Rowe, Sidney E.
2010-01-01
In September 2007, the Engineering Directorate at the Marshall Space Flight Center (MSFC) created the Design System Focus Team (DSFT). MSFC was responsible for the in-house design and development of the Ares 1 Upper Stage and the Engineering Directorate was preparing to deploy a new electronic Configuration Management and Data Management System with the Design Data Management System (DDMS) based upon a Commercial Off The Shelf (COTS) Product Data Management (PDM) System. The DSFT was to establish standardized CAD practices and a new data life cycle for design data. Of special interest here, the design teams were to implement Model Based Definition (MBD) in support of the Upper Stage manufacturing contract. It is noted that this MBD does use partially dimensioned drawings for auxiliary information to the model. The design data lifecycle implemented several new release states to be used prior to formal release that allowed the models to move through a flow of progressive maturity. The DSFT identified some 17 Lessons Learned as outcomes of the standards development, pathfinder deployments and initial application to the Upper Stage design completion. Some of the high value examples are reviewed.
Mental Models: A Robust Definition
Rook, Laura
2013-01-01
Purpose: The concept of a mental model has been described by theorists from diverse disciplines. The purpose of this paper is to offer a robust definition of an individual mental model for use in organisational management. Design/methodology/approach: The approach adopted involves an interdisciplinary literature review of disciplines, including…
Conceptualising Business Models: Definitions, Frameworks and Classifications
Directory of Open Access Journals (Sweden)
Erwin Fielt
2013-12-01
Full Text Available The business model concept is gaining traction in different disciplines but is still criticized for being fuzzy and vague and lacking consensus on its definition and compositional elements. In this paper we set out to advance our understanding of the business model concept by addressing three areas of foundational research: business model definitions, business model elements, and business model archetypes. We define a business model as a representation of the value logic of an organization in terms of how it creates and captures customer value. This abstract and generic definition is made more specific and operational by the compositional elements that need to address the customer, value proposition, organizational architecture (firm and network level and economics dimensions. Business model archetypes complement the definition and elements by providing a more concrete and empirical understanding of the business model concept. The main contributions of this paper are (1 explicitly including the customer value concept in the business model definition and focussing on value creation, (2 presenting four core dimensions that business model elements need to cover, (3 arguing for flexibility by adapting and extending business model elements to cater for different purposes and contexts (e.g. technology, innovation, strategy (4 stressing a more systematic approach to business model archetypes by using business model elements for their description, and (5 suggesting to use business model archetype research for the empirical exploration and testing of business model elements and their relationships.
Do we need one business model definition?
Directory of Open Access Journals (Sweden)
Anders Bille Jensen
2013-12-01
Full Text Available Purpose: Different applications and conceptualizations of the business model concept have created discussions on what it actually is. The purpose of this paper is twofold: 1 to establish an overview of current usages of the business model construct, its nature and role in theory building, and – building on this - 2 to derive guiding principles applicable for achieving better clarity of the business model construct in future research. Design/methodology: Variances in roles, nature and forms of current and diverse applications of the business model concept are discussed from a vertical and a horizontal dimension. Based on the analysis, key issues for achieving construct clarity are proposed. Findings: This paper 1 demonstrates that there are at least three levels of understanding business models (general, conceptual and as a research construct, 2 that the business model construct is heavily influenced by the research view, 3 that the establishment of specific constructs can be informed by the existing literature, and 4 discusses how the emergent business model concept can be strengthened. Implications: Different and complementary business model perspectives may provide a better understanding and reflection of reality than a single, general and detailed definition. For specific applications, definitions need to explicitly clarify the particular role, nature and boundaries of the business model. Originality/value: The paper provides a methodological contribution in the discussion on business model definitions by adding clarity on the value of the multi-levels and multi-views of current understandings as well as contributing on how to create specific constructs.
Current definition and a generalized federbush model
International Nuclear Information System (INIS)
Singh, L.P.S.; Hagen, C.R.
1978-01-01
The Federbush model is studied, with particular attention being given to the definition of currents. Inasmuch as there is no a priori restriction of local gauge invariance, the currents in the interacting case can be defined more generally than in Q.E.D. It is found that two arbitrary parameters are thereby introduced into the theory. Lowest order perturbation calculations for the current correlation functions and the Fermion propagators indicate that the theory admits a whole class of solutions dependent upon these parameters with the closed solution of Federbush emerging as a special case. The theory is shown to be locally covariant, and a conserved energy--momentum tensor is displayed. One finds in addition that the generators of gauge transformations for the fields are conserved. Finally it is shown that the general theory yields the Federbush solution if suitable Thirring model type counterterms are added
Translating Building Information Modeling to Building Energy Modeling Using Model View Definition
Kim, Jong Bum; Clayton, Mark J.; Haberl, Jeff S.
2014-01-01
This paper presents a new approach to translate between Building Information Modeling (BIM) and Building Energy Modeling (BEM) that uses Modelica, an object-oriented declarative, equation-based simulation environment. The approach (BIM2BEM) has been developed using a data modeling method to enable seamless model translations of building geometry, materials, and topology. Using data modeling, we created a Model View Definition (MVD) consisting of a process model and a class diagram. The process model demonstrates object-mapping between BIM and Modelica-based BEM (ModelicaBEM) and facilitates the definition of required information during model translations. The class diagram represents the information and object relationships to produce a class package intermediate between the BIM and BEM. The implementation of the intermediate class package enables system interface (Revit2Modelica) development for automatic BIM data translation into ModelicaBEM. In order to demonstrate and validate our approach, simulation result comparisons have been conducted via three test cases using (1) the BIM-based Modelica models generated from Revit2Modelica and (2) BEM models manually created using LBNL Modelica Buildings library. Our implementation shows that BIM2BEM (1) enables BIM models to be translated into ModelicaBEM models, (2) enables system interface development based on the MVD for thermal simulation, and (3) facilitates the reuse of original BIM data into building energy simulation without an import/export process. PMID:25309954
Translating building information modeling to building energy modeling using model view definition.
Jeong, WoonSeong; Kim, Jong Bum; Clayton, Mark J; Haberl, Jeff S; Yan, Wei
2014-01-01
This paper presents a new approach to translate between Building Information Modeling (BIM) and Building Energy Modeling (BEM) that uses Modelica, an object-oriented declarative, equation-based simulation environment. The approach (BIM2BEM) has been developed using a data modeling method to enable seamless model translations of building geometry, materials, and topology. Using data modeling, we created a Model View Definition (MVD) consisting of a process model and a class diagram. The process model demonstrates object-mapping between BIM and Modelica-based BEM (ModelicaBEM) and facilitates the definition of required information during model translations. The class diagram represents the information and object relationships to produce a class package intermediate between the BIM and BEM. The implementation of the intermediate class package enables system interface (Revit2Modelica) development for automatic BIM data translation into ModelicaBEM. In order to demonstrate and validate our approach, simulation result comparisons have been conducted via three test cases using (1) the BIM-based Modelica models generated from Revit2Modelica and (2) BEM models manually created using LBNL Modelica Buildings library. Our implementation shows that BIM2BEM (1) enables BIM models to be translated into ModelicaBEM models, (2) enables system interface development based on the MVD for thermal simulation, and (3) facilitates the reuse of original BIM data into building energy simulation without an import/export process.
Translating Building Information Modeling to Building Energy Modeling Using Model View Definition
Directory of Open Access Journals (Sweden)
WoonSeong Jeong
2014-01-01
Full Text Available This paper presents a new approach to translate between Building Information Modeling (BIM and Building Energy Modeling (BEM that uses Modelica, an object-oriented declarative, equation-based simulation environment. The approach (BIM2BEM has been developed using a data modeling method to enable seamless model translations of building geometry, materials, and topology. Using data modeling, we created a Model View Definition (MVD consisting of a process model and a class diagram. The process model demonstrates object-mapping between BIM and Modelica-based BEM (ModelicaBEM and facilitates the definition of required information during model translations. The class diagram represents the information and object relationships to produce a class package intermediate between the BIM and BEM. The implementation of the intermediate class package enables system interface (Revit2Modelica development for automatic BIM data translation into ModelicaBEM. In order to demonstrate and validate our approach, simulation result comparisons have been conducted via three test cases using (1 the BIM-based Modelica models generated from Revit2Modelica and (2 BEM models manually created using LBNL Modelica Buildings library. Our implementation shows that BIM2BEM (1 enables BIM models to be translated into ModelicaBEM models, (2 enables system interface development based on the MVD for thermal simulation, and (3 facilitates the reuse of original BIM data into building energy simulation without an import/export process.
Weak Memory Models: Balancing Definitional Simplicity and Implementation Flexibility
Zhang, Sizhuo; Vijayaraghavan, Muralidaran; Arvind
2017-01-01
The memory model for RISC-V, a newly developed open source ISA, has not been finalized yet and thus, offers an opportunity to evaluate existing memory models. We believe RISC-V should not adopt the memory models of POWER or ARM, because their axiomatic and operational definitions are too complicated. We propose two new weak memory models: WMM and WMM-S, which balance definitional simplicity and implementation flexibility differently. Both allow all instruction reorderings except overtaking of...
Moving towards maturity in business model definitions
DEFF Research Database (Denmark)
Nielsen, Christian; Lund, Morten; Bukh, Per Nikolaj
2014-01-01
The field of business models has, as is the case with all emerging fields of practice, slowly matured through the development of frameworks, models, concepts and ideas over the last 15 years. New concepts, theories and models typically transcend a series of maturity phases. For the concept of Bus...
[Safety culture: definition, models and design].
Pfaff, Holger; Hammer, Antje; Ernstmann, Nicole; Kowalski, Christoph; Ommen, Oliver
2009-01-01
Safety culture is a multi-dimensional phenomenon. Safety culture of a healthcare organization is high if it has a common stock in knowledge, values and symbols in regard to patients' safety. The article intends to define safety culture in the first step and, in the second step, demonstrate the effects of safety culture. We present the model of safety behaviour and show how safety culture can affect behaviour and produce safe behaviour. In the third step we will look at the causes of safety culture and present the safety-culture-model. The main hypothesis of this model is that the safety culture of a healthcare organization strongly depends on its communication culture and its social capital. Finally, we will investigate how the safety culture of a healthcare organization can be improved. Based on the safety culture model six measures to improve safety culture will be presented.
2013-06-01
Building in- formation exchange (COBie), Building Information Modeling ( BIM ) 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF...to develop a life-cycle building model have resulted in the definition of a “core” building information model that contains general information de...develop an information -exchange Model View Definition (MVD) for building electrical systems. The objective of the current work was to document the
Weak Memory Models with Matching Axiomatic and Operational Definitions
Zhang, Sizhuo; Vijayaraghavan, Muralidaran; Lustig, Dan; Arvind
2017-01-01
Memory consistency models are notorious for being difficult to define precisely, to reason about, and to verify. More than a decade of effort has gone into nailing down the definitions of the ARM and IBM Power memory models, and yet there still remain aspects of those models which (perhaps surprisingly) remain unresolved to this day. In response to these complexities, there has been somewhat of a recent trend in the (general-purpose) architecture community to limit new memory models to being ...
Building a Shared Definitional Model of Long Duration Human Spaceflight
Arias, Diana; Orr, Martin; Whitmire, Alexandra; Leveton, Lauren; Sandoval, Luis
2012-01-01
Objective: To establish the need for a shared definitional model of long duration human spaceflight, that would provide a framework and vision to facilitate communication, research and practice In 1956, on the eve of human space travel, Hubertus Strughold first proposed a "simple classification of the present and future stages of manned flight" that identified key factors, risks and developmental stages for the evolutionary journey ahead. As we look to new destinations, we need a current shared working definitional model of long duration human space flight to help guide our path. Here we describe our preliminary findings and outline potential approaches for the future development of a definition and broader classification system
Spectra of definite type in waveguide models
Czech Academy of Sciences Publication Activity Database
Lotoreichik, Vladimir; Siegl, Petr
2017-01-01
Roč. 145, č. 3 (2017), s. 1231-1246 ISSN 0002-9939 R&D Projects: GA ČR(CZ) GA14-06818S Institutional support: RVO:61389005 Keywords : spectral points of definite and of type pi * weakly coupled bound states * pertrubations of essential spectrum * PT-symmetric waveguide Subject RIV: BE - Theoretical Physics OBOR OECD: Applied mathematics Impact factor: 0.679, year: 2016
Building a Shared Definitional Model of Long Duration Human Spaceflight
Orr, M.; Whitmire, A.; Sandoval, L.; Leveton, L.; Arias, D.
2011-01-01
In 1956, on the eve of human space travel Strughold first proposed a simple classification of the present and future stages of manned flight that identified key factors, risks and developmental stages for the evolutionary journey ahead. As we look to optimize the potential of the ISS as a gateway to new destinations, we need a current shared working definitional model of long duration human space flight to help guide our path. Initial search of formal and grey literature augmented by liaison with subject matter experts. Search strategy focused on both the use of term long duration mission and long duration spaceflight, and also broader related current and historical definitions and classification models of spaceflight. The related sea and air travel literature was also subsequently explored with a view to identifying analogous models or classification systems. There are multiple different definitions and classification systems for spaceflight including phase and type of mission, craft and payload and related risk management models. However the frequently used concepts of long duration mission and long duration spaceflight are infrequently operationally defined by authors, and no commonly referenced classical or gold standard definition or model of these terms emerged from the search. The categorization (Cat) system for sailing was found to be of potential analogous utility, with its focus on understanding the need for crew and craft autonomy at various levels of potential adversity and inability to gain outside support or return to a safe location, due to factors of time, distance and location.
A consensus definition of cataplexy in mouse models of narcolepsy.
Scammell, Thomas E; Willie, Jon T; Guilleminault, Christian; Siegel, Jerome M
2009-01-01
People with narcolepsy often have episodes of cataplexy, brief periods of muscle weakness triggered by strong emotions. Many researchers are now studying mouse models of narcolepsy, but definitions of cataplexy-like behavior in mice differ across labs. To establish a common language, the International Working Group on Rodent Models of Narcolepsy reviewed the literature on cataplexy in people with narcolepsy and in dog and mouse models of narcolepsy and then developed a consensus definition of murine cataplexy. The group concluded that murine cataplexy is an abrupt episode of nuchal atonia lasting at least 10 seconds. In addition, theta activity dominates the EEG during the episode, and video recordings document immobility. To distinguish a cataplexy episode from REM sleep after a brief awakening, at least 40 seconds of wakefulness must precede the episode. Bouts of cataplexy fitting this definition are common in mice with disrupted orexin/hypocretin signaling, but these events almost never occur in wild type mice. It remains unclear whether murine cataplexy is triggered by strong emotions or whether mice remain conscious during the episodes as in people with narcolepsy. This working definition provides helpful insights into murine cataplexy and should allow objective and accurate comparisons of cataplexy in future studies using mouse models of narcolepsy.
A Model-Free Definition of Increasing Uncertainty
Grant, S.; Quiggin, J.
2001-01-01
We present a definition of increasing uncertainty, independent of any notion of subjective probabilities, or of any particular model of preferences.Our notion of an elementary increase in the uncertainty of any act corresponds to the addition of an 'elementary bet' which increases consumption by a
Promoting Model-based Definition to Establish a Complete Product Definition.
Ruemler, Shawn P; Zimmerman, Kyle E; Hartman, Nathan W; Hedberg, Thomas; Feeny, Allison Barnard
2017-05-01
The manufacturing industry is evolving and starting to use 3D models as the central knowledge artifact for product data and product definition, or what is known as Model-based Definition (MBD). The Model-based Enterprise (MBE) uses MBD as a way to transition away from using traditional paper-based drawings and documentation. As MBD grows in popularity, it is imperative to understand what information is needed in the transition from drawings to models so that models represent all the relevant information needed for processes to continue efficiently. Finding this information can help define what data is common amongst different models in different stages of the lifecycle, which could help establish a Common Information Model. The Common Information Model is a source that contains common information from domain specific elements amongst different aspects of the lifecycle. To help establish this Common Information Model, information about how models are used in industry within different workflows needs to be understood. To retrieve this information, a survey mechanism was administered to industry professionals from various sectors. Based on the results of the survey a Common Information Model could not be established. However, the results gave great insight that will help in further investigation of the Common Information Model.
Livrable D1.2 of the PERSEE project : Perceptual-Modelling-Definition-of-the-Models
Wang , Junle; Bosc , Emilie; Li , Jing; Ricordel , Vincent
2011-01-01
Livrable D1.2 du projet ANR PERSEE; Ce rapport a été réalisé dans le cadre du projet ANR PERSEE (n° ANR-09-BLAN-0170). Exactement il correspond au livrable D1.2 du projet. Son titre : Perceptual-Modelling-Definition-of-the-Models
Basic definitions for discrete modeling of computer worms epidemics
Directory of Open Access Journals (Sweden)
Pedro Guevara López
2015-01-01
Full Text Available The information technologies have evolved in such a way that communication between computers or hosts has become common, so much that the worldwide organization (governments and corporations depends on it; what could happen if these computers stop working for a long time is catastrophic. Unfortunately, networks are attacked by malware such as viruses and worms that could collapse the system. This has served as motivation for the formal study of computer worms and epidemics to develop strategies for prevention and protection; this is why in this paper, before analyzing epidemiological models, a set of formal definitions based on set theory and functions is proposed for describing 21 concepts used in the study of worms. These definitions provide a basis for future qualitative research on the behavior of computer worms, and quantitative for the study of their epidemiological models.
Ontology for Life-Cycle Modeling of Water Distribution Systems: Model View Definition
2013-06-01
requirements, the team followed the Information Delivery Manual ( IDM ) and Model View Definition (MVD) procedures defined by the International Organization...MVD_Format_V2_Proposal_080128.pdf. Wix, J., ed. 2007. “Information Delivery Manual: Guide to Components and Development Methods,” http://www.iai.no/ idm /idm_resources
Sustainable geothermal utilization - Case histories; definitions; research issues and modelling
International Nuclear Information System (INIS)
Axelsson, Gudni
2010-01-01
Sustainable development by definition meets the needs of the present without compromising the ability of future generations to meet their own needs. The Earth's enormous geothermal resources have the potential to contribute significantly to sustainable energy use worldwide as well as to help mitigate climate change. Experience from the use of numerous geothermal systems worldwide lasting several decades demonstrates that by maintaining production below a certain limit the systems reach a balance between net energy discharge and recharge that may be maintained for a long time (100-300 years). Modelling studies indicate that the effect of heavy utilization is often reversible on a time-scale comparable to the period of utilization. Thus, geothermal resources can be used in a sustainable manner either through (1) constant production below the sustainable limit, (2) step-wise increase in production, (3) intermittent excessive production with breaks, and (4) reduced production after a shorter period of heavy production. The long production histories that are available for low-temperature as well as high-temperature geothermal systems distributed throughout the world, provide the most valuable data available for studying sustainable management of geothermal resources, and reservoir modelling is the most powerful tool available for this purpose. The paper presents sustainability modelling studies for the Hamar and Nesjavellir geothermal systems in Iceland, the Beijing Urban system in China and the Olkaria system in Kenya as examples. Several relevant research issues have also been identified, such as the relevance of system boundary conditions during long-term utilization, how far reaching interference from utilization is, how effectively geothermal systems recover after heavy utilization and the reliability of long-term (more than 100 years) model predictions. (author)
Scoring predictive models using a reduced representation of proteins: model and energy definition
Directory of Open Access Journals (Sweden)
Corazza Alessandra
2007-03-01
Full Text Available Abstract Background Reduced representations of proteins have been playing a keyrole in the study of protein folding. Many such models are available, with different representation detail. Although the usefulness of many such models for structural bioinformatics applications has been demonstrated in recent years, there are few intermediate resolution models endowed with an energy model capable, for instance, of detecting native or native-like structures among decoy sets. The aim of the present work is to provide a discrete empirical potential for a reduced protein model termed here PC2CA, because it employs a PseudoCovalent structure with only 2 Centers of interactions per Amino acid, suitable for protein model quality assessment. Results All protein structures in the set top500H have been converted in reduced form. The distribution of pseudobonds, pseudoangle, pseudodihedrals and distances between centers of interactions have been converted into potentials of mean force. A suitable reference distribution has been defined for non-bonded interactions which takes into account excluded volume effects and protein finite size. The correlation between adjacent main chain pseudodihedrals has been converted in an additional energetic term which is able to account for cooperative effects in secondary structure elements. Local energy surface exploration is performed in order to increase the robustness of the energy function. Conclusion The model and the energy definition proposed have been tested on all the multiple decoys' sets in the Decoys'R'us database. The energetic model is able to recognize, for almost all sets, native-like structures (RMSD less than 2.0 Å. These results and those obtained in the blind CASP7 quality assessment experiment suggest that the model compares well with scoring potentials with finer granularity and could be useful for fast exploration of conformational space. Parameters are available at the url: http://www.dstb.uniud.it/~ffogolari/download/.
Scoring predictive models using a reduced representation of proteins: model and energy definition.
Fogolari, Federico; Pieri, Lidia; Dovier, Agostino; Bortolussi, Luca; Giugliarelli, Gilberto; Corazza, Alessandra; Esposito, Gennaro; Viglino, Paolo
2007-03-23
Reduced representations of proteins have been playing a keyrole in the study of protein folding. Many such models are available, with different representation detail. Although the usefulness of many such models for structural bioinformatics applications has been demonstrated in recent years, there are few intermediate resolution models endowed with an energy model capable, for instance, of detecting native or native-like structures among decoy sets. The aim of the present work is to provide a discrete empirical potential for a reduced protein model termed here PC2CA, because it employs a PseudoCovalent structure with only 2 Centers of interactions per Amino acid, suitable for protein model quality assessment. All protein structures in the set top500H have been converted in reduced form. The distribution of pseudobonds, pseudoangle, pseudodihedrals and distances between centers of interactions have been converted into potentials of mean force. A suitable reference distribution has been defined for non-bonded interactions which takes into account excluded volume effects and protein finite size. The correlation between adjacent main chain pseudodihedrals has been converted in an additional energetic term which is able to account for cooperative effects in secondary structure elements. Local energy surface exploration is performed in order to increase the robustness of the energy function. The model and the energy definition proposed have been tested on all the multiple decoys' sets in the Decoys'R'us database. The energetic model is able to recognize, for almost all sets, native-like structures (RMSD less than 2.0 A). These results and those obtained in the blind CASP7 quality assessment experiment suggest that the model compares well with scoring potentials with finer granularity and could be useful for fast exploration of conformational space. Parameters are available at the url: http://www.dstb.uniud.it/~ffogolari/download/.
Persuasive Game Design : A model and its definitions
Visch, V.T.; Vegt, N.J.H.; Anderiesen, H.; Van der Kooij, K.
2013-01-01
The following position paper proposes a general theoretical model for persuasive game design. This model combines existing theories on persuasive technology, serious gaming, and gamification. The model is based on user experience, gamification design, and transfer effects.
Integrated source-risk model for radon: A definition study
International Nuclear Information System (INIS)
Laheij, G.M.H.; Aldenkamp, F.J.; Stoop, P.
1993-10-01
The purpose of a source-risk model is to support policy making on radon mitigation by comparing effects of various policy options and to enable optimization of counter measures applied to different parts of the source-risk chain. There are several advantages developing and using a source-risk model: risk calculations are standardized; the effects of measures applied to different parts of the source-risk chain can be better compared because interactions are included; and sensitivity analyses can be used to determine the most important parameters within the total source-risk chain. After an inventory of processes and sources to be included in the source-risk chain, the models presently available in the Netherlands are investigated. The models were screened for completeness, validation and operational status. The investigation made clear that, by choosing for each part of the source-risk chain the most convenient model, a source-risk chain model for radon may be realized. However, the calculation of dose out of the radon concentrations and the status of the validation of most models should be improved. Calculations with the proposed source-risk model will give estimations with a large uncertainty at the moment. For further development of the source-risk model an interaction between the source-risk model and experimental research is recommended. Organisational forms of the source-risk model are discussed. A source-risk model in which only simple models are included is also recommended. The other models are operated and administrated by the model owners. The model owners execute their models for a combination of input parameters. The output of the models is stored in a database which will be used for calculations with the source-risk model. 5 figs., 15 tabs., 7 appendices, 14 refs
Test, revision, and cross-validation of the Physical Activity Self-Definition Model.
Kendzierski, Deborah; Morganstein, Mara S
2009-08-01
Structural equation modeling was used to test an extended version of the Kendzierski, Furr, and Schiavoni (1998) Physical Activity Self-Definition Model. A revised model using data from 622 runners fit the data well. Cross-validation indices supported the revised model, and this model also provided a good fit to data from 397 cyclists. Partial invariance was found across activities. In both samples, perceived commitment and perceived ability had direct effects on self-definition, and perceived wanting, perceived trying, and enjoyment had indirect effects. The contribution of perceived ability to self-definition did not differ across activities. Implications concerning the original model, indirect effects, skill salience, and the role of context in self-definition are discussed.
Time domain series system definition and gear set reliability modeling
International Nuclear Information System (INIS)
Xie, Liyang; Wu, Ningxiang; Qian, Wenxue
2016-01-01
Time-dependent multi-configuration is a typical feature for mechanical systems such as gear trains and chain drives. As a series system, a gear train is distinct from a traditional series system, such as a chain, in load transmission path, system-component relationship, system functioning manner, as well as time-dependent system configuration. Firstly, the present paper defines time-domain series system to which the traditional series system reliability model is not adequate. Then, system specific reliability modeling technique is proposed for gear sets, including component (tooth) and subsystem (tooth-pair) load history description, material priori/posterior strength expression, time-dependent and system specific load-strength interference analysis, as well as statistically dependent failure events treatment. Consequently, several system reliability models are developed for gear sets with different tooth numbers in the scenario of tooth root material ultimate tensile strength failure. The application of the models is discussed in the last part, and the differences between the system specific reliability model and the traditional series system reliability model are illustrated by virtue of several numerical examples. - Highlights: • A new type of series system, i.e. time-domain multi-configuration series system is defined, that is of great significance to reliability modeling. • Multi-level statistical analysis based reliability modeling method is presented for gear transmission system. • Several system specific reliability models are established for gear set reliability estimation. • The differences between the traditional series system reliability model and the new model are illustrated.
TAPWAT: Definition structure and applications for modelling drinking water treatment
Versteegh JFM; Gaalen FW van; Rietveld LC; Evers EG; Aldenberg TA; Cleij P; Technische Universiteit Delft; LWD
2001-01-01
The 'Tool for the Analysis of the Production of drinking WATer' (TAPWAT) model has been developed for describing drinking-water quality in integral studies in the context of the Environmental Policy Assessment of the RIVM. The model consists of modules that represent individual steps in a treatment
TAPWAT: Definition structure and applications for modelling drinking water treatment
Versteegh JFM; van Gaalen FW; Rietveld LC; Evers EG; Aldenberg TA; Cleij P; LWD
2001-01-01
Het model TAPWAT (Tool for the Analysis of the Production of drinking WATer), is ontwikkeld om de drinkwaterkwaliteit te beschrijven voor integrale studies in het kader van het planbureau Milieu en Natuur van het RIVM. Het model bestaat uit modules die de individuele zuiveringsstappen van het
Towards a Definition of Role-related Concepts for Business Modeling
Meertens, Lucas Onno; Iacob, Maria Eugenia; Nieuwenhuis, Lambertus Johannes Maria
2010-01-01
Abstract—While several role-related concepts play an important role in business modeling, their definitions, relations, and use differ greatly between languages, papers, and reports. Due to this, the knowledge captured by models is not transferred correctly, and models are incomparable. In this
Health literacy and public health: A systematic review and integration of definitions and models
2012-01-01
Background Health literacy concerns the knowledge and competences of persons to meet the complex demands of health in modern society. Although its importance is increasingly recognised, there is no consensus about the definition of health literacy or about its conceptual dimensions, which limits the possibilities for measurement and comparison. The aim of the study is to review definitions and models on health literacy to develop an integrated definition and conceptual model capturing the most comprehensive evidence-based dimensions of health literacy. Methods A systematic literature review was performed to identify definitions and conceptual frameworks of health literacy. A content analysis of the definitions and conceptual frameworks was carried out to identify the central dimensions of health literacy and develop an integrated model. Results The review resulted in 17 definitions of health literacy and 12 conceptual models. Based on the content analysis, an integrative conceptual model was developed containing 12 dimensions referring to the knowledge, motivation and competencies of accessing, understanding, appraising and applying health-related information within the healthcare, disease prevention and health promotion setting, respectively. Conclusions Based upon this review, a model is proposed integrating medical and public health views of health literacy. The model can serve as a basis for developing health literacy enhancing interventions and provide a conceptual basis for the development and validation of measurement tools, capturing the different dimensions of health literacy within the healthcare, disease prevention and health promotion settings. PMID:22276600
Health literacy and public health: A systematic review and integration of definitions and models
LENUS (Irish Health Repository)
Sorensen, Kristine
2012-01-25
Abstract Background Health literacy concerns the knowledge and competences of persons to meet the complex demands of health in modern society. Although its importance is increasingly recognised, there is no consensus about the definition of health literacy or about its conceptual dimensions, which limits the possibilities for measurement and comparison. The aim of the study is to review definitions and models on health literacy to develop an integrated definition and conceptual model capturing the most comprehensive evidence-based dimensions of health literacy. Methods A systematic literature review was performed to identify definitions and conceptual frameworks of health literacy. A content analysis of the definitions and conceptual frameworks was carried out to identify the central dimensions of health literacy and develop an integrated model. Results The review resulted in 17 definitions of health literacy and 12 conceptual models. Based on the content analysis, an integrative conceptual model was developed containing 12 dimensions referring to the knowledge, motivation and competencies of accessing, understanding, appraising and applying health-related information within the healthcare, disease prevention and health promotion setting, respectively. Conclusions Based upon this review, a model is proposed integrating medical and public health views of health literacy. The model can serve as a basis for developing health literacy enhancing interventions and provide a conceptual basis for the development and validation of measurement tools, capturing the different dimensions of health literacy within the healthcare, disease prevention and health promotion settings.
Beyond a Definition: Toward a Framework for Designing and Specifying Mentoring Models
Dawson, Phillip
2014-01-01
More than three decades of mentoring research has yet to converge on a unifying definition of mentoring; this is unsurprising given the diversity of relationships classified as mentoring. This article advances beyond a definition toward a common framework for specifying mentoring models. Sixteen design elements were identified from the literature…
[The definition of chemiluminescence intensity in experimental model of premature senescence].
Rukavishnikova, S A; Arutiunian, A V; Ryzhak, G A
2011-01-01
The diagnostic possibilities of chemiluminescence's intensity research for premature senescence prognosis are described in this article. The definition of chemiluminescence's intensity enables to reveal the individuals stable to processes of premature senescence in the model of acute radiation.
Validation of the PESTLA model: Definitions, objectives and procedure
Boekhold AE; van den Bosch H; Boesten JJTI; Leistra M; Swartjes FA; van der Linden AMA
1993-01-01
The simulation model PESTLA was developed to produce estimates of accumulation and leaching of pesticides in soil to facilitate classification of pesticides in the Dutch registration procedure. Before PESTLA can be used for quantitative assessment of expected pesticide concentrations in
Family Resilience in the Military: Definitions, Models, and Policies
2015-01-01
Chartrand et al., 2008; Lester et al., 2010; Lincoln and Sweeten , 2011; Chandra et al., 2010; Chandra et al., 2011; Gibbs et al., 2008). Despite...for definitions of family resilience. Of these, 29 presented at least one definition with original content (i.e., a new definition or one building on...policies: (1) policies about programs that originally had different purposes, such as youth programs, which are then modified to address resilience or
Kroeker, Kristine; Widdifield, Jessica; Muthukumarana, Saman; Jiang, Depeng; Lix, Lisa M
2017-06-23
This research proposes a model-based method to facilitate the selection of disease case definitions from validation studies for administrative health data. The method is demonstrated for a rheumatoid arthritis (RA) validation study. Data were from 148 definitions to ascertain cases of RA in hospital, physician and prescription medication administrative data. We considered: (A) separate univariate models for sensitivity and specificity, (B) univariate model for Youden's summary index and (C) bivariate (ie, joint) mixed-effects model for sensitivity and specificity. Model covariates included the number of diagnoses in physician, hospital and emergency department records, physician diagnosis observation time, duration of time between physician diagnoses and number of RA-related prescription medication records. The most common case definition attributes were: 1+ hospital diagnosis (65%), 2+ physician diagnoses (43%), 1+ specialist physician diagnosis (51%) and 2+ years of physician diagnosis observation time (27%). Statistically significant improvements in sensitivity and/or specificity for separate univariate models were associated with (all p values model produced similar results. Youden's index was associated with these same case definition criteria, except for the length of the physician diagnosis observation time. A model-based method provides valuable empirical evidence to aid in selecting a definition(s) for ascertaining diagnosed disease cases from administrative health data. The choice between univariate and bivariate models depends on the goals of the validation study and number of case definitions. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Spädtke, P
2013-01-01
Modeling of technical machines became a standard technique since computer became powerful enough to handle the amount of data relevant to the specific system. Simulation of an existing physical device requires the knowledge of all relevant quantities. Electric fields given by the surrounding boundary as well as magnetic fields caused by coils or permanent magnets have to be known. Internal sources for both fields are sometimes taken into account, such as space charge forces or the internal magnetic field of a moving bunch of charged particles. Used solver routines are briefly described and some bench-marking is shown to estimate necessary computing times for different problems. Different types of charged particle sources will be shown together with a suitable model to describe the physical model. Electron guns are covered as well as different ion sources (volume ion sources, laser ion sources, Penning ion sources, electron resonance ion sources, and H$^-$-sources) together with some remarks on beam transport.
Biomass Scenario Model Scenario Library: Definitions, Construction, and Description
Energy Technology Data Exchange (ETDEWEB)
Inman, D.; Vimmerstedt, L.; Bush, B.; Peterson, S.
2014-04-01
Understanding the development of the biofuels industry in the United States is important to policymakers and industry. The Biomass Scenario Model (BSM) is a system dynamics model of the biomass-to-biofuels system that can be used to explore policy effects on biofuels development. Because of the complexity of the model, as well as the wide range of possible future conditions that affect biofuels industry development, we have not developed a single reference case but instead developed a set of specific scenarios that provide various contexts for our analyses. The purpose of this report is to describe the scenarios that comprise the BSM scenario library. At present, we have the following policy-focused scenarios in our library: minimal policies, ethanol-focused policies, equal access to policies, output-focused policies, technological diversity focused, and the point-of-production- focused. This report describes each scenario, its policy settings, and general insights gained through use of the scenarios in analytic studies.
Directory of Open Access Journals (Sweden)
Piet Swanepoel
2011-10-01
Full Text Available
ABSTRACT: This article focuses on some of the problems raised by Atkins and Rundell's (2008 approach to the design of lexicographic definitions for members of lexical sets. The questions raised are how to define and identify lexical sets, how lexical conceptual models (LCMs can support definitional consistency and coherence in defining members of lexical sets, and what the ideal content and structure of LCMs could be. Although similarity of meaning is proposed as the defining feature of lexical sets, similarity of meaning is only one dimension of the broader concept of lexical coherence. The argument is presented that numerous conceptual lexical models (e.g. taxonomies, folk models, frames, etc. in fact indicate, justify or explain how lexical items cohere (and thus form sets. In support of Fillmore's (2003 suggestion that definitions of the lexical items of cohering sets should be linked to such explanatory models, additional functionally-orientated arguments are presented for the incorporation of conceptual lexical models in electronic monolingual learners' dictionaries. Numerous resources exist to support the design of LCMs which can improve the functionality of definitions of members of lexical sets. A few examples are discussed of how such resources can be used to design functionally justified LCMs.
OPSOMMING: Verbetering van die funksionaliteit van woordeboekdefinisies vir leksikale versamelings: Die rol van definisiematryse, definisie-eenvormigheid, definisiesamehang en die inkorporering van leksikale konseptuele modelle. Hierdie artikel fokus op sommige van die probleme wat ter sprake kom deur Atkins en Rundell (2008 se benadering tot die ontwerp van leksikografiese definisies vir lede van leksikale versamelings. Die vrae wat gestel word, is hoe leksikale versamelings gedefinieer en geïdentifiseer moet word, hoe leksikale konseptuele modelle (LKM's definisie-eenvormigheid en-samehang kan ondersteun by die definiëring van lede
A formal definition of data flow graph models
Kavi, Krishna M.; Buckles, Bill P.; Bhat, U. Narayan
1986-01-01
In this paper, a new model for parallel computations and parallel computer systems that is based on data flow principles is presented. Uninterpreted data flow graphs can be used to model computer systems including data driven and parallel processors. A data flow graph is defined to be a bipartite graph with actors and links as the two vertex classes. Actors can be considered similar to transitions in Petri nets, and links similar to places. The nondeterministic nature of uninterpreted data flow graphs necessitates the derivation of liveness conditions.
Analyzing differences in operational disease definitions using ontological modeling
Peelen, Linda; Klein, Michel C.A.; Schlobach, Stefan; De Keizer, Nicolette F.; Peek, Niels
2007-01-01
In medicine, there are many diseases which cannot be precisely characterized but are considered as natural kinds. In the communication between health care professionals, this is generally not problematic. In biomedical research, however, crisp definitions are required to unambiguously distinguish
Eating Competence: Definition and Evidence for the Satter Eating Competence Model
Satter, Ellyn
2007-01-01
The evidence- and practice-based Satter Eating Competence Model (ecSatter) outlines an inclusive definition of the interrelated spectrum of eating attitudes and behaviors. The model is predicated on the utility and effectiveness of biopsychosocial processes: hunger and the drive to survive, appetite and the need for subjective reward and the…
Zoning of agricultural fields is an important task for utilization of precision farming technology. One method for the definition of zones with different levels of productivity is based on fuzzy indicator model. Fuzzy indicator model for identification of zones with different levels of productivit...
The definition of input parameters for modelling of energetic subsystems
Directory of Open Access Journals (Sweden)
Ptacek M.
2013-06-01
Full Text Available This paper is a short review and a basic description of mathematical models of renewable energy sources which present individual investigated subsystems of a system created in Matlab/Simulink. It solves the physical and mathematical relationships of photovoltaic and wind energy sources that are often connected to the distribution networks. The fuel cell technology is much less connected to the distribution networks but it could be promising in the near future. Therefore, the paper informs about a new dynamic model of the low-temperature fuel cell subsystem, and the main input parameters are defined as well. Finally, the main evaluated and achieved graphic results for the suggested parameters and for all the individual subsystems mentioned above are shown.
The definition of input parameters for modelling of energetic subsystems
Ptacek, M.
2013-06-01
This paper is a short review and a basic description of mathematical models of renewable energy sources which present individual investigated subsystems of a system created in Matlab/Simulink. It solves the physical and mathematical relationships of photovoltaic and wind energy sources that are often connected to the distribution networks. The fuel cell technology is much less connected to the distribution networks but it could be promising in the near future. Therefore, the paper informs about a new dynamic model of the low-temperature fuel cell subsystem, and the main input parameters are defined as well. Finally, the main evaluated and achieved graphic results for the suggested parameters and for all the individual subsystems mentioned above are shown.
The Primary Care Behavioral Health (PCBH) Model: An Overview and Operational Definition.
Reiter, Jeffrey T; Dobmeyer, Anne C; Hunter, Christopher L
2018-02-26
The Primary Care Behavioral Health (PCBH) model is a prominent approach to the integration of behavioral health services into primary care settings. Implementation of the PCBH model has grown over the past two decades, yet research and training efforts have been slowed by inconsistent terminology and lack of a concise, operationalized definition of the model and its key components. This article provides the first concise operationalized definition of the PCBH model, developed from examination of multiple published resources and consultation with nationally recognized PCBH model experts. The definition frames the model as a team-based approach to managing biopsychosocial issues that present in primary care, with the over-arching goal of improving primary care in general. The article provides a description of the key components and strategies used in the model, the rationale for those strategies, a brief comparison of this model to other integration approaches, a focused summary of PCBH model outcomes, and an overview of common challenges to implementing the model.
Effect of the MCNP model definition on the computation time
International Nuclear Information System (INIS)
Šunka, Michal
2017-01-01
The presented work studies the influence of the method of defining the geometry in the MCNP transport code and its impact on the computational time, including the difficulty of preparing an input file describing the given geometry. Cases using different geometric definitions including the use of basic 2-dimensional and 3-dimensional objects and theirs combinations were studied. The results indicate that an inappropriate definition can increase the computational time by up to 59% (a more realistic case indicates 37%) for the same results and the same statistical uncertainty. (orig.)
HIV lipodystrophy case definition using artificial neural network modelling
DEFF Research Database (Denmark)
Ioannidis, John P A; Trikalinos, Thomas A; Law, Matthew
2003-01-01
OBJECTIVE: A case definition of HIV lipodystrophy has recently been developed from a combination of clinical, metabolic and imaging/body composition variables using logistic regression methods. We aimed to evaluate whether artificial neural networks could improve the diagnostic accuracy. METHODS:...
TAME - the terrestrial-aquatic model of the environment: model definition
International Nuclear Information System (INIS)
Klos, R.A.; Mueller-Lemans, H.; Dorp, F. van; Gribi, P.
1996-10-01
TAME - the Terrestrial-Aquatic Model of the Environment is a new computer model for use in assessments of the radiological impact of the release of radionuclides to the biosphere, following their disposal in underground waste repositories. Based on regulatory requirements, the end-point of the calculations is the maximum annual individual dose to members of a hypothetical population group inhabiting the biosphere region. Additional mid- and end-points in the TAME calculations are dose as function of time from eleven exposure pathways, foodstuff concentrations and the distribution of radionuclides in the modelled biosphere. A complete description of the mathematical representations of the biosphere in TAME is given in this document, based on a detailed review of the underlying conceptual framework for the model. Example results are used to illustrate features of the conceptual and mathematical models. The end-point of dose is shown to be robust for the simplifying model assumptions used to define the biosphere for the example calculations. TAME comprises two distinct sub-models - one representing the transport of radionuclides in the near-surface environment and one for the calculation of dose to individual inhabitants of that biosphere. The former is the result of a detailed review of the modelling requirements for such applications and is based on a comprehensive consideration of all features, events and processes (FEPs) relevant to Swiss biospheres, both in the present-day biosphere and in potential future biosphere states. Representations of the transport processes are derived from first principles. Mass balance for water and solid material fluxes is used to determine the rates of contaminant transfer between components of the biosphere system. The calculation of doses is based on existing representations of exposure pathways and draws on experience both from Switzerland and elsewhere. (author) figs., tabs., refs
Smart Grid Maturity Model: SGMM Model Definition. Version 1.2
2011-09-01
taking place. This might include using radio-frequency identification ( RFID ) technology to link assets to an inventory database that connects GIS and...warehoused). Automation might include workers entering the data via keyboard or barcode reader at the warehouse , or something more advanced like using... RFID tags. WAM-3.7 Modeling of asset investments for key components is underway. The asset performance and management modeling is based on real smart
TAME - the terrestrial-aquatic model of the environment: model definition
Energy Technology Data Exchange (ETDEWEB)
Klos, R.A. [Paul Scherrer Inst. (PSI), Villigen (Switzerland); Mueller-Lemans, H. [Tergoso AG fuer Umweltfragen, Sargans (Switzerland); Dorp, F. van [Nationale Genossenschaft fuer die Lagerung Radioaktiver Abfaelle (NAGRA), Baden (Switzerland); Gribi, P. [Colenco AG, Baden (Switzerland)
1996-10-01
TAME - the Terrestrial-Aquatic Model of the Environment is a new computer model for use in assessments of the radiological impact of the release of radionuclides to the biosphere, following their disposal in underground waste repositories. Based on regulatory requirements, the end-point of the calculations is the maximum annual individual dose to members of a hypothetical population group inhabiting the biosphere region. Additional mid- and end-points in the TAME calculations are dose as function of time from eleven exposure pathways, foodstuff concentrations and the distribution of radionuclides in the modelled biosphere. A complete description of the mathematical representations of the biosphere in TAME is given in this document, based on a detailed review of the underlying conceptual framework for the model. Example results are used to illustrate features of the conceptual and mathematical models. The end-point of dose is shown to be robust for the simplifying model assumptions used to define the biosphere for the example calculations. TAME comprises two distinct sub-models - one representing the transport of radionuclides in the near-surface environment and one for the calculation of dose to individual inhabitants of that biosphere. The former is the result of a detailed review of the modelling requirements for such applications and is based on a comprehensive consideration of all features, events and processes (FEPs) relevant to Swiss biospheres, both in the present-day biosphere and in potential future biosphere states. Representations of the transport processes are derived from first principles. Mass balance for water and solid material fluxes is used to determine the rates of contaminant transfer between components of the biosphere system. The calculation of doses is based on existing representations of exposure pathways and draws on experience both from Switzerland and elsewhere. (author) figs., tabs., refs.
High definition geomagnetic models: A new perspective for improved wellbore positioning
DEFF Research Database (Denmark)
Maus, Stefan; Nair, Manoj C.; Poedjono, Benny
2012-01-01
for accurate well placement, the US National Geophysical Data Center (NGDC), in partnership with industry, has developed high-definition geomagnetic models (HDGM), updated regularly using the latest satellite, airborne and marine measurements of the Earth's magnetic field. Standard geomagnetic reference models....... These are compiled into a global magnetic anomaly grid and expanded into ellipsoidal harmonics. The harmonic expansion coefficients are then included in the high-definition models to accurately represent the direction and strength of the local geomagnetic field. The latest global model to degree and order 720...... facilitates the validation of MWD surveys by keeping the field acceptance criteria centered on the true downhole magnetic field. Together, these factors improve well placement, prevent and mitigate the danger of collision with existing wellbores and enable real-time steering to save rig-time and reduce...
Software tools for object-based audio production using the Audio Definition Model
Matthias , Geier; Carpentier , Thibaut; Noisternig , Markus; Warusfel , Olivier
2017-01-01
International audience; We present a publicly available set of tools for the integration of the Audio Definition Model (ADM) in production workflows. ADM is an open metadata model for the description of channel-, scene-, and object-based media within a Broadcast Wave Format (BWF) container. The software tools were developed within the European research project ORPHEUS (https://orpheus-audio.eu/) that aims at developing new end-to-end object-based media chains for broadcast. These tools allow ...
Revisiting the ‘Low BirthWeight paradox’ using a model-based definition
Sol Juárez; George B. Ploubidis; Lynda Clarke
2014-01-01
Introduction: Immigrant mothers in Spain have a lower risk of delivering Low BirthWeight (LBW) babies in comparison to Spaniards (LBW paradox). This study aimed at revisiting this finding by applying a model-based threshold as an alternative to the conventional definition of LBW. Methods: Vital information data from Madrid was used (2005–2006). LBW was defined in two ways (less than 2500 g and Wilcox's proposal). Logistic and linear regression models were run. Results: According to comm...
Asymmetric and Non–Positive Definite Distance Functions Part II: Modeling
Directory of Open Access Journals (Sweden)
H. Sánchez–Larios
2009-01-01
Full Text Available Traditionally the distance functions involved in problems of Operations Research have been modeled using positive linear combinations of metrics Lp. Thus, the resulting distance functions are symmetric, uniforms and positive definite. Starting from a new definition of arc length, we propose a method formo deling generalized distance functions, that we call premetrics, which can be asymmetric, non uniform, and non positive definite. We show that every distance function satisfying the triangle inequality and having a continuous one–sided directional derivative can be modeled as a problem of calculus of variations. The "length" of a d–geodesic arc C(a,b from a to b with respect to the premetric d (the d–length can be negative, and therefore the d–distance from a to b may represent the minimum energy needed to move a mobile object from a to b. We illustrate our method with two examples.
2013-12-31
... DEPARTMENT OF ENERGY 10 CFR Parts 429 and 431 [Docket No. EERE-2011-BT-TP-0024] RIN 1904-AC46 Energy Conservation Program: Alternative Efficiency Determination Methods, Basic Model Definition, and Compliance for Commercial HVAC, Refrigeration, and WH Equipment AGENCY: Office of Energy Efficiency and...
Revisiting the 'Low BirthWeight paradox' using a model-based definition.
Juárez, Sol; Ploubidis, George B; Clarke, Lynda
2014-01-01
Immigrant mothers in Spain have a lower risk of delivering Low BirthWeight (LBW) babies in comparison to Spaniards (LBW paradox). This study aimed at revisiting this finding by applying a model-based threshold as an alternative to the conventional definition of LBW. Vital information data from Madrid was used (2005-2006). LBW was defined in two ways (less than 2500g and Wilcox's proposal). Logistic and linear regression models were run. According to common definition of LBW (less than 2500g) there is evidence to support the LBW paradox in Spain. Nevertheless, when an alternative model-based definition of LBW is used, the paradox is only clearly present in mothers from the rest of Southern America, suggesting a possible methodological bias effect. In the future, any examination of the existence of the LBW paradox should incorporate model-based definitions of LBW in order to avoid methodological bias. Copyright © 2013 SESPAS. Published by Elsevier Espana. All rights reserved.
2013-06-01
the Building Smart Alliance website, http://buildingsmartalliance.org/ index.php/projects/commonbimfiles/. The following steps were followed to...updated. 2.3.1 Removing nonessential elements from the models Nonessential building interior features, such as furniture , casework, elec- trical
Development of a definition, classification system, and model for cultural geology
Mitchell, Lloyd W., III
The concept for this study is based upon a personal interest by the author, an American Indian, in promoting cultural perspectives in undergraduate college teaching and learning environments. Most academicians recognize that merged fields can enhance undergraduate curricula. However, conflict may occur when instructors attempt to merge social science fields such as history or philosophy with geoscience fields such as mining and geomorphology. For example, ideologies of Earth structures derived from scientific methodologies may conflict with historical and spiritual understandings of Earth structures held by American Indians. Specifically, this study addresses the problem of how to combine cultural studies with the geosciences into a new merged academic discipline called cultural geology. This study further attempts to develop the merged field of cultural geology using an approach consisting of three research foci: a definition, a classification system, and a model. Literature reviews were conducted for all three foci. Additionally, to better understand merged fields, a literature review was conducted specifically for academic fields that merged social and physical sciences. Methodologies concentrated on the three research foci: definition, classification system, and model. The definition was derived via a two-step process. The first step, developing keyword hierarchical ranking structures, was followed by creating and analyzing semantic word meaning lists. The classification system was developed by reviewing 102 classification systems and incorporating selected components into a system framework. The cultural geology model was created also utilizing a two-step process. A literature review of scientific models was conducted. Then, the definition and classification system were incorporated into a model felt to reflect the realm of cultural geology. A course syllabus was then developed that incorporated the resulting definition, classification system, and model. This
А. Лопатьєв; М. Пітин; А. Демічковський
2017-01-01
The objective is to systematize and adapt the basic definitions and concepts of the systems approach, mathematical modeling and information technologies to sports science. Materials and methods. The research has studied the availability of appropriate terms in shooting sports, which would meet the requirements of modern sports science. It has examined the compliance of the shooting sports training program for children and youth sports schools, the Olympic reserve specialized children and ...
Trigeminal cardiac reflex: new thinking model about the definition based on a literature review.
Meuwly, C; Golanov, E; Chowdhury, T; Erne, P; Schaller, B
2015-02-01
Trigeminocardiac reflex (TCR) is a brainstem reflex that manifests as sudden onset of hemodynamic perturbation in blood pressure (MABP) and heart rate (HR), as apnea and as gastric hypermotility during stimulation of any branches of the trigeminal nerve. The molecular and clinical knowledge about the TCR is in a constant growth since 1999, what implies a current need of a review about its definition in this changing context. Relevant literature was identified through searching in PubMed (MEDLINE) and Google scholar database for the terms TCR, oculocardiac reflex, diving reflex, vasovagale response. The definition of the TCR varies in clinical as well as in research studies. The main difference applies the required change of MABP and sometimes also HR, which most varies between 10% and 20%. Due to this definition problem, we defined, related to actual literature, 2 major (plausibility, reversibility) and 2 minor criteria (repetition, prevention) for a more proper identification of the TCR in a clinical or research setting. Latest research implies that there is a need for a more extended classification with 2 additional subgroups, considering also the diving reflex and the brainstem reflex. In this review, we highlighted criteria for proper definition and classification of the TCR in the light of increased knowledge and present a thinking model to overcome this complexity. Further we separately discussed the role of HR and MABP and their variation in this context. As another subtopic we gave attention to is the chronic TCR; a variant that is rarely seen in clinical medicine.
Directory of Open Access Journals (Sweden)
А. Лопатьєв
2017-09-01
Full Text Available The objective is to systematize and adapt the basic definitions and concepts of the systems approach, mathematical modeling and information technologies to sports science. Materials and methods. The research has studied the availability of appropriate terms in shooting sports, which would meet the requirements of modern sports science. It has examined the compliance of the shooting sports training program for children and youth sports schools, the Olympic reserve specialized children and youth schools, schools of higher sports skills, and sports educational institutions with the modern requirements and principles. Research results. The paper suggests the basic definitions adapted to the requirements of technical sports and sports science. The research has thoroughly analyzed the shooting sports training program for children and youth sports schools, the Olympic reserve specialized children and youth schools, schools of higher sports skills, and sports educational institutions. The paper offers options to improve the training program in accordance with the modern tendencies of training athletes. Conclusions. The research suggests to systematize and adapt the basic definitions and concepts of the systems approach, mathematical modeling and information technologies using the example of technical sports.
Yilmaz, S; Jonveaux, P; Bicep, C; Pierron, L; Smaïl-Tabbone, M; Devignes, M D
2009-01-15
Computational methods are widely used to discover gene-disease relationships hidden in vast masses of available genomic and post-genomic data. In most current methods, a similarity measure is calculated between gene annotations and known disease genes or disease descriptions. However, more explicit gene-disease relationships are required for better insights into the molecular bases of diseases, especially for complex multi-gene diseases. Explicit relationships between genes and diseases are formulated as candidate gene definitions that may include intermediary genes, e.g. orthologous or interacting genes. These definitions guide data modelling in our database approach for gene-disease relationship discovery and are expressed as views which ultimately lead to the retrieval of documented sets of candidate genes. A system called ACGR (Approach for Candidate Gene Retrieval) has been implemented and tested with three case studies including a rare orphan gene disease.
Health literacy in childhood and youth: a systematic review of definitions and models
Directory of Open Access Journals (Sweden)
Janine Bröder
2017-04-01
Full Text Available Abstract Background Children and young people constitute a core target group for health literacy research and practice: during childhood and youth, fundamental cognitive, physical and emotional development processes take place and health-related behaviours and skills develop. However, there is limited knowledge and academic consensus regarding the abilities and knowledge a child or young person should possess for making sound health decisions. The research presented in this review addresses this gap by providing an overview and synthesis of current understandings of health literacy in childhood and youth. Furthermore, the authors aim to understand to what extent available models capture the unique needs and characteristics of children and young people. Method Six databases were systematically searched with relevant search terms in English and German. Of the n = 1492 publications identified, N = 1021 entered the abstract screening and N = 340 full-texts were screened for eligibility. A total of 30 articles, which defined or conceptualized generic health literacy for a target population of 18 years or younger, were selected for a four-step inductive content analysis. Results The systematic review of the literature identified 12 definitions and 21 models that have been specifically developed for children and young people. In the literature, health literacy in children and young people is described as comprising variable sets of key dimensions, each appearing as a cluster of related abilities, skills, commitments, and knowledge that enable a person to approach health information competently and effectively and to derive at health-promoting decisions and actions. Discussion Identified definitions and models are very heterogeneous, depicting health literacy as multidimensional, complex construct. Moreover, health literacy is conceptualized as an action competence, with a strong focus on personal attributes, while also recognising its
Health literacy in childhood and youth: a systematic review of definitions and models.
Bröder, Janine; Okan, Orkan; Bauer, Ullrich; Bruland, Dirk; Schlupp, Sandra; Bollweg, Torsten M; Saboga-Nunes, Luis; Bond, Emma; Sørensen, Kristine; Bitzer, Eva-Maria; Jordan, Susanne; Domanska, Olga; Firnges, Christiane; Carvalho, Graça S; Bittlingmayer, Uwe H; Levin-Zamir, Diane; Pelikan, Jürgen; Sahrai, Diana; Lenz, Albert; Wahl, Patricia; Thomas, Malcolm; Kessl, Fabian; Pinheiro, Paulo
2017-04-26
Children and young people constitute a core target group for health literacy research and practice: during childhood and youth, fundamental cognitive, physical and emotional development processes take place and health-related behaviours and skills develop. However, there is limited knowledge and academic consensus regarding the abilities and knowledge a child or young person should possess for making sound health decisions. The research presented in this review addresses this gap by providing an overview and synthesis of current understandings of health literacy in childhood and youth. Furthermore, the authors aim to understand to what extent available models capture the unique needs and characteristics of children and young people. Six databases were systematically searched with relevant search terms in English and German. Of the n = 1492 publications identified, N = 1021 entered the abstract screening and N = 340 full-texts were screened for eligibility. A total of 30 articles, which defined or conceptualized generic health literacy for a target population of 18 years or younger, were selected for a four-step inductive content analysis. The systematic review of the literature identified 12 definitions and 21 models that have been specifically developed for children and young people. In the literature, health literacy in children and young people is described as comprising variable sets of key dimensions, each appearing as a cluster of related abilities, skills, commitments, and knowledge that enable a person to approach health information competently and effectively and to derive at health-promoting decisions and actions. Identified definitions and models are very heterogeneous, depicting health literacy as multidimensional, complex construct. Moreover, health literacy is conceptualized as an action competence, with a strong focus on personal attributes, while also recognising its interrelatedness with social and contextual determinants. Life phase
Escorpizo, Reuben; Reneman, Michiel F; Ekholm, Jan; Fritz, Julie; Krupa, Terry; Marnetoft, Sven-Uno; Maroun, Claude E; Guzman, Julietta Rodriguez; Suzuki, Yoshiko; Stucki, Gerold; Chan, Chetwyn C H
2011-06-01
The International Classification of Functioning, Disability and Health (ICF) is a conceptual framework and classification system by the World Health Organization (WHO) to understand functioning. The objective of this discussion paper is to offer a conceptual definition for vocational rehabilitation (VR) based on the ICF. We presented the ICF as a model for application in VR and the rationale for the integration of the ICF. We also briefly reviewed other work disability models. Five essential elements of foci were found towards a conceptual definition of VR: an engagement or re-engagement to work, along a work continuum, involved health conditions or events leading to work disability, patient-centered and evidence-based, and is multi-professional or multidisciplinary. VR refers to a multi-professional approach that is provided to individuals of working age with health-related impairments, limitations, or restrictions with work functioning and whose primary aim is to optimize work participation. We propose that the ICF and VR interface be explored further using empirical and qualitative works and encouraging stakeholders' participation.
Exploratory analysis regarding the domain definitions for computer based analytical models
Raicu, A.; Oanta, E.; Barhalescu, M.
2017-08-01
Our previous computer based studies dedicated to structural problems using analytical methods defined the composite cross section of a beam as a result of Boolean operations with so-called ‘simple’ shapes. Using generalisations, in the class of the ‘simple’ shapes were included areas bounded by curves approximated using spline functions and areas approximated as polygons. However, particular definitions lead to particular solutions. In order to ascend above the actual limitations, we conceived a general definition of the cross sections that are considered now calculus domains consisting of several subdomains. The according set of input data use complex parameterizations. This new vision allows us to naturally assign a general number of attributes to the subdomains. In this way there may be modelled new phenomena that use map-wise information, such as the metal alloys equilibrium diagrams. The hierarchy of the input data text files that use the comma-separated-value format and their structure are also presented and discussed in the paper. This new approach allows us to reuse the concepts and part of the data processing software instruments already developed. The according software to be subsequently developed will be modularised and generalised in order to be used in the upcoming projects that require rapid development of computer based models.
Questioning the “classical” in Persian painting: models and problems of definition
Directory of Open Access Journals (Sweden)
Christiane Gruber
2012-06-01
Full Text Available In scholarship on Persian book arts, paintings have tended to be organized according to a rise-and-fall model. Within this overarching framework, the Ilkhanid period represents the birth of painting and the Qajar era its supposed decline, while Timurid and Safavid painting mark a high point for the development of pictorial arts in Iran. As a result, scholars have used the term ‘classical’ to describe both Timurid and Safavid painting. The many definitions of ‘classical’ – which alternatively engage with aesthetic criteria, time periods, numerical output, systems of patronage, artistic models, and stylistic imitations – raise a number of significant questions, however. This study highlights the problematic uses of the term in scholarship on Persian manuscript painting. Moreover, by examining a series of interrelated Ilkhanid, Timurid, and Safavid paintings of the Prophet Muhammad in particular, it seeks to explore alternative models for studying the history of Persian manuscript painting, itself too diverse and self-referential to be confined to a linear account.
Directory of Open Access Journals (Sweden)
Raman Sood
2012-01-01
Full Text Available Hematopoiesis is a dynamic process where initiation and maintenance of hematopoietic stem cells, as well as their differentiation into erythroid, myeloid and lymphoid lineages, are tightly regulated by a network of transcription factors. Understanding the genetic controls of hematopoiesis is crucial as perturbations in hematopoiesis lead to diseases such as anemia, thrombocytopenia, or cancers, including leukemias and lymphomas. Animal models, particularly conventional and conditional knockout mice, have played major roles in our understanding of the genetic controls of hematopoiesis. However, knockout mice for most of the hematopoietic transcription factors are embryonic lethal, thus precluding the analysis of their roles during the transition from embryonic to adult hematopoiesis. Zebrafish are an ideal model organism to determine the function of a gene during embryonic-to-adult transition of hematopoiesis since bloodless zebrafish embryos can develop normally into early larval stage by obtaining oxygen through diffusion. In this review, we discuss the current status of the ontogeny and regulation of hematopoiesis in zebrafish. By providing specific examples of zebrafish morphants and mutants, we have highlighted the contributions of the zebrafish model to our overall understanding of the roles of transcription factors in regulation of primitive and definitive hematopoiesis.
Psychoacoustic and cognitive aspects of auditory roughness: definitions, models, and applications
Vassilakis, Pantelis N.; Kendall, Roger A.
2010-02-01
The term "auditory roughness" was first introduced in the 19th century to describe the buzzing, rattling auditory sensation accompanying narrow harmonic intervals (i.e. two tones with frequency difference in the range of ~15-150Hz, presented simultaneously). A broader definition and an overview of the psychoacoustic correlates of the auditory roughness sensation, also referred to as sensory dissonance, is followed by an examination of efforts to quantify it over the past one hundred and fifty years and leads to the introduction of a new roughness calculation model and an application that automates spectral and roughness analysis of sound signals. Implementation of spectral and roughness analysis is briefly discussed in the context of two pilot perceptual experiments, designed to assess the relationship among cultural background, music performance practice, and aesthetic attitudes towards the auditory roughness sensation.
Conversion of Component-Based Point Definition to VSP Model and Higher Order Meshing
Ordaz, Irian
2011-01-01
Vehicle Sketch Pad (VSP) has become a powerful conceptual and parametric geometry tool with numerous export capabilities for third-party analysis codes as well as robust surface meshing capabilities for computational fluid dynamics (CFD) analysis. However, a capability gap currently exists for reconstructing a fully parametric VSP model of a geometry generated by third-party software. A computer code called GEO2VSP has been developed to close this gap and to allow the integration of VSP into a closed-loop geometry design process with other third-party design tools. Furthermore, the automated CFD surface meshing capability of VSP are demonstrated for component-based point definition geometries in a conceptual analysis and design framework.
Evans, Natalie; Meñaca, Arantza; Koffman, Jonathan; Harding, Richard; Higginson, Irene J; Pool, Robert; Gysels, Marjolein
2012-07-01
Cultural competency is increasingly recommended in policy and practice to improve end-of-life (EoL) care for minority ethnic groups in multicultural societies. It is imperative to critically analyze this approach to understand its underlying concepts. Our aim was to appraise cultural competency approaches described in the British literature on EoL care and minority ethnic groups. This is a critical review. Articles on cultural competency were identified from a systematic review of the literature on minority ethnic groups and EoL care in the United Kingdom. Terms, definitions, and conceptual models of cultural competency approaches were identified and situated according to purpose, components, and origin. Content analysis of definitions and models was carried out to identify key components. One-hundred thirteen articles on minority ethnic groups and EoL care in the United Kingdom were identified. Over half (n=60) contained a term, definition, or model for cultural competency. In all, 17 terms, 17 definitions, and 8 models were identified. The most frequently used term was "culturally sensitive," though "cultural competence" was defined more often. Definitions contained one or more of the components: "cognitive," "implementation," or "outcome." Models were categorized for teaching or use in patient assessment. Approaches were predominantly of American origin. The variety of terms, definitions, and models underpinning cultural competency approaches demonstrates a lack of conceptual clarity, and potentially complicates implementation. Further research is needed to compare the use of cultural competency approaches in diverse cultures and settings, and to assess the impact of such approaches on patient outcomes.
Directory of Open Access Journals (Sweden)
Pauline Gerus
Full Text Available Neuromusculoskeletal models are a common method to estimate muscle forces. Developing accurate neuromusculoskeletal models is a challenging task due to the complexity of the system and large inter-subject variability. The estimation of muscles force is based on the mechanical properties of tendon-aponeurosis complex. Most neuromusculoskeletal models use a generic definition of the tendon-aponeurosis complex based on in vitro test, perhaps limiting their validity. Ultrasonography allows subject-specific estimates of the tendon-aponeurosis complex's mechanical properties. The aim of this study was to investigate the influence of subject-specific mechanical properties of the tendon-aponeurosis complex on a neuromusculoskeletal model of the ankle joint. Seven subjects performed isometric contractions from which the tendon-aponeurosis force-strain relationship was estimated. Hopping and running tasks were performed and muscle forces were estimated using subject-specific tendon-aponeurosis and generic tendon properties. Two ultrasound probes positioned over the muscle-tendon junction and the mid-belly were combined with motion capture to estimate the in vivo tendon and aponeurosis strain of the medial head of gastrocnemius muscle. The tendon-aponeurosis force-strain relationship was scaled for the other ankle muscles based on tendon and aponeurosis length of each muscle measured by ultrasonography. The EMG-driven model was calibrated twice - using the generic tendon definition and a subject-specific tendon-aponeurosis force-strain definition. The use of subject-specific tendon-aponeurosis definition leads to a higher muscle force estimate for the soleus muscle and the plantar-flexor group, and to a better model prediction of the ankle joint moment compared to the model estimate which used a generic definition. Furthermore, the subject-specific tendon-aponeurosis definition leads to a decoupling behaviour between the muscle fibre and muscle-tendon unit
A species-generalized probabilistic model-based definition of CpG islands.
Irizarry, Rafael A; Wu, Hao; Feinberg, Andrew P
2009-01-01
The DNA of most vertebrates is depleted in CpG dinucleotides, the target for DNA methylation. The remaining CpGs tend to cluster in regions referred to as CpG islands (CGI). CGI have been useful as marking functionally relevant epigenetic loci for genome studies. For example, CGI are enriched in the promoters of vertebrate genes and thought to play an important role in regulation. Currently, CGI are defined algorithmically as an observed-to-expected ratio (O/E) of CpG greater than 0.6, G+C content greater than 0.5, and usually but not necessarily greater than a certain length. Here we find that the current definition leaves out important CpG clusters associated with epigenetic marks, relevant to development and disease, and does not apply at all to nonvertabrate genomes. We propose an alternative Hidden Markov model-based approach that solves these problems. We fit our model to genomes from 30 species, and the results support a new epigenomic view toward the development of DNA methylation in species diversity and evolution. The O/E of CpG in islands and nonislands segregated closely phylogenetically and showed substantial loss in both groups in animals of greater complexity, while maintaining a nearly constant difference in CpG O/E between islands and nonisland compartments. Lists of CGI for some species are available at http://www.rafalab.org .
A Gaussian mixture model for definition of lung tumor volumes in positron emission tomography
International Nuclear Information System (INIS)
Aristophanous, Michalis; Penney, Bill C.; Martel, Mary K.; Pelizzari, Charles A.
2007-01-01
The increased interest in 18 F-fluorodeoxyglucose (FDG) positron emission tomography (PET) in radiation treatment planning in the past five years necessitated the independent and accurate segmentation of gross tumor volume (GTV) from FDG-PET scans. In some studies the radiation oncologist contours the GTV based on a computed tomography scan, while incorporating pertinent data from the PET images. Alternatively, a simple threshold, typically 40% of the maximum intensity, has been employed to differentiate tumor from normal tissue, while other researchers have developed algorithms to aid the PET based GTV definition. None of these methods, however, results in reliable PET tumor segmentation that can be used for more sophisticated treatment plans. For this reason, we developed a Gaussian mixture model (GMM) based segmentation technique on selected PET tumor regions from non-small cell lung cancer patients. The purpose of this study was to investigate the feasibility of using a GMM-based tumor volume definition in a robust, reliable and reproducible way. A GMM relies on the idea that any distribution, in our case a distribution of image intensities, can be expressed as a mixture of Gaussian densities representing different classes. According to our implementation, each class belongs to one of three regions in the image; the background (B), the uncertain (U) and the target (T), and from these regions we can obtain the tumor volume. User interaction in the implementation is required, but is limited to the initialization of the model parameters and the selection of an ''analysis region'' to which the modeling is restricted. The segmentation was developed on three and tested on another four clinical cases to ensure robustness against differences observed in the clinic. It also compared favorably with thresholding at 40% of the maximum intensity and a threshold determination function based on tumor to background image intensities proposed in a recent paper. The parts of the
Van Buren, Kendra L.; Ouisse, Morvan; Cogan, Scott; Sadoulet-Reboul, Emeline; Maxit, Laurent
2017-09-01
In the development of numerical models, uncertainty quantification (UQ) can inform appropriate allocation of computational resources, often resulting in efficient analysis for activities such as model calibration and robust design. UQ can be especially beneficial for numerical models with significant computational expense, such as coupled models, which require several subsystem models to attain the performance of a more complex, inter-connected system. In the coupled model paradigm, UQ can be applied at either the subsystem model level or the coupled model level. When applied at the subsystem level, UQ is applied directly to the physical input parameters, which can be computationally expensive. In contrast, UQ at the coupled level may not be representative of the physical input parameters, but comes at the benefit of being computationally efficient to implement. To be physically meaningful, analysis at the coupled level requires information about how uncertainty is propagated through from the subsystem level. Herein, the proposed strategy is based on simulations performed at the subsystem level to inform a covariance matrix for UQ performed at the coupled level. The approach is applied to a four-subsystem model of mid-frequency vibrations simulated using the Statistical Modal Energy Distribution Analysis, a variant of the Statistical Energy Analysis. The proposed approach is computationally efficient to implement, while simultaneously capturing information from the subsystem level to ensure the analysis is physically meaningful.
Jensen, Morten B; Guldberg, Trine L; Harbøll, Anja; Lukacova, Slávka; Kallehauge, Jesper F
2017-11-01
The clinical target volume (CTV) in radiotherapy is routinely based on gadolinium contrast enhanced T1 weighted (T1w + Gd) and T2 weighted fluid attenuated inversion recovery (T2w FLAIR) magnetic resonance imaging (MRI) sequences which have been shown to over- or underestimate the microscopic tumor cell spread. Gliomas favor spread along the white matter fiber tracts. Tumor growth models incorporating the MRI diffusion tensors (DTI) allow to account more consistently for the glioma growth. The aim of the study was to investigate the potential of a DTI driven growth model to improve target definition in glioblastoma (GBM). Eleven GBM patients were scanned using T1w, T2w FLAIR, T1w + Gd and DTI. The brain was segmented into white matter, gray matter and cerebrospinal fluid. The Fisher-Kolmogorov growth model was used assuming uniform proliferation and a difference in white and gray matter diffusion of a ratio of 10. The tensor directionality was tested using an anisotropy weighting parameter set to zero (γ0) and twenty (γ20). The volumetric comparison was performed using Hausdorff distance, Dice similarity coefficient (DSC) and surface area. The median of the standard CTV (CTVstandard) was 180 cm 3 . The median surface area of CTVstandard was 211 cm 2 . The median surface area of respective CTV γ0 and CTV γ20 significantly increased to 338 and 376 cm 2 , respectively. The Hausdorff distance was greater than zero and significantly increased for both CTV γ0 and CTV γ20 with respective median of 18.7 and 25.2 mm. The DSC for both CTV γ0 and CTV γ20 were significantly below one with respective median of 0.74 and 0.72, which means that 74 and 72% of CTVstandard were included in CTV γ0 and CTV γ20, respectively. DTI driven growth models result in CTVs with a significantly increased surface area, a significantly increased Hausdorff distance and decreased overlap between the standard and model derived volume.
A process-based model for the definition of hydrological alert systems in landslide risk mitigation
Directory of Open Access Journals (Sweden)
M. Floris
2012-11-01
Full Text Available The definition of hydrological alert systems for rainfall-induced landslides is strongly related to a deep knowledge of the geological and geomorphological features of the territory. Climatic conditions, spatial and temporal evolution of the phenomena and characterization of landslide triggering, together with propagation mechanisms, are the key elements to be considered. Critical steps for the development of the systems consist of the identification of the hydrological variable related to landslide triggering and of the minimum rainfall threshold for landslide occurrence.
In this paper we report the results from a process-based model to define a hydrological alert system for the Val di Maso Landslide, located in the northeastern Italian Alps and included in the Vicenza Province (Veneto region, NE Italy. The instability occurred in November 2010, due to an exceptional rainfall event that hit the Vicenza Province and the entire NE Italy. Up to 500 mm in 3-day cumulated rainfall generated large flood conditions and triggered hundreds of landslides. During the flood, the Soil Protection Division of the Vicenza Province received more than 500 warnings of instability phenomena. The complexity of the event and the high level of risk to infrastructure and private buildings are the main reasons for deepening the specific phenomenon occurred at Val di Maso.
Empirical and physically-based models have been used to identify the minimum rainfall threshold for the occurrence of instability phenomena in the crown area of Val di Maso landslide, where a retrogressive evolution by multiple rotational slides is expected. Empirical models helped in the identification and in the evaluation of recurrence of critical rainfall events, while physically-based modelling was essential to verify the effects on the slope stability of determined rainfall depths. Empirical relationships between rainfall and landslide consist of the calculation of rainfall
A process-based model for the definition of hydrological alert systems in landslide risk mitigation
Floris, M.; D'Alpaos, A.; De Agostini, A.; Stevan, G.; Tessari, G.; Genevois, R.
2012-11-01
The definition of hydrological alert systems for rainfall-induced landslides is strongly related to a deep knowledge of the geological and geomorphological features of the territory. Climatic conditions, spatial and temporal evolution of the phenomena and characterization of landslide triggering, together with propagation mechanisms, are the key elements to be considered. Critical steps for the development of the systems consist of the identification of the hydrological variable related to landslide triggering and of the minimum rainfall threshold for landslide occurrence. In this paper we report the results from a process-based model to define a hydrological alert system for the Val di Maso Landslide, located in the northeastern Italian Alps and included in the Vicenza Province (Veneto region, NE Italy). The instability occurred in November 2010, due to an exceptional rainfall event that hit the Vicenza Province and the entire NE Italy. Up to 500 mm in 3-day cumulated rainfall generated large flood conditions and triggered hundreds of landslides. During the flood, the Soil Protection Division of the Vicenza Province received more than 500 warnings of instability phenomena. The complexity of the event and the high level of risk to infrastructure and private buildings are the main reasons for deepening the specific phenomenon occurred at Val di Maso. Empirical and physically-based models have been used to identify the minimum rainfall threshold for the occurrence of instability phenomena in the crown area of Val di Maso landslide, where a retrogressive evolution by multiple rotational slides is expected. Empirical models helped in the identification and in the evaluation of recurrence of critical rainfall events, while physically-based modelling was essential to verify the effects on the slope stability of determined rainfall depths. Empirical relationships between rainfall and landslide consist of the calculation of rainfall Depth-Duration-Frequency (DDF) curves
DEFF Research Database (Denmark)
Cameron, Ian; Gani, Rafiqul
2011-01-01
This chapter deals with the practicalities of building, testing, deploying and maintaining models. It gives specific advice for each phase of the modelling cycle. To do this, a modelling framework is introduced which covers: problem and model definition; model conceptualization; model data...... requirements; model construction; model solution; model verification; model validation and finally model deployment and maintenance. Within the adopted methodology, each step is discussedthrough the consideration of key issues and questions relevant to the modelling activity. Practical advice, based on many...... years of experience is providing in directing the reader in their activities.Traps and pitfalls are discussed and strategies also given to improve model development towards “fit-for-purpose” models. The emphasis in this chapter is the adoption and exercise of a modelling methodology that has proven very...
Definition of an Object-Oriented Modeling Language for Enterprise Architecture
Lê, Lam Son; Wegmann, Alain
2005-01-01
In enterprise architecture, the goal is to integrate business resources and IT resources in order to improve an enterprises competitiveness. In an enterprise architecture project, the development team usually constructs a model that represents the enterprise: the enterprise model. In this paper, we present a modeling language for building such enterprise models. Our enterprise models are hierarchical object-oriented representations of the enterprises. This paper presents the foundations of o...
Gäbler, E
1977-12-01
Cooperativity in protein-ligand binding cannot yet be treated as a urinary phenomenon (neither in investigating nor in modelling), although imaginable to rest upon some few general principles only. Knowledge of atomic details of protein-ligand interactions is still limited. Instead of a physico-chemical explanation and definition of cooperativity, a series of schematic concept has been established throughout the literature. Some dominating or merely observable features are isolated and intermixed with plausibly invented mechanistic details. These cooperativity concepts are put together and partly generalized here. From comparing these concepts with the state of binding theory on the one hand and with the nature of measured binding data on the other, some problems of understanding cooperative effects and of detecting, measuring and identifying them are pointed out. For special cases, suitable criteria and measures are recommended and graphic and mathematical techniques discussed. The limited significance and generality of current cooperativity terms is emphasized. Some different levels of understanding have to be distinguished.
Zifan, Ali; Ledgerwood-Lee, Melissa; Mittal, Ravinder K
2016-12-01
Three-dimensional high-definition anorectal manometry (3D-HDAM) is used to assess anal sphincter function; it determines profiles of regional pressure distribution along the length and circumference of the anal canal. There is no consensus, however, on the best way to analyze data from 3D-HDAM to distinguish healthy individuals from persons with sphincter dysfunction. We developed a computer analysis system to analyze 3D-HDAM data and to aid in the diagnosis and assessment of patients with fecal incontinence (FI). In a prospective study, we performed 3D-HDAM analysis of 24 asymptomatic healthy subjects (control subjects; all women; mean age, 39 ± 10 years) and 24 patients with symptoms of FI (all women; mean age, 58 ± 13 years). Patients completed a standardized questionnaire (FI severity index) to score the severity of FI symptoms. We developed and evaluated a robust prediction model to distinguish patients with FI from control subjects using linear discriminant, quadratic discriminant, and logistic regression analyses. In addition to collecting pressure information from the HDAM data, we assessed regional features based on shape characteristics and the anal sphincter pressure symmetry index. The combination of pressure values, anal sphincter area, and reflective symmetry values was identified in patients with FI versus control subjects with an area under the curve value of 1.0. In logistic regression analyses using different predictors, the model identified patients with FI with an area under the curve value of 0.96 (interquartile range, 0.22). In discriminant analysis, results were classified with a minimum error of 0.02, calculated using 10-fold cross-validation; different combinations of predictors produced median classification errors of 0.16 in linear discriminant analysis (interquartile range, 0.25) and 0.08 in quadratic discriminant analysis (interquartile range, 0.25). We developed and validated a novel prediction model to analyze 3D-HDAM data. This
Definition of Saturn's magnetospheric model parameters for the Pioneer 11 flyby
Directory of Open Access Journals (Sweden)
E. S. Belenkaya
2006-05-01
Full Text Available This paper presents a description of a method for selection parameters for a global paraboloid model of Saturn's magnetosphere. The model is based on the preexisting paraboloid terrestrial and Jovian models of the magnetospheric field. Interaction of the solar wind with the magnetosphere, i.e. the magnetotail current system, and the magnetopause currents screening all magnetospheric field sources, is taken into account. The input model parameters are determined from observations of the Pioneer 11 inbound flyby.
Solar Sail Models and Test Measurements Correspondence for Validation Requirements Definition
Ewing, Anthony; Adams, Charles
2004-01-01
Solar sails are being developed as a mission-enabling technology in support of future NASA science missions. Current efforts have advanced solar sail technology sufficient to justify a flight validation program. A primary objective of this activity is to test and validate solar sail models that are currently under development so that they may be used with confidence in future science mission development (e.g., scalable to larger sails). Both system and model validation requirements must be defined early in the program to guide design cycles and to ensure that relevant and sufficient test data will be obtained to conduct model validation to the level required. A process of model identification, model input/output documentation, model sensitivity analyses, and test measurement correspondence is required so that decisions can be made to satisfy validation requirements within program constraints.
Definition of Saturn's magnetospheric model parameters for the Pioneer 11 flyby
Directory of Open Access Journals (Sweden)
E. S. Belenkaya
2006-05-01
Full Text Available This paper presents a description of a method for selection parameters for a global paraboloid model of Saturn's magnetosphere. The model is based on the preexisting paraboloid terrestrial and Jovian models of the magnetospheric field. Interaction of the solar wind with the magnetosphere, i.e. the magnetotail current system, and the magnetopause currents screening all magnetospheric field sources, is taken into account. The input model parameters are determined from observations of the Pioneer 11 inbound flyby.
Trainer, Asa; Hedberg, Thomas; Feeney, Allison Barnard; Fischer, Kevin; Rosche, Phil
2016-01-01
Advances in information technology triggered a digital revolution that holds promise of reduced costs, improved productivity, and higher quality. To ride this wave of innovation, manufacturing enterprises are changing how product definitions are communicated - from paper to models. To achieve industry's vision of the Model-Based Enterprise (MBE), the MBE strategy must include model-based data interoperability from design to manufacturing and quality in the supply chain. The Model-Based Definition (MBD) is created by the original equipment manufacturer (OEM) using Computer-Aided Design (CAD) tools. This information is then shared with the supplier so that they can manufacture and inspect the physical parts. Today, suppliers predominantly use Computer-Aided Manufacturing (CAM) and Coordinate Measuring Machine (CMM) models for these tasks. Traditionally, the OEM has provided design data to the supplier in the form of two-dimensional (2D) drawings, but may also include a three-dimensional (3D)-shape-geometry model, often in a standards-based format such as ISO 10303-203:2011 (STEP AP203). The supplier then creates the respective CAM and CMM models and machine programs to produce and inspect the parts. In the MBE vision for model-based data exchange, the CAD model must include product-and-manufacturing information (PMI) in addition to the shape geometry. Today's CAD tools can generate models with embedded PMI. And, with the emergence of STEP AP242, a standards-based model with embedded PMI can now be shared downstream. The on-going research detailed in this paper seeks to investigate three concepts. First, that the ability to utilize a STEP AP242 model with embedded PMI for CAD-to-CAM and CAD-to-CMM data exchange is possible and valuable to the overall goal of a more efficient process. Second, the research identifies gaps in tools, standards, and processes that inhibit industry's ability to cost-effectively achieve model-based-data interoperability in the pursuit of the
Laffan, Shawn W; Wang, Zhaoyuan; Ward, Michael P
2011-12-01
The definition of the spatial relatedness between infectious and susceptible animal groups is a fundamental component of spatio-temporal modelling of disease outbreaks. A common neighbourhood definition for disease spread in wild and feral animal populations is the distance between the centroids of neighbouring group home ranges. This distance can be used to define neighbourhood interactions, and also to describe the probability of successful disease transmission. Key limitations of this approach are (1) that a susceptible neighbour of an infectious group with an overlapping home range - but whose centroid lies outside the home range of an infectious group - will not be considered for disease transmission, and (2) the degree of overlap between the home ranges is not taken into account for those groups with centroids inside the infectious home range. We assessed the impact of both distance-based and range overlap methods of disease transmission on model-predicted disease spread. Range overlap was calculated using home ranges modelled as circles. We used the Sirca geographic automata model, with the population data from a nine-county study area in Texas that we have previously described. For each method we applied 100 model repetitions, each of 100 time steps, to 30 index locations. The results show that the rate of disease spread for the range-overlap method is clearly less than the distance-based method, with median outbreaks modelled using the latter being 1.4-1.45 times larger. However, the two methods show similar overall trends in the area infected, and the range-overlap median (48 and 120 for cattle and pigs, respectively) falls within the 5th-95th percentile range of the distance-based method (0-96 and 0-252 for cattle and pigs, respectively). These differences can be attributed to the calculation of the interaction probabilities in the two methods, with overlap weights generally resulting in lower interaction probabilities. The definition of spatial
Torian, J. G.
1977-01-01
Consumables models required for the mission planning and scheduling function are formulated. The relation of the models to prelaunch, onboard, ground support, and postmission functions for the space transportation systems is established. Analytical models consisting of an orbiter planning processor with consumables data base is developed. A method of recognizing potential constraint violations in both the planning and flight operations functions, and a flight data file storage/retrieval of information over an extended period which interfaces with a flight operations processor for monitoring of the actual flights is presented.
Lin, Tony; Erfan, Sasan
2016-01-01
Mathematical modeling is an open-ended research subject where no definite answers exist for any problem. Math modeling enables thinking outside the box to connect different fields of studies together including statistics, algebra, calculus, matrices, programming and scientific writing. As an integral part of society, it is the foundation for many…
Model integration and a theory of models
Dolk, Daniel R.; Kottemann, Jeffrey E.
1993-01-01
Model integration extends the scope of model management to include the dimension of manipulation as well. This invariably leads to comparisons with database theory. Model integration is viewed from four perspectives: Organizational, definitional, procedural, and implementational. Strategic modeling is discussed as the organizational motivation for model integration. Schema and process integration are examined as the logical and manipulation counterparts of model integr...
Transport spatial model for the definition of green routes for city logistics centers
International Nuclear Information System (INIS)
Pamučar, Dragan; Gigović, Ljubomir; Ćirović, Goran; Regodić, Miodrag
2016-01-01
This paper presents a transport spatial decision support model (TSDSM) for carrying out the optimization of green routes for city logistics centers. The TSDSM model is based on the integration of the multi-criteria method of Weighted Linear Combination (WLC) and the modified Dijkstra algorithm within a geographic information system (GIS). The GIS is used for processing spatial data. The proposed model makes it possible to plan routes for green vehicles and maximize the positive effects on the environment, which can be seen in the reduction of harmful gas emissions and an increase in the air quality in highly populated areas. The scheduling of delivery vehicles is given as a problem of optimization in terms of the parameters of: the environment, health, use of space and logistics operating costs. Each of these input parameters was thoroughly examined and broken down in the GIS into criteria which further describe them. The model presented here takes into account the fact that logistics operators have a limited number of environmentally friendly (green) vehicles available. The TSDSM was tested on a network of roads with 127 links for the delivery of goods from the city logistics center to the user. The model supports any number of available environmentally friendly or environmentally unfriendly vehicles consistent with the size of the network and the transportation requirements. - Highlights: • Model for routing light delivery vehicles in urban areas. • Optimization of green routes for city logistics centers. • The proposed model maximizes the positive effects on the environment. • The model was tested on a real network.
Transport spatial model for the definition of green routes for city logistics centers
Energy Technology Data Exchange (ETDEWEB)
Pamučar, Dragan, E-mail: dpamucar@gmail.com [University of Defence in Belgrade, Department of Logistics, Pavla Jurisica Sturma 33, 11000 Belgrade (Serbia); Gigović, Ljubomir, E-mail: gigoviclj@gmail.com [University of Defence in Belgrade, Department of Mathematics, Pavla Jurisica Sturma 33, 11000 Belgrade (Serbia); Ćirović, Goran, E-mail: cirovic@sezampro.rs [College of Civil Engineering and Geodesy, The Belgrade University, Hajduk Stankova 2, 11000 Belgrade (Serbia); Regodić, Miodrag, E-mail: mregodic62@gmail.com [University of Defence in Belgrade, Department of Mathematics, Pavla Jurisica Sturma 33, 11000 Belgrade (Serbia)
2016-01-15
This paper presents a transport spatial decision support model (TSDSM) for carrying out the optimization of green routes for city logistics centers. The TSDSM model is based on the integration of the multi-criteria method of Weighted Linear Combination (WLC) and the modified Dijkstra algorithm within a geographic information system (GIS). The GIS is used for processing spatial data. The proposed model makes it possible to plan routes for green vehicles and maximize the positive effects on the environment, which can be seen in the reduction of harmful gas emissions and an increase in the air quality in highly populated areas. The scheduling of delivery vehicles is given as a problem of optimization in terms of the parameters of: the environment, health, use of space and logistics operating costs. Each of these input parameters was thoroughly examined and broken down in the GIS into criteria which further describe them. The model presented here takes into account the fact that logistics operators have a limited number of environmentally friendly (green) vehicles available. The TSDSM was tested on a network of roads with 127 links for the delivery of goods from the city logistics center to the user. The model supports any number of available environmentally friendly or environmentally unfriendly vehicles consistent with the size of the network and the transportation requirements. - Highlights: • Model for routing light delivery vehicles in urban areas. • Optimization of green routes for city logistics centers. • The proposed model maximizes the positive effects on the environment. • The model was tested on a real network.
Kuhn, Thomas; Cunze, Sarah; Kochmann, Judith; Klimpel, Sven
2016-08-01
Marine nematodes of the genus Anisakis are common parasites of a wide range of aquatic organisms. Public interest is primarily based on their importance as zoonotic agents of the human Anisakiasis, a severe infection of the gastro-intestinal tract as result of consuming live larvae in insufficiently cooked fish dishes. The diverse nature of external impacts unequally influencing larval and adult stages of marine endohelminth parasites requires the consideration of both abiotic and biotic factors. Whereas abiotic factors are generally more relevant for early life stages and might also be linked to intermediate hosts, definitive hosts are indispensable for a parasite’s reproduction. In order to better understand the uneven occurrence of parasites in fish species, we here use the maximum entropy approach (Maxent) to model the habitat suitability for nine Anisakis species accounting for abiotic parameters as well as biotic data (definitive hosts). The modelled habitat suitability reflects the observed distribution quite well for all Anisakis species, however, in some cases, habitat suitability exceeded the known geographical distribution, suggesting a wider distribution than presently recorded. We suggest that integrative modelling combining abiotic and biotic parameters is a valid approach for habitat suitability assessments of Anisakis, and potentially other marine parasite species.
Kuhn, Thomas; Cunze, Sarah; Kochmann, Judith; Klimpel, Sven
2016-08-10
Marine nematodes of the genus Anisakis are common parasites of a wide range of aquatic organisms. Public interest is primarily based on their importance as zoonotic agents of the human Anisakiasis, a severe infection of the gastro-intestinal tract as result of consuming live larvae in insufficiently cooked fish dishes. The diverse nature of external impacts unequally influencing larval and adult stages of marine endohelminth parasites requires the consideration of both abiotic and biotic factors. Whereas abiotic factors are generally more relevant for early life stages and might also be linked to intermediate hosts, definitive hosts are indispensable for a parasite's reproduction. In order to better understand the uneven occurrence of parasites in fish species, we here use the maximum entropy approach (Maxent) to model the habitat suitability for nine Anisakis species accounting for abiotic parameters as well as biotic data (definitive hosts). The modelled habitat suitability reflects the observed distribution quite well for all Anisakis species, however, in some cases, habitat suitability exceeded the known geographical distribution, suggesting a wider distribution than presently recorded. We suggest that integrative modelling combining abiotic and biotic parameters is a valid approach for habitat suitability assessments of Anisakis, and potentially other marine parasite species.
Sidorov, Vladimir P.; Melzitdinova, Anna V.
2017-10-01
This paper represents the definition methods for thermal constants according to the data of the weld width under the normal-circular heat source. The method is based on isoline contouring of “effective power – temperature conductivity coefficient”. The definition of coefficients provides setting requirements to the precision of welding parameters support with the enough accuracy for an engineering practice.
Kaneko, D.
2017-12-01
Climate change initiates abnormal meteorological disasters. Drought causes climate instability, thus producing poor harvests because of low rates of photosynthesis and sterile pollination. This research evaluates drought indices regarding precipitation and includes this data in global geophysical crop models that concern with evaporation, stomata opening, advection-effects from sea surface temperature anomalies, photosynthesis, carbon partitioning, crop yields, and crop production. Standard precipitation index (SPI) is a useful tool because of related variable not used in the stomata model. However, SPI is not an adequate tool for drought in irrigated fields. Contrary to expectations, the global comparisons of spatial characteristics between stomata opening/evapotranspiration and SPI for monitoring continental crop extremes produced serious defects and obvious differences between evapotranspiration and the small stomata-opening phenomena. The reason for this is that SPI does not include surface air temperature in its analysis. The Penman equation (Epen) describes potential evaporation better than SPI for recent hot droughts caused by climate change. However, the distribution of precipitation is a necessary condition for crop monitoring because it affirms the trend of the dry results computed by crop models. Consequently, the author uses global precipitation data observed by microwave passive sensors on TRMM and GCOM-W satellites. This remote sensing data conveniently supplies spatial distributions of global and seasonal precipitation. The author has designed a model to measure the effects of drought on crop yield and the degree of stomata closure related to the photosynthesis rate. To determine yield effects, the drought injury function is defined by integrating stomata closure during the two seasons from flowering to pollination. The stomata, defined by ratio between Epen and Eac, reflect the effects of drought and irrigation. Stomata-closure model includes the
A topo-graph model for indistinct target boundary definition from anatomical images.
Cui, Hui; Wang, Xiuying; Zhou, Jianlong; Gong, Guanzhong; Eberl, Stefan; Yin, Yong; Wang, Lisheng; Feng, Dagan; Fulham, Michael
2018-06-01
It can be challenging to delineate the target object in anatomical imaging when the object boundaries are difficult to discern due to the low contrast or overlapping intensity distributions from adjacent tissues. We propose a topo-graph model to address this issue. The first step is to extract a topographic representation that reflects multiple levels of topographic information in an input image. We then define two types of node connections - nesting branches (NBs) and geodesic edges (GEs). NBs connect nodes corresponding to initial topographic regions and GEs link the nodes at a detailed level. The weights for NBs are defined to measure the similarity of regional appearance, and weights for GEs are defined with geodesic and local constraints. NBs contribute to the separation of topographic regions and the GEs assist the delineation of uncertain boundaries. Final segmentation is achieved by calculating the relevance of the unlabeled nodes to the labels by the optimization of a graph-based energy function. We test our model on 47 low contrast CT studies of patients with non-small cell lung cancer (NSCLC), 10 contrast-enhanced CT liver cases and 50 breast and abdominal ultrasound images. The validation criteria are the Dice's similarity coefficient and the Hausdorff distance. Student's t-test show that our model outperformed the graph models with pixel-only, pixel and regional, neighboring and radial connections (p-values <0.05). Our findings show that the topographic representation and topo-graph model provides improved delineation and separation of objects from adjacent tissues compared to the tested models. Copyright © 2018 Elsevier B.V. All rights reserved.
Russell, Richard A.; Waiss, Richard D.
1988-01-01
A study was conducted to identify the common support equipment and Space Station interface requirements for the IOC (initial operating capabilities) model technology experiments. In particular, each principal investigator for the proposed model technology experiment was contacted and visited for technical understanding and support for the generation of the detailed technical backup data required for completion of this study. Based on the data generated, a strong case can be made for a dedicated technology experiment command and control work station consisting of a command keyboard, cathode ray tube, data processing and storage, and an alert/annunciator panel located in the pressurized laboratory.
Escorpizo, Reuben; Reneman, Michiel F.; Ekholm, Jan; Fritz, Julie; Krupa, Terry; Marnetoft, Sven-Uno; Maroun, Claude E.; Guzman, Julietta Rodriguez; Suzuki, Yoshiko; Stucki, Gerold; Chan, Chetwyn C. H.
Background The International Classification of Functioning, Disability and Health (ICF) is a conceptual framework and classification system by the World Health Organization (WHO) to understand functioning. The objective of this discussion paper is to offer a conceptual definition for vocational
Directory of Open Access Journals (Sweden)
Alba Sandyra Bezerra Lopes
2012-01-01
Full Text Available The motion estimation is the most complex module in a video encoder requiring a high processing throughput and high memory bandwidth, mainly when the focus is high-definition videos. The throughput problem can be solved increasing the parallelism in the internal operations. The external memory bandwidth may be reduced using a memory hierarchy. This work presents a memory hierarchy model for a full-search motion estimation core. The proposed memory hierarchy model is based on a data reuse scheme considering the full search algorithm features. The proposed memory hierarchy expressively reduces the external memory bandwidth required for the motion estimation process, and it provides a very high data throughput for the ME core. This throughput is necessary to achieve real time when processing high-definition videos. When considering the worst bandwidth scenario, this memory hierarchy is able to reduce the external memory bandwidth in 578 times. A case study for the proposed hierarchy, using 32×32 search window and 8×8 block size, was implemented and prototyped on a Virtex 4 FPGA. The results show that it is possible to reach 38 frames per second when processing full HD frames (1920×1080 pixels using nearly 299 Mbytes per second of external memory bandwidth.
Upton, J; Murphy, M; Shalloo, L; Groot Koerkamp, P W G; De Boer, I J M
2014-01-01
Our objective was to define and demonstrate a mechanistic model that enables dairy farmers to explore the impact of a technical or managerial innovation on electricity consumption, associated CO2 emissions, and electricity costs. We, therefore, (1) defined a model for electricity consumption on dairy farms (MECD) capable of simulating total electricity consumption along with related CO2 emissions and electricity costs on dairy farms on a monthly basis; (2) validated the MECD using empirical data of 1yr on commercial spring calving, grass-based dairy farms with 45, 88, and 195 milking cows; and (3) demonstrated the functionality of the model by applying 2 electricity tariffs to the electricity consumption data and examining the effect on total dairy farm electricity costs. The MECD was developed using a mechanistic modeling approach and required the key inputs of milk production, cow number, and details relating to the milk-cooling system, milking machine system, water-heating system, lighting systems, water pump systems, and the winter housing facilities as well as details relating to the management of the farm (e.g., season of calving). Model validation showed an overall relative prediction error (RPE) of less than 10% for total electricity consumption. More than 87% of the mean square prediction error of total electricity consumption was accounted for by random variation. The RPE values of the milk-cooling systems, water-heating systems, and milking machine systems were less than 20%. The RPE values for automatic scraper systems, lighting systems, and water pump systems varied from 18 to 113%, indicating a poor prediction for these metrics. However, automatic scrapers, lighting, and water pumps made up only 14% of total electricity consumption across all farms, reducing the overall impact of these poor predictions. Demonstration of the model showed that total farm electricity costs increased by between 29 and 38% by moving from a day and night tariff to a flat
Model for definition of heat transfer coefficient in an annular two-phase flow
International Nuclear Information System (INIS)
Khun, J.
1976-01-01
Near-wall heat exchange in a vertical tube at high vapor velocity in a two-phase vapor and liquid flow is investigated. The flow divides inside the tube into a near-wall liquid film and a vapor nucleus containing liquid droplets, with the boundaries being uniform. The liquid film thickness determines the main resistance during heat transfer between the wall and vapor nucleus. The theoretical model presented is verified in water vaporization experiments, the R12 cooling agent and certain hydrocarbons. The loss of friction pressure is determined by the Lockart-Martinelli method. The approximately universal Carman velocity profile is used to evaluate the velocity in film, and basing on this, film thickness is determined. The parameter ranges were: Resub(vap)=10 4 -3x10 6 , Resub(liq.)=0.9-10. The theoretical model ensures good correlation with the experiment
Parametric Engineering System Definition Model. Volume II. Appendix C. FORTRAN Listings
1979-08-01
MODEL . IVOLUME II i .PPENDIX C (TCTRAN ISTINC L !I NO,144 ". l ~July 1979 l ~Contract DAAK30-78-C-O059’ : ~S. Spaulding, A. Weintraub,: ~b ,F...listings for the COMPEND model . The listings are organized as follows: 0 Section C-2 contains the main program, a listing of all labeled COMMON blocks (all...ý LU.U. a V, _j -tL C w It 4c. cc ~ * U- . 0W.(e ozo . LJ.Zm *O-’t L * .jc clf LL’ L;~ LtUC 0’ coI C N M. %t W, - orc & ru..w a’ý Wu. Ln U, U U Ln U
Arnold, S. M.; Saleeb, A. F.
2003-01-01
Given the previous complete-potential structure framework together with the notion of strain- and stress-partitioning in terms of separate contributions of several submechanisms (viscoelastic and viscoplastic) to the thermodynamic functions (stored energy and dissipation) a detailed viscoelastoplastic multimechanism characterization of a specific hardening functional form of the model is presented and discussed. TIMETAL 21S is the material of choice as a comprehensive test matrix, including creep, relaxation, constant strain-rate tension tests, etc. are available at various temperatures. Discussion of these correlations tests, together with comparisons to several other experimental results, are given to assess the performance and predictive capabilities of the present model particularly with regard to the notion of hardening saturation as well as the interaction of multiplicity of dissipative (reversible/irreversible) mechanisms.
Economics definitions, methods, models, and analysis procedures for Homeland Security applications.
Energy Technology Data Exchange (ETDEWEB)
Ehlen, Mark Andrew; Loose, Verne William; Vargas, Vanessa N.; Smith, Braeton J.; Warren, Drake E.; Downes, Paula Sue; Eidson, Eric D.; Mackey, Greg Edward
2010-01-01
This report gives an overview of the types of economic methodologies and models used by Sandia economists in their consequence analysis work for the National Infrastructure Simulation & Analysis Center and other DHS programs. It describes the three primary resolutions at which analysis is conducted (microeconomic, mesoeconomic, and macroeconomic), the tools used at these three levels (from data analysis to internally developed and publicly available tools), and how they are used individually and in concert with each other and other infrastructure tools.
Active Aging for Individuals with Parkinson’s Disease: Definitions, Literature Review, and Models
Directory of Open Access Journals (Sweden)
Seyed-Mohammad Fereshtehnejad
2014-01-01
Full Text Available Active aging has been emerged to optimize different aspects of health opportunities during the aging process in order to enhance quality of life. Yet, most of the efforts are on normal aging and less attention has been paid for the elderly suffering from a chronic illness such as Parkinson’s disease (PD. The aim of this review was to investigate how the concept of “active aging” fit for the elderly with PD and to propose a new model for them using the recent improvements in caring models and management approaches. For this purpose, biomedical databases have been assessed using relevant keywords to find out appropriate articles. Movement problems of PD affect physical activity, psychiatric symptoms lessen social communication, and cognitive impairment could worsen mental well-being in elderly with PD, all of which could lead to earlier retirement and poorer quality of life compared with healthy elderly. Based on the multisystematic nature of PD, a new “Active Aging Model for Parkinson’s Disease” is proposed consisting of self-care, multidisciplinary and interdisciplinary care, palliative care, patient-centered care, and personalized care. These strategies could potentially help the individuals with PD to have a better management approach for their condition towards the concept of active aging.
Active aging for individuals with Parkinson's disease: definitions, literature review, and models.
Fereshtehnejad, Seyed-Mohammad; Lökk, Johan
2014-01-01
Active aging has been emerged to optimize different aspects of health opportunities during the aging process in order to enhance quality of life. Yet, most of the efforts are on normal aging and less attention has been paid for the elderly suffering from a chronic illness such as Parkinson's disease (PD). The aim of this review was to investigate how the concept of "active aging" fit for the elderly with PD and to propose a new model for them using the recent improvements in caring models and management approaches. For this purpose, biomedical databases have been assessed using relevant keywords to find out appropriate articles. Movement problems of PD affect physical activity, psychiatric symptoms lessen social communication, and cognitive impairment could worsen mental well-being in elderly with PD, all of which could lead to earlier retirement and poorer quality of life compared with healthy elderly. Based on the multisystematic nature of PD, a new "Active Aging Model for Parkinson's Disease" is proposed consisting of self-care, multidisciplinary and interdisciplinary care, palliative care, patient-centered care, and personalized care. These strategies could potentially help the individuals with PD to have a better management approach for their condition towards the concept of active aging.
Zacher, Hannes
2015-01-01
It is crucial to advance understanding of the concept of successful aging at work to guide rigorous future research and effective practice. Drawing on the gerontology and life-span developmental literatures, I recently proposed a definition and theoretical framework of successful aging at work that
DEFF Research Database (Denmark)
Nielsen, Mogens Peter; Shui, Wan; Johansson, Jens
2011-01-01
term with stresses depending linearly on the strain rates. This term takes into account the transfer of linear momentum from one part of the fluid to another. Besides there is another term, which takes into account the transfer of angular momentum. Thus the model implies a new definition of turbulence...
Cucchi, K.; Kawa, N.; Hesse, F.; Rubin, Y.
2017-12-01
In order to reduce uncertainty in the prediction of subsurface flow and transport processes, practitioners should use all data available. However, classic inverse modeling frameworks typically only make use of information contained in in-situ field measurements to provide estimates of hydrogeological parameters. Such hydrogeological information about an aquifer is difficult and costly to acquire. In this data-scarce context, the transfer of ex-situ information coming from previously investigated sites can be critical for improving predictions by better constraining the estimation procedure. Bayesian inverse modeling provides a coherent framework to represent such ex-situ information by virtue of the prior distribution and combine them with in-situ information from the target site. In this study, we present an innovative data-driven approach for defining such informative priors for hydrogeological parameters at the target site. Our approach consists in two steps, both relying on statistical and machine learning methods. The first step is data selection; it consists in selecting sites similar to the target site. We use clustering methods for selecting similar sites based on observable hydrogeological features. The second step is data assimilation; it consists in assimilating data from the selected similar sites into the informative prior. We use a Bayesian hierarchical model to account for inter-site variability and to allow for the assimilation of multiple types of site-specific data. We present the application and validation of the presented methods on an established database of hydrogeological parameters. Data and methods are implemented in the form of an open-source R-package and therefore facilitate easy use by other practitioners.
Folkert, Michael R.; Setton, Jeremy; Apte, Aditya P.; Grkovski, Milan; Young, Robert J.; Schöder, Heiko; Thorstad, Wade L.; Lee, Nancy Y.; Deasy, Joseph O.; Oh, Jung Hun
2017-07-01
In this study, we investigate the use of imaging feature-based outcomes research (‘radiomics’) combined with machine learning techniques to develop robust predictive models for the risk of all-cause mortality (ACM), local failure (LF), and distant metastasis (DM) following definitive chemoradiation therapy (CRT). One hundred seventy four patients with stage III-IV oropharyngeal cancer (OC) treated at our institution with CRT with retrievable pre- and post-treatment 18F-fluorodeoxyglucose positron emission tomography (FDG-PET) scans were identified. From pre-treatment PET scans, 24 representative imaging features of FDG-avid disease regions were extracted. Using machine learning-based feature selection methods, multiparameter logistic regression models were built incorporating clinical factors and imaging features. All model building methods were tested by cross validation to avoid overfitting, and final outcome models were validated on an independent dataset from a collaborating institution. Multiparameter models were statistically significant on 5 fold cross validation with the area under the receiver operating characteristic curve (AUC) = 0.65 (p = 0.004), 0.73 (p = 0.026), and 0.66 (p = 0.015) for ACM, LF, and DM, respectively. The model for LF retained significance on the independent validation cohort with AUC = 0.68 (p = 0.029) whereas the models for ACM and DM did not reach statistical significance, but resulted in comparable predictive power to the 5 fold cross validation with AUC = 0.60 (p = 0.092) and 0.65 (p = 0.062), respectively. In the largest study of its kind to date, predictive features including increasing metabolic tumor volume, increasing image heterogeneity, and increasing tumor surface irregularity significantly correlated to mortality, LF, and DM on 5 fold cross validation in a relatively uniform single-institution cohort. The LF model also retained
Optimal definition of biological tumor volume using positron emission tomography in an animal model.
Wu, Ingrid; Wang, Hao; Huso, David; Wahl, Richard L
2015-12-01
The goal of the study is to investigate (18)F-fluorodeoxyglucose positron emission tomography ((18)F-FDG-PET)'s ability to delineate the viable portion of a tumor in an animal model using cross-sectional histology as the validation standard. Syngeneic mammary tumors were grown in female Lewis rats. Macroscopic histological images of the transverse tumor sections were paired with their corresponding FDG micro-PET slices of the same cranial-caudal location to form 51 pairs of co-registered images. A binary classification system based on four FDG-PET tumor contouring methods was applied to each pair of images: threshold based on (1) percentage of maximum tumor voxel counts (Cmax), (2) percentage of tumor peak voxel counts (Cpeak), (3) multiples of liver mean voxel counts (Cliver) derived from PERCIST, and (4) an edge-detection-based automated contouring system. The sensitivity, which represented the percentage of viable tumor areas correctly delineated by the gross tumor area (GTA) generated from a particular tumor contouring method, and the ratio (expressed in percentage) of the overestimated areas of a gross tumor area (GTAOE)/whole tumor areas on the macroscopic histology (WTAH), which represented how much a particular GTA extended into the normal structures surrounding the primary tumor target, were calculated. The receiver operating characteristic curves of all pairs of FDG-PET images have a mean area under the curve value of 0.934 (CI of 0.911-0.954), for representing how well each contouring method accurately delineated the viable tumor area. FDG-PET single value threshold tumor contouring based on 30 and 35 % of tumor Cmax or Cpeak and 6 × Cliver + 2 × SD achieved a sensitivity greater than 90 % with a GTAOE/WTAH ratio less than 10 %. Contouring based on 50 % of Cmax or Cpeak had a much lower sensitivity of 67.2-75.6 % with a GTAOE/WTAH ratio of 1.1-1.7 %. Automated edge detection was not reliable in this system. Single
Weatherill, Graeme; Garcia, Julio; Poggi, Valerio; Chen, Yen-Shin; Pagani, Marco
2016-04-01
The Global Earthquake Model (GEM) has, since its inception in 2009, made many contributions to the practice of seismic hazard modeling in different regions of the globe. The OpenQuake-engine (hereafter referred to simply as OpenQuake), GEM's open-source software for calculation of earthquake hazard and risk, has found application in many countries, spanning a diversity of tectonic environments. GEM itself has produced a database of national and regional seismic hazard models, harmonizing into OpenQuake's own definition the varied seismogenic sources found therein. The characterization of active faults in probabilistic seismic hazard analysis (PSHA) is at the centre of this process, motivating many of the developments in OpenQuake and presenting hazard modellers with the challenge of reconciling seismological, geological and geodetic information for the different regions of the world. Faced with these challenges, and from the experience gained in the process of harmonizing existing models of seismic hazard, four critical issues are addressed. The challenge GEM has faced in the development of software is how to define a representation of an active fault (both in terms of geometry and earthquake behaviour) that is sufficiently flexible to adapt to different tectonic conditions and levels of data completeness. By exploring the different fault typologies supported by OpenQuake we illustrate how seismic hazard calculations can, and do, take into account complexities such as geometrical irregularity of faults in the prediction of ground motion, highlighting some of the potential pitfalls and inconsistencies that can arise. This exploration leads to the second main challenge in active fault modeling, what elements of the fault source model impact most upon the hazard at a site, and when does this matter? Through a series of sensitivity studies we show how different configurations of fault geometry, and the corresponding characterisation of near-fault phenomena (including
Nicolau-Raducu, Ramona; Cohen, Ari J; Bokhari, Amjad; Bohorquez, Humberto; Bruce, David; Carmody, Ian; Bugeaud, Emily; Seal, John; Sonnier, Dennis; Nossaman, Bobby; Loss, George
2017-11-01
Early allograft dysfunction (EAD) is a well-defined clinical syndrome that reflects overall graft function within the first week after transplant. The aim of this study was to further refine the definition for EAD. In this study, 1124 patients were included for analysis. Logistic regression was performed to identify markers of liver injury associated with 6-month patient and graft failure. Recursive partitioning identified cut-points for ALT/AST > 3000/6000 IU/dL observed within first week, with bilirubin ≥ 10 mg/dL and INR ≥ 1.6 on postoperative day 7 for the revised EAD model. The incidence of updated EAD was 15% (164/1124). Multivariable analysis identified eight risk factors associated with EAD: % macrosteatosis, donor location, donor weight, nonheart beating donors, type of organ transplanted, recipient-associated hepatocellular carcinoma, severity of postreperfusion syndrome, and the amount of transfused fresh frozen plasma. In the presence of EAD, the incidence of post-transplant renal replacement therapy and dialysis dependence increases. There was a significant association of the presence of EAD with 6-month mortality (12% vs 3%) and 6-month graft failure (8% vs 1%). Higher AST/ALT level needed as cutoff in comparison with the old EAD definition. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
A Mediated Definite Delegation Model allowing for Certified Grid Job Submission
Schreiner, Steffen; Grigoras, Costin; Litmaath, Maarten
2012-01-01
Grid computing infrastructures need to provide traceability and accounting of their users" activity and protection against misuse and privilege escalation. A central aspect of multi-user Grid job environments is the necessary delegation of privileges in the course of a job submission. With respect to these generic requirements this document describes an improved handling of multi-user Grid jobs in the ALICE ("A Large Ion Collider Experiment") Grid Services. A security analysis of the ALICE Grid job model is presented with derived security objectives, followed by a discussion of existing approaches of unrestricted delegation based on X.509 proxy certificates and the Grid middleware gLExec. Unrestricted delegation has severe security consequences and limitations, most importantly allowing for identity theft and forgery of delegated assignments. These limitations are discussed and formulated, both in general and with respect to an adoption in line with multi-user Grid jobs. Based on the architecture of the ALICE...
Semantic Building Information Modeling and high definition surveys for Cultural Heritage sites
Directory of Open Access Journals (Sweden)
Simone Garagnani
2012-11-01
Full Text Available In recent years, digital technology devoted to the building design has experienced significant advancements allowing to reach, by means of the Building Information Modeling, those goals only imagined since the mid-Seventies of the last century. The BIM process, bearer of several advantages for actors and designers who implement it in their workflow, may be employed even in various case studies related to some interventions on the existing architectural Cultural Heritage. The semantics typical of the classical architecture, so pervasive in the European urban landscape, as well as the Modern or Contemporary architecture features, coincide with the self-conscious structure made of “smart objects” proper of BIM, which proves to be an effective system to document component relationships. However, the translation of existing buildings geometric information, acquired using the common techniques of laser scanning and digital photogrammetry, into BIM objects, is still a critical process that this paper aims to investigate, describing possible methods and approaches.
Definition of a 5MW/61.5m wind turbine blade reference model.
Energy Technology Data Exchange (ETDEWEB)
Resor, Brian Ray
2013-04-01
A basic structural concept of the blade design that is associated with the frequently utilized %E2%80%9CNREL offshore 5-MW baseline wind turbine%E2%80%9D is needed for studies involving blade structural design and blade structural design tools. The blade structural design documented in this report represents a concept that meets basic design criteria set forth by IEC standards for the onshore turbine. The design documented in this report is not a fully vetted blade design which is ready for manufacture. The intent of the structural concept described by this report is to provide a good starting point for more detailed and targeted investigations such as blade design optimization, blade design tool verification, blade materials and structures investigations, and blade design standards evaluation. This report documents the information used to create the current model as well as the analyses used to verify that the blade structural performance meets reasonable blade design criteria.
A univocal definition of the neuronal soma morphology using Gaussian mixture models
Luengo-Sanchez, Sergio; Bielza, Concha; Benavides-Piccione, Ruth; Fernaud-Espinosa, Isabel; DeFelipe, Javier; Larrañaga, Pedro
2015-01-01
The definition of the soma is fuzzy, as there is no clear line demarcating the soma of the labeled neurons and the origin of the dendrites and axon. Thus, the morphometric analysis of the neuronal soma is highly subjective. In this paper, we provide a mathematical definition and an automatic segmentation method to delimit the neuronal soma. We applied this method to the characterization of pyramidal cells, which are the most abundant neurons in the cerebral cortex. Since there are no benchmarks with which to compare the proposed procedure, we validated the goodness of this automatic segmentation method against manual segmentation by neuroanatomists to set up a framework for comparison. We concluded that there were no significant differences between automatically and manually segmented somata, i.e., the proposed procedure segments the neurons similarly to how a neuroanatomist does. It also provides univocal, justifiable and objective cutoffs. Thus, this study is a means of characterizing pyramidal neurons in order to objectively compare the morphometry of the somata of these neurons in different cortical areas and species. PMID:26578898
Universality Conjecture and Results for a Model of Several Coupled Positive-Definite Matrices
Bertola, Marco; Bothner, Thomas
2015-08-01
The paper contains two main parts: in the first part, we analyze the general case of matrices coupled in a chain subject to Cauchy interaction. Similarly to the Itzykson-Zuber interaction model, the eigenvalues of the Cauchy chain form a multi level determinantal point process. We first compute all correlations functions in terms of Cauchy biorthogonal polynomials and locate them as specific entries of a matrix valued solution of a Riemann-Hilbert problem. In the second part, we fix the external potentials as classical Laguerre weights. We then derive strong asymptotics for the Cauchy biorthogonal polynomials when the support of the equilibrium measures contains the origin. As a result, we obtain a new family of universality classes for multi-level random determinantal point fields, which include the Bessel universality for 1-level and the Meijer-G universality for 2-level. Our analysis uses the Deift-Zhou nonlinear steepest descent method and the explicit construction of a origin parametrix in terms of Meijer G-functions. The solution of the full Riemann-Hilbert problem is derived rigorously only for p = 3 but the general framework of the proof can be extended to the Cauchy chain of arbitrary length p.
Morohashi, Mineo; Ohashi, Yoshiaki; Tani, Saeka; Ishii, Kotaro; Itaya, Mitsuhiro; Nanamiya, Hideaki; Kawamura, Fujio; Tomita, Masaru; Soga, Tomoyoshi
2007-08-01
The soil bacterium Bacillus subtilis forms dormant, robust spores as a tactic to ensure survival under conditions of starvation. However, the sporulating culture includes sporulating and non-sporulating cells, because a portion of the cell population initiates sporulation in wild-type strain. We anticipated that the population effect must be considered carefully to analyse samples yielding population heterogeneity. We first built a mathematical model and simulated for signal transduction of the sporulation cue to see what mechanisms are responsible for generating the heterogeneity. The simulated results were confirmed experimentally, where heterogeneity is primarily modulated by negative feedback circuits, resulting in generation of a bistable response within the sporulating culture. We also confirmed that mutants relevant to negative feedback yield either sporulating or non-sporulating subpopulations. To see the effect of molecular mechanism between sporulating and non-sporulating cells in distinct manner, metabolome analysis was conducted using the above mutants. The metabolic profiles exhibited distinct characteristics with time regardless of whether sporulation was initiated or not. In addition, several distinct characteristics of metabolites were observed between strains, which was inconsistent with previously reported data. The results imply that careful consideration must be made in the interpretation of data obtained from cells yielding population heterogeneity.
Model of observed stochastic balance between work and free time supporting the LQTAI definition
DEFF Research Database (Denmark)
Ditlevsen, Ove Dalager
2008-01-01
A balance differential equation between free time and money-producing work time on the national economy level is formulated in a previous paper in terms of two dimensionless quantities, the fraction of work time and the total productivity factor defined as the ratio of the Gross Domestic Product ...... significant systematically balance influencing parameters on the macro economical level than those considered in the definition in the previous paper of the Life Quality Time Allocation Index.......A balance differential equation between free time and money-producing work time on the national economy level is formulated in a previous paper in terms of two dimensionless quantities, the fraction of work time and the total productivity factor defined as the ratio of the Gross Domestic Product...
International Nuclear Information System (INIS)
Kainz, Wolfgang; Christ, Andreas; Kellom, Tocher; Seidman, Seth; Nikoloski, Neviana; Beard, Brian; Kuster, Niels
2005-01-01
This paper presents new definitions for obtaining reproducible results in numerical phone dosimetry. Numerous numerical dosimetric studies have been published about the exposure of mobile phone users which concluded with conflicting results. However, many of these studies lack reproducibility due to shortcomings in the description of the phone positioning. The new approach was tested by two groups applying two different numerical program packages to compare the specific anthropomorphic mannequin (SAM) to 14 anatomically correct head models. A novel definition for the positioning of mobile phones next to anatomically correct head models is given along with other essential parameters to be reported. The definition is solely based on anatomical characteristics of the head. A simple up-to-date phone model was used to determine the peak spatial specific absorption rate (SAR) of mobile phones in SAM and in the anatomically correct head models. The results were validated by measurements. The study clearly shows that SAM gives a conservative estimate of the exposure in anatomically correct head models for head only tissue. Depending on frequency, phone position and head size the numerically calculated 10 g averaged SAR in the pinna can be up to 2.1 times greater than the peak spatial SAR in SAM. Measurements in small structures, such as the pinna, will significantly increase the uncertainty; therefore SAM was designed for SAR assessment in the head only. Whether SAM will provide a conservative value for the pinna depends on the pinna SAR limit of the safety standard considered
Ghanassia, E; Raynaud de Mauverger, E; Brun, J-F; Fedou, C; Mercier, J
2009-01-01
To assess the agreement of the NCEP ATP-III and the IDF definitions of metabolic syndrome and to determine their predictive values for the diagnosis of insulin resistance. For this purpose, we recruited 150 subjects (94 women and 56 men) and determined the presence of metabolic syndrome using the NCEP-ATP III and IDF definitions. We evaluated their insulin sensitivity S(I) using Caumo's oral minimal model after a standardized hyperglucidic breakfast test. Subjects whose S(I) was in the lowest quartile were considered as insulin resistant. We then calculated sensitivity, specificity, positive and negative predictive values of both definitions for the diagnosis of insulin resistance. The prevalence of metabolic syndrome was 37.4% (NCEP-ATP III) and 40% (IDF). Agreement between the two definitions was 96%. Using NCEP-ATP III and IDF criteria for the identification of insulin resistant subjects, sensitivity was 55.3% and 63%, specificity was 68.8% and 67.8%, positive predictive value was 37.5% and 40%, negative predictive value was 81.9% and 84.5%, respectively. Positive predictive value increased with the number of criteria for both definitions. Whatever the definition, the scoring of metabolic syndrome is not a reliable tool for the individual diagnosis of insulin resistance, and is more useful for excluding this diagnosis.
Ivy Ang
2013-01-01
The motivational base of this study lies in the real-life problem faced by many L2 learners: How can learners achieve target-like lexical competence? It does not take much to demonstrate that knowing a dictionary definition alone is not enough, but how learning from context actually leads to accuracy of production has remained unclear. The present article addresses this issue by proposing a cognitive model that illustrates the role of context and definition as well as L1 translation in the ac...
DEFF Research Database (Denmark)
Nygård, Lotte; Vogelius, Ivan R; Fischer, Barbara M
2018-01-01
INTRODUCTION: The aim of the study was to build a model of first failure site and lesion specific failure probability after definitive chemo-radiotherapy for inoperable non-small cell lung cancer (NSCLC). METHODS: We retrospectively analyzed 251 patients receiving definitive chemo......) failure than squamous cell carcinoma, HR 0.45, 95% CI [0.26; 0.76], p =0.003. Distant failures were more common in the adenocarcinoma group, HR 2.21, 95% CI [1.41; 3.48], ptime of first failure showed primary tumors were more likely...
Zilli, Thomas; Jorcano, Sandra; Peguret, Nicolas; Caparrotti, Francesca; Hidalgo, Alberto; Khan, Haleem G; Vees, Hansjorg; Weber, Damien C; Miralbell, Raymond
2014-01-01
To assess treatment tolerance by patients treated with a dose-adapted salvage radiotherapy (SRT) protocol based on an multiparametric endorectal magnetic resonance imaging (erMRI) failure definition model after radical prostatectomy (RP). A total of 171 prostate cancer patients recurring after RP undergoing erMRI before SRT were analyzed. A median dose of 64 Gy was delivered to the prostatic bed (PB) with, in addition, a boost of 10 Gy to the suspected relapse as visualized on erMRI in 131 patients (76.6%). Genitourinary (GU) and gastrointestinal (GI) toxicities were scored using the RTOG scale. Grade ≥ 3 GU and GI acute toxicity were observed in three and zero patients, respectively. The four-year grade ≥ 2 and ≥ 3 late GU and GI toxicity-free survival rates (109 patients with at least two years of follow-up) were 83.9 ± 4.7% and 87.1 ± 4.2%, and 92.1 ± 3.6% and 97.5 ± 1.7%, respectively. Boost (p = 0.048) and grade ≥ 2 acute GU toxicity (p = 0.008) were independently correlated with grade ≥ 2 late GU toxicity on multivariate analysis. A dose-adapted, erMRI-based SRT approach treating the PB with a boost to the suspected local recurrence may potentially improve the therapeutic ratio by selecting patients that are most likely expected to benefit from SRT doses above 70 Gy as well as by reducing the size of the highest-dose target volume. Further prospective trials are needed to investigate the use of erMRI in SRT as well as the role of dose-adapted protocols and the best fractionation schedule.
Zilli, Thomas; Jorcano, Sandra; Peguret, Nicolas; Caparrotti, Francesca; Hidalgo, Alberto; Khan, Haleem G; Vees, Hansjörg; Miralbell, Raymond
2017-04-01
To assess the outcome of patients treated with a dose-adapted salvage radiotherapy (SRT) protocol based on an endorectal magnetic resonance imaging (erMRI) failure definition model after radical prostatectomy (RP). We report on 171 relapsing patients after RP who had undergone an erMRI before SRT. 64 Gy were prescribed to the prostatic bed with, in addition, a boost of 10 Gy to the suspected local relapse as detected on erMRI in 131 patients (76.6%). The 3-year biochemical relapse-free survival (bRFS), local relapse-free survival, distant metastasis-free survival, cancer-specific survival, and overall survival were 64.2±4.3%, 100%, 85.2±3.2%, 100%, and 99.1±0.9%, respectively. A PSA value >1 ng/mL before salvage (P=0.006) and an absence of biochemical progression during RT (P=0.001) were both independently correlated with bRFS on multivariate analysis. No significant difference in 3-year bRFS was observed between the boost and no-boost groups (68.4±4.6% vs. 49.7±10%, P=0.251). A PSA value >1 ng/mL before salvage and a biochemical progression during RT were both independently correlated with worse bRFS after SRT. By using erMRI to select patients who are most likely expected to benefit from dose-escalated SRT protocols, this dose-adapted SRT approach was associated with good biochemical control and outcome, serving as a hypothesis-generating basis for further prospective trials aimed at improving the therapeutic ratio in the salvage setting.
Bartlett, Marcus A.; Liang, Tao; Pu, Liang; Schaefer, Henry F.; Allen, Wesley D.
2018-03-01
The n-propyl + O2 reaction is an important model of chain branching reactions in larger combustion systems. In this work, focal point analyses (FPAs) extrapolating to the ab initio limit were performed on the n-propyl + O2 system based on explicit quantum chemical computations with electron correlation treatments through coupled cluster single, double, triple, and perturbative quadruple excitations [CCSDT(Q)] and basis sets up to cc-pV5Z. All reaction species and transition states were fully optimized at the rigorous CCSD(T)/cc-pVTZ level of theory, revealing some substantial differences in comparison to the density functional theory geometries existing in the literature. A mixed Hessian methodology was implemented and benchmarked that essentially makes the computations of CCSD(T)/cc-pVTZ vibrational frequencies feasible and thus provides critical improvements to zero-point vibrational energies for the n-propyl + O2 system. Two key stationary points, n-propylperoxy radical (MIN1) and its concerted elimination transition state (TS1), were located 32.7 kcal mol-1 and 2.4 kcal mol-1 below the reactants, respectively. Two competitive β-hydrogen transfer transition states (TS2 and TS2') were found separated by only 0.16 kcal mol-1, a fact unrecognized in the current combustion literature. Incorporating TS2' in master equation (ME) kinetic models might reduce the large discrepancy of 2.5 kcal mol-1 between FPA and ME barrier heights for TS2. TS2 exhibits an anomalously large diagonal Born-Oppenheimer correction (ΔDBOC = 1.71 kcal mol-1), which is indicative of a nearby surface crossing and possible nonadiabatic reaction dynamics. The first systematic conformational search of three hydroperoxypropyl (QOOH) intermediates was completed, uncovering a total of 32 rotamers lying within 1.6 kcal mol-1 of their respective lowest-energy minima. Our definitive energetics for stationary points on the n-propyl + O2 potential energy surface provide key benchmarks for future studies
DEFF Research Database (Denmark)
Bork Petersen, Franziska
2013-01-01
focus centres on how the catwalk scenography evokes a ‘defiguration’ of the walking models and to what effect. Vibskov’s mobile catwalk draws attention to the walk, which is a key element of models’ performance but which usually functions in fashion shows merely to present clothes in the most...... catwalks. Vibskov’s catwalk induces what the dance scholar Gabriele Brandstetter has labelled a ‘defigurative choregoraphy’: a straying from definitions, which exist in ballet as in other movement-based genres, of how a figure should move and appear (1998). The catwalk scenography in this instance...
DEFF Research Database (Denmark)
Carlson, Kerstin
The International Criminal Tribunal for the former Yugoslavia (ICTY) was the first and most celebrated of a wave of international criminal tribunals (ICTs) built in the 1990s designed to advance liberalism through international criminal law. Model(ing) Justice examines the case law of the ICTY...
ten Cate, J.M.
2015-01-01
Developing experimental models to understand dental caries has been the theme in our research group. Our first, the pH-cycling model, was developed to investigate the chemical reactions in enamel or dentine, which lead to dental caries. It aimed to leverage our understanding of the fluoride mode of
Ludwig, C; Grimmer, S; Seyfarth, A; Maus, H-M
2012-09-21
The spring-loaded inverted pendulum (SLIP) model is a well established model for describing bouncy gaits like human running. The notion of spring-like leg behavior has led many researchers to compute the corresponding parameters, predominantly stiffness, in various experimental setups and in various ways. However, different methods yield different results, making the comparison between studies difficult. Further, a model simulation with experimentally obtained leg parameters typically results in comparatively large differences between model and experimental center of mass trajectories. Here, we pursue the opposite approach which is calculating model parameters that allow reproduction of an experimental sequence of steps. In addition, to capture energy fluctuations, an extension of the SLIP (ESLIP) is required and presented. The excellent match of the models with the experiment validates the description of human running by the SLIP with the obtained parameters which we hence call dynamical leg parameters. Copyright © 2012 Elsevier Ltd. All rights reserved.
Land, Kenneth C.
2001-01-01
Examines the definition, construction, and interpretation of social indicators. Shows how standard classes of formalisms used to construct models in contemporary sociology are derived from the general theory of models. Reviews recent model building and evaluation related to active life expectancy among the elderly, fertility rates, and indicators…
DEFF Research Database (Denmark)
Damkjær, Sidsel; Thomsen, Jakob B; Petersen, Svetlana I
2017-01-01
prescribed the same PTV mean dose. Rectal NTCP grade ≥2 was evaluated with the Lyman-Kutcher-Burman model and TCP was estimated by a logistic model using the combined MRI positive volume in SV and prostate as region-of-interest. RESULTS: Fourteen of twenty-one patients were classified as MRI positive, six...
Moghaddasi, L; Bezak, E; Harriss-Phillips, W
2016-05-07
Clinical target volume (CTV) determination may be complex and subjective. In this work a microscopic-scale tumour model was developed to evaluate current CTV practices in glioblastoma multiforme (GBM) external radiotherapy. Previously, a Geant4 cell-based dosimetry model was developed to calculate the dose deposited in individual GBM cells. Microscopic extension probability (MEP) models were then developed using Matlab-2012a. The results of the cell-based dosimetry model and MEP models were combined to calculate survival fractions (SF) for CTV margins of 2.0 and 2.5 cm. In the current work, oxygenation and heterogeneous radiosensitivity profiles were incorporated into the GBM model. The genetic heterogeneity was modelled using a range of α/β values (linear-quadratic model parameters) associated with different GBM cell lines. These values were distributed among the cells randomly, taken from a Gaussian-weighted sample of α/β values. Cellular oxygen pressure was distributed randomly taken from a sample weighted to profiles obtained from literature. Three types of GBM models were analysed: homogeneous-normoxic, heterogeneous-normoxic, and heterogeneous-hypoxic. The SF in different regions of the tumour model and the effect of the CTV margin extension from 2.0-2.5 cm on SFs were investigated for three MEP models. The SF within the beam was increased by up to three and two orders of magnitude following incorporation of heterogeneous radiosensitivities and hypoxia, respectively, in the GBM model. However, the total SF was shown to be overdominated by the presence of tumour cells in the penumbra region and to a lesser extent by genetic heterogeneity and hypoxia. CTV extension by 0.5 cm reduced the SF by a maximum of 78.6 ± 3.3%, 78.5 ± 3.3%, and 77.7 ± 3.1% for homogeneous and heterogeneous-normoxic, and heterogeneous hypoxic GBMs, respectively. Monte-Carlo model was developed to quantitatively evaluate SF for genetically
Collett, David
2002-01-01
INTRODUCTION Some Examples The Scope of this Book Use of Statistical Software STATISTICAL INFERENCE FOR BINARY DATA The Binomial Distribution Inference about the Success Probability Comparison of Two Proportions Comparison of Two or More Proportions MODELS FOR BINARY AND BINOMIAL DATA Statistical Modelling Linear Models Methods of Estimation Fitting Linear Models to Binomial Data Models for Binomial Response Data The Linear Logistic Model Fitting the Linear Logistic Model to Binomial Data Goodness of Fit of a Linear Logistic Model Comparing Linear Logistic Models Linear Trend in Proportions Comparing Stimulus-Response Relationships Non-Convergence and Overfitting Some other Goodness of Fit Statistics Strategy for Model Selection Predicting a Binary Response Probability BIOASSAY AND SOME OTHER APPLICATIONS The Tolerance Distribution Estimating an Effective Dose Relative Potency Natural Response Non-Linear Logistic Regression Models Applications of the Complementary Log-Log Model MODEL CHECKING Definition of Re...
Averaging in cosmological models
Coley, Alan
2010-01-01
The averaging problem in cosmology is of considerable importance for the correct interpretation of cosmological data. We review cosmological observations and discuss some of the issues regarding averaging. We present a precise definition of a cosmological model and a rigorous mathematical definition of averaging, based entirely in terms of scalar invariants.
Zandbelt, Bram
2017-01-01
Introductory presentation on cognitive modeling for the course ‘Cognitive control’ of the MSc program Cognitive Neuroscience at Radboud University. It addresses basic questions, such as 'What is a model?', 'Why use models?', and 'How to use models?'
Anaïs Schaeffer
2012-01-01
By analysing the production of mesons in the forward region of LHC proton-proton collisions, the LHCf collaboration has provided key information needed to calibrate extremely high-energy cosmic ray models. Average transverse momentum (pT) as a function of rapidity loss ∆y. Black dots represent LHCf data and the red diamonds represent SPS experiment UA7 results. The predictions of hadronic interaction models are shown by open boxes (sibyll 2.1), open circles (qgsjet II-03) and open triangles (epos 1.99). Among these models, epos 1.99 shows the best overall agreement with the LHCf data. LHCf is dedicated to the measurement of neutral particles emitted at extremely small angles in the very forward region of LHC collisions. Two imaging calorimeters – Arm1 and Arm2 – take data 140 m either side of the ATLAS interaction point. “The physics goal of this type of analysis is to provide data for calibrating the hadron interaction models – the well-known &...
Nygård, Lotte; Vogelius, Ivan R; Fischer, Barbara M; Kjær, Andreas; Langer, Seppo W; Aznar, Marianne C; Persson, Gitte F; Bentzen, Søren M
2018-04-01
The aim of the study was to build a model of first failure site- and lesion-specific failure probability after definitive chemoradiotherapy for inoperable NSCLC. We retrospectively analyzed 251 patients receiving definitive chemoradiotherapy for NSCLC at a single institution between 2009 and 2015. All patients were scanned by fludeoxyglucose positron emission tomography/computed tomography for radiotherapy planning. Clinical patient data and fludeoxyglucose positron emission tomography standardized uptake values from primary tumor and nodal lesions were analyzed by using multivariate cause-specific Cox regression. In patients experiencing locoregional failure, multivariable logistic regression was applied to assess risk of each lesion being the first site of failure. The two models were used in combination to predict probability of lesion failure accounting for competing events. Adenocarcinoma had a lower hazard ratio (HR) of locoregional failure than squamous cell carcinoma (HR = 0.45, 95% confidence interval [CI]: 0.26-0.76, p = 0.003). Distant failures were more common in the adenocarcinoma group (HR = 2.21, 95% CI: 1.41-3.48, p failure showed that primary tumors were more likely to fail than lymph nodes (OR = 12.8, 95% CI: 5.10-32.17, p failure (OR = 1.26 per unit increase, 95% CI: 1.12-1.40, p failure site-specific competing risk model based on patient- and lesion-level characteristics. Failure patterns differed between adenocarcinoma and squamous cell carcinoma, illustrating the limitation of aggregating them into NSCLC. Failure site-specific models add complementary information to conventional prognostic models. Copyright © 2018 International Association for the Study of Lung Cancer. Published by Elsevier Inc. All rights reserved.
Luxford, Cynthia J.; Bretz, Stacey Lowery
2013-01-01
Chemistry students encounter a variety of terms, definitions, and classification schemes that many instructors expect students to memorize and be able to use. This research investigated students' descriptions of ionic and covalent bonding beyond definitions in order to explore students' knowledge about chemical bonding. Using Johnstone's Multiple…
Energy Technology Data Exchange (ETDEWEB)
Shipman, Galen M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2016-06-13
These are the slides for a presentation on programming models in HPC, at the Los Alamos National Laboratory's Parallel Computing Summer School. The following topics are covered: Flynn's Taxonomy of computer architectures; single instruction single data; single instruction multiple data; multiple instruction multiple data; address space organization; definition of Trinity (Intel Xeon-Phi is a MIMD architecture); single program multiple data; multiple program multiple data; ExMatEx workflow overview; definition of a programming model, programming languages, runtime systems; programming model and environments; MPI (Message Passing Interface); OpenMP; Kokkos (Performance Portable Thread-Parallel Programming Model); Kokkos abstractions, patterns, policies, and spaces; RAJA, a systematic approach to node-level portability and tuning; overview of the Legion Programming Model; mapping tasks and data to hardware resources; interoperability: supporting task-level models; Legion S3D execution and performance details; workflow, integration of external resources into the programming model.
Person Oriented Models (POMs) provide a basis for simulating aggregate chemical exposures in a population over time (Price and Chaisson, 2005). POMs assign characteristics to simulated individuals that are used to determine the individual’s probability of interacting with e...
DEFF Research Database (Denmark)
Beauquier, Maxime; Schürmann, Carsten
2011-01-01
In this paper, we present a model based on relations for bigraphical reactive system [Milner09]. Its defining characteristics are that validity and reaction relations are captured as traces in a multi-set rewriting system. The relational model is derived from Milner's graphical definition...
Degeling, Koen; IJzerman, Maarten J; Koopman, Miriam; Koffijberg, Hendrik
2017-12-15
Parametric distributions based on individual patient data can be used to represent both stochastic and parameter uncertainty. Although general guidance is available on how parameter uncertainty should be accounted for in probabilistic sensitivity analysis, there is no comprehensive guidance on reflecting parameter uncertainty in the (correlated) parameters of distributions used to represent stochastic uncertainty in patient-level models. This study aims to provide this guidance by proposing appropriate methods and illustrating the impact of this uncertainty on modeling outcomes. Two approaches, 1) using non-parametric bootstrapping and 2) using multivariate Normal distributions, were applied in a simulation and case study. The approaches were compared based on point-estimates and distributions of time-to-event and health economic outcomes. To assess sample size impact on the uncertainty in these outcomes, sample size was varied in the simulation study and subgroup analyses were performed for the case-study. Accounting for parameter uncertainty in distributions that reflect stochastic uncertainty substantially increased the uncertainty surrounding health economic outcomes, illustrated by larger confidence ellipses surrounding the cost-effectiveness point-estimates and different cost-effectiveness acceptability curves. Although both approaches performed similar for larger sample sizes (i.e. n = 500), the second approach was more sensitive to extreme values for small sample sizes (i.e. n = 25), yielding infeasible modeling outcomes. Modelers should be aware that parameter uncertainty in distributions used to describe stochastic uncertainty needs to be reflected in probabilistic sensitivity analysis, as it could substantially impact the total amount of uncertainty surrounding health economic outcomes. If feasible, the bootstrap approach is recommended to account for this uncertainty.
Directory of Open Access Journals (Sweden)
Koen Degeling
2017-12-01
Full Text Available Abstract Background Parametric distributions based on individual patient data can be used to represent both stochastic and parameter uncertainty. Although general guidance is available on how parameter uncertainty should be accounted for in probabilistic sensitivity analysis, there is no comprehensive guidance on reflecting parameter uncertainty in the (correlated parameters of distributions used to represent stochastic uncertainty in patient-level models. This study aims to provide this guidance by proposing appropriate methods and illustrating the impact of this uncertainty on modeling outcomes. Methods Two approaches, 1 using non-parametric bootstrapping and 2 using multivariate Normal distributions, were applied in a simulation and case study. The approaches were compared based on point-estimates and distributions of time-to-event and health economic outcomes. To assess sample size impact on the uncertainty in these outcomes, sample size was varied in the simulation study and subgroup analyses were performed for the case-study. Results Accounting for parameter uncertainty in distributions that reflect stochastic uncertainty substantially increased the uncertainty surrounding health economic outcomes, illustrated by larger confidence ellipses surrounding the cost-effectiveness point-estimates and different cost-effectiveness acceptability curves. Although both approaches performed similar for larger sample sizes (i.e. n = 500, the second approach was more sensitive to extreme values for small sample sizes (i.e. n = 25, yielding infeasible modeling outcomes. Conclusions Modelers should be aware that parameter uncertainty in distributions used to describe stochastic uncertainty needs to be reflected in probabilistic sensitivity analysis, as it could substantially impact the total amount of uncertainty surrounding health economic outcomes. If feasible, the bootstrap approach is recommended to account for this uncertainty.
International Nuclear Information System (INIS)
Lundberg, Jonas; Johansson, Björn JE
2015-01-01
It has been realized that resilience as a concept involves several contradictory definitions, both for instance resilience as agile adjustment and as robust resistance to situations. Our analysis of resilience concepts and models suggest that beyond simplistic definitions, it is possible to draw up a systemic resilience model (SyRes) that maintains these opposing characteristics without contradiction. We outline six functions in a systemic model, drawing primarily on resilience engineering, and disaster response: anticipation, monitoring, response, recovery, learning, and self-monitoring. The model consists of four areas: Event-based constraints, Functional Dependencies, Adaptive Capacity and Strategy. The paper describes dependencies between constraints, functions and strategies. We argue that models such as SyRes should be useful both for envisioning new resilience methods and metrics, as well as for engineering and evaluating resilient systems. - Highlights: • The SyRes model resolves contradictions between previous resilience definitions. • SyRes is a core model for envisioning and evaluating resilience metrics and models. • SyRes describes six functions in a systemic model. • They are anticipation, monitoring, response, recovery, learning, self-monitoring. • The model describes dependencies between constraints, functions and strategies
Arnaoudova, Kristina; Stanchev, Peter
2015-11-01
The business processes are the key asset for every organization. The design of the business process models is the foremost concern and target among an organization's functions. Business processes and their proper management are intensely dependent on the performance of software applications and technology solutions. The paper is attempt for definition of new Conceptual model of IT service provider, it could be examined as IT focused Enterprise model, part of Enterprise Architecture (EA) school.
Edwards, Dylan; Cortes, Mar; Datta, Abhishek; Minhas, Preet; Wassermann, Eric M.; Bikson, Marom
2015-01-01
Transcranial Direct Current Stimulation (tDCS) is a non-invasive, low-cost, well-tolerated technique producing lasting modulation of cortical excitability. Behavioral and therapeutic outcomes of tDCS are linked to the targeted brain regions, but there is little evidence that current reaches the brain as intended. We aimed to: (1) validate a computational model for estimating cortical electric fields in human transcranial stimulation, and (2) assess the magnitude and spread of cortical electric field with a novel High-Definition tDCS (HD-tDCS) scalp montage using a 4×1-Ring electrode configuration. In three healthy adults, Transcranial Electrical Stimulation (TES) over primary motor cortex (M1) was delivered using the 4×1 montage (4× cathode, surrounding a single central anode; montage radius ~3 cm) with sufficient intensity to elicit a discrete muscle twitch in the hand. The estimated current distribution in M1 was calculated using the individualized MRI-based model, and compared with the observed motor response across subjects. The response magnitude was quantified with stimulation over motor cortex as well as anterior and posterior to motor cortex. In each case the model data were consistent with the motor response across subjects. The estimated cortical electric fields with the 4×1 montage were compared (area, magnitude, direction) for TES and tDCS in each subject. We provide direct evidence in humans that TES with a 4×1-Ring configuration can activate motor cortex and that current does not substantially spread outside the stimulation area. Computational models predict that both TES and tDCS waveforms using the 4×1-Ring configuration generate electric fields in cortex with comparable gross current distribution, and preferentially directed normal (inward) currents. The agreement of modeling and experimental data for both current delivery and focality support the use of the HD-tDCS 4×1-Ring montage for cortically targeted neuromodulation. PMID:23370061
Walsh, Colin; Hripcsak, George
2014-12-01
Hospital readmission risk prediction remains a motivated area of investigation and operations in light of the hospital readmissions reduction program through CMS. Multiple models of risk have been reported with variable discriminatory performances, and it remains unclear how design factors affect performance. To study the effects of varying three factors of model development in the prediction of risk based on health record data: (1) reason for readmission (primary readmission diagnosis); (2) available data and data types (e.g. visit history, laboratory results, etc); (3) cohort selection. Regularized regression (LASSO) to generate predictions of readmissions risk using prevalence sampling. Support Vector Machine (SVM) used for comparison in cohort selection testing. Calibration by model refitting to outcome prevalence. Predicting readmission risk across multiple reasons for readmission resulted in ROC areas ranging from 0.92 for readmission for congestive heart failure to 0.71 for syncope and 0.68 for all-cause readmission. Visit history and laboratory tests contributed the most predictive value; contributions varied by readmission diagnosis. Cohort definition affected performance for both parametric and nonparametric algorithms. Compared to all patients, limiting the cohort to patients whose index admission and readmission diagnoses matched resulted in a decrease in average ROC from 0.78 to 0.55 (difference in ROC 0.23, p value 0.01). Calibration plots demonstrate good calibration with low mean squared error. Targeting reason for readmission in risk prediction impacted discriminatory performance. In general, laboratory data and visit history data contributed the most to prediction; data source contributions varied by reason for readmission. Cohort selection had a large impact on model performance, and these results demonstrate the difficulty of comparing results across different studies of predictive risk modeling. Copyright © 2014 Elsevier Inc. All rights
International Nuclear Information System (INIS)
Ellis, R.J.
2000-01-01
The US Department of Energy (USDOE) has contracted with Duke Engineering and Services, Cogema, Inc., and Stone and Webster (DCS) to provide mixed-oxide (MOX) fuel fabrication and reactor irradiation services in support of USDOE's mission to dispose of surplus weapons-grade plutonium. The nuclear station units currently identified as mission reactors for this project are Catawba Units 1 and 2 and McGuire Units 1 and 2. This report is specific to Catawba Nuclear Station Units 1 and 2, but the details and materials for the McGuire reactors are very similar. The purpose of this document is to present a complete set of data about the reactor materials and components to be used in modeling the Catawba reactors to predict reactor physics parameters for the Catawba site. Except where noted, Duke Power Company or DCS documents are the sources of these data. These data are being used with the ORNL computer code models of the DCS Catawba (and McGuire) pressurized-water reactors
Ravi, Sathish Kumar; Gawad, Jerzy; Seefeldt, Marc; Van Bael, Albert; Roose, Dirk
2017-10-01
A numerical multi-scale model is being developed to predict the anisotropic macroscopic material response of multi-phase steel. The embedded microstructure is given by a meso-scale Representative Volume Element (RVE), which holds the most relevant features like phase distribution, grain orientation, morphology etc., in sufficient detail to describe the multi-phase behavior of the material. A Finite Element (FE) mesh of the RVE is constructed using statistical information from individual phases such as grain size distribution and ODF. The material response of the RVE is obtained for selected loading/deformation modes through numerical FE simulations in Abaqus. For the elasto-plastic response of the individual grains, single crystal plasticity based plastic potential functions are proposed as Abaqus material definitions. The plastic potential functions are derived using the Facet method for individual phases in the microstructure at the level of single grains. The proposed method is a new modeling framework and the results presented in terms of macroscopic flow curves are based on the building blocks of the approach, while the model would eventually facilitate the construction of an anisotropic yield locus of the underlying multi-phase microstructure derived from a crystal plasticity based framework.
Wu, Xiaojie; Li, Xiantao
2015-01-01
Results from molecular dynamics simulations often need to be further processed to understand the physics on a larger scale. This paper considers the definitions of momentum and energy fluxes obtained from a control-volume approach. To assess the validity of these defined quantities, two consistency criteria are proposed. As examples, the embedded atom potential and the Tersoff potential are considered. The consistency is verified using analytical and numerical methods.
Directory of Open Access Journals (Sweden)
Tea Ya. Danelyan
2014-01-01
Full Text Available The article states the general principles of structural modeling in aspect of the theory of systems and gives the interrelation with other types of modeling to adjust them to the main directions of modeling. Mathematical methods of structural modeling, in particular method of expert evaluations are considered.
African Journals Online (AJOL)
Moatez Billah HARIDA
The use of the simulator “Hybrid Electrical Vehicle Model Balances Fidelity and. Speed (HEVMBFS)” and the global control strategy make it possible to achieve encouraging results. Key words: Series parallel hybrid vehicle - nonlinear model - linear model - Diesel engine - Engine modelling -. HEV simulator - Predictive ...
DEFF Research Database (Denmark)
Sales-Cruz, Mauricio; Piccolo, Chiara; Heitzig, Martina
2011-01-01
This chapter presents various types of constitutive models and their applications. There are 3 aspects dealt with in this chapter, namely: creation and solution of property models, the application of parameter estimation and finally application examples of constitutive models. A systematic...... procedure is introduced for the analysis and solution of property models. Models that capture and represent the temperature dependent behaviour of physical properties are introduced, as well as equation of state models (EOS) such as the SRK EOS. Modelling of liquid phase activity coefficients are also...
Chang, CC
2012-01-01
Model theory deals with a branch of mathematical logic showing connections between a formal language and its interpretations or models. This is the first and most successful textbook in logical model theory. Extensively updated and corrected in 1990 to accommodate developments in model theoretic methods - including classification theory and nonstandard analysis - the third edition added entirely new sections, exercises, and references. Each chapter introduces an individual method and discusses specific applications. Basic methods of constructing models include constants, elementary chains, Sko
Directory of Open Access Journals (Sweden)
Sona Benesova
2014-11-01
Full Text Available With the aid of DEFORM® software it is possible to conduct numerical simulation of workpiece phase composition during and upon heat treatment. The computation can be based on either the graphical representation of TTT diagram of the steel in question or one of the mathematical models integrated in the software, the latter being applicable if the required constants are known. The present paper gives an evaluation of differences between results of numerical simulations with various definitions of phase transformation for the heat treatment of a gearwheel and a specially prepared specimen of simple shape. It was found that the preparation of input data in terms of thorough mapping of characteristics of the material is essential.
International Nuclear Information System (INIS)
Pinedo, P.
2002-01-01
The long life of high level waste and their 'possible' releases, from the repository, in the far future during wide time frames, introduce difficulties on the ability of forecasting actual doses. Similar difficulties were found when trying to establish or recommend protection criteria for the environment and human health. The stochastic nature of the whole problem, from the causes that initiate radionuclides releases to the nature of the environmental conditions where impact is evaluated, made more complex the treatment of the radionuclide transport models and the analysis of radiological impact. The application of radiological protection principles to this management option, was also seen as different from other present-day practices. All this gave rise to the diversification of the research lines towards new areas that allow for the analysis of radionuclide transport, dose calculations and, criteria, in this new situation. The approach for the biosphere system based on the 'reference' concept, in essence the same idea as the one for the 'Reference man' concept, was promoted internationally, first within the BIOMOVS II Project and, afterwards, in the BIOMASS IAEA Programme. In parallel to the participation in these Projects and based on their conclusions, CIEMAT has been developing for ENRESA a methodology, which has to be updated and completed with recent developments from BIOMASS-Theme1. Notably, for the Justification and Identification step, the Description of Critical Groups and the use of the Data protocol. An application of this methodology was performed and published in 1998 and, its results and conclusions are summarised in the paper. Also, the paper includes main conclusions from the biosphere modelling applied in the last ENRESA2000 Spanish PA exercise and, difficulties found in the consistency between the scenario generation procedure, the treatment of the interface and the source term and, the use of the reference biosphere concept. (author)
International Nuclear Information System (INIS)
Buchler, J.R.; Gottesman, S.T.; Hunter, J.H. Jr.
1990-01-01
Various papers on galactic models are presented. Individual topics addressed include: observations relating to galactic mass distributions; the structure of the Galaxy; mass distribution in spiral galaxies; rotation curves of spiral galaxies in clusters; grand design, multiple arm, and flocculent spiral galaxies; observations of barred spirals; ringed galaxies; elliptical galaxies; the modal approach to models of galaxies; self-consistent models of spiral galaxies; dynamical models of spiral galaxies; N-body models. Also discussed are: two-component models of galaxies; simulations of cloudy, gaseous galactic disks; numerical experiments on the stability of hot stellar systems; instabilities of slowly rotating galaxies; spiral structure as a recurrent instability; model gas flows in selected barred spiral galaxies; bar shapes and orbital stochasticity; three-dimensional models; polar ring galaxies; dynamical models of polar rings
Ducasse, Éric; Yaacoubi, Slah
2010-01-01
A tensor Hankel transform (THT) is defined for vector fields, such as displacement, and second-order tensor fields, such as stress or strain. The THT establishes a bijection between the real space and the wave-vector domain, and, remarkably, cannot be reduced to a scalar transform applied separately to each component. One of the advantages of this approach is that some standard elasticity problems can be concisely rewritten by applying this tensor integral transform coupled with an azimuthal Fourier series expansion. A simple and compact formulation of the boundary conditions is also achieved. Thanks to the THT, we obtain for each azimuthal wavenumber and each azimuthal direction exactly the same wave equation as for a standard 2D model of elastic wave propagation. Thus, waves similar to the standard plane P, SV and SH waves are naturally found. Lastly, the THT is used to calculate the ultrasonic field in an isotropic cylindrical leaky waveguide, the walls of which radiating into a surrounding elastic medium, by using a standard scattering approach.
Guerrero, J M; Martínez-Tomás, R; Rincón, M; Peraita, H
2016-01-01
Early detection of Alzheimer's disease (AD) has become one of the principal focuses of research in medicine, particularly when the disease is incipient or even prodromic, because treatments are more effective in these stages. Lexical-semantic-conceptual deficit (LSCD) in the oral definitions of semantic categories for basic objects is an important early indicator in the evaluation of the cognitive state of patients. The objective of this research is to define an economic procedure for cognitive impairment (CI) diagnosis, which may be associated with early stages of AD, by analysing cognitive alterations affecting declarative semantic memory. Because of its low cost, it could be used for routine clinical evaluations or screenings, leading to more expensive and selective tests that confirm or rule out the disease accurately. It should necessarily be an explanatory procedure, which would allow us to study the evolution of the disease in relation to CI, the irregularities in different semantic categories, and other neurodegenerative diseases. On the basis of these requirements, we hypothesise that Bayesian networks (BNs) are the most appropriate tool for this purpose. We have developed a BN for CI diagnosis in mild and moderate AD patients by analysing the oral production of semantic features. The BN causal model represents LSCD in certain semantic categories, both of living things (dog, pine, and apple) and non-living things (chair, car, and trousers), as symptoms of CI. The model structure, the qualitative part of the model, uses domain knowledge obtained from psychology experts and epidemiological studies. Further, the model parameters, the quantitative part of the model, are learnt automatically from epidemiological studies and Peraita and Grasso's linguistic corpus of oral definitions. This corpus was prepared with an incidental sampling and included the analysis of the oral linguistic production of 81 participants (42 cognitively healthy elderly people and 39
Rao, Sunil V; McCoy, Lisa A; Spertus, John A; Krone, Ronald J; Singh, Mandeep; Fitzgerald, Susan; Peterson, Eric D
2013-09-01
This study sought to develop a model that predicts bleeding complications using an expanded bleeding definition among patients undergoing percutaneous coronary intervention (PCI) in contemporary clinical practice. New knowledge about the importance of periprocedural bleeding combined with techniques to mitigate its occurrence and the inclusion of new data in the updated CathPCI Registry data collection forms encouraged us to develop a new bleeding definition and risk model to improve the monitoring and safety of PCI. Detailed clinical data from 1,043,759 PCI procedures at 1,142 centers from February 2008 through April 2011 participating in the CathPCI Registry were used to identify factors associated with major bleeding complications occurring within 72 h post-PCI. Risk models (full and simplified risk scores) were developed in 80% of the cohort and validated in the remaining 20%. Model discrimination and calibration were assessed in the overall population and among the following pre-specified patient subgroups: females, those older than 70 years of age, those with diabetes mellitus, those with ST-segment elevation myocardial infarction, and those who did not undergo in-hospital coronary artery bypass grafting. Using the updated definition, the rate of bleeding was 5.8%. The full model included 31 variables, and the risk score had 10. The full model had similar discriminatory value across pre-specified subgroups and was well calibrated across the PCI risk spectrum. The updated bleeding definition identifies important post-PCI bleeding events. Risk models that use this expanded definition provide accurate estimates of post-PCI bleeding risk, thereby better informing clinical decision making and facilitating risk-adjusted provider feedback to support quality improvement. Copyright © 2013 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.
International Nuclear Information System (INIS)
Borges, C.; Zarza-Moreno, M.; Heath, E.; Teixeira, N.; Vaz, P.
2012-01-01
Purpose: The most recent Varian micro multileaf collimator (MLC), the High Definition (HD120) MLC, was modeled using the BEAMNRC Monte Carlo code. This model was incorporated into a Varian medical linear accelerator, for a 6 MV beam, in static and dynamic mode. The model was validated by comparing simulated profiles with measurements. Methods: The Varian Trilogy (2300C/D) accelerator model was accurately implemented using the state-of-the-art Monte Carlo simulation program BEAMNRC and validated against off-axis and depth dose profiles measured using ionization chambers, by adjusting the energy and the full width at half maximum (FWHM) of the initial electron beam. The HD120 MLC was modeled by developing a new BEAMNRC component module (CM), designated HDMLC, adapting the available DYNVMLC CM and incorporating the specific characteristics of this new micro MLC. The leaf dimensions were provided by the manufacturer. The geometry was visualized by tracing particles through the CM and recording their position when a leaf boundary is crossed. The leaf material density and abutting air gap between leaves were adjusted in order to obtain a good agreement between the simulated leakage profiles and EBT2 film measurements performed in a solid water phantom. To validate the HDMLC implementation, additional MLC static patterns were also simulated and compared to additional measurements. Furthermore, the ability to simulate dynamic MLC fields was implemented in the HDMLC CM. The simulation results of these fields were compared with EBT2 film measurements performed in a solid water phantom. Results: Overall, the discrepancies, with and without MLC, between the opened field simulations and the measurements using ionization chambers in a water phantom, for the off-axis profiles are below 2% and in depth-dose profiles are below 2% after the maximum dose depth and below 4% in the build-up region. On the conditions of these simulations, this tungsten-based MLC has a density of 18.7 g
DEFF Research Database (Denmark)
Ravn, Anders P.; Staunstrup, Jørgen
1994-01-01
This paper proposes a model for specifying interfaces between concurrently executing modules of a computing system. The model does not prescribe a particular type of communication protocol and is aimed at describing interfaces between both software and hardware modules or a combination of the two....... The model describes both functional and timing properties of an interface...
Directory of Open Access Journals (Sweden)
Josué Kunjom Mfopou
2014-01-01
Full Text Available Definitive endoderm (DE differentiation from mouse embryonic stem cell (mESC monolayer cultures has been limited by poor cell survival or low efficiency. Recently, a combination of TGFβ and Wnt activation with BMP inhibition improved DE induction in embryoid bodies cultured in suspension. Based on these observations we developed a protocol to efficiently induce DE cells in monolayer cultures of mESCs. We obtained a good cell yield with 54.92% DE induction as shown by Foxa2, Sox17, Cxcr4 and E-Cadherin expression. These DE-cells could be further differentiated into posterior foregut and pancreatic phenotypes using a culture protocol initially developed for human embryonic stem cell (hESC differentiation. In addition, this mESC-derived DE gave rise to hepatocyte-like cells after exposure to BMP and FGF ligands. Our data therefore indicate a substantial improvement of monolayer DE induction from mESCs and support the concept that differentiation conditions for mESC-derived DE are similar to those for hESCs. As mESCs are easier to maintain and manipulate in culture compared to hESCs, and considering the shorter duration of embryonic development in the mouse, this method of efficient DE induction on monolayer will promote the development of new differentiation protocols to obtain DE-derivatives, like pancreatic beta-cells, for future use in cell replacement therapies.
Hydrological models are mediating models
Babel, L. V.; Karssenberg, D.
2013-08-01
Despite the increasing role of models in hydrological research and decision-making processes, only few accounts of the nature and function of models exist in hydrology. Earlier considerations have traditionally been conducted while making a clear distinction between physically-based and conceptual models. A new philosophical account, primarily based on the fields of physics and economics, transcends classes of models and scientific disciplines by considering models as "mediators" between theory and observations. The core of this approach lies in identifying models as (1) being only partially dependent on theory and observations, (2) integrating non-deductive elements in their construction, and (3) carrying the role of instruments of scientific enquiry about both theory and the world. The applicability of this approach to hydrology is evaluated in the present article. Three widely used hydrological models, each showing a different degree of apparent physicality, are confronted to the main characteristics of the "mediating models" concept. We argue that irrespective of their kind, hydrological models depend on both theory and observations, rather than merely on one of these two domains. Their construction is additionally involving a large number of miscellaneous, external ingredients, such as past experiences, model objectives, knowledge and preferences of the modeller, as well as hardware and software resources. We show that hydrological models convey the role of instruments in scientific practice by mediating between theory and the world. It results from these considerations that the traditional distinction between physically-based and conceptual models is necessarily too simplistic and refers at best to the stage at which theory and observations are steering model construction. The large variety of ingredients involved in model construction would deserve closer attention, for being rarely explicitly presented in peer-reviewed literature. We believe that devoting
International Nuclear Information System (INIS)
Phillips, C.K.
1985-12-01
This lecture provides a survey of the methods used to model fast magnetosonic wave coupling, propagation, and absorption in tokamaks. The validity and limitations of three distinct types of modelling codes, which will be contrasted, include discrete models which utilize ray tracing techniques, approximate continuous field models based on a parabolic approximation of the wave equation, and full field models derived using finite difference techniques. Inclusion of mode conversion effects in these models and modification of the minority distribution function will also be discussed. The lecture will conclude with a presentation of time-dependent global transport simulations of ICRF-heated tokamak discharges obtained in conjunction with the ICRF modelling codes. 52 refs., 15 figs
Zaytsev, V.; Pierantonio, A.; Schätz, B.; Tamzalit, D.
2014-01-01
The evolution of a software language (whether modelled by a grammar or a schema or a metamodel) is not limited to development of new versions and dialects. An important dimension of a software language evolution is maturing in the sense of improving the quality of its definition. In this paper, we
Fernandez, R.; Deveaux, V.
2010-01-01
We provide a formal definition and study the basic properties of partially ordered chains (POC). These systems were proposed to model textures in image processing and to represent independence relations between random variables in statistics (in the later case they are known as Bayesian networks).
Modelling in Business Model design
Simonse, W.L.
2013-01-01
It appears that business model design might not always produce a design or model as the expected result. However when designers are involved, a visual model or artefact is produced. To assist strategic managers in thinking about how they can act, the designers challenge is to combine strategy and
International Nuclear Information System (INIS)
Yang, H.
1999-01-01
The purpose of this analysis and model report (AMR) for the Ventilation Model is to analyze the effects of pre-closure continuous ventilation in the Engineered Barrier System (EBS) emplacement drifts and provide heat removal data to support EBS design. It will also provide input data (initial conditions, and time varying boundary conditions) for the EBS post-closure performance assessment and the EBS Water Distribution and Removal Process Model. The objective of the analysis is to develop, describe, and apply calculation methods and models that can be used to predict thermal conditions within emplacement drifts under forced ventilation during the pre-closure period. The scope of this analysis includes: (1) Provide a general description of effects and heat transfer process of emplacement drift ventilation. (2) Develop a modeling approach to simulate the impacts of pre-closure ventilation on the thermal conditions in emplacement drifts. (3) Identify and document inputs to be used for modeling emplacement ventilation. (4) Perform calculations of temperatures and heat removal in the emplacement drift. (5) Address general considerations of the effect of water/moisture removal by ventilation on the repository thermal conditions. The numerical modeling in this document will be limited to heat-only modeling and calculations. Only a preliminary assessment of the heat/moisture ventilation effects and modeling method will be performed in this revision. Modeling of moisture effects on heat removal and emplacement drift temperature may be performed in the future
International Nuclear Information System (INIS)
Laurence, D.
1997-01-01
This paper is an introduction course in modelling turbulent thermohydraulics, aimed at computational fluid dynamics users. No specific knowledge other than the Navier Stokes equations is required beforehand. Chapter I (which those who are not beginners can skip) provides basic ideas on turbulence physics and is taken up in a textbook prepared by the teaching team of the ENPC (Benque, Viollet). Chapter II describes turbulent viscosity type modelling and the 2k-ε two equations model. It provides details of the channel flow case and the boundary conditions. Chapter III describes the 'standard' (R ij -ε) Reynolds tensions transport model and introduces more recent models called 'feasible'. A second paper deals with heat transfer and the effects of gravity, and returns to the Reynolds stress transport model. (author)
2016-01-01
This book provides a thorough introduction to the challenge of applying mathematics in real-world scenarios. Modelling tasks rarely involve well-defined categories, and they often require multidisciplinary input from mathematics, physics, computer sciences, or engineering. In keeping with this spirit of modelling, the book includes a wealth of cross-references between the chapters and frequently points to the real-world context. The book combines classical approaches to modelling with novel areas such as soft computing methods, inverse problems, and model uncertainty. Attention is also paid to the interaction between models, data and the use of mathematical software. The reader will find a broad selection of theoretical tools for practicing industrial mathematics, including the analysis of continuum models, probabilistic and discrete phenomena, and asymptotic and sensitivity analysis.
DEFF Research Database (Denmark)
Larsen, Lars Bjørn; Vesterager, Johan
sharing many of the characteristics of a virtual enterprise. This extended enterprise will have the following characteristics: The extended enterprise is focused on satisfying the current customer requirement so that it has a limited life expectancy, but should be capable of being recreated to deal....... One or more units from beyond the network may complement the extended enterprise. The common reference model for this extended enterprise will utilise GERAM (Generalised Enterprise Reference Architecture and Methodology) to provide an architectural framework for the modelling carried out within......This report provides an overview of the existing models of global manufacturing, describes the required modelling views and associated methods and identifies tools, which can provide support for this modelling activity.The model adopted for global manufacturing is that of an extended enterprise...
DEFF Research Database (Denmark)
Blomhøj, Morten
2004-01-01
modelling, however, can be seen as a practice of teaching that place the relation between real life and mathematics into the centre of teaching and learning mathematics, and this is relevant at all levels. Modelling activities may motivate the learning process and help the learner to establish cognitive......Developing competences for setting up, analysing and criticising mathematical models are normally seen as relevant only from and above upper secondary level. The general belief among teachers is that modelling activities presuppose conceptual understanding of the mathematics involved. Mathematical...... roots for the construction of important mathematical concepts. In addition competences for setting up, analysing and criticising modelling processes and the possible use of models is a formative aim in this own right for mathematics teaching in general education. The paper presents a theoretical...
DEFF Research Database (Denmark)
Bækgaard, Lars
2001-01-01
The purpose of this chapter is to discuss conceptual event modeling within a context of information modeling. Traditionally, information modeling has been concerned with the modeling of a universe of discourse in terms of information structures. However, most interesting universes of discourse...... are dynamic and we present a modeling approach that can be used to model such dynamics. We characterize events as both information objects and change agents (Bækgaard 1997). When viewed as information objects events are phenomena that can be observed and described. For example, borrow events in a library can...... be characterized by their occurrence times and the participating books and borrowers. When we characterize events as information objects we focus on concepts like information structures. When viewed as change agents events are phenomena that trigger change. For example, when borrow event occurs books are moved...
Bottle, Neil
2013-01-01
The Model : making exhibition was curated by Brian Kennedy in collaboration with Allies & Morrison in September 2013. For the London Design Festival, the Model : making exhibition looked at the increased use of new technologies by both craft-makers and architectural model makers. In both practices traditional ways of making by hand are increasingly being combined with the latest technologies of digital imaging, laser cutting, CNC machining and 3D printing. This exhibition focussed on ...
Wenninger, Magnus J
2012-01-01
Well-illustrated, practical approach to creating star-faced spherical forms that can serve as basic structures for geodesic domes. Complete instructions for making models from circular bands of paper with just a ruler and compass. Discusses tessellation, or tiling, and how to make spherical models of the semiregular solids and concludes with a discussion of the relationship of polyhedra to geodesic domes and directions for building models of domes. "". . . very pleasant reading."" - Science. 1979 edition.
Open source molecular modeling.
Pirhadi, Somayeh; Sunseri, Jocelyn; Koes, David Ryan
2016-09-01
The success of molecular modeling and computational chemistry efforts are, by definition, dependent on quality software applications. Open source software development provides many advantages to users of modeling applications, not the least of which is that the software is free and completely extendable. In this review we categorize, enumerate, and describe available open source software packages for molecular modeling and computational chemistry. An updated online version of this catalog can be found at https://opensourcemolecularmodeling.github.io. Copyright © 2016 The Author(s). Published by Elsevier Inc. All rights reserved.
Modelling dense relational data
DEFF Research Database (Denmark)
Herlau, Tue; Mørup, Morten; Schmidt, Mikkel Nørgaard
2012-01-01
Relational modelling classically consider sparse and discrete data. Measures of influence computed pairwise between temporal sources naturally give rise to dense continuous-valued matrices, for instance p-values from Granger causality. Due to asymmetry or lack of positive definiteness they are no......Relational modelling classically consider sparse and discrete data. Measures of influence computed pairwise between temporal sources naturally give rise to dense continuous-valued matrices, for instance p-values from Granger causality. Due to asymmetry or lack of positive definiteness...... they are not naturally suited for kernel K-means. We propose a generative Bayesian model for dense matrices which generalize kernel K-means to consider off-diagonal interactions in matrices of interactions, and demonstrate its ability to detect structure on both artificial data and two real data sets....
Defining fitness in evolutionary models
Indian Academy of Sciences (India)
jgen/087/04/0339-0348. Keywords. fitness; invasion exponent; adaptive dynamics; game theory; Lyapunov exponent; invasibility; Malthusian parameter. Abstract. The analysis of evolutionary models requires an appropriate definition for fitness.
Models of educational institutions' networking
Shilova Olga Nikolaevna
2015-01-01
The importance of educational institutions' networking in modern sociocultural conditions and a definition of networking in education are presented in the article. The results of research levels, methods and models of educational institutions' networking are presented and substantially disclosed.
Modeling Documents with Event Model
Directory of Open Access Journals (Sweden)
Longhui Wang
2015-08-01
Full Text Available Currently deep learning has made great breakthroughs in visual and speech processing, mainly because it draws lessons from the hierarchical mode that brain deals with images and speech. In the field of NLP, a topic model is one of the important ways for modeling documents. Topic models are built on a generative model that clearly does not match the way humans write. In this paper, we propose Event Model, which is unsupervised and based on the language processing mechanism of neurolinguistics, to model documents. In Event Model, documents are descriptions of concrete or abstract events seen, heard, or sensed by people and words are objects in the events. Event Model has two stages: word learning and dimensionality reduction. Word learning is to learn semantics of words based on deep learning. Dimensionality reduction is the process that representing a document as a low dimensional vector by a linear mode that is completely different from topic models. Event Model achieves state-of-the-art results on document retrieval tasks.
Kummer, E. E.; Siegel, Edward Carl-Ludwig
2011-03-01
Clock-model Archimedes [http://linkage.rockeller.edu/ wli/moved.8.04/ 1fnoise/ index. ru.html] HYPERBOLICITY inevitability throughout physics/pure-maths: Newton-law F=ma, Heisenberg and classical uncertainty-principle=Parseval/Plancherel-theorems causes FUZZYICS definition: (so miscalled) "complexity" = UTTER-SIMPLICITY!!! Watkins[www.secamlocal.ex.ac.uk/people/staff/mrwatkin/]-Hubbard[World According to Wavelets (96)-p.14!]-Franklin[1795]-Fourier[1795;1822]-Brillouin[1922] dual/inverse-space(k,w) analysis key to Fourier-unification in Archimedes hyperbolicity inevitability progress up Siegel cognition hierarchy-of-thinking (HoT): data-info.-know.-understand.-meaning-...-unity-simplicity = FUZZYICS!!! Frohlich-Mossbauer-Goldanskii-del Guidice [Nucl.Phys.B:251,375(85);275,185 (86)]-Young [arXiv-0705.4678y2, (5/31/07] theory of health/life=aqueous-electret/ ferroelectric protoplasm BEC = Archimedes-Siegel [Schrodinger Cent.Symp.(87); Symp.Fractals, MRS Fall Mtg.(89)-5-pprs] 1/w-"noise" Zipf-law power-spectrum hyperbolicity INEVITABILITY= Chi; Dirac delta-function limit w=0 concentration= BEC = Chi-Quong.
Model Reduction in Groundwater Modeling and Management
Siade, A. J.; Kendall, D. R.; Putti, M.; Yeh, W. W.
2008-12-01
Groundwater management requires the development and implementation of mathematical models that, through simulation, evaluate the effects of anthropogenic impacts on an aquifer system. To obtain high levels of accuracy, one must incorporate high levels of complexity, resulting in computationally demanding models. This study provides a methodology for solving groundwater management problems with reduced computational effort by replacing the large, complex numerical model with a significantly smaller, simpler approximation. This is achieved via Proper Orthogonal Decomposition (POD), where the goal is to project the larger model solution space onto a smaller or reduced subspace in which the management problem will be solved, achieving reductions in computation time of up to three orders of magnitude. Once the solution is obtained in the reduced space with acceptable accuracy, it is then projected back to the full model space. A major challenge when using this method is the definition of the reduced solution subspace. In POD, this subspace is defined based on samples or snapshots taken at specific times from the solution of the full model. In this work we determine when snapshots should be taken on the basis of the exponential behavior of the governing partial differential equation. This selection strategy is then generalized for any groundwater model by obtaining and using the optimal snapshot selection for a simplified, dimensionless model. Protocols are developed to allow the snapshot selection results of the simplified, dimensionless model to be transferred to that of a complex, heterogeneous model with any geometry. The proposed methodology is finally applied to a basin in the Oristano Plain located in the Sardinia Island, Italy.
DEFF Research Database (Denmark)
Højgaard, Tomas; Hansen, Rune
The purpose of this paper is to introduce Didactical Modelling as a research methodology in mathematics education. We compare the methodology with other approaches and argue that Didactical Modelling has its own specificity. We discuss the methodological “why” and explain why we find it useful...... to construct this approach in mathematics education research....
Flores, J.; Kiss, S.; Cano, P.; Nijholt, Antinus; Zwiers, Jakob
2003-01-01
We concentrate our efforts on building virtual modelling environments where the content creator uses controls (widgets) as an interactive adjustment modality for the properties of the edited objects. Besides the advantage of being an on-line modelling approach (visualised just like any other on-line
DEFF Research Database (Denmark)
Gøtze, Jens Peter; Krentz, Andrew
2014-01-01
In this issue of Cardiovascular Endocrinology, we are proud to present a broad and dedicated spectrum of reviews on animal models in cardiovascular disease. The reviews cover most aspects of animal models in science from basic differences and similarities between small animals and the human...
Poortman, Sybilla; Sloep, Peter
2006-01-01
Educational models describes a case study on a complex learning object. Possibilities are investigated for using this learning object, which is based on a particular educational model, outside of its original context. Furthermore, this study provides advice that might lead to an increase in
Oh, Phil Seok; Oh, Sung Jin
2013-01-01
Modeling in science has been studied by education researchers for decades and is now being applied broadly in school. It is among the scientific practices featured in the "Next Generation Science Standards" ("NGSS") (Achieve Inc. 2013). This article describes modeling activities in an extracurricular science club in a high…
Jongerden, M.R.; Haverkort, Boudewijn R.H.M.
2008-01-01
The use of mobile devices is often limited by the capacity of the employed batteries. The battery lifetime determines how long one can use a device. Battery modeling can help to predict, and possibly extend this lifetime. Many different battery models have been developed over the years. However,
International Nuclear Information System (INIS)
V. Chipman
2002-01-01
The purpose of the Ventilation Model is to simulate the heat transfer processes in and around waste emplacement drifts during periods of forced ventilation. The model evaluates the effects of emplacement drift ventilation on the thermal conditions in the emplacement drifts and surrounding rock mass, and calculates the heat removal by ventilation as a measure of the viability of ventilation to delay the onset of peak repository temperature and reduce its magnitude. The heat removal by ventilation is temporally and spatially dependent, and is expressed as the fraction of heat carried away by the ventilation air compared to the fraction of heat produced by radionuclide decay. One minus the heat removal is called the wall heat fraction, or the remaining amount of heat that is transferred via conduction to the surrounding rock mass. Downstream models, such as the ''Multiscale Thermohydrologic Model'' (BSC 2001), use the wall heat fractions as outputted from the Ventilation Model to initialize their postclosure analyses
DEFF Research Database (Denmark)
Kindler, Ekkart
2009-01-01
, these notations have been extended in order to increase expressiveness and to be more competitive. This resulted in an increasing number of notations and formalisms for modelling business processes and in an increase of the different modelling constructs provided by modelling notations, which makes it difficult......There are many different notations and formalisms for modelling business processes and workflows. These notations and formalisms have been introduced with different purposes and objectives. Later, influenced by other notations, comparisons with other tools, or by standardization efforts...... to compare modelling notations and to make transformations between them. One of the reasons is that, in each notation, the new concepts are introduced in a different way by extending the already existing constructs. In this chapter, we go the opposite direction: We show that it is possible to add most...
Energy Technology Data Exchange (ETDEWEB)
Veronica J. Rutledge
2013-01-01
The absence of industrial scale nuclear fuel reprocessing in the U.S. has precluded the necessary driver for developing the advanced simulation capability now prevalent in so many other countries. Thus, it is essential to model complex series of unit operations to simulate, understand, and predict inherent transient behavior and feedback loops. A capability of accurately simulating the dynamic behavior of advanced fuel cycle separation processes will provide substantial cost savings and many technical benefits. The specific fuel cycle separation process discussed in this report is the off-gas treatment system. The off-gas separation consists of a series of scrubbers and adsorption beds to capture constituents of interest. Dynamic models are being developed to simulate each unit operation involved so each unit operation can be used as a stand-alone model and in series with multiple others. Currently, an adsorption model has been developed within Multi-physics Object Oriented Simulation Environment (MOOSE) developed at the Idaho National Laboratory (INL). Off-gas Separation and REcoverY (OSPREY) models the adsorption of off-gas constituents for dispersed plug flow in a packed bed under non-isothermal and non-isobaric conditions. Inputs to the model include gas, sorbent, and column properties, equilibrium and kinetic data, and inlet conditions. The simulation outputs component concentrations along the column length as a function of time from which breakthrough data is obtained. The breakthrough data can be used to determine bed capacity, which in turn can be used to size columns. It also outputs temperature along the column length as a function of time and pressure drop along the column length. Experimental data and parameters were input into the adsorption model to develop models specific for krypton adsorption. The same can be done for iodine, xenon, and tritium. The model will be validated with experimental breakthrough curves. Customers will be given access to
Directory of Open Access Journals (Sweden)
Slavik Stefan
2014-12-01
Full Text Available The term business model has been used in practice for few years, but companies create, define and innovate their models subconsciously from the start of business. Our paper is aimed to clear the theory about business model, hence definition and all the components that form each business. In the second part, we create an analytical tool and analyze the real business models in Slovakia and define the characteristics of each part of business model, i.e., customers, distribution, value, resources, activities, cost and revenue. In the last part of our paper, we discuss the most used characteristics, extremes, discrepancies and the most important facts which were detected in our research.
McMEEKIN, Thomas A; Ross, Thomas
1996-12-01
The concept of predictive microbiology has developed rapidly through the initial phases of experimental design and model development and the subsequent phase of model validation. A fully validated model represents a general rule which may be brought to bear on particular cases. For some microorganism/food combinations, sufficient confidence now exists to indicate substantial benefits to the food industry from use of predictive models. Several types of devices are available to monitor and record environmental conditions (particularly temperature). These "environmental histories" can be interpreted, using predictive models, in terms of microbial proliferation. The current challenge is to provide systems for the collection and interpretation of environmental information which combine ease of use, reliability, and security, providing the industrial user with the ability to make informed and precise decisions regarding the quality and safety of foods. Many specific applications for predictive modeling can be developed from a basis of understanding the inherent qualities of a fully validated model. These include increased precision and confidence in predictions based on accumulation of quantitative data, objective and rapid assessment of the effect of environmental conditions on microbial proliferation, and flexibility in monitoring the relative contribution of component parts of processing, distribution, and storage systems for assurance of shelf life and safety.
Modeling complexes of modeled proteins.
Anishchenko, Ivan; Kundrotas, Petras J; Vakser, Ilya A
2017-03-01
Structural characterization of proteins is essential for understanding life processes at the molecular level. However, only a fraction of known proteins have experimentally determined structures. This fraction is even smaller for protein-protein complexes. Thus, structural modeling of protein-protein interactions (docking) primarily has to rely on modeled structures of the individual proteins, which typically are less accurate than the experimentally determined ones. Such "double" modeling is the Grand Challenge of structural reconstruction of the interactome. Yet it remains so far largely untested in a systematic way. We present a comprehensive validation of template-based and free docking on a set of 165 complexes, where each protein model has six levels of structural accuracy, from 1 to 6 Å C α RMSD. Many template-based docking predictions fall into acceptable quality category, according to the CAPRI criteria, even for highly inaccurate proteins (5-6 Å RMSD), although the number of such models (and, consequently, the docking success rate) drops significantly for models with RMSD > 4 Å. The results show that the existing docking methodologies can be successfully applied to protein models with a broad range of structural accuracy, and the template-based docking is much less sensitive to inaccuracies of protein models than the free docking. Proteins 2017; 85:470-478. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
DEFF Research Database (Denmark)
Pedersen, Mogens Jin; Stritch, Justin Michael
2018-01-01
contributes knowledge about a social phenomenon and advances knowledge in the public administration and management literatures. The RNICE model provides a vehicle for researchers who seek to evaluate or demonstrate the value of a replication study systematically. We illustrate the practical application...... research. Recently, scholars have issued calls for more replication, but academic reflections on when replication adds substantive value to public administration and management research are needed. This concise article presents a conceptual model, RNICE, for assessing when and how a replication study...... of the model using two previously published replication studies as examples....
Blacher, René
2010-01-01
Ce rapport complete les deux rapports précédents et apporte une explication plus simple aux résultats précédents : à savoir la preuve que les suites obtenues sont aléatoires.; In previous reports, we have show how to transform a text $y_n$ in a random sequence by using functions of Fibonacci $T_q$. Now, in this report, we obtain a clearer result by proving that $T_q(y_n)$ has the IID model as correct model. But, it is necessary to define correctly a correct model. Then, we study also this pro...
National Oceanic and Atmospheric Administration, Department of Commerce — Computer simulations of past climate. Variables provided as model output are described by parameter keyword. In some cases the parameter keywords are a subset of all...
Regardt, Olle; Rönnbäck, Lars; Bergholtz, Maria; Johannesson, Paul; Wohed, Petia
Maintaining and evolving data warehouses is a complex, error prone, and time consuming activity. The main reason for this state of affairs is that the environment of a data warehouse is in constant change, while the warehouse itself needs to provide a stable and consistent interface to information spanning extended periods of time. In this paper, we propose a modeling technique for data warehousing, called anchor modeling, that offers non-destructive extensibility mechanisms, thereby enabling robust and flexible management of changes in source systems. A key benefit of anchor modeling is that changes in a data warehouse environment only require extensions, not modifications, to the data warehouse. This ensures that existing data warehouse applications will remain unaffected by the evolution of the data warehouse, i.e. existing views and functions will not have to be modified as a result of changes in the warehouse model.
Searle, Shayle R
2012-01-01
This 1971 classic on linear models is once again available--as a Wiley Classics Library Edition. It features material that can be understood by any statistician who understands matrix algebra and basic statistical methods.
EPA's modeling community is working to gain insights into certain parts of a physical, biological, economic, or social system by conducting environmental assessments for Agency decision making to complex environmental issues.
International Nuclear Information System (INIS)
Rosner, J.L.
1981-01-01
This paper invites experimenters to consider the wide variety of tests suggested by the new aspects of quark models since the discovery of charm and beauty, and nonrelativistic models. Colors and flavours are counted and combined into hadrons. The current quark zoo is summarized. Models and theoretical background are studied under: qualitative QCD: strings and bags, potential models, relativistic effects, electromagnetic transitions, gluon emissions, and single quark transition descriptions. Hadrons containing quarks known before 1974 (i.e. that can be made of ''light'' quarks u, d, and s) are treated in Section III, while those containing charmed quarks and beauty (b) quarks are discussed in Section IV. Unfolding the properties of the sixth quark from information on its hadrons is seen as a future application of the methods used in this study
Hodges, Wilfrid
1993-01-01
An up-to-date and integrated introduction to model theory, designed to be used for graduate courses (for students who are familiar with first-order logic), and as a reference for more experienced logicians and mathematicians.
Digital Repository Service at National Institute of Oceanography (India)
Unnikrishnan, A.S.; Manoj, N.T.
developed most of the above models. This is a good approximation to simulate horizontal distribution of active and passive variables. The future challenge lies in developing capability to simulate the distribution in the vertical....
International Nuclear Information System (INIS)
Peccei, R.D.
If quarks and leptons are composite, it should be possible eventually to calculate their mass spectrum and understand the reasons for the observed family replications, questions which lie beyond the standard model. Alas, all experimental evidence to date points towards quark and lepton elemenarity with the typical momentum scale Λsub(comp), beyond which effects of inner structure may be seen, probably being greater than ITeV. One supersymmetric preon model explained provides a new dynamical alternative for obtaining light fermions which is that these states are quasi Goldstone fermions. This, and similar models are discussed. Although quasi Goldstone fermions provide an answer to the 0sup(th)-order question of composite models the questions of how masses and families are generated remain unanswered. (U.K.)
Skaaret, Eimund
Calculation procedures, used in the design of ventilating systems, which are especially suited for displacement ventilation in addition to linking it to mixing ventilation, are addressed. The two zone flow model is considered and the steady state and transient solutions are addressed. Different methods of supplying air are discussed, and different types of air flow are considered: piston flow, plane flow and radial flow. An evaluation model for ventilation systems is presented.
DEFF Research Database (Denmark)
Lasrado, Lester Allan; Vatrapu, Ravi
2016-01-01
effects, unicausal reduction, and case specificity. Based on the developments in set theoretical thinking in social sciences and employing methods like Qualitative Comparative Analysis (QCA), Necessary Condition Analysis (NCA), and set visualization techniques, in this position paper, we propose...... and demonstrate a new approach to maturity models in the domain of Information Systems. This position paper describes the set-theoretical approach to maturity models, presents current results and outlines future research work....
Accelerated life models modeling and statistical analysis
Bagdonavicius, Vilijandas
2001-01-01
Failure Time DistributionsIntroductionParametric Classes of Failure Time DistributionsAccelerated Life ModelsIntroductionGeneralized Sedyakin's ModelAccelerated Failure Time ModelProportional Hazards ModelGeneralized Proportional Hazards ModelsGeneralized Additive and Additive-Multiplicative Hazards ModelsChanging Shape and Scale ModelsGeneralizationsModels Including Switch-Up and Cycling EffectsHeredity HypothesisSummaryAccelerated Degradation ModelsIntroductionDegradation ModelsModeling the Influence of Explanatory Varia
Model uncertainty: Probabilities for models?
International Nuclear Information System (INIS)
Winkler, R.L.
1994-01-01
Like any other type of uncertainty, model uncertainty should be treated in terms of probabilities. The question is how to do this. The most commonly-used approach has a drawback related to the interpretation of the probabilities assigned to the models. If we step back and look at the big picture, asking what the appropriate focus of the model uncertainty question should be in the context of risk and decision analysis, we see that a different probabilistic approach makes more sense, although it raise some implementation questions. Current work that is underway to address these questions looks very promising
Stochastic Models of Evolution
Bezruchko, Boris P.; Smirnov, Dmitry A.
To continue the discussion of randomness given in Sect. 2.2.1, we briefly touch on stochastic models of temporal evolution (random processes). They can be specified either via explicit definition of their statistical properties (probability density functions, correlation functions, etc., Sects. 4.1, 4.2 and 4.3) or via stochastic difference or differential equations. Some of the most widely known equations, their properties and applications are discussed in Sects. 4.4 and 4.5.
Energy Technology Data Exchange (ETDEWEB)
Curtis, S.B.
1990-09-01
Several models and theories are reviewed that incorporate the idea of radiation-induced lesions (repairable and/or irreparable) that can be related to molecular lesions in the DNA molecule. Usually the DNA double-strand or chromatin break is suggested as the critical lesion. In the models, the shoulder on the low-LET survival curve is hypothesized as being due to one (or more) of the following three mechanisms: (1) ``interaction`` of lesions produced by statistically independent particle tracks; (2) nonlinear (i.e., linear-quadratic) increase in the yield of initial lesions, and (3) saturation of repair processes at high dose. Comparisons are made between the various approaches. Several significant advances in model development are discussed; in particular, a description of the matrix formulation of the Markov versions of the RMR and LPL models is given. The more advanced theories have incorporated statistical fluctuations in various aspects of the energy-loss and lesion-formation process. An important direction is the inclusion of physical and chemical processes into the formulations by incorporating relevant track structure theory (Monte Carlo track simulations) and chemical reactions of radiation-induced radicals. At the biological end, identification of repair genes and how they operate as well as a better understanding of how DNA misjoinings lead to lethal chromosome aberrations are needed for appropriate inclusion into the theories. More effort is necessary to model the complex end point of radiation-induced carcinogenesis.
Energy Technology Data Exchange (ETDEWEB)
Curtis, S.B.
1990-09-01
Several models and theories are reviewed that incorporate the idea of radiation-induced lesions (repairable and/or irreparable) that can be related to molecular lesions in the DNA molecule. Usually the DNA double-strand or chromatin break is suggested as the critical lesion. In the models, the shoulder on the low-LET survival curve is hypothesized as being due to one (or more) of the following three mechanisms: (1) interaction'' of lesions produced by statistically independent particle tracks; (2) nonlinear (i.e., linear-quadratic) increase in the yield of initial lesions, and (3) saturation of repair processes at high dose. Comparisons are made between the various approaches. Several significant advances in model development are discussed; in particular, a description of the matrix formulation of the Markov versions of the RMR and LPL models is given. The more advanced theories have incorporated statistical fluctuations in various aspects of the energy-loss and lesion-formation process. An important direction is the inclusion of physical and chemical processes into the formulations by incorporating relevant track structure theory (Monte Carlo track simulations) and chemical reactions of radiation-induced radicals. At the biological end, identification of repair genes and how they operate as well as a better understanding of how DNA misjoinings lead to lethal chromosome aberrations are needed for appropriate inclusion into the theories. More effort is necessary to model the complex end point of radiation-induced carcinogenesis.
Quantization of Midisuperspace Models
Directory of Open Access Journals (Sweden)
J. Fernando Barbero G.
2010-10-01
Full Text Available We give a comprehensive review of the quantization of midisuperspace models. Though the main focus of the paper is on quantum aspects, we also provide an introduction to several classical points related to the definition of these models. We cover some important issues, in particular, the use of the principle of symmetric criticality as a very useful tool to obtain the required Hamiltonian formulations. Two main types of reductions are discussed: those involving metrics with two Killing vector fields and spherically-symmetric models. We also review the more general models obtained by coupling matter fields to these systems. Throughout the paper we give separate discussions for standard quantizations using geometrodynamical variables and those relying on loop-quantum-gravity-inspired methods.
Business Model Process Configurations
DEFF Research Database (Denmark)
Taran, Yariv; Nielsen, Christian; Thomsen, Peter
2015-01-01
strategic preference, as part of their business model innovation activity planned. Practical implications – This paper aimed at strengthening researchers and, particularly, practitioner’s perspectives into the field of business model process configurations. By insuring an [abstracted] alignment between......Purpose – The paper aims: 1) To develop systematically a structural list of various business model process configuration and to group (deductively) these selected configurations in a structured typological categorization list. 2) To facilitate companies in the process of BM innovation......, by developing (inductively) an ontological classification framework, in view of the BM process configurations typology developed. Design/methodology/approach – Given the inconsistencies found in the business model studies (e.g. definitions, configurations, classifications) we adopted the analytical induction...
Smith, J. A.; Cooper, K.; Randolph, M.
1984-01-01
A classical description of the one dimensional radiative transfer treatment of vegetation canopies was completed and the results were tested against measured prairie (blue grama) and agricultural canopies (soybean). Phase functions are calculated in terms of directly measurable biophysical characteristics of the canopy medium. While the phase functions tend to exhibit backscattering anisotropy, their exact behavior is somewhat more complex and wavelength dependent. A Monte Carlo model was developed that treats soil surfaces with large periodic variations in three dimensions. A photon-ray tracing technology is used. Currently, the rough soil surface is described by analytic functions and appropriate geometric calculations performed. A bidirectional reflectance distribution function is calculated and, hence, available for other atmospheric or canopy reflectance models as a lower boundary condition. This technique is used together with an adding model to calculate several cases where Lambertian leaves possessing anisotropic leaf angle distributions yield non-Lambertian reflectance; similar behavior is exhibited for simulated soil surfaces.
Eck, Christof; Knabner, Peter
2017-01-01
Mathematical models are the decisive tool to explain and predict phenomena in the natural and engineering sciences. With this book readers will learn to derive mathematical models which help to understand real world phenomena. At the same time a wealth of important examples for the abstract concepts treated in the curriculum of mathematics degrees are given. An essential feature of this book is that mathematical structures are used as an ordering principle and not the fields of application. Methods from linear algebra, analysis and the theory of ordinary and partial differential equations are thoroughly introduced and applied in the modeling process. Examples of applications in the fields electrical networks, chemical reaction dynamics, population dynamics, fluid dynamics, elasticity theory and crystal growth are treated comprehensively.
Cardey, Sylviane
2013-01-01
In response to the need for reliable results from natural language processing, this book presents an original way of decomposing a language(s) in a microscopic manner by means of intra/inter‑language norms and divergences, going progressively from languages as systems to the linguistic, mathematical and computational models, which being based on a constructive approach are inherently traceable. Languages are described with their elements aggregating or repelling each other to form viable interrelated micro‑systems. The abstract model, which contrary to the current state of the art works in int
Directory of Open Access Journals (Sweden)
Aarti Sharma
2009-01-01
Full Text Available The use of computational chemistry in the development of novel pharmaceuticals is becoming an increasingly important tool. In the past, drugs were simply screened for effectiveness. The recent advances in computing power and the exponential growth of the knowledge of protein structures have made it possible for organic compounds to be tailored to decrease the harmful side effects and increase the potency. This article provides a detailed description of the techniques employed in molecular modeling. Molecular modeling is a rapidly developing discipline, and has been supported by the dramatic improvements in computer hardware and software in recent years.
International Nuclear Information System (INIS)
Woosley, S.E.; Weaver, T.A.
1980-01-01
Recent progress in understanding the observed properties of Type I supernovae as a consequence of the thermonuclear detonation of white dwarf stars and the ensuing decay of the 56 Ni produced therein is reviewed. Within the context of this model for Type I explosions and the 1978 model for Type II explosions, the expected nucleosynthesis and gamma-line spectra from both kinds of supernovae are presented. Finally, a qualitatively new approach to the problem of massive star death and Type II supernovae based upon a combination of rotation and thermonuclear burning is discussed
DEFF Research Database (Denmark)
Stubkjær, Erik
2005-01-01
to the modeling of an industrial sector, as it aims at rendering the basic concepts that relate to the domain of real estate and the pertinent human activities. The palpable objects are pieces of land and buildings, documents, data stores and archives, as well as persons in their diverse roles as owners, holders...... to land. The paper advances the position that cadastral modeling has to include not only the physical objects, agents, and information sets of the domain, but also the objectives or requirements of cadastral systems....
Hoerbst, Alexander; Hackl, Werner; Ammenwerth, Elske
2010-01-01
Quality assurance is a major task with regard to Electronic Health Records (EHR). Currently there are only a few approaches explicitly dealing with the quality of EHR services as a whole. The objective of this paper is to introduce a new Meta-Model to structure and describe quality requirements of EHRs. This approach should support the transnational quality certification of EHR services. The Model was developed based on interviews with 24 experts and a systematic literature search and comprises a service and requirements model. The service model represents the structure of a service whereas the requirements model can be used to assign specific predefined aims and requirements to a service. The new model differs from existing approaches as it accounts for modern software architectures and the special attributes of EHRs.
African Journals Online (AJOL)
Simple analytic polynomials have been proposed for estimating solar radiation in the traditional Northern, Central and Southern regions of Malawi. There is a strong agreement between the polynomials and the SSE model with R2 values of 0.988, 0.989 and 0.989 and root mean square errors of 0.061, 0.057 and 0.062 ...
DEFF Research Database (Denmark)
Nash, Ulrik William
2014-01-01
Firms consist of people who make decisions to achieve goals. How do these people develop the expectations which underpin the choices they make? The lens model provides one answer to this question. It was developed by cognitive psychologist Egon Brunswik (1952) to illustrate his theory...
Indian Academy of Sciences (India)
pattern of the watershed LULC, leading to an accretive linear growth of agricultural and settlement areas. The annual rate of ... thereby advocates for better agricultural practices with additional energy subsidy to arrest further forest loss and LULC ...... automaton model and GIS: Long-term urban growth pre- diction for San ...
DEFF Research Database (Denmark)
Arnoldi, Jakob
The article discusses the use of algorithmic models for so-called High Frequency Trading (HFT) in finance. HFT is controversial yet widespread in modern financial markets. It is a form of automated trading technology which critics among other things claim can lead to market manipulation. Drawing ...
Finger Lakes Regional Education Center for Economic Development, Mount Morris, NY.
This guide describes seven model programs that were developed by the Finger Lakes Regional Center for Economic Development (New York) to meet the training needs of female and minority entrepreneurs to help their businesses survive and grow and to assist disabled and dislocated workers and youth in beginning small businesses. The first three models…
DEFF Research Database (Denmark)
About the reconstruction of Palle Nielsen's (f. 1942) work The Model from 1968: a gigantic playground for children in the museum, where they can freely romp about, climb in ropes, crawl on wooden structures, work with tools, jump in foam rubber, paint with finger paints and dress up in costumes....
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 14; Issue 7. Model Checking - Automated Verification of Computational Systems. Madhavan Mukund. General Article Volume 14 Issue 7 July 2009 pp 667-681. Fulltext. Click here to view fulltext PDF. Permanent link:
Lomnitz, Cinna
Tichelaar and Ruff [1989] propose to “estimate model variance in complicated geophysical problems,” including the determination of focal depth in earthquakes, by means of unconventional statistical methods such as bootstrapping. They are successful insofar as they are able to duplicate the results from more conventional procedures.
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 9; Issue 5. Molecular Modeling: A Powerful Tool for Drug Design and Molecular Docking. Rama Rao Nadendla. General Article Volume 9 Issue 5 May 2004 pp 51-60. Fulltext. Click here to view fulltext PDF. Permanent link:
Principles of models based engineering
Energy Technology Data Exchange (ETDEWEB)
Dolin, R.M.; Hefele, J.
1996-11-01
This report describes a Models Based Engineering (MBE) philosophy and implementation strategy that has been developed at Los Alamos National Laboratory`s Center for Advanced Engineering Technology. A major theme in this discussion is that models based engineering is an information management technology enabling the development of information driven engineering. Unlike other information management technologies, models based engineering encompasses the breadth of engineering information, from design intent through product definition to consumer application.
Pieters, Sigrid; Saeys, Wouter; Van den Kerkhof, Tom; Goodarzi, Mohammad; Hellings, Mario; De Beer, Thomas; Heyden, Yvan Vander
2013-01-25
Owing to spectral variations from other sources than the component of interest, large investments in the NIR model development may be required to obtain satisfactory and robust prediction performance. To make the NIR model development for routine active pharmaceutical ingredient (API) prediction in tablets more cost-effective, alternative modelling strategies were proposed. They used a massive amount of prior spectral information on intra- and inter-batch variation and the pure component spectra to define a clutter, i.e., the detrimental spectral information. This was subsequently used for artificial data augmentation and/or orthogonal projections. The model performance improved statistically significantly, with a 34-40% reduction in RMSEP while needing fewer model latent variables, by applying the following procedure before PLS regression: (1) augmentation of the calibration spectra with the spectral shapes from the clutter, and (2) net analyte pre-processing (NAP). The improved prediction performance was not compromised when reducing the variability in the calibration set, making exhaustive calibration unnecessary. Strong water content variations in the tablets caused frequency shifts of the API absorption signals that could not be included in the clutter. Updating the model for this kind of variation demonstrated that the completeness of the clutter is critical for the performance of these models and that the model will only be more robust for spectral variation that is not co-linear with the one from the property of interest. Copyright © 2012 Elsevier B.V. All rights reserved.
Hybrid modeling in computational neuropsychiatry.
Marin-Sanguino, A; Mendoza, E R
2008-09-01
The aim of building mathematical models is to provide a formal structure to explain the behaviour of a whole in terms of its parts. In the particular case of neuropsychiatry, the available information upon which models are to be built is distributed over several fields of expertise. Molecular and cellular biologists, physiologists and clinicians all hold valuable information about the system which has to be distilled into a unified view. Furthermore, modelling is not a sequential process in which the roles of field and modelling experts are separated. Model building is done through iterations in which all the parts have to keep an active role. This work presents some modelling techniques and guidelines on how they can be combined in order to simplify modelling efforts in neuropsychiatry. The proposed approach involves two well known modelling techniques, Petri nets and Biochemical System Theory that provide a general well proven structured definition for biological models.
Business Model for Czech Agribusiness
Directory of Open Access Journals (Sweden)
Poláková Jana
2015-09-01
Full Text Available Business modelling facilitates the understanding of value creation logic in organizations in general. Identifying the components of business models based on different criteria helps understanding the fundamentals of business and the position of entrepreneurs and managers in companies. The present research is focused on the definition of a specific business model for the Czech agribusiness sector. Based on the theoretical background and evaluation of selected business models, the aim is to create a new business model, using components which take into account the specifics of this particular industry.
Model-reduced inverse modeling
Vermeulen, P.T.M.
2006-01-01
Although faster computers have been developed in recent years, they tend to be used to solve even more detailed problems. In many cases this will yield enormous models that can not be solved within acceptable time constraints. Therefore, there is a need for alternative methods that simulate such
Building Models and Building Modelling
DEFF Research Database (Denmark)
Jørgensen, Kaj; Skauge, Jørn
2008-01-01
I rapportens indledende kapitel beskrives de primære begreber vedrørende bygningsmodeller og nogle fundamentale forhold vedrørende computerbaseret modulering bliver opstillet. Desuden bliver forskellen mellem tegneprogrammer og bygningsmodelleringsprogrammer beskrevet. Vigtige aspekter om comp...
Directory of Open Access Journals (Sweden)
Aarti Sharma
2009-12-01
Full Text Available
Barr, Michael
2002-01-01
Acyclic models is a method heavily used to analyze and compare various homology and cohomology theories appearing in topology and algebra. This book is the first attempt to put together in a concise form this important technique and to include all the necessary background. It presents a brief introduction to category theory and homological algebra. The author then gives the background of the theory of differential modules and chain complexes over an abelian category to state the main acyclic models theorem, generalizing and systemizing the earlier material. This is then applied to various cohomology theories in algebra and topology. The volume could be used as a text for a course that combines homological algebra and algebraic topology. Required background includes a standard course in abstract algebra and some knowledge of topology. The volume contains many exercises. It is also suitable as a reference work for researchers.
DEFF Research Database (Denmark)
Pedersen, Mogens Jin; Stritch, Justin Michael
2018-01-01
Replication studies relate to the scientific principle of replicability and serve the significant purpose of providing supporting (or contradicting) evidence regarding the existence of a phenomenon. However, replication has never been an integral part of public administration and management...... research. Recently, scholars have issued calls for more replication, but academic reflections on when replication adds substantive value to public administration and management research are needed. This concise article presents a conceptual model, RNICE, for assessing when and how a replication study...... contributes knowledge about a social phenomenon and advances knowledge in the public administration and management literatures. The RNICE model provides a vehicle for researchers who seek to evaluate or demonstrate the value of a replication study systematically. We illustrate the practical application...
DEFF Research Database (Denmark)
2012-01-01
on this subject, this book makes essential reading for anyone considering new ways of thinking about architecture. In drawing upon both historical and contemporary perspectives this book provides evidence of the ways in which relations between representation and the represented continue to be reconsidered......The relationship between representation and the represented is examined here through the notion of persistent modelling. This notion is not novel to the activity of architectural design if it is considered as describing a continued active and iterative engagement with design concerns – an evident...... characteristic of architectural practice. But the persistence in persistent modelling can also be understood to apply in other ways, reflecting and anticipating extended roles for representation. This book identifies three principle areas in which these extensions are becoming apparent within contemporary...
DEFF Research Database (Denmark)
on this subject, this book makes essential reading for anyone considering new ways of thinking about architecture. In drawing upon both historical and contemporary perspectives this book provides evidence of the ways in which relations between representation and the represented continue to be reconsidered......The relationship between representation and the represented is examined here through the notion of persistent modelling. This notion is not novel to the activity of architectural design if it is considered as describing a continued active and iterative engagement with design concerns – an evident...... characteristic of architectural practice. But the persistence in persistent modelling can also be understood to apply in other ways, reflecting and anticipating extended roles for representation. This book identifies three principle areas in which these extensions are becoming apparent within contemporary...
DEFF Research Database (Denmark)
Michael, John
others' minds. Then (2), in order to bring to light some possible justifications, as well as hazards and criticisms of the methodology of looking time tests, I will take a closer look at the concept of folk psychology and will focus on the idea that folk psychology involves using oneself as a model...... of other people in order to predict and understand their behavior. Finally (3), I will discuss the historical location and significance of the emergence of looking time tests...
1975-01-01
detailed rendered visible in his photographs by streams of photographs of spheres entering the water small bubbles from electrolysis . So far as is...of the cavity is opaque or, brined wihile the sphere wats still in the oil. At if translucent, the contrast between thle jet and about the time the...and brass, for example) should be so model velocity scale according to Equation 1.18, selected that electrolysis is not a problem. the addition of
Vincent, Julian F V
2003-01-01
Biomimetics is seen as a path from biology to engineering. The only path from engineering to biology in current use is the application of engineering concepts and models to biological systems. However, there is another pathway: the verification of biological mechanisms by manufacture, leading to an iterative process between biology and engineering in which the new understanding that the engineering implementation of a biological system can bring is fed back into biology, allowing a more compl...
DEFF Research Database (Denmark)
This book reflects and expands on the current trend in the building industry to understand, simulate and ultimately design buildings by taking into consideration the interlinked elements and forces that act on them. This approach overcomes the traditional, exclusive focus on building tasks, while....... The chapter authors were invited speakers at the 5th Symposium "Modelling Behaviour", which took place at the CITA in Copenhagen in September 2015....
1980-02-01
a teuto 014aceo 0-oiuato 4 ajj 210- I 14 *Experiments l~~lamCID - l2 C15 model+ Aida ditane &Gray medium K .2 a Experiments hont target n-IO a0 deawa...possibilita di valutazione dello scambio termico in focolai di caldaie per ricaldamento"I Atti E Rassegna Tecnica Societa ingegneri e arc~hitetti in Torino
DEFF Research Database (Denmark)
practice: the duration of active influence that representation can hold in relation to the represented; the means, methods and media through which representations are constructed and used; and what it is that is being represented. Featuring contributions from some of the world’s most advanced thinkers....... It also provides critical insight into the use of contemporary modelling tools and methods, together with an examination of the implications their use has within the territories of architectural design, realisation and experience....
International Nuclear Information System (INIS)
McIllvaine, C.M.
1994-01-01
Exhaust gases from power plants that burn fossil fuels contain concentrations of sulfur dioxide (SO 2 ), nitric oxide (NO), particulate matter, hydrocarbon compounds and trace metals. Estimated emissions from the operation of a hypothetical 500 MW coal-fired power plant are given. Ozone is considered a secondary pollutant, since it is not emitted directly into the atmosphere but is formed from other air pollutants, specifically, nitrogen oxides (NO), and non-methane organic compounds (NMOQ) in the presence of sunlight. (NMOC are sometimes referred to as hydrocarbons, HC, or volatile organic compounds, VOC, and they may or may not include methane). Additionally, ozone formation Alternative is a function of the ratio of NMOC concentrations to NO x concentrations. A typical ozone isopleth is shown, generated with the Empirical Kinetic Modeling Approach (EKMA) option of the Environmental Protection Agency's (EPA) Ozone Isopleth Plotting Mechanism (OZIPM-4) model. Ozone isopleth diagrams, originally generated with smog chamber data, are more commonly generated with photochemical reaction mechanisms and tested against smog chamber data. The shape of the isopleth curves is a function of the region (i.e. background conditions) where ozone concentrations are simulated. The location of an ozone concentration on the isopleth diagram is defined by the ratio of NMOC and NO x coordinates of the point, known as the NMOC/NO x ratio. Results obtained by the described model are presented
Energy Technology Data Exchange (ETDEWEB)
Plimpton, Steven James; Heffernan, Julieanne; Sasaki, Darryl Yoshio; Frischknecht, Amalie Lucile; Stevens, Mark Jackson; Frink, Laura J. Douglas
2005-11-01
Understanding the properties and behavior of biomembranes is fundamental to many biological processes and technologies. Microdomains in biomembranes or ''lipid rafts'' are now known to be an integral part of cell signaling, vesicle formation, fusion processes, protein trafficking, and viral and toxin infection processes. Understanding how microdomains form, how they depend on membrane constituents, and how they act not only has biological implications, but also will impact Sandia's effort in development of membranes that structurally adapt to their environment in a controlled manner. To provide such understanding, we created physically-based models of biomembranes. Molecular dynamics (MD) simulations and classical density functional theory (DFT) calculations using these models were applied to phenomena such as microdomain formation, membrane fusion, pattern formation, and protein insertion. Because lipid dynamics and self-organization in membranes occur on length and time scales beyond atomistic MD, we used coarse-grained models of double tail lipid molecules that spontaneously self-assemble into bilayers. DFT provided equilibrium information on membrane structure. Experimental work was performed to further help elucidate the fundamental membrane organization principles.
Abreu, Orlando; Alvear, Daniel
2016-01-01
This book presents an overview of modeling definitions and concepts, theory on human behavior and human performance data, available tools and simulation approaches, model development, and application and validation methods. It considers the data and research efforts needed to develop and incorporate functions for the different parameters into comprehensive escape and evacuation simulations, with a number of examples illustrating different aspects and approaches. After an overview of basic modeling approaches, the book discusses benefits and challenges of current techniques. The representation of evacuees is a central issue, including human behavior and the proper implementation of representational tools. Key topics include the nature and importance of the different parameters involved in ASET and RSET and the interactions between them. A review of the current literature on verification and validation methods is provided, with a set of recommended verification tests and examples of validation tests. The book c...
Ocean General Circulation Models
Energy Technology Data Exchange (ETDEWEB)
Yoon, Jin-Ho; Ma, Po-Lun
2012-09-30
1. Definition of Subject The purpose of this text is to provide an introduction to aspects of oceanic general circulation models (OGCMs), an important component of Climate System or Earth System Model (ESM). The role of the ocean in ESMs is described in Chapter XX (EDITOR: PLEASE FIND THE COUPLED CLIMATE or EARTH SYSTEM MODELING CHAPTERS). The emerging need for understanding the Earth’s climate system and especially projecting its future evolution has encouraged scientists to explore the dynamical, physical, and biogeochemical processes in the ocean. Understanding the role of these processes in the climate system is an interesting and challenging scientific subject. For example, a research question how much extra heat or CO2 generated by anthropogenic activities can be stored in the deep ocean is not only scientifically interesting but also important in projecting future climate of the earth. Thus, OGCMs have been developed and applied to investigate the various oceanic processes and their role in the climate system.
Tirado-Ramos, Alfredo; Hu, Jingkun; Lee, K P
2002-01-01
Supplement 23 to DICOM (Digital Imaging and Communications for Medicine), Structured Reporting, is a specification that supports a semantically rich representation of image and waveform content, enabling experts to share image and related patient information. DICOM SR supports the representation of textual and coded data linked to images and waveforms. Nevertheless, the medical information technology community needs models that work as bridges between the DICOM relational model and open object-oriented technologies. The authors assert that representations of the DICOM Structured Reporting standard, using object-oriented modeling languages such as the Unified Modeling Language, can provide a high-level reference view of the semantically rich framework of DICOM and its complex structures. They have produced an object-oriented model to represent the DICOM SR standard and have derived XML-exchangeable representations of this model using World Wide Web Consortium specifications. They expect the model to benefit developers and system architects who are interested in developing applications that are compliant with the DICOM SR specification.
Object Modeling and Building Information Modeling
Auråen, Hege; Gjemdal, Hanne
2016-01-01
The main part of this thesis is an online course (Small Private Online Course) entitled "Introduction to Object Modeling and Building Information Modeling". This supplementary report clarifies the choices made in the process of developing the course. The course examines the basic concepts of object modeling, modeling techniques and a modeling language (UML). Further, building information modeling (BIM) is presented as a modeling process, and the object modeling concepts in the BIM softw...
Directory of Open Access Journals (Sweden)
PAPAJ Jan
2014-05-01
Full Text Available Traditional wireless networks use the concept of the point-to-point forwarding inherited from reliable wired networks which seems to be not ideal for wireless environment. New emerging applications and networks operate mostly disconnected. So-called Delay-Tolerant networks (DTNs are receiving increasing attentions from both academia and industry. DTNs introduced a store-carry-and-forward concept solving the problem of intermittent connectivity. Behavior of such networks is verified by real models, computer simulation or combination of the both approaches. Computer simulation has become the primary and cost effective tool for evaluating the performance of the DTNs. OPNET modeler is our target simulation tool and we wanted to spread OPNET’s simulation opportunity towards DTN. We implemented bundle protocol to OPNET modeler allowing simulate cases based on bundle concept as epidemic forwarding which relies on flooding the network with messages and the forwarding algorithm based on the history of past encounters (PRoPHET. The implementation details will be provided in article.
Model Checking Algorithms for Markov Reward Models
Cloth, Lucia; Cloth, L.
2006-01-01
Model checking Markov reward models unites two different approaches of model-based system validation. On the one hand, Markov reward models have a long tradition in model-based performance and dependability evaluation. On the other hand, a formal method like model checking allows for the precise
Model Management Via Dependencies Between Variables: An Indexical Reasoning in Mathematical Modeling
National Research Council Canada - National Science Library
Rehber, Devrim
1997-01-01
... declarations and formal model definitions. The utilization of the standard graphical screen objects of a graphics-based operating system provides enhanced visualization of models and more cohesive human-computer interaction...
Students' Models of Curve Fitting: A Models and Modeling Perspective
Gupta, Shweta
2010-01-01
The Models and Modeling Perspectives (MMP) has evolved out of research that began 26 years ago. MMP researchers use Model Eliciting Activities (MEAs) to elicit students' mental models. In this study MMP was used as the conceptual framework to investigate the nature of students' models of curve fitting in a problem-solving environment consisting of…
Model-based software process improvement
Zettervall, Brenda T.
1994-01-01
The activities of a field test site for the Software Engineering Institute's software process definition project are discussed. Products tested included the improvement model itself, descriptive modeling techniques, the CMM level 2 framework document, and the use of process definition guidelines and templates. The software process improvement model represents a five stage cyclic approach for organizational process improvement. The cycles consist of the initiating, diagnosing, establishing, acting, and leveraging phases.
Family models: comparing and contrasting the Olson Circumplex Model with the Beavers Systems Model.
Beavers, W R; Voeller, M N
1983-03-01
There is an increasing interest in and need for family models. One such model is the Olson Circumplex Model, previously reported in this journal (18). This model is compared and contrasted with the Beavers Systems Model, which was also developed from empirical data and has had extensive use in family assessment. Though both are cross-sectional, process-oriented, and capable of providing structure for family research, we believe there are certain short-comings in the Olson model that make it less clinically useful than the Beavers Systems Model. These include definitional problems and a total reliance on curvilinear dimensions with a grid approach to family typology that does not acknowledge a separation/individuation continuum. Our model avoids these deficiencies and includes a continuum of functional competence that reflects the development and differentiation of many living systems, including the family.
International Nuclear Information System (INIS)
Fawaz, S.; Khan, Zulfiquar A.; Mossa, Samir Y.
2006-01-01
A new definition is proposed for analyzing the consultation in the primary health care, integrating other models of consultation and provides a framework by which general practitioners can apply the principles of consultation using communication skills to reconcile the respective agenda and autonomy of both doctor and patient into a negotiated agreed plan, which includes both management of health problems and health promotion. Achieving success of consultations depends on time and mutual cooperation between patient and doctor showed by doctor-patient relationship. (author)
Cushman, John H.
1987-04-01
In a recent review article, G. Sposito et al. (1986) examined the various stochastic theories which are concerned with transport of solutes in porous media. In this short note we expand on their discussion to include several topics which had been omitted. We begin by looking at two definitions of probability theory and their relation to the concept of an ensemble. An REV ensemble of soils is defined and examined. The concept of ergodicity is reviewed, and it is pointed out that most stochastic models are theoretically unverifiable. The relationship between scale of measurement and stochasticity is briefly reviewed, and an equation that combines the two concepts is presented.
Directory of Open Access Journals (Sweden)
Cristian GEORGESCU
2005-01-01
Full Text Available The goal of this paper is to investigate how such a pattern matching could be performed on models,including the definition of the input language as well as the elaboration of efficient matchingalgorithms. Design patterns can be considered reusable micro-architectures that contribute to anoverall system architecture. Frameworks are also closely related to design patterns. Componentsoffer the possibility to radically change the behaviors and services offered by an application bysubstitution or addition of new components, even a long time after deployment. Software testing isanother aspect of reliable development. Testing activities mainly consist in ensuring that a systemimplementation conforms to its specifications.
International Nuclear Information System (INIS)
Miranda, Luis E.T.
1997-01-01
A model for the design, and the definition of scale, of a facility for the treatment of low and intermediate-level radioactive wastes is presented. The facility is designed to manage wastes generated in research and production of radioisotopes and labeled compounds, and wastes coming from users of radioisotopes, including: compatible solid wastes, spent sealed sources, radioactive lightning rods, organic and inorganic liquids, and wet bulk solids. The input is one hundred cubic meters per year of untreated wastes and the output is two hundred drums of treated waste in a form suitable for transportation and disposal. (author). 2 refs., 2 figs., 2 tabs
1989-01-01
A wooden model of the ALEPH experiment and its cavern. ALEPH was one of 4 experiments at CERN's 27km Large Electron Positron collider (LEP) that ran from 1989 to 2000. During 11 years of research, LEP's experiments provided a detailed study of the electroweak interaction. Measurements performed at LEP also proved that there are three – and only three – generations of particles of matter. LEP was closed down on 2 November 2000 to make way for the construction of the Large Hadron Collider in the same tunnel. The cavern and detector are in separate locations - the cavern is stored at CERN and the detector is temporarily on display in Glasgow physics department. Both are available for loan.
International Nuclear Information System (INIS)
Hrivnacova, I; Viren, B
2008-01-01
The Virtual Geometry Model (VGM) was introduced at CHEP in 2004 [1], where its concept, based on the abstract interfaces to geometry objects, has been presented. Since then, it has undergone a design evolution to pure abstract interfaces, it has been consolidated and completed with more advanced features. Currently it is used in Geant4 VMC for the support of TGeo geometry definition with Geant4 native geometry navigation and recently it has been used in the validation of the G4Root tool. The implementation of the VGM for a concrete geometry model represents a small layer between the VGM and the particular native geometry. In addition to the implementations for Geant4 and Root TGeo geometry models, there is now added the third one for AGDD, which together with the existing XML exporter makes the VGM the most advanced tool for exchanging geometry formats providing 9 ways of conversions between Geant4, TGeo, AGDD and GDML models. In this presentation we will give the overview and the present status of the tool, we will review the supported features and point to possible limits in converting geometry models
Archetype modeling methodology.
Moner, David; Maldonado, José Alberto; Robles, Montserrat
2018-03-01
Clinical Information Models (CIMs) expressed as archetypes play an essential role in the design and development of current Electronic Health Record (EHR) information structures. Although there exist many experiences about using archetypes in the literature, a comprehensive and formal methodology for archetype modeling does not exist. Having a modeling methodology is essential to develop quality archetypes, in order to guide the development of EHR systems and to allow the semantic interoperability of health data. In this work, an archetype modeling methodology is proposed. This paper describes its phases, the inputs and outputs of each phase, and the involved participants and tools. It also includes the description of the possible strategies to organize the modeling process. The proposed methodology is inspired by existing best practices of CIMs, software and ontology development. The methodology has been applied and evaluated in regional and national EHR projects. The application of the methodology provided useful feedback and improvements, and confirmed its advantages. The conclusion of this work is that having a formal methodology for archetype development facilitates the definition and adoption of interoperable archetypes, improves their quality, and facilitates their reuse among different information systems and EHR projects. Moreover, the proposed methodology can be also a reference for CIMs development using any other formalism. Copyright © 2018 Elsevier Inc. All rights reserved.
International Nuclear Information System (INIS)
Beltracchi, Leo
1999-01-01
The design and development of a digital computer-based safety system for a nuclear power plant is a complex process. The process of design and product development must result in a final product free of critical errors; operational safety of nuclear power plants must not be compromised. This paper focuses on the development of a safety system model to assist designers, developers, and regulators in establishing and evaluating requirements for a digital computer-based safety system. The model addresses hardware, software, and human elements for use in the requirements definition process. The purpose of the safety system model is to assist and serve as a guide to humans in the cognitive reasoning process of establishing requirements. The goals in the use of the model are to: (1) enhance the completeness of the requirements and (2) reduce the number of errors associated with the requirements definition phase of a project
Ogurtsova, Katherine; Heise, Thomas L; Linnenkamp, Ute; Dintsios, Charalabos-Markos; Lhachimi, Stefan K; Icks, Andrea
2017-12-29
Type 2 diabetes mellitus (T2DM), a highly prevalent chronic disease, puts a large burden on individual health and health care systems. Computer simulation models, used to evaluate the clinical and economic effectiveness of various interventions to handle T2DM, have become a well-established tool in diabetes research. Despite the broad consensus about the general importance of validation, especially external validation, as a crucial instrument of assessing and controlling for the quality of these models, there are no systematic reviews comparing such validation of diabetes models. As a result, the main objectives of this systematic review are to identify and appraise the different approaches used for the external validation of existing models covering the development and progression of T2DM. We will perform adapted searches by applying respective search strategies to identify suitable studies from 14 electronic databases. Retrieved study records will be included or excluded based on predefined eligibility criteria as defined in this protocol. Among others, a publication filter will exclude studies published before 1995. We will run abstract and full text screenings and then extract data from all selected studies by filling in a predefined data extraction spreadsheet. We will undertake a descriptive, narrative synthesis of findings to address the study objectives. We will pay special attention to aspects of quality of these models in regard to the external validation based upon ISPOR and ADA recommendations as well as Mount Hood Challenge reports. All critical stages within the screening, data extraction and synthesis processes will be conducted by at least two authors. This protocol adheres to PRISMA and PRISMA-P standards. The proposed systematic review will provide a broad overview of the current practice in the external validation of models with respect to T2DM incidence and progression in humans built on simulation techniques. PROSPERO CRD42017069983 .
International Nuclear Information System (INIS)
Grandi, C; Bonacorsi, D; Colling, D; Fisk, I; Girone, M
2014-01-01
The CMS Computing Model was developed and documented in 2004. Since then the model has evolved to be more flexible and to take advantage of new techniques, but many of the original concepts remain and are in active use. In this presentation we will discuss the changes planned for the restart of the LHC program in 2015. We will discuss the changes planning in the use and definition of the computing tiers that were defined with the MONARC project. We will present how we intend to use new services and infrastructure to provide more efficient and transparent access to the data. We will discuss the computing plans to make better use of the computing capacity by scheduling more of the processor nodes, making better use of the disk storage, and more intelligent use of the networking.
Averaging in cosmological models using scalars
International Nuclear Information System (INIS)
Coley, A A
2010-01-01
The averaging problem in cosmology is of considerable importance for the correct interpretation of cosmological data. A rigorous mathematical definition of averaging in a cosmological model is necessary. In general, a spacetime is completely characterized by its scalar curvature invariants, and this suggests a particular spacetime averaging scheme based entirely on scalars. We clearly identify the problems of averaging in a cosmological model. We then present a precise definition of a cosmological model, and based upon this definition, we propose an averaging scheme in terms of scalar curvature invariants. This scheme is illustrated in a simple static spherically symmetric perfect fluid cosmological spacetime, where the averaging scales are clearly identified.
International Nuclear Information System (INIS)
Schreckenberg, M
2004-01-01
This book by Nino Boccara presents a compilation of model systems commonly termed as 'complex'. It starts with a definition of the systems under consideration and how to build up a model to describe the complex dynamics. The subsequent chapters are devoted to various categories of mean-field type models (differential and recurrence equations, chaos) and of agent-based models (cellular automata, networks and power-law distributions). Each chapter is supplemented by a number of exercises and their solutions. The table of contents looks a little arbitrary but the author took the most prominent model systems investigated over the years (and up until now there has been no unified theory covering the various aspects of complex dynamics). The model systems are explained by looking at a number of applications in various fields. The book is written as a textbook for interested students as well as serving as a comprehensive reference for experts. It is an ideal source for topics to be presented in a lecture on dynamics of complex systems. This is the first book on this 'wide' topic and I have long awaited such a book (in fact I planned to write it myself but this is much better than I could ever have written it!). Only section 6 on cellular automata is a little too limited to the author's point of view and one would have expected more about the famous Domany-Kinzel model (and more accurate citation!). In my opinion this is one of the best textbooks published during the last decade and even experts can learn a lot from it. Hopefully there will be an actualization after, say, five years since this field is growing so quickly. The price is too high for students but this, unfortunately, is the normal case today. Nevertheless I think it will be a great success! (book review)
Business model elements impacting cloud computing adoption
DEFF Research Database (Denmark)
Bogataj, Kristina; Pucihar, Andreja; Sudzina, Frantisek
The paper presents a proposed research framework for identification of business model elements impacting Cloud Computing Adoption. We provide a definition of main Cloud Computing characteristics, discuss previous findings on factors impacting Cloud Computing Adoption, and investigate technology...... adoption theories, such as Diffusion of Innovations, Technology Acceptance Model, Unified Theory of Acceptance and Use of Technology. Further on, at research model for identification of Cloud Computing Adoption factors from a business model perspective is presented. The following business model building...
Stickler, Leslie; Sykes, Gary
2016-01-01
This report reviews the scholarly and research evidence supporting the construct labeled modeling and explaining content (MEC), which is measured via a performance assessment in the "ETS"® National Observational Teaching Examination (NOTE) assessment series. This construct involves practices at the heart of teaching that deal with how…
Energy Technology Data Exchange (ETDEWEB)
Andrade, Jose Geraldo Pena de; Koelle, Edmundo; Luvizotto Junior, Edevar [Universidade Estadual de Campinas, SP (Brazil). Faculdade de Engenharia Civil. Dept. de Hidraulica e Saneamento
1997-07-01
This paper presents a complete mathematical and computer model which allows simulating a generic hydroelectric power plant under steady state and transitory regimes, in the extensive time, and also the analysis of the oscillating flows resulting from excitation sources present in the installation, such as vortices in the suction pipe during partial load operation.
International Nuclear Information System (INIS)
Choi, Sang Hyoun
2007-08-01
Ajou University School of Medicine made the serially sectioned anatomical images from the Visible Korean Human (VKH) Project in Korea. The VKH images, which are the high-resolution color photographic images, show the organs and tissues in the human body very clearly at 0.2 mm intervals. In this study, we constructed a high-quality voxel model (VKH-Man) with a total of 30 organs and tissues by manual and automatic segmentation method using the serially sectioned anatomical image data from the Visible Korean Human (VKH) project in Korea. The height and weight of VKH-Man voxel model is 164 cm and 57.6 kg, respectively, and the voxel resolution is 1.875 x 1.875 x 2 mm 3 . However, this voxel phantom can be used to calculate the organ and tissue doses of only one person. Therefore, in this study, we adjusted the voxel phantom to the 'Reference Korean' data to construct the voxel phantom that represents the radiation workers in Korea. The height and weight of the voxel model (HDRK-Man) that is finally developed are 171 cm and 68 kg, respectively, and the voxel resolution is 1.981 x 1.981 x 2.0854 mm 3 . VKH-Man and HDRK-Man voxel model were implemented in a Monte Carlo particle transport simulation code for calculation of the organ and tissue doses in various irradiation geometries. The calculated values were compared with each other to see the effect of the adjustment and also compared with other computational models (KTMAN-2, ICRP-74 and VIP-Man). According to the results, the adjustment of the voxel model was found hardly affect the dose calculations and most of the organ and tissue equivalent doses showed some differences among the models. These results shows that the difference in figure, and organ topology affects the organ doses more than the organ size. The calculated values of the effective dose from VKH-Man and HDRK-Man according to the ICRP-60 and upcoming ICRP recommendation were compared. For the other radiation geometries (AP, LLAT, RLAT) except for PA
Kyriacou, Chris; Sutcliffe, John
1978-01-01
A definition and model of teacher stress is presented which conceptualizes teacher stress as a response syndrome (anger or depression) mediated by (1) an appraisal of threat to the teacher's self-esteem or well-being and (2) coping mechanisms activated to reduce the perceived threat. (Author)
Defining fitness in evolutionary models
Indian Academy of Sciences (India)
2008-12-23
Dec 23, 2008 ... The analysis of evolutionary models requires an appropriate definition for fitness. ..... of dimorphism for dormancy in plants (Cohen 1966). .... yses have assumed nonoverlapping generations (i.e. no age- structure). The solution to defining fitness when the environ- ment is spatially variable and there is a ...
Computational Modeling of Culture's Consequences
Hofstede, G.J.; Jonker, C.M.; Verwaart, T.
2010-01-01
This paper presents an approach to formalize the influence of culture on the decision functions of agents in social simulations. The key components are (a) a definition of the domain of study in the form of a decision model, (b) knowledge acquisition based on a dimensional theory of culture,
Beyond (Models of) Disability?
Beaudry, Jonas-Sébastien
2016-04-01
The strategy of developing an ontology or models of disability as a prior step to settling ethical issues regarding disabilities is highly problematic for two reasons. First, key definitional aspects of disability are normative and cannot helpfully be made value-neutral. Second, if we accept that the contested concept of disability is value-laden, it is far from obvious that there are definitive reasons for choosing one interpretation of the concept over another. I conclude that the concept of disability is better left ethically open-ended or broad enough to encompass the examination of various ethical issues (such as oppression, minority rights, or physical discomfort). Alternatively, the concept of disability could be altogether abandoned in order to focus on specific issues without being hindered by debates about the nature of disability. Only political costs, rather than conceptual considerations internal to the models, could be weighed against such a conclusion. © The Author 2016. Published by Oxford University Press, on behalf of the Journal of Medicine and Philosophy Inc. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Cannerfelt, B; Nystedt, J; Jönsen, A; Lätt, J; van Westen, D; Lilja, A; Bengtsson, A; Nilsson, P; Mårtensson, J; Sundgren, P C
2018-01-01
Aim The aim of this study was to evaluate the extent of white matter lesions, atrophy of the hippocampus and corpus callosum, and their correlation with cognitive dysfunction (CD), in patients diagnosed with systemic lupus erythematosus (SLE). Methods Seventy SLE patients and 25 healthy individuals (HIs) were included in the study. To evaluate the different SLE and neuropsychiatric SLE (NPSLE) definition schemes, patients were grouped both according to the American College of Rheumatology (ACR) definition, as well as the more stringent ACR-Systemic Lupus International Collaborating Clinics definition. Patients and HIs underwent a 3 Tesla brain MRI and a standardized neuropsychological test. MRI data were evaluated for number and volume of white matter lesions and atrophy of the hippocampus and corpus callosum. Differences between groups and subgroups were evaluated for significance. Number and volume of white matter lesions and atrophy of the hippocampus and corpus callosum were correlated to cognitive dysfunction. Results The total volume of white matter lesions was significantly larger in SLE patients compared to HIs ( p = 0.004). However, no significant differences were seen between the different SLE subgroups. Atrophy of the bilateral hippocampus was significantly more pronounced in patients with NPSLE compared to those with non-NPSLE (right: p = 0.010; left p = 0.023). Significant negative correlations between cognitive test scores on verbal memory and number and volume of white matter lesions were present. Conclusion SLE patients have a significantly larger volume of white matter lesions on MRI compared to HIs and the degree of white matter lesion volume correlates to cognitive dysfunction, specifically to verbal memory. No significant differences in the number or volume of white matter lesions were identified between subgroups of SLE patients regardless of the definition model used.
The Concept of Model. What is Remarkable in Mathematical Models
Bezruchko, Boris P.; Smirnov, Dmitry A.
Dictionaries tell us that the word "model" originates from the Latin word "modulus" which means "measure, template, norm". This term was used in proceedings on civil engineering several centuries BC. Currently, it relates to an enormously wide range of material objects, symbolic structures and ideal images ranging from models of clothes, small copies of ships and aeroplanes, different pictures and plots to mathematical equations and computational algorithms. Starting to define the concept of "model", we would like to remind about the difficulty to give strict definitions of basic concepts. Thus, when university professors define "oscillations" and "waves" in their lectures on this subject, it is common for many of them to repeat the joke of Russian academician L.I. Mandel'shtam, who illustrated the problem with the example of the term "heap": How many objects, and of which kind, deserve such a name? As well, he compared strict definitions at the beginning of studying any topic to "swaddling oneself with barbed wire". Among classical examples of impossibility to give exhaustive formulations, one can mention the terms "bald spot", "forest", etc. Therefore, we will not consider variety of existing definitions of "model" and "modelling" in detail. Any of them relates to the purposes and subjective preferences of an author and is valid in a certain sense. However, it is restricted since it ignores some objects or properties that deserve attention from other points of view.
Mathematical modeling of aeroelastic systems
Velmisov, Petr A.; Ankilov, Andrey V.; Semenova, Elizaveta P.
2017-12-01
In the paper, the stability of elastic elements of a class of designs that are in interaction with a gas or liquid flow is investigated. The definition of the stability of an elastic body corresponds to the concept of stability of dynamical systems by Lyapunov. As examples the mathematical models of flowing channels (models of vibration devices) at a subsonic flow and the mathematical models of protective surface at a supersonic flow are considered. Models are described by the related systems of the partial differential equations. An analytic investigation of stability is carried out on the basis of the construction of Lyapunov-type functionals, a numerical investigation is carried out on the basis of the Galerkin method. The various models of the gas-liquid environment (compressed, incompressible) and the various models of a deformable body (elastic linear and elastic nonlinear) are considered.
Radovcich, N. A.; Dreim, D.; Okeefe, D. A.; Linner, L.; Pathak, S. K.; Reaser, J. S.; Richardson, D.; Sweers, J.; Conner, F.
1985-01-01
Work performed in the design of a transport aircraft wing for maximum fuel efficiency is documented with emphasis on design criteria, design methodology, and three design configurations. The design database includes complete finite element model description, sizing data, geometry data, loads data, and inertial data. A design process which satisfies the economics and practical aspects of a real design is illustrated. The cooperative study relationship between the contractor and NASA during the course of the contract is also discussed.
Modelling synergistic effects of appetite regulating hormones
DEFF Research Database (Denmark)
Schmidt, Julie Berg; Ritz, Christian
2016-01-01
We briefly reviewed one definition of dose addition, which is applicable within the framework of generalized linear models. We established how this definition of dose addition corresponds to effect addition in case only two doses per compound are considered for evaluating synergistic effects...
A model for persistency of egg production
Grossman, M.; Gossman, T.N.; Koops, W.J.
2000-01-01
The objectives of our study were to propose a new definition for persistency of egg production and to develop a mathematical model to describe the egg production curve, one that includes a new measure for persistency, based on the proposed definition, for use as a selection criterion to improve
Modelling and Forecasting Multivariate Realized Volatility
DEFF Research Database (Denmark)
Halbleib, Roxana; Voev, Valeri
2011-01-01
This paper proposes a methodology for dynamic modelling and forecasting of realized covariance matrices based on fractionally integrated processes. The approach allows for flexible dependence patterns and automatically guarantees positive definiteness of the forecast. We provide an empirical appl...
Sorensen, K.; van den Broucke, S.; Fullam, J.; Doyle, G.; Pelikan, J.; Slonska, Z.; Brand, H.
2012-01-01
Background: Health literacy concerns the knowledge and competences of persons to meet the complex demands of health in modern society. Although its importance is increasingly recognised, there is no consensus about the definition of health literacy or about its conceptual dimensions, which limits
The IMACLIM model; Le modele IMACLIM
Energy Technology Data Exchange (ETDEWEB)
NONE
2003-07-01
This document provides annexes to the IMACLIM model which propose an actualized description of IMACLIM, model allowing the design of an evaluation tool of the greenhouse gases reduction policies. The model is described in a version coupled with the POLES, technical and economical model of the energy industry. Notations, equations, sources, processing and specifications are proposed and detailed. (A.L.B.)
Building Mental Models by Dissecting Physical Models
Srivastava, Anveshna
2016-01-01
When students build physical models from prefabricated components to learn about model systems, there is an implicit trade-off between the physical degrees of freedom in building the model and the intensity of instructor supervision needed. Models that are too flexible, permitting multiple possible constructions require greater supervision to…
Energy Technology Data Exchange (ETDEWEB)
Blanchard, Miran [Department of Molecular Medicine, Mayo Clinic, Rochester, Minnesota (United States); Department of Radiation Oncology, Mayo Clinic, Rochester, Minnesota (United States); Shim, Kevin G. [Department of Molecular Medicine, Mayo Clinic, Rochester, Minnesota (United States); Department of Immunology, Mayo Clinic, Rochester, Minnesota (United States); Grams, Michael P. [Department of Radiation Oncology, Mayo Clinic, Rochester, Minnesota (United States); Rajani, Karishma; Diaz, Rosa M. [Department of Molecular Medicine, Mayo Clinic, Rochester, Minnesota (United States); Furutani, Keith M. [Department of Radiation Oncology, Mayo Clinic, Rochester, Minnesota (United States); Thompson, Jill [Department of Molecular Medicine, Mayo Clinic, Rochester, Minnesota (United States); Olivier, Kenneth R.; Park, Sean S. [Department of Radiation Oncology, Mayo Clinic, Rochester, Minnesota (United States); Markovic, Svetomir N. [Department of Immunology, Mayo Clinic, Rochester, Minnesota (United States); Department of Medical Oncology, Mayo Clinic, Rochester, Minnesota (United States); Pandha, Hardev [The Postgraduate Medical School, University of Surrey, Guildford (United Kingdom); Melcher, Alan [Leeds Institute of Cancer Studies and Pathology, University of Leeds, Leeds (United Kingdom); Harrington, Kevin [Targeted Therapy Laboratory, The Institute of Cancer Research, London (United Kingdom); Zaidi, Shane [Department of Molecular Medicine, Mayo Clinic, Rochester, Minnesota (United States); Targeted Therapy Laboratory, The Institute of Cancer Research, London (United Kingdom); Vile, Richard, E-mail: vile.richard@mayo.edu [Department of Molecular Medicine, Mayo Clinic, Rochester, Minnesota (United States); Department of Immunology, Mayo Clinic, Rochester, Minnesota (United States); Leeds Institute of Cancer Studies and Pathology, University of Leeds, Leeds (United Kingdom)
2015-11-01
Purpose: The oligometastatic state is an intermediate state between a malignancy that can be completely eradicated with conventional modalities and one in which a palliative approach is undertaken. Clinically, high rates of local tumor control are possible with stereotactic ablative radiation therapy (SABR), using precisely targeted, high-dose, low-fraction radiation therapy. However, in oligometastatic melanoma, virtually all patients develop progression systemically at sites not initially treated with ablative radiation therapy that cannot be managed with conventional chemotherapy and immunotherapy. We have demonstrated in mice that intravenous administration of vesicular stomatitis virus (VSV) expressing defined tumor-associated antigens (TAAs) generates systemic immune responses capable of clearing established tumors. Therefore, in the present preclinical study, we tested whether the combination of systemic VSV-mediated antigen delivery and SABR would be effective against oligometastatic disease. Methods and Materials: We generated a model of oligometastatic melanoma in C57BL/6 immunocompetent mice and then used a combination of SABR and systemically administered VSV-TAA viral immunotherapy to treat both local and systemic disease. Results: Our data showed that SABR generates excellent control or cure of local, clinically detectable, and accessible tumor through direct cell ablation. Also, the immunotherapeutic activity of systemically administered VSV-TAA generated T-cell responses that cleared subclinical metastatic tumors. We also showed that SABR induced weak T-cell-mediated tumor responses, which, particularly if boosted by VSV-TAA, might contribute to control of local and systemic disease. In addition, VSV-TAA therapy alone had significant effects on control of both local and metastatic tumors. Conclusions: We have shown in the present preliminary murine study using a single tumor model that this approach represents an effective, complementary
Model correction factor method for system analysis
DEFF Research Database (Denmark)
Ditlevsen, Ove Dalager; Johannesen, Johannes M.
2000-01-01
The Model Correction Factor Method is an intelligent response surface method based on simplifiedmodeling. MCFM is aimed for reliability analysis in case of a limit state defined by an elaborate model. Herein it isdemonstrated that the method is applicable for elaborate limit state surfaces on which...... severallocally most central points exist without there being a simple geometric definition of the corresponding failuremodes such as is the case for collapse mechanisms in rigid plastic hinge models for frame structures. Taking as simplifiedidealized model a model of similarity with the elaborate model...... surface than existing in the idealized model....
Towards Clone Detection in UML Domain Models
DEFF Research Database (Denmark)
Störrle, Harald
2013-01-01
Code clones (i.e., duplicate fragments of code) have been studied for long, and there is strong evidence that they are a major source of software faults. Anecdotal evidence suggests that this phenomenon occurs similarly in models, suggesting that model clones are as detrimental to model quality...... as they are to code quality. However, programming language code and visual models have significant differences that make it difficult to directly transfer notions and algorithms developed in the code clone arena to model clones. In this article, we develop and propose a definition of the notion of “model clone” based...... on the thorough analysis of practical scenarios. We propose a formal definition of model clones, specify a clone detection algorithm for UML domain models, and implement it prototypically. We investigate different similarity heuristics to be used in the algorithm, and report the performance of our approach. While...
Atmospheric Models/Global Atmospheric Modeling
1998-09-30
Atmospheric Models /Global Atmospheric Modeling Timothy F. Hogan Naval Research Laboratory Monterey, CA 93943-5502 phone: (831) 656-4705 fax: (831...to 00-00-1998 4. TITLE AND SUBTITLE Atmospheric Models /Global Atmospheric Modeling 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...initialization of increments, improved cloud prediction, and improved surface fluxes) have been transition to 6.4 (Global Atmospheric Models , PE 0603207N, X-0513
Models in architectural design
Pauwels, Pieter
2017-01-01
Whereas architects and construction specialists used to rely mainly on sketches and physical models as representations of their own cognitive design models, they rely now more and more on computer models. Parametric models, generative models, as-built models, building information models (BIM), and so forth, they are used daily by any practitioner in architectural design and construction. Although processes of abstraction and the actual architectural model-based reasoning itself of course rema...
International Nuclear Information System (INIS)
Tozini, A.V.
1984-01-01
A review is made of some properties of the rotating Universe models. Godel's model is identified as a generalized filted model. Some properties of new solutions of the Einstein's equations, which are rotating non-stationary Universe models, are presented and analyzed. These models have the Godel's model as a particular case. Non-stationary cosmological models are found which are a generalization of the Godel's metrics in an analogous way in which Friedmann is to the Einstein's model. (L.C.) [pt
Jenkyn, T R; Nicol, A C
2007-01-01
A multi-segment kinematic model of the foot was developed for use in a gait analysis laboratory. The foot was divided into hindfoot, talus, midfoot and medial and lateral forefoot segments. Six functional joints were defined: ankle and subtalar joints, frontal and transverse plane motions of the hindfoot relative to midfoot, supination/pronation twist of the forefoot relative to midfoot and medial longitudinal arch height-to-length ratio. Twelve asymptomatic subjects were tested during barefoot walking with a six-camera optical stereometric system and auto-reflective markers organized in triads. Repeatability of the joint motions was tested using coefficients of multiple correlation. Ankle and subtalar joint motions and twisting of the forefoot were most repeatable. Hindfoot motions were least repeatable both within-subjects and between-subjects. Hindfoot and forefoot pronation in the frontal plane was found to coincide with dropping of the medial longitudinal arch between early to mid-stance, followed by supination and rising of the arch in late stance and swing phase. This multi-segment foot model addresses an unfortunate shortcoming in current gait analysis practice-the inability to measure motion within the foot. Such measurements are crucial if gait analysis is to remain relevant in the orthopaedic and rehabilitative treatment of the foot and ankle.
Directory of Open Access Journals (Sweden)
Marcus Alessi Bittencourt
2016-07-01
Full Text Available An indisputable cornerstone of the Western music tradition, the dialectic opposition between the major and minor grammatical modal genders has always been present in the imagination of musicians and music theorists for centuries. Such dialectics of opposition is especially important in the context of nineteenth-century harmonic dualism, with its ideas of tonicity and phonicity. These concepts serve as the main foundation for the way harmonic dualism conceives the major and minor worlds: two worlds with equivalent rights and properties, but with opposed polarities. This paper presents a redefinition of the terms tonicity and phonicity, translating those concepts to the context of post-tonal music theory. The terminologies of generatrix, tonicity, root, phonicity, vertex, and azimuth are explained in this paper, followed by propositions of mathematical models for those concepts, which spring from Richard Parncutt’s root-salience model for pitch-class sets. In order to demonstrate the possibilities of using modal gender as a criterion for the study and classification of the universe of Tn-types, we will present a taxonomy of the 351 transpositional set types, which comprises the categories of tonic (major, phonic (minor and neutral (genderless. In addition, there will be a small discussion on the effect of set symmetries and set asymmetries on the tonic/phonic properties of a Tn-type.
Stochastic Subspace Modelling of Turbulence
DEFF Research Database (Denmark)
Sichani, Mahdi Teimouri; Pedersen, B. J.; Nielsen, Søren R.K.
2009-01-01
positive definite cross-spectral density matrix a frequency response matrix is constructed which determines the turbulence vector as a linear filtration of Gaussian white noise. Finally, an accurate state space modelling method is proposed which allows selection of an appropriate model order......, and estimation of a state space model for the vector turbulence process incorporating its phase spectrum in one stage, and its results are compared with a conventional ARMA modelling method.......Turbulence of the incoming wind field is of paramount importance to the dynamic response of civil engineering structures. Hence reliable stochastic models of the turbulence should be available from which time series can be generated for dynamic response and structural safety analysis. In the paper...
2012-03-01
Defect map for directional solidification with superimposed predicted preferred solidification conditions for a range of bar thicknesses and mold ...J.J. Schirra (The Mineral, Metals & Materials Society, 2000) 189-200. 7. A.J. Elliott, “ Directional Solidification of Large Cross-Section Ni -Base...AFRL-RX-WP-TP-2012-0252 DEVELOPMENT AND APPLICATION OF A PROTOCOL FOR DEFINITION OF PROCESS CONDITIONS FOR DIRECTIONAL SOLIDIFICATION
DEFF Research Database (Denmark)
Branlard, Emmanuel Simon Pierre
2017-01-01
Different models of wake expansion are presented in this chapter: the 1D momentum theory model, the cylinder analog model and Theodorsen’s model. Far wake models such as the ones from Frandsen or Rathmann or only briefly mentioned. The different models are compared to each other. Results from...
Uninformative priors prefer simpler models
Mattingly, Henry; Abbott, Michael; Machta, Benjamin
The Bayesian framework for model selection requires a prior for the probability of candidate models that is uninformative-it minimally biases predictions with preconceptions. For parameterized models, Jeffreys' uninformative prior, pJ, weights parameter space according to the local density of distinguishable model predictions. While pJ is rigorously justifiable in the limit that there is infinite data, it is ill-suited to effective theories and sloppy models. In these models, parameters are very poorly constrained by available data, and even the number of parameters is often arbitrary. We use a principled definition of `uninformative' as the mutual information between parameters and their expected data and study the properties of the prior p* which maximizes it. When data is abundant, p* approaches Jeffreys' prior. With finite data, however, p* is discrete, putting weight on a finite number of atoms in parameter space. In addition, when data is scarce, the prior lies on model boundaries, which in many cases correspond to interpretable models but with fewer parameters. As more data becomes available, the prior puts weight on models with more parameters. Thus, p* quantifies the intuition that better data can justify the use of more complex models.
Improved Maximum Parsimony Models for Phylogenetic Networks.
Van Iersel, Leo; Jones, Mark; Scornavacca, Celine
2018-05-01
Phylogenetic networks are well suited to represent evolutionary histories comprising reticulate evolution. Several methods aiming at reconstructing explicit phylogenetic networks have been developed in the last two decades. In this article, we propose a new definition of maximum parsimony for phylogenetic networks that permits to model biological scenarios that cannot be modeled by the definitions currently present in the literature (namely, the "hardwired" and "softwired" parsimony). Building on this new definition, we provide several algorithmic results that lay the foundations for new parsimony-based methods for phylogenetic network reconstruction.
Model Manipulation for End-User Modelers
DEFF Research Database (Denmark)
Acretoaie, Vlad
End-user modelers are domain experts who create and use models as part of their work. They are typically not Software Engineers, and have little or no programming and meta-modeling experience. However, using model manipulation languages developed in the context of Model-Driven Engineering often...... of these proposals. To achieve its first goal, the thesis presents the findings of a Systematic Mapping Study showing that human factors topics are scarcely and relatively poorly addressed in model transformation research. Motivated by these findings, the thesis explores the requirements of end-user modelers......, and transformations using their modeling notation and editor of choice. The VM* languages are implemented via a single execution engine, the VM* Runtime, built on top of the Henshin graph-based transformation engine. This approach combines the benefits of flexibility, maturity, and formality. To simplify model editor...
Pi, E. I.; Siegel, E.
2010-03-01
Siegel[AMS Natl.Mtg.(2002)-Abs.973-60-124] digits logarithmic- law inversion to ONLY BEQS BEC:Quanta/Bosons=#: EMP-like SEVERE VULNERABILITY of ONLY #-networks(VS.ANALOG INvulnerability) via Barabasi NP(VS.dynamics[Not.AMS(5/2009)] critique);(so called)``quantum-computing''(QC) = simple-arithmetic (sansdivision);algorithmiccomplexities:INtractibility/UNdecidabi lity/INefficiency/NONcomputability/HARDNESS(so MIScalled) ``noise''-induced-phase-transition(NIT)ACCELERATION:Cook-Levin theorem Reducibility = RG fixed-points; #-Randomness DEFINITION via WHAT? Query(VS. Goldreich[Not.AMS(2002)] How? mea culpa)= ONLY MBCS hot-plasma v #-clumping NON-random BEC; Modular-Arithmetic Congruences = Signal x Noise PRODUCTS = clock-model; NON-Shor[Physica A,341,586(04)]BEC logarithmic-law inversion factorization: Watkins #-theory U statistical- physics); P=/=NP C-S TRIVIAL Proof: Euclid!!! [(So Miscalled) computational-complexity J-O obviation(3 millennia AGO geometry: NO:CC,``CS'';``Feet of Clay!!!'']; Query WHAT?:Definition: (so MIScalled)``complexity''=UTTER-SIMPLICITY!! v COMPLICATEDNESS MEASURE(S).
Model-to-model interface for multiscale materials modeling
Energy Technology Data Exchange (ETDEWEB)
Antonelli, Perry Edward [Iowa State Univ., Ames, IA (United States)
2017-12-17
A low-level model-to-model interface is presented that will enable independent models to be linked into an integrated system of models. The interface is based on a standard set of functions that contain appropriate export and import schemas that enable models to be linked with no changes to the models themselves. These ideas are presented in the context of a specific multiscale material problem that couples atomistic-based molecular dynamics calculations to continuum calculations of fluid ow. These simulations will be used to examine the influence of interactions of the fluid with an adjacent solid on the fluid ow. The interface will also be examined by adding it to an already existing modeling code, Large-scale Atomic/Molecular Massively Parallel Simulator (LAMMPS) and comparing it with our own molecular dynamics code.
Monica, Ratti Maria; Delli Zotti, Giulia Bruna; Spotti, Donatella; Sarno, Lucio
2014-01-01
Chronic Kidney Disease (CKD) and the dialytic treatment cause a significant psychological impact on patients, their families and on the medical-nursing staff too. The psychological aspects linked to the chronic condition of Kidney Disease generate the need to integrated a psychologist into the healthcare team of the Nephrology, Dialysis and Hypertension Operative Unit, in order to offer a specific and professional support to the patient during the different stages of the disease, to their caregivers and to the medical team. The aim of this collaboration project between Nephrology and Psychology is to create a global and integrated healthcare model. It does not give attention simply to the physical dimension of patients affected by CKD, but also to the emotional-affective, cognitive and social dimensions and to the health environment.
Directory of Open Access Journals (Sweden)
David Lo Buglio
2012-12-01
Full Text Available EnWith the arrival of digital technologies in the field of architectural documentation, many tools and methods for data acquisition have been considerably developed. However, these developments are primarily used for recording colorimetric and dimensional properties of the objects processed. The actors, of the disciplines concerned by 3D digitization of architectural heritage, are facing with a large number of data, leaving the survey far from its cognitive dimension. In this context, it seems necessary to provide innovative solutions in order to increase the informational value of the representations produced by strengthen relations between "multiplicity" of data and "intelligibility" of the theoretical model. With the purpose of answering to the lack of methodology we perceived, this article therefore offers an approach to the creation of representation systems that articulate the digital instance with the geometric/semantic model.ItGrazie all’introduzione delle tecnologie digitali nel campo della documentazione architettonica, molti strumenti e metodi di acquisizione hanno avuto un notevole sviluppo. Tuttavia, questi sviluppi si sono principalmente concentrati sulla registrazione e sulla restituzione delle proprietà geometriche e colorimetriche degli oggetti di studio. Le discipline interessate alla digitalizzazione 3D del patrimonio architettonico hanno pertanto la possibilità di produrre delle grandi quantità di dati attraverso un’evoluzione delle pratiche di documentazione che potrebbero progressivamente far scomparire la dimensione cognitiva del rilievo. In questo contesto, appare necessario fornire soluzioni innovative per aumentare il valore informativo delle rappresentazioni digitali tramite l’identificazione delle relazioni potenziali che è possibile costruire fra le nozioni di "molteplicità" ed "intelligibilità". Per rispondere a questo deficit metodologico, questo articolo presenta le basi di un approccio per la
Tryby, M.; Fries, J. S.; Baranowski, C.
2014-12-01
Extreme precipitation events can cause significant impacts to drinking water and wastewater utilities, including facility damage, water quality impacts, service interruptions and potential risks to human health and the environment due to localized flooding and combined sewer overflows (CSOs). These impacts will become more pronounced with the projected increases in frequency and intensity of extreme precipitation events due to climate change. To model the impacts of extreme precipitation events, wastewater utilities often develop Intensity, Duration, and Frequency (IDF) rainfall curves and "design storms" for use in the U.S. Environmental Protection Agency's (EPA) Storm Water Management Model (SWMM). Wastewater utilities use SWMM for planning, analysis, and facility design related to stormwater runoff, combined and sanitary sewers, and other drainage systems in urban and non-urban areas. SWMM tracks (1) the quantity and quality of runoff made within each sub-catchment; and (2) the flow rate, flow depth, and quality of water in each pipe and channel during a simulation period made up of multiple time steps. In its current format, EPA SWMM does not consider climate change projection data. Climate change may affect the relationship between intensity, duration, and frequency described by past rainfall events. Therefore, EPA is integrating climate projection data available in the Climate Resilience Evaluation and Awareness Tool (CREAT) into SWMM. CREAT is a climate risk assessment tool for utilities that provides downscaled climate change projection data for changes in the amount of rainfall in a 24-hour period for various extreme precipitation events (e.g., from 5-year to 100-year storm events). Incorporating climate change projections into SWMM will provide wastewater utilities with more comprehensive data they can use in planning for future storm events, thereby reducing the impacts to the utility and customers served from flooding and stormwater issues.
Concept Modeling vs. Data modeling in Practice
DEFF Research Database (Denmark)
Madsen, Bodil Nistrup; Erdman Thomsen, Hanne
2015-01-01
This chapter shows the usefulness of terminological concept modeling as a first step in data modeling. First, we introduce terminological concept modeling with terminological ontologies, i.e. concept systems enriched with characteristics modeled as feature specifications. This enables a formal...... account of the inheritance of characteristics and allows us to introduce a number of principles and constraints which render concept modeling more coherent than earlier approaches. Second, we explain how terminological ontologies can be used as the basis for developing conceptual and logical data models...
Cognitive models embedded in system simulation models
International Nuclear Information System (INIS)
Siegel, A.I.; Wolf, J.J.
1982-01-01
If we are to discuss and consider cognitive models, we must first come to grips with two questions: (1) What is cognition; (2) What is a model. Presumably, the answers to these questions can provide a basis for defining a cognitive model. Accordingly, this paper first places these two questions into perspective. Then, cognitive models are set within the context of computer simulation models and a number of computer simulations of cognitive processes are described. Finally, pervasive issues are discussed vis-a-vis cognitive modeling in the computer simulation context
Model-Based Learning: A Synthesis of Theory and Research
Seel, Norbert M.
2017-01-01
This article provides a review of theoretical approaches to model-based learning and related research. In accordance with the definition of model-based learning as an acquisition and utilization of mental models by learners, the first section centers on mental model theory. In accordance with epistemology of modeling the issues of semantics,…
Modeling and Predistortion of Envelope Tracking Power Amplifiers using a Memory Binomial Model
DEFF Research Database (Denmark)
Tafuri, Felice Francesco; Sira, Daniel; Larsen, Torben
2013-01-01
. The model definition is based on binomial series, hence the name of memory binomial model (MBM). The MBM is here applied to measured data-sets acquired from an ET measurement set-up. When used as a PA model the MBM showed an NMSE (Normalized Mean Squared Error) as low as −40dB and an ACEPR (Adjacent Channel...
Dodgson, Mark; Gann, David; Phillips, Nelson; Massa, Lorenzo; Tucci, Christopher
2014-01-01
The chapter offers a broad review of the literature at the nexus between Business Models and innovation studies, and examines the notion of Business Model Innovation in three different situations: Business Model Design in newly formed organizations, Business Model Reconfiguration in incumbent firms, and Business Model Innovation in the broad context of sustainability. Tools and perspectives to make sense of Business Models and support managers and entrepreneurs in dealing with Business Model ...
Air Quality Dispersion Modeling - Alternative Models
Models, not listed in Appendix W, that can be used in regulatory applications with case-by-case justification to the Reviewing Authority as noted in Section 3.2, Use of Alternative Models, in Appendix W.
Wake modelling combining mesoscale and microscale models
DEFF Research Database (Denmark)
Badger, Jake; Volker, Patrick; Prospathospoulos, J.
2013-01-01
parameterizations are demonstrated in theWeather Research and Forecasting mesoscale model (WRF) in an idealized atmospheric flow. The model framework is the Horns Rev I wind farm experiencing an 7.97 m/s wind from 269.4o. Three of the four parameterizations use thrust output from the CRESflow-NS microscale model......In this paper the basis for introducing thrust information from microscale wake models into mesocale model wake parameterizations will be described. A classification system for the different types of mesoscale wake parameterizations is suggested and outlined. Four different mesoscale wake....... The characteristics of the mesoscale wake that developed from the four parameterizations are examined. In addition the mesoscale model wakes are compared to measurement data from Horns Rev I. Overall it is seen as an advantage to incorporate microscale model data in mesocale model wake parameterizations....
A Model of Trusted Measurement Model
Ma Zhili; Wang Zhihao; Dai Liang; Zhu Xiaoqin
2017-01-01
A model of Trusted Measurement supporting behavior measurement based on trusted connection architecture (TCA) with three entities and three levels is proposed, and a frame to illustrate the model is given. The model synthesizes three trusted measurement dimensions including trusted identity, trusted status and trusted behavior, satisfies the essential requirements of trusted measurement, and unified the TCA with three entities and three levels.
Molecular Models: Construction of Models with Magnets
Directory of Open Access Journals (Sweden)
Kalinovčić P.
2015-07-01
Full Text Available Molecular models are indispensable tools in teaching chemistry. Beside their high price, commercially available models are generally too small for classroom demonstration. This paper suggests how to make space-filling (callote models from Styrofoam with magnetic balls as connectors and disc magnets for showing molecular polarity
Spencer, Netanya Y; Yan, Ziying; Cong, Le; Zhang, Yulong; Engelhardt, John F; Stanton, Robert C
2016-02-01
Studies to determine subcellular localization and translocation of proteins are important because subcellular localization of proteins affects every aspect of cellular function. Such studies frequently utilize mutagenesis to alter amino acid sequences hypothesized to constitute subcellular localization signals. These studies often utilize fluorescent protein tags to facilitate live cell imaging. These methods are excellent for studies of monomeric proteins, but for multimeric proteins, they are unable to rule out artifacts from native protein subunits already present in the cells. That is, native monomers might direct the localization of fluorescent proteins with their localization signals obliterated. We have developed a method for ruling out such artifacts, and we use glucose 6-phosphate dehydrogenase (G6PD) as a model to demonstrate the method's utility. Because G6PD is capable of homodimerization, we employed a novel approach to remove interference from native G6PD. We produced a G6PD knockout somatic (hepatic) cell line using CRISPR-Cas9 mediated genome engineering. Transfection of G6PD knockout cells with G6PD fluorescent mutant proteins demonstrated that the major subcellular localization sequences of G6PD are within the N-terminal portion of the protein. This approach sets a new gold standard for similar studies of subcellular localization signals in all homodimerization-capable proteins. Copyright © 2015 Elsevier Inc. All rights reserved.
Demir, Resit; Peros, Georgios; Hohenberger, Werner
2011-06-01
Since the introduction of the angiogenic therapy by Folkman et al. in the 1970'ies many antiangiogenic drugs were identified. Only few of them are still now in clinical use. Also the Vascular Endothelial Growth Factor (VEGF), the cytokine with the highest angiogenic activity, has been identified. Its antagonist, Bevacizumab, is produced and admitted for the angiogenic therapy in first line for metastatic colorectal cancer. When we look at preclinical studies, they fail of in vivo models that define the "Drug-Angiogenic-Activity-Index" of angiogenic or antiangiogenic drugs. This work proposes a possible standardized procedure to define the "Drug Angiogenic Activity Index" by counting the vascular intersections (VIS) on the Chorioallantoic Membrane after drug application. The equation was defined as follows: {ΔVIS[Drug]-ΔVIS[Control]} / Δ VIS[Control]. For VEGF a Drug-Angiogenic-Activity-Index of 0.92 was found and for Bevacizumab a -1. This means almost that double of the naturally angiogenic activity was achieved by VEGF on the Chorioallantoic membrane. A complete blocking of naturally angiogenic activity was observed after Bevacizumabs application. Establishing the "Drug-Angiogenic-Activity-Index" in the preclinical phase will give us an impact of effectiveness for the new constructed antiangiogenic drugs like the impact of effectiveness in the cortisone family.
Chao, Michael C; Pritchard, Justin R; Zhang, Yanjia J; Rubin, Eric J; Livny, Jonathan; Davis, Brigid M; Waldor, Matthew K
2013-10-01
The coupling of high-density transposon mutagenesis to high-throughput DNA sequencing (transposon-insertion sequencing) enables simultaneous and genome-wide assessment of the contributions of individual loci to bacterial growth and survival. We have refined analysis of transposon-insertion sequencing data by normalizing for the effect of DNA replication on sequencing output and using a hidden Markov model (HMM)-based filter to exploit heretofore unappreciated information inherent in all transposon-insertion sequencing data sets. The HMM can smooth variations in read abundance and thereby reduce the effects of read noise, as well as permit fine scale mapping that is independent of genomic annotation and enable classification of loci into several functional categories (e.g. essential, domain essential or 'sick'). We generated a high-resolution map of genomic loci (encompassing both intra- and intergenic sequences) that are required or beneficial for in vitro growth of the cholera pathogen, Vibrio cholerae. This work uncovered new metabolic and physiologic requirements for V. cholerae survival, and by combining transposon-insertion sequencing and transcriptomic data sets, we also identified several novel noncoding RNA species that contribute to V. cholerae growth. Our findings suggest that HMM-based approaches will enhance extraction of biological meaning from transposon-insertion sequencing genomic data.
On the Way to Appropriate Model Complexity
Höge, M.
2016-12-01
When statistical models are used to represent natural phenomena they are often too simple or too complex - this is known. But what exactly is model complexity? Among many other definitions, the complexity of a model can be conceptualized as a measure of statistical dependence between observations and parameters (Van der Linde, 2014). However, several issues remain when working with model complexity: A unique definition for model complexity is missing. Assuming a definition is accepted, how can model complexity be quantified? How can we use a quantified complexity to the better of modeling? Generally defined, "complexity is a measure of the information needed to specify the relationships between the elements of organized systems" (Bawden & Robinson, 2015). The complexity of a system changes as the knowledge about the system changes. For models this means that complexity is not a static concept: With more data or higher spatio-temporal resolution of parameters, the complexity of a model changes. There are essentially three categories into which all commonly used complexity measures can be classified: (1) An explicit representation of model complexity as "Degrees of freedom" of a model, e.g. effective number of parameters. (2) Model complexity as code length, a.k.a. "Kolmogorov complexity": The longer the shortest model code, the higher its complexity (e.g. in bits). (3) Complexity defined via information entropy of parametric or predictive uncertainty. Preliminary results show that Bayes theorem allows for incorporating all parts of the non-static concept of model complexity like data quality and quantity or parametric uncertainty. Therefore, we test how different approaches for measuring model complexity perform in comparison to a fully Bayesian model selection procedure. Ultimately, we want to find a measure that helps to assess the most appropriate model.
Target Scattering Metrics: Model-Model and Model Data comparisons
2017-12-13
be suitable for input to classification schemes. The investigated metrics are then applied to model-data comparisons. INTRODUCTION Metrics for...stainless steel replica of artillery shell Table 7. Targets used in the TIER simulations for the metrics study. C. Four Potential Metrics: Four...Four metrics were investigated. The metric, based on 2D cross-correlation, is typically used in classification algorithms. Model-model comparisons
A 3 x 2 Achievement Goal Model
Elliot, Andrew J.; Murayama, Kou; Pekrun, Reinhard
2011-01-01
In the present research, a 3 x 2 model of achievement goals is proposed and tested. The model is rooted in the definition and valence components of competence, and encompasses 6 goal constructs: task-approach, task-avoidance, self-approach, self-avoidance, other-approach, and other-avoidance. The results from 2 studies provided strong support for…
Extendable linearised adjustment model for deformation analysis
Hiddo Velsink
2015-01-01
Author supplied: "This paper gives a linearised adjustment model for the affine, similarity and congruence transformations in 3D that is easily extendable with other parameters to describe deformations. The model considers all coordinates stochastic. Full positive semi-definite covariance matrices
Extendable linearised adjustment model for deformation analysis
Velsink, H.
2015-01-01
This paper gives a linearised adjustment model for the affine, similarity and congruence transformations in 3D that is easily extendable with other parameters to describe deformations. The model considers all coordinates stochastic. Full positive semi-definite covariance matrices and correlation
Model Uncertainty for Bilinear Hysteretic Systems
DEFF Research Database (Denmark)
Sørensen, John Dalsgaard; Thoft-Christensen, Palle
1984-01-01
is related to the concept of a failure surface (or limit state surface) in the n-dimensional basic variable space then model uncertainty is at least due to the neglected variables, the modelling of the failure surface and the computational technique used. A more precise definition is given in section 2...
A Formal Model for Context-Awareness
DEFF Research Database (Denmark)
Kjærgaard, Mikkel Baun; Bunde-Pedersen, Jonathan
here is a definite lack of formal support for modeling real- istic context-awareness in pervasive computing applications. The Conawa calculus presented in this paper provides mechanisms for modeling complex and interwoven sets of context-information by extending ambient calculus with new construc...
Target-Centric Network Modeling
DEFF Research Database (Denmark)
Mitchell, Dr. William L.; Clark, Dr. Robert M.
reporting formats, along with a tested process that facilitates the production of a wide range of analytical products for civilian, military, and hybrid intelligence environments. Readers will learn how to perform the specific actions of problem definition modeling, target network modeling......, and collaborative sharing in the process of creating a high-quality, actionable intelligence product. The case studies reflect the complexity of twenty-first century intelligence issues by dealing with multi-layered target networks that cut across political, economic, social, technological, and military issues...
Exclusion statistics and integrable models
International Nuclear Information System (INIS)
Mashkevich, S.
1998-01-01
The definition of exclusion statistics that was given by Haldane admits a 'statistical interaction' between distinguishable particles (multispecies statistics). For such statistics, thermodynamic quantities can be evaluated exactly; explicit expressions are presented here for cluster coefficients. Furthermore, single-species exclusion statistics is realized in one-dimensional integrable models of the Calogero-Sutherland type. The interesting questions of generalizing this correspondence to the higher-dimensional and the multispecies cases remain essentially open; however, our results provide some hints as to searches for the models in question
Network model of security system
Directory of Open Access Journals (Sweden)
Adamczyk Piotr
2016-01-01
Full Text Available The article presents the concept of building a network security model and its application in the process of risk analysis. It indicates the possibility of a new definition of the role of the network models in the safety analysis. Special attention was paid to the development of the use of an algorithm describing the process of identifying the assets, vulnerability and threats in a given context. The aim of the article is to present how this algorithm reduced the complexity of the problem by eliminating from the base model these components that have no links with others component and as a result and it was possible to build a real network model corresponding to reality.
International Symposia on Scale Modeling
Ito, Akihiko; Nakamura, Yuji; Kuwana, Kazunori
2015-01-01
This volume thoroughly covers scale modeling and serves as the definitive source of information on scale modeling as a powerful simplifying and clarifying tool used by scientists and engineers across many disciplines. The book elucidates techniques used when it would be too expensive, or too difficult, to test a system of interest in the field. Topics addressed in the current edition include scale modeling to study weather systems, diffusion of pollution in air or water, chemical process in 3-D turbulent flow, multiphase combustion, flame propagation, biological systems, behavior of materials at nano- and micro-scales, and many more. This is an ideal book for students, both graduate and undergraduate, as well as engineers and scientists interested in the latest developments in scale modeling. This book also: Enables readers to evaluate essential and salient aspects of profoundly complex systems, mechanisms, and phenomena at scale Offers engineers and designers a new point of view, liberating creative and inno...
DEFF Research Database (Denmark)
Madsen, Henrik; Zhou, Jianjun; Hansen, Lars Henrik
1997-01-01
This paper describes a case study of identifying the physical model (or the grey box model) of a hydraulic test robot. The obtained model is intended to provide a basis for model-based control of the robot. The physical model is formulated in continuous time and is derived by application...
Automated data model evaluation
International Nuclear Information System (INIS)
Kazi, Zoltan; Kazi, Ljubica; Radulovic, Biljana
2012-01-01
Modeling process is essential phase within information systems development and implementation. This paper presents methods and techniques for analysis and evaluation of data model correctness. Recent methodologies and development results regarding automation of the process of model correctness analysis and relations with ontology tools has been presented. Key words: Database modeling, Data model correctness, Evaluation
DEFF Research Database (Denmark)
Hansen, Mads Fogtmann; Fagertun, Jens; Larsen, Rasmus
2011-01-01
This paper presents a fusion of the active appearance model (AAM) and the Riemannian elasticity framework which yields a non-linear shape model and a linear texture model – the active elastic appearance model (EAM). The non-linear elasticity shape model is more flexible than the usual linear subs...
Haiganoush Preisler; Alan Ager
2013-01-01
For applied mathematicians forest fire models refer mainly to a non-linear dynamic system often used to simulate spread of fire. For forest managers forest fire models may pertain to any of the three phases of fire management: prefire planning (fire risk models), fire suppression (fire behavior models), and postfire evaluation (fire effects and economic models). In...
Willden, Jeff
2001-01-01
"Bohr's Atomic Model" is a small interactive multimedia program that introduces the viewer to a simplified model of the atom. This interactive simulation lets students build an atom using an atomic construction set. The underlying design methodology for "Bohr's Atomic Model" is model-centered instruction, which means the central model of the…
DEFF Research Database (Denmark)
Ivang, Reimer; Hinson, Robert; Somasundaram, Ramanathan
2006-01-01
Purpose: Seeks to argue that there are problems associated with e-market definitive efforts and consequently seeks proposes a new e-market model. Design/methodology/Approach: Paper based largely on literature survey and an assessment of the existing e-market conceptualizations. Findings: Based...... on the literature survey and ídentification of gaps in the present e-market definitive models, the authors postulate a preliminary e-market reference model. Originality/ Value: Through synthesizing the e-market literature, and by taking into account contemporary e-market developments, key dimensions that define...
A Method for Model Checking Feature Interactions
DEFF Research Database (Denmark)
Pedersen, Thomas; Le Guilly, Thibaut; Ravn, Anders Peter
2015-01-01
This paper presents a method to check for feature interactions in a system assembled from independently developed concurrent processes as found in many reactive systems. The method combines and refines existing definitions and adds a set of activities. The activities describe how to populate the ...... the definitions with models to ensure that all interactions are captured. The method is illustrated on a home automation example with model checking as analysis tool. In particular, the modelling formalism is timed automata and the analysis uses UPPAAL to find interactions....
From Numeric Models to Granular System Modeling
Directory of Open Access Journals (Sweden)
Witold Pedrycz
2015-03-01
To make this study self-contained, we briefly recall the key concepts of granular computing and demonstrate how this conceptual framework and its algorithmic fundamentals give rise to granular models. We discuss several representative formal setups used in describing and processing information granules including fuzzy sets, rough sets, and interval calculus. Key architectures of models dwell upon relationships among information granules. We demonstrate how information granularity and its optimization can be regarded as an important design asset to be exploited in system modeling and giving rise to granular models. With this regard, an important category of rule-based models along with their granular enrichments is studied in detail.
Degrenne, B; Pruvost, J; Titica, M; Takache, H; Legrand, J
2011-10-01
Photosynthetic hydrogen production under light by the green microalga Chlamydomonas reinhardtii was investigated in a torus-shaped PBR in sulfur-deprived conditions. Culture conditions, represented by the dry biomass concentration of the inoculum, sulfate concentration, and incident photon flux density (PFD), were optimized based on a previously published model (Fouchard et al., 2009. Biotechnol Bioeng 102:232-245). This allowed a strictly autotrophic production, whereas the sulfur-deprived protocol is usually applied in photoheterotrophic conditions. Experimental results combined with additional information from kinetic simulations emphasize effects of sulfur deprivation and light attenuation in the PBR in inducing anoxia and hydrogen production. A broad range of PFD was tested (up to 500 µmol photons m(-2) s(-1) ). Maximum hydrogen productivities were 1.0 ± 0.2 mL H₂ /h/L (or 25 ± 5 mL H₂ /m(2) h) and 3.1 mL ± 0.4 H₂ /h L (or 77.5 ± 10 mL H₂ /m(2) h), at 110 and 500 µmol photons m(-2) s(-1) , respectively. These values approached a maximum specific productivity of approximately 1.9 mL ± 0.4 H₂ /h/g of biomass dry weight, clearly indicative of a limitation in cell capacity to produce hydrogen. The efficiency of the process and further optimizations are discussed. Copyright © 2011 Wiley Periodicals, Inc.
Geologic Framework Model Analysis Model Report
International Nuclear Information System (INIS)
Clayton, R.
2000-01-01
The purpose of this report is to document the Geologic Framework Model (GFM), Version 3.1 (GFM3.1) with regard to data input, modeling methods, assumptions, uncertainties, limitations, and validation of the model results, qualification status of the model, and the differences between Version 3.1 and previous versions. The GFM represents a three-dimensional interpretation of the stratigraphy and structural features of the location of the potential Yucca Mountain radioactive waste repository. The GFM encompasses an area of 65 square miles (170 square kilometers) and a volume of 185 cubic miles (771 cubic kilometers). The boundaries of the GFM were chosen to encompass the most widely distributed set of exploratory boreholes (the Water Table or WT series) and to provide a geologic framework over the area of interest for hydrologic flow and radionuclide transport modeling through the unsaturated zone (UZ). The depth of the model is constrained by the inferred depth of the Tertiary-Paleozoic unconformity. The GFM was constructed from geologic map and borehole data. Additional information from measured stratigraphy sections, gravity profiles, and seismic profiles was also considered. This interim change notice (ICN) was prepared in accordance with the Technical Work Plan for the Integrated Site Model Process Model Report Revision 01 (CRWMS M and O 2000). The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. The GFM is one component of the Integrated Site Model (ISM) (Figure l), which has been developed to provide a consistent volumetric portrayal of the rock layers, rock properties, and mineralogy of the Yucca Mountain site. The ISM consists of three components: (1) Geologic Framework Model (GFM); (2) Rock Properties Model (RPM); and (3) Mineralogic Model (MM). The ISM merges the detailed project stratigraphy into model stratigraphic units that are most useful for the primary downstream models and
Geologic Framework Model Analysis Model Report
Energy Technology Data Exchange (ETDEWEB)
R. Clayton
2000-12-19
The purpose of this report is to document the Geologic Framework Model (GFM), Version 3.1 (GFM3.1) with regard to data input, modeling methods, assumptions, uncertainties, limitations, and validation of the model results, qualification status of the model, and the differences between Version 3.1 and previous versions. The GFM represents a three-dimensional interpretation of the stratigraphy and structural features of the location of the potential Yucca Mountain radioactive waste repository. The GFM encompasses an area of 65 square miles (170 square kilometers) and a volume of 185 cubic miles (771 cubic kilometers). The boundaries of the GFM were chosen to encompass the most widely distributed set of exploratory boreholes (the Water Table or WT series) and to provide a geologic framework over the area of interest for hydrologic flow and radionuclide transport modeling through the unsaturated zone (UZ). The depth of the model is constrained by the inferred depth of the Tertiary-Paleozoic unconformity. The GFM was constructed from geologic map and borehole data. Additional information from measured stratigraphy sections, gravity profiles, and seismic profiles was also considered. This interim change notice (ICN) was prepared in accordance with the Technical Work Plan for the Integrated Site Model Process Model Report Revision 01 (CRWMS M&O 2000). The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. The GFM is one component of the Integrated Site Model (ISM) (Figure l), which has been developed to provide a consistent volumetric portrayal of the rock layers, rock properties, and mineralogy of the Yucca Mountain site. The ISM consists of three components: (1) Geologic Framework Model (GFM); (2) Rock Properties Model (RPM); and (3) Mineralogic Model (MM). The ISM merges the detailed project stratigraphy into model stratigraphic units that are most useful for the primary downstream models and the
Mathematical Modeling Using MATLAB
National Research Council Canada - National Science Library
Phillips, Donovan
1998-01-01
.... Mathematical Modeling Using MA MATLAB acts as a companion resource to A First Course in Mathematical Modeling with the goal of guiding the reader to a fuller understanding of the modeling process...
CSIR Research Space (South Africa)
Osburn, L
2010-01-01
Full Text Available The construction industry has turned to energy modelling in order to assist them in reducing the amount of energy consumed by buildings. However, while the energy loads of buildings can be accurately modelled, energy models often under...
DEFF Research Database (Denmark)
Silvennoinen, Annastiina; Teräsvirta, Timo
This article contains a review of multivariate GARCH models. Most common GARCH models are presented and their properties considered. This also includes nonparametric and semiparametric models. Existing specification and misspecification tests are discussed. Finally, there is an empirical example...
Hiemstra, Djoerd; Liu, Ling; Tamer Özsu, M.
2017-01-01
In language modeling, n-gram models are probabilistic models of text that use some limited amount of history, or word dependencies, where n refers to the number of words that participate in the dependence relation.
Souza, D', Austin
2013-01-01
Presentatie gegeven op 13 mei 2013 op de bijeenkomst "Business Model Canvas Challenge Assen". Het Business Model Canvas is ontworpen door Alex Osterwalder. Het model werkt zeer overzichtelijk en bestaat uit negen bouwstenen.
Earth Data Analysis Center, University of New Mexico — The model combines three modeled fire behavior parameters (rate of spread, flame length, crown fire potential) and one modeled ecological health measure (fire regime...
DEFF Research Database (Denmark)
De Giovanni, Domenico
prepayment models for mortgage backed securities, this paper builds a Rational Expectation (RE) model describing the policyholders' behavior in lapsing the contract. A market model with stochastic interest rates is considered, and the pricing is carried out through numerical approximation...
DEFF Research Database (Denmark)
De Giovanni, Domenico
2010-01-01
prepayment models for mortgage backed securities, this paper builds a Rational Expectation (RE) model describing the policyholders' behavior in lapsing the contract. A market model with stochastic interest rates is considered, and the pricing is carried out through numerical approximation...
Natural climate variability in a coupled model
International Nuclear Information System (INIS)
Zebiak, S.E.; Cane, M.A.
1990-01-01
Multi-century simulations with a simplified coupled ocean-atmosphere model are described. These simulations reveal an impressive range of variability on decadal and longer time scales, in addition to the dominant interannual el Nino/Southern Oscillation signal that the model originally was designed to simulate. Based on a very large sample of century-long simulations, it is nonetheless possible to identify distinct model parameter sensitivities that are described here in terms of selected indices. Preliminary experiments motivated by general circulation model results for increasing greenhouse gases suggest a definite sensitivity to model global warming. While these results are not definitive, they strongly suggest that coupled air-sea dynamics figure prominently in global change and must be included in models for reliable predictions
Brax, P.; Martin, J.; Riazuelo, A.
2001-01-01
A short review of some of the aspects of quintessence model building is presented. We emphasize the role of tracking models and their possible supersymmetric origin. A short review of some of the aspects of quintessence model building is presented. We emphasize the role of tracking models and their possible supersymmetric origin. A short review of some of the aspects of quintessence model building is presented. We emphasize the role of tracking models and their possible supersymmetric o...
Computational neurogenetic modeling
Benuskova, Lubica
2010-01-01
Computational Neurogenetic Modeling is a student text, introducing the scope and problems of a new scientific discipline - Computational Neurogenetic Modeling (CNGM). CNGM is concerned with the study and development of dynamic neuronal models for modeling brain functions with respect to genes and dynamic interactions between genes. These include neural network models and their integration with gene network models. This new area brings together knowledge from various scientific disciplines, such as computer and information science, neuroscience and cognitive science, genetics and molecular biol
Eaton, Carrie Diaz; Highlander, Hannah C.; Dahlquist, Kam D.; LaMar, M. Drew; Ledder, Glenn; Schugart, Richard C.
2016-01-01
Despite widespread calls for the incorporation of mathematical modeling into the undergraduate biology curriculum, there is lack of a common understanding around the definition of modeling, which inhibits progress. In this paper, we extend the "Rule of Four," initially used in calculus reform efforts, to a framework for models and modeling that is inclusive of varying disciplinary definitions of each. This unifying framework allows us to both build on strengths that each discipline and its st...
Modelling and evaluation of surgical performance using hidden Markov models.
Megali, Giuseppe; Sinigaglia, Stefano; Tonet, Oliver; Dario, Paolo
2006-10-01
Minimally invasive surgery has become very widespread in the last ten years. Since surgeons experience difficulties in learning and mastering minimally invasive techniques, the development of training methods is of great importance. While the introduction of virtual reality-based simulators has introduced a new paradigm in surgical training, skill evaluation methods are far from being objective. This paper proposes a method for defining a model of surgical expertise and an objective metric to evaluate performance in laparoscopic surgery. Our approach is based on the processing of kinematic data describing movements of surgical instruments. We use hidden Markov model theory to define an expert model that describes expert surgical gesture. The model is trained on kinematic data related to exercises performed on a surgical simulator by experienced surgeons. Subsequently, we use this expert model as a reference model in the definition of an objective metric to evaluate performance of surgeons with different abilities. Preliminary results show that, using different topologies for the expert model, the method can be efficiently used both for the discrimination between experienced and novice surgeons, and for the quantitative assessment of surgical ability.
Overuse Injury Assessment Model
National Research Council Canada - National Science Library
Stuhmiller, James H; Shen, Weixin; Sih, Bryant
2005-01-01
.... Previously, we developed a preliminary model that predicted the stress fracture rate and used biomechanical modeling, nonlinear optimization for muscle force, and bone structural analysis to estimate...
Finch, W Holmes; Kelley, Ken
2014-01-01
A powerful tool for analyzing nested designs in a variety of fields, multilevel/hierarchical modeling allows researchers to account for data collected at multiple levels. Multilevel Modeling Using R provides you with a helpful guide to conducting multilevel data modeling using the R software environment.After reviewing standard linear models, the authors present the basics of multilevel models and explain how to fit these models using R. They then show how to employ multilevel modeling with longitudinal data and demonstrate the valuable graphical options in R. The book also describes models fo
Cosmological models without singularities
International Nuclear Information System (INIS)
Petry, W.
1981-01-01
A previously studied theory of gravitation in flat space-time is applied to homogeneous and isotropic cosmological models. There exist two different classes of models without singularities: (i) ever-expanding models, (ii) oscillating models. The first class contains models with hot big bang. For these models there exist at the beginning of the universe-in contrast to Einstein's theory-very high but finite densities of matter and radiation with a big bang of very short duration. After short time these models pass into the homogeneous and isotropic models of Einstein's theory with spatial curvature equal to zero and cosmological constant ALPHA >= O. (author)
DEFF Research Database (Denmark)
Knudsen, Torben
2011-01-01
model structure suggested by University of Lund the WP4 leader. This particular model structure has the advantages that it fits better into the control design frame work used by WP3-4 compared to the model structures previously developed in WP2. The different model structures are first summarised....... Then issues dealing with optimal experimental design is considered. Finally the parameters are estimated in the chosen static and dynamic models and a validation is performed. Two of the static models, one of them the additive model, explains the data well. In case of dynamic models the suggested additive...
Federal Laboratory Consortium — The Environmental Modeling Center provides the computational tools to perform geostatistical analysis, to model ground water and atmospheric releases for comparison...
National Aeronautics and Space Administration — CLAIRE MONTELEONI*, GAVIN SCHMIDT, AND SHAILESH SAROHA* Climate models are complex mathematical models designed by meteorologists, geophysicists, and climate...
Towards a Tool-Supported Quality Model for Model-Driven Engineering
Mohagheghi, Parastoo
2008-01-01
This paper reviews definitions of model quality before introducing five properties of models that are important for building high-quality models. These are identified to be correctness, completeness, consistency, comprehensibility and confinement. We have earlier defined a quality model that separates intangible quality goals from tangible quality-carrying properties and practices that should be in place to support these properties. A part of that work was to define a metamodel for deve...
Regularized Structural Equation Modeling
Jacobucci, Ross; Grimm, Kevin J.; McArdle, John J.
2016-01-01
A new method is proposed that extends the use of regularization in both lasso and ridge regression to structural equation models. The method is termed regularized structural equation modeling (RegSEM). RegSEM penalizes specific parameters in structural equation models, with the goal of creating easier to understand and simpler models. Although regularization has gained wide adoption in regression, very little has transferred to models with latent variables. By adding penalties to specific parameters in a structural equation model, researchers have a high level of flexibility in reducing model complexity, overcoming poor fitting models, and the creation of models that are more likely to generalize to new samples. The proposed method was evaluated through a simulation study, two illustrative examples involving a measurement model, and one empirical example involving the structural part of the model to demonstrate RegSEM’s utility. PMID:27398019
Integrated Site Model Process Model Report
International Nuclear Information System (INIS)
Booth, T.
2000-01-01
The Integrated Site Model (ISM) provides a framework for discussing the geologic features and properties of Yucca Mountain, which is being evaluated as a potential site for a geologic repository for the disposal of nuclear waste. The ISM is important to the evaluation of the site because it provides 3-D portrayals of site geologic, rock property, and mineralogic characteristics and their spatial variabilities. The ISM is not a single discrete model; rather, it is a set of static representations that provide three-dimensional (3-D), computer representations of site geology, selected hydrologic and rock properties, and mineralogic-characteristics data. These representations are manifested in three separate model components of the ISM: the Geologic Framework Model (GFM), the Rock Properties Model (RPM), and the Mineralogic Model (MM). The GFM provides a representation of the 3-D stratigraphy and geologic structure. Based on the framework provided by the GFM, the RPM and MM provide spatial simulations of the rock and hydrologic properties, and mineralogy, respectively. Functional summaries of the component models and their respective output are provided in Section 1.4. Each of the component models of the ISM considers different specific aspects of the site geologic setting. Each model was developed using unique methodologies and inputs, and the determination of the modeled units for each of the components is dependent on the requirements of that component. Therefore, while the ISM represents the integration of the rock properties and mineralogy into a geologic framework, the discussion of ISM construction and results is most appropriately presented in terms of the three separate components. This Process Model Report (PMR) summarizes the individual component models of the ISM (the GFM, RPM, and MM) and describes how the three components are constructed and combined to form the ISM
Traceability in Model-Based Testing
Directory of Open Access Journals (Sweden)
Mathew George
2012-11-01
Full Text Available The growing complexities of software and the demand for shorter time to market are two important challenges that face today’s IT industry. These challenges demand the increase of both productivity and quality of software. Model-based testing is a promising technique for meeting these challenges. Traceability modeling is a key issue and challenge in model-based testing. Relationships between the different models will help to navigate from one model to another, and trace back to the respective requirements and the design model when the test fails. In this paper, we present an approach for bridging the gaps between the different models in model-based testing. We propose relation definition markup language (RDML for defining the relationships between models.
Minimum-complexity helicopter simulation math model
Heffley, Robert K.; Mnich, Marc A.
1988-01-01
An example of a minimal complexity simulation helicopter math model is presented. Motivating factors are the computational delays, cost, and inflexibility of the very sophisticated math models now in common use. A helicopter model form is given which addresses each of these factors and provides better engineering understanding of the specific handling qualities features which are apparent to the simulator pilot. The technical approach begins with specification of features which are to be modeled, followed by a build up of individual vehicle components and definition of equations. Model matching and estimation procedures are given which enable the modeling of specific helicopters from basic data sources such as flight manuals. Checkout procedures are given which provide for total model validation. A number of possible model extensions and refinement are discussed. Math model computer programs are defined and listed.
4K Video Traffic Prediction using Seasonal Autoregressive Modeling
Directory of Open Access Journals (Sweden)
D. R. Marković
2017-06-01
Full Text Available From the perspective of average viewer, high definition video streams such as HD (High Definition and UHD (Ultra HD are increasing their internet presence year over year. This is not surprising, having in mind expansion of HD streaming services, such as YouTube, Netflix etc. Therefore, high definition video streams are starting to challenge network resource allocation with their bandwidth requirements and statistical characteristics. Need for analysis and modeling of this demanding video traffic has essential importance for better quality of service and experience support. In this paper we use an easy-to-apply statistical model for prediction of 4K video traffic. Namely, seasonal autoregressive modeling is applied in prediction of 4K video traffic, encoded with HEVC (High Efficiency Video Coding. Analysis and modeling were performed within R programming environment using over 17.000 high definition video frames. It is shown that the proposed methodology provides good accuracy in high definition video traffic modeling.
Better models are more effectively connected models
Nunes, João Pedro; Bielders, Charles; Darboux, Frederic; Fiener, Peter; Finger, David; Turnbull-Lloyd, Laura; Wainwright, John
2016-04-01
The concept of hydrologic and geomorphologic connectivity describes the processes and pathways which link sources (e.g. rainfall, snow and ice melt, springs, eroded areas and barren lands) to accumulation areas (e.g. foot slopes, streams, aquifers, reservoirs), and the spatial variations thereof. There are many examples of hydrological and sediment connectivity on a watershed scale; in consequence, a process-based understanding of connectivity is crucial to help managers understand their systems and adopt adequate measures for flood prevention, pollution mitigation and soil protection, among others. Modelling is often used as a tool to understand and predict fluxes within a catchment by complementing observations with model results. Catchment models should therefore be able to reproduce the linkages, and thus the connectivity of water and sediment fluxes within the systems under simulation. In modelling, a high level of spatial and temporal detail is desirable to ensure taking into account a maximum number of components, which then enables connectivity to emerge from the simulated structures and functions. However, computational constraints and, in many cases, lack of data prevent the representation of all relevant processes and spatial/temporal variability in most models. In most cases, therefore, the level of detail selected for modelling is too coarse to represent the system in a way in which connectivity can emerge; a problem which can be circumvented by representing fine-scale structures and processes within coarser scale models using a variety of approaches. This poster focuses on the results of ongoing discussions on modelling connectivity held during several workshops within COST Action Connecteur. It assesses the current state of the art of incorporating the concept of connectivity in hydrological and sediment models, as well as the attitudes of modellers towards this issue. The discussion will focus on the different approaches through which connectivity
Modelling bankruptcy prediction models in Slovak companies
Directory of Open Access Journals (Sweden)
Kovacova Maria
2017-01-01
Full Text Available An intensive research from academics and practitioners has been provided regarding models for bankruptcy prediction and credit risk management. In spite of numerous researches focusing on forecasting bankruptcy using traditional statistics techniques (e.g. discriminant analysis and logistic regression and early artificial intelligence models (e.g. artificial neural networks, there is a trend for transition to machine learning models (support vector machines, bagging, boosting, and random forest to predict bankruptcy one year prior to the event. Comparing the performance of this with unconventional approach with results obtained by discriminant analysis, logistic regression, and neural networks application, it has been found that bagging, boosting, and random forest models outperform the others techniques, and that all prediction accuracy in the testing sample improves when the additional variables are included. On the other side the prediction accuracy of old and well known bankruptcy prediction models is quiet high. Therefore, we aim to analyse these in some way old models on the dataset of Slovak companies to validate their prediction ability in specific conditions. Furthermore, these models will be modelled according to new trends by calculating the influence of elimination of selected variables on the overall prediction ability of these models.
Model Validation in Ontology Based Transformations
Directory of Open Access Journals (Sweden)
Jesús M. Almendros-Jiménez
2012-10-01
Full Text Available Model Driven Engineering (MDE is an emerging approach of software engineering. MDE emphasizes the construction of models from which the implementation should be derived by applying model transformations. The Ontology Definition Meta-model (ODM has been proposed as a profile for UML models of the Web Ontology Language (OWL. In this context, transformations of UML models can be mapped into ODM/OWL transformations. On the other hand, model validation is a crucial task in model transformation. Meta-modeling permits to give a syntactic structure to source and target models. However, semantic requirements have to be imposed on source and target models. A given transformation will be sound when source and target models fulfill the syntactic and semantic requirements. In this paper, we present an approach for model validation in ODM based transformations. Adopting a logic programming based transformational approach we will show how it is possible to transform and validate models. Properties to be validated range from structural and semantic requirements of models (pre and post conditions to properties of the transformation (invariants. The approach has been applied to a well-known example of model transformation: the Entity-Relationship (ER to Relational Model (RM transformation.
Generalized latent variable modeling multilevel, longitudinal, and structural equation models
Skrondal, Anders; Rabe-Hesketh, Sophia
2004-01-01
This book unifies and extends latent variable models, including multilevel or generalized linear mixed models, longitudinal or panel models, item response or factor models, latent class or finite mixture models, and structural equation models.
International Nuclear Information System (INIS)
M. A. Wasiolek
2003-01-01
The purpose of this report is to document the biosphere model, the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), which describes radionuclide transport processes in the biosphere and associated human exposure that may arise as the result of radionuclide release from the geologic repository at Yucca Mountain. The biosphere model is one of the process models that support the Yucca Mountain Project (YMP) Total System Performance Assessment (TSPA) for the license application (LA), the TSPA-LA. The ERMYN model provides the capability of performing human radiation dose assessments. This report documents the biosphere model, which includes: (1) Describing the reference biosphere, human receptor, exposure scenarios, and primary radionuclides for each exposure scenario (Section 6.1); (2) Developing a biosphere conceptual model using site-specific features, events, and processes (FEPs), the reference biosphere, the human receptor, and assumptions (Section 6.2 and Section 6.3); (3) Building a mathematical model using the biosphere conceptual model and published biosphere models (Sections 6.4 and 6.5); (4) Summarizing input parameters for the mathematical model, including the uncertainty associated with input values (Section 6.6); (5) Identifying improvements in the ERMYN model compared with the model used in previous biosphere modeling (Section 6.7); (6) Constructing an ERMYN implementation tool (model) based on the biosphere mathematical model using GoldSim stochastic simulation software (Sections 6.8 and 6.9); (7) Verifying the ERMYN model by comparing output from the software with hand calculations to ensure that the GoldSim implementation is correct (Section 6.10); and (8) Validating the ERMYN model by corroborating it with published biosphere models; comparing conceptual models, mathematical models, and numerical results (Section 7)
Energy Technology Data Exchange (ETDEWEB)
D. W. Wu
2003-07-16
The purpose of this report is to document the biosphere model, the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), which describes radionuclide transport processes in the biosphere and associated human exposure that may arise as the result of radionuclide release from the geologic repository at Yucca Mountain. The biosphere model is one of the process models that support the Yucca Mountain Project (YMP) Total System Performance Assessment (TSPA) for the license application (LA), the TSPA-LA. The ERMYN model provides the capability of performing human radiation dose assessments. This report documents the biosphere model, which includes: (1) Describing the reference biosphere, human receptor, exposure scenarios, and primary radionuclides for each exposure scenario (Section 6.1); (2) Developing a biosphere conceptual model using site-specific features, events, and processes (FEPs), the reference biosphere, the human receptor, and assumptions (Section 6.2 and Section 6.3); (3) Building a mathematical model using the biosphere conceptual model and published biosphere models (Sections 6.4 and 6.5); (4) Summarizing input parameters for the mathematical model, including the uncertainty associated with input values (Section 6.6); (5) Identifying improvements in the ERMYN model compared with the model used in previous biosphere modeling (Section 6.7); (6) Constructing an ERMYN implementation tool (model) based on the biosphere mathematical model using GoldSim stochastic simulation software (Sections 6.8 and 6.9); (7) Verifying the ERMYN model by comparing output from the software with hand calculations to ensure that the GoldSim implementation is correct (Section 6.10); and (8) Validating the ERMYN model by corroborating it with published biosphere models; comparing conceptual models, mathematical models, and numerical results (Section 7).
Energy Technology Data Exchange (ETDEWEB)
M. A. Wasiolek
2003-10-27
The purpose of this report is to document the biosphere model, the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), which describes radionuclide transport processes in the biosphere and associated human exposure that may arise as the result of radionuclide release from the geologic repository at Yucca Mountain. The biosphere model is one of the process models that support the Yucca Mountain Project (YMP) Total System Performance Assessment (TSPA) for the license application (LA), the TSPA-LA. The ERMYN model provides the capability of performing human radiation dose assessments. This report documents the biosphere model, which includes: (1) Describing the reference biosphere, human receptor, exposure scenarios, and primary radionuclides for each exposure scenario (Section 6.1); (2) Developing a biosphere conceptual model using site-specific features, events, and processes (FEPs), the reference biosphere, the human receptor, and assumptions (Section 6.2 and Section 6.3); (3) Building a mathematical model using the biosphere conceptual model and published biosphere models (Sections 6.4 and 6.5); (4) Summarizing input parameters for the mathematical model, including the uncertainty associated with input values (Section 6.6); (5) Identifying improvements in the ERMYN model compared with the model used in previous biosphere modeling (Section 6.7); (6) Constructing an ERMYN implementation tool (model) based on the biosphere mathematical model using GoldSim stochastic simulation software (Sections 6.8 and 6.9); (7) Verifying the ERMYN model by comparing output from the software with hand calculations to ensure that the GoldSim implementation is correct (Section 6.10); and (8) Validating the ERMYN model by corroborating it with published biosphere models; comparing conceptual models, mathematical models, and numerical results (Section 7).
Lumped Thermal Household Model
DEFF Research Database (Denmark)
Biegel, Benjamin; Andersen, Palle; Stoustrup, Jakob
2013-01-01
a lumped model approach as an alternative to the individual models. In the lumped model, the portfolio is seen as baseline consumption superimposed with an ideal storage of limited power and energy capacity. The benefit of such a lumped model is that the computational effort of flexibility optimization...
DEFF Research Database (Denmark)
Larsen, Bjarke Alexander; Andkjær, Kasper Ingdahl; Schoenau-Fog, Henrik
2015-01-01
This paper proposes a new relation model, called "The Moody Mask model", for Interactive Digital Storytelling (IDS), based on Franceso Osborne's "Mask Model" from 2011. This, mixed with some elements from Chris Crawford's Personality Models, is a system designed for dynamic interaction between ch...
DEFF Research Database (Denmark)
Hansen, Peter Reinhard; Lunde, Asger; Nason, James M.
The paper introduces the model confidence set (MCS) and applies it to the selection of models. A MCS is a set of models that is constructed such that it will contain the best model with a given level of confidence. The MCS is in this sense analogous to a confidence interval for a parameter. The MCS...
Rahmani, Fouad Lazhar
2010-11-01
The aim of this paper is to present mathematical modelling of the spread of infection in the context of the transmission of the human immunodeficiency virus (HIV) and the acquired immune deficiency syndrome (AIDS). These models are based in part on the models suggested in the field of th AIDS mathematical modelling as reported by ISHAM [6].
Numerical Modelling of Streams
DEFF Research Database (Denmark)
Vestergaard, Kristian
In recent years there has been a sharp increase in the use of numerical water quality models. Numeric water quality modeling can be divided into three steps: Hydrodynamic modeling for the determination of stream flow and water levels. Modelling of transport and dispersion of a conservative...
DEFF Research Database (Denmark)
Ayres, Phil
2012-01-01
This essay discusses models. It examines what models are, the roles models perform and suggests various intentions that underlie their construction and use. It discusses how models act as a conversational partner, and how they support various forms of conversation within the conversational activity...
R. Pietersz (Raoul); M. van Regenmortel
2005-01-01
textabstractCurrently, there are two market models for valuation and risk management of interest rate derivatives, the LIBOR and swap market models. In this paper, we introduce arbitrage-free constant maturity swap (CMS) market models and generic market models featuring forward rates that span
Modeling the Accidental Deaths
Directory of Open Access Journals (Sweden)
Mariyam Hafeez
2008-01-01
Full Text Available The model for accidental deaths in the city of Lahore has been developed by using a class of Generalized Linear Models. Various link functions have been used in developing the model. The diagnostic checks have been carried out to see the validity of the fitted model.
Cultural Resource Predictive Modeling
2017-10-01
refining formal, inductive predictive models is the quality of the archaeological and environmental data. To build models efficiently, relevant...geomorphology, and historic information . Lessons Learned: The original model was focused on the identification of prehistoric resources. This...system but uses predictive modeling informally . For example, there is no probability for buried archaeological deposits on the Burton Mesa, but there is
Modelling Railway Interlocking Systems
DEFF Research Database (Denmark)
Lindegaard, Morten Peter; Viuf, P.; Haxthausen, Anne Elisabeth
2000-01-01
In this report we present a model of interlocking systems, and describe how the model may be validated by simulation. Station topologies are modelled by graphs in which the nodes denote track segments, and the edges denote connectivity for train traÆc. Points and signals are modelled by annotatio...
Energy Technology Data Exchange (ETDEWEB)
Ibsen, Lars Bo; Liingaard, M.
2006-12-15
A lumped-parameter model represents the frequency dependent soil-structure interaction of a massless foundation placed on or embedded into an unbounded soil domain. In this technical report the steps of establishing a lumped-parameter model are presented. Following sections are included in this report: Static and dynamic formulation, Simple lumped-parameter models and Advanced lumped-parameter models. (au)
Comparing Active Vision Models
Croon, G.C.H.E. de; Sprinkhuizen-Kuyper, I.G.; Postma, E.O.
2009-01-01
Active vision models can simplify visual tasks, provided that they can select sensible actions given incoming sensory inputs. Many active vision models have been proposed, but a comparative evaluation of these models is lacking. We present a comparison of active vision models from two different
Comparing active vision models
Croon, G.C.H.E. de; Sprinkhuizen-Kuyper, I.G.; Postma, E.O.
2009-01-01
Active vision models can simplify visual tasks, provided that they can select sensible actions given incoming sensory inputs. Many active vision models have been proposed, but a comparative evaluation of these models is lacking. We present a comparison of active vision models from two different
Van Bloemendaal, Karen; Dijkema, Gerard P.J.; Woerdman, Edwin; Jong, Mattheus
2015-01-01
This White Paper provides an overview of the modelling approaches adopted by the project partners in the EDGaR project 'Understanding Gas Sector Intra- and Inter- Market interactions' (UGSIIMI). The paper addresses three types of models: complementarity modelling, agent-based modelling and property
Business process modeling in healthcare.
Ruiz, Francisco; Garcia, Felix; Calahorra, Luis; Llorente, César; Gonçalves, Luis; Daniel, Christel; Blobel, Bernd
2012-01-01
The importance of the process point of view is not restricted to a specific enterprise sector. In the field of health, as a result of the nature of the service offered, health institutions' processes are also the basis for decision making which is focused on achieving their objective of providing quality medical assistance. In this chapter the application of business process modelling - using the Business Process Modelling Notation (BPMN) standard is described. Main challenges of business process modelling in healthcare are the definition of healthcare processes, the multi-disciplinary nature of healthcare, the flexibility and variability of the activities involved in health care processes, the need of interoperability between multiple information systems, and the continuous updating of scientific knowledge in healthcare.
Target-Centric Network Modeling
DEFF Research Database (Denmark)
Mitchell, Dr. William L.; Clark, Dr. Robert M.
In Target-Centric Network Modeling: Case Studies in Analyzing Complex Intelligence Issues, authors Robert Clark and William Mitchell take an entirely new approach to teaching intelligence analysis. Unlike any other book on the market, it offers case study scenarios using actual intelligence...... reporting formats, along with a tested process that facilitates the production of a wide range of analytical products for civilian, military, and hybrid intelligence environments. Readers will learn how to perform the specific actions of problem definition modeling, target network modeling......, and collaborative sharing in the process of creating a high-quality, actionable intelligence product. The case studies reflect the complexity of twenty-first century intelligence issues by dealing with multi-layered target networks that cut across political, economic, social, technological, and military issues...
Mathematical modelling in solid mechanics
Sofonea, Mircea; Steigmann, David
2017-01-01
This book presents new research results in multidisciplinary fields of mathematical and numerical modelling in mechanics. The chapters treat the topics: mathematical modelling in solid, fluid and contact mechanics nonconvex variational analysis with emphasis to nonlinear solid and structural mechanics numerical modelling of problems with non-smooth constitutive laws, approximation of variational and hemivariational inequalities, numerical analysis of discrete schemes, numerical methods and the corresponding algorithms, applications to mechanical engineering numerical aspects of non-smooth mechanics, with emphasis on developing accurate and reliable computational tools mechanics of fibre-reinforced materials behaviour of elasto-plastic materials accounting for the microstructural defects definition of structural defects based on the differential geometry concepts or on the atomistic basis interaction between phase transformation and dislocations at nano-scale energetic arguments bifurcation and post-buckling a...
DEFF Research Database (Denmark)
Ayres, Phil
2012-01-01
This essay discusses models. It examines what models are, the roles models perform and suggests various intentions that underlie their construction and use. It discusses how models act as a conversational partner, and how they support various forms of conversation within the conversational activity...... of design. Three distinctions are drawn through which to develop this discussion of models in an architectural context. An examination of these distinctions serves to nuance particular characteristics and roles of models, the modelling activity itself and those engaged in it....
DEFF Research Database (Denmark)
Gernaey, Krist; Sin, Gürkan
2011-01-01
The state-of-the-art level reached in modeling wastewater treatment plants (WWTPs) is reported. For suspended growth systems, WWTP models have evolved from simple description of biological removal of organic carbon and nitrogen in aeration tanks (ASM1 in 1987) to more advanced levels including...... of WWTP modeling by linking the wastewater treatment line with the sludge handling line in one modeling platform. Application of WWTP models is currently rather time consuming and thus expensive due to the high model complexity, and requires a great deal of process knowledge and modeling expertise...
DEFF Research Database (Denmark)
Gernaey, Krist; Sin, Gürkan
2008-01-01
The state-of-the-art level reached in modeling wastewater treatment plants (WWTPs) is reported. For suspended growth systems, WWTP models have evolved from simple description of biological removal of organic carbon and nitrogen in aeration tanks (ASM1 in 1987) to more advanced levels including...... the practice of WWTP modeling by linking the wastewater treatment line with the sludge handling line in one modeling platform. Application of WWTP models is currently rather time consuming and thus expensive due to the high model complexity, and requires a great deal of process knowledge and modeling expertise...
Morgan, Byron JT; Tanner, Martin Abba; Carlin, Bradley P
2008-01-01
Introduction and Examples Introduction Examples of data sets Basic Model Fitting Introduction Maximum-likelihood estimation for a geometric model Maximum-likelihood for the beta-geometric model Modelling polyspermy Which model? What is a model for? Mechanistic models Function Optimisation Introduction MATLAB: graphs and finite differences Deterministic search methods Stochastic search methods Accuracy and a hybrid approach Basic Likelihood ToolsIntroduction Estimating standard errors and correlations Looking at surfaces: profile log-likelihoods Confidence regions from profiles Hypothesis testing in model selectionScore and Wald tests Classical goodness of fit Model selection biasGeneral Principles Introduction Parameterisation Parameter redundancy Boundary estimates Regression and influence The EM algorithm Alternative methods of model fitting Non-regular problemsSimulation Techniques Introduction Simulating random variables Integral estimation Verification Monte Carlo inference Estimating sampling distributi...
DEFF Research Database (Denmark)
Justesen, Lise; Overgaard, Svend Skafte
2017-01-01
-ended approach towards meal experiences. The underlying purpose of The Hospitable Meal Model is to provide the basis for creating value for the individuals involved in institutional meal services. The Hospitable Meal Model was developed on the basis of an empirical study on hospital meal experiences explored......This article presents an analytical model that aims to conceptualize how meal experiences are framed when taking into account a dynamic understanding of hospitality: the meal model is named The Hospitable Meal Model. The idea behind The Hospitable Meal Model is to present a conceptual model...... that can serve as a frame for developing hospitable meal competencies among professionals working within the area of institutional foodservices as well as a conceptual model for analysing meal experiences. The Hospitable Meal Model transcends and transforms existing meal models by presenting a more open...
Energy Technology Data Exchange (ETDEWEB)
C. Ahlers; H. Liu
2000-03-12
The purpose of this Analysis/Model Report (AMR) is to document the Calibrated Properties Model that provides calibrated parameter sets for unsaturated zone (UZ) flow and transport process models for the Yucca Mountain Site Characterization Project (YMP). This work was performed in accordance with the ''AMR Development Plan for U0035 Calibrated Properties Model REV00. These calibrated property sets include matrix and fracture parameters for the UZ Flow and Transport Model (UZ Model), drift seepage models, drift-scale and mountain-scale coupled-processes models, and Total System Performance Assessment (TSPA) models as well as Performance Assessment (PA) and other participating national laboratories and government agencies. These process models provide the necessary framework to test conceptual hypotheses of flow and transport at different scales and predict flow and transport behavior under a variety of climatic and thermal-loading conditions.
Energy Technology Data Exchange (ETDEWEB)
C.F. Ahlers, H.H. Liu
2001-12-18
The purpose of this Analysis/Model Report (AMR) is to document the Calibrated Properties Model that provides calibrated parameter sets for unsaturated zone (UZ) flow and transport process models for the Yucca Mountain Site Characterization Project (YMP). This work was performed in accordance with the AMR Development Plan for U0035 Calibrated Properties Model REV00 (CRWMS M&O 1999c). These calibrated property sets include matrix and fracture parameters for the UZ Flow and Transport Model (UZ Model), drift seepage models, drift-scale and mountain-scale coupled-processes models, and Total System Performance Assessment (TSPA) models as well as Performance Assessment (PA) and other participating national laboratories and government agencies. These process models provide the necessary framework to test conceptual hypotheses of flow and transport at different scales and predict flow and transport behavior under a variety of climatic and thermal-loading conditions.
DEFF Research Database (Denmark)
Justesen, Lise; Overgaard, Svend Skafte
2017-01-01
This article presents an analytical model that aims to conceptualize how meal experiences are framed when taking into account a dynamic understanding of hospitality: the meal model is named The Hospitable Meal Model. The idea behind The Hospitable Meal Model is to present a conceptual model...... that can serve as a frame for developing hospitable meal competencies among professionals working within the area of institutional foodservices as well as a conceptual model for analysing meal experiences. The Hospitable Meal Model transcends and transforms existing meal models by presenting a more open......-ended approach towards meal experiences. The underlying purpose of The Hospitable Meal Model is to provide the basis for creating value for the individuals involved in institutional meal services. The Hospitable Meal Model was developed on the basis of an empirical study on hospital meal experiences explored...
Fundamental concepts of modeling
International Nuclear Information System (INIS)
Garland, W.J.
1990-01-01
This paper addresses the roles of simulation in science and engineering: the extended calculator, the prototyper and the intuition generation medium. Science and engineering involves thought processes which transcend the rational. Simulation has emerged as a third wing of science that is orthogonal to experimentation and theory. It has the pedestrian role of the extended calculator in the simulation provides a numerical bridge between symbolic theory and hard experimental data. In this role, discovery has been assisted. But, simulation has proved to be more than a super calculator. The nuclear industry and computational fluid dynamics are but two examples of areas that use simulation to replace experimentation (prototyping) for cost and danger reasons. Further, there is an emerging role of graphics and artificial intelligence in the discovery process. Simulation is clearly becoming not only a tool that reduces the tedium, but one that enhances the creative process. The paper looks at thermalhydraulics and considers the emerging trends. A general modelling scheme is proposed and a systems view is used to suggest the criteria for optimum simulation models for the working environment. The basic theme proposed is to reduce the proportion of rational mental time we spend on perspiration so as to allow more time for inspiration non- rational. Our QUEST then is the Quintessential Eureka Stimulator. The QUEST involves the use of existing tools on the microcomputer to enhance problem setup and post-run analysis. The key to successful questing lies, however, in the development of a simulation environment that is seamless through the whole of the process, from problem definition to presentation of results. The framework for this mythical environment is introduced
The Aalborg Model and The Problem
DEFF Research Database (Denmark)
Qvist, Palle
To know the definition of a problem in is an important implication for the possibility to identify and formulate the problem1, the starting point of the learning process in the Aalborg Model2 3. For certification it has been suggested that: A problem grows out of students’ wondering within...... different disciplines and professional environments4. This article goes through the definitions of a problem formulated by researchers at Aalborg University during the lifetime of the university5 and raises the question to each of them: Leads the definition to creation of a feeling or experience...
MulensModel: Microlensing light curves modeling
Poleski, Radoslaw; Yee, Jennifer
2018-03-01
MulensModel calculates light curves of microlensing events. Both single and binary lens events are modeled and various higher-order effects can be included: extended source (with limb-darkening), annual microlensing parallax, and satellite microlensing parallax. The code is object-oriented and written in Python3, and requires AstroPy (ascl:1304.002).
Business Models and Business Model Innovation
DEFF Research Database (Denmark)
Foss, Nicolai J.; Saebi, Tina
2018-01-01
While research on business models and business model innovation continue to exhibit growth, the field is still, even after more than two decades of research, characterized by a striking lack of cumulative theorizing and an opportunistic borrowing of more or less related ideas from neighbouring...
Takahashi, Takehiro; Schibuya, Noboru
The EMC simulation is now widely used in design stage of electronic equipment to reduce electromagnetic noise. As the calculated electromagnetic behaviors of the EMC simulator depends on the inputted EMC model of the equipment, the modeling technique is important to obtain effective results. In this paper, simple outline of the EMC simulator and EMC model are described. Some modeling techniques of EMC simulation are also described with an example of the EMC model which is shield box with aperture.
Phenomenology of inflationary models
Olyaei, Abbas
2018-01-01
There are many inflationary models compatible with observational data. One can investigate inflationary models by looking at their general features, which are common in most of the models. Here we have investigated some of the single-field models without considering their origin in order to find the phenomenology of them. We have shown how to adjust the simple harmonic oscillator model in order to be in good agreement with observational data.
Goldstein, Harvey
2011-01-01
This book provides a clear introduction to this important area of statistics. The author provides a wide of coverage of different kinds of multilevel models, and how to interpret different statistical methodologies and algorithms applied to such models. This 4th edition reflects the growth and interest in this area and is updated to include new chapters on multilevel models with mixed response types, smoothing and multilevel data, models with correlated random effects and modeling with variance.
DEFF Research Database (Denmark)
Langseth, Helge; Nielsen, Thomas Dyhre
2005-01-01
parametric family ofdistributions. In this paper we propose a new set of models forclassification in continuous domains, termed latent classificationmodels. The latent classification model can roughly be seen ascombining the \\NB model with a mixture of factor analyzers,thereby relaxing the assumptions...... classification model, and wedemonstrate empirically that the accuracy of the proposed model issignificantly higher than the accuracy of other probabilisticclassifiers....
Geochemistry Model Validation Report: External Accumulation Model
Energy Technology Data Exchange (ETDEWEB)
K. Zarrabi
2001-09-27
The purpose of this Analysis and Modeling Report (AMR) is to validate the External Accumulation Model that predicts accumulation of fissile materials in fractures and lithophysae in the rock beneath a degrading waste package (WP) in the potential monitored geologic repository at Yucca Mountain. (Lithophysae are voids in the rock having concentric shells of finely crystalline alkali feldspar, quartz, and other materials that were formed due to entrapped gas that later escaped, DOE 1998, p. A-25.) The intended use of this model is to estimate the quantities of external accumulation of fissile material for use in external criticality risk assessments for different types of degrading WPs: U.S. Department of Energy (DOE) Spent Nuclear Fuel (SNF) codisposed with High Level Waste (HLW) glass, commercial SNF, and Immobilized Plutonium Ceramic (Pu-ceramic) codisposed with HLW glass. The scope of the model validation is to (1) describe the model and the parameters used to develop the model, (2) provide rationale for selection of the parameters by comparisons with measured values, and (3) demonstrate that the parameters chosen are the most conservative selection for external criticality risk calculations. To demonstrate the applicability of the model, a Pu-ceramic WP is used as an example. The model begins with a source term from separately documented EQ6 calculations; where the source term is defined as the composition versus time of the water flowing out of a breached waste package (WP). Next, PHREEQC, is used to simulate the transport and interaction of the source term with the resident water and fractured tuff below the repository. In these simulations the primary mechanism for accumulation is mixing of the high pH, actinide-laden source term with resident water; thus lowering the pH values sufficiently for fissile minerals to become insoluble and precipitate. In the final section of the model, the outputs from PHREEQC, are processed to produce mass of accumulation
Geochemistry Model Validation Report: External Accumulation Model
International Nuclear Information System (INIS)
Zarrabi, K.
2001-01-01
The purpose of this Analysis and Modeling Report (AMR) is to validate the External Accumulation Model that predicts accumulation of fissile materials in fractures and lithophysae in the rock beneath a degrading waste package (WP) in the potential monitored geologic repository at Yucca Mountain. (Lithophysae are voids in the rock having concentric shells of finely crystalline alkali feldspar, quartz, and other materials that were formed due to entrapped gas that later escaped, DOE 1998, p. A-25.) The intended use of this model is to estimate the quantities of external accumulation of fissile material for use in external criticality risk assessments for different types of degrading WPs: U.S. Department of Energy (DOE) Spent Nuclear Fuel (SNF) codisposed with High Level Waste (HLW) glass, commercial SNF, and Immobilized Plutonium Ceramic (Pu-ceramic) codisposed with HLW glass. The scope of the model validation is to (1) describe the model and the parameters used to develop the model, (2) provide rationale for selection of the parameters by comparisons with measured values, and (3) demonstrate that the parameters chosen are the most conservative selection for external criticality risk calculations. To demonstrate the applicability of the model, a Pu-ceramic WP is used as an example. The model begins with a source term from separately documented EQ6 calculations; where the source term is defined as the composition versus time of the water flowing out of a breached waste package (WP). Next, PHREEQC, is used to simulate the transport and interaction of the source term with the resident water and fractured tuff below the repository. In these simulations the primary mechanism for accumulation is mixing of the high pH, actinide-laden source term with resident water; thus lowering the pH values sufficiently for fissile minerals to become insoluble and precipitate. In the final section of the model, the outputs from PHREEQC, are processed to produce mass of accumulation
Crop rotation modelling - A European model intercomparison
DEFF Research Database (Denmark)
Kollas, Chris; Kersebaum, Kurt C; Nendel, Claas
2015-01-01
crop growth simulation models to predict yields in crop rotations at five sites across Europe under minimal calibration. Crop rotations encompassed 301 seasons of ten crop types common to European agriculture and a diverse set of treatments (irrigation, fertilisation, CO2 concentration, soil types...... accurately than main crops (cereals). The majority of models performed better for the treatments of increased CO2 and nitrogen fertilisation than for irrigation and soil-related treatments. The yield simulation of the multi-model ensemble reduced the error compared to single-model simulations. The low degree...... representation of crop rotations, further research is required to synthesise existing knowledge of the physiology of intermediate crops and of carry-over effects from the preceding to the following crop, and to implement/improve the modelling of processes that condition these effects....
Modelling of an homogeneous equilibrium mixture model
International Nuclear Information System (INIS)
Bernard-Champmartin, A.; Poujade, O.; Mathiaud, J.; Mathiaud, J.; Ghidaglia, J.M.
2014-01-01
We present here a model for two phase flows which is simpler than the 6-equations models (with two densities, two velocities, two temperatures) but more accurate than the standard mixture models with 4 equations (with two densities, one velocity and one temperature). We are interested in the case when the two-phases have been interacting long enough for the drag force to be small but still not negligible. The so-called Homogeneous Equilibrium Mixture Model (HEM) that we present is dealing with both mixture and relative quantities, allowing in particular to follow both a mixture velocity and a relative velocity. This relative velocity is not tracked by a conservation law but by a closure law (drift relation), whose expression is related to the drag force terms of the two-phase flow. After the derivation of the model, a stability analysis and numerical experiments are presented. (authors)
Directory of Open Access Journals (Sweden)
Nadina Yedid
2013-12-01
Full Text Available El artículo repasa los hallazgos y opiniones de teóricos e investigadores que se han dedicado a analizar el fenómeno de las folksonomías. Se rescatan las principales definiciones del concepto de folksonomía, sus características, los tipos de folksonomías existentes, y las diferencias con los modelos tradicionales de indización mediante vocabularios controlados, analizando las ventajas y desventajas de este nuevo modelo. Se concluye que las folksonomías pueden ofrecer grandes ventajas en la recuperación de información, y más aún si son utilizadas como complemento de la indización mediante vocabularios controlados = The article reviews the findings and opinions of theorists and researchers who are dedicated to analyzing the phenomenon of folksonomies. We highlight the main definitions of folksonomy, their characteristics, the different types of existing folksonomies, and the differences with traditional indexing using controlled vocabularies, analyzing the advantages and disadvantages of this new model. We conclude that folksonomies can offer great advantages in information retrieval, especially if they are used to complement the indexing using controlled vocabularies.
Havenith, George; Fiala, Dusan; Błazejczyk, Krzysztof; Richards, Mark; Bröde, Peter; Holmér, Ingvar; Rintamaki, Hannu; Benshabat, Yael; Jendritzky, Gerd
2012-05-01
The Universal Thermal Climate Index (UTCI) was conceived as a thermal index covering the whole climate range from heat to cold. This would be impossible without considering clothing as the interface between the person (here, the physiological model of thermoregulation) and the environment. It was decided to develop a clothing model for this application in which the following three factors were considered: (1) typical dressing behaviour in different temperatures, as observed in the field, resulting in a model of the distribution of clothing over the different body segments in relation to the ambient temperature, (2) the changes in clothing insulation and vapour resistance caused by wind and body movement, and (3) the change in wind speed in relation to the height above ground. The outcome was a clothing model that defines in detail the effective clothing insulation and vapour resistance for each of the thermo-physiological model's body segments over a wide range of climatic conditions. This paper details this model's conception and documents its definitions.
Havenith, George; Fiala, Dusan; Błazejczyk, Krzysztof; Richards, Mark; Bröde, Peter; Holmér, Ingvar; Rintamaki, Hannu; Benshabat, Yael; Jendritzky, Gerd
2012-05-01
The Universal Thermal Climate Index (UTCI) was conceived as a thermal index covering the whole climate range from heat to cold. This would be impossible without considering clothing as the interface between the person (here, the physiological model of thermoregulation) and the environment. It was decided to develop a clothing model for this application in which the following three factors were considered: (1) typical dressing behaviour in different temperatures, as observed in the field, resulting in a model of the distribution of clothing over the different body segments in relation to the ambient temperature, (2) the changes in clothing insulation and vapour resistance caused by wind and body movement, and (3) the change in wind speed in relation to the height above ground. The outcome was a clothing model that defines in detail the effective clothing insulation and vapour resistance for each of the thermo-physiological model's body segments over a wide range of climatic conditions. This paper details this model's conception and documents its definitions.
Towards Clone Detection in UML Domain Models
DEFF Research Database (Denmark)
Störrle, Harald
2010-01-01
Code clones - that is, duplicate fragments of code - have been studied for a long time. There is strong evidence that code clones are a major source of software faults. Anecdotal evidence suggests that this phenomenon is not restricted to code, but occurs in models in a very similar way. So...... it is likely that model clones are as detrimental to model quality as they are to code quality. However, programming language code and visual models also have significant differences so that notions and algorithms developed in the code clone arena cannot be transferred directly to model clones. In this article......, we discuss how model clones arise by analyzing several practical scenarios. We propose a formal definition of models and clones, that allows us to specify a generic clone detection algorithm. Through a thorough analysis of the detail structure of sample UML domain models, recommendations for clone...
DEFF Research Database (Denmark)
Høskuldsson, Agnar
1996-01-01
Determination of the proper dimension of a given linear model is one of the most important tasks in the applied modeling work. We consider here eight criteria that can be used to determine the dimension of the model, or equivalently, the number of components to use in the model. Four of these cri......Determination of the proper dimension of a given linear model is one of the most important tasks in the applied modeling work. We consider here eight criteria that can be used to determine the dimension of the model, or equivalently, the number of components to use in the model. Four...... the basic problems in determining the dimension of linear models. Then each of the eight measures are treated. The results are illustrated by examples....
DEFF Research Database (Denmark)
Cameron, Ian T.; Gani, Rafiqul
This book covers the area of product and process modelling via a case study approach. It addresses a wide range of modelling applications with emphasis on modelling methodology and the subsequent in-depth analysis of mathematical models to gain insight via structural aspects of the models....... These approaches are put into the context of life cycle modelling, where multiscale and multiform modelling is increasingly prevalent in the 21st century. The book commences with a discussion of modern product and process modelling theory and practice followed by a series of case studies drawn from a variety...... to biotechnology applications, food, polymer and human health application areas. The book highlights to important nature of modern product and process modelling in the decision making processes across the life cycle. As such it provides an important resource for students, researchers and industrial practitioners....
Model Validation Status Review
International Nuclear Information System (INIS)
E.L. Hardin
2001-01-01
The primary objective for the Model Validation Status Review was to perform a one-time evaluation of model validation associated with the analysis/model reports (AMRs) containing model input to total-system performance assessment (TSPA) for the Yucca Mountain site recommendation (SR). This review was performed in response to Corrective Action Request BSC-01-C-01 (Clark 2001, Krisha 2001) pursuant to Quality Assurance review findings of an adverse trend in model validation deficiency. The review findings in this report provide the following information which defines the extent of model validation deficiency and the corrective action needed: (1) AMRs that contain or support models are identified, and conversely, for each model the supporting documentation is identified. (2) The use for each model is determined based on whether the output is used directly for TSPA-SR, or for screening (exclusion) of features, events, and processes (FEPs), and the nature of the model output. (3) Two approaches are used to evaluate the extent to which the validation for each model is compliant with AP-3.10Q (Analyses and Models). The approaches differ in regard to whether model validation is achieved within individual AMRs as originally intended, or whether model validation could be readily achieved by incorporating information from other sources. (4) Recommendations are presented for changes to the AMRs, and additional model development activities or data collection, that will remedy model validation review findings, in support of licensing activities. The Model Validation Status Review emphasized those AMRs that support TSPA-SR (CRWMS M and O 2000bl and 2000bm). A series of workshops and teleconferences was held to discuss and integrate the review findings. The review encompassed 125 AMRs (Table 1) plus certain other supporting documents and data needed to assess model validity. The AMRs were grouped in 21 model areas representing the modeling of processes affecting the natural and
Model Validation Status Review
Energy Technology Data Exchange (ETDEWEB)
E.L. Hardin
2001-11-28
The primary objective for the Model Validation Status Review was to perform a one-time evaluation of model validation associated with the analysis/model reports (AMRs) containing model input to total-system performance assessment (TSPA) for the Yucca Mountain site recommendation (SR). This review was performed in response to Corrective Action Request BSC-01-C-01 (Clark 2001, Krisha 2001) pursuant to Quality Assurance review findings of an adverse trend in model validation deficiency. The review findings in this report provide the following information which defines the extent of model validation deficiency and the corrective action needed: (1) AMRs that contain or support models are identified, and conversely, for each model the supporting documentation is identified. (2) The use for each model is determined based on whether the output is used directly for TSPA-SR, or for screening (exclusion) of features, events, and processes (FEPs), and the nature of the model output. (3) Two approaches are used to evaluate the extent to which the validation for each model is compliant with AP-3.10Q (Analyses and Models). The approaches differ in regard to whether model validation is achieved within individual AMRs as originally intended, or whether model validation could be readily achieved by incorporating information from other sources. (4) Recommendations are presented for changes to the AMRs, and additional model development activities or data collection, that will remedy model validation review findings, in support of licensing activities. The Model Validation Status Review emphasized those AMRs that support TSPA-SR (CRWMS M&O 2000bl and 2000bm). A series of workshops and teleconferences was held to discuss and integrate the review findings. The review encompassed 125 AMRs (Table 1) plus certain other supporting documents and data needed to assess model validity. The AMRs were grouped in 21 model areas representing the modeling of processes affecting the natural and
Modeling for Battery Prognostics
Kulkarni, Chetan S.; Goebel, Kai; Khasin, Michael; Hogge, Edward; Quach, Patrick
2017-01-01
For any battery-powered vehicles (be it unmanned aerial vehicles, small passenger aircraft, or assets in exoplanetary operations) to operate at maximum efficiency and reliability, it is critical to monitor battery health as well performance and to predict end of discharge (EOD) and end of useful life (EOL). To fulfil these needs, it is important to capture the battery's inherent characteristics as well as operational knowledge in the form of models that can be used by monitoring, diagnostic, and prognostic algorithms. Several battery modeling methodologies have been developed in last few years as the understanding of underlying electrochemical mechanics has been advancing. The models can generally be classified as empirical models, electrochemical engineering models, multi-physics models, and molecular/atomist. Empirical models are based on fitting certain functions to past experimental data, without making use of any physicochemical principles. Electrical circuit equivalent models are an example of such empirical models. Electrochemical engineering models are typically continuum models that include electrochemical kinetics and transport phenomena. Each model has its advantages and disadvantages. The former type of model has the advantage of being computationally efficient, but has limited accuracy and robustness, due to the approximations used in developed model, and as a result of such approximations, cannot represent aging well. The latter type of model has the advantage of being very accurate, but is often computationally inefficient, having to solve complex sets of partial differential equations, and thus not suited well for online prognostic applications. In addition both multi-physics and atomist models are computationally expensive hence are even less suited to online application An electrochemistry-based model of Li-ion batteries has been developed, that captures crucial electrochemical processes, captures effects of aging, is computationally efficient
Modeling volatility using state space models.
Timmer, J; Weigend, A S
1997-08-01
In time series problems, noise can be divided into two categories: dynamic noise which drives the process, and observational noise which is added in the measurement process, but does not influence future values of the system. In this framework, we show that empirical volatilities (the squared relative returns of prices) exhibit a significant amount of observational noise. To model and predict their time evolution adequately, we estimate state space models that explicitly include observational noise. We obtain relaxation times for shocks in the logarithm of volatility ranging from three weeks (for foreign exchange) to three to five months (for stock indices). In most cases, a two-dimensional hidden state is required to yield residuals that are consistent with white noise. We compare these results with ordinary autoregressive models (without a hidden state) and find that autoregressive models underestimate the relaxation times by about two orders of magnitude since they do not distinguish between observational and dynamic noise. This new interpretation of the dynamics of volatility in terms of relaxators in a state space model carries over to stochastic volatility models and to GARCH models, and is useful for several problems in finance, including risk management and the pricing of derivative securities. Data sets used: Olsen & Associates high frequency DEM/USD foreign exchange rates (8 years). Nikkei 225 index (40 years). Dow Jones Industrial Average (25 years).
High resolution extremity CT for biomechanics modeling
International Nuclear Information System (INIS)
Ashby, A.E.; Brand, H.; Hollerbach, K.; Logan, C.M.; Martz, H.E.
1995-01-01
With the advent of ever more powerful computing and finite element analysis (FEA) capabilities, the bone and joint geometry detail available from either commercial surface definitions or from medical CT scans is inadequate. For dynamic FEA modeling of joints, precise articular contours are necessary to get appropriate contact definition. In this project, a fresh cadaver extremity was suspended in parafin in a lucite cylinder and then scanned with an industrial CT system to generate a high resolution data set for use in biomechanics modeling
High resolution extremity CT for biomechanics modeling
Energy Technology Data Exchange (ETDEWEB)
Ashby, A.E.; Brand, H.; Hollerbach, K.; Logan, C.M.; Martz, H.E.
1995-09-23
With the advent of ever more powerful computing and finite element analysis (FEA) capabilities, the bone and joint geometry detail available from either commercial surface definitions or from medical CT scans is inadequate. For dynamic FEA modeling of joints, precise articular contours are necessary to get appropriate contact definition. In this project, a fresh cadaver extremity was suspended in parafin in a lucite cylinder and then scanned with an industrial CT system to generate a high resolution data set for use in biomechanics modeling.
Empirical Model Building Data, Models, and Reality
Thompson, James R
2011-01-01
Praise for the First Edition "This...novel and highly stimulating book, which emphasizes solving real problems...should be widely read. It will have a positive and lasting effect on the teaching of modeling and statistics in general." - Short Book Reviews This new edition features developments and real-world examples that showcase essential empirical modeling techniques Successful empirical model building is founded on the relationship between data and approximate representations of the real systems that generated that data. As a result, it is essential for researchers who construct these m
Zephyr - the prediction models
DEFF Research Database (Denmark)
Nielsen, Torben Skov; Madsen, Henrik; Nielsen, Henrik Aalborg
2001-01-01
utilities as partners and users. The new models are evaluated for five wind farms in Denmark as well as one wind farm in Spain. It is shown that the predictions based on conditional parametric models are superior to the predictions obatined by state-of-the-art parametric models.......This paper briefly describes new models and methods for predicationg the wind power output from wind farms. The system is being developed in a project which has the research organization Risø and the department of Informatics and Mathematical Modelling (IMM) as the modelling team and all the Danish...
International Nuclear Information System (INIS)
Harvey, M.; Khanna, F.C.
1975-01-01
The general problem of what constitutes a physical model and what is known about the free nucleon-nucleon interaction are considered. A time independent formulation of the basic equations is chosen. Construction of the average field in which particles move in a general independent particle model is developed, concentrating on problems of defining the average spherical single particle field for any given nucleus, and methods for construction of effective residual interactions and other physical operators. Deformed shell models and both spherical and deformed harmonic oscillator models are discussed in detail, and connections between spherical and deformed shell models are analyzed. A section on cluster models is included. 11 tables, 21 figures
Peabody, Hume L.
2017-01-01
This presentation is meant to be an overview of the model building process It is based on typical techniques (Monte Carlo Ray Tracing for radiation exchange, Lumped Parameter, Finite Difference for thermal solution) used by the aerospace industry This is not intended to be a "How to Use ThermalDesktop" course. It is intended to be a "How to Build Thermal Models" course and the techniques will be demonstrated using the capabilities of ThermalDesktop (TD). Other codes may or may not have similar capabilities. The General Model Building Process can be broken into four top level steps: 1. Build Model; 2. Check Model; 3. Execute Model; 4. Verify Results.
Microsoft tabular modeling cookbook
Braak, Paul te
2013-01-01
This book follows a cookbook style with recipes explaining the steps for developing analytic data using Business Intelligence Semantic Models.This book is designed for developers who wish to develop powerful and dynamic models for users as well as those who are responsible for the administration of models in corporate environments. It is also targeted at analysts and users of Excel who wish to advance their knowledge of Excel through the development of tabular models or who wish to analyze data through tabular modeling techniques. We assume no prior knowledge of tabular modeling
Directory of Open Access Journals (Sweden)
Luiz Carlos Bresser-Pereira
2012-03-01
Full Text Available Besides analyzing capitalist societies historically and thinking of them in terms of phases or stages, we may compare different models or varieties of capitalism. In this paper I survey the literature on this subject, and distinguish the classification that has a production or business approach from those that use a mainly political criterion. I identify five forms of capitalism: among the rich countries, the liberal democratic or Anglo-Saxon model, the social or European model, and the endogenous social integration or Japanese model; among developing countries, I distinguish the Asian developmental model from the liberal-dependent model that characterizes most other developing countries, including Brazil.
Geller, Michael; Telem, Ofri
2015-05-15
We present the first realization of a "twin Higgs" model as a holographic composite Higgs model. Uniquely among composite Higgs models, the Higgs potential is protected by a new standard model (SM) singlet elementary "mirror" sector at the sigma model scale f and not by the composite states at m_{KK}, naturally allowing for m_{KK} beyond the LHC reach. As a result, naturalness in our model cannot be constrained by the LHC, but may be probed by precision Higgs measurements at future lepton colliders, and by direct searches for Kaluza-Klein excitations at a 100 TeV collider.
Energy Technology Data Exchange (ETDEWEB)
D.W. Wu; A.J. Smith
2004-11-08
The purpose of this report is to document the biosphere model, the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), which describes radionuclide transport processes in the biosphere and associated human exposure that may arise as the result of radionuclide release from the geologic repository at Yucca Mountain. The biosphere model is one of the process models that support the Yucca Mountain Project (YMP) Total System Performance Assessment (TSPA) for the license application (LA), TSPA-LA. The ERMYN provides the capability of performing human radiation dose assessments. This report documents the biosphere model, which includes: (1) Describing the reference biosphere, human receptor, exposure scenarios, and primary radionuclides for each exposure scenario (Section 6.1); (2) Developing a biosphere conceptual model using site-specific features, events, and processes (FEPs) (Section 6.2), the reference biosphere (Section 6.1.1), the human receptor (Section 6.1.2), and approximations (Sections 6.3.1.4 and 6.3.2.4); (3) Building a mathematical model using the biosphere conceptual model (Section 6.3) and published biosphere models (Sections 6.4 and 6.5); (4) Summarizing input parameters for the mathematical model, including the uncertainty associated with input values (Section 6.6); (5) Identifying improvements in the ERMYN compared with the model used in previous biosphere modeling (Section 6.7); (6) Constructing an ERMYN implementation tool (model) based on the biosphere mathematical model using GoldSim stochastic simulation software (Sections 6.8 and 6.9); (7) Verifying the ERMYN by comparing output from the software with hand calculations to ensure that the GoldSim implementation is correct (Section 6.10); (8) Validating the ERMYN by corroborating it with published biosphere models; comparing conceptual models, mathematical models, and numerical results (Section 7).
International Nuclear Information System (INIS)
D.W. Wu; A.J. Smith
2004-01-01
The purpose of this report is to document the biosphere model, the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), which describes radionuclide transport processes in the biosphere and associated human exposure that may arise as the result of radionuclide release from the geologic repository at Yucca Mountain. The biosphere model is one of the process models that support the Yucca Mountain Project (YMP) Total System Performance Assessment (TSPA) for the license application (LA), TSPA-LA. The ERMYN provides the capability of performing human radiation dose assessments. This report documents the biosphere model, which includes: (1) Describing the reference biosphere, human receptor, exposure scenarios, and primary radionuclides for each exposure scenario (Section 6.1); (2) Developing a biosphere conceptual model using site-specific features, events, and processes (FEPs) (Section 6.2), the reference biosphere (Section 6.1.1), the human receptor (Section 6.1.2), and approximations (Sections 6.3.1.4 and 6.3.2.4); (3) Building a mathematical model using the biosphere conceptual model (Section 6.3) and published biosphere models (Sections 6.4 and 6.5); (4) Summarizing input parameters for the mathematical model, including the uncertainty associated with input values (Section 6.6); (5) Identifying improvements in the ERMYN compared with the model used in previous biosphere modeling (Section 6.7); (6) Constructing an ERMYN implementation tool (model) based on the biosphere mathematical model using GoldSim stochastic simulation software (Sections 6.8 and 6.9); (7) Verifying the ERMYN by comparing output from the software with hand calculations to ensure that the GoldSim implementation is correct (Section 6.10); (8) Validating the ERMYN by corroborating it with published biosphere models; comparing conceptual models, mathematical models, and numerical results (Section 7)
Modelling of Innovation Diffusion
Directory of Open Access Journals (Sweden)
Arkadiusz Kijek
2010-01-01
Full Text Available Since the publication of the Bass model in 1969, research on the modelling of the diffusion of innovation resulted in a vast body of scientific literature consisting of articles, books, and studies of real-world applications of this model. The main objective of the diffusion model is to describe a pattern of spread of innovation among potential adopters in terms of a mathematical function of time. This paper assesses the state-of-the-art in mathematical models of innovation diffusion and procedures for estimating their parameters. Moreover, theoretical issues related to the models presented are supplemented with empirical research. The purpose of the research is to explore the extent to which the diffusion of broadband Internet users in 29 OECD countries can be adequately described by three diffusion models, i.e. the Bass model, logistic model and dynamic model. The results of this research are ambiguous and do not indicate which model best describes the diffusion pattern of broadband Internet users but in terms of the results presented, in most cases the dynamic model is inappropriate for describing the diffusion pattern. Issues related to the further development of innovation diffusion models are discussed and some recommendations are given. (original abstract
Sumner, J G; Fernández-Sánchez, J; Jarvis, P D
2012-04-07
Recent work has discussed the importance of multiplicative closure for the Markov models used in phylogenetics. For continuous-time Markov chains, a sufficient condition for multiplicative closure of a model class is ensured by demanding that the set of rate-matrices belonging to the model class form a Lie algebra. It is the case that some well-known Markov models do form Lie algebras and we refer to such models as "Lie Markov models". However it is also the case that some other well-known Markov models unequivocally do not form Lie algebras (GTR being the most conspicuous example). In this paper, we will discuss how to generate Lie Markov models by demanding that the models have certain symmetries under nucleotide permutations. We show that the Lie Markov models include, and hence provide a unifying concept for, "group-based" and "equivariant" models. For each of two and four character states, the full list of Lie Markov models with maximal symmetry is presented and shown to include interesting examples that are neither group-based nor equivariant. We also argue that our scheme is pleasing in the context of applied phylogenetics, as, for a given symmetry of nucleotide substitution, it provides a natural hierarchy of models with increasing number of parameters. We also note that our methods are applicable to any application of continuous-time Markov chains beyond the initial motivations we take from phylogenetics. Crown Copyright Â© 2011. Published by Elsevier Ltd. All rights reserved.
Flight Dynamic Model Exchange using XML
Jackson, E. Bruce; Hildreth, Bruce L.
2002-01-01
The AIAA Modeling and Simulation Technical Committee has worked for several years to develop a standard by which the information needed to develop physics-based models of aircraft can be specified. The purpose of this standard is to provide a well-defined set of information, definitions, data tables and axis systems so that cooperating organizations can transfer a model from one simulation facility to another with maximum efficiency. This paper proposes using an application of the eXtensible Markup Language (XML) to implement the AIAA simulation standard. The motivation and justification for using a standard such as XML is discussed. Necessary data elements to be supported are outlined. An example of an aerodynamic model as an XML file is given. This example includes definition of independent and dependent variables for function tables, definition of key variables used to define the model, and axis systems used. The final steps necessary for implementation of the standard are presented. Software to take an XML-defined model and import/export it to/from a given simulation facility is discussed, but not demonstrated. That would be the next step in final implementation of standards for physics-based aircraft dynamic models.
Integrated Medical Model – Chest Injury Model
National Aeronautics and Space Administration — The Exploration Medical Capability (ExMC) Element of NASA's Human Research Program (HRP) developed the Integrated Medical Model (IMM) to forecast the resources...
Traffic & safety statewide model and GIS modeling.
2012-07-01
Several steps have been taken over the past two years to advance the Utah Department of Transportation (UDOT) safety initiative. Previous research projects began the development of a hierarchical Bayesian model to analyze crashes on Utah roadways. De...
Nonlinear Modeling by Assembling Piecewise Linear Models
Yao, Weigang; Liou, Meng-Sing
2013-01-01
To preserve nonlinearity of a full order system over a parameters range of interest, we propose a simple modeling approach by assembling a set of piecewise local solutions, including the first-order Taylor series terms expanded about some sampling states. The work by Rewienski and White inspired our use of piecewise linear local solutions. The assembly of these local approximations is accomplished by assigning nonlinear weights, through radial basis functions in this study. The efficacy of the proposed procedure is validated for a two-dimensional airfoil moving at different Mach numbers and pitching motions, under which the flow exhibits prominent nonlinear behaviors. All results confirm that our nonlinear model is accurate and stable for predicting not only aerodynamic forces but also detailed flowfields. Moreover, the model is robustness-accurate for inputs considerably different from the base trajectory in form and magnitude. This modeling preserves nonlinearity of the problems considered in a rather simple and accurate manner.
Collision models in quantum optics
Ciccarello, Francesco
2017-12-01
Quantum collision models (CMs) provide advantageous case studies for investigating major issues in open quantum systems theory, and especially quantum non-Markovianity. After reviewing their general definition and distinctive features, we illustrate the emergence of a CM in a familiar quantum optics scenario. This task is carried out by highlighting the close connection between the well-known input-output formalism and CMs. Within this quantum optics framework, usual assumptions in the CMs' literature - such as considering a bath of noninteracting yet initially correlated ancillas - have a clear physical origin.
Núñez Vaquero, Álvaro
2013-01-01
This paper pursues three goals. First, some traditional concepts of ‘legal science’ will be analysed, and a definition of ‘legal science ampio sensu’, ‘legal science stricto sensu’ and ‘legal dogmatics’ will be proposed. Second, a reconstruction of five models of ‘legal science ampio sensu’ will be presented to show the different methodological alternatives available to legal scholars. Third, I claim that it is necessary (for conceptual reasons) to argue for moral reasons when choosing a lega...
Solid Waste Projection Model: Model user's guide
International Nuclear Information System (INIS)
Stiles, D.L.; Crow, V.L.
1990-08-01
The Solid Waste Projection Model (SWPM) system is an analytical tool developed by Pacific Northwest Laboratory (PNL) for Westinghouse Hanford company (WHC) specifically to address solid waste management issues at the Hanford Central Waste Complex (HCWC). This document, one of six documents supporting the SWPM system, contains a description of the system and instructions for preparing to use SWPM and operating Version 1 of the model. 4 figs., 1 tab
National Research Council Canada - National Science Library
Feiler, Peter
2007-01-01
.... The Society of Automotive Engineers (SAE) Architecture Analysis & Design Language (AADL) is an industry-standard, architecture-modeling notation specifically designed to support a component- based approach to modeling embedded systems...
DEFF Research Database (Denmark)
Juhl, Joakim
This thesis is about mathematical modelling and technology development. While mathematical modelling has become widely deployed within a broad range of scientific practices, it has also gained a central position within technology development. The intersection of mathematical modelling......-efficiency project, this thesis presents an analysis of the central practices that materialised representative physical modelling and implemented operational regulation models. In order to show how the project’s representative modelling and technology development connected physical theory with concrete problems...... theoretical outset, the existing literature on simulation models, and the study’s methodological and empirical approach. The purpose of this thesis is to describe the central practices that developed regulation technology for industrial production processes and to analyse how mathematical modelling...
International Nuclear Information System (INIS)
Pulkkinen, U.
2004-04-01
The report describes a simple comparison of two CCF-models, the ECLM, and the Beta-model. The objective of the comparison is to identify differences in the results of the models by applying the models in some simple test data cases. The comparison focuses mainly on theoretical aspects of the above mentioned CCF-models. The properties of the model parameter estimates in the data cases is also discussed. The practical aspects in using and estimating CCFmodels in real PSA context (e.g. the data interpretation, properties of computer tools, the model documentation) are not discussed in the report. Similarly, the qualitative CCF-analyses needed in using the models are not discussed in the report. (au)
Modeling EERE deployment programs
Energy Technology Data Exchange (ETDEWEB)
Cort, K. A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hostick, D. J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Belzer, D. B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Livingston, O. V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)
2007-11-01
The purpose of the project was to identify and characterize the modeling of deployment programs within the EERE Technology Development (TD) programs, address possible improvements to the modeling process, and note gaps in knowledge for future research.
Controlling Modelling Artifacts
DEFF Research Database (Denmark)
Smith, Michael James Andrew; Nielson, Flemming; Nielson, Hanne Riis
2011-01-01
the possible configurations of the system (for example, by counting the number of components in a certain state). We motivate our methodology with a case study of the LMAC protocol for wireless sensor networks. In particular, we investigate the accuracy of a recently proposed high-level model of LMAC......When analysing the performance of a complex system, we typically build abstract models that are small enough to analyse, but still capture the relevant details of the system. But it is difficult to know whether the model accurately describes the real system, or if its behaviour is due to modelling...... artifacts that were inadvertently introduced. In this paper, we propose a novel methodology to reason about modelling artifacts, given a detailed model and a highlevel (more abstract) model of the same system. By a series of automated abstraction steps, we lift the detailed model to the same state space...
Modeling Fluid Structure Interaction
National Research Council Canada - National Science Library
Benaroya, Haym
2000-01-01
The principal goal of this program is on integrating experiments with analytical modeling to develop physics-based reduced-order analytical models of nonlinear fluid-structure interactions in articulated naval platforms...
Consistent model driven architecture
Niepostyn, Stanisław J.
2015-09-01
The goal of the MDA is to produce software systems from abstract models in a way where human interaction is restricted to a minimum. These abstract models are based on the UML language. However, the semantics of UML models is defined in a natural language. Subsequently the verification of consistency of these diagrams is needed in order to identify errors in requirements at the early stage of the development process. The verification of consistency is difficult due to a semi-formal nature of UML diagrams. We propose automatic verification of consistency of the series of UML diagrams originating from abstract models implemented with our consistency rules. This Consistent Model Driven Architecture approach enables us to generate automatically complete workflow applications from consistent and complete models developed from abstract models (e.g. Business Context Diagram). Therefore, our method can be used to check practicability (feasibility) of software architecture models.
Agena, S. M.; Pusey, M. L.; Bogle, I. D.
1999-01-01
A thermodynamic framework (UNIQUAC model with temperature dependent parameters) is applied to model the salt-induced protein crystallization equilibrium, i.e., protein solubility. The framework introduces a term for the solubility product describing protein transfer between the liquid and solid phase and a term for the solution behavior describing deviation from ideal solution. Protein solubility is modeled as a function of salt concentration and temperature for a four-component system consisting of a protein, pseudo solvent (water and buffer), cation, and anion (salt). Two different systems, lysozyme with sodium chloride and concanavalin A with ammonium sulfate, are investigated. Comparison of the modeled and experimental protein solubility data results in an average root mean square deviation of 5.8%, demonstrating that the model closely follows the experimental behavior. Model calculations and model parameters are reviewed to examine the model and protein crystallization process. Copyright 1999 John Wiley & Sons, Inc.
International Nuclear Information System (INIS)
Anon.
1977-01-01
Progress in model and code development for reactor physics calculations is summarized. The codes included CINDER-10, PHROG, RAFFLE GAPP, DCFMR, RELAP/4, PARET, and KENO. Kinetics models for the PBF were developed
National Aeronautics and Space Administration — The Galactic model is a spatial and spectral template. The model for the Galactic diffuse emission was developed using spectral line surveys of HI and CO (as a...
National Oceanic and Atmospheric Administration, Department of Commerce — The World Magnetic Model is the standard model used by the U.S. Department of Defense, the U.K. Ministry of Defence, the North Atlantic Treaty Organization (NATO)...
Amir Farbin
The ATLAS Analysis Model is a continually developing vision of how to reconcile physics analysis requirements with the ATLAS offline software and computing model constraints. In the past year this vision has influenced the evolution of the ATLAS Event Data Model, the Athena software framework, and physics analysis tools. These developments, along with the October Analysis Model Workshop and the planning for CSC analyses have led to a rapid refinement of the ATLAS Analysis Model in the past few months. This article introduces some of the relevant issues and presents the current vision of the future ATLAS Analysis Model. Event Data Model The ATLAS Event Data Model (EDM) consists of several levels of details, each targeted for a specific set of tasks. For example the Event Summary Data (ESD) stores calorimeter cells and tracking system hits thereby permitting many calibration and alignment tasks, but will be only accessible at particular computing sites with potentially large latency. In contrast, the Analysis...
National Oceanic and Atmospheric Administration, Department of Commerce — The World Magnetic Model is the standard model used by the U.S. Department of Defense, the U.K. Ministry of Defence, the North Atlantic Treaty Organization (NATO)...
DEFF Research Database (Denmark)
Riis, Troels; Jørgensen, John Leif
1999-01-01
This documents describes a test of the implementation of the ASC orbit model for the Champ satellite.......This documents describes a test of the implementation of the ASC orbit model for the Champ satellite....
Laboratory of Biological Modeling
Federal Laboratory Consortium — The Laboratory of Biological Modeling is defined by both its methodologies and its areas of application. We use mathematical modeling in many forms and apply it to a...
Bounding species distribution models
Directory of Open Access Journals (Sweden)
Thomas J. STOHLGREN, Catherine S. JARNEVICH, Wayne E. ESAIAS,Jeffrey T. MORISETTE
2011-10-01
Full Text Available Species distribution models are increasing in popularity for mapping suitable habitat for species of management concern. Many investigators now recognize that extrapolations of these models with geographic information systems (GIS might be sensitive to the environmental bounds of the data used in their development, yet there is no recommended best practice for “clamping” model extrapolations. We relied on two commonly used modeling approaches: classification and regression tree (CART and maximum entropy (Maxent models, and we tested a simple alteration of the model extrapolations, bounding extrapolations to the maximum and minimum values of primary environmental predictors, to provide a more realistic map of suitable habitat of hybridized Africanized honey bees in the southwestern United States. Findings suggest that multiple models of bounding, and the most conservative bounding of species distribution models, like those presented here, should probably replace the unbounded or loosely bounded techniques currently used [Current Zoology 57 (5: 642–647, 2011].
Bounding Species Distribution Models
Stohlgren, Thomas J.; Jarnevich, Cahterine S.; Morisette, Jeffrey T.; Esaias, Wayne E.
2011-01-01
Species distribution models are increasing in popularity for mapping suitable habitat for species of management concern. Many investigators now recognize that extrapolations of these models with geographic information systems (GIS) might be sensitive to the environmental bounds of the data used in their development, yet there is no recommended best practice for "clamping" model extrapolations. We relied on two commonly used modeling approaches: classification and regression tree (CART) and maximum entropy (Maxent) models, and we tested a simple alteration of the model extrapolations, bounding extrapolations to the maximum and minimum values of primary environmental predictors, to provide a more realistic map of suitable habitat of hybridized Africanized honey bees in the southwestern United States. Findings suggest that multiple models of bounding, and the most conservative bounding of species distribution models, like those presented here, should probably replace the unbounded or loosely bounded techniques currently used [Current Zoology 57 (5): 642-647, 2011].
Directory of Open Access Journals (Sweden)
Oleg Svatos
2013-01-01
Full Text Available In this paper we analyze complexity of time limits we can find especially in regulated processes of public administration. First we review the most popular process modeling languages. There is defined an example scenario based on the current Czech legislature which is then captured in discussed process modeling languages. Analysis shows that the contemporary process modeling languages support capturing of the time limit only partially. This causes troubles to analysts and unnecessary complexity of the models. Upon unsatisfying results of the contemporary process modeling languages we analyze the complexity of the time limits in greater detail and outline lifecycles of a time limit using the multiple dynamic generalizations pattern. As an alternative to the popular process modeling languages there is presented PSD process modeling language, which supports the defined lifecycles of a time limit natively and therefore allows keeping the models simple and easy to understand.
Emissions Modeling Clearinghouse
U.S. Environmental Protection Agency — The Emissions Modeling Clearinghouse (EMCH) supports and promotes emissions modeling activities both internal and external to the EPA. Through this site, the EPA...
Hébert, Hélène; Abadie, Stéphane; Benoit, Michel; Créach, Ronan; Frère, Antoine; Gailler, Audrey; Garzaglia, Sébastien; Hayashi, Yutaka; Loevenbruck, Anne; Macary, Olivier; Marcer, Richard; Morichon, Denis; Pedreros, Rodrigo; Rebour, Vincent; Ricchiuto, Mario; Silva Jacinto, Ricardo; Terrier, Monique; Toucanne, Samuel; Traversa, Paola; Violeau, Damien
2014-05-01
TANDEM (Tsunamis in the Atlantic and the English ChaNnel: Definition of the Effects through numerical Modeling) is a French research project dedicated to the appraisal of coastal effects due to tsunami waves on the French coastlines, with a special focus on the Atlantic and Channel coastlines, where French civil nuclear facilities have been operating since about 30 years. This project aims at drawing conclusions from the 2011 catastrophic tsunami, and will allow, together with a Japanese research partner, to design, adapt and validate numerical methods of tsunami hazard assessment, using the outstanding database of the 2011 tsunami. Then the validated methods will be applied to estimate, as accurately as possible, the tsunami hazard for the French Atlantic and Channel coastlines, in order to provide guidance for risk assessment on the nuclear facilities. The project TANDEM follows the recommendations of International Atomic Energy Agency (IAEA) to analyse the tsunami exposure of the nuclear facilities, as well as the recommendations of the French Nuclear Safety Authority (Autorité de Sûreté Nucléaire, ASN) in the aftermath of the 2011 catastrophe, which required the licensee of nuclear facilities to conduct complementary safety assessments (CSA), also including "the robustness beyond their design basis". The tsunami hazard deserves an appraisal in the light of the 2011 catastrophe, to check whether any unforeseen tsunami impact can be expected for these facilities. TANDEM aims at defining the tsunami effects expected for the French Atlantic and Channel coastlines, basically from numerical modeling methods, through adaptation and improvement of numerical methods, in order to study tsunami impacts down to the interaction with coastal structures (thus sometimes using 3D approaches) (WP1). Then the methods will be tested to better characterize and quantify the associated uncertainties (in the source, the propagation, and the coastal impact) (WP2). The project will
Differential models in ecology
International Nuclear Information System (INIS)
Barco Gomez, Carlos; Barco Gomez, German
2002-01-01
The models mathematical writings with differential equations are used to describe the populational behavior through the time of the animal species. These models can be lineal or no lineal. The differential models for unique specie include the exponential pattern of Malthus and the logistical pattern of Verlhust. The lineal differential models to describe the interaction between two species include the competition relationships, predation and symbiosis
Tashiro, Tohru
2014-03-01
We propose a new model about diffusion of a product which includes a memory of how many adopters or advertisements a non-adopter met, where (non-)adopters mean people (not) possessing the product. This effect is lacking in the Bass model. As an application, we utilize the model to fit the iPod sales data, and so the better agreement is obtained than the Bass model.
GARCH Modelling of Cryptocurrencies
Directory of Open Access Journals (Sweden)
Jeffrey Chu
2017-10-01
Full Text Available With the exception of Bitcoin, there appears to be little or no literature on GARCH modelling of cryptocurrencies. This paper provides the first GARCH modelling of the seven most popular cryptocurrencies. Twelve GARCH models are fitted to each cryptocurrency, and their fits are assessed in terms of five criteria. Conclusions are drawn on the best fitting models, forecasts and acceptability of value at risk estimates.
Yongquan Zhou; Jian Xie; Liangliang Li; Mingzhi Ma
2014-01-01
Bat algorithm (BA) is a novel stochastic global optimization algorithm. Cloud model is an effective tool in transforming between qualitative concepts and their quantitative representation. Based on the bat echolocation mechanism and excellent characteristics of cloud model on uncertainty knowledge representation, a new cloud model bat algorithm (CBA) is proposed. This paper focuses on remodeling echolocation model based on living and preying characteristics of bats, utilizing the transformati...
Optimization modeling with spreadsheets
Baker, Kenneth R
2015-01-01
An accessible introduction to optimization analysis using spreadsheets Updated and revised, Optimization Modeling with Spreadsheets, Third Edition emphasizes model building skills in optimization analysis. By emphasizing both spreadsheet modeling and optimization tools in the freely available Microsoft® Office Excel® Solver, the book illustrates how to find solutions to real-world optimization problems without needing additional specialized software. The Third Edition includes many practical applications of optimization models as well as a systematic framework that il
Artificial neural network modelling
Samarasinghe, Sandhya
2016-01-01
This book covers theoretical aspects as well as recent innovative applications of Artificial Neural networks (ANNs) in natural, environmental, biological, social, industrial and automated systems. It presents recent results of ANNs in modelling small, large and complex systems under three categories, namely, 1) Networks, Structure Optimisation, Robustness and Stochasticity 2) Advances in Modelling Biological and Environmental Systems and 3) Advances in Modelling Social and Economic Systems. The book aims at serving undergraduates, postgraduates and researchers in ANN computational modelling. .
2006-01-01
This is the version 1.1 of the TENCompetence Domain Model (version 1.0 released at 19-6-2006; version 1.1 at 9-11-2008). It contains several files: a) a pdf with the model description, b) three jpg files with class models (also in the pdf), c) a MagicDraw zip file with the model itself, d) a release
Petrone, Giovanni; Spagnuolo, Giovanni
2016-01-01
This comprehensive guide surveys all available models for simulating a photovoltaic (PV) generator at different levels of granularity, from cell to system level, in uniform as well as in mismatched conditions. Providing a thorough comparison among the models, engineers have all the elements needed to choose the right PV array model for specific applications or environmental conditions matched with the model of the electronic circuit used to maximize the PV power production.
International Nuclear Information System (INIS)
Tashiro, Tohru
2014-01-01
We propose a new model about diffusion of a product which includes a memory of how many adopters or advertisements a non-adopter met, where (non-)adopters mean people (not) possessing the product. This effect is lacking in the Bass model. As an application, we utilize the model to fit the iPod sales data, and so the better agreement is obtained than the Bass model
Energy Technology Data Exchange (ETDEWEB)
J. Wang
2003-06-24
The purpose of this Model Report is to document the Calibrated Properties Model that provides calibrated parameter sets for unsaturated zone (UZ) flow and transport process models for the Office of Repository Development (ORD). The UZ contains the unsaturated rock layers overlying the repository and host unit, which constitute a natural barrier to flow, and the unsaturated rock layers below the repository which constitute a natural barrier to flow and transport. This work followed, and was planned in, ''Technical Work Plan (TWP) for: Performance Assessment Unsaturated Zone'' (BSC 2002 [160819], Section 1.10.8 [under Work Package (WP) AUZM06, Climate Infiltration and Flow], and Section I-1-1 [in Attachment I, Model Validation Plans]). In Section 4.2, four acceptance criteria (ACs) are identified for acceptance of this Model Report; only one of these (Section 4.2.1.3.6.3, AC 3) was identified in the TWP (BSC 2002 [160819], Table 3-1). These calibrated property sets include matrix and fracture parameters for the UZ Flow and Transport Model (UZ Model), drift seepage models, and drift-scale and mountain-scale coupled-process models from the UZ Flow, Transport and Coupled Processes Department in the Natural Systems Subproject of the Performance Assessment (PA) Project. The Calibrated Properties Model output will also be used by the Engineered Barrier System Department in the Engineering Systems Subproject. The Calibrated Properties Model provides input through the UZ Model and other process models of natural and engineered systems to the Total System Performance Assessment (TSPA) models, in accord with the PA Strategy and Scope in the PA Project of the Bechtel SAIC Company, LLC (BSC). The UZ process models provide the necessary framework to test conceptual hypotheses of flow and transport at different scales and predict flow and transport behavior under a variety of climatic and thermal-loading conditions. UZ flow is a TSPA model component.
DEFF Research Database (Denmark)
Gudiksen, Sune Klok; Poulsen, Søren Bolvig; Buur, Jacob
2014-01-01
Well-established companies are currently struggling to secure profits due to the pressure from new players' business models as they take advantage of communication technology and new business-model configurations. Because of this, the business model research field flourishes currently; however, t...... illustrates how the application of participatory business model design toolsets can open up discussions on alternative scenarios through improvisation, mock-up making and design game playing, before qualitative judgment on the most promising scenario is carried out....
Model Checking Feature Interactions
DEFF Research Database (Denmark)
Le Guilly, Thibaut; Olsen, Petur; Pedersen, Thomas
2015-01-01
This paper presents an offline approach to analyzing feature interactions in embedded systems. The approach consists of a systematic process to gather the necessary information about system components and their models. The model is first specified in terms of predicates, before being refined to t...... to timed automata. The consistency of the model is verified at different development stages, and the correct linkage between the predicates and their semantic model is checked. The approach is illustrated on a use case from home automation....
DEFF Research Database (Denmark)
Thoft-Christensen, Palle
Modelling of corrosion cracking of reinforced concrete structures is complicated as a great number of uncertain factors are involved. To get a reliable modelling a physical and mechanical understanding of the process behind corrosion in needed.......Modelling of corrosion cracking of reinforced concrete structures is complicated as a great number of uncertain factors are involved. To get a reliable modelling a physical and mechanical understanding of the process behind corrosion in needed....
Modeling and Simulation for Safeguards
Energy Technology Data Exchange (ETDEWEB)
Swinhoe, Martyn T. [Los Alamos National Laboratory
2012-07-26
The purpose of this talk is to give an overview of the role of modeling and simulation in Safeguards R&D and introduce you to (some of) the tools used. Some definitions are: (1) Modeling - the representation, often mathematical, of a process, concept, or operation of a system, often implemented by a computer program; (2) Simulation - the representation of the behavior or characteristics of one system through the use of another system, especially a computer program designed for the purpose; and (3) Safeguards - the timely detection of diversion of significant quantities of nuclear material. The role of modeling and simulation are: (1) Calculate amounts of material (plant modeling); (2) Calculate signatures of nuclear material etc. (source terms); and (3) Detector performance (radiation transport and detection). Plant modeling software (e.g. FACSIM) gives the flows and amount of material stored at all parts of the process. In safeguards this allow us to calculate the expected uncertainty of the mass and evaluate the expected MUF. We can determine the measurement accuracy required to achieve a certain performance.
Model description and evaluation of model performance: DOSDIM model
International Nuclear Information System (INIS)
Lewyckyj, N.; Zeevaert, T.
1996-01-01
DOSDIM was developed to assess the impact to man from routine and accidental atmospheric releases. It is a compartmental, deterministic, radiological model. For an accidental release, dynamic transfer are used in opposition to a routine release for which equilibrium transfer factors are used. Parameters values were chosen to be conservative. Transfer between compartments are described by first-order differential equations. 2 figs
E. Gregory McPherson; Paula J. Peper
2012-01-01
This paper describes three long-term tree growth studies conducted to evaluate tree performance because repeated measurements of the same trees produce critical data for growth model calibration and validation. Several empirical and process-based approaches to modeling tree growth are reviewed. Modeling is more advanced in the fields of forestry and...
Classifying variability modeling techniques
Sinnema, Marco; Deelstra, Sybren
Variability modeling is important for managing variability in software product families, especially during product derivation. In the past few years, several variability modeling techniques have been developed, each using its own concepts to model the variability provided by a product family. The
Energy Technology Data Exchange (ETDEWEB)
Fortelius, C.; Holopainen, E.; Kaurola, J.; Ruosteenoja, K.; Raeisaenen, J. [Helsinki Univ. (Finland). Dept. of Meteorology
1996-12-31
In recent years the modelling of interannual climate variability has been studied, the atmospheric energy and water cycles, and climate simulations with the ECHAM3 model. In addition, the climate simulations of several models have been compared with special emphasis in the area of northern Europe
International Nuclear Information System (INIS)
Lum, C.
2004-01-01
The purpose of this model report is to document the Rock Properties Model version 3.1 with regard to input data, model methods, assumptions, uncertainties and limitations of model results, and qualification status of the model. The report also documents the differences between the current and previous versions and validation of the model. The rock properties model provides mean matrix and lithophysae porosity, and the cross-correlated mean bulk density as direct input to the ''Saturated Zone Flow and Transport Model Abstraction'', MDL-NBS-HS-000021, REV 02 (BSC 2004 [DIRS 170042]). The constraints, caveats, and limitations associated with this model are discussed in Section 6.6 and 8.2. Model validation accomplished by corroboration with data not cited as direct input is discussed in Section 7. The revision of this model report was performed as part of activities being conducted under the ''Technical Work Plan for: The Integrated Site Model, Revision 05'' (BSC 2004 [DIRS 169635]). The purpose of this revision is to bring the report up to current procedural requirements and address the Regulatory Integration Team evaluation comments. The work plan describes the scope, objectives, tasks, methodology, and procedures for this process
Vega, Solmaria Halleck; Elhorst, J. Paul
We provide a comprehensive overview of the strengths and weaknesses of different spatial econometric model specifications in terms of spillover effects. Based on this overview, we advocate taking the SLX model as point of departure in case a well-founded theory indicating which model is most
DEFF Research Database (Denmark)
Andresen, Mette
2007-01-01
-authentic modelling is also linked with the potentials of exploration of ready-made models as a forerunner for more authentic modelling processes. The discussion includes analysis of an episode of students? work in the classroom, which serves to illustrate how concept formation may be linked to explorations of a non...
Modeling EERE Deployment Programs
Energy Technology Data Exchange (ETDEWEB)
Cort, K. A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hostick, D. J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Belzer, D. B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Livingston, O. V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)
2007-11-01
This report compiles information and conclusions gathered as part of the “Modeling EERE Deployment Programs” project. The purpose of the project was to identify and characterize the modeling of deployment programs within the EERE Technology Development (TD) programs, address possible improvements to the modeling process, and note gaps in knowledge in which future research is needed.
Models of Business Internationalisation
Directory of Open Access Journals (Sweden)
Jurgita Vabinskaitė
2011-04-01
Full Text Available The study deals with the theoretical models of business internationalisation: the “Uppsala” Internationalisation Model, modified “Uppsala” model, the Eclectic Paradigm and analysis of transactional costs, Industrial Network approach, the Advantage Package and the Advantage Cycle.Article in Lithuanian
DEFF Research Database (Denmark)
Cameron, Ian; Gani, Rafiqul
2011-01-01
Engineering of products and processes is increasingly “model-centric”. Models in their multitudinous forms are ubiquitous, being heavily used for a range of decision making activities across all life cycle phases. This chapter gives an overview of what is a model, the principal activities in the ...
Christensen, V.; Pauly, D.
1996-01-01
A brief review of the status of the ECOPATH modeling approach and software is presented, with emphasis on the recent release of a Windows version (ECOPATH 3.0), which enables consideration of uncertainties, and sets the stage for simulation modeling using ECOSIM. Modeling of coral reefs is emphasized.
International Nuclear Information System (INIS)
Martin Llorente, F.
1990-01-01
The models of atmospheric pollutants dispersion are based in mathematic algorithms that describe the transport, diffusion, elimination and chemical reactions of atmospheric contaminants. These models operate with data of contaminants emission and make an estimation of quality air in the area. This model can be applied to several aspects of atmospheric contamination
Kelderman, Hendrikus
1984-01-01
Existing statistical tests for the fit of the Rasch model have been criticized, because they are only sensitive to specific violations of its assumptions. Contingency table methods using loglinear models have been used to test various psychometric models. In this paper, the assumptions of the Rasch
Modeling Epidemic Network Failures
DEFF Research Database (Denmark)
Ruepp, Sarah Renée; Fagertun, Anna Manolova
2013-01-01
This paper presents the implementation of a failure propagation model for transport networks when multiple failures occur resulting in an epidemic. We model the Susceptible Infected Disabled (SID) epidemic model and validate it by comparing it to analytical solutions. Furthermore, we evaluate...... to evaluate multiple epidemic scenarios in various network types....
DEFF Research Database (Denmark)
Friis, Silje Alberthe Kamille; Gelting, Anne Katrine Gøtzsche
2014-01-01
the approaches and reach a new level of conscious action when designing? Informed by theories of design thinking, knowledge production, and learning, we have developed a model, the 5C model, accompanied by 62 method cards. Examples of how the model has been applied in an educational setting are provided...
The nontopological soliton model
International Nuclear Information System (INIS)
Wilets, L.
1988-01-01
The nontopological soliton model introduced by Friedberg and Lee, and variations of it, provide a method for modeling QCD which can effectively include the dynamics of hadronic collisions as well as spectra. Absolute color confinement is effected by the assumed dielectric properties of the medium. A recently proposed version of the model is chirally invariant. 32 refs., 5 figs., 1 tab
International Nuclear Information System (INIS)
Thomas, A.W.
1981-01-01
Recent developments in the bag model, in which the constraints of chiral symmetry are explicitly included are reviewed. The model leads to a new understanding of the Δ-resonance. The connection of the theory with current algebra is clarified and implications of the model for the structure of the nucleon are discussed
Flexible survival regression modelling
DEFF Research Database (Denmark)
Cortese, Giuliana; Scheike, Thomas H; Martinussen, Torben
2009-01-01
Regression analysis of survival data, and more generally event history data, is typically based on Cox's regression model. We here review some recent methodology, focusing on the limitations of Cox's regression model. The key limitation is that the model is not well suited to represent time-varyi...