WorldWideScience

Sample records for model model definition

  1. Model Based Definition

    Science.gov (United States)

    Rowe, Sidney E.

    2010-01-01

    In September 2007, the Engineering Directorate at the Marshall Space Flight Center (MSFC) created the Design System Focus Team (DSFT). MSFC was responsible for the in-house design and development of the Ares 1 Upper Stage and the Engineering Directorate was preparing to deploy a new electronic Configuration Management and Data Management System with the Design Data Management System (DDMS) based upon a Commercial Off The Shelf (COTS) Product Data Management (PDM) System. The DSFT was to establish standardized CAD practices and a new data life cycle for design data. Of special interest here, the design teams were to implement Model Based Definition (MBD) in support of the Upper Stage manufacturing contract. It is noted that this MBD does use partially dimensioned drawings for auxiliary information to the model. The design data lifecycle implemented several new release states to be used prior to formal release that allowed the models to move through a flow of progressive maturity. The DSFT identified some 17 Lessons Learned as outcomes of the standards development, pathfinder deployments and initial application to the Upper Stage design completion. Some of the high value examples are reviewed.

  2. Mental Models: A Robust Definition

    Science.gov (United States)

    Rook, Laura

    2013-01-01

    Purpose: The concept of a mental model has been described by theorists from diverse disciplines. The purpose of this paper is to offer a robust definition of an individual mental model for use in organisational management. Design/methodology/approach: The approach adopted involves an interdisciplinary literature review of disciplines, including…

  3. Mental Models: A Robust Definition

    Science.gov (United States)

    Rook, Laura

    2013-01-01

    Purpose: The concept of a mental model has been described by theorists from diverse disciplines. The purpose of this paper is to offer a robust definition of an individual mental model for use in organisational management. Design/methodology/approach: The approach adopted involves an interdisciplinary literature review of disciplines, including…

  4. Conceptualising Business Models: Definitions, Frameworks and Classifications

    Directory of Open Access Journals (Sweden)

    Erwin Fielt

    2013-12-01

    Full Text Available The business model concept is gaining traction in different disciplines but is still criticized for being fuzzy and vague and lacking consensus on its definition and compositional elements. In this paper we set out to advance our understanding of the business model concept by addressing three areas of foundational research: business model definitions, business model elements, and business model archetypes. We define a business model as a representation of the value logic of an organization in terms of how it creates and captures customer value. This abstract and generic definition is made more specific and operational by the compositional elements that need to address the customer, value proposition, organizational architecture (firm and network level and economics dimensions. Business model archetypes complement the definition and elements by providing a more concrete and empirical understanding of the business model concept. The main contributions of this paper are (1 explicitly including the customer value concept in the business model definition and focussing on value creation, (2 presenting four core dimensions that business model elements need to cover, (3 arguing for flexibility by adapting and extending business model elements to cater for different purposes and contexts (e.g. technology, innovation, strategy (4 stressing a more systematic approach to business model archetypes by using business model elements for their description, and (5 suggesting to use business model archetype research for the empirical exploration and testing of business model elements and their relationships.

  5. A New Definition of Models and Modeling in Chemistry's Teaching

    Science.gov (United States)

    Chamizo, José A.

    2013-07-01

    The synthesis of new chemical compounds makes it the most productive science. Unfortunately chemistry education practice has not been driven to any great extent by research findings, philosophical positions or advances in new ways of approaching knowledge. The changes that have occurred in textbooks during the past three decades do not show any real recognition of these. Despite previously reported different types of models in this paper, from an `empirical reliability with minimal realism' approach to realism, a new simple and broad definition, a typology of models and their relation with modeling is presented.

  6. Sensitivity of resource selection and connectivity models to landscape definition

    Science.gov (United States)

    Katherine A. Zeller; Kevin McGarigal; Samuel A. Cushman; Paul Beier; T. Winston Vickers; Walter M. Boyce

    2017-01-01

    Context: The definition of the geospatial landscape is the underlying basis for species-habitat models, yet sensitivity of habitat use inference, predicted probability surfaces, and connectivity models to landscape definition has received little attention. Objectives: We evaluated the sensitivity of resource selection and connectivity models to four landscape...

  7. Translating Building Information Modeling to Building Energy Modeling Using Model View Definition

    Directory of Open Access Journals (Sweden)

    WoonSeong Jeong

    2014-01-01

    Full Text Available This paper presents a new approach to translate between Building Information Modeling (BIM and Building Energy Modeling (BEM that uses Modelica, an object-oriented declarative, equation-based simulation environment. The approach (BIM2BEM has been developed using a data modeling method to enable seamless model translations of building geometry, materials, and topology. Using data modeling, we created a Model View Definition (MVD consisting of a process model and a class diagram. The process model demonstrates object-mapping between BIM and Modelica-based BEM (ModelicaBEM and facilitates the definition of required information during model translations. The class diagram represents the information and object relationships to produce a class package intermediate between the BIM and BEM. The implementation of the intermediate class package enables system interface (Revit2Modelica development for automatic BIM data translation into ModelicaBEM. In order to demonstrate and validate our approach, simulation result comparisons have been conducted via three test cases using (1 the BIM-based Modelica models generated from Revit2Modelica and (2 BEM models manually created using LBNL Modelica Buildings library. Our implementation shows that BIM2BEM (1 enables BIM models to be translated into ModelicaBEM models, (2 enables system interface development based on the MVD for thermal simulation, and (3 facilitates the reuse of original BIM data into building energy simulation without an import/export process.

  8. Translating building information modeling to building energy modeling using model view definition.

    Science.gov (United States)

    Jeong, WoonSeong; Kim, Jong Bum; Clayton, Mark J; Haberl, Jeff S; Yan, Wei

    2014-01-01

    This paper presents a new approach to translate between Building Information Modeling (BIM) and Building Energy Modeling (BEM) that uses Modelica, an object-oriented declarative, equation-based simulation environment. The approach (BIM2BEM) has been developed using a data modeling method to enable seamless model translations of building geometry, materials, and topology. Using data modeling, we created a Model View Definition (MVD) consisting of a process model and a class diagram. The process model demonstrates object-mapping between BIM and Modelica-based BEM (ModelicaBEM) and facilitates the definition of required information during model translations. The class diagram represents the information and object relationships to produce a class package intermediate between the BIM and BEM. The implementation of the intermediate class package enables system interface (Revit2Modelica) development for automatic BIM data translation into ModelicaBEM. In order to demonstrate and validate our approach, simulation result comparisons have been conducted via three test cases using (1) the BIM-based Modelica models generated from Revit2Modelica and (2) BEM models manually created using LBNL Modelica Buildings library. Our implementation shows that BIM2BEM (1) enables BIM models to be translated into ModelicaBEM models, (2) enables system interface development based on the MVD for thermal simulation, and (3) facilitates the reuse of original BIM data into building energy simulation without an import/export process.

  9. Moving towards maturity in business model definitions

    DEFF Research Database (Denmark)

    Nielsen, Christian; Lund, Morten; Bukh, Per Nikolaj

    2014-01-01

    The field of business models has, as is the case with all emerging fields of practice, slowly matured through the development of frameworks, models, concepts and ideas over the last 15 years. New concepts, theories and models typically transcend a series of maturity phases. For the concept of Bus...... for focusing on the more complex and dynamic aspects of business models seems to be right - right now!......The field of business models has, as is the case with all emerging fields of practice, slowly matured through the development of frameworks, models, concepts and ideas over the last 15 years. New concepts, theories and models typically transcend a series of maturity phases. For the concept...... of Business Models, we are at the verge of moving from phase 2 to 3, after having spent a lot of time during the 1990’s and 2000’s arguing for the importance of understanding business models properly and discussing the content and potential building blocks of them. Therefore, in terms of maturity – the time...

  10. A unified model for yeast transcript definition.

    Science.gov (United States)

    de Boer, Carl G; van Bakel, Harm; Tsui, Kyle; Li, Joyce; Morris, Quaid D; Nislow, Corey; Greenblatt, Jack F; Hughes, Timothy R

    2014-01-01

    Identifying genes in the genomic context is central to a cell's ability to interpret the genome. Yet, in general, the signals used to define eukaryotic genes are poorly described. Here, we derived simple classifiers that identify where transcription will initiate and terminate using nucleic acid sequence features detectable by the yeast cell, which we integrate into a Unified Model (UM) that models transcription as a whole. The cis-elements that denote where transcription initiates function primarily through nucleosome depletion, and, using a synthetic promoter system, we show that most of these elements are sufficient to initiate transcription in vivo. Hrp1 binding sites are the major characteristic of terminators; these binding sites are often clustered in terminator regions and can terminate transcription bidirectionally. The UM predicts global transcript structure by modeling transcription of the genome using a hidden Markov model whose emissions are the outputs of the initiation and termination classifiers. We validated the novel predictions of the UM with available RNA-seq data and tested it further by directly comparing the transcript structure predicted by the model to the transcription generated by the cell for synthetic DNA segments of random design. We show that the UM identifies transcription start sites more accurately than the initiation classifier alone, indicating that the relative arrangement of promoter and terminator elements influences their function. Our model presents a concrete description of how the cell defines transcript units, explains the existence of nongenic transcripts, and provides insight into genome evolution.

  11. Ontology for Life-Cycle Modeling of Electrical Distribution Systems: Application of Model View Definition Attributes

    Science.gov (United States)

    2013-06-01

    Building in- formation exchange (COBie), Building Information Modeling ( BIM ) 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF...to develop a life-cycle building model have resulted in the definition of a “core” building information model that contains general information de...develop an information -exchange Model View Definition (MVD) for building electrical systems. The objective of the current work was to document the

  12. 40 CFR 85.2302 - Definition of model year.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 18 2010-07-01 2010-07-01 false Definition of model year. 85.2302 Section 85.2302 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) CONTROL OF AIR POLLUTION FROM MOBILE SOURCES Determination of Model Year for Motor Vehicles and Engines Used in Motor Vehicles Under...

  13. General Meta-Models to Analysis of Software Architecture Definitions

    Directory of Open Access Journals (Sweden)

    GholamAli Nejad HajAli Irani

    2011-12-01

    Full Text Available An important step for understanding the architecture will be obtained by providing a clear definition from that. More than 150 valid definitions presented for identifying the software architecture. So an analogy among them is needed to give us a better understanding on the existing definitions. In this paper an analysis over different issues of current definitions is provided based on the incorporated elements. In conjunction with this objective first, the definitions are collected and, after conducting an analysis over them, are broken into different constituent elements which are shown in one table. Then some selected parameters in the table are classified into groups for comparison purposes. Then all parameters of each individual group are specified and compared with each other. This procedure is rendered for all groups respectively. Finally, a meta-model is developed for each group. The aim is not to accept or reject a specific definition, but rather is to contrast the definitions and their respective constituent elements in order to construct a background for gaining better perceptions on software architecture which in turn can benefit the introduction of an appropriate definition.

  14. Building a Shared Definitional Model of Long Duration Human Spaceflight

    Science.gov (United States)

    Orr, M.; Whitmire, A.; Sandoval, L.; Leveton, L.; Arias, D.

    2011-01-01

    In 1956, on the eve of human space travel Strughold first proposed a simple classification of the present and future stages of manned flight that identified key factors, risks and developmental stages for the evolutionary journey ahead. As we look to optimize the potential of the ISS as a gateway to new destinations, we need a current shared working definitional model of long duration human space flight to help guide our path. Initial search of formal and grey literature augmented by liaison with subject matter experts. Search strategy focused on both the use of term long duration mission and long duration spaceflight, and also broader related current and historical definitions and classification models of spaceflight. The related sea and air travel literature was also subsequently explored with a view to identifying analogous models or classification systems. There are multiple different definitions and classification systems for spaceflight including phase and type of mission, craft and payload and related risk management models. However the frequently used concepts of long duration mission and long duration spaceflight are infrequently operationally defined by authors, and no commonly referenced classical or gold standard definition or model of these terms emerged from the search. The categorization (Cat) system for sailing was found to be of potential analogous utility, with its focus on understanding the need for crew and craft autonomy at various levels of potential adversity and inability to gain outside support or return to a safe location, due to factors of time, distance and location.

  15. Promoting Model-based Definition to Establish a Complete Product Definition.

    Science.gov (United States)

    Ruemler, Shawn P; Zimmerman, Kyle E; Hartman, Nathan W; Hedberg, Thomas; Feeny, Allison Barnard

    2017-05-01

    The manufacturing industry is evolving and starting to use 3D models as the central knowledge artifact for product data and product definition, or what is known as Model-based Definition (MBD). The Model-based Enterprise (MBE) uses MBD as a way to transition away from using traditional paper-based drawings and documentation. As MBD grows in popularity, it is imperative to understand what information is needed in the transition from drawings to models so that models represent all the relevant information needed for processes to continue efficiently. Finding this information can help define what data is common amongst different models in different stages of the lifecycle, which could help establish a Common Information Model. The Common Information Model is a source that contains common information from domain specific elements amongst different aspects of the lifecycle. To help establish this Common Information Model, information about how models are used in industry within different workflows needs to be understood. To retrieve this information, a survey mechanism was administered to industry professionals from various sectors. Based on the results of the survey a Common Information Model could not be established. However, the results gave great insight that will help in further investigation of the Common Information Model.

  16. Basic definitions for discrete modeling of computer worms epidemics

    Directory of Open Access Journals (Sweden)

    P. Guevara

    2015-04-01

    Full Text Available The information technologies have evolved in such a way that communication between computers or hosts has become common, so much that the worldwide organization (governments and corporations depends on it; what could happen if these computers stop working for a long time is catastrophic. Unfortunately, networks are attacked by malware such as viruses and worms that could collapse the system. This has served as motivation for the formal study of computer worms and epidemics to develop strategies for prevention and protection; this is why in this paper, before analyzing epidemiological models, a set of formal definitions based on set theory and functions is proposed for describing 21 concepts used in the study of worms. These definitions provide a basis for future qualitative research on the behavior of computer worms, and quantitative for the study of their epidemiological models.

  17. A mathematical model of symmetry based on mathematical definition

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Tolerance is imperative for seamless integration of CAD/CAM(Computer Aided Disign/Computer Aided Manufacture) which is just a text attribute and has no semantics in present CAD systems. There are many tolerance types, the relations between which are very complicated. In addition, the different principles of tolerance make study of tolerance difficult; and there may be various meanings or interpretation for the same type of tolerance because of the literal definition. In this work, latest unambiguous mathematical definition was applied to study, explain and clarify: (1) the formation and representation of tolerance zone, and (2) the formation and representation of variational elements; after which, the mathematical models of symmetry of different tolerance principles and different interpretations were derived. An example is given to illustrate the application of these models in tolerance analysis.

  18. A mathematical model of symmetry based on mathematical definition

    Institute of Scientific and Technical Information of China (English)

    刘玉生; 杨将新; 吴昭同; 高曙明

    2002-01-01

    Tolerance is imperative for seamless integration of CAD/CAM(Computer Aided Disignd/Computer Aided Manufacture) which is just a text attribute and has no semantics in present CAD systems. There are many tolerance types, the relations between which are very complicated. In addition, the different principles of tolerance make study of tolerance difficult; and there may be various meanings or interpretation for the same type of tolerance beeanse of the literal definition. In this work, latest unambiguous mathematical definition was applied to study, explain and clarify: ( 1 ) the formation and representation of tolerance zone, and (2) the formation and representation of variational elements ; after which, the mathematical models of syrmmetry of different tolerance principles and different interpretations were derived. An example is given to illustrate the application of these models in tolerance analysis.

  19. Fuzzy Entropy: Axiomatic Definition and Neural Networks Model

    Institute of Scientific and Technical Information of China (English)

    QINGMing; CAOYue; HUANGTian-min

    2004-01-01

    The measure of uncertainty is adopted as a measure of information. The measures of fuzziness are known as fuzzy information measures. The measure of a quantity of fuzzy information gained from a fuzzy set or fuzzy system is known as fuzzy entropy. Fuzzy entropy has been focused and studied by many researchers in various fields. In this paper, firstly, the axiomatic definition of fuzzy entropy is discussed. Then, neural networks model of fuzzy entropy is proposed, based on the computing capability of neural networks. In the end, two examples are discussed to show the efficiency of the model.

  20. Structural Dynamics Model Updating with Positive Definiteness and No Spillover

    Directory of Open Access Journals (Sweden)

    Yongxin Yuan

    2014-01-01

    Full Text Available Model updating is a common method to improve the correlation between structural dynamics models and measured data. In conducting the updating, it is desirable to match only the measured spectral data without tampering with the other unmeasured and unknown eigeninformation in the original model (if so, the model is said to be updated with no spillover and to maintain the positive definiteness of the coefficient matrices. In this paper, an efficient numerical method for updating mass and stiffness matrices simultaneously is presented. The method first updates the modal frequencies. Then, a method is presented to construct a transformation matrix and this matrix is used to correct the analytical eigenvectors so that the updated model is compatible with the measurement of the eigenvectors. The method can preserve both no spillover and the symmetric positive definiteness of the mass and stiffness matrices. The method is computationally efficient as neither iteration nor numerical optimization is required. The numerical example shows that the presented method is quite accurate and efficient.

  1. Persuasive Game Design: A model and its definitions

    NARCIS (Netherlands)

    Visch, V.T.; Vegt, N.J.H.; Anderiesen, H.; Van der Kooij, K.

    2013-01-01

    The following position paper proposes a general theoretical model for persuasive game design. This model combines existing theories on persuasive technology, serious gaming, and gamification. The model is based on user experience, gamification design, and transfer effects.

  2. Persuasive Game Design: A model and its definitions

    NARCIS (Netherlands)

    Visch, V.T.; Vegt, N.J.H.; Anderiesen, H.; Van der Kooij, K.

    2013-01-01

    The following position paper proposes a general theoretical model for persuasive game design. This model combines existing theories on persuasive technology, serious gaming, and gamification. The model is based on user experience, gamification design, and transfer effects.

  3. A Model-Driven Approach for Telecommunications Network Services Definition

    Science.gov (United States)

    Chiprianov, Vanea; Kermarrec, Yvon; Alff, Patrick D.

    Present day Telecommunications market imposes a short concept-to-market time for service providers. To reduce it, we propose a computer-aided, model-driven, service-specific tool, with support for collaborative work and for checking properties on models. We started by defining a prototype of the Meta-model (MM) of the service domain. Using this prototype, we defined a simple graphical modeling language specific for service designers. We are currently enlarging the MM of the domain using model transformations from Network Abstractions Layers (NALs). In the future, we will investigate approaches to ensure the support for collaborative work and for checking properties on models.

  4. Definition of Model-based diagnosis problems with Altarica

    OpenAIRE

    Pencolé, Yannick; Chanthery, Elodie; Peynot, Thierry

    2016-01-01

    International audience; This paper presents a framework for modeling diagnosis problems based on a formal language called Altarica. The initial purpose of the language Altarica was to define a modeling language for safety analysis. This language has been developed as a collaboration between academics and industrial partners and is used in some industrial companies. The paper shows that the expres-sivity of this language, mixing event-based and state-based models, is sufficient to model classi...

  5. TAPWAT: Definition structure and applications for modelling drinking water treatment

    NARCIS (Netherlands)

    Versteegh JFM; Gaalen FW van; Rietveld LC; Evers EG; Aldenberg TA; Cleij P; Technische Universiteit Delft; LWD

    2001-01-01

    The 'Tool for the Analysis of the Production of drinking WATer' (TAPWAT) model has been developed for describing drinking-water quality in integral studies in the context of the Environmental Policy Assessment of the RIVM. The model consists of modules that represent individual steps in a treatment

  6. TAPWAT: Definition structure and applications for modelling drinking water treatment

    NARCIS (Netherlands)

    Versteegh JFM; van Gaalen FW; Rietveld LC; Evers EG; Aldenberg TA; Cleij P; LWD

    2001-01-01

    Het model TAPWAT (Tool for the Analysis of the Production of drinking WATer), is ontwikkeld om de drinkwaterkwaliteit te beschrijven voor integrale studies in het kader van het planbureau Milieu en Natuur van het RIVM. Het model bestaat uit modules die de individuele zuiveringsstappen van het

  7. TAPWAT: Definition structure and applications for modelling drinking water treatment

    NARCIS (Netherlands)

    Versteegh JFM; Gaalen FW van; Rietveld LC; Evers EG; Aldenberg TA; Cleij P; Technische Universiteit Delft; LWD

    2001-01-01

    The 'Tool for the Analysis of the Production of drinking WATer' (TAPWAT) model has been developed for describing drinking-water quality in integral studies in the context of the Environmental Policy Assessment of the RIVM. The model consists of modules that represent individual steps in a treatment

  8. Health literacy and public health: A systematic review and integration of definitions and models

    LENUS (Irish Health Repository)

    Sorensen, Kristine

    2012-01-25

    Abstract Background Health literacy concerns the knowledge and competences of persons to meet the complex demands of health in modern society. Although its importance is increasingly recognised, there is no consensus about the definition of health literacy or about its conceptual dimensions, which limits the possibilities for measurement and comparison. The aim of the study is to review definitions and models on health literacy to develop an integrated definition and conceptual model capturing the most comprehensive evidence-based dimensions of health literacy. Methods A systematic literature review was performed to identify definitions and conceptual frameworks of health literacy. A content analysis of the definitions and conceptual frameworks was carried out to identify the central dimensions of health literacy and develop an integrated model. Results The review resulted in 17 definitions of health literacy and 12 conceptual models. Based on the content analysis, an integrative conceptual model was developed containing 12 dimensions referring to the knowledge, motivation and competencies of accessing, understanding, appraising and applying health-related information within the healthcare, disease prevention and health promotion setting, respectively. Conclusions Based upon this review, a model is proposed integrating medical and public health views of health literacy. The model can serve as a basis for developing health literacy enhancing interventions and provide a conceptual basis for the development and validation of measurement tools, capturing the different dimensions of health literacy within the healthcare, disease prevention and health promotion settings.

  9. Health literacy and public health: A systematic review and integration of definitions and models

    Directory of Open Access Journals (Sweden)

    Sørensen Kristine

    2012-01-01

    Full Text Available Abstract Background Health literacy concerns the knowledge and competences of persons to meet the complex demands of health in modern society. Although its importance is increasingly recognised, there is no consensus about the definition of health literacy or about its conceptual dimensions, which limits the possibilities for measurement and comparison. The aim of the study is to review definitions and models on health literacy to develop an integrated definition and conceptual model capturing the most comprehensive evidence-based dimensions of health literacy. Methods A systematic literature review was performed to identify definitions and conceptual frameworks of health literacy. A content analysis of the definitions and conceptual frameworks was carried out to identify the central dimensions of health literacy and develop an integrated model. Results The review resulted in 17 definitions of health literacy and 12 conceptual models. Based on the content analysis, an integrative conceptual model was developed containing 12 dimensions referring to the knowledge, motivation and competencies of accessing, understanding, appraising and applying health-related information within the healthcare, disease prevention and health promotion setting, respectively. Conclusions Based upon this review, a model is proposed integrating medical and public health views of health literacy. The model can serve as a basis for developing health literacy enhancing interventions and provide a conceptual basis for the development and validation of measurement tools, capturing the different dimensions of health literacy within the healthcare, disease prevention and health promotion settings.

  10. Influence of magnetospheric inputs definition on modeling of ionospheric storms

    Science.gov (United States)

    Tashchilin, A. V.; Romanova, E. B.; Kurkin, V. I.

    Usually for numerical modeling of ionospheric storms corresponding empirical models specify parameters of neutral atmosphere and magnetosphere. Statistical kind of these models renders them impractical for simulation of the individual storm. Therefore one has to correct the empirical models using various additional speculations. The influence of magnetospheric inputs such as distributions of electric potential, number and energy fluxes of the precipitating electrons on the results of the ionospheric storm simulations has been investigated in this work. With this aim for the strong geomagnetic storm on September 25, 1998 hour global distributions of those magnetospheric inputs from 20 to 27 September were calculated by the magnetogram inversion technique (MIT). Then with the help of 3-D ionospheric model two variants of ionospheric response to this magnetic storm were simulated using MIT data and empirical models of the electric fields (Sojka et al., 1986) and electron precipitations (Hardy et al., 1985). The comparison of the received results showed that for high-latitude and subauroral stations the daily variations of electron density calculated with MIT data are more close to observations than those of empirical models. In addition using of the MIT data allows revealing some peculiarities in the daily variations of electron density during strong geomagnetic storm. References Sojka J.J., Rasmussen C.E., Schunk R.W. J.Geophys.Res., 1986, N10, p.11281. Hardy D.A., Gussenhoven M.S., Holeman E.A. J.Geophys.Res., 1985, N5, p.4229.

  11. Investigation of FE model size definition for surface coating application

    Science.gov (United States)

    Chen, Yanhong; Zhuang, Weimin; Wang, Shiwen; Lin, Jianguo; Balint, Daniel; Shan, Debin

    2012-09-01

    An efficient prediction mechanical performance of coating structures has been a constant concern since the dawn of surface engineering. However, predictive models presented by initial research are normally based on traditional solid mechanics, and thus cannot predict coating performance accurately. Also, the high computational costs that originate from the exclusive structure of surface coating systems (a big difference in the order of coating and substrate) are not well addressed by these models. To fill the needs for accurate prediction and low computational costs, a multi-axial continuum damage mechanics (CDM)-based constitutive model is introduced for the investigation of the load bearing capacity and fracture properties of coatings. Material parameters within the proposed constitutive model are determined for a typical coating (TiN) and substrate (Cu) system. An efficient numerical subroutine is developed to implement the determined constitutive model into the commercial FE solver, ABAQUS, through the user-defined subroutine, VUMAT. By changing the geometrical sizes of FE models, a series of computations are carried out to investigate (1) loading features, (2) stress distributions, and (3) failure features of the coating system. The results show that there is a critical displacement corresponding to each FE model size, and only if the applied normal loading displacement is smaller than the critical displacement, a reasonable prediction can be achieved. Finally, a 3D map of the critical displacement is generated to provide guidance for users to determine an FE model with suitable geometrical size for surface coating simulations. This paper presents an effective modelling approach for the prediction of mechanical performance of surface coatings.

  12. Beyond a Definition: Toward a Framework for Designing and Specifying Mentoring Models

    Science.gov (United States)

    Dawson, Phillip

    2014-01-01

    More than three decades of mentoring research has yet to converge on a unifying definition of mentoring; this is unsurprising given the diversity of relationships classified as mentoring. This article advances beyond a definition toward a common framework for specifying mentoring models. Sixteen design elements were identified from the literature…

  13. Beyond a Definition: Toward a Framework for Designing and Specifying Mentoring Models

    Science.gov (United States)

    Dawson, Phillip

    2014-01-01

    More than three decades of mentoring research has yet to converge on a unifying definition of mentoring; this is unsurprising given the diversity of relationships classified as mentoring. This article advances beyond a definition toward a common framework for specifying mentoring models. Sixteen design elements were identified from the literature…

  14. Gene-disease relationship discovery based on model-driven data integration and database view definition

    National Research Council Canada - National Science Library

    Yilmaz, S; Jonveaux, P; Bicep, C; Pierron, L; Smaïl-Tabbone, M; Devignes, M.D

    2009-01-01

    .... orthologous or interacting genes. These definitions guide data modelling in our database approach for gene-disease relationship discovery and are expressed as views which ultimately lead to the retrieval of documented sets of candidate genes...

  15. Biomass Scenario Model Scenario Library: Definitions, Construction, and Description

    Energy Technology Data Exchange (ETDEWEB)

    Inman, D.; Vimmerstedt, L.; Bush, B.; Peterson, S.

    2014-04-01

    Understanding the development of the biofuels industry in the United States is important to policymakers and industry. The Biomass Scenario Model (BSM) is a system dynamics model of the biomass-to-biofuels system that can be used to explore policy effects on biofuels development. Because of the complexity of the model, as well as the wide range of possible future conditions that affect biofuels industry development, we have not developed a single reference case but instead developed a set of specific scenarios that provide various contexts for our analyses. The purpose of this report is to describe the scenarios that comprise the BSM scenario library. At present, we have the following policy-focused scenarios in our library: minimal policies, ethanol-focused policies, equal access to policies, output-focused policies, technological diversity focused, and the point-of-production- focused. This report describes each scenario, its policy settings, and general insights gained through use of the scenarios in analytic studies.

  16. Analyzing Differences in Operational Disease Definitions Using Ontological Modeling

    NARCIS (Netherlands)

    Peelen, Linda; Klein, Michel; Schlobach, Stefan; Keizer, de Nicolette; Peek, Niels

    2007-01-01

    In medicine, there are many diseases which cannot be precisely characterized but are considered as natural kinds. In the communication between health care professionals, this is generally not problematic. In biomedical research, however, crisp definitions are required to unambiguously distinguish

  17. Analyzing Differences in Operational Disease Definitions Using Ontological Modeling

    OpenAIRE

    Peelen, Linda; Klein, Michel; Schlobach, Stefan; Keizer, de, Rob J.W.; Peek, Niels

    2007-01-01

    In medicine, there are many diseases which cannot be precisely characterized but are considered as natural kinds. In the communication between health care professionals, this is generally not problematic. In biomedical research, however, crisp definitions are required to unambiguously distinguish patients with and without the disease. In practice, this results in different operational definitions being in use for a single disease. This paper presents an approach to compare different operation...

  18. Analyzing Differences in Operational Disease Definitions Using Ontological Modeling

    OpenAIRE

    Peelen, Linda; Klein, Michel; Schlobach, Stefan; Keizer, de, M.; Peek,Niels

    2007-01-01

    In medicine, there are many diseases which cannot be precisely characterized but are considered as natural kinds. In the communication between health care professionals, this is generally not problematic. In biomedical research, however, crisp definitions are required to unambiguously distinguish patients with and without the disease. In practice, this results in different operational definitions being in use for a single disease. This paper presents an approach to compare different operation...

  19. Definition of zones with different levels of productivity within an agricultural field using fuzzy modeling

    Science.gov (United States)

    Zoning of agricultural fields is an important task for utilization of precision farming technology. One method for the definition of zones with different levels of productivity is based on fuzzy indicator model. Fuzzy indicator model for identification of zones with different levels of productivit...

  20. Holonomy Spin Foam Models: Definition and Coarse Graining

    CERN Document Server

    Bahr, Benjamin; Hellmann, Frank; Kaminski, Wojciech

    2012-01-01

    We propose a new holonomy formulation for spin foams, which naturally extends the theory space of lattice gauge theories. This allows current spin foam models to be defined on arbitrary two-complexes as well as to generalize current spin foam models to arbitrary, in particular finite groups. The similarity with standard lattice gauge theories allows to apply standard coarse graining methods, which for finite groups can now be easily considered numerically. We will summarize other holonomy and spin network formulations of spin foams and group field theories and explain how the different representations arise through variable transformations in the partition function. A companion paper will provide a description of boundary Hilbert spaces as well as a canonical dynamic encoded in transfer operators.

  1. Latino Definitions of Success: A Cultural Model of Intercultural Competence.

    Science.gov (United States)

    Torres, Lucas

    2009-01-01

    The present study sought to examine Latino intercultural competence via two separate methodologies. Phase 1 entailed discovering and generating themes regarding the features of intercultural competence based on semistructured interviews of 15 Latino adults. Phase 2 included conducting a cultural consensus analysis from the quantitative responses of 46 Latino adults to determine the cultural model of intercultural competence. The major results indicated that the participants, despite variations in socioeconomic and generational statuses, shared a common knowledge base regarding the competencies needed for Latinos to successfully navigate different cultures. Overall, the cultural model of Latino intercultural competence includes a set of skills that integrates traditional cultural values along with attributes of self-efficacy. The findings are discussed within a competence-based conceptualization of cultural adaptation and potential advancements in acculturation research.

  2. Ports: Definition and study of types, sizes and business models

    Directory of Open Access Journals (Sweden)

    Ivan Roa

    2013-09-01

    Full Text Available Purpose: In the world today there are thousands of port facilities of different types and sizes, competing to capture some market share of freight by sea, mainly. This article aims to determine the type of port and the most common size, in order to find out which business model is applied in that segment and what is the legal status of the companies of such infrastructure.Design/methodology/approach: To achieve this goal, we develop a research on a representative sample of 800 ports worldwide, which manage 90% of the containerized port loading. Then you can find out the legal status of the companies that manage them.Findings: The results indicate a port type and a dominant size, which are mostly managed by companies subject to a concession model.Research limitations/implications: In this research, we study only those ports that handle freight (basically containerized, ignoring other activities such as fishing, military, tourism or recreational.Originality/value: This is an investigation to show that the vast majority of the studied segment port facilities are governed by a similar corporate model and subject to pressure from the markets, which increasingly demand efficiency and service. Consequently, we tend to concession terminals to private operators in a process that might be called privatization, but in the strictest sense of the term, is not entirely realistic because the ownership of the land never ceases to be public

  3. Static Potential in the SU(2)-Higgs Model and Coupling Constant Definitions in Lattice and Continuum Models

    CERN Document Server

    Csikor, Ferenc; Hegedüs, P; Piróth, A

    1999-01-01

    We present a one-loop calculation of the static potential in the SU(2)-Higgs model. The connection to the coupling constant definition used in lattice simulations is clarified. The consequences in comparing lattice simulations and perturbative results for finite temperature applications are explored.

  4. HIV lipodystrophy case definition using artificial neural network modelling

    DEFF Research Database (Denmark)

    Ioannidis, John P A; Trikalinos, Thomas A; Law, Matthew

    2003-01-01

    OBJECTIVE: A case definition of HIV lipodystrophy has recently been developed from a combination of clinical, metabolic and imaging/body composition variables using logistic regression methods. We aimed to evaluate whether artificial neural networks could improve the diagnostic accuracy. METHODS...

  5. Multimedia Modeling of engineered Nanoparticles with SimpleBox4Nano: Model Definition and Evaluation

    NARCIS (Netherlands)

    Meesters, J.; Koelmans, A.A.; Quik, J.T.K.; Hendriks, A.J.; Meent, van de D.

    2014-01-01

    Screening level models for environmental assessment of engineered nanoparticles (ENP) are not generally available. Here we present SimpleBox for Nano (SB4N) as the first model of this type, motivate its validity and evaluate it by comparisons with a known material flow model. SB4N expresses ENP tran

  6. Multimedia Modeling of engineered Nanoparticles with SimpleBox4Nano: Model Definition and Evaluation

    NARCIS (Netherlands)

    Meesters, J.; Koelmans, A.A.; Quik, J.T.K.; Hendriks, A.J.; Meent, van de D.

    2014-01-01

    Screening level models for environmental assessment of engineered nanoparticles (ENP) are not generally available. Here we present SimpleBox for Nano (SB4N) as the first model of this type, motivate its validity and evaluate it by comparisons with a known material flow model. SB4N expresses ENP

  7. Ontology for Life-Cycle Modeling of Water Distribution Systems: Model View Definition

    Science.gov (United States)

    2013-06-01

    building information models ( BIM ) at the coordinated design stage of building construction. 1.3 Approach To document the process and exchange...exchanging Building Information Modeling ( BIM ) data, which defines hundreds of classes for common use in software, currently supported by approximately...Construction Operations Build - ing information exchange (COBie), Building Information Modeling ( BIM ) 16. SECURITY

  8. Models

    DEFF Research Database (Denmark)

    Juel-Christiansen, Carsten

    2005-01-01

    Artiklen fremhæver den visuelle rotation - billeder, tegninger, modeller, værker - som det privilligerede medium i kommunikationen af ideer imellem skabende arkitekter......Artiklen fremhæver den visuelle rotation - billeder, tegninger, modeller, værker - som det privilligerede medium i kommunikationen af ideer imellem skabende arkitekter...

  9. High definition geomagnetic models: A new perspective for improved wellbore positioning

    DEFF Research Database (Denmark)

    Maus, Stefan; Nair, Manoj C.; Poedjono, Benny;

    2012-01-01

    Earth's gravity and magnetic fields are used as natural reference frames in directional drilling. The azimuth of the bottomhole assembly is inferred by comparing the magnetic field measured-while-drilling (MWD) with a geomagnetic reference model. To provide a reference of sufficient quality...... for accurate well placement, the US National Geophysical Data Center (NGDC), in partnership with industry, has developed high-definition geomagnetic models (HDGM), updated regularly using the latest satellite, airborne and marine measurements of the Earth's magnetic field. Standard geomagnetic reference models....... These are compiled into a global magnetic anomaly grid and expanded into ellipsoidal harmonics. The harmonic expansion coefficients are then included in the high-definition models to accurately represent the direction and strength of the local geomagnetic field. The latest global model to degree and order 720...

  10. Implications of the 90-day episode definition used for the Comprehensive Care for Joint Replacement model

    Science.gov (United States)

    Ellimoottil, Chad; Ryan, Andrew M.; Hou, Hechuan; Dupree, James M.; Hallstrom, Brian; Miller, David C.

    2017-01-01

    Importance Under the Comprehensive Care for Joint Replacement (CJR) model, hospitals are held accountable for nearly all Medicare payments that occur during the initial hospitalization through 90-days post-discharge (i.e., episode of care). It is unknown whether unrelated expenditures resulting from this “broad” episode definition will impact participating hospital’s average 90-day episode payments. Objective To compare the CJR program’s broad episode definition to a clinically-narrow episode definition Design We identified Medicare claims for patients in Michigan who underwent joint replacement from 2011 through 2013. Using specifications from the CJR model and the clinically-narrow Hospital Compare payment measure, we constructed episodes of care and calculated 90-day episode payments. We then compared hospitals’ average 90-day episode payments using the two episode definitions and fit linear regression models to understand whether payment differences were associated with specific hospital characteristics (average CMS-HCC risk score, rural hospital status, joint replacement volume, percentage of Medicaid discharges, teaching hospital status, number of beds, percentage of joint replacements performed on African American patients and median income of the hospital’s county). Setting All Michigan hospitals located in metropolitan statistical areas Participants Medicare beneficiaries Main Outcome and Measure(s) The correlation and difference between average 90-day episode payments using the broad CJR model episode definition and the clinically-narrow Hospital Compare episode definition. Results We identified 23,251 joint replacement episodes. 90-day episode payments using the broad CJR episode definition ranged from $17,349 to $29,465 (mean: $22,122, standard deviation: $2,600). Episode payments were slightly lower (mean: $21,670) when the Hospital Compare episode definition was used. Both methods were strongly correlated (r=0.99, p<0.001). The average

  11. Refining definitions of periodontal disease and caries for prediction models of incident tooth loss.

    Science.gov (United States)

    Houshmand, Mohammad; Holtfreter, Birte; Berg, Marie Henrike; Schwahn, Christian; Meisel, Peter; Biffar, Reiner; Kindler, Stefan; Kocher, Thomas

    2012-07-01

    To assess the suitability of different definitions of caries and periodontitis for inclusion in tooth loss prediction models. The Study of Health in Pomerania (SHIP) is a population-based cohort study conducted in 1997-2001 (SHIP-0) and 2002-2006 (SHIP-1). This sample comprised 2,780 subjects aged 20-81 years with complete information on dental and periodontal status [DMFS status, clinical attachment loss (CAL) and probing depth (PD)]. Analyses on five-year tooth loss were limited to half-mouth data. The predictive value of tested definitions was markedly age- and gender-dependent: in 20-39-aged men, the number of decayed or filled surfaces best predicted the number of lost teeth, whereas in young women CAL≥4 mm performed best. In older subjects, periodontal definitions were superior to caries definitions: mean CAL performed best in 40-59-year olds, whereas AL- or PD-related definitions predicted best in 60-81-year olds. On tooth level, mean CAL was the superior definition to assess 5-year incident tooth loss in all strata except for young men. Caries parameters best predicted incident tooth loss in men aged 20-39 years; in the intermediate and oldest age group and in young women, mean AL was most informative. Therefore, prediction models need to be developed for different age and gender groups. © 2012 John Wiley & Sons A/S.

  12. Asymmetric and Non–Positive Definite Distance Functions Part II: Modeling

    Directory of Open Access Journals (Sweden)

    H. Sánchez–Larios

    2009-01-01

    Full Text Available Traditionally the distance functions involved in problems of Operations Research have been modeled using positive linear combinations of metrics Lp. Thus, the resulting distance functions are symmetric, uniforms and positive definite. Starting from a new definition of arc length, we propose a method formo deling generalized distance functions, that we call premetrics, which can be asymmetric, non uniform, and non positive definite. We show that every distance function satisfying the triangle inequality and having a continuous one–sided directional derivative can be modeled as a problem of calculus of variations. The "length" of a d–geodesic arc C(a,b from a to b with respect to the premetric d (the d–length can be negative, and therefore the d–distance from a to b may represent the minimum energy needed to move a mobile object from a to b. We illustrate our method with two examples.

  13. Modelling

    CERN Document Server

    Spädtke, P

    2013-01-01

    Modeling of technical machines became a standard technique since computer became powerful enough to handle the amount of data relevant to the specific system. Simulation of an existing physical device requires the knowledge of all relevant quantities. Electric fields given by the surrounding boundary as well as magnetic fields caused by coils or permanent magnets have to be known. Internal sources for both fields are sometimes taken into account, such as space charge forces or the internal magnetic field of a moving bunch of charged particles. Used solver routines are briefly described and some bench-marking is shown to estimate necessary computing times for different problems. Different types of charged particle sources will be shown together with a suitable model to describe the physical model. Electron guns are covered as well as different ion sources (volume ion sources, laser ion sources, Penning ion sources, electron resonance ion sources, and H$^-$-sources) together with some remarks on beam transport.

  14. A Lotka-Volterra competition model and its global convergence to a definite axial equilibrium.

    Science.gov (United States)

    Sikder, Asim

    2002-04-01

    We consider a four-species model based on competition and show that the whole four-species system collapses to a definite single species equilibrium at its carrying capacity. To do so, we use the results of Hirsch, Van Den Driessche and Zeeman, Hofbauer and Sigmund, and the product theorem of the Conley connection matrix theory by Mischaikow and Reineck.

  15. model

    African Journals Online (AJOL)

    trie neural construction oí inoiviouo! unci communal identities in ... occurs, Including models based on Information processing,1 ... Applying the DSM descriptive approach to dissociation in the ... a personal, narrative path lhal connects personal lo ethnic ..... managed the problem in the context of the community, using a.

  16. Development of a definition, classification system, and model for cultural geology

    Science.gov (United States)

    Mitchell, Lloyd W., III

    The concept for this study is based upon a personal interest by the author, an American Indian, in promoting cultural perspectives in undergraduate college teaching and learning environments. Most academicians recognize that merged fields can enhance undergraduate curricula. However, conflict may occur when instructors attempt to merge social science fields such as history or philosophy with geoscience fields such as mining and geomorphology. For example, ideologies of Earth structures derived from scientific methodologies may conflict with historical and spiritual understandings of Earth structures held by American Indians. Specifically, this study addresses the problem of how to combine cultural studies with the geosciences into a new merged academic discipline called cultural geology. This study further attempts to develop the merged field of cultural geology using an approach consisting of three research foci: a definition, a classification system, and a model. Literature reviews were conducted for all three foci. Additionally, to better understand merged fields, a literature review was conducted specifically for academic fields that merged social and physical sciences. Methodologies concentrated on the three research foci: definition, classification system, and model. The definition was derived via a two-step process. The first step, developing keyword hierarchical ranking structures, was followed by creating and analyzing semantic word meaning lists. The classification system was developed by reviewing 102 classification systems and incorporating selected components into a system framework. The cultural geology model was created also utilizing a two-step process. A literature review of scientific models was conducted. Then, the definition and classification system were incorporated into a model felt to reflect the realm of cultural geology. A course syllabus was then developed that incorporated the resulting definition, classification system, and model. This

  17. Multimedia modeling of engineered nanoparticles with SimpleBox4nano: model definition and evaluation.

    Science.gov (United States)

    Meesters, Johannes A J; Koelmans, Albert A; Quik, Joris T K; Hendriks, A Jan; van de Meent, Dik

    2014-05-20

    Screening level models for environmental assessment of engineered nanoparticles (ENP) are not generally available. Here, we present SimpleBox4Nano (SB4N) as the first model of this type, assess its validity, and evaluate it by comparisons with a known material flow model. SB4N expresses ENP transport and concentrations in and across air, rain, surface waters, soil, and sediment, accounting for nanospecific processes such as aggregation, attachment, and dissolution. The model solves simultaneous mass balance equations (MBE) using simple matrix algebra. The MBEs link all concentrations and transfer processes using first-order rate constants for all processes known to be relevant for ENPs. The first-order rate constants are obtained from the literature. The output of SB4N is mass concentrations of ENPs as free dispersive species, heteroaggregates with natural colloids, and larger natural particles in each compartment in time and at steady state. Known scenario studies for Switzerland were used to demonstrate the impact of the transport processes included in SB4N on the prediction of environmental concentrations. We argue that SB4N-predicted environmental concentrations are useful as background concentrations in environmental risk assessment.

  18. The advanced model definition and analysis of orthodontic parameters on 3D digital models

    Directory of Open Access Journals (Sweden)

    Majstorović Nemanja V.

    2017-01-01

    Full Text Available Introduction/Objective. Digital 3D modeling is slowly becoming an everyday orthodontic practice, and after two decades of research and development it is a basic element of e-orthodontics. The aim of this study was development and use of geometric entities on 3D digital models for diagnosing, planning and monitoring of orthodontic therapy, by using CAD (computer aided design systems. Methods. Statistical analysis and synthesis of 54 orthodontic parameters (28 in the upper and 26 in the lower jaw, defining three hypotheses and their testing, the application of the t-test. Results. All three hypotheses are confirmed, convenience of using geometric entities, higher accuracy of 3D digital models, and more substantial displacement of teeth in the first six months of therapy (Student’s t-test. After the first six months, distances in the x–y plane (occlusal plane were bigger in both the upper and the lower jaw; additionally, the distances in the y–z plane (medial plane decreased on the left and right side, so we can say that the first phase of therapy had success and that both jaws are wider. At the next four controls, parameters showed slight progress that was not statistically significant. Overall, after 11 months of therapy, there was a considerable improvement in the x–y plane, while changes in distances of clinical crown heights were very small. This could be explained by the fact that, during therapy, by using different arches, upper molars were pushed inside, toward the palate. Analyzing 3D computer models, we could notice that in this plane displacement of the upper left first molar was larger. Conclusion. The use of geometric entities for defining orthodontic parameters gives us new possibilities for accurate and reliable analysis of patient’s orthodontic condition.

  19. Defining epidemics in computer simulation models: How do definitions influence conclusions?

    Directory of Open Access Journals (Sweden)

    Carolyn Orbann

    2017-06-01

    Full Text Available Computer models have proven to be useful tools in studying epidemic disease in human populations. Such models are being used by a broader base of researchers, and it has become more important to ensure that descriptions of model construction and data analyses are clear and communicate important features of model structure. Papers describing computer models of infectious disease often lack a clear description of how the data are aggregated and whether or not non-epidemic runs are excluded from analyses. Given that there is no concrete quantitative definition of what constitutes an epidemic within the public health literature, each modeler must decide on a strategy for identifying epidemics during simulation runs. Here, an SEIR model was used to test the effects of how varying the cutoff for considering a run an epidemic changes potential interpretations of simulation outcomes. Varying the cutoff from 0% to 15% of the model population ever infected with the illness generated significant differences in numbers of dead and timing variables. These results are important for those who use models to form public health policy, in which questions of timing or implementation of interventions might be answered using findings from computer simulation models.

  20. A New Definition and Calculation Model for Evolutionary Multi-Objective Optimization

    Institute of Scientific and Technical Information of China (English)

    Zhou Ai-min; Kang Li-shan; Chen Yu-ping; Huang Yu-zhen

    2003-01-01

    We present a new definition (Evolving Solutions) for Multi objective Optimization Problem (MOP) to answer the basic question (what's multi-objective optimal solution?) and advance an asynchronous evolutionary model (MINT Model) to solve MOPs. The new theory is based on our understanding of the natural evolution and the analysis of the difference between natural evolution and MOP, thus it is not only different from the Converting Optimization but also different from Pareto Optimization.Some tests prove that our new theory may conquer disadvantages of the upper two methods to some extent.

  1. Process and building information modelling in the construction industry by using information delivery manuals and model view definitions

    DEFF Research Database (Denmark)

    Karlshøj, Jan

    2012-01-01

    The construction industry is gradually increasing its use of structured information and building information modelling.To date, the industry has suffered from the disadvantages of a project-based organizational structure and ad hoc solutions. Furthermore, it is not used to formalizing the flow...... of information and specifying exactly which objects and properties are needed for each process and which information is produced by the processes. The present study is based on reviewing the existing methodology of Information Delivery Manuals (IDM) from Buildingsmart, which also is also an ISO standard 29481...... Part 1; and the Model View Definition (MVD) methodology developed by Buildingsmart and BLIS. The research also includes a review of concrete IDM development projects that have been developed over the last five years. Although the study has identified interest in the IDM methodology in a number...

  2. Gene-disease relationship discovery based on model-driven data integration and database view definition.

    Science.gov (United States)

    Yilmaz, S; Jonveaux, P; Bicep, C; Pierron, L; Smaïl-Tabbone, M; Devignes, M D

    2009-01-15

    Computational methods are widely used to discover gene-disease relationships hidden in vast masses of available genomic and post-genomic data. In most current methods, a similarity measure is calculated between gene annotations and known disease genes or disease descriptions. However, more explicit gene-disease relationships are required for better insights into the molecular bases of diseases, especially for complex multi-gene diseases. Explicit relationships between genes and diseases are formulated as candidate gene definitions that may include intermediary genes, e.g. orthologous or interacting genes. These definitions guide data modelling in our database approach for gene-disease relationship discovery and are expressed as views which ultimately lead to the retrieval of documented sets of candidate genes. A system called ACGR (Approach for Candidate Gene Retrieval) has been implemented and tested with three case studies including a rare orphan gene disease.

  3. A Simple Line Drawing Definition and Transfer Model for Facial Animation Generation

    Directory of Open Access Journals (Sweden)

    Qingxiang Wang

    2013-11-01

    Full Text Available The Line Drawing Animation is an active research area in Non-Photorealistic Rendering. Many researchs are focused on the skech abastract, like such as portrait drawing of human and animation generation. However most of the model are too complex to calculate or pay attention to the details which are not stable that are not suitable for realtime transfer for continuous sequence of video. This paper proposes a simple line drawing definition and transfer model with Bézier Curves and the core of the AAM fit parameters. The facial line drawings have seven basic emotions include neutral, happiness, anger, disgust, fear, sadness and surprised. Each of the drawing in a specific model is consisted of a same set of cubic Bézier curves. The proposed model is suitabl for shape conbination anmition. In the experiment, the AAM method is used to get the facial features of the face and then find the nearest combination of the emotion to transfer to the line drawing model. The result shows that the method is simple and fast. Only a few of the parameters are needed to transfer that is suitable to record and communication.

  4. Exploratory analysis regarding the domain definitions for computer based analytical models

    Science.gov (United States)

    Raicu, A.; Oanta, E.; Barhalescu, M.

    2017-08-01

    Our previous computer based studies dedicated to structural problems using analytical methods defined the composite cross section of a beam as a result of Boolean operations with so-called ‘simple’ shapes. Using generalisations, in the class of the ‘simple’ shapes were included areas bounded by curves approximated using spline functions and areas approximated as polygons. However, particular definitions lead to particular solutions. In order to ascend above the actual limitations, we conceived a general definition of the cross sections that are considered now calculus domains consisting of several subdomains. The according set of input data use complex parameterizations. This new vision allows us to naturally assign a general number of attributes to the subdomains. In this way there may be modelled new phenomena that use map-wise information, such as the metal alloys equilibrium diagrams. The hierarchy of the input data text files that use the comma-separated-value format and their structure are also presented and discussed in the paper. This new approach allows us to reuse the concepts and part of the data processing software instruments already developed. The according software to be subsequently developed will be modularised and generalised in order to be used in the upcoming projects that require rapid development of computer based models.

  5. Roles of Definitional and Assessment Models in the Identification of New or Second Language Learners of English for Special Education

    Science.gov (United States)

    Barrera, Manuel

    2006-01-01

    This article examines the efficacy of current definitional perspectives on learning disabilities (LD) and related assessment models to support appropriate instructional and support services for learners of English with learning-related difficulties. A revised framework for defining LD and an associated assessment model, curriculum-based dynamic…

  6. Subject-Specific Tendon-Aponeurosis Definition in Hill-Type Model Predicts Higher Muscle Forces in Dynamic Tasks

    OpenAIRE

    Pauline Gerus; Guillaume Rao; Eric Berton

    2012-01-01

    Neuromusculoskeletal models are a common method to estimate muscle forces. Developing accurate neuromusculoskeletal models is a challenging task due to the complexity of the system and large inter-subject variability. The estimation of muscles force is based on the mechanical properties of tendon-aponeurosis complex. Most neuromusculoskeletal models use a generic definition of the tendon-aponeurosis complex based on in vitro test, perhaps limiting their validity. Ultrasonography allows subjec...

  7. Questioning the “classical” in Persian painting: models and problems of definition

    Directory of Open Access Journals (Sweden)

    Christiane Gruber

    2012-06-01

    Full Text Available In scholarship on Persian book arts, paintings have tended to be organized according to a rise-and-fall model. Within this overarching framework, the Ilkhanid period represents the birth of painting and the Qajar era its supposed decline, while Timurid and Safavid painting mark a high point for the development of pictorial arts in Iran. As a result, scholars have used the term ‘classical’ to describe both Timurid and Safavid painting. The many definitions of ‘classical’ – which alternatively engage with aesthetic criteria, time periods, numerical output, systems of patronage, artistic models, and stylistic imitations – raise a number of significant questions, however. This study highlights the problematic uses of the term in scholarship on Persian manuscript painting. Moreover, by examining a series of interrelated Ilkhanid, Timurid, and Safavid paintings of the Prophet Muhammad in particular, it seeks to explore alternative models for studying the history of Persian manuscript painting, itself too diverse and self-referential to be confined to a linear account.

  8. Novel Insights into the Genetic Controls of Primitive and Definitive Hematopoiesis from Zebrafish Models

    Directory of Open Access Journals (Sweden)

    Raman Sood

    2012-01-01

    Full Text Available Hematopoiesis is a dynamic process where initiation and maintenance of hematopoietic stem cells, as well as their differentiation into erythroid, myeloid and lymphoid lineages, are tightly regulated by a network of transcription factors. Understanding the genetic controls of hematopoiesis is crucial as perturbations in hematopoiesis lead to diseases such as anemia, thrombocytopenia, or cancers, including leukemias and lymphomas. Animal models, particularly conventional and conditional knockout mice, have played major roles in our understanding of the genetic controls of hematopoiesis. However, knockout mice for most of the hematopoietic transcription factors are embryonic lethal, thus precluding the analysis of their roles during the transition from embryonic to adult hematopoiesis. Zebrafish are an ideal model organism to determine the function of a gene during embryonic-to-adult transition of hematopoiesis since bloodless zebrafish embryos can develop normally into early larval stage by obtaining oxygen through diffusion. In this review, we discuss the current status of the ontogeny and regulation of hematopoiesis in zebrafish. By providing specific examples of zebrafish morphants and mutants, we have highlighted the contributions of the zebrafish model to our overall understanding of the roles of transcription factors in regulation of primitive and definitive hematopoiesis.

  9. Challenges in phenotype definition in the whole-genome era: multivariate models of memory and intelligence.

    Science.gov (United States)

    Sabb, F W; Burggren, A C; Higier, R G; Fox, J; He, J; Parker, D S; Poldrack, R A; Chu, W; Cannon, T D; Freimer, N B; Bilder, R M

    2009-11-24

    Refining phenotypes for the study of neuropsychiatric disorders is of paramount importance in neuroscience. Poor phenotype definition provides the greatest obstacle for making progress in disorders like schizophrenia, bipolar disorder, Attention Deficit/Hyperactivity Disorder (ADHD), and autism. Using freely available informatics tools developed by the Consortium for Neuropsychiatric Phenomics (CNP), we provide a framework for defining and refining latent constructs used in neuroscience research and then apply this strategy to review known genetic contributions to memory and intelligence in healthy individuals. This approach can help us begin to build multi-level phenotype models that express the interactions between constructs necessary to understand complex neuropsychiatric diseases. These results are available online through the http://www.phenowiki.org database. Further work needs to be done in order to provide consensus-building applications for the broadly defined constructs used in neuroscience research.

  10. Psychoacoustic and cognitive aspects of auditory roughness: definitions, models, and applications

    Science.gov (United States)

    Vassilakis, Pantelis N.; Kendall, Roger A.

    2010-02-01

    The term "auditory roughness" was first introduced in the 19th century to describe the buzzing, rattling auditory sensation accompanying narrow harmonic intervals (i.e. two tones with frequency difference in the range of ~15-150Hz, presented simultaneously). A broader definition and an overview of the psychoacoustic correlates of the auditory roughness sensation, also referred to as sensory dissonance, is followed by an examination of efforts to quantify it over the past one hundred and fifty years and leads to the introduction of a new roughness calculation model and an application that automates spectral and roughness analysis of sound signals. Implementation of spectral and roughness analysis is briefly discussed in the context of two pilot perceptual experiments, designed to assess the relationship among cultural background, music performance practice, and aesthetic attitudes towards the auditory roughness sensation.

  11. Implications of the Definition of an Episode of Care Used in the Comprehensive Care for Joint Replacement Model.

    Science.gov (United States)

    Ellimoottil, Chad; Ryan, Andrew M; Hou, Hechuan; Dupree, James M; Hallstrom, Brian; Miller, David C

    2017-01-01

    Under the Comprehensive Care for Joint Replacement (CJR) model, hospitals are held accountable for nearly all Medicare payments that occur during the initial hospitalization until 90 days after hospital discharge (ie, the episode of care). It is not known whether unrelated expenditures resulting from this "broad" definition of an episode of care will affect participating hospitals' average episode-of-care payments. To compare the CJR program's broad definition of an episode of care with a clinically narrow definition of an episode of care. We identified Medicare claims for 23 251 patients in Michigan who were Medicare beneficiaries and who underwent joint replacement during the period from 2011 through 2013 at hospitals located in metropolitan statistical areas. Using specifications from the CJR model and the clinically narrow Hospital Compare payment measure, we constructed episodes of care and calculated 90-day episode payments. We then compared hospitals' average 90-day episode payments using the 2 definitions of an episode of care and fit linear regression models to understand whether payment differences were associated with specific hospital characteristics (average Centers for Medicare & Medicaid Services-hierarchical condition categories risk score, rural hospital status, joint replacement volume, percentage of Medicaid discharges, teaching hospital status, number of beds, percentage of joint replacements performed on African American patients, and median income of the hospital's county). We performed analyses from July 1 through October 1, 2015. The correlation and difference between average 90-day episode payments using the broad definition of an episode of care in the CJR model and the clinically narrow Hospital Compare definition of an episode of care. We identified 23 251 joint replacements (ie, episodes of care). The 90-day episode payments using the broad definition of the CJR model ranged from $17 349 to $29 465 (mean [SD] payment, $22 122

  12. Subject-specific tendon-aponeurosis definition in Hill-type model predicts higher muscle forces in dynamic tasks.

    Directory of Open Access Journals (Sweden)

    Pauline Gerus

    Full Text Available Neuromusculoskeletal models are a common method to estimate muscle forces. Developing accurate neuromusculoskeletal models is a challenging task due to the complexity of the system and large inter-subject variability. The estimation of muscles force is based on the mechanical properties of tendon-aponeurosis complex. Most neuromusculoskeletal models use a generic definition of the tendon-aponeurosis complex based on in vitro test, perhaps limiting their validity. Ultrasonography allows subject-specific estimates of the tendon-aponeurosis complex's mechanical properties. The aim of this study was to investigate the influence of subject-specific mechanical properties of the tendon-aponeurosis complex on a neuromusculoskeletal model of the ankle joint. Seven subjects performed isometric contractions from which the tendon-aponeurosis force-strain relationship was estimated. Hopping and running tasks were performed and muscle forces were estimated using subject-specific tendon-aponeurosis and generic tendon properties. Two ultrasound probes positioned over the muscle-tendon junction and the mid-belly were combined with motion capture to estimate the in vivo tendon and aponeurosis strain of the medial head of gastrocnemius muscle. The tendon-aponeurosis force-strain relationship was scaled for the other ankle muscles based on tendon and aponeurosis length of each muscle measured by ultrasonography. The EMG-driven model was calibrated twice - using the generic tendon definition and a subject-specific tendon-aponeurosis force-strain definition. The use of subject-specific tendon-aponeurosis definition leads to a higher muscle force estimate for the soleus muscle and the plantar-flexor group, and to a better model prediction of the ankle joint moment compared to the model estimate which used a generic definition. Furthermore, the subject-specific tendon-aponeurosis definition leads to a decoupling behaviour between the muscle fibre and muscle-tendon unit

  13. Subject-specific tendon-aponeurosis definition in Hill-type model predicts higher muscle forces in dynamic tasks.

    Science.gov (United States)

    Gerus, Pauline; Rao, Guillaume; Berton, Eric

    2012-01-01

    Neuromusculoskeletal models are a common method to estimate muscle forces. Developing accurate neuromusculoskeletal models is a challenging task due to the complexity of the system and large inter-subject variability. The estimation of muscles force is based on the mechanical properties of tendon-aponeurosis complex. Most neuromusculoskeletal models use a generic definition of the tendon-aponeurosis complex based on in vitro test, perhaps limiting their validity. Ultrasonography allows subject-specific estimates of the tendon-aponeurosis complex's mechanical properties. The aim of this study was to investigate the influence of subject-specific mechanical properties of the tendon-aponeurosis complex on a neuromusculoskeletal model of the ankle joint. Seven subjects performed isometric contractions from which the tendon-aponeurosis force-strain relationship was estimated. Hopping and running tasks were performed and muscle forces were estimated using subject-specific tendon-aponeurosis and generic tendon properties. Two ultrasound probes positioned over the muscle-tendon junction and the mid-belly were combined with motion capture to estimate the in vivo tendon and aponeurosis strain of the medial head of gastrocnemius muscle. The tendon-aponeurosis force-strain relationship was scaled for the other ankle muscles based on tendon and aponeurosis length of each muscle measured by ultrasonography. The EMG-driven model was calibrated twice - using the generic tendon definition and a subject-specific tendon-aponeurosis force-strain definition. The use of subject-specific tendon-aponeurosis definition leads to a higher muscle force estimate for the soleus muscle and the plantar-flexor group, and to a better model prediction of the ankle joint moment compared to the model estimate which used a generic definition. Furthermore, the subject-specific tendon-aponeurosis definition leads to a decoupling behaviour between the muscle fibre and muscle-tendon unit in agreement with

  14. A Gaussian mixture model for definition of lung tumor volumes in positron emission tomography.

    Science.gov (United States)

    Aristophanous, Michalis; Penney, Bill C; Martel, Mary K; Pelizzari, Charles A

    2007-11-01

    The increased interest in 18F-fluorodeoxyglucose (FDG) positron emission tomography (PET) in radiation treatment planning in the past five years necessitated the independent and accurate segmentation of gross tumor volume (GTV) from FDG-PET scans. In some studies the radiation oncologist contours the GTV based on a computed tomography scan, while incorporating pertinent data from the PET images. Alternatively, a simple threshold, typically 40% of the maximum intensity, has been employed to differentiate tumor from normal tissue, while other researchers have developed algorithms to aid the PET based GTV definition. None of these methods, however, results in reliable PET tumor segmentation that can be used for more sophisticated treatment plans. For this reason, we developed a Gaussian mixture model (GMM) based segmentation technique on selected PET tumor regions from non-small cell lung cancer patients. The purpose of this study was to investigate the feasibility of using a GMM-based tumor volume definition in a robust, reliable and reproducible way. A GMM relies on the idea that any distribution, in our case a distribution of image intensities, can be expressed as a mixture of Gaussian densities representing different classes. According to our implementation, each class belongs to one of three regions in the image; the background (B), the uncertain (U) and the target (T), and from these regions we can obtain the tumor volume. User interaction in the implementation is required, but is limited to the initialization of the model parameters and the selection of an "analysis region" to which the modeling is restricted. The segmentation was developed on three and tested on another four clinical cases to ensure robustness against differences observed in the clinic. It also compared favorably with thresholding at 40% of the maximum intensity and a threshold determination function based on tumor to background image intensities proposed in a recent paper. The parts of the

  15. A process-based model for the definition of hydrological alert systems in landslide risk mitigation

    Directory of Open Access Journals (Sweden)

    M. Floris

    2012-11-01

    Full Text Available The definition of hydrological alert systems for rainfall-induced landslides is strongly related to a deep knowledge of the geological and geomorphological features of the territory. Climatic conditions, spatial and temporal evolution of the phenomena and characterization of landslide triggering, together with propagation mechanisms, are the key elements to be considered. Critical steps for the development of the systems consist of the identification of the hydrological variable related to landslide triggering and of the minimum rainfall threshold for landslide occurrence.

    In this paper we report the results from a process-based model to define a hydrological alert system for the Val di Maso Landslide, located in the northeastern Italian Alps and included in the Vicenza Province (Veneto region, NE Italy. The instability occurred in November 2010, due to an exceptional rainfall event that hit the Vicenza Province and the entire NE Italy. Up to 500 mm in 3-day cumulated rainfall generated large flood conditions and triggered hundreds of landslides. During the flood, the Soil Protection Division of the Vicenza Province received more than 500 warnings of instability phenomena. The complexity of the event and the high level of risk to infrastructure and private buildings are the main reasons for deepening the specific phenomenon occurred at Val di Maso.

    Empirical and physically-based models have been used to identify the minimum rainfall threshold for the occurrence of instability phenomena in the crown area of Val di Maso landslide, where a retrogressive evolution by multiple rotational slides is expected. Empirical models helped in the identification and in the evaluation of recurrence of critical rainfall events, while physically-based modelling was essential to verify the effects on the slope stability of determined rainfall depths. Empirical relationships between rainfall and landslide consist of the calculation of rainfall

  16. Effect of model-form definition on uncertainty quantification in coupled models of mid-frequency range simulations

    Science.gov (United States)

    Van Buren, Kendra L.; Ouisse, Morvan; Cogan, Scott; Sadoulet-Reboul, Emeline; Maxit, Laurent

    2017-09-01

    In the development of numerical models, uncertainty quantification (UQ) can inform appropriate allocation of computational resources, often resulting in efficient analysis for activities such as model calibration and robust design. UQ can be especially beneficial for numerical models with significant computational expense, such as coupled models, which require several subsystem models to attain the performance of a more complex, inter-connected system. In the coupled model paradigm, UQ can be applied at either the subsystem model level or the coupled model level. When applied at the subsystem level, UQ is applied directly to the physical input parameters, which can be computationally expensive. In contrast, UQ at the coupled level may not be representative of the physical input parameters, but comes at the benefit of being computationally efficient to implement. To be physically meaningful, analysis at the coupled level requires information about how uncertainty is propagated through from the subsystem level. Herein, the proposed strategy is based on simulations performed at the subsystem level to inform a covariance matrix for UQ performed at the coupled level. The approach is applied to a four-subsystem model of mid-frequency vibrations simulated using the Statistical Modal Energy Distribution Analysis, a variant of the Statistical Energy Analysis. The proposed approach is computationally efficient to implement, while simultaneously capturing information from the subsystem level to ensure the analysis is physically meaningful.

  17. Process Definition and Process Modeling Methods Version 01.01.00

    Science.gov (United States)

    1991-09-01

    process model. This generic process model is a state machine model . It permits progress in software development to be characterized as transitions...e.g., Entry-Task-Validation-Exit (ETVX) diagram, Petri Net, two-level state machine model , state machine, and Structured Analysis and Design

  18. Geosynchronous platform definition study. Volume 4, Part 1: Traffic analysis and system requirements for the baseline traffic model

    Science.gov (United States)

    1973-01-01

    The traffic analyses and system requirements data generated in the study resulted in the development of two traffic models; the baseline traffic model and the new traffic model. The baseline traffic model provides traceability between the numbers and types of geosynchronous missions considered in the study and the entire spectrum of missions foreseen in the total national space program. The information presented pertaining to the baseline traffic model includes: (1) definition of the baseline traffic model, including identification of specific geosynchronous missions and their payload delivery schedules through 1990; (2) Satellite location criteria, including the resulting distribution of the satellite population; (3) Geosynchronous orbit saturation analyses, including the effects of satellite physical proximity and potential electromagnetic interference; and (4) Platform system requirements analyses, including satellite and mission equipment descriptions, the options and limitations in grouping satellites, and on-orbit servicing criteria (both remotely controlled and man-attended).

  19. Soil organic matter quality: Definition, quantification and implications for modeling (Invited)

    Science.gov (United States)

    Plante, A. F.

    2010-12-01

    Soil organic matter (SOM) is an important component of the global C cycle. It contains more C than plant biomass and the atmosphere combined, and contributes to a C flux to and from the atmosphere ten times larger than the C flux due to fossil fuel combustion. Increasing interest in SOM is driven by questions about the permanence of soil C during sequestration, the vulnerability of soil C stocks in response to disturbance or climate change, and its role as an energy and nutrient source/sink for soil biota. Recent research has suggested that the quality of SOM may be as important as its quantity in influencing ecosystem function. Soil organic matter quality is frequently defined as a set of properties meant to characterize how easily SOM can be mineralized. This definition is much too vague to be useful. Part of our conceptualization of SOM quality has been inherited from that of substrate quality when considering above-ground litter decomposition. However, the concept must go beyond biochemical composition and encompasses all of the mechanisms that act to stabilize organic matter in soil. The ambiguity of what comprises SOM quality is reflected in the wide range of approaches used to measure or quantify it. Various physical, chemical and biological fractionation techniques as well as analytical and instrumental chemical techniques have been used to characterize SOM quality to varying degrees, though correlations between various methods are not frequently reported or apparent. While SOM quality may not be a singular property, its implications in the dynamics of SOM make it important to express in a quantitative manner, likely through the use of indices. While this may seem unsatisfying, the goal, ultimately, is to model and predict how SOM will respond to climate change and changes in land use and management. SOM quality is essentially embedded in current models through the use of multiple compartments. Multiple compartments reflect the composite nature of SOM with

  20. Between algorithm and model: different Molecular Surface definitions for the Poisson-Boltzmann based electrostatic characterization of biomolecules in solution

    OpenAIRE

    2012-01-01

    The definition of a molecular surface which is physically sound and computationally efficient is a very interesting and long standing problem in the implicit solvent continuum modeling of biomolecular systems as well as in the molecular graphics field. In this work, two molecular surfaces are evaluated with respect to their suitability for electrostatic computation as alternatives to the widely used Connolly-Richards surface: the blobby surface, an implicit Gaussian atom centered surface, and...

  1. Definition of Saturn's magnetospheric model parameters for the Pioneer 11 flyby

    Directory of Open Access Journals (Sweden)

    E. S. Belenkaya

    2006-05-01

    Full Text Available This paper presents a description of a method for selection parameters for a global paraboloid model of Saturn's magnetosphere. The model is based on the preexisting paraboloid terrestrial and Jovian models of the magnetospheric field. Interaction of the solar wind with the magnetosphere, i.e. the magnetotail current system, and the magnetopause currents screening all magnetospheric field sources, is taken into account. The input model parameters are determined from observations of the Pioneer 11 inbound flyby.

  2. Modelling Practice

    DEFF Research Database (Denmark)

    2011-01-01

    This chapter deals with the practicalities of building, testing, deploying and maintaining models. It gives specific advice for each phase of the modelling cycle. To do this, a modelling framework is introduced which covers: problem and model definition; model conceptualization; model data...... requirements; model construction; model solution; model verification; model validation and finally model deployment and maintenance. Within the adopted methodology, each step is discussedthrough the consideration of key issues and questions relevant to the modelling activity. Practical advice, based on many...... years of experience is providing in directing the reader in their activities.Traps and pitfalls are discussed and strategies also given to improve model development towards “fit-for-purpose” models. The emphasis in this chapter is the adoption and exercise of a modelling methodology that has proven very...

  3. A mechanistic model for electricity consumption on dairy farms: Definition, validation, and demonstration

    NARCIS (Netherlands)

    Upton, J.R.; Murphy, M.; Shallo, L.; Groot Koerkamp, P.W.G.; Boer, de I.J.M.

    2014-01-01

    Our objective was to define and demonstrate a mechanistic model that enables dairy farmers to explore the impact of a technical or managerial innovation on electricity consumption, associated CO2 emissions, and electricity costs. We, therefore, (1) defined a model for electricity consumption on dair

  4. Weak localization as a definitive test of diffusive models in the Casimir effect

    Science.gov (United States)

    Allocca, Andrew; Wilson, Justin; Galitski, Victor

    2015-03-01

    Results from many measurements of the Casimir effect suggest that the metallic plates in these experiments should be modeled with the plasma model of free electrons as opposed to the naive diffusive Drude model, while other experiments seem to indicate the exact opposite, with results more in line with a diffusive model. We study the Casimir effect at low temperatures between a thick disordered plate and purely two-dimensional disordered system where the Drude conductivity decreases logarithmically at low temperatures due to weak localization. This effect can be tuned with either temperature or applied magnetic field leading to a measurable change in the Casimir force. On the other hand, a ballistic model cannot experience such an effect and is only weakly dependent on temperature and magnetic field. As a result, we propose that an experiment would unambiguously differentiate between diffusive and ballistic models by measuring the effect at low temperatures with an applied magnetic field. Additionally, we calculate the impact that fluctuations in the disorder distribution have on the Casimir effect. Assuming the validity of a diffusive model, we find that the Drude model is a good approximation of a more exact treatment of disorder. This work was supported by the DOE-BES (Grant No. DESC0001911) (A.A. and V.G.), the JQI-PFC (J.W.), and the Simons Foundation.

  5. A point-process model of human heartbeat intervals: new definitions of heart rate and heart rate variability.

    Science.gov (United States)

    Barbieri, Riccardo; Matten, Eric C; Alabi, Abdulrasheed A; Brown, Emery N

    2005-01-01

    Heart rate is a vital sign, whereas heart rate variability is an important quantitative measure of cardiovascular regulation by the autonomic nervous system. Although the design of algorithms to compute heart rate and assess heart rate variability is an active area of research, none of the approaches considers the natural point-process structure of human heartbeats, and none gives instantaneous estimates of heart rate variability. We model the stochastic structure of heartbeat intervals as a history-dependent inverse Gaussian process and derive from it an explicit probability density that gives new definitions of heart rate and heart rate variability: instantaneous R-R interval and heart rate standard deviations. We estimate the time-varying parameters of the inverse Gaussian model by local maximum likelihood and assess model goodness-of-fit by Kolmogorov-Smirnov tests based on the time-rescaling theorem. We illustrate our new definitions in an analysis of human heartbeat intervals from 10 healthy subjects undergoing a tilt-table experiment. Although several studies have identified deterministic, nonlinear dynamical features in human heartbeat intervals, our analysis shows that a highly accurate description of these series at rest and in extreme physiological conditions may be given by an elementary, physiologically based, stochastic model.

  6. Gaps Analysis of Integrating Product Design, Manufacturing, and Quality Data in The Supply Chain Using Model-Based Definition.

    Science.gov (United States)

    Trainer, Asa; Hedberg, Thomas; Feeney, Allison Barnard; Fischer, Kevin; Rosche, Phil

    2016-01-01

    Advances in information technology triggered a digital revolution that holds promise of reduced costs, improved productivity, and higher quality. To ride this wave of innovation, manufacturing enterprises are changing how product definitions are communicated - from paper to models. To achieve industry's vision of the Model-Based Enterprise (MBE), the MBE strategy must include model-based data interoperability from design to manufacturing and quality in the supply chain. The Model-Based Definition (MBD) is created by the original equipment manufacturer (OEM) using Computer-Aided Design (CAD) tools. This information is then shared with the supplier so that they can manufacture and inspect the physical parts. Today, suppliers predominantly use Computer-Aided Manufacturing (CAM) and Coordinate Measuring Machine (CMM) models for these tasks. Traditionally, the OEM has provided design data to the supplier in the form of two-dimensional (2D) drawings, but may also include a three-dimensional (3D)-shape-geometry model, often in a standards-based format such as ISO 10303-203:2011 (STEP AP203). The supplier then creates the respective CAM and CMM models and machine programs to produce and inspect the parts. In the MBE vision for model-based data exchange, the CAD model must include product-and-manufacturing information (PMI) in addition to the shape geometry. Today's CAD tools can generate models with embedded PMI. And, with the emergence of STEP AP242, a standards-based model with embedded PMI can now be shared downstream. The on-going research detailed in this paper seeks to investigate three concepts. First, that the ability to utilize a STEP AP242 model with embedded PMI for CAD-to-CAM and CAD-to-CMM data exchange is possible and valuable to the overall goal of a more efficient process. Second, the research identifies gaps in tools, standards, and processes that inhibit industry's ability to cost-effectively achieve model-based-data interoperability in the pursuit of the

  7. The effect of neighbourhood definitions on spatio-temporal models of disease outbreaks: Separation distance versus range overlap.

    Science.gov (United States)

    Laffan, Shawn W; Wang, Zhaoyuan; Ward, Michael P

    2011-12-01

    The definition of the spatial relatedness between infectious and susceptible animal groups is a fundamental component of spatio-temporal modelling of disease outbreaks. A common neighbourhood definition for disease spread in wild and feral animal populations is the distance between the centroids of neighbouring group home ranges. This distance can be used to define neighbourhood interactions, and also to describe the probability of successful disease transmission. Key limitations of this approach are (1) that a susceptible neighbour of an infectious group with an overlapping home range - but whose centroid lies outside the home range of an infectious group - will not be considered for disease transmission, and (2) the degree of overlap between the home ranges is not taken into account for those groups with centroids inside the infectious home range. We assessed the impact of both distance-based and range overlap methods of disease transmission on model-predicted disease spread. Range overlap was calculated using home ranges modelled as circles. We used the Sirca geographic automata model, with the population data from a nine-county study area in Texas that we have previously described. For each method we applied 100 model repetitions, each of 100 time steps, to 30 index locations. The results show that the rate of disease spread for the range-overlap method is clearly less than the distance-based method, with median outbreaks modelled using the latter being 1.4-1.45 times larger. However, the two methods show similar overall trends in the area infected, and the range-overlap median (48 and 120 for cattle and pigs, respectively) falls within the 5th-95th percentile range of the distance-based method (0-96 and 0-252 for cattle and pigs, respectively). These differences can be attributed to the calculation of the interaction probabilities in the two methods, with overlap weights generally resulting in lower interaction probabilities. The definition of spatial

  8. Environmental variables and definitive host distribution: a habitat suitability modelling for endohelminth parasites in the marine realm

    Science.gov (United States)

    Kuhn, Thomas; Cunze, Sarah; Kochmann, Judith; Klimpel, Sven

    2016-08-01

    Marine nematodes of the genus Anisakis are common parasites of a wide range of aquatic organisms. Public interest is primarily based on their importance as zoonotic agents of the human Anisakiasis, a severe infection of the gastro-intestinal tract as result of consuming live larvae in insufficiently cooked fish dishes. The diverse nature of external impacts unequally influencing larval and adult stages of marine endohelminth parasites requires the consideration of both abiotic and biotic factors. Whereas abiotic factors are generally more relevant for early life stages and might also be linked to intermediate hosts, definitive hosts are indispensable for a parasite’s reproduction. In order to better understand the uneven occurrence of parasites in fish species, we here use the maximum entropy approach (Maxent) to model the habitat suitability for nine Anisakis species accounting for abiotic parameters as well as biotic data (definitive hosts). The modelled habitat suitability reflects the observed distribution quite well for all Anisakis species, however, in some cases, habitat suitability exceeded the known geographical distribution, suggesting a wider distribution than presently recorded. We suggest that integrative modelling combining abiotic and biotic parameters is a valid approach for habitat suitability assessments of Anisakis, and potentially other marine parasite species.

  9. Modelling SDL, Modelling Languages

    Directory of Open Access Journals (Sweden)

    Michael Piefel

    2007-02-01

    Full Text Available Today's software systems are too complex to implement them and model them using only one language. As a result, modern software engineering uses different languages for different levels of abstraction and different system aspects. Thus to handle an increasing number of related or integrated languages is the most challenging task in the development of tools. We use object oriented metamodelling to describe languages. Object orientation allows us to derive abstract reusable concept definitions (concept classes from existing languages. This language definition technique concentrates on semantic abstractions rather than syntactical peculiarities. We present a set of common concept classes that describe structure, behaviour, and data aspects of high-level modelling languages. Our models contain syntax modelling using the OMG MOF as well as static semantic constraints written in OMG OCL. We derive metamodels for subsets of SDL and UML from these common concepts, and we show for parts of these languages that they can be modelled and related to each other through the same abstract concepts.

  10. Modelling acute oral mammalian toxicity. 1. Definition of a quantifiable baseline effect.

    Science.gov (United States)

    Koleva, Yana K; Cronin, Mark T D; Madden, Judith C; Schwöbel, Johannes A H

    2011-10-01

    Quantitative structure-activity relationships (QSARs) provide a useful tool to define a relationship between chemical structure and toxicity and allow for the prediction of the toxicity of untested chemicals. QSAR models based upon an anaesthetic or narcosis mechanism represent a baseline, or minimum, toxicity, i.e. unless a chemical acts by another, more specific, mechanism, its toxicity will be predicted by such models. The aim of this investigation was to develop baseline models for the acute toxicity of chemicals to mammals (rat and mouse) following the oral route of administration. The availability of such baseline toxicity models for mammalian species can provide a probe for testing new chemicals with respect to their molecular mechanism of toxicity. Multiple-regression type structure-toxicity relationships were derived . (i.e., from oral log LD(50)(-1) data for mammalian species (rat and mouse) and the 1-octanol/water partition coefficient (log P) of classic non-polar narcotics). Subsequently, these models were used to distinguish between reactive chemicals of different mechanistic domains and baseline toxic chemicals. Comparison of measured toxicity data for oral rat and mouse LD(50) with predictions from baseline QSAR provides a means of identifying mechanistic categories and for categorising more specific acute mechanisms. Copyright © 2011 Elsevier Ltd. All rights reserved.

  11. Adaptation of a canopy reflectance model for sub-aqueous vegetation: Definition and sensitivity analysis

    Energy Technology Data Exchange (ETDEWEB)

    Plummer, S.E. [NERC/RSADU, Cambridgeshire (United Kingdom); Malthus, T.J. [Univ. of Edinburgh (United Kingdom); Clark, C.D. [Univ. of Sheffield (United Kingdom)

    1997-06-01

    Seagrass meadows are a key component of shallow coastal environments acting as a food resource, nursery and contributing to water oxygenation. Given the importance of these meadows and their susceptibility to anthropogenic disturbance, it is vital that the extent and growth of seagrass is monitored. Remote sensing techniques offer the potential to determine biophysical characteristics of seagrass. This paper presents observations on the development and testing of an invertible model of seagrass canopy reflectance. The model is an adaptation of a land surface reflectance model to incorporate the effects of attenuation and scattering of incoming radiative flux in water. Sensitivity analysis reveals that the subsurface reflectance is strongly dependent on the water depth, vegetation amount, the parameter which we wish to determine, and turbidity respectively. By contrast the chlorophyll concentration of water and gelbstoff are relatively unimportant. Water depth and turbidity need to be known or accommodated in any inversion as free parameters.

  12. Definition of the persistence length in the coarse-grained models of DNA elasticity

    Science.gov (United States)

    Fathizadeh, A.; Eslami-Mossallam, B.; Ejtehadi, M. R.

    2012-11-01

    By considering the detailed structure of DNA in the base pair level, two possible definitions of the persistence length are compared. One definition is related to the orientation of the terminal base pairs, and the other is based on the vectors which connect two adjacent base pairs at each end of the molecule. It is shown that although these definitions approach each other for long DNA molecules, they are dramatically different on short length scales. We show analytically that the difference mostly comes from the shear flexibility of the molecule and can be used to measure the shear modulus of DNA.

  13. A Conceptual Definition of Vocational Rehabilitation Based on the ICF : building a shared global model

    NARCIS (Netherlands)

    Escorpizo, Reuben; Reneman, Michiel F.; Ekholm, Jan; Fritz, Julie; Krupa, Terry; Marnetoft, Sven-Uno; Maroun, Claude E.; Guzman, Julietta Rodriguez; Suzuki, Yoshiko; Stucki, Gerold; Chan, Chetwyn C. H.

    Background The International Classification of Functioning, Disability and Health (ICF) is a conceptual framework and classification system by the World Health Organization (WHO) to understand functioning. The objective of this discussion paper is to offer a conceptual definition for vocational

  14. Roles of definitional and assessment models in the identification of new or second language learners of English for special education.

    Science.gov (United States)

    Barrera, Manuel

    2006-01-01

    This article examines the efficacy of current definitional perspectives on learning disabilities (LD) and related assessment models to support appropriate instructional and support services for learners of English with learning-related difficulties. A revised framework for defining LD and an associated assessment model, curriculum-based dynamic assessment (CDA), are proposed. The results of a teacher assessment study are reported to exemplify how this revised framework may be studied. The study examined the following questions: (a) Can curriculum-based dynamic assessments of authentic learning tasks help educators to differentiate between the work of students with limited English proficiency and their peers identified as having LD? (b) What are the characteristics of curriculum-based work samples of limited English proficient students with LD that may predictably differentiate them from their peers without LD?

  15. Definition of Magnetic Monopole Numbers for SU(N) Lattice Gauge-Higgs Models

    CERN Document Server

    Hollands, S

    2001-01-01

    A geometric definition for a magnetic charge of Abelian monopoles in SU(N) lattice gauge theories with Higgs fields is presented. The corresponding local monopole number defined for almost all field configurations does not require gauge fixing and is stable against small perturbations. Its topological content is that of a 3-cochain. A detailed prescription for calculating the local monopole number is worked out. Our method generalizes a magnetic charge definition previously invented by Phillips and Stone for SU(2).

  16. Systems Biology Markup Language (SBML) Level 2 Version 5: Structures and Facilities for Model Definitions.

    Science.gov (United States)

    Hucka, Michael; Bergmann, Frank T; Dräger, Andreas; Hoops, Stefan; Keating, Sarah M; Le Novère, Nicolas; Myers, Chris J; Olivier, Brett G; Sahle, Sven; Schaff, James C; Smith, Lucian P; Waltemath, Dagmar; Wilkinson, Darren J

    2015-09-04

    Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 5 of SBML Level 2. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org.

  17. Systems Biology Markup Language (SBML) Level 2 Version 5: Structures and Facilities for Model Definitions

    Science.gov (United States)

    Hucka, Michael; Bergmann, Frank T.; Dräger, Andreas; Hoops, Stefan; Keating, Sarah M.; Le Novére, Nicolas; Myers, Chris J.; Olivier, Brett G.; Sahle, Sven; Schaff, James C.; Smith, Lucian P.; Waltemath, Dagmar; Wilkinson, Darren J.

    2017-01-01

    Summary Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 5 of SBML Level 2. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org/. PMID:26528569

  18. High definition geomagnetic models: A new perspective for improved wellbore positioning

    DEFF Research Database (Denmark)

    Maus, Stefan; Nair, Manoj C.; Poedjono, Benny

    2012-01-01

    Earth's gravity and magnetic fields are used as natural reference frames in directional drilling. The azimuth of the bottomhole assembly is inferred by comparing the magnetic field measured-while-drilling (MWD) with a geomagnetic reference model. To provide a reference of sufficient quality for a...

  19. Mathematical models for the definition of cell manufacturing layout. Literature review

    Directory of Open Access Journals (Sweden)

    Gustavo Andrés Romero Duque

    2015-11-01

    Full Text Available This review article discusses the approach to the layout problem of cell manufacturing (LCM in a descriptive form; considering at first the problem and its variations, then the elements of the mathematical models, subsequently presenting solution methods used; and finally some future perspectives about this topic are considered.

  20. Model for the ready definition and approximate comparison of alternative high voltage transmission systems

    Energy Technology Data Exchange (ETDEWEB)

    1978-12-01

    A model of generic overhead transmission systems in the range of 362 to 1200 kV ac, and +-400 to +-800 kV dc is developed. Such generic systems are to include (a) transmission from generation to load, and (b) interconnection of two large integrated systems, with and without the existence of an underlying, lower voltage network in either case. The model provides a means whereby an engineer with some experience in power systems planning can make a reconnaissance study of alternatives within a relatively short span of time and with fair accuracy. Given an amount of power to be transferred over a specified distance, the model can be used: to define the workable alternatives in terms of voltages, number of lines, series compensation, and certain other factors affecting transfer capability; to delineate other salient features of the selected alternatives, notably shunt compensation requirements; and to compare the alternatives in terms of potentially relevant benefits and costs. The significant properties of the model, the basis and assumptions necessary to its formulation, instructions for its use, and inherent limitations upon the accuracy to be expected are described.

  1. Definition and sensitivity of the conceptual MORDOR rainfall-runoff model parameters using different multi-criteria calibration strategies

    Science.gov (United States)

    Garavaglia, F.; Seyve, E.; Gottardi, F.; Le Lay, M.; Gailhard, J.; Garçon, R.

    2014-12-01

    MORDOR is a conceptual hydrological model extensively used in Électricité de France (EDF, French electric utility company) operational applications: (i) hydrological forecasting, (ii) flood risk assessment, (iii) water balance and (iv) climate change studies. MORDOR is a lumped, reservoir, elevation based model with hourly or daily areal rainfall and air temperature as the driving input data. The principal hydrological processes represented are evapotranspiration, direct and indirect runoff, ground water, snow accumulation and melt and routing. The model has been intensively used at EDF for more than 20 years, in particular for modeling French mountainous watersheds. In the matter of parameters calibration we propose and test alternative multi-criteria techniques based on two specific approaches: automatic calibration using single-objective functions and a priori parameter calibration founded on hydrological watershed features. The automatic calibration approach uses single-objective functions, based on Kling-Gupta efficiency, to quantify the good agreement between the simulated and observed runoff focusing on four different runoff samples: (i) time-series sample, (I) annual hydrological regime, (iii) monthly cumulative distribution functions and (iv) recession sequences.The primary purpose of this study is to analyze the definition and sensitivity of MORDOR parameters testing different calibration techniques in order to: (i) simplify the model structure, (ii) increase the calibration-validation performance of the model and (iii) reduce the equifinality problem of calibration process. We propose an alternative calibration strategy that reaches these goals. The analysis is illustrated by calibrating MORDOR model to daily data for 50 watersheds located in French mountainous regions.

  2. A Memory Hierarchy Model Based on Data Reuse for Full-Search Motion Estimation on High-Definition Digital Videos

    Directory of Open Access Journals (Sweden)

    Alba Sandyra Bezerra Lopes

    2012-01-01

    Full Text Available The motion estimation is the most complex module in a video encoder requiring a high processing throughput and high memory bandwidth, mainly when the focus is high-definition videos. The throughput problem can be solved increasing the parallelism in the internal operations. The external memory bandwidth may be reduced using a memory hierarchy. This work presents a memory hierarchy model for a full-search motion estimation core. The proposed memory hierarchy model is based on a data reuse scheme considering the full search algorithm features. The proposed memory hierarchy expressively reduces the external memory bandwidth required for the motion estimation process, and it provides a very high data throughput for the ME core. This throughput is necessary to achieve real time when processing high-definition videos. When considering the worst bandwidth scenario, this memory hierarchy is able to reduce the external memory bandwidth in 578 times. A case study for the proposed hierarchy, using 32×32 search window and 8×8 block size, was implemented and prototyped on a Virtex 4 FPGA. The results show that it is possible to reach 38 frames per second when processing full HD frames (1920×1080 pixels using nearly 299 Mbytes per second of external memory bandwidth.

  3. Identification and Estimation of Postseismic Deformation: Implications for Plate Motion Models, Models of the Earthquake Cycle, and Terrestrial Reference Frame Definition

    Science.gov (United States)

    Kedar, S.; Bock, Y.; Moore, A. W.; Argus, D. F.; Fang, P.; Liu, Z.; Haase, J. S.; Su, L.; Owen, S. E.; Goldberg, D.; Squibb, M. B.; Geng, J.

    2015-12-01

    Postseismic deformation indicates a viscoelastic response of the lithosphere. It is critical, then, to identify and estimate the extent of postseismic deformation in both space and time, not only for its inherent information on crustal rheology and earthquake physics, but also since it must considered for plate motion models that are derived geodetically from the "steady-state" interseismic velocities, models of the earthquake cycle that provide interseismic strain accumulation and earthquake probability forecasts, as well as terrestrial reference frame definition that is the basis for space geodetic positioning. As part of the Solid Earth Science ESDR System) SESES project under a NASA MEaSUREs grant, JPL and SIO estimate combined daily position time series for over 1800 GNSS stations, both globally and at plate boundaries, independently using the GIPSY and GAMIT software packages, but with a consistent set of a prior epoch-date coordinates and metadata. The longest time series began in 1992, and many of them contain postseismic signals. For example, about 90 of the global GNSS stations out of more than 400 that define the ITRF have experienced one or more major earthquakes and 36 have had multiple earthquakes; as expected, most plate boundary stations have as well. We quantify the spatial (distance from rupture) and temporal (decay time) extent of postseismic deformation. We examine parametric models (log, exponential) and a physical model (rate- and state-dependent friction) to fit the time series. Using a PCA analysis, we determine whether or not a particular earthquake can be uniformly fit by a single underlying postseismic process - otherwise we fit individual stations. Then we investigate whether the estimated time series velocities can be directly used as input to plate motion models, rather than arbitrarily removing the apparent postseismic portion of a time series and/or eliminating stations closest to earthquake epicenters.

  4. A mechanistic model for electricity consumption on dairy farms: definition, validation, and demonstration.

    Science.gov (United States)

    Upton, J; Murphy, M; Shalloo, L; Groot Koerkamp, P W G; De Boer, I J M

    2014-01-01

    Our objective was to define and demonstrate a mechanistic model that enables dairy farmers to explore the impact of a technical or managerial innovation on electricity consumption, associated CO2 emissions, and electricity costs. We, therefore, (1) defined a model for electricity consumption on dairy farms (MECD) capable of simulating total electricity consumption along with related CO2 emissions and electricity costs on dairy farms on a monthly basis; (2) validated the MECD using empirical data of 1yr on commercial spring calving, grass-based dairy farms with 45, 88, and 195 milking cows; and (3) demonstrated the functionality of the model by applying 2 electricity tariffs to the electricity consumption data and examining the effect on total dairy farm electricity costs. The MECD was developed using a mechanistic modeling approach and required the key inputs of milk production, cow number, and details relating to the milk-cooling system, milking machine system, water-heating system, lighting systems, water pump systems, and the winter housing facilities as well as details relating to the management of the farm (e.g., season of calving). Model validation showed an overall relative prediction error (RPE) of less than 10% for total electricity consumption. More than 87% of the mean square prediction error of total electricity consumption was accounted for by random variation. The RPE values of the milk-cooling systems, water-heating systems, and milking machine systems were less than 20%. The RPE values for automatic scraper systems, lighting systems, and water pump systems varied from 18 to 113%, indicating a poor prediction for these metrics. However, automatic scrapers, lighting, and water pumps made up only 14% of total electricity consumption across all farms, reducing the overall impact of these poor predictions. Demonstration of the model showed that total farm electricity costs increased by between 29 and 38% by moving from a day and night tariff to a flat

  5. Economics definitions, methods, models, and analysis procedures for Homeland Security applications.

    Energy Technology Data Exchange (ETDEWEB)

    Ehlen, Mark Andrew; Loose, Verne William; Vargas, Vanessa N.; Smith, Braeton J.; Warren, Drake E.; Downes, Paula Sue; Eidson, Eric D.; Mackey, Greg Edward

    2010-01-01

    This report gives an overview of the types of economic methodologies and models used by Sandia economists in their consequence analysis work for the National Infrastructure Simulation & Analysis Center and other DHS programs. It describes the three primary resolutions at which analysis is conducted (microeconomic, mesoeconomic, and macroeconomic), the tools used at these three levels (from data analysis to internally developed and publicly available tools), and how they are used individually and in concert with each other and other infrastructure tools.

  6. Vocational services for traumatic brain injury: treatment definition and diversity within model systems of care.

    Science.gov (United States)

    Hart, Tessa; Dijkers, Marcel; Fraser, Robert; Cicerone, Keith; Bogner, Jennifer A; Whyte, John; Malec, James; Waldron, Brigid

    2006-01-01

    To examine characteristics and diversity among vocational treatment services in model programs for traumatic brain injury (TBI) rehabilitation. Vocational or postacute treatment components of 16 TBI Model System (TBIMS) centers. Vocational director/coordinator from each TBIMS surveyed in semistructured phone interview. Survey of vocational services for people with TBI, with about 100 closed and open-ended questions on vocational assessments; pre- and postjob placement treatments; program philosophies; funding; and integration of cognitive, behavioral, family, and medical rehabilitation interventions. Great diversity was found among the vocational services of the 16 TBIMS. Programs fell into 3 clusters emphasizing medical rehabilitation services, supported employment, or a combination of these with an emphasis on case management. Job coaching was identified as a key intervention, but there was great variability in intensity, availability, and funding of coaching services. Diversity in vocational services appears related to funding differences and "parallel evolution" rather than strong treatment philosophy or scientific evidence base. Multicenter research on effectiveness or establishment of best practices in vocational rehabilitation after TBI must deal with substantial existing variability in treatment models and specific interventions, and must examine the relationship of treatment variations to case-mix factors.

  7. Active Aging for Individuals with Parkinson’s Disease: Definitions, Literature Review, and Models

    Directory of Open Access Journals (Sweden)

    Seyed-Mohammad Fereshtehnejad

    2014-01-01

    Full Text Available Active aging has been emerged to optimize different aspects of health opportunities during the aging process in order to enhance quality of life. Yet, most of the efforts are on normal aging and less attention has been paid for the elderly suffering from a chronic illness such as Parkinson’s disease (PD. The aim of this review was to investigate how the concept of “active aging” fit for the elderly with PD and to propose a new model for them using the recent improvements in caring models and management approaches. For this purpose, biomedical databases have been assessed using relevant keywords to find out appropriate articles. Movement problems of PD affect physical activity, psychiatric symptoms lessen social communication, and cognitive impairment could worsen mental well-being in elderly with PD, all of which could lead to earlier retirement and poorer quality of life compared with healthy elderly. Based on the multisystematic nature of PD, a new “Active Aging Model for Parkinson’s Disease” is proposed consisting of self-care, multidisciplinary and interdisciplinary care, palliative care, patient-centered care, and personalized care. These strategies could potentially help the individuals with PD to have a better management approach for their condition towards the concept of active aging.

  8. "We definitely are role models": Exploring how clinical instructors' influence nursing students' attitudes towards older adults.

    Science.gov (United States)

    Gibbs, Sheena Simpkins; Kulig, Judith C

    2017-07-19

    The world's population is getting older, which will inevitably cause increased demands for nurses to provide high quality care to this demographic. Attitudes have been shown to influence the quality of care that older adults receive. It is therefore important to gain a better understanding of what influences nursing students' attitudes towards older adults. This article reports on one of three inter-connected research questions of a mixed methods study that explored the relationship between clinical instructors' attitudes and nursing students' attitudes towards older adults. Semi-structured interviews were conducted with 6 clinical instructors and 13 nursing students. Interview data was analyzed using thematic analysis. A conceptual model was developed from the research findings, which revealed that nursing instructors are seen as strong role models for their students, and as role models, they influence students through demonstrations, expectations and support. As a result, nursing students mirror the attitudes of their instructors towards older adults. Findings from this study highlight the strong connection between nursing instructors' and students' attitudes. This has important implications for nursing education including strategies that instructors can employ to enhance students' attitudes towards older adults. Insights from this study also have the potential to improve the quality of care that future nurses provide to older adults. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. A Conceptual Definition of Vocational Rehabilitation Based on the ICF : building a shared global model

    NARCIS (Netherlands)

    Escorpizo, Reuben; Reneman, Michiel F.; Ekholm, Jan; Fritz, Julie; Krupa, Terry; Marnetoft, Sven-Uno; Maroun, Claude E.; Guzman, Julietta Rodriguez; Suzuki, Yoshiko; Stucki, Gerold; Chan, Chetwyn C. H.

    2011-01-01

    Background The International Classification of Functioning, Disability and Health (ICF) is a conceptual framework and classification system by the World Health Organization (WHO) to understand functioning. The objective of this discussion paper is to offer a conceptual definition for vocational reha

  10. The importance of a precise definition, comprehensive model, and critical discussion of successful aging at work

    NARCIS (Netherlands)

    Zacher, Hannes

    2015-01-01

    It is crucial to advance understanding of the concept of successful aging at work to guide rigorous future research and effective practice. Drawing on the gerontology and life-span developmental literatures, I recently proposed a definition and theoretical framework of successful aging at work that

  11. A Model for Math Modeling

    Science.gov (United States)

    Lin, Tony; Erfan, Sasan

    2016-01-01

    Mathematical modeling is an open-ended research subject where no definite answers exist for any problem. Math modeling enables thinking outside the box to connect different fields of studies together including statistics, algebra, calculus, matrices, programming and scientific writing. As an integral part of society, it is the foundation for many…

  12. Optimization and planning of operating theatre activities: an original definition of pathways and process modeling.

    Science.gov (United States)

    Barbagallo, Simone; Corradi, Luca; de Ville de Goyet, Jean; Iannucci, Marina; Porro, Ivan; Rosso, Nicola; Tanfani, Elena; Testi, Angela

    2015-05-17

    The Operating Room (OR) is a key resource of all major hospitals, but it also accounts for up 40% of resource costs. Improving cost effectiveness, while maintaining a quality of care, is a universal objective. These goals imply an optimization of planning and a scheduling of the activities involved. This is highly challenging due to the inherent variable and unpredictable nature of surgery. A Business Process Modeling Notation (BPMN 2.0) was used for the representation of the "OR Process" (being defined as the sequence of all of the elementary steps between "patient ready for surgery" to "patient operated upon") as a general pathway ("path"). The path was then both further standardized as much as possible and, at the same time, keeping all of the key-elements that would allow one to address or define the other steps of planning, and the inherent and wide variability in terms of patient specificity. The path was used to schedule OR activity, room-by-room, and day-by-day, feeding the process from a "waiting list database" and using a mathematical optimization model with the objective of ending up in an optimized planning. The OR process was defined with special attention paid to flows, timing and resource involvement. Standardization involved a dynamics operation and defined an expected operating time for each operation. The optimization model has been implemented and tested on real clinical data. The comparison of the results reported with the real data, shows that by using the optimization model, allows for the scheduling of about 30% more patients than in actual practice, as well as to better exploit the OR efficiency, increasing the average operating room utilization rate up to 20%. The optimization of OR activity planning is essential in order to manage the hospital's waiting list. Optimal planning is facilitated by defining the operation as a standard pathway where all variables are taken into account. By allowing a precise scheduling, it feeds the process of

  13. Model of observed stochastic balance between work and free time supporting the LQTAI definition

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager

    2008-01-01

    A balance differential equation between free time and money-producing work time on the national economy level is formulated in a previous paper in terms of two dimensionless quantities, the fraction of work time and the total productivity factor defined as the ratio of the Gross Domestic Product...... to the total salary paid in return for work. Among the solutions there is one relation that compares surprisingly well with the relevant sequences of Danish data spanning from 1948 to 2003, and also with similar data from several other countries except for slightly different model parameter values. Statistical...

  14. Predictive modeling of outcomes following definitive chemoradiotherapy for oropharyngeal cancer based on FDG-PET image characteristics

    Science.gov (United States)

    Folkert, Michael R.; Setton, Jeremy; Apte, Aditya P.; Grkovski, Milan; Young, Robert J.; Schöder, Heiko; Thorstad, Wade L.; Lee, Nancy Y.; Deasy, Joseph O.; Oh, Jung Hun

    2017-07-01

    In this study, we investigate the use of imaging feature-based outcomes research (‘radiomics’) combined with machine learning techniques to develop robust predictive models for the risk of all-cause mortality (ACM), local failure (LF), and distant metastasis (DM) following definitive chemoradiation therapy (CRT). One hundred seventy four patients with stage III-IV oropharyngeal cancer (OC) treated at our institution with CRT with retrievable pre- and post-treatment 18F-fluorodeoxyglucose positron emission tomography (FDG-PET) scans were identified. From pre-treatment PET scans, 24 representative imaging features of FDG-avid disease regions were extracted. Using machine learning-based feature selection methods, multiparameter logistic regression models were built incorporating clinical factors and imaging features. All model building methods were tested by cross validation to avoid overfitting, and final outcome models were validated on an independent dataset from a collaborating institution. Multiparameter models were statistically significant on 5 fold cross validation with the area under the receiver operating characteristic curve (AUC)  =  0.65 (p  =  0.004), 0.73 (p  =  0.026), and 0.66 (p  =  0.015) for ACM, LF, and DM, respectively. The model for LF retained significance on the independent validation cohort with AUC  =  0.68 (p  =  0.029) whereas the models for ACM and DM did not reach statistical significance, but resulted in comparable predictive power to the 5 fold cross validation with AUC  =  0.60 (p  =  0.092) and 0.65 (p  =  0.062), respectively. In the largest study of its kind to date, predictive features including increasing metabolic tumor volume, increasing image heterogeneity, and increasing tumor surface irregularity significantly correlated to mortality, LF, and DM on 5 fold cross validation in a relatively uniform single-institution cohort. The LF model also retained

  15. The Role of Acting Participants, Definitions, and the Determining Factors of Adherence to Treatment from Two Perspectives: The Biomedical Model and the Chronic Care Model

    Directory of Open Access Journals (Sweden)

    Adrian LUPU

    2014-11-01

    Full Text Available Management of chronic illness implies significant changing the lifestyle, taking medication, watching the diet, introducing and maintaining exercise in daily life, etc. These actions represent elements of adherence to treatment and they reflect the responsibility of patient’s participation to healthcare. The increase in adherence to treatment and in the quality of care, implicitly, may depend on allotting the resources necessary within therapeutic effort and on the effectiveness of the partnership between patient and doctor. Assuming the medical decision as a team may lead to solving the issue of non-adherence (Armstrong, 2014. Whereas the values of the functional parameters of the body represent an objective measurement of treatment efficiency and to some extent of adherence to it, implicitly, assessing the patient’s lifestyle involves understanding his experience, which is governed by subjectivity. This article has the following objectives: (1 to analyze the definitions of adherence to treatment from a biomedical perspective and from the perspective of Chronic Care Model (CCM; (2 to identify the characteristics specific to the roles of acting participants to healthcare and to analyze the modifications of roles by the choice of theoretical model and (3 to identify the determining factors of adherence to treatment.

  16. A Mediated Definite Delegation Model allowing for Certified Grid Job Submission

    CERN Document Server

    Schreiner, Steffen; Grigoras, Costin; Litmaath, Maarten

    2012-01-01

    Grid computing infrastructures need to provide traceability and accounting of their users" activity and protection against misuse and privilege escalation. A central aspect of multi-user Grid job environments is the necessary delegation of privileges in the course of a job submission. With respect to these generic requirements this document describes an improved handling of multi-user Grid jobs in the ALICE ("A Large Ion Collider Experiment") Grid Services. A security analysis of the ALICE Grid job model is presented with derived security objectives, followed by a discussion of existing approaches of unrestricted delegation based on X.509 proxy certificates and the Grid middleware gLExec. Unrestricted delegation has severe security consequences and limitations, most importantly allowing for identity theft and forgery of delegated assignments. These limitations are discussed and formulated, both in general and with respect to an adoption in line with multi-user Grid jobs. Based on the architecture of the ALICE...

  17. Semantic Building Information Modeling and high definition surveys for Cultural Heritage sites

    Directory of Open Access Journals (Sweden)

    Simone Garagnani

    2012-11-01

    Full Text Available In recent years, digital technology devoted to the building design has experienced significant advancements allowing to reach, by means of the Building Information Modeling, those goals only imagined since the mid-Seventies of the last century. The BIM process, bearer of several advantages for actors and designers who implement it in their workflow, may be employed even in various case studies related to some interventions on the existing architectural Cultural Heritage. The semantics typical of the classical architecture, so pervasive in the European urban landscape, as well as the Modern or Contemporary architecture features, coincide with the self-conscious structure made of “smart objects” proper of BIM, which proves to be an effective system to document component relationships. However, the translation of existing buildings geometric information, acquired using the common techniques of laser scanning and digital photogrammetry, into BIM objects, is still a critical process that this paper aims to investigate, describing possible methods and approaches.

  18. Definition of a 5MW/61.5m wind turbine blade reference model.

    Energy Technology Data Exchange (ETDEWEB)

    Resor, Brian Ray

    2013-04-01

    A basic structural concept of the blade design that is associated with the frequently utilized %E2%80%9CNREL offshore 5-MW baseline wind turbine%E2%80%9D is needed for studies involving blade structural design and blade structural design tools. The blade structural design documented in this report represents a concept that meets basic design criteria set forth by IEC standards for the onshore turbine. The design documented in this report is not a fully vetted blade design which is ready for manufacture. The intent of the structural concept described by this report is to provide a good starting point for more detailed and targeted investigations such as blade design optimization, blade design tool verification, blade materials and structures investigations, and blade design standards evaluation. This report documents the information used to create the current model as well as the analyses used to verify that the blade structural performance meets reasonable blade design criteria.

  19. Modelling Active Faults in Probabilistic Seismic Hazard Analysis (PSHA) with OpenQuake: Definition, Design and Experience

    Science.gov (United States)

    Weatherill, Graeme; Garcia, Julio; Poggi, Valerio; Chen, Yen-Shin; Pagani, Marco

    2016-04-01

    The Global Earthquake Model (GEM) has, since its inception in 2009, made many contributions to the practice of seismic hazard modeling in different regions of the globe. The OpenQuake-engine (hereafter referred to simply as OpenQuake), GEM's open-source software for calculation of earthquake hazard and risk, has found application in many countries, spanning a diversity of tectonic environments. GEM itself has produced a database of national and regional seismic hazard models, harmonizing into OpenQuake's own definition the varied seismogenic sources found therein. The characterization of active faults in probabilistic seismic hazard analysis (PSHA) is at the centre of this process, motivating many of the developments in OpenQuake and presenting hazard modellers with the challenge of reconciling seismological, geological and geodetic information for the different regions of the world. Faced with these challenges, and from the experience gained in the process of harmonizing existing models of seismic hazard, four critical issues are addressed. The challenge GEM has faced in the development of software is how to define a representation of an active fault (both in terms of geometry and earthquake behaviour) that is sufficiently flexible to adapt to different tectonic conditions and levels of data completeness. By exploring the different fault typologies supported by OpenQuake we illustrate how seismic hazard calculations can, and do, take into account complexities such as geometrical irregularity of faults in the prediction of ground motion, highlighting some of the potential pitfalls and inconsistencies that can arise. This exploration leads to the second main challenge in active fault modeling, what elements of the fault source model impact most upon the hazard at a site, and when does this matter? Through a series of sensitivity studies we show how different configurations of fault geometry, and the corresponding characterisation of near-fault phenomena (including

  20. A univocal definition of the neuronal soma morphology using Gaussian mixture models.

    Science.gov (United States)

    Luengo-Sanchez, Sergio; Bielza, Concha; Benavides-Piccione, Ruth; Fernaud-Espinosa, Isabel; DeFelipe, Javier; Larrañaga, Pedro

    2015-01-01

    The definition of the soma is fuzzy, as there is no clear line demarcating the soma of the labeled neurons and the origin of the dendrites and axon. Thus, the morphometric analysis of the neuronal soma is highly subjective. In this paper, we provide a mathematical definition and an automatic segmentation method to delimit the neuronal soma. We applied this method to the characterization of pyramidal cells, which are the most abundant neurons in the cerebral cortex. Since there are no benchmarks with which to compare the proposed procedure, we validated the goodness of this automatic segmentation method against manual segmentation by neuroanatomists to set up a framework for comparison. We concluded that there were no significant differences between automatically and manually segmented somata, i.e., the proposed procedure segments the neurons similarly to how a neuroanatomist does. It also provides univocal, justifiable and objective cutoffs. Thus, this study is a means of characterizing pyramidal neurons in order to objectively compare the morphometry of the somata of these neurons in different cortical areas and species.

  1. A univocal definition of the neuronal soma morphology using Gaussian mixture models

    Directory of Open Access Journals (Sweden)

    Sergio eLuengo-Sanchez

    2015-11-01

    Full Text Available The definition of the soma is fuzzy, as there is no clear line demarcating the soma of the labeled neurons and the origin of the dendrites and axon. Thus, the morphometric analysis of the neuronal soma is highly subjective. In this paper, we provide a mathematical definition and an automatic segmentation method to delimit the neuronal soma. We applied this method to the characterization of pyramidal cells, which are the most abundant neurons in the cerebral cortex. Since there are no benchmarks with which to compare the proposed procedure, we validated the goodness of this automatic segmentation method against manual segmentation by experts in neuroanatomy to set up a framework for comparison. We concluded that there were no significant differences between automatically and manually segmented somata, i.e., the proposed procedure segments the neurons more or less as an expert does. It also provides univocal, justifiable and objective cutoffs. Thus, this study is a means of characterizing pyramidal neurons in order to objectively compare the morphometry of the somata of these neurons in different cortical areas and species.

  2. Continuity and Resurgence: towards a continuum definition of the CP(N-1) model

    CERN Document Server

    Dunne, Gerald V

    2013-01-01

    We introduce a non-perturbative continuum framework to study the dynamics of quantum field theory (QFT), applied here to the CP(N-1) model, using Ecalle's theory of resurgent trans-series, combined with the physical principle of continuity, in which spatial compactification and a Born-Oppenheimer approximation reduce QFT to quantum mechanics, while preventing all intervening rapid cross-overs or phase transitions. The reduced quantum mechanics contains the germ of all non-perturbative data, e.g., mass gap, of the QFT, all of which are calculable. For CP(N-1), the results obtained at arbitrary N are consistent with lattice and large-N results. These theories are perturbatively non-Borel summable and possess the elusive IR-renormalon singularities. The trans-series expansion, in which perturbative and non-perturbative effects are intertwined, encapsulates the multi-length-scale nature of the theory, and eliminates all perturbative and non-perturbative ambiguities under consistent analytic continuation of the co...

  3. Elaboration of the definition of genetic counseling into a model for counselee decision-making.

    Science.gov (United States)

    Bringle, R G; Antley, R M

    1980-01-01

    Genetic counselors are generally trained in genetics only and often have no basis for determining when a counselee has made an informed decision and the counselor's function is complete. A theory of genetic counseling (GC) is offered which interrelates genetic information, psychological responses, learning theory, and decision making, reflecting a shift from a eugenic orientation to an orientation concerned with the physical and mental well-being of counselees. GC is 1st defined as enabling the counselee to comprehend the medical facts of genetic disorders, heredity, risks, and alternatives, as well as to make a healthy adjustment to a family member's disorder and risk of recurrence. The process of learning is broken down into a hierarchical relationship between acquisition, understanding, and personalization of facts, and applied to the GC situation; e.g. "the options are as follows;" "they can be exercised by couples in certain ways;" and "we have the following choices to make." Personalization of knowledge means integration into one's own value system where it will affect decisions made, a process affected by factors such as stress. Often, the information provided by GC is not the only information the counselee possesses, and it will be integrated with other conceptions. Normative social influences (e.g. a family's attitude towards abortion) affect the behavioral intention. And finally, the behavioral intention is not always equivalent to the actual behavior. These process are all related to the way in which a family deals with the stress caused by a genetic disorder. GC outcomes are easier to measure than those of psychological counseling. Extending the model to clinical application implies 1) assessment; 2) setting objectives; 3) counseling; and 4) evaluation.

  4. A covariate-adjustment regression model approach to noninferiority margin definition.

    Science.gov (United States)

    Nie, Lei; Soon, Guoxing

    2010-05-10

    To maintain the interpretability of the effect of experimental treatment (EXP) obtained from a noninferiority trial, current statistical approaches often require the constancy assumption. This assumption typically requires that the control treatment effect in the population of the active control trial is the same as its effect presented in the population of the historical trial. To prevent constancy assumption violation, clinical trial sponsors were recommended to make sure that the design of the active control trial is as close to the design of the historical trial as possible. However, these rigorous requirements are rarely fulfilled in practice. The inevitable discrepancies between the historical trial and the active control trial have led to debates on many controversial issues. Without support from a well-developed quantitative method to determine the impact of the discrepancies on the constancy assumption violation, a correct judgment seems difficult. In this paper, we present a covariate-adjustment generalized linear regression model approach to achieve two goals: (1) to quantify the impact of population difference between the historical trial and the active control trial on the degree of constancy assumption violation and (2) to redefine the active control treatment effect in the active control trial population if the quantification suggests an unacceptable violation. Through achieving goal (1), we examine whether or not a population difference leads to an unacceptable violation. Through achieving goal (2), we redefine the noninferiority margin if the violation is unacceptable. This approach allows us to correctly determine the effect of EXP in the noninferiority trial population when constancy assumption is violated due to the population difference. We illustrate the covariate-adjustment approach through a case study.

  5. Enlarging the toolbox for allergen epitope definition with an allergen-type model protein.

    Science.gov (United States)

    Berkner, Hanna; Seutter von Loetzen, Christian; Hartl, Maximilian; Randow, Stefanie; Gubesch, Michaela; Vogel, Lothar; Husslik, Felix; Reuter, Andreas; Lidholm, Jonas; Ballmer-Weber, Barbara; Vieths, Stefan; Rösch, Paul; Schiller, Dirk

    2014-01-01

    Birch pollen-allergic subjects produce polyclonal cross-reactive IgE antibodies that mediate pollen-associated food allergies. The major allergen Bet v 1 and its homologs in plant foods bind IgE in their native protein conformation. Information on location, number and clinical relevance of IgE epitopes is limited. We addressed the use of an allergen-related protein model to identify amino acids critical for IgE binding of PR-10 allergens. Norcoclaurine synthase (NCS) from meadow rue is structurally homologous to Bet v 1 but does not bind Bet v 1-reactive IgE. NCS was used as the template for epitope grafting. NCS variants were tested with sera from 70 birch pollen allergic subjects and with monoclonal antibody BV16 reported to compete with IgE binding to Bet v 1. We generated an NCS variant (Δ29NCSN57/I58E/D60N/V63P/D68K) harboring an IgE epitope of Bet v 1. Bet v 1-type protein folding of the NCS variant was evaluated by 1H-15N-HSQC NMR spectroscopy. BV16 bound the NCS variant and 71% (50/70 sera) of our study population showed significant IgE binding. We observed IgE and BV16 cross-reactivity to the epitope presented by the NCS variant in a subgroup of Bet v 1-related allergens. Moreover BV16 blocked IgE binding to the NCS variant. Antibody cross-reactivity depended on a defined orientation of amino acids within the Bet v 1-type conformation. Our system allows the evaluation of patient-specific epitope profiles and will facilitate both the identification of clinically relevant epitopes as biomarkers and the monitoring of therapeutic outcomes to improve diagnosis, prognosis, and therapy of allergies caused by PR-10 proteins.

  6. Enlarging the toolbox for allergen epitope definition with an allergen-type model protein.

    Directory of Open Access Journals (Sweden)

    Hanna Berkner

    Full Text Available Birch pollen-allergic subjects produce polyclonal cross-reactive IgE antibodies that mediate pollen-associated food allergies. The major allergen Bet v 1 and its homologs in plant foods bind IgE in their native protein conformation. Information on location, number and clinical relevance of IgE epitopes is limited. We addressed the use of an allergen-related protein model to identify amino acids critical for IgE binding of PR-10 allergens.Norcoclaurine synthase (NCS from meadow rue is structurally homologous to Bet v 1 but does not bind Bet v 1-reactive IgE. NCS was used as the template for epitope grafting. NCS variants were tested with sera from 70 birch pollen allergic subjects and with monoclonal antibody BV16 reported to compete with IgE binding to Bet v 1.We generated an NCS variant (Δ29NCSN57/I58E/D60N/V63P/D68K harboring an IgE epitope of Bet v 1. Bet v 1-type protein folding of the NCS variant was evaluated by 1H-15N-HSQC NMR spectroscopy. BV16 bound the NCS variant and 71% (50/70 sera of our study population showed significant IgE binding. We observed IgE and BV16 cross-reactivity to the epitope presented by the NCS variant in a subgroup of Bet v 1-related allergens. Moreover BV16 blocked IgE binding to the NCS variant. Antibody cross-reactivity depended on a defined orientation of amino acids within the Bet v 1-type conformation.Our system allows the evaluation of patient-specific epitope profiles and will facilitate both the identification of clinically relevant epitopes as biomarkers and the monitoring of therapeutic outcomes to improve diagnosis, prognosis, and therapy of allergies caused by PR-10 proteins.

  7. The learning rate in three dimensional high definition video assisted microvascular anastomosis in a rat model.

    Science.gov (United States)

    Kotsougiani, Dimitra; Hundepool, Caroline A; Bulstra, Liselotte F; Shin, Delaney M; Shin, Alexander Y; Bishop, Allen T

    2016-11-01

    Three-dimensional (3D) high definition (HD) video systems are changing microsurgical practice by providing stereoscopic imaging not only for the surgeon and first assistant using the binocular microscope, but also for others involved in the surgery. The purpose of this study was to evaluate the potential to replace the binocular microscope for microarterial anastomoses and assess the rate of learning based on surgeons' experience. Two experienced and two novice microsurgeons performed a total of 88 rat femoral arterial anastomoses: 44 using a 3D HD video device ('Trenion', Carl Zeiss Meditech) and 44, a binocular microscope. We evaluated anastomosis time and modified OSATS scores as well as the subjects' preference for comfort, image adequacy and technical ease. Experienced microsurgeons showed a steep learning curve for anastomosis times with equivalent OSATS scores for both systems. However, prolonged anastomosis times were required when using the novel 3D-HD system rather than direct binocular vision. Comparable learning rates for anastomosis time were demonstrated for novice microsurgeons and modified OSATS scores did not differ between the different viewing technologies. All microsurgeons reported improved comfort for the 3D HD video system but found the image quality of the conventional microscope superior, facilitating technical ease. The present study demonstrates the potential of 3D HD video systems to replace current binocular microscopes, offering qualitatively-equivalent microvascular anastomosis with improved comfort for experienced microsurgeons. However, image quality was rated inferior with the 3D HD system resulting in prolonged anastomosis times. Microsurgical skill acquisition in novice microsurgeons was not influenced by the viewing system used.

  8. Between algorithm and model: different Molecular Surface definitions for the Poisson-Boltzmann based electrostatic characterization of biomolecules in solution.

    Science.gov (United States)

    Decherchi, Sergio; Colmenares, José; Catalano, Chiara Eva; Spagnuolo, Michela; Alexov, Emil; Rocchia, Walter

    2013-01-01

    The definition of a molecular surface which is physically sound and computationally efficient is a very interesting and long standing problem in the implicit solvent continuum modeling of biomolecular systems as well as in the molecular graphics field. In this work, two molecular surfaces are evaluated with respect to their suitability for electrostatic computation as alternatives to the widely used Connolly-Richards surface: the blobby surface, an implicit Gaussian atom centered surface, and the skin surface. As figures of merit, we considered surface differentiability and surface area continuity with respect to atom positions, and the agreement with explicit solvent simulations. Geometric analysis seems to privilege the skin to the blobby surface, and points to an unexpected relationship between the non connectedness of the surface, caused by interstices in the solute volume, and the surface area dependence on atomic centers. In order to assess the ability to reproduce explicit solvent results, specific software tools have been developed to enable the use of the skin surface in Poisson-Boltzmann calculations with the DelPhi solver. Results indicate that the skin and Connolly surfaces have a comparable performance from this last point of view.

  9. A four class model for digital breast histopathology using high-definition Fourier transform infrared (FT-IR) spectroscopic imaging

    Science.gov (United States)

    Mittal, Shachi; Wrobel, Tomasz P.; Leslie, L. S.; Kadjacsy-Balla, Andre; Bhargava, Rohit

    2016-03-01

    High-definition (HD) Fourier transform infrared (FT-IR) spectroscopic imaging is an emerging technique that not only enables chemistry-based visualization of tissue constituents, and label free extraction of biochemical information but its higher spatial detail makes it a potentially useful platform to conduct digital pathology. This methodology, along with fast and efficient data analysis, can enable both quantitative and automated pathology. Here we demonstrate a combination of HD FT-IR spectroscopic imaging of breast tissue microarrays (TMAs) with data analysis algorithms to perform histologic analysis. The samples comprise four tissue states, namely hyperplasia, dysplasia, cancerous and normal. We identify various cell types which would act as biomarkers for breast cancer detection and differentiate between them using statistical pattern recognition tools i.e. Random Forest (RF) and Bayesian algorithms. Feature optimization is integrally carried out for the RF algorithm, reducing computation time as well as redundant spectral features. We achieved an order of magnitude reduction in the number of features with comparable prediction accuracy to that of the original feature set. Together, the demonstration of histology and selection of features paves the way for future applications in more complex models and rapid data acquisition.

  10. Turbulence Model

    DEFF Research Database (Denmark)

    Nielsen, Mogens Peter; Shui, Wan; Johansson, Jens

    2011-01-01

    In this report a new turbulence model is presented.In contrast to the bulk of modern work, the model is a classical continuum model with a relatively simple constitutive equation. The constitutive equation is, as usual in continuum mechanics, entirely empirical. It has the usual Newton or Stokes...... term with stresses depending linearly on the strain rates. This term takes into account the transfer of linear momentum from one part of the fluid to another. Besides there is another term, which takes into account the transfer of angular momentum. Thus the model implies a new definition of turbulence....... The model is in a virgin state, but a number of numerical tests have been carried out with good results. It is published to encourage other researchers to study the model in order to find its merits and possible limitations....

  11. Fuzzy Entropy:Axiomatic Definition and Neural Networks Model%模糊熵:公理化定义和神经网络模型

    Institute of Scientific and Technical Information of China (English)

    卿铭; 曹悦; 黄天民

    2004-01-01

    The measure of uncertainty is adopted as a measure of information. The measures of fuzziness are known as fuzzy information measures. The measure of a quantity of fuzzy information gained from a fuzzy set or fuzzy system is known as fuzzy entropy. Fuzzy entropy has been focused and studied by many researchers in various fields. In this paper, firstly,the axiomatic definition of fuzzy entropy is discussed. Then, neural networks model of fuzzy entropy is proposed, based on the computing capability of neural networks. In the end, two examples are discussed to show the efficiency of the model.

  12. Multiple-step model-experiment matching allows precise definition of dynamical leg parameters in human running.

    Science.gov (United States)

    Ludwig, C; Grimmer, S; Seyfarth, A; Maus, H-M

    2012-09-21

    The spring-loaded inverted pendulum (SLIP) model is a well established model for describing bouncy gaits like human running. The notion of spring-like leg behavior has led many researchers to compute the corresponding parameters, predominantly stiffness, in various experimental setups and in various ways. However, different methods yield different results, making the comparison between studies difficult. Further, a model simulation with experimentally obtained leg parameters typically results in comparatively large differences between model and experimental center of mass trajectories. Here, we pursue the opposite approach which is calculating model parameters that allow reproduction of an experimental sequence of steps. In addition, to capture energy fluctuations, an extension of the SLIP (ESLIP) is required and presented. The excellent match of the models with the experiment validates the description of human running by the SLIP with the obtained parameters which we hence call dynamical leg parameters.

  13. Information Model for Product Modeling

    Institute of Scientific and Technical Information of China (English)

    焦国方; 刘慎权

    1992-01-01

    The Key problems in product modeling for integrated CAD ∥CAM systems are the information structures and representations of products.They are taking more and more important roles in engineering applications.With the investigation on engineering product information and from the viewpoint of industrial process,in this paper,the information models are proposed and the definitions of the framework of product information are given.And then,the integration and the consistence of product information are discussed by introucing the entity and its instance.As a summary,the information structures described in this paper have many advantage and natures helpful in engineering design.

  14. Normalizing a relativistic model of X-ray reflection. Definition of the reflection fraction and its implementation in relxill

    Science.gov (United States)

    Dauser, T.; García, J.; Walton, , D. J.; Eikmann, W.; Kallman, T.; McClintock, J.; Wilms, J.

    2016-05-01

    Aims: The only relativistic reflection model that implements a parameter relating the intensity incident on an accretion disk to the observed intensity is relxill. The parameter used in earlier versions of this model, referred to as the reflection strength, is unsatisfactory; it has been superseded by a parameter that provides insight into the accretion geometry, namely the reflection fraction. The reflection fraction is defined as the ratio of the coronal intensity illuminating the disk to the coronal intensity that reaches the observer. Methods: The relxill model combines a general relativistic ray-tracing code and a photoionization code to compute the component of radiation reflected from an accretion that is illuminated by an external source. The reflection fraction is a particularly important parameter for relativistic models with well-defined geometry, such as the lamp post model, which is a focus of this paper. Results: Relativistic spectra are compared for three inclinations and for four values of the key parameter of the lamp post model, namely the height above the black hole of the illuminating, on-axis point source. In all cases, the strongest reflection is produced for low source heights and high spin. A low-spin black hole is shown to be incapable of producing enhanced relativistic reflection. Results for the relxill model are compared to those obtained with other models and a Monte Carlo simulation. Conclusions: Fitting data by using the relxill model and the recently implemented reflection fraction, the geometry of a system can be constrained. The reflection fraction is independent of system parameters such as inclination and black hole spin. The reflection-fraction parameter was implemented with the name refl_frac in all flavours of the relxill model, and the non-relativistic reflection model xillver, in v0.4a (18 January 2016).

  15. Hsp90 inhibitors, part 1: definition of 3-D QSAutogrid/R models as a tool for virtual screening.

    Science.gov (United States)

    Ballante, Flavio; Caroli, Antonia; Wickersham, Richard B; Ragno, Rino

    2014-03-24

    The multichaperone heat shock protein (Hsp) 90 complex mediates the maturation and stability of a variety of oncogenic signaling proteins. For this reason, Hsp90 has emerged as a promising target for anticancer drug development. Herein, we describe a complete computational procedure for building several 3-D QSAR models used as a ligand-based (LB) component of a comprehensive ligand-based (LB) and structure-based (SB) virtual screening (VS) protocol to identify novel molecular scaffolds of Hsp90 inhibitors. By the application of the 3-D QSAutogrid/R method, eight SB PLS 3-D QSAR models were generated, leading to a final multiprobe (MP) 3-D QSAR pharmacophoric model capable of recognizing the most significant chemical features for Hsp90 inhibition. Both the monoprobe and multiprobe models were optimized, cross-validated, and tested against an external test set. The obtained statistical results confirmed the models as robust and predictive to be used in a subsequent VS.

  16. Research of semi-definite programming SVM model%半定规划支持向量机模型的研究

    Institute of Scientific and Technical Information of China (English)

    张敏; 覃华; 苏一丹

    2011-01-01

    The classification accuracy and generalization ability of SVM model is largely dependent on the selection of model parameters.The traditional parameter selection lacks of theoretical support, spends too much time, and the classification precision is not always perfect. Against those problems, a semi-definite programming SVM model is proposed which is able to distinguish the validaty of a given set of kernel parameters and get a better kernel matrix by combining simple ones using combination coefficients, to improve the accuracy of SVM. Experimental results on UCI datasets show that using the new method to identify effective kernel parameters is feasible and the semi-definite programming SVM model is superior to the standard SVM. Further more, the generalization ability of heterogeneous kernel semi-definite programming SVM model significantly outperform homogeneous kernel semi-definite programming SVM model.%支持向量机(support vector machincs,SVM)的分类精度和泛化能力会受到核函数及其工作参数的影响,传统的核函数参数选择方法缺乏理论支持,花费的时间较多,效果也不一定理想.针对此问题,提出一种基于半定规划的SVM模型,利用半定规划来判别一组给定的核函数工作参数是否有效,并能用有效的核函数工作参数组合计算出更优的核矩阵,提高SVM模型的分类精度.在UCI数据集上的实验结果表明,用此方法判别核函数工作参数是可行的,所组合出的半定规划SVM模型的泛化能力优于传统的SVM模型,并且异构核半定规划SVM模型的泛化能力优于同构核半定规划SVM模型.

  17. Stereometric Modelling

    Science.gov (United States)

    Grimaldi, P.

    2012-07-01

    These mandatory guidelines are provided for preparation of papers accepted for publication in the series of Volumes of The The stereometric modelling means modelling achieved with : - the use of a pair of virtual cameras, with parallel axes and positioned at a mutual distance average of 1/10 of the distance camera-object (in practice the realization and use of a stereometric camera in the modeling program); - the shot visualization in two distinct windows - the stereoscopic viewing of the shot while modelling. Since the definition of "3D vision" is inaccurately referred to as the simple perspective of an object, it is required to add the word stereo so that "3D stereo vision " shall stand for "three-dimensional view" and ,therefore, measure the width, height and depth of the surveyed image. Thanks to the development of a stereo metric model , either real or virtual, through the "materialization", either real or virtual, of the optical-stereo metric model made visible with a stereoscope. It is feasible a continuous on line updating of the cultural heritage with the help of photogrammetry and stereometric modelling. The catalogue of the Architectonic Photogrammetry Laboratory of Politecnico di Bari is available on line at: http://rappresentazione.stereofot.it:591/StereoFot/FMPro?-db=StereoFot.fp5&-lay=Scheda&-format=cerca.htm&-view

  18. Monte-Carlo model development for evaluation of current clinical target volume definition for heterogeneous and hypoxic glioblastoma.

    Science.gov (United States)

    Moghaddasi, L; Bezak, E; Harriss-Phillips, W

    2016-05-07

    Clinical target volume (CTV) determination may be complex and subjective. In this work a microscopic-scale tumour model was developed to evaluate current CTV practices in glioblastoma multiforme (GBM) external radiotherapy. Previously, a Geant4 cell-based dosimetry model was developed to calculate the dose deposited in individual GBM cells. Microscopic extension probability (MEP) models were then developed using Matlab-2012a. The results of the cell-based dosimetry model and MEP models were combined to calculate survival fractions (SF) for CTV margins of 2.0 and 2.5 cm. In the current work, oxygenation and heterogeneous radiosensitivity profiles were incorporated into the GBM model. The genetic heterogeneity was modelled using a range of α/β values (linear-quadratic model parameters) associated with different GBM cell lines. These values were distributed among the cells randomly, taken from a Gaussian-weighted sample of α/β values. Cellular oxygen pressure was distributed randomly taken from a sample weighted to profiles obtained from literature. Three types of GBM models were analysed: homogeneous-normoxic, heterogeneous-normoxic, and heterogeneous-hypoxic. The SF in different regions of the tumour model and the effect of the CTV margin extension from 2.0-2.5 cm on SFs were investigated for three MEP models. The SF within the beam was increased by up to three and two orders of magnitude following incorporation of heterogeneous radiosensitivities and hypoxia, respectively, in the GBM model. However, the total SF was shown to be overdominated by the presence of tumour cells in the penumbra region and to a lesser extent by genetic heterogeneity and hypoxia. CTV extension by 0.5 cm reduced the SF by a maximum of 78.6  ±  3.3%, 78.5  ±  3.3%, and 77.7  ±  3.1% for homogeneous and heterogeneous-normoxic, and heterogeneous hypoxic GBMs, respectively. Monte-Carlo model was developed to quantitatively evaluate SF for genetically

  19. Monte-Carlo model development for evaluation of current clinical target volume definition for heterogeneous and hypoxic glioblastoma

    Science.gov (United States)

    Moghaddasi, L.; Bezak, E.; Harriss-Phillips, W.

    2016-05-01

    Clinical target volume (CTV) determination may be complex and subjective. In this work a microscopic-scale tumour model was developed to evaluate current CTV practices in glioblastoma multiforme (GBM) external radiotherapy. Previously, a Geant4 cell-based dosimetry model was developed to calculate the dose deposited in individual GBM cells. Microscopic extension probability (MEP) models were then developed using Matlab-2012a. The results of the cell-based dosimetry model and MEP models were combined to calculate survival fractions (SF) for CTV margins of 2.0 and 2.5 cm. In the current work, oxygenation and heterogeneous radiosensitivity profiles were incorporated into the GBM model. The genetic heterogeneity was modelled using a range of α/β values (linear-quadratic model parameters) associated with different GBM cell lines. These values were distributed among the cells randomly, taken from a Gaussian-weighted sample of α/β values. Cellular oxygen pressure was distributed randomly taken from a sample weighted to profiles obtained from literature. Three types of GBM models were analysed: homogeneous-normoxic, heterogeneous-normoxic, and heterogeneous-hypoxic. The SF in different regions of the tumour model and the effect of the CTV margin extension from 2.0-2.5 cm on SFs were investigated for three MEP models. The SF within the beam was increased by up to three and two orders of magnitude following incorporation of heterogeneous radiosensitivities and hypoxia, respectively, in the GBM model. However, the total SF was shown to be overdominated by the presence of tumour cells in the penumbra region and to a lesser extent by genetic heterogeneity and hypoxia. CTV extension by 0.5 cm reduced the SF by a maximum of 78.6  ±  3.3%, 78.5  ±  3.3%, and 77.7  ±  3.1% for homogeneous and heterogeneous-normoxic, and heterogeneous hypoxic GBMs, respectively. Monte-Carlo model was developed to quantitatively evaluate SF for genetically

  20. On Communication Models

    Institute of Scientific and Technical Information of China (English)

    蒋娜; 谢有琪

    2012-01-01

    With the development of human society, the social hub enlarges beyond one community to the extent that the world is deemed as a community as a whole. Communication, therefore, plays an increasingly important role in our daily life. As a consequence, communication model or the definition of which is not so much a definition as a guide in communication. However, some existed communication models are not as practical as it was. This paper tries to make an overall contrast among three communication models Coded Model, Gable Communication Model and Ostensive Inferential Model, to see how they assist people to comprehend verbal and non -verbal communication.

  1. A Research on the Definitions and Models of Crowdsourcing and Their Future Development%众包定义、模式研究发展及展望

    Institute of Scientific and Technical Information of China (English)

    林素芬; 林峰

    2015-01-01

    The research on “crowdsourcing”definitions and models is relative and interactive.It is a starter of other re-search in the domain.The research makes it easier for academicians to foresee the developing trend of crowdsourcing prac-tice and also offers theoretical support for other aspects of crowdsourcing research.This paper introduces different“crowdsourcing”definitions and models which cover information technology,and business and knowledge domain.Based on the definition research,the research on crowdsourcing model is concluded from crowdsourcing practice and meets the ac-ademic research aims.Crowdsourcing information model,business model as well as knowledge models are key models of crowdsourcing.They are closely related and interactive to each other,and show a diversified development trend.%众包商业模式扩大了企业的创新源、激发了大众创造热情,在实践中不断发展。众包概念和模式的研究是众包其它方面研究的出发点和基石。对众包概念和模式研究的梳理不仅有利于把握众包商业实践发展态势,还能为众包其它方面的研究提供理论支持。众包概念和众包模式主要涉及信息技术、商业、知识领域。众包模式研究建立在众包概念研究的基础上,遵循源于实践和研究需要的原则。众包信息模式、众包商业模式、众包知识模式研究交叉渗透发展。未来众包模式研究有着多元化发展的趋势。

  2. Geosynchronous platform definition study. Volume 4, Part 2: Traffic analysis and system requirements for the new traffic model

    Science.gov (United States)

    1973-01-01

    A condensed summary of the traffic analyses and systems requirements for the new traffic model is presented. The results of each study activity are explained, key analyses are described, and important results are highlighted.

  3. The Definition and Ray-Tracing of B-Spline Objects in a Combinatorial Solid Geometric Modeling System

    Science.gov (United States)

    2013-04-01

    surfaces consisting of Bezier curves and Nonuniform Rational B-spline Surfaces ( NURBS ). There are many times however, when both modeling approaches...have allowed the integration of free-form objects in CSG systems. This presentation will discuss the development and integration of NURBS into the...Ballistics Research Laboratory CSG modeling system. 15. SUBJECT TERMS NURBS , BSpline, raytracing, CSG, BRL-CAD 16. SECURITY CLASSIFICATION OF: 17

  4. Definition and Experimental Validation of a Simplified Model for a Microgrid Thermal Network and its Integration into Energy Management Systems

    Directory of Open Access Journals (Sweden)

    Andrea Bonfiglio

    2016-11-01

    Full Text Available The present paper aims at defining a simplified but effective model of a thermal network that links the thermal power generation with the resulting temperature time profile in a heated or refrigerated environment. For this purpose, an equivalent electric circuit is proposed together with an experimental procedure to evaluate its input parameters. The paper also highlights the simplicity of implementation of the proposed model into a microgrid Energy Management System. This allows the optimal operation of the thermal network to be achieved on the basis of available data (desired temperature profile instead of a less realistic basis (such as the desired thermal power profile. The validation of the proposed model is performed on the Savona Campus Smart Polygeneration Microgrid (SPM with the following steps: (i identification of the parameters involved in the equivalent circuit (performed by minimizing the difference between the temperature profile, as calculated with the proposed model, and the measured one in a set of training days; (ii test of the model accuracy on a set of testing days (comparing the measured temperature profiles with the calculated ones; (iii implementation of the model into an Energy Management System in order to optimize the thermal generation starting from a desired temperature hourly profile.

  5. Mechanisms of chemical vapor generation by aqueous tetrahydridoborate. Recent developments toward the definition of a more general reaction model

    Science.gov (United States)

    D'Ulivo, Alessandro

    2016-05-01

    A reaction model describing the reactivity of metal and semimetal species with aqueous tetrahydridoborate (THB) has been drawn taking into account the mechanism of chemical vapor generation (CVG) of hydrides, recent evidences on the mechanism of interference and formation of byproducts in arsane generation, and other evidences in the field of the synthesis of nanoparticles and catalytic hydrolysis of THB by metal nanoparticles. The new "non-analytical" reaction model is of more general validity than the previously described "analytical" reaction model for CVG. The non-analytical model is valid for reaction of a single analyte with THB and for conditions approaching those typically encountered in the synthesis of nanoparticles and macroprecipitates. It reduces to the previously proposed analytical model under conditions typically employed in CVG for trace analysis (analyte below the μM level, borane/analyte ≫ 103 mol/mol, no interference). The non-analytical reaction model is not able to explain all the interference effects observed in CVG, which can be achieved only by assuming the interaction among the species of reaction pathways of different analytical substrates. The reunification of CVG, the synthesis of nanoparticles by aqueous THB and the catalytic hydrolysis of THB inside a common frame contribute to rationalization of the complex reactivity of aqueous THB with metal and semimetal species.

  6. Leadership Models.

    Science.gov (United States)

    Freeman, Thomas J.

    This paper discusses six different models of organizational structure and leadership, including the scalar chain or pyramid model, the continuum model, the grid model, the linking pin model, the contingency model, and the circle or democratic model. Each model is examined in a separate section that describes the model and its development, lists…

  7. On inclusion of water resource management in Earth System models – Part 1: Problem definition and representation of water demand

    Directory of Open Access Journals (Sweden)

    A. Nazemi

    2014-07-01

    Full Text Available Human activities have caused various changes in the Earth System, and hence, the interconnections between humans and the Earth System should be recognized and reflected in models that simulate the Earth System processes. One key anthropogenic activity is water resource management that determines the dynamics of human–water interactions in time and space. There are various reasons to include water resource management in Earth System models. First, the extent of human water requirements is increasing rapidly at the global scale and it is crucial to analyze the possible imbalance between water demands and supply under various scenarios of climate change and across various temporal and spatial scales. Second, recent observations show that human–water interactions, manifested through water resource management, can substantially alter the terrestrial water cycle, affect land-atmospheric feedbacks and may further interact with climate and contribute to sea-level change. Here, we divide the water resource management into two interdependent elements, related to water demand as well as water supply and allocation. In this paper, we survey the current literature on how various water demands have been included in large-scale models, including Land Surface Schemes and Global Hydrological Models. The available algorithms are classified based on the type of demand, mode of simulation and underlying modeling assumptions. We discuss the pros and cons of available algorithms, address various sources of uncertainty and highlight limitations in current applications. We conclude that current capability of large-scale models in terms of representing human water demands is rather limited, particularly with respect to future projections and online simulations. We argue that current limitations in simulating various human demands and their impact on the Earth System are mainly due to the uncertainties in data support, demand algorithms and large-scale models. To

  8. On inclusion of water resource management in Earth system models - Part 1: Problem definition and representation of water demand

    Science.gov (United States)

    Nazemi, A.; Wheater, H. S.

    2015-01-01

    Human activities have caused various changes to the Earth system, and hence the interconnections between human activities and the Earth system should be recognized and reflected in models that simulate Earth system processes. One key anthropogenic activity is water resource management, which determines the dynamics of human-water interactions in time and space and controls human livelihoods and economy, including energy and food production. There are immediate needs to include water resource management in Earth system models. First, the extent of human water requirements is increasing rapidly at the global scale and it is crucial to analyze the possible imbalance between water demands and supply under various scenarios of climate change and across various temporal and spatial scales. Second, recent observations show that human-water interactions, manifested through water resource management, can substantially alter the terrestrial water cycle, affect land-atmospheric feedbacks and may further interact with climate and contribute to sea-level change. Due to the importance of water resource management in determining the future of the global water and climate cycles, the World Climate Research Program's Global Energy and Water Exchanges project (WRCP-GEWEX) has recently identified gaps in describing human-water interactions as one of the grand challenges in Earth system modeling (GEWEX, 2012). Here, we divide water resource management into two interdependent elements, related firstly to water demand and secondly to water supply and allocation. In this paper, we survey the current literature on how various components of water demand have been included in large-scale models, in particular land surface and global hydrological models. Issues of water supply and allocation are addressed in a companion paper. The available algorithms to represent the dominant demands are classified based on the demand type, mode of simulation and underlying modeling assumptions. We discuss

  9. On the Physical Interpretation of the Saleh-Valenzuela Model and the definition of its power delay profiles

    NARCIS (Netherlands)

    Meijerink, Arjan; Molisch, Andreas F.

    2014-01-01

    The physical motivation and interpretation of the stochastic propagation channel model of Saleh and Valenzuela are discussed in detail. This motivation mainly relies on assumptions on the stochastic properties of the positions of transmitter, receiver and scatterers in the propagation environment,

  10. Model Transformations? Transformation Models!

    NARCIS (Netherlands)

    Bézivin, J.; Büttner, F.; Gogolla, M.; Jouault, F.; Kurtev, I.; Lindow, A.

    2006-01-01

    Much of the current work on model transformations seems essentially operational and executable in nature. Executable descriptions are necessary from the point of view of implementation. But from a conceptual point of view, transformations can also be viewed as descriptive models by stating only the

  11. Modelling business models

    NARCIS (Netherlands)

    Simonse, W.L.

    2014-01-01

    Business model design does not always produce a “design” or “model” as the expected result. However, when designers are involved, a visual model or artifact is produced. To assist strategic managers in thinking about how they can act, the designers’ challenge is to combine both strategy and design n

  12. Investigation on the Flexural Creep Stiffness Behavior of PC-ABS Material Processed by Fused Deposition Modeling Using Response Surface Definitive Screening Design

    Science.gov (United States)

    Mohamed, Omar Ahmed; Masood, Syed Hasan; Bhowmik, Jahar Lal

    2017-03-01

    The resistance of polymeric materials to time-dependent plastic deformation is an important requirement of the fused deposition modeling (FDM) design process, its processed products, and their application for long-term loading, durability, and reliability. The creep performance of the material and part processed by FDM is the fundamental criterion for many applications with strict dimensional stability requirements, including medical implants, electrical and electronic products, and various automotive applications. Herein, the effect of FDM fabrication conditions on the flexural creep stiffness behavior of polycarbonate-acrylonitrile-butadiene-styrene processed parts was investigated. A relatively new class of experimental design called "definitive screening design" was adopted for this investigation. The effects of process variables on flexural creep stiffness behavior were monitored, and the best suited quadratic polynomial model with high coefficient of determination ( R 2) value was developed. This study highlights the value of response surface definitive screening design in optimizing properties for the products and materials, and it demonstrates its role and potential application in material processing and additive manufacturing.

  13. Investigation on the Flexural Creep Stiffness Behavior of PC-ABS Material Processed by Fused Deposition Modeling Using Response Surface Definitive Screening Design

    Science.gov (United States)

    Mohamed, Omar Ahmed; Masood, Syed Hasan; Bhowmik, Jahar Lal

    2016-12-01

    The resistance of polymeric materials to time-dependent plastic deformation is an important requirement of the fused deposition modeling (FDM) design process, its processed products, and their application for long-term loading, durability, and reliability. The creep performance of the material and part processed by FDM is the fundamental criterion for many applications with strict dimensional stability requirements, including medical implants, electrical and electronic products, and various automotive applications. Herein, the effect of FDM fabrication conditions on the flexural creep stiffness behavior of polycarbonate-acrylonitrile-butadiene-styrene processed parts was investigated. A relatively new class of experimental design called "definitive screening design" was adopted for this investigation. The effects of process variables on flexural creep stiffness behavior were monitored, and the best suited quadratic polynomial model with high coefficient of determination (R 2) value was developed. This study highlights the value of response surface definitive screening design in optimizing properties for the products and materials, and it demonstrates its role and potential application in material processing and additive manufacturing.

  14. Definition of a unique model for the improvement of the monitoring network and seismic risk reduction of the school buildings in Italy

    Science.gov (United States)

    Greco, M.; Console, R.; Colangelo, A.; Cioè, A.; Trivigno, L.

    2015-12-01

    In the latest decade the safety of the Italian schools against seismic risk is a crucial subject for the Italian legislation as well as to the UN Convention on the DRR and the more specific priorities adopted even within the OECD. Recently, the Italian Parliament approved a law (L98/2013) which launched the Commissioning Safety of School Buildings Plan and the Definition of a Unique Model, to be developed by the CGIAM, in order to improve monitoring network and seismic risk reduction (SRR). The objectives of such a law deals with increasing in the knowledge of public actions aimed to improve the effectiveness of the SRR policy on school buildings. The actions of the CGIAM will consist in the identification of a significant number of school buildings in Italy, mainly in terms of type of construction and material, on which calibrate specific synthetic parameters and test models. Furthermore, the activities are addressed to quantitatively evaluation of intervention efficacy, to set up simple systems of instrumental monitoring, even able to test the possibility of periodical checks of the state of general preservation. The main issues carried on by the CGIAM mainly concern the completion and enrichment of the existing data base of school buildings, even through the collaboration of the Ministries and other relevant Italian research institutions, the evaluation of seismic hazard and site condition analysis as well as the definition of other seismic risk factors. Nevertheless a cost-benefit analysis as well as application and dissemination of such tools are proposed too. At the same time, the CGIAM contributes to the definition of experimental installation and use of a Simplified Accelerometric Monitoring Network for school buildings comprehensive of testing phase on a limited number of structures. The work proposes a synthetic overview of the employed methodologies as well as the first results arising from the research and implementation activities.

  15. Product models for the Construction industry

    DEFF Research Database (Denmark)

    Sørensen, Lars Schiøtt

    1996-01-01

    Different types of product models for the building sector was elaborated and grouped. Some discussion on the different models was given. The "definition" of Product models was given.......Different types of product models for the building sector was elaborated and grouped. Some discussion on the different models was given. The "definition" of Product models was given....

  16. SPATIO-TEMPORAL SEGMENTATION AND REGIONS TRACKING OF HIGH DEFINITION VIDEO SEQUENCES USING A MARKOV RANDOM FIELD MODEL

    OpenAIRE

    Brouard, Olivier; Delannay, Fabrice; Ricordel, Vincent; Barba, Dominique

    2008-01-01

    International audience; In this paper, we proposed a Markov Random field sequence segmentation and regions tracking model, which aims at combining color, texture, and motion features. First a motion-based segmentation is realized. The global motion of the video sequence is estimated and compensated. From the remaining motion information, the motion segmentation is achieved. Then, we use a Markovian approach to update and track over time the video objects. By video object, we mean typically, a...

  17. 78 FR 31836 - Special Conditions: Embraer S.A., Model EMB-550 Airplane, Dive Speed Definition With Speed...

    Science.gov (United States)

    2013-05-28

    ... in the Federal Register on January 24, 2013 (78 FR 5146). We received no substantive comments, and..., Dive Speed Definition With Speed Protection System AGENCY: Federal Aviation Administration (FAA), DOT... design features include a high-speed protection system. The applicable airworthiness regulations do...

  18. Steady induction effects in geomagnetism. Part 1C: Geomagnetic estimation of steady surficial core motions: Application to the definitive geomagnetic reference field models

    Science.gov (United States)

    Voorhies, Coerte V.

    1993-01-01

    In the source-free mantle/frozen-flux core magnetic earth model, the non-linear inverse steady motional induction problem was solved using the method presented in Part 1B. How that method was applied to estimate steady, broad-scale fluid velocity fields near the top of Earth's core that induce the secular change indicated by the Definitive Geomagnetic Reference Field (DGRF) models from 1945 to 1980 are described. Special attention is given to the derivation of weight matrices for the DGRF models because the weights determine the apparent significance of the residual secular change. The derived weight matrices also enable estimation of the secular change signal-to-noise ratio characterizing the DGRF models. Two types of weights were derived in 1987-88: radial field weights for fitting the evolution of the broad-scale portion of the radial geomagnetic field component at Earth's surface implied by the DGRF's, and general weights for fitting the evolution of the broad-scale portion of the scalar potential specified by these models. The difference is non-trivial because not all the geomagnetic data represented by the DGRF's constrain the radial field component. For radial field weights (or general weights), a quantitatively acceptable explication of broad-scale secular change relative to the 1980 Magsat epoch must account for 99.94271 percent (or 99.98784 percent) of the total weighted variance accumulated therein. Tolerable normalized root-mean-square weighted residuals of 2.394 percent (or 1.103 percent) are less than the 7 percent errors expected in the source-free mantle/frozen-flux core approximation.

  19. Wall-to-Wall Forest Mapping Based on Digital Surface Models from Image-Based Point Clouds and a NFI Forest Definition

    Directory of Open Access Journals (Sweden)

    Lars T. Waser

    2015-12-01

    Full Text Available Forest mapping is an important source of information for assessing woodland resources and a key issue for any National Forest Inventory (NFI. In the present study, a detailed wall-to-wall forest cover map was generated for all of Switzerland, which meets the requirement of the Swiss NFI forest definition. The workflow is highly automated and based on digital surface models from image-based point clouds of airborne digital sensor data. It fully takes into account the four key criteria of minimum tree height, crown coverage, width, and land use. The forest cover map was validated using almost 10,000 terrestrial and stereo-interpreted NFI plots, which verified 97% agreement overall. This validation implies different categories such as five production regions, altitude, tree type, and distance to the forest border. Overall accuracy was lower at forest borders but increased with increasing distance from the forest border. Commission errors remained stable at around 10%, but increased to 17.6% at the upper tree line. Omission errors were low at 1%–10%, but also increased with altitude and mainly occurred at the upper tree line (19.7%. The main reasons for this are the lower image quality and the NFI height definition for forest which apparently excludes shrub forest from the mask. The presented forest mapping approach is superior to existing products due to its national coverage, high level of detail, regular updating, and implementation of the land use criteria.

  20. The models of phase transformations definition in the software Deform and their effect on the output values from the numerical simulation of gear thermal processing.

    Directory of Open Access Journals (Sweden)

    Sona Benesova

    2014-11-01

    Full Text Available With the aid of DEFORM® software it is possible to conduct numerical simulation of workpiece phase composition during and upon heat treatment. The computation can be based on either the graphical representation of TTT diagram of the steel in question or one of the mathematical models integrated in the software, the latter being applicable if the required constants are known. The present paper gives an evaluation of differences between results of numerical simulations with various definitions of phase transformation for the heat treatment of a gearwheel and a specially prepared specimen of simple shape. It was found that the preparation of input data in terms of thorough mapping of characteristics of the material is essential. 

  1. Modelling Defiguration

    DEFF Research Database (Denmark)

    Bork Petersen, Franziska

    2013-01-01

    For the presentation of his autumn/winter 2012 collection in Paris and subsequently in Copenhagen, Danish designer Henrik Vibskov installed a mobile catwalk. The article investigates the choreographic impact of this scenography on those who move through it. Drawing on Dance Studies, the analytical...... advantageous manner. Stepping on the catwalk’s sloping, moving surfaces decelerates the models’ walk and makes it cautious, hesitant and shaky: suddenly the models lack exactly the affirmative, staccato, striving quality of motion, and the condescending expression that they perform on most contemporary...... catwalks. Vibskov’s catwalk induces what the dance scholar Gabriele Brandstetter has labelled a ‘defigurative choregoraphy’: a straying from definitions, which exist in ballet as in other movement-based genres, of how a figure should move and appear (1998). The catwalk scenography in this instance...

  2. [First definition of minimal care model: the role of nurses, physiotherapists, dietitians and psychologists in preventive and rehabilitative cardiology].

    Science.gov (United States)

    Bettinardi, Ornella; da Vico, Letizia; Pierobon, Antonia; Iannucci, Manuela; Maffezzoni, Barbara; Borghi, Silvana; Ferrari, Marina; Brazzo, Silvia; Mazza, Antonio; Sommaruga, Marinella; Angelino, Elisabetta; Biffi, Barbara; Agostini, Susanna; Masini, Maria Luisa; Ambrosetti, Marco; Faggiano, Pompilio; Griffo, Raffaele

    2014-09-01

    Rehabilitative and preventive cardiology (CRP) is configured as intervention prevention to "gain health" through a process of multifactorial care that reduces disability and the risk of subsequent cardiovascular events. It makes use of an interdisciplinary team in which every professional needs to have multiple intervention paths because of the different levels of clinical and functional complexity of cardiac patients who currently have access to the rehabilitation. The document refers to the use of interventions by nurses, physiotherapists, dietitians and psychologists that are part of the rehabilitation team of CRP. Interventions of which have been documented, on scientific bases and clinical practice, empirical effectiveness and organizational efficiency. The methodological approach of this paper is a first attempt to define, through the model of consensus, the minimum standards for a CRP evidence based characterized by clearly defined criteria that can be used by operators of CRP. The document describes the activities to be carried out in each of the phases included in the pathways of care by nurses, physiotherapists, dietitians and psychologists. The routes identified were divided, according to the type of patients who have access to the CRP and to the phases of care, including the initial assessment, intervention, evaluation and final reporting, in high medium and low complexity. Examples of models of reporting, used by the operators of the team according to the principles of good clinical practice, are provided. This is made to allow traceability of operations, encourage communication inside the working group and within the patient and the caregiver. Also to give any possible indication for the post-rehabilitation.

  3. Realization of MBD model digitalization definition in UG/NX%基于模型的数字化定义在UG/NX中的实现

    Institute of Scientific and Technical Information of China (English)

    陶杰; 葛如海; 周临震

    2013-01-01

    Through studying the development process of DPD and in-depth analysis of the relationship between DPD and MBD, the importance of DPD as the basis of MBD technology application system was revealed. The content of Digital Product Definition, the definition method of GD&T in 3D model, the application of 3D cutaway view and the comments expression of non-geometry manufacturing information of air bag support were analyzed based on UG/ NX platform. In addition, the expression method of product manufacturing information under NX platform was given. DPD, as the basis of MBD technology, will completely change the information carrier from 2D drawings to 3D model and lead digital manufacturing technology to great changes.%通过探究DPD的发展历程,深入剖析DPD与MBD的关系,揭示了DPD作为MBD技术应用体系基础的重要性.以UG/NX软件为平台,以气囊下支座为研究对象,分析了DPD定义的内容,GD&T在三维数模中的定义方法,三维剖视图的应用及非几何信息的注释表达,明确了在NX平台下产品制造信息的表达方法.DPD作为MBD技术的基础,彻底将产品定义信息载体变更为三维模型,必将引起数字化制造技术的巨大变革.

  4. Actant Models

    DEFF Research Database (Denmark)

    Poulsen, Helle

    1996-01-01

    This paper presents a functional modelling method called Actant Modelling rooted in linguistics and semiotics. Actant modelling can be integrated with Multilevel Flow Modelling (MFM) in order to give an interpretation of actants.......This paper presents a functional modelling method called Actant Modelling rooted in linguistics and semiotics. Actant modelling can be integrated with Multilevel Flow Modelling (MFM) in order to give an interpretation of actants....

  5. Programming Models in HPC

    Energy Technology Data Exchange (ETDEWEB)

    Shipman, Galen M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-06-13

    These are the slides for a presentation on programming models in HPC, at the Los Alamos National Laboratory's Parallel Computing Summer School. The following topics are covered: Flynn's Taxonomy of computer architectures; single instruction single data; single instruction multiple data; multiple instruction multiple data; address space organization; definition of Trinity (Intel Xeon-Phi is a MIMD architecture); single program multiple data; multiple program multiple data; ExMatEx workflow overview; definition of a programming model, programming languages, runtime systems; programming model and environments; MPI (Message Passing Interface); OpenMP; Kokkos (Performance Portable Thread-Parallel Programming Model); Kokkos abstractions, patterns, policies, and spaces; RAJA, a systematic approach to node-level portability and tuning; overview of the Legion Programming Model; mapping tasks and data to hardware resources; interoperability: supporting task-level models; Legion S3D execution and performance details; workflow, integration of external resources into the programming model.

  6. Modelling the models

    CERN Multimedia

    Anaïs Schaeffer

    2012-01-01

    By analysing the production of mesons in the forward region of LHC proton-proton collisions, the LHCf collaboration has provided key information needed to calibrate extremely high-energy cosmic ray models.   Average transverse momentum (pT) as a function of rapidity loss ∆y. Black dots represent LHCf data and the red diamonds represent SPS experiment UA7 results. The predictions of hadronic interaction models are shown by open boxes (sibyll 2.1), open circles (qgsjet II-03) and open triangles (epos 1.99). Among these models, epos 1.99 shows the best overall agreement with the LHCf data. LHCf is dedicated to the measurement of neutral particles emitted at extremely small angles in the very forward region of LHC collisions. Two imaging calorimeters – Arm1 and Arm2 – take data 140 m either side of the ATLAS interaction point. “The physics goal of this type of analysis is to provide data for calibrating the hadron interaction models – the well-known &...

  7. High definition clouds and precipitation for climate prediction -results from a unified German research initiative on high resolution modeling and observations

    Science.gov (United States)

    Rauser, F.

    2013-12-01

    We present results from the German BMBF initiative 'High Definition Cloud and Precipitation for advancing Climate Prediction -HD(CP)2'. This initiative addresses most of the problems that are discussed in this session in one, unified approach: cloud physics, convection, boundary layer development, radiation and subgrid variability are approached in one organizational framework. HD(CP)2 merges both observation and high performance computing / model development communities to tackle a shared problem: how to improve the understanding of the most important subgrid-scale processes of cloud and precipitation physics, and how to utilize this knowledge for improved climate predictions. HD(CP)2 is a coordinated initiative to: (i) realize; (ii) evaluate; and (iii) statistically characterize and exploit for the purpose of both parameterization development and cloud / precipitation feedback analysis; ultra-high resolution (100 m in the horizontal, 10-50 m in the vertical) regional hind-casts over time periods (3-15 y) and spatial scales (1000-1500 km) that are climatically meaningful. HD(CP)2 thus consists of three elements (the model development and simulations, their observational evaluation and exploitation/synthesis to advance CP prediction) and its first three-year phase has started on October 1st 2012. As a central part of HD(CP)2, the HD(CP)2 Observational Prototype Experiment (HOPE) has been carried out in spring 2013. In this campaign, high resolution measurements with a multitude of instruments from all major centers in Germany have been carried out in a limited domain, to allow for unprecedented resolution and precision in the observation of microphysics parameters on a resolution that will allow for evaluation and improvement of ultra-high resolution models. At the same time, a local area version of the new climate model ICON of the Max Planck Institute and the German weather service has been developed that allows for LES-type simulations on high resolutions on

  8. Modeling Dependencies in Critical Infrastructures

    NARCIS (Netherlands)

    Nieuwenhuijs, A.H.; Luiijf, H.A.M.; Klaver, M.H.A.

    2009-01-01

    This paper describes a model for expressing critical infrastructure dependencies. The model addresses the limitations of existing approaches with respect to clarity of definition, support for quality and the influence of operating states of critical infrastructures and environmental factors.

  9. Criticality Model

    Energy Technology Data Exchange (ETDEWEB)

    A. Alsaed

    2004-09-14

    The ''Disposal Criticality Analysis Methodology Topical Report'' (YMP 2003) presents the methodology for evaluating potential criticality situations in the monitored geologic repository. As stated in the referenced Topical Report, the detailed methodology for performing the disposal criticality analyses will be documented in model reports. Many of the models developed in support of the Topical Report differ from the definition of models as given in the Office of Civilian Radioactive Waste Management procedure AP-SIII.10Q, ''Models'', in that they are procedural, rather than mathematical. These model reports document the detailed methodology necessary to implement the approach presented in the Disposal Criticality Analysis Methodology Topical Report and provide calculations utilizing the methodology. Thus, the governing procedure for this type of report is AP-3.12Q, ''Design Calculations and Analyses''. The ''Criticality Model'' is of this latter type, providing a process evaluating the criticality potential of in-package and external configurations. The purpose of this analysis is to layout the process for calculating the criticality potential for various in-package and external configurations and to calculate lower-bound tolerance limit (LBTL) values and determine range of applicability (ROA) parameters. The LBTL calculations and the ROA determinations are performed using selected benchmark experiments that are applicable to various waste forms and various in-package and external configurations. The waste forms considered in this calculation are pressurized water reactor (PWR), boiling water reactor (BWR), Fast Flux Test Facility (FFTF), Training Research Isotope General Atomic (TRIGA), Enrico Fermi, Shippingport pressurized water reactor, Shippingport light water breeder reactor (LWBR), N-Reactor, Melt and Dilute, and Fort Saint Vrain Reactor spent nuclear fuel (SNF). The scope of

  10. Psychometric latent response models

    NARCIS (Netherlands)

    Maris, E.

    1995-01-01

    In this paper, some psychometric models will be presented that belong to the larger class oflatent response models (LRMs). First, LRMs are introduced by means of an application in the field ofcomponential item response theory (Embretson, 1980, 1984). Second, a general definition of LRMs (not specifi

  11. A Bigraph Relational Model

    DEFF Research Database (Denmark)

    Beauquier, Maxime; Schürmann, Carsten

    2011-01-01

    In this paper, we present a model based on relations for bigraphical reactive system [Milner09]. Its defining characteristics are that validity and reaction relations are captured as traces in a multi-set rewriting system. The relational model is derived from Milner's graphical definition...

  12. Definition of Linear Color Models in the RGB Vector Color Space to Detect Red Peaches in Orchard Images Taken under Natural Illumination

    Directory of Open Access Journals (Sweden)

    Jordi Palacín

    2012-06-01

    Full Text Available This work proposes the detection of red peaches in orchard images based on the definition of different linear color models in the RGB vector color space. The classification and segmentation of the pixels of the image is then performed by comparing the color distance from each pixel to the different previously defined linear color models. The methodology proposed has been tested with images obtained in a real orchard under natural light. The peach variety in the orchard was the paraguayo (Prunus persica var. platycarpa peach with red skin. The segmentation results showed that the area of the red peaches in the images was detected with an average error of 11.6%; 19.7% in the case of bright illumination; 8.2% in the case of low illumination; 8.6% for occlusion up to 33%; 12.2% in the case of occlusion between 34 and 66%; and 23% for occlusion above 66%. Finally, a methodology was proposed to estimate the diameter of the fruits based on an ellipsoidal fitting. A first diameter was obtained by using all the contour pixels and a second diameter was obtained by rejecting some pixels of the contour. This approach enables a rough estimate of the fruit occlusion percentage range by comparing the two diameter estimates.

  13. Deterministic modelling of the cumulative impacts of underground structures on urban groundwater flow and the definition of a potential state of urban groundwater flow: example of Lyon, France

    Science.gov (United States)

    Attard, Guillaume; Rossier, Yvan; Winiarski, Thierry; Cuvillier, Loann; Eisenlohr, Laurent

    2016-08-01

    Underground structures have been shown to have a great influence on subsoil resources in urban aquifers. A methodology to assess the actual and the potential state of the groundwater flow in an urban area is proposed. The study develops a three-dimensional modeling approach to understand the cumulative impacts of underground infrastructures on urban groundwater flow, using a case in the city of Lyon (France). All known underground structures were integrated in the numerical model. Several simulations were run: the actual state of groundwater flow, the potential state of groundwater flow (without underground structures), an intermediate state (without impervious structures), and a transient simulation of the actual state of groundwater flow. The results show that underground structures fragment groundwater flow systems leading to a modification of the aquifer regime. For the case studied, the flow systems are shown to be stable over time with a transient simulation. Structures with drainage systems are shown to have a major impact on flow systems. The barrier effect of impervious structures was negligible because of the small hydraulic gradient of the area. The study demonstrates that the definition of a potential urban groundwater flow and the depiction of urban flow systems, which involves understanding the impact of underground structures, are important issues with respect to urban underground planning.

  14. Efficient definitive endoderm induction from mouse embryonic stem cell adherent cultures: A rapid screening model for differentiation studies

    Directory of Open Access Journals (Sweden)

    Josué Kunjom Mfopou

    2014-01-01

    Full Text Available Definitive endoderm (DE differentiation from mouse embryonic stem cell (mESC monolayer cultures has been limited by poor cell survival or low efficiency. Recently, a combination of TGFβ and Wnt activation with BMP inhibition improved DE induction in embryoid bodies cultured in suspension. Based on these observations we developed a protocol to efficiently induce DE cells in monolayer cultures of mESCs. We obtained a good cell yield with 54.92% DE induction as shown by Foxa2, Sox17, Cxcr4 and E-Cadherin expression. These DE-cells could be further differentiated into posterior foregut and pancreatic phenotypes using a culture protocol initially developed for human embryonic stem cell (hESC differentiation. In addition, this mESC-derived DE gave rise to hepatocyte-like cells after exposure to BMP and FGF ligands. Our data therefore indicate a substantial improvement of monolayer DE induction from mESCs and support the concept that differentiation conditions for mESC-derived DE are similar to those for hESCs. As mESCs are easier to maintain and manipulate in culture compared to hESCs, and considering the shorter duration of embryonic development in the mouse, this method of efficient DE induction on monolayer will promote the development of new differentiation protocols to obtain DE-derivatives, like pancreatic beta-cells, for future use in cell replacement therapies.

  15. Promoting Models

    Science.gov (United States)

    Li, Qin; Zhao, Yongxin; Wu, Xiaofeng; Liu, Si

    There can be multitudinous models specifying aspects of the same system. Each model has a bias towards one aspect. These models often override in specific aspects though they have different expressions. A specification written in one model can be refined by introducing additional information from other models. The paper proposes a concept of promoting models which is a methodology to obtain refinements with support from cooperating models. It refines a primary model by integrating the information from a secondary model. The promotion principle is not merely an academic point, but also a reliable and robust engineering technique which can be used to develop software and hardware systems. It can also check the consistency between two specifications from different models. A case of modeling a simple online shopping system with the cooperation of the guarded design model and CSP model illustrates the practicability of the promotion principle.

  16. Soil depth map definition on a terraced slope for a following distributed, high resolution, numerical modelling analysis

    Science.gov (United States)

    Camera, C.; Apuani, T.; Mele, M.; Kuriakose, S. L.; Giudici, M.

    2012-04-01

    The soil thickness represents a key data for every environmental analysis involving soil, but its determination is not always simple. In this particular case, the study area is represented by a small terraced slope (0.6 km2) of Valtellina (Northern Italy), and the soil depth map is necessary for a coupled hydrogeological-stability analysis in a raster environment. During this work geometrical/morphological and geostatistical interpolation techniques were tested to obtain a satisfying soil depth map. At the end, the final product has been validated with geo-electrical resistivity inverse models. In this particular context, the presence of dry-stone retaining walls is of primary importance, since they have an influence on the morphology of the entire area as well as on the physical processes of water infiltration and slope stability. In order to consider the dry-stone walls in the analysis, it is necessary to have base maps with an adequate resolution (cells 1 m x 1 m). Assuming that the walls might be founded on bedrock or in its proximity, it was decided to use the heights of walls and the distribution of rock outcrops as soil depth input data. It was impossible to obtain direct measures with the knocking pole method, being pebbles frequently presents in the backfill soil . Except zero depth values, 682 measures were performed. The initial data set was divided into two subsets in order to use one as training points (76 % of the total) and the second as test points (24 %). Various techniques were tested, from linear multiple regressions with environmental predictors, to ordinary kriging, regression kriging with the same environmental variables, and Gaussian stochastic simulations. At the end, the best result was obtained with co-kriging, using a soil depth class map drawn from the field measures as co-variable. The result is a little bit guided but it was the only solution to obtain a map that partially takes into account the morphology of the slope. To verify the

  17. Cadastral Modeling

    DEFF Research Database (Denmark)

    Stubkjær, Erik

    2005-01-01

    Modeling is a term that refers to a variety of efforts, including data and process modeling. The domain to be modeled may be a department, an organization, or even an industrial sector. E-business presupposes the modeling of an industrial sector, a substantial task. Cadastral modeling compares to...

  18. Open source molecular modeling.

    Science.gov (United States)

    Pirhadi, Somayeh; Sunseri, Jocelyn; Koes, David Ryan

    2016-09-01

    The success of molecular modeling and computational chemistry efforts are, by definition, dependent on quality software applications. Open source software development provides many advantages to users of modeling applications, not the least of which is that the software is free and completely extendable. In this review we categorize, enumerate, and describe available open source software packages for molecular modeling and computational chemistry. An updated online version of this catalog can be found at https://opensourcemolecularmodeling.github.io.

  19. Conceptual IT model

    Science.gov (United States)

    Arnaoudova, Kristina; Stanchev, Peter

    2015-11-01

    The business processes are the key asset for every organization. The design of the business process models is the foremost concern and target among an organization's functions. Business processes and their proper management are intensely dependent on the performance of software applications and technology solutions. The paper is attempt for definition of new Conceptual model of IT service provider, it could be examined as IT focused Enterprise model, part of Enterprise Architecture (EA) school.

  20. Gauss Modular-Arithmetic Congruence = Signal X Noise PRODUCT: Clock-model Archimedes HYPERBOLICITY Centrality INEVITABILITY: Definition: Complexity= UTTER-SIMPLICITY: Natural-Philosophy UNITY SIMPLICITY Redux!!!

    Science.gov (United States)

    Kummer, E. E.; Siegel, Edward Carl-Ludwig

    2011-03-01

    Clock-model Archimedes [http://linkage.rockeller.edu/ wli/moved.8.04/ 1fnoise/ index. ru.html] HYPERBOLICITY inevitability throughout physics/pure-maths: Newton-law F=ma, Heisenberg and classical uncertainty-principle=Parseval/Plancherel-theorems causes FUZZYICS definition: (so miscalled) "complexity" = UTTER-SIMPLICITY!!! Watkins[www.secamlocal.ex.ac.uk/people/staff/mrwatkin/]-Hubbard[World According to Wavelets (96)-p.14!]-Franklin[1795]-Fourier[1795;1822]-Brillouin[1922] dual/inverse-space(k,w) analysis key to Fourier-unification in Archimedes hyperbolicity inevitability progress up Siegel cognition hierarchy-of-thinking (HoT): data-info.-know.-understand.-meaning-...-unity-simplicity = FUZZYICS!!! Frohlich-Mossbauer-Goldanskii-del Guidice [Nucl.Phys.B:251,375(85);275,185 (86)]-Young [arXiv-0705.4678y2, (5/31/07] theory of health/life=aqueous-electret/ ferroelectric protoplasm BEC = Archimedes-Siegel [Schrodinger Cent.Symp.(87); Symp.Fractals, MRS Fall Mtg.(89)-5-pprs] 1/w-"noise" Zipf-law power-spectrum hyperbolicity INEVITABILITY= Chi; Dirac delta-function limit w=0 concentration= BEC = Chi-Quong.

  1. Toward the definition of a carbon budget model: seasonal variation and temperature effect on respiration rate of vegetative and reproductive organs of pistachio trees (Pistacia vera).

    Science.gov (United States)

    Marra, Francesco P; Barone, Ettore; La Mantia, Michele; Caruso, Tiziano

    2009-09-01

    This study, as a preliminary step toward the definition of a carbon budget model for pistachio trees (Pistacia vera L.), aimed at estimating and evaluating the dynamics of respiration of vegetative and reproductive organs of pistachio tree. Trials were performed in 2005 in a commercial orchard located in Sicily (370 m a.s.l.) on five bearing 20-year-old pistachio trees of cv. Bianca grafted onto Pistachio terebinthus L. Growth analyses and respiration measurements were done on vegetative (leaf) and reproductive (infructescence) organs during the entire growing season (April-September) at biweekly intervals. Results suggested that the respiration rates of pistachio reproductive and vegetative organs were related to their developmental stage. Both for leaf and for infructescence, the highest values were observed during the earlier stages of growth corresponding to the phases of most intense organ growth. The sensitivity of respiration activity to temperature changes, measured by Q(10), showed an increase throughout the transition from immature to mature leaves, as well as during fruit development. The data collected were also used to estimate the seasonal carbon loss by respiration activity for a single leaf and a single infructescence. The amount of carbon lost by respiration was affected by short-term temperature patterns, organ developmental stage and tissue function.

  2. Testing agile requirements models

    Institute of Scientific and Technical Information of China (English)

    BOTASCHANJAN Jewgenij; PISTER Markus; RUMPE Bernhard

    2004-01-01

    This paper discusses a model-based approach to validate software requirements in agile development processes by simulation and in particular automated testing. The use of models as central development artifact needs to be added to the portfolio of software engineering techniques, to further increase efficiency and flexibility of the development beginning already early in the requirements definition phase. Testing requirements are some of the most important techniques to give feedback and to increase the quality of the result. Therefore testing of artifacts should be introduced as early as possible, even in the requirements definition phase.

  3. Model Warehouse

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    This paper puts forward a new conception:model warehouse,analyzes the reason why model warehouse appears and introduces the characteristics and architecture of model warehouse.Last,this paper points out that model warehouse is an important part of WebGIS.

  4. Constitutive Models

    DEFF Research Database (Denmark)

    2011-01-01

    procedure is introduced for the analysis and solution of property models. Models that capture and represent the temperature dependent behaviour of physical properties are introduced, as well as equation of state models (EOS) such as the SRK EOS. Modelling of liquid phase activity coefficients are also......This chapter presents various types of constitutive models and their applications. There are 3 aspects dealt with in this chapter, namely: creation and solution of property models, the application of parameter estimation and finally application examples of constitutive models. A systematic...... covered, illustrating several models such as the Wilson equation and NRTL equation, along with their solution strategies. A section shows how to use experimental data to regress the property model parameters using a least squares approach. A full model analysis is applied in each example that discusses...

  5. Model cities

    OpenAIRE

    Batty, M.

    2007-01-01

    The term ?model? is now central to our thinking about how weunderstand and design cities. We suggest a variety of ways inwhich we use ?models?, linking these ideas to Abercrombie?sexposition of Town and Country Planning which represented thestate of the art fifty years ago. Here we focus on using models asphysical representations of the city, tracing the development ofsymbolic models where the focus is on simulating how functiongenerates form, to iconic models where the focus is on representi...

  6. Model theory

    CERN Document Server

    Chang, CC

    2012-01-01

    Model theory deals with a branch of mathematical logic showing connections between a formal language and its interpretations or models. This is the first and most successful textbook in logical model theory. Extensively updated and corrected in 1990 to accommodate developments in model theoretic methods - including classification theory and nonstandard analysis - the third edition added entirely new sections, exercises, and references. Each chapter introduces an individual method and discusses specific applications. Basic methods of constructing models include constants, elementary chains, Sko

  7. A Monte Carlo model for independent dose verification in IMRT and VMAT for the Varian Novalis TX with high definition MLC

    Directory of Open Access Journals (Sweden)

    Luis Vazquez Quino

    2015-09-01

    Full Text Available Purpose: With intensity modulated radiation therapy (IMRT, the physician can prescribe, design and deliver optimized treatment plans that target the tumor and spare adjacent critical structures. The increased conformity of such plans often comes at the expenses of adding significant complexity to the delivery of the treatment. With volumetrically modulated arc therapy (VMAT, in addition to the modulation of the intensity of the radiation beam, other mechanical parameters such as gantry speed and dose rate are varied during treatment delivery. It is therefore imperative that we develop comprehensive and accurate methods to validate such complex delivery techniques prior to the commencement of the patient’s treatment. Methods: In this study, a Monte Carlo simulation was performed for the high definition multileaf collimator (HD-MLC of a Varian Novalis TX linac. Our simulation is based on the MCSIM code and provides a comprehensive model of the linac head. After validating the model in reference geometries, treatment plans for different anatomical sites were simulated and compared against the treatment planning system (TPS dose calculations. All simulations were performed in a cylindrical water phantom as opposed to the patient anatomy, to remove any complexities associated with density effects. Finally, a comparison through gamma analysis of dose plane between the simulation, the TPS and the measurements from the Matrixx array (IBA was conducted to verify the accuracy of our model against both the measurements and the TPS. Results: Gamma analysis of ten IMRT and ten VMAT cases for different anatomical sites was performed, using a 3%/3 mm passing criterion. The average passing rates were 97.5% and 94.3% for the IMRT and the VMAT plans respectively when comparing the MCSIM and TPS dose calculations. Conclusion: In the present work a Monte Carlo model of a Novalis TX linac which has been tested and benchmarked to produce phase-space files for the

  8. Event Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    2001-01-01

    The purpose of this chapter is to discuss conceptual event modeling within a context of information modeling. Traditionally, information modeling has been concerned with the modeling of a universe of discourse in terms of information structures. However, most interesting universes of discourse...... are dynamic and we present a modeling approach that can be used to model such dynamics. We characterize events as both information objects and change agents (Bækgaard 1997). When viewed as information objects events are phenomena that can be observed and described. For example, borrow events in a library can...

  9. Event Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    2001-01-01

    The purpose of this chapter is to discuss conceptual event modeling within a context of information modeling. Traditionally, information modeling has been concerned with the modeling of a universe of discourse in terms of information structures. However, most interesting universes of discourse...... are dynamic and we present a modeling approach that can be used to model such dynamics.We characterize events as both information objects and change agents (Bækgaard 1997). When viewed as information objects events are phenomena that can be observed and described. For example, borrow events in a library can...

  10. Numerical models

    Digital Repository Service at National Institute of Oceanography (India)

    Unnikrishnan, A; Manoj, N.T.

    Various numerical models used to study the dynamics and horizontal distribution of salinity in Mandovi-Zuari estuaries, Goa, India is discussed in this chapter. Earlier, a one-dimensional network model was developed for representing the complex...

  11. Feature Technology in Product Modeling

    Institute of Scientific and Technical Information of China (English)

    ZHANG Xu; NING Ruxin

    2006-01-01

    A unified feature definition is proposed. Feature is form-concentrated, and can be used to model product functionalities, assembly relations, and part geometries. The feature model is given and a feature classification is introduced including functional, assembly, structural, and manufacturing features. A prototype modeling system is developed in Pro/ENGINEER that can define the assembly and user-defined form features.

  12. Computable models

    CERN Document Server

    Turner, Raymond

    2009-01-01

    Computational models can be found everywhere in present day science and engineering. In providing a logical framework and foundation for the specification and design of specification languages, Raymond Turner uses this framework to introduce and study computable models. In doing so he presents the first systematic attempt to provide computational models with a logical foundation. Computable models have wide-ranging applications from programming language semantics and specification languages, through to knowledge representation languages and formalism for natural language semantics. They are al

  13. MODELING CONSCIOUSNESS

    OpenAIRE

    Taylor, J G

    2009-01-01

    We present tentative answers to three questions: firstly, what is to be assumed about the structure of the brain in attacking the problem of modeling consciousness; secondly, what is it about consciousness that is attempting to be modeled; and finally, what is taken on board the modeling enterprise, if anything, from the vast works by philosophers about the nature of mind.

  14. Zeebrugge Model

    DEFF Research Database (Denmark)

    Sclütter, Flemming; Frigaard, Peter; Liu, Zhou

    This report presents the model test results on wave run-up on the Zeebrugge breakwater under the simulated prototype storms. The model test was performed in January 2000 at the Hydraulics & Coastal Engineering Laboratory, Aalborg University. The detailed description of the model is given...

  15. Interface models

    DEFF Research Database (Denmark)

    Ravn, Anders P.; Staunstrup, Jørgen

    1994-01-01

    This paper proposes a model for specifying interfaces between concurrently executing modules of a computing system. The model does not prescribe a particular type of communication protocol and is aimed at describing interfaces between both software and hardware modules or a combination of the two....... The model describes both functional and timing properties of an interface...

  16. Constitutive Models

    DEFF Research Database (Denmark)

    2011-01-01

    This chapter presents various types of constitutive models and their applications. There are 3 aspects dealt with in this chapter, namely: creation and solution of property models, the application of parameter estimation and finally application examples of constitutive models. A systematic...

  17. Model Experiments and Model Descriptions

    Science.gov (United States)

    Jackman, Charles H.; Ko, Malcolm K. W.; Weisenstein, Debra; Scott, Courtney J.; Shia, Run-Lie; Rodriguez, Jose; Sze, N. D.; Vohralik, Peter; Randeniya, Lakshman; Plumb, Ian

    1999-01-01

    The Second Workshop on Stratospheric Models and Measurements Workshop (M&M II) is the continuation of the effort previously started in the first Workshop (M&M I, Prather and Remsberg [1993]) held in 1992. As originally stated, the aim of M&M is to provide a foundation for establishing the credibility of stratospheric models used in environmental assessments of the ozone response to chlorofluorocarbons, aircraft emissions, and other climate-chemistry interactions. To accomplish this, a set of measurements of the present day atmosphere was selected. The intent was that successful simulations of the set of measurements should become the prerequisite for the acceptance of these models as having a reliable prediction for future ozone behavior. This section is divided into two: model experiment and model descriptions. In the model experiment, participant were given the charge to design a number of experiments that would use observations to test whether models are using the correct mechanisms to simulate the distributions of ozone and other trace gases in the atmosphere. The purpose is closely tied to the needs to reduce the uncertainties in the model predicted responses of stratospheric ozone to perturbations. The specifications for the experiments were sent out to the modeling community in June 1997. Twenty eight modeling groups responded to the requests for input. The first part of this section discusses the different modeling group, along with the experiments performed. Part two of this section, gives brief descriptions of each model as provided by the individual modeling groups.

  18. Bisimulation for models in concurrency

    DEFF Research Database (Denmark)

    Nielsen, Mogens; Clausen, Christian

    1994-01-01

    Recently, Joyal, Nielsen and Winskel suggested a categorical definition of bisimulation, applicable to a wide range of models in concurrency with an accompanying notion af observations. The definition is in terms of span of open maps, and it coincides with Park and Milner's strong bisimulation...

  19. Model-Driven Constraint Programming

    CERN Document Server

    Chenouard, Raphael; Soto, Ricardo; 10.1145/1389449.1389479

    2010-01-01

    Constraint programming can definitely be seen as a model-driven paradigm. The users write programs for modeling problems. These programs are mapped to executable models to calculate the solutions. This paper focuses on efficient model management (definition and transformation). From this point of view, we propose to revisit the design of constraint-programming systems. A model-driven architecture is introduced to map solving-independent constraint models to solving-dependent decision models. Several important questions are examined, such as the need for a visual highlevel modeling language, and the quality of metamodeling techniques to implement the transformations. A main result is the s-COMMA platform that efficiently implements the chain from modeling to solving constraint problems

  20. Scalable Models Using Model Transformation

    Science.gov (United States)

    2008-07-13

    and the following companies: Agilent, Bosch, HSBC , Lockheed-Martin, National Instruments, and Toyota. Scalable Models Using Model Transformation...parametrization, and workflow automation. (AFRL), the State of California Micro Program, and the following companies: Agi- lent, Bosch, HSBC , Lockheed

  1. Data Model to CSS Mappings.

    Energy Technology Data Exchange (ETDEWEB)

    Hamlet, Benjamin R. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Montoya, Mark Sinclair [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Vickers, James Wallace [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Sandoval, Rudy Daniel [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States)

    2015-12-01

    This initial draft document contains formative data model content for select areas of Re-Engineering Phase 2 IDC System. The purpose of this document is to facilitate discussion among the stakeholders. It is not intended as a definitive proposal.

  2. Models of educational institutions' networking

    OpenAIRE

    Shilova Olga Nikolaevna

    2015-01-01

    The importance of educational institutions' networking in modern sociocultural conditions and a definition of networking in education are presented in the article. The results of research levels, methods and models of educational institutions' networking are presented and substantially disclosed.

  3. Models of educational institutions' networking

    OpenAIRE

    Shilova Olga Nikolaevna

    2015-01-01

    The importance of educational institutions' networking in modern sociocultural conditions and a definition of networking in education are presented in the article. The results of research levels, methods and models of educational institutions' networking are presented and substantially disclosed.

  4. Cadastral Modeling

    DEFF Research Database (Denmark)

    Stubkjær, Erik

    2005-01-01

    to the modeling of an industrial sector, as it aims at rendering the basic concepts that relate to the domain of real estate and the pertinent human activities. The palpable objects are pieces of land and buildings, documents, data stores and archives, as well as persons in their diverse roles as owners, holders...... to land. The paper advances the position that cadastral modeling has to include not only the physical objects, agents, and information sets of the domain, but also the objectives or requirements of cadastral systems.......Modeling is a term that refers to a variety of efforts, including data and process modeling. The domain to be modeled may be a department, an organization, or even an industrial sector. E-business presupposes the modeling of an industrial sector, a substantial task. Cadastral modeling compares...

  5. Outside users payload model

    Science.gov (United States)

    1985-01-01

    The outside users payload model which is a continuation of documents and replaces and supersedes the July 1984 edition is presented. The time period covered by this model is 1985 through 2000. The following sections are included: (1) definition of the scope of the model; (2) discussion of the methodology used; (3) overview of total demand; (4) summary of the estimated market segmentation by launch vehicle; (5) summary of the estimated market segmentation by user type; (6) details of the STS market forecast; (7) summary of transponder trends; (8) model overview by mission category; and (9) detailed mission models. All known non-NASA, non-DOD reimbursable payloads forecast to be flown by non-Soviet-block countries are included in this model with the exception of Spacelab payloads and small self contained payloads. Certain DOD-sponsored or cosponsored payloads are included if they are reimbursable launches.

  6. Modelling in Business Model design

    NARCIS (Netherlands)

    Simonse, W.L.

    2013-01-01

    It appears that business model design might not always produce a design or model as the expected result. However when designers are involved, a visual model or artefact is produced. To assist strategic managers in thinking about how they can act, the designers challenge is to combine strategy and

  7. Modelling dense relational data

    DEFF Research Database (Denmark)

    Herlau, Tue; Mørup, Morten; Schmidt, Mikkel Nørgaard;

    2012-01-01

    Relational modelling classically consider sparse and discrete data. Measures of influence computed pairwise between temporal sources naturally give rise to dense continuous-valued matrices, for instance p-values from Granger causality. Due to asymmetry or lack of positive definiteness they are no......Relational modelling classically consider sparse and discrete data. Measures of influence computed pairwise between temporal sources naturally give rise to dense continuous-valued matrices, for instance p-values from Granger causality. Due to asymmetry or lack of positive definiteness...... they are not naturally suited for kernel K-means. We propose a generative Bayesian model for dense matrices which generalize kernel K-means to consider off-diagonal interactions in matrices of interactions, and demonstrate its ability to detect structure on both artificial data and two real data sets....

  8. IIB Matrix Model

    CERN Document Server

    Aoki, H; Kawai, H; Kitazawa, Y; Tada, T; Tsuchiya, A

    1999-01-01

    We review our proposal for a constructive definition of superstring, type IIB matrix model. The IIB matrix model is a manifestly covariant model for space-time and matter which possesses N=2 supersymmetry in ten dimensions. We refine our arguments to reproduce string perturbation theory based on the loop equations. We emphasize that the space-time is dynamically determined from the eigenvalue distributions of the matrices. We also explain how matter, gauge fields and gravitation appear as fluctuations around dynamically determined space-time.

  9. Higher Order Spreading Models

    CERN Document Server

    Argyros, S A; Tyros, K

    2012-01-01

    We introduce the higher order spreading models associated to a Banach space $X$. Their definition is based on $\\ff$-sequences $(x_s)_{s\\in\\ff}$ with $\\ff$ a regular thin family and the plegma families. We show that the higher order spreading models of a Banach space $X$ form an increasing transfinite hierarchy $(\\mathcal{SM}_\\xi(X))_{\\xi<\\omega_1}$. Each $\\mathcal{SM}_\\xi (X)$ contains all spreading models generated by $\\ff$-sequences $(x_s)_{s\\in\\ff}$ with order of $\\ff$ equal to $\\xi$. We also provide a study of the fundamental properties of the hierarchy.

  10. Climate Models

    Science.gov (United States)

    Druyan, Leonard M.

    2012-01-01

    Climate models is a very broad topic, so a single volume can only offer a small sampling of relevant research activities. This volume of 14 chapters includes descriptions of a variety of modeling studies for a variety of geographic regions by an international roster of authors. The climate research community generally uses the rubric climate models to refer to organized sets of computer instructions that produce simulations of climate evolution. The code is based on physical relationships that describe the shared variability of meteorological parameters such as temperature, humidity, precipitation rate, circulation, radiation fluxes, etc. Three-dimensional climate models are integrated over time in order to compute the temporal and spatial variations of these parameters. Model domains can be global or regional and the horizontal and vertical resolutions of the computational grid vary from model to model. Considering the entire climate system requires accounting for interactions between solar insolation, atmospheric, oceanic and continental processes, the latter including land hydrology and vegetation. Model simulations may concentrate on one or more of these components, but the most sophisticated models will estimate the mutual interactions of all of these environments. Advances in computer technology have prompted investments in more complex model configurations that consider more phenomena interactions than were possible with yesterday s computers. However, not every attempt to add to the computational layers is rewarded by better model performance. Extensive research is required to test and document any advantages gained by greater sophistication in model formulation. One purpose for publishing climate model research results is to present purported advances for evaluation by the scientific community.

  11. Model Evaluation of Continuous Data Pharmacometric Models: Metrics and Graphics.

    Science.gov (United States)

    Nguyen, Tht; Mouksassi, M-S; Holford, N; Al-Huniti, N; Freedman, I; Hooker, A C; John, J; Karlsson, M O; Mould, D R; Pérez Ruixo, J J; Plan, E L; Savic, R; van Hasselt, Jgc; Weber, B; Zhou, C; Comets, E; Mentré, F

    2017-02-01

    This article represents the first in a series of tutorials on model evaluation in nonlinear mixed effect models (NLMEMs), from the International Society of Pharmacometrics (ISoP) Model Evaluation Group. Numerous tools are available for evaluation of NLMEM, with a particular emphasis on visual assessment. This first basic tutorial focuses on presenting graphical evaluation tools of NLMEM for continuous data. It illustrates graphs for correct or misspecified models, discusses their pros and cons, and recalls the definition of metrics used.

  12. Model Evaluation of Continuous Data Pharmacometric Models: Metrics and Graphics

    Science.gov (United States)

    Nguyen, THT; Mouksassi, M‐S; Holford, N; Al‐Huniti, N; Freedman, I; Hooker, AC; John, J; Karlsson, MO; Mould, DR; Pérez Ruixo, JJ; Plan, EL; Savic, R; van Hasselt, JGC; Weber, B; Zhou, C; Comets, E

    2017-01-01

    This article represents the first in a series of tutorials on model evaluation in nonlinear mixed effect models (NLMEMs), from the International Society of Pharmacometrics (ISoP) Model Evaluation Group. Numerous tools are available for evaluation of NLMEM, with a particular emphasis on visual assessment. This first basic tutorial focuses on presenting graphical evaluation tools of NLMEM for continuous data. It illustrates graphs for correct or misspecified models, discusses their pros and cons, and recalls the definition of metrics used. PMID:27884052

  13. SeReM2--a meta-model for the structured definition of quality requirements for electronic health record services.

    Science.gov (United States)

    Hoerbst, Alexander; Hackl, Werner; Ammenwerth, Elske

    2010-01-01

    Quality assurance is a major task with regard to Electronic Health Records (EHR). Currently there are only a few approaches explicitly dealing with the quality of EHR services as a whole. The objective of this paper is to introduce a new Meta-Model to structure and describe quality requirements of EHRs. This approach should support the transnational quality certification of EHR services. The Model was developed based on interviews with 24 experts and a systematic literature search and comprises a service and requirements model. The service model represents the structure of a service whereas the requirements model can be used to assign specific predefined aims and requirements to a service. The new model differs from existing approaches as it accounts for modern software architectures and the special attributes of EHRs.

  14. Mathematical modelling

    CERN Document Server

    2016-01-01

    This book provides a thorough introduction to the challenge of applying mathematics in real-world scenarios. Modelling tasks rarely involve well-defined categories, and they often require multidisciplinary input from mathematics, physics, computer sciences, or engineering. In keeping with this spirit of modelling, the book includes a wealth of cross-references between the chapters and frequently points to the real-world context. The book combines classical approaches to modelling with novel areas such as soft computing methods, inverse problems, and model uncertainty. Attention is also paid to the interaction between models, data and the use of mathematical software. The reader will find a broad selection of theoretical tools for practicing industrial mathematics, including the analysis of continuum models, probabilistic and discrete phenomena, and asymptotic and sensitivity analysis.

  15. Mathematical modelling

    DEFF Research Database (Denmark)

    Blomhøj, Morten

    2004-01-01

    Developing competences for setting up, analysing and criticising mathematical models are normally seen as relevant only from and above upper secondary level. The general belief among teachers is that modelling activities presuppose conceptual understanding of the mathematics involved. Mathematical...... modelling, however, can be seen as a practice of teaching that place the relation between real life and mathematics into the centre of teaching and learning mathematics, and this is relevant at all levels. Modelling activities may motivate the learning process and help the learner to establish cognitive...... roots for the construction of important mathematical concepts. In addition competences for setting up, analysing and criticising modelling processes and the possible use of models is a formative aim in this own right for mathematics teaching in general education. The paper presents a theoretical...

  16. Spherical models

    CERN Document Server

    Wenninger, Magnus J

    2012-01-01

    Well-illustrated, practical approach to creating star-faced spherical forms that can serve as basic structures for geodesic domes. Complete instructions for making models from circular bands of paper with just a ruler and compass. Discusses tessellation, or tiling, and how to make spherical models of the semiregular solids and concludes with a discussion of the relationship of polyhedra to geodesic domes and directions for building models of domes. "". . . very pleasant reading."" - Science. 1979 edition.

  17. Zeebrugge Model

    DEFF Research Database (Denmark)

    Liu, Zhou; Frigaard, Peter

    This report presents the model on wave run-up and run-down on the Zeebrugge breakwater under short-crested oblique wave attacks. The model test was performed in March-April 2000 at the Hydraulics & Coastal Engineering Laboratory, Aalborg University.......This report presents the model on wave run-up and run-down on the Zeebrugge breakwater under short-crested oblique wave attacks. The model test was performed in March-April 2000 at the Hydraulics & Coastal Engineering Laboratory, Aalborg University....

  18. Stream Modelling

    DEFF Research Database (Denmark)

    Vestergaard, Kristian

    the engineers, but as the scale and the complexity of the hydraulic works increased, the mathematical models became so complex that a mathematical solution could not be obtained. This created a demand for new methods and again the experimental investigation became popular, but this time as measurements on small......-scale models. But still the scale and complexity of hydraulic works were increasing, and soon even small-scale models reached a natural limit for some applications. In the mean time the modern computer was developed, and it became possible to solve complex mathematical models by use of computer-based numerical...

  19. Ventilation Model

    Energy Technology Data Exchange (ETDEWEB)

    V. Chipman

    2002-10-05

    The purpose of the Ventilation Model is to simulate the heat transfer processes in and around waste emplacement drifts during periods of forced ventilation. The model evaluates the effects of emplacement drift ventilation on the thermal conditions in the emplacement drifts and surrounding rock mass, and calculates the heat removal by ventilation as a measure of the viability of ventilation to delay the onset of peak repository temperature and reduce its magnitude. The heat removal by ventilation is temporally and spatially dependent, and is expressed as the fraction of heat carried away by the ventilation air compared to the fraction of heat produced by radionuclide decay. One minus the heat removal is called the wall heat fraction, or the remaining amount of heat that is transferred via conduction to the surrounding rock mass. Downstream models, such as the ''Multiscale Thermohydrologic Model'' (BSC 2001), use the wall heat fractions as outputted from the Ventilation Model to initialize their post-closure analyses. The Ventilation Model report was initially developed to analyze the effects of preclosure continuous ventilation in the Engineered Barrier System (EBS) emplacement drifts, and to provide heat removal data to support EBS design. Revision 00 of the Ventilation Model included documentation of the modeling results from the ANSYS-based heat transfer model. The purposes of Revision 01 of the Ventilation Model are: (1) To validate the conceptual model for preclosure ventilation of emplacement drifts and verify its numerical application in accordance with new procedural requirements as outlined in AP-SIII-10Q, Models (Section 7.0). (2) To satisfy technical issues posed in KTI agreement RDTME 3.14 (Reamer and Williams 2001a). Specifically to demonstrate, with respect to the ANSYS ventilation model, the adequacy of the discretization (Section 6.2.3.1), and the downstream applicability of the model results (i.e. wall heat fractions) to

  20. Modeling Documents with Event Model

    Directory of Open Access Journals (Sweden)

    Longhui Wang

    2015-08-01

    Full Text Available Currently deep learning has made great breakthroughs in visual and speech processing, mainly because it draws lessons from the hierarchical mode that brain deals with images and speech. In the field of NLP, a topic model is one of the important ways for modeling documents. Topic models are built on a generative model that clearly does not match the way humans write. In this paper, we propose Event Model, which is unsupervised and based on the language processing mechanism of neurolinguistics, to model documents. In Event Model, documents are descriptions of concrete or abstract events seen, heard, or sensed by people and words are objects in the events. Event Model has two stages: word learning and dimensionality reduction. Word learning is to learn semantics of words based on deep learning. Dimensionality reduction is the process that representing a document as a low dimensional vector by a linear mode that is completely different from topic models. Event Model achieves state-of-the-art results on document retrieval tasks.

  1. Robust calibrations on reduced sample sets for API content prediction in tablets: definition of a cost-effective NIR model development strategy.

    Science.gov (United States)

    Pieters, Sigrid; Saeys, Wouter; Van den Kerkhof, Tom; Goodarzi, Mohammad; Hellings, Mario; De Beer, Thomas; Heyden, Yvan Vander

    2013-01-25

    Owing to spectral variations from other sources than the component of interest, large investments in the NIR model development may be required to obtain satisfactory and robust prediction performance. To make the NIR model development for routine active pharmaceutical ingredient (API) prediction in tablets more cost-effective, alternative modelling strategies were proposed. They used a massive amount of prior spectral information on intra- and inter-batch variation and the pure component spectra to define a clutter, i.e., the detrimental spectral information. This was subsequently used for artificial data augmentation and/or orthogonal projections. The model performance improved statistically significantly, with a 34-40% reduction in RMSEP while needing fewer model latent variables, by applying the following procedure before PLS regression: (1) augmentation of the calibration spectra with the spectral shapes from the clutter, and (2) net analyte pre-processing (NAP). The improved prediction performance was not compromised when reducing the variability in the calibration set, making exhaustive calibration unnecessary. Strong water content variations in the tablets caused frequency shifts of the API absorption signals that could not be included in the clutter. Updating the model for this kind of variation demonstrated that the completeness of the clutter is critical for the performance of these models and that the model will only be more robust for spectral variation that is not co-linear with the one from the property of interest.

  2. Comparative Town Meetings: A Search for Causative Models of Feminine Involvement in Politics with New Operational Definitions of a Well Calloused Dependent Variable.

    Science.gov (United States)

    Bryan, Frank M.

    Variations in the level of female political participation were examined in the context of the "standard" model of political participation (higher socioeconomic status, urbanism, living at society's center, increased participation) and the "decline of community" model (decreased group membership, increased mobility, decline of…

  3. Analysis of Business Models

    Directory of Open Access Journals (Sweden)

    Slavik Stefan

    2014-12-01

    Full Text Available The term business model has been used in practice for few years, but companies create, define and innovate their models subconsciously from the start of business. Our paper is aimed to clear the theory about business model, hence definition and all the components that form each business. In the second part, we create an analytical tool and analyze the real business models in Slovakia and define the characteristics of each part of business model, i.e., customers, distribution, value, resources, activities, cost and revenue. In the last part of our paper, we discuss the most used characteristics, extremes, discrepancies and the most important facts which were detected in our research.

  4. Modelling Epistemic Systems

    CERN Document Server

    Martins, Andre C R

    2012-01-01

    In this Chapter, I will explore the use of modeling in order to understand how Science works. I will discuss the modeling of scientific communities, providing a general, non-comprehensive overview of existing models, with a focus on the use of the tools of Agent-Based Modeling and Opinion Dynamics. A special attention will be paid to models inspired by a Bayesian formalism of Opinion Dynamics. The objective of this exploration is to better understand the effect that different conditions might have on the reliability of the opinions of a scientific community. We will see that, by using artificial worlds as exploring grounds, we can prevent some epistemological problems with the definition of truth and obtain insights on the conditions that might cause the quest for more reliable knowledge to fail.

  5. Model Selection for Geostatistical Models

    Energy Technology Data Exchange (ETDEWEB)

    Hoeting, Jennifer A.; Davis, Richard A.; Merton, Andrew A.; Thompson, Sandra E.

    2006-02-01

    We consider the problem of model selection for geospatial data. Spatial correlation is typically ignored in the selection of explanatory variables and this can influence model selection results. For example, the inclusion or exclusion of particular explanatory variables may not be apparent when spatial correlation is ignored. To address this problem, we consider the Akaike Information Criterion (AIC) as applied to a geostatistical model. We offer a heuristic derivation of the AIC in this context and provide simulation results that show that using AIC for a geostatistical model is superior to the often used approach of ignoring spatial correlation in the selection of explanatory variables. These ideas are further demonstrated via a model for lizard abundance. We also employ the principle of minimum description length (MDL) to variable selection for the geostatistical model. The effect of sampling design on the selection of explanatory covariates is also explored.

  6. Didactical modelling

    DEFF Research Database (Denmark)

    Højgaard, Tomas; Hansen, Rune

    2016-01-01

    The purpose of this paper is to introduce Didactical Modelling as a research methodology in mathematics education. We compare the methodology with other approaches and argue that Didactical Modelling has its own specificity. We discuss the methodological “why” and explain why we find it useful to...

  7. Didactical modelling

    OpenAIRE

    Højgaard, Tomas; Hansen, Rune

    2016-01-01

    The purpose of this paper is to introduce Didactical Modelling as a research methodology in mathematics education. We compare the methodology with other approaches and argue that Didactical Modelling has its own specificity. We discuss the methodological “why” and explain why we find it useful to construct this approach in mathematics education research.

  8. Animal models

    DEFF Research Database (Denmark)

    Gøtze, Jens Peter; Krentz, Andrew

    2014-01-01

    In this issue of Cardiovascular Endocrinology, we are proud to present a broad and dedicated spectrum of reviews on animal models in cardiovascular disease. The reviews cover most aspects of animal models in science from basic differences and similarities between small animals and the human...

  9. Martingale Model

    OpenAIRE

    Giandomenico, Rossano

    2006-01-01

    The model determines a stochastic continuous process as continuous limit of a stochastic discrete process so to show that the stochastic continuous process converges to the stochastic discrete process such that we can integrate it. Furthermore, the model determines the expected volatility and the expected mean so to show that the volatility and the mean are increasing function of the time.

  10. Dispersion Modeling.

    Science.gov (United States)

    Budiansky, Stephen

    1980-01-01

    This article discusses the need for more accurate and complete input data and field verification of the various models of air pollutant dispension. Consideration should be given to changing the form of air quality standards based on enhanced dispersion modeling techniques. (Author/RE)

  11. Education models

    NARCIS (Netherlands)

    Poortman, Sybilla; Sloep, Peter

    2006-01-01

    Educational models describes a case study on a complex learning object. Possibilities are investigated for using this learning object, which is based on a particular educational model, outside of its original context. Furthermore, this study provides advice that might lead to an increase in

  12. Battery Modeling

    NARCIS (Netherlands)

    Jongerden, M.R.; Haverkort, Boudewijn R.H.M.

    2008-01-01

    The use of mobile devices is often limited by the capacity of the employed batteries. The battery lifetime determines how long one can use a device. Battery modeling can help to predict, and possibly extend this lifetime. Many different battery models have been developed over the years. However,

  13. 城市电视台高清电视发展模式研究%Research on the development model of high definition TV in city TV station

    Institute of Scientific and Technical Information of China (English)

    张晓丽

    2016-01-01

    At present in our country is can promote the improvement of HDTV development related policies and systems,high-definition television in the digital TV position has gradually improved,HD market has gradually opened,high-definition receiving products and high-definition display in the market more and more,the State Administration of radio, film and television on the development of high-definition television is to encourage and support a positive attitude.Based on this,the paper makes a systematic study on the development model of the Urban TV station.%目前我国正在完善能够促进高清电视发展的相关政策制度,高清电视在数字电视中的地位已经逐渐提高,高清市场也渐渐打开,高清接收产品和高清显示器在市场上也比较多,国家广电总局对发展高清电视更是持鼓励和积极支持的态度.基于此,论文对城市电视台高清电视发展模式进行了系统研究.

  14. Linguistic models and linguistic modeling.

    Science.gov (United States)

    Pedryez, W; Vasilakos, A V

    1999-01-01

    The study is concerned with a linguistic approach to the design of a new category of fuzzy (granular) models. In contrast to numerically driven identification techniques, we concentrate on budding meaningful linguistic labels (granules) in the space of experimental data and forming the ensuing model as a web of associations between such granules. As such models are designed at the level of information granules and generate results in the same granular rather than pure numeric format, we refer to them as linguistic models. Furthermore, as there are no detailed numeric estimation procedures involved in the construction of the linguistic models carried out in this way, their design mode can be viewed as that of a rapid prototyping. The underlying algorithm used in the development of the models utilizes an augmented version of the clustering technique (context-based clustering) that is centered around a notion of linguistic contexts-a collection of fuzzy sets or fuzzy relations defined in the data space (more precisely a space of input variables). The detailed design algorithm is provided and contrasted with the standard modeling approaches commonly encountered in the literature. The usefulness of the linguistic mode of system modeling is discussed and illustrated with the aid of numeric studies including both synthetic data as well as some time series dealing with modeling traffic intensity over a broadband telecommunication network.

  15. OSPREY Model

    Energy Technology Data Exchange (ETDEWEB)

    Veronica J. Rutledge

    2013-01-01

    The absence of industrial scale nuclear fuel reprocessing in the U.S. has precluded the necessary driver for developing the advanced simulation capability now prevalent in so many other countries. Thus, it is essential to model complex series of unit operations to simulate, understand, and predict inherent transient behavior and feedback loops. A capability of accurately simulating the dynamic behavior of advanced fuel cycle separation processes will provide substantial cost savings and many technical benefits. The specific fuel cycle separation process discussed in this report is the off-gas treatment system. The off-gas separation consists of a series of scrubbers and adsorption beds to capture constituents of interest. Dynamic models are being developed to simulate each unit operation involved so each unit operation can be used as a stand-alone model and in series with multiple others. Currently, an adsorption model has been developed within Multi-physics Object Oriented Simulation Environment (MOOSE) developed at the Idaho National Laboratory (INL). Off-gas Separation and REcoverY (OSPREY) models the adsorption of off-gas constituents for dispersed plug flow in a packed bed under non-isothermal and non-isobaric conditions. Inputs to the model include gas, sorbent, and column properties, equilibrium and kinetic data, and inlet conditions. The simulation outputs component concentrations along the column length as a function of time from which breakthrough data is obtained. The breakthrough data can be used to determine bed capacity, which in turn can be used to size columns. It also outputs temperature along the column length as a function of time and pressure drop along the column length. Experimental data and parameters were input into the adsorption model to develop models specific for krypton adsorption. The same can be done for iodine, xenon, and tritium. The model will be validated with experimental breakthrough curves. Customers will be given access to

  16. Modelling synergistic effects of appetite regulating hormones

    DEFF Research Database (Denmark)

    Schmidt, Julie Berg; Ritz, Christian

    2016-01-01

    We briefly reviewed one definition of dose addition, which is applicable within the framework of generalized linear models. We established how this definition of dose addition corresponds to effect addition in case only two doses per compound are considered for evaluating synergistic effects. The....... The link between definitions was exemplified for an appetite study where two appetite hormones were studied....

  17. Model hydrographs

    Science.gov (United States)

    Mitchell, W.D.

    1972-01-01

    Model hydrographs are composed of pairs of dimensionless ratios, arrayed in tabular form, which, when modified by the appropriate values of rainfall exceed and by the time and areal characteristics of the drainage basin, satisfactorily represent the flood hydrograph for the basin. Model bydrographs are developed from a dimensionless translation hydrograph, having a time base of T hours and appropriately modified for storm duration by routing through reservoir storage, S=kOx. Models fall into two distinct classes: (1) those for which the value of x is unity and which have all the characteristics of true unit hydrographs and (2) those for which the value of x is other than unity and to which the unit-hydrograph principles of proportionality and superposition do not apply. Twenty-six families of linear models and eight families of nonlinear models in tabular form from the principal subject of this report. Supplemental discussions describe the development of the models and illustrate their application. Other sections of the report, supplemental to the tables, describe methods of determining the hydrograph characteristics, T, k, and x, both from observed hydrograph and from the physical characteristics of the drainage basin. Five illustrative examples of use show that the models, when properly converted to incorporate actual rainfall excess and the time and areal characteristics of the drainage basins, do indeed satisfactorily represent the observed flood hydrographs for the basins.

  18. Beyond (Models of) Disability?

    Science.gov (United States)

    Beaudry, Jonas-Sébastien

    2016-01-01

    The strategy of developing an ontology or models of disability as a prior step to settling ethical issues regarding disabilities is highly problematic for two reasons. First, key definitional aspects of disability are normative and cannot helpfully be made value-neutral. Second, if we accept that the contested concept of disability is value-laden, it is far from obvious that there are definitive reasons for choosing one interpretation of the concept over another. I conclude that the concept of disability is better left ethically open-ended or broad enough to encompass the examination of various ethical issues (such as oppression, minority rights, or physical discomfort). Alternatively, the concept of disability could be altogether abandoned in order to focus on specific issues without being hindered by debates about the nature of disability. Only political costs, rather than conceptual considerations internal to the models, could be weighed against such a conclusion. PMID:26892249

  19. Modeling complexes of modeled proteins.

    Science.gov (United States)

    Anishchenko, Ivan; Kundrotas, Petras J; Vakser, Ilya A

    2017-03-01

    Structural characterization of proteins is essential for understanding life processes at the molecular level. However, only a fraction of known proteins have experimentally determined structures. This fraction is even smaller for protein-protein complexes. Thus, structural modeling of protein-protein interactions (docking) primarily has to rely on modeled structures of the individual proteins, which typically are less accurate than the experimentally determined ones. Such "double" modeling is the Grand Challenge of structural reconstruction of the interactome. Yet it remains so far largely untested in a systematic way. We present a comprehensive validation of template-based and free docking on a set of 165 complexes, where each protein model has six levels of structural accuracy, from 1 to 6 Å C(α) RMSD. Many template-based docking predictions fall into acceptable quality category, according to the CAPRI criteria, even for highly inaccurate proteins (5-6 Å RMSD), although the number of such models (and, consequently, the docking success rate) drops significantly for models with RMSD > 4 Å. The results show that the existing docking methodologies can be successfully applied to protein models with a broad range of structural accuracy, and the template-based docking is much less sensitive to inaccuracies of protein models than the free docking. Proteins 2017; 85:470-478. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  20. Modelling survival

    DEFF Research Database (Denmark)

    Ashauer, Roman; Albert, Carlo; Augustine, Starrlight

    2016-01-01

    The General Unified Threshold model for Survival (GUTS) integrates previously published toxicokinetic-toxicodynamic models and estimates survival with explicitly defined assumptions. Importantly, GUTS accounts for time-variable exposure to the stressor. We performed three studies to test...... the ability of GUTS to predict survival of aquatic organisms across different pesticide exposure patterns, time scales and species. Firstly, using synthetic data, we identified experimental data requirements which allow for the estimation of all parameters of the GUTS proper model. Secondly, we assessed how...

  1. Modelling Constructs

    DEFF Research Database (Denmark)

    Kindler, Ekkart

    2009-01-01

    There are many different notations and formalisms for modelling business processes and workflows. These notations and formalisms have been introduced with different purposes and objectives. Later, influenced by other notations, comparisons with other tools, or by standardization efforts, these no...

  2. Linear Models

    CERN Document Server

    Searle, Shayle R

    2012-01-01

    This 1971 classic on linear models is once again available--as a Wiley Classics Library Edition. It features material that can be understood by any statistician who understands matrix algebra and basic statistical methods.

  3. Modeling Arcs

    CERN Document Server

    Insepov, Zeke; Veitzer, Seth; Mahalingam, Sudhakar

    2011-01-01

    Although vacuum arcs were first identified over 110 years ago, they are not yet well understood. We have since developed a model of breakdown and gradient limits that tries to explain, in a self-consistent way: arc triggering, plasma initiation, plasma evolution, surface damage and gra- dient limits. We use simple PIC codes for modeling plasmas, molecular dynamics for modeling surface breakdown, and surface damage, and mesoscale surface thermodynamics and finite element electrostatic codes for to evaluate surface properties. Since any given experiment seems to have more variables than data points, we have tried to consider a wide variety of arcing (rf structures, e beam welding, laser ablation, etc.) to help constrain the problem, and concentrate on common mechanisms. While the mechanisms can be comparatively simple, modeling can be challenging.

  4. Paleoclimate Modeling

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Computer simulations of past climate. Variables provided as model output are described by parameter keyword. In some cases the parameter keywords are a subset of all...

  5. Anchor Modeling

    Science.gov (United States)

    Regardt, Olle; Rönnbäck, Lars; Bergholtz, Maria; Johannesson, Paul; Wohed, Petia

    Maintaining and evolving data warehouses is a complex, error prone, and time consuming activity. The main reason for this state of affairs is that the environment of a data warehouse is in constant change, while the warehouse itself needs to provide a stable and consistent interface to information spanning extended periods of time. In this paper, we propose a modeling technique for data warehousing, called anchor modeling, that offers non-destructive extensibility mechanisms, thereby enabling robust and flexible management of changes in source systems. A key benefit of anchor modeling is that changes in a data warehouse environment only require extensions, not modifications, to the data warehouse. This ensures that existing data warehouse applications will remain unaffected by the evolution of the data warehouse, i.e. existing views and functions will not have to be modified as a result of changes in the warehouse model.

  6. Model theory

    CERN Document Server

    Hodges, Wilfrid

    1993-01-01

    An up-to-date and integrated introduction to model theory, designed to be used for graduate courses (for students who are familiar with first-order logic), and as a reference for more experienced logicians and mathematicians.

  7. Product Development Process Modeling

    Institute of Scientific and Technical Information of China (English)

    1999-01-01

    The use of Concurrent Engineering and other modern methods of product development and maintenance require that a large number of time-overlapped "processes" be performed by many people. However, successfully describing and optimizing these processes are becoming even more difficult to achieve. The perspective of industrial process theory (the definition of process) and the perspective of process implementation (process transition, accumulation, and inter-operations between processes) are used to survey the method used to build one base model (multi-view) process model.

  8. Accelerated life models modeling and statistical analysis

    CERN Document Server

    Bagdonavicius, Vilijandas

    2001-01-01

    Failure Time DistributionsIntroductionParametric Classes of Failure Time DistributionsAccelerated Life ModelsIntroductionGeneralized Sedyakin's ModelAccelerated Failure Time ModelProportional Hazards ModelGeneralized Proportional Hazards ModelsGeneralized Additive and Additive-Multiplicative Hazards ModelsChanging Shape and Scale ModelsGeneralizationsModels Including Switch-Up and Cycling EffectsHeredity HypothesisSummaryAccelerated Degradation ModelsIntroductionDegradation ModelsModeling the Influence of Explanatory Varia

  9. Business Model Process Configurations

    DEFF Research Database (Denmark)

    Taran, Yariv; Nielsen, Christian; Thomsen, Peter

    2015-01-01

    Purpose – The paper aims: 1) To develop systematically a structural list of various business model process configuration and to group (deductively) these selected configurations in a structured typological categorization list. 2) To facilitate companies in the process of BM innovation......, by developing (inductively) an ontological classification framework, in view of the BM process configurations typology developed. Design/methodology/approach – Given the inconsistencies found in the business model studies (e.g. definitions, configurations, classifications) we adopted the analytical induction...... method of data analysis. Findings - A comprehensive literature review and analysis resulted in a list of business model process configurations systematically organized under five classification groups, namely, revenue model; value proposition; value configuration; target customers, and strategic...

  10. Principles of models based engineering

    Energy Technology Data Exchange (ETDEWEB)

    Dolin, R.M.; Hefele, J.

    1996-11-01

    This report describes a Models Based Engineering (MBE) philosophy and implementation strategy that has been developed at Los Alamos National Laboratory`s Center for Advanced Engineering Technology. A major theme in this discussion is that models based engineering is an information management technology enabling the development of information driven engineering. Unlike other information management technologies, models based engineering encompasses the breadth of engineering information, from design intent through product definition to consumer application.

  11. Geoneutrino and Hydridic Earth model

    CERN Document Server

    Bezrukov, Leonid

    2013-01-01

    Uranium, Thorium and Potassium-40 abundances in the Earth were calculated in the frame of Hydridic Earth model. Terrestrial heat producton from U, Th and K40 decays was calculated also. We must admit the existance of Earth expansion process to understand the obtained large value of terrestrial heat producton. The geoneutrino detector with volume more than 5 kT (LENA type) must be constructed to definitely separate between Bulk Silicat Earth model and Hydridic Earth model.

  12. Do stroke models model stroke?

    Directory of Open Access Journals (Sweden)

    Philipp Mergenthaler

    2012-11-01

    Full Text Available Stroke is one of the leading causes of death worldwide and the biggest reason for long-term disability. Basic research has formed the modern understanding of stroke pathophysiology, and has revealed important molecular, cellular and systemic mechanisms. However, despite decades of research, most translational stroke trials that aim to introduce basic research findings into clinical treatment strategies – most notably in the field of neuroprotection – have failed. Among other obstacles, poor methodological and statistical standards, negative publication bias, and incomplete preclinical testing have been proposed as ‘translational roadblocks’. In this article, we introduce the models commonly used in preclinical stroke research, discuss some of the causes of failed translational success and review potential remedies. We further introduce the concept of modeling ‘care’ of stroke patients, because current preclinical research models the disorder but does not model care or state-of-the-art clinical testing. Stringent statistical methods and controlled preclinical trials have been suggested to counteract weaknesses in preclinical research. We conclude that preclinical stroke research requires (1 appropriate modeling of the disorder, (2 appropriate modeling of the care of stroke patients and (3 an approach to preclinical testing that is similar to clinical testing, including Phase 3 randomized controlled preclinical trials as necessary additional steps before new therapies enter clinical testing.

  13. Persistent Modelling

    DEFF Research Database (Denmark)

    2012-01-01

    The relationship between representation and the represented is examined here through the notion of persistent modelling. This notion is not novel to the activity of architectural design if it is considered as describing a continued active and iterative engagement with design concerns – an evident...... characteristic of architectural practice. But the persistence in persistent modelling can also be understood to apply in other ways, reflecting and anticipating extended roles for representation. This book identifies three principle areas in which these extensions are becoming apparent within contemporary....... It also provides critical insight into the use of contemporary modelling tools and methods, together with an examination of the implications their use has within the territories of architectural design, realisation and experience....

  14. Mathematical modeling

    CERN Document Server

    Eck, Christof; Knabner, Peter

    2017-01-01

    Mathematical models are the decisive tool to explain and predict phenomena in the natural and engineering sciences. With this book readers will learn to derive mathematical models which help to understand real world phenomena. At the same time a wealth of important examples for the abstract concepts treated in the curriculum of mathematics degrees are given. An essential feature of this book is that mathematical structures are used as an ordering principle and not the fields of application. Methods from linear algebra, analysis and the theory of ordinary and partial differential equations are thoroughly introduced and applied in the modeling process. Examples of applications in the fields electrical networks, chemical reaction dynamics, population dynamics, fluid dynamics, elasticity theory and crystal growth are treated comprehensively.

  15. Inflatable Models

    Institute of Scientific and Technical Information of China (English)

    Ling Li; Vasily Volkov

    2006-01-01

    A physically-based model is presented for the simulation of a new type of deformable objects-inflatable objects, such as shaped balloons, which consist of pressurized air enclosed by an elastic surface. These objects have properties inherent in both 3D and 2D elastic bodies, as they demonstrate the behaviour of 3D shapes using 2D formulations. As there is no internal structure in them, their behaviour is substantially different from the behaviour of deformable solid objects. We use one of the few available models for deformable surfaces, and enhance it to include the forces of internal and external pressure. These pressure forces may also incorporate buoyancy forces, to allow objects filled with a low density gas to float in denser media. The obtained models demonstrate rich dynamic behaviour, such as bouncing, floating, deflation and inflation.

  16. Lens Model

    DEFF Research Database (Denmark)

    Nash, Ulrik William

    2014-01-01

    Firms consist of people who make decisions to achieve goals. How do these people develop the expectations which underpin the choices they make? The lens model provides one answer to this question. It was developed by cognitive psychologist Egon Brunswik (1952) to illustrate his theory of probabil......Firms consist of people who make decisions to achieve goals. How do these people develop the expectations which underpin the choices they make? The lens model provides one answer to this question. It was developed by cognitive psychologist Egon Brunswik (1952) to illustrate his theory...... of probabilistic functionalism, and concerns the environment and the mind, and adaptation by the latter to the former. This entry is about the lens model, and probabilistic functionalism more broadly. Focus will mostly be on firms and their employees, but, to fully appreciate the scope, we have to keep in mind...

  17. Controversy around the definition of waste

    CSIR Research Space (South Africa)

    Oelofse, Suzanna HH

    2009-11-20

    Full Text Available This paper presents the information concerning the definition of waste. Discussing the importance of the clear definition, ongoing debates, broad definition of waste, problems with the broad definition, interpretation, current waste management model...

  18. Lens Model

    DEFF Research Database (Denmark)

    Nash, Ulrik William

    2014-01-01

    Firms consist of people who make decisions to achieve goals. How do these people develop the expectations which underpin the choices they make? The lens model provides one answer to this question. It was developed by cognitive psychologist Egon Brunswik (1952) to illustrate his theory of probabil......Firms consist of people who make decisions to achieve goals. How do these people develop the expectations which underpin the choices they make? The lens model provides one answer to this question. It was developed by cognitive psychologist Egon Brunswik (1952) to illustrate his theory...

  19. Molecular modeling

    Directory of Open Access Journals (Sweden)

    Aarti Sharma

    2009-01-01

    Full Text Available The use of computational chemistry in the development of novel pharmaceuticals is becoming an increasingly important tool. In the past, drugs were simply screened for effectiveness. The recent advances in computing power and the exponential growth of the knowledge of protein structures have made it possible for organic compounds to be tailored to decrease the harmful side effects and increase the potency. This article provides a detailed description of the techniques employed in molecular modeling. Molecular modeling is a rapidly developing discipline, and has been supported by the dramatic improvements in computer hardware and software in recent years.

  20. Smashnova Model

    CERN Document Server

    Sivaram, C

    2007-01-01

    An alternate model for gamma ray bursts is suggested. For a white dwarf (WD) and neutron star (NS) very close binary system, the WD (close to Mch) can detonate due to tidal heating, leading to a SN. Material falling on to the NS at relativistic velocities can cause its collapse to a magnetar or quark star or black hole leading to a GRB. As the material smashes on to the NS, it is dubbed the Smashnova model. Here the SN is followed by a GRB. NS impacting a RG (or RSG) (like in Thorne-Zytkow objects) can also cause a SN outburst followed by a GRB. Other variations are explored.

  1. Modelling language

    CERN Document Server

    Cardey, Sylviane

    2013-01-01

    In response to the need for reliable results from natural language processing, this book presents an original way of decomposing a language(s) in a microscopic manner by means of intra/inter‑language norms and divergences, going progressively from languages as systems to the linguistic, mathematical and computational models, which being based on a constructive approach are inherently traceable. Languages are described with their elements aggregating or repelling each other to form viable interrelated micro‑systems. The abstract model, which contrary to the current state of the art works in int

  2. Business Model for Czech Agribusiness

    Directory of Open Access Journals (Sweden)

    Poláková Jana

    2015-09-01

    Full Text Available Business modelling facilitates the understanding of value creation logic in organizations in general. Identifying the components of business models based on different criteria helps understanding the fundamentals of business and the position of entrepreneurs and managers in companies. The present research is focused on the definition of a specific business model for the Czech agribusiness sector. Based on the theoretical background and evaluation of selected business models, the aim is to create a new business model, using components which take into account the specifics of this particular industry.

  3. Model for analysis and definition of the governor constants in hydroelectric power; Modelo para analise e definicao das constantes do regulador em usinas hidreletricas

    Energy Technology Data Exchange (ETDEWEB)

    Andrade, Jose Geraldo Pena de; Koelle, Edmundo; Luvizotto Junior, Edevar [Universidade Estadual de Campinas, SP (Brazil). Faculdade de Engenharia Civil. Dept. de Hidraulica e Saneamento

    1997-07-01

    This paper presents a complete mathematical and computer model which allows simulating a generic hydroelectric power plant under steady state and transitory regimes, in the extensive time, and also the analysis of the oscillating flows resulting from excitation sources present in the installation, such as vortices in the suction pipe during partial load operation.

  4. 面向业务的动态任务分配工作模型定义%Definition for business-oriented dynamic task allocation work model

    Institute of Scientific and Technical Information of China (English)

    李春芳; 徐建军

    2014-01-01

    To solve the deficiency of the complete business-oriented dynamic task allocation work model and the unclarity of working relationships among the model’s constitutions ,a strategy-first dynamic task allocation work model S-DTAWM was pro-posed .Using the mathematical set theory , the components , relationships and executing algorithms of the work model S-DTAWM were formalized ,and a foundation for the research and the practice of the universal task allocation system was built . Then cases used in the approving and examining process for the bank credit were provided to illustrate the use of the work model .%为解决面向业务的完整动态任务分配工作模型缺失以及组成内容之间工作关系语义不清晰的问题,提出策略优先的动态任务分配工作模型S-DTAWM。利用集合论的描述语言,形式化定义S-DTAWM 各构件内容、工作关系及执行算法,为面向业务的通用动态任务分配系统构建奠定基础。通过相应模型在银行信贷审批过程的具体案例表明其应用方法。

  5. 过程模型中控制流反模式的定义和检测方法%Definition and detection approach of control-flow anti-patterns in process models

    Institute of Scientific and Technical Information of China (English)

    韩兆刚; 巩朋; 张莉; 吕方兴

    2013-01-01

    提出一种过程模型中控制流反模式的定义和检测方法,抽取不同过程模型中的控制流结构并将其规约为语言无关的过程结构树,基于不同控制流反模式的CAPDL定义,在过程结构树上查找与之对应的控制流反模式.该方法既支持不同的过程建模语言,也允许用户自定义控制流反模式.基于215个实际BPMN过程模型的反模式检测试验表明,该方法可以有效地检测用户自定义的控制流反模式,与已有方法相比,该方法极大地提高了反模式检测的检测效率.%A control-flow anti-pattern definition and detection approach for process models was proposed.Firstly control-flow structures of different process models are extracted and reduced to process structure tree which is language-independent,then control-flow anti-patterns are detected according to their CAPDL (control-flow anti-pattern description language) definitions in the process structure tree.The proposed approach can support user-defined control-flow anti-pattern detection with different process modeling languages.The results of anti-pattern detection experiment based on 215 real-world BPMN process models show that the proposed approach can detect user-defined control-flow anti-patterns effectively.Compared with existing approaches,the efficiency of anti-pattern detection is greatly enhanced.

  6. Model molecules mimicking asphaltenes.

    Science.gov (United States)

    Sjöblom, Johan; Simon, Sébastien; Xu, Zhenghe

    2015-04-01

    Asphalthenes are typically defined as the fraction of petroleum insoluble in n-alkanes (typically heptane, but also hexane or pentane) but soluble in toluene. This fraction causes problems of emulsion formation and deposition/precipitation during crude oil production, processing and transport. From the definition it follows that asphaltenes are not a homogeneous fraction but is composed of molecules polydisperse in molecular weight, structure and functionalities. Their complexity makes the understanding of their properties difficult. Proper model molecules with well-defined structures which can resemble the properties of real asphaltenes can help to improve this understanding. Over the last ten years different research groups have proposed different asphaltene model molecules and studied them to determine how well they can mimic the properties of asphaltenes and determine the mechanisms behind the properties of asphaltenes. This article reviews the properties of the different classes of model compounds proposed and present their properties by comparison with fractionated asphaltenes. After presenting the interest of developing model asphaltenes, the composition and properties of asphaltenes are presented, followed by the presentation of approaches and accomplishments of different schools working on asphaltene model compounds. The presentation of bulk and interfacial properties of perylene-based model asphaltene compounds developed by Sjöblom et al. is the subject of the next part. Finally the emulsion-stabilization properties of fractionated asphaltenes and model asphaltene compounds is presented and discussed.

  7. Building Models and Building Modelling

    DEFF Research Database (Denmark)

    Jørgensen, Kaj Asbjørn; Skauge, Jørn

    I rapportens indledende kapitel beskrives de primære begreber vedrørende bygningsmodeller og nogle fundamentale forhold vedrørende computerbaseret modulering bliver opstillet. Desuden bliver forskellen mellem tegneprogrammer og bygnings­model­lerings­programmer beskrevet. Vigtige aspekter om......­lering og bygningsmodeller. Det bliver understreget at modellering bør udføres på flere abstraktions­niveauer og i to dimensioner i den såkaldte modelleringsmatrix. Ud fra dette identificeres de primære faser af bygningsmodel­lering. Dernæst beskrives de basale karakteristika for bygningsmodeller. Heri...... inkluderes en præcisering af begreberne objektorienteret software og objektorienteret modeller. Det bliver fremhævet at begrebet objektbaseret modellering giver en tilstrækkelig og bedre forståelse. Endelig beskrives forestillingen om den ideale bygningsmodel som værende én samlet model, der anvendes gennem...

  8. Zeebrugge Model

    DEFF Research Database (Denmark)

    Jensen, Morten S.; Frigaard, Peter

    In the following, results from model tests with Zeebrugge breakwater are presented. The objective with these tests is partly to investigate the influence on wave run-up due to a changing waterlevel during a storm. Finally, the influence on wave run-up due to an introduced longshore current...

  9. Why Model?

    Directory of Open Access Journals (Sweden)

    Olaf eWolkenhauer

    2014-01-01

    Full Text Available Next generation sequencing technologies are bringing about a renaissance of mining approaches. A comprehensive picture of the genetic landscape of an individual patient will be useful, for example, to identify groups of patients that do or do not respond to certain therapies. The high expectations may however not be satisfied if the number of patient groups with similar characteristics is going to be very large. I therefore doubt that mining sequence data will give us an understanding of why and when therapies work. For understanding the mechanisms underlying diseases, an alternative approach is to model small networks in quantitative mechanistic detail, to elucidate the role of gene and proteins in dynamically changing the functioning of cells. Here an obvious critique is that these models consider too few components, compared to what might be relevant for any particular cell function. I show here that mining approaches and dynamical systems theory are two ends of a spectrum of methodologies to choose from. Drawing upon personal experience in numerous interdisciplinary collaborations, I provide guidance on how to model by discussing the question Why model?

  10. Why model?

    Science.gov (United States)

    Wolkenhauer, Olaf

    2014-01-01

    Next generation sequencing technologies are bringing about a renaissance of mining approaches. A comprehensive picture of the genetic landscape of an individual patient will be useful, for example, to identify groups of patients that do or do not respond to certain therapies. The high expectations may however not be satisfied if the number of patient groups with similar characteristics is going to be very large. I therefore doubt that mining sequence data will give us an understanding of why and when therapies work. For understanding the mechanisms underlying diseases, an alternative approach is to model small networks in quantitative mechanistic detail, to elucidate the role of gene and proteins in dynamically changing the functioning of cells. Here an obvious critique is that these models consider too few components, compared to what might be relevant for any particular cell function. I show here that mining approaches and dynamical systems theory are two ends of a spectrum of methodologies to choose from. Drawing upon personal experience in numerous interdisciplinary collaborations, I provide guidance on how to model by discussing the question "Why model?"

  11. Model CAPM

    OpenAIRE

    Burianová, Eva

    2008-01-01

    Cílem první části této bakalářské práce je - pomocí analýzy výchozích textů - teoretické shrnutí ekonomických modelů a teorií, na kterých model CAPM stojí: Markowitzův model teorie portfolia (analýza maximalizace očekávaného užitku a na něm založený model výběru optimálního portfolia), Tobina (rozšíření Markowitzova modelu ? rozdělení výběru optimálního portfolia do dvou fází; nejprve určení optimální kombinace rizikových instrumentů a následná alokace dostupného kapitálu mezi tuto optimální ...

  12. Transport modeling

    Institute of Scientific and Technical Information of China (English)

    R.E. Waltz

    2007-01-01

    @@ There has been remarkable progress during the past decade in understanding and modeling turbulent transport in tokamaks. With some exceptions the progress is derived from the huge increases in computational power and the ability to simulate tokamak turbulence with ever more fundamental and physically realistic dynamical equations, e.g.

  13. Painting models

    Science.gov (United States)

    Baart, F.; Donchyts, G.; van Dam, A.; Plieger, M.

    2015-12-01

    The emergence of interactive art has blurred the line between electronic, computer graphics and art. Here we apply this art form to numerical models. Here we show how the transformation of a numerical model into an interactive painting can both provide insights and solve real world problems. The cases that are used as an example include forensic reconstructions, dredging optimization, barrier design. The system can be fed using any source of time varying vector fields, such as hydrodynamic models. The cases used here, the Indian Ocean (HYCOM), the Wadden Sea (Delft3D Curvilinear), San Francisco Bay (3Di subgrid and Delft3D Flexible Mesh), show that the method used is suitable for different time and spatial scales. High resolution numerical models become interactive paintings by exchanging their velocity fields with a high resolution (>=1M cells) image based flow visualization that runs in a html5 compatible web browser. The image based flow visualization combines three images into a new image: the current image, a drawing, and a uv + mask field. The advection scheme that computes the resultant image is executed in the graphics card using WebGL, allowing for 1M grid cells at 60Hz performance on mediocre graphic cards. The software is provided as open source software. By using different sources for a drawing one can gain insight into several aspects of the velocity fields. These aspects include not only the commonly represented magnitude and direction, but also divergence, topology and turbulence .

  14. Modeling Muscles

    Science.gov (United States)

    Goodwyn, Lauren; Salm, Sarah

    2007-01-01

    Teaching the anatomy of the muscle system to high school students can be challenging. Students often learn about muscle anatomy by memorizing information from textbooks or by observing plastic, inflexible models. Although these mediums help students learn about muscle placement, the mediums do not facilitate understanding regarding integration of…

  15. Entrepreneurship Models.

    Science.gov (United States)

    Finger Lakes Regional Education Center for Economic Development, Mount Morris, NY.

    This guide describes seven model programs that were developed by the Finger Lakes Regional Center for Economic Development (New York) to meet the training needs of female and minority entrepreneurs to help their businesses survive and grow and to assist disabled and dislocated workers and youth in beginning small businesses. The first three models…

  16. Quality modelling

    NARCIS (Netherlands)

    Tijskens, L.M.M.

    2003-01-01

    For modelling product behaviour, with respect to quality for users and consumers, its essential to have at least a fundamental notion what quality really is, and which product properties determine the quality assigned by the consumer to a product. In other words: what is allowed and what is to be

  17. Cloud Robotics Model

    Directory of Open Access Journals (Sweden)

    Gyula Mester

    2015-01-01

    Full Text Available Cloud Robotics was born from the merger of service robotics and cloud technologies. It allows robots to benefit from the powerful computational, storage, and communications resources of modern data centres. Cloud robotics allows robots to take advantage of the rapid increase in data transfer rates to offload tasks without hard real time requirements. Cloud Robotics has rapidly gained momentum with initiatives by companies such as Google, Willow Garage and Gostai as well as more than a dozen active research projects around the world. The presentation summarizes the main idea, the definition, the cloud model composed of essential characteristics, service models and deployment models, planning task execution and beyond. Finally some cloud robotics projects are discussed.

  18. Evacuation modeling trends

    CERN Document Server

    Abreu, Orlando; Alvear, Daniel

    2016-01-01

    This book presents an overview of modeling definitions and concepts, theory on human behavior and human performance data, available tools and simulation approaches, model development, and application and validation methods. It considers the data and research efforts needed to develop and incorporate functions for the different parameters into comprehensive escape and evacuation simulations, with a number of examples illustrating different aspects and approaches. After an overview of basic modeling approaches, the book discusses benefits and challenges of current techniques. The representation of evacuees is a central issue, including human behavior and the proper implementation of representational tools. Key topics include the nature and importance of the different parameters involved in ASET and RSET and the interactions between them. A review of the current literature on verification and validation methods is provided, with a set of recommended verification tests and examples of validation tests. The book c...

  19. A Hybrid Model. DEMETER

    Energy Technology Data Exchange (ETDEWEB)

    Gerlagh, Reyer [University of Manchester, Manchester (United Kingdom); Van der Zwaan, Bob [ECN Policy Studies, Petten (Netherlands)

    2009-11-15

    This insightful book explores the issue of sustainable development in its more operative and applied sense. Although a great deal of research has addressed potential interpretations and definitions of sustainable development, much of this work is too abstract to offer policy-makers and researchers the feasible and effective guidelines they require. This book redresses the balance. The authors highlight how various indicators and aggregate measures can be included in models that are used for decision-making support and sustainability assessment. They also demonstrate the importance of identifying practical means to assess whether policy proposals, specific decisions or targeted scenarios are sustainable. With discussions of basic concepts relevant to understanding applied sustainability analysis, such as definitions of costs and revenue recycling, this book provides policy-makers, researchers and graduate students with feasible and effective principles for measuring sustainable development.

  20. Study for the optimization of a transport aircraft wing for maximum fuel efficiency. Volume 1: Methodology, criteria, aeroelastic model definition and results

    Science.gov (United States)

    Radovcich, N. A.; Dreim, D.; Okeefe, D. A.; Linner, L.; Pathak, S. K.; Reaser, J. S.; Richardson, D.; Sweers, J.; Conner, F.

    1985-01-01

    Work performed in the design of a transport aircraft wing for maximum fuel efficiency is documented with emphasis on design criteria, design methodology, and three design configurations. The design database includes complete finite element model description, sizing data, geometry data, loads data, and inertial data. A design process which satisfies the economics and practical aspects of a real design is illustrated. The cooperative study relationship between the contractor and NASA during the course of the contract is also discussed.

  1. Building Models and Building Modelling

    DEFF Research Database (Denmark)

    Jørgensen, Kaj; Skauge, Jørn

    2008-01-01

    I rapportens indledende kapitel beskrives de primære begreber vedrørende bygningsmodeller og nogle fundamentale forhold vedrørende computerbaseret modulering bliver opstillet. Desuden bliver forskellen mellem tegneprogrammer og bygnings­model­lerings­programmer beskrevet. Vigtige aspekter om comp...

  2. Molecular Modelling

    Directory of Open Access Journals (Sweden)

    Aarti Sharma

    2009-12-01

    Full Text Available

    The use of computational chemistry in the development of novel pharmaceuticals is becoming an increasingly important
    tool. In the past, drugs were simply screened for effectiveness. The recent advances in computing power and
    the exponential growth of the knowledge of protein structures have made it possible for organic compounds to tailored to
    decrease harmful side effects and increase the potency. This article provides a detailed description of the techniques
    employed in molecular modeling. Molecular modelling is a rapidly developing discipline, and has been supported from
    the dramatic improvements in computer hardware and software in recent years.

  3. Cheating models

    DEFF Research Database (Denmark)

    Arnoldi, Jakob

    The article discusses the use of algorithmic models for so-called High Frequency Trading (HFT) in finance. HFT is controversial yet widespread in modern financial markets. It is a form of automated trading technology which critics among other things claim can lead to market manipulation. Drawing...... on two cases, this article shows that manipulation more likely happens in the reverse way, meaning that human traders attempt to make algorithms ‘make mistakes’ or ‘mislead’ algos. Thus, it is algorithmic models, not humans, that are manipulated. Such manipulation poses challenges for security exchanges....... The article analyses these challenges and argues that we witness a new post-social form of human-technology interaction that will lead to a reconfiguration of professional codes for financial trading....

  4. Acyclic models

    CERN Document Server

    Barr, Michael

    2002-01-01

    Acyclic models is a method heavily used to analyze and compare various homology and cohomology theories appearing in topology and algebra. This book is the first attempt to put together in a concise form this important technique and to include all the necessary background. It presents a brief introduction to category theory and homological algebra. The author then gives the background of the theory of differential modules and chain complexes over an abelian category to state the main acyclic models theorem, generalizing and systemizing the earlier material. This is then applied to various cohomology theories in algebra and topology. The volume could be used as a text for a course that combines homological algebra and algebraic topology. Required background includes a standard course in abstract algebra and some knowledge of topology. The volume contains many exercises. It is also suitable as a reference work for researchers.

  5. Nuclear Models

    Science.gov (United States)

    Fossión, Rubén

    2010-09-01

    The atomic nucleus is a typical example of a many-body problem. On the one hand, the number of nucleons (protons and neutrons) that constitute the nucleus is too large to allow for exact calculations. On the other hand, the number of constituent particles is too small for the individual nuclear excitation states to be explained by statistical methods. Another problem, particular for the atomic nucleus, is that the nucleon-nucleon (n-n) interaction is not one of the fundamental forces of Nature, and is hard to put in a single closed equation. The nucleon-nucleon interaction also behaves differently between two free nucleons (bare interaction) and between two nucleons in the nuclear medium (dressed interaction). Because of the above reasons, specific nuclear many-body models have been devised of which each one sheds light on some selected aspects of nuclear structure. Only combining the viewpoints of different models, a global insight of the atomic nucleus can be gained. In this chapter, we revise the the Nuclear Shell Model as an example of the microscopic approach, and the Collective Model as an example of the geometric approach. Finally, we study the statistical properties of nuclear spectra, basing on symmetry principles, to find out whether there is quantum chaos in the atomic nucleus. All three major approaches have been rewarded with the Nobel Prize of Physics. In the text, we will stress how each approach introduces its own series of approximations to reduce the prohibitingly large number of degrees of freedom of the full many-body problem to a smaller manageable number of effective degrees of freedom.

  6. Modelling Behaviour

    DEFF Research Database (Denmark)

    2015-01-01

    This book reflects and expands on the current trend in the building industry to understand, simulate and ultimately design buildings by taking into consideration the interlinked elements and forces that act on them. This approach overcomes the traditional, exclusive focus on building tasks, while....... The chapter authors were invited speakers at the 5th Symposium "Modelling Behaviour", which took place at the CITA in Copenhagen in September 2015....

  7. Modeling Minds

    DEFF Research Database (Denmark)

    Michael, John

    others' minds. Then (2), in order to bring to light some possible justifications, as well as hazards and criticisms of the methodology of looking time tests, I will take a closer look at the concept of folk psychology and will focus on the idea that folk psychology involves using oneself as a model...... of other people in order to predict and understand their behavior. Finally (3), I will discuss the historical location and significance of the emergence of looking time tests...

  8. Modeling biomembranes.

    Energy Technology Data Exchange (ETDEWEB)

    Plimpton, Steven James; Heffernan, Julieanne; Sasaki, Darryl Yoshio; Frischknecht, Amalie Lucile; Stevens, Mark Jackson; Frink, Laura J. Douglas

    2005-11-01

    Understanding the properties and behavior of biomembranes is fundamental to many biological processes and technologies. Microdomains in biomembranes or ''lipid rafts'' are now known to be an integral part of cell signaling, vesicle formation, fusion processes, protein trafficking, and viral and toxin infection processes. Understanding how microdomains form, how they depend on membrane constituents, and how they act not only has biological implications, but also will impact Sandia's effort in development of membranes that structurally adapt to their environment in a controlled manner. To provide such understanding, we created physically-based models of biomembranes. Molecular dynamics (MD) simulations and classical density functional theory (DFT) calculations using these models were applied to phenomena such as microdomain formation, membrane fusion, pattern formation, and protein insertion. Because lipid dynamics and self-organization in membranes occur on length and time scales beyond atomistic MD, we used coarse-grained models of double tail lipid molecules that spontaneously self-assemble into bilayers. DFT provided equilibrium information on membrane structure. Experimental work was performed to further help elucidate the fundamental membrane organization principles.

  9. Model Mapping Approach Based on Ontology Semantics

    Directory of Open Access Journals (Sweden)

    Jinkui Hou

    2013-09-01

    Full Text Available The mapping relations between different models are the foundation for model transformation in model-driven software development. On the basis of ontology semantics, model mappings between different levels are classified by using structural semantics of modeling languages. The general definition process for mapping relations is explored, and the principles of structure mapping are proposed subsequently. The approach is further illustrated by the mapping relations from class model of object oriented modeling language to the C programming codes. The application research shows that the approach provides a theoretical guidance for the realization of model mapping, and thus can make an effective support to model-driven software development

  10. PATTERN MATCHING IN MODELS

    Directory of Open Access Journals (Sweden)

    Cristian GEORGESCU

    2005-01-01

    Full Text Available The goal of this paper is to investigate how such a pattern matching could be performed on models,including the definition of the input language as well as the elaboration of efficient matchingalgorithms. Design patterns can be considered reusable micro-architectures that contribute to anoverall system architecture. Frameworks are also closely related to design patterns. Componentsoffer the possibility to radically change the behaviors and services offered by an application bysubstitution or addition of new components, even a long time after deployment. Software testing isanother aspect of reliable development. Testing activities mainly consist in ensuring that a systemimplementation conforms to its specifications.

  11. Model Construct Based Enterprise Model Architecture and Its Modeling Approach

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    In order to support enterprise integration, a kind of model construct based enterprise model architecture and its modeling approach are studied in this paper. First, the structural makeup and internal relationships of enterprise model architecture are discussed. Then, the concept of reusable model construct (MC) which belongs to the control view and can help to derive other views is proposed. The modeling approach based on model construct consists of three steps, reference model architecture synthesis, enterprise model customization, system design and implementation. According to MC based modeling approach a case study with the background of one-kind-product machinery manufacturing enterprises is illustrated. It is shown that proposal model construct based enterprise model architecture and modeling approach are practical and efficient.

  12. DTN Modeling in OPNET Modeler

    Directory of Open Access Journals (Sweden)

    PAPAJ Jan

    2014-05-01

    Full Text Available Traditional wireless networks use the concept of the point-to-point forwarding inherited from reliable wired networks which seems to be not ideal for wireless environment. New emerging applications and networks operate mostly disconnected. So-called Delay-Tolerant networks (DTNs are receiving increasing attentions from both academia and industry. DTNs introduced a store-carry-and-forward concept solving the problem of intermittent connectivity. Behavior of such networks is verified by real models, computer simulation or combination of the both approaches. Computer simulation has become the primary and cost effective tool for evaluating the performance of the DTNs. OPNET modeler is our target simulation tool and we wanted to spread OPNET’s simulation opportunity towards DTN. We implemented bundle protocol to OPNET modeler allowing simulate cases based on bundle concept as epidemic forwarding which relies on flooding the network with messages and the forwarding algorithm based on the history of past encounters (PRoPHET. The implementation details will be provided in article.

  13. A Model

    Institute of Scientific and Technical Information of China (English)

    Liu Zhiyang

    2011-01-01

    Similar to ISO Technical Committees,SAC Technical Committees undertake the management and coordination of standard's development and amendments in various sectors in industry,playing the role as a bridge among enterprises,research institutions and the governmental standardization administration.How to fully play the essential role is the vital issue SAC has been committing to resolve.Among hundreds of SAC TCs,one stands out in knitting together those isolated,scattered,but highly competitive enterprises in the same industry with the "Standards" thread,and achieving remarkable results in promoting industry development with standardization.It sets a role model for other TCs.

  14. PATHS groundwater hydrologic model

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, R.W.; Schur, J.A.

    1980-04-01

    A preliminary evaluation capability for two-dimensional groundwater pollution problems was developed as part of the Transport Modeling Task for the Waste Isolation Safety Assessment Program (WISAP). Our approach was to use the data limitations as a guide in setting the level of modeling detail. PATHS Groundwater Hydrologic Model is the first level (simplest) idealized hybrid analytical/numerical model for two-dimensional, saturated groundwater flow and single component transport; homogeneous geology. This document consists of the description of the PATHS groundwater hydrologic model. The preliminary evaluation capability prepared for WISAP, including the enhancements that were made because of the authors' experience using the earlier capability is described. Appendixes A through D supplement the report as follows: complete derivations of the background equations are provided in Appendix A. Appendix B is a comprehensive set of instructions for users of PATHS. It is written for users who have little or no experience with computers. Appendix C is for the programmer. It contains information on how input parameters are passed between programs in the system. It also contains program listings and test case listing. Appendix D is a definition of terms.

  15. Model building techniques for analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Walther, Howard P.; McDaniel, Karen Lynn; Keener, Donald; Cordova, Theresa Elena; Henry, Ronald C.; Brooks, Sean; Martin, Wilbur D.

    2009-09-01

    The practice of mechanical engineering for product development has evolved into a complex activity that requires a team of specialists for success. Sandia National Laboratories (SNL) has product engineers, mechanical designers, design engineers, manufacturing engineers, mechanical analysts and experimentalists, qualification engineers, and others that contribute through product realization teams to develop new mechanical hardware. The goal of SNL's Design Group is to change product development by enabling design teams to collaborate within a virtual model-based environment whereby analysis is used to guide design decisions. Computer-aided design (CAD) models using PTC's Pro/ENGINEER software tools are heavily relied upon in the product definition stage of parts and assemblies at SNL. The three-dimensional CAD solid model acts as the design solid model that is filled with all of the detailed design definition needed to manufacture the parts. Analysis is an important part of the product development process. The CAD design solid model (DSM) is the foundation for the creation of the analysis solid model (ASM). Creating an ASM from the DSM currently is a time-consuming effort; the turnaround time for results of a design needs to be decreased to have an impact on the overall product development. This effort can be decreased immensely through simple Pro/ENGINEER modeling techniques that summarize to the method features are created in a part model. This document contains recommended modeling techniques that increase the efficiency of the creation of the ASM from the DSM.

  16. Business model elements impacting cloud computing adoption

    DEFF Research Database (Denmark)

    Bogataj, Kristina; Pucihar, Andreja; Sudzina, Frantisek

    The paper presents a proposed research framework for identification of business model elements impacting Cloud Computing Adoption. We provide a definition of main Cloud Computing characteristics, discuss previous findings on factors impacting Cloud Computing Adoption, and investigate technology...... adoption theories, such as Diffusion of Innovations, Technology Acceptance Model, Unified Theory of Acceptance and Use of Technology. Further on, at research model for identification of Cloud Computing Adoption factors from a business model perspective is presented. The following business model building...

  17. Business model elements impacting cloud computing adoption

    DEFF Research Database (Denmark)

    Bogataj, Kristina; Pucihar, Andreja; Sudzina, Frantisek

    adoption theories, such as Diffusion of Innovations, Technology Acceptance Model, Unified Theory of Acceptance and Use of Technology. Further on, at research model for identification of Cloud Computing Adoption factors from a business model perspective is presented. The following business model building......The paper presents a proposed research framework for identification of business model elements impacting Cloud Computing Adoption. We provide a definition of main Cloud Computing characteristics, discuss previous findings on factors impacting Cloud Computing Adoption, and investigate technology...

  18. Comparison of serum calcium measurements with respect to five models of atomic absorption spectrometers using NBS-AACC calcium reference method and isotope-dilution mass spectrometry as the definitive method.

    Science.gov (United States)

    Copeland, B E; Grisley, D W; Casella, J; Bailey, H

    1976-10-01

    Utilizing the recently described reference method for calcium (NBS-AACC) and the recently developed definitive (referee) NBS method for serum calcium measurement by isotopedilution mass spectrometry (IDMS), an evaluation of five recent-model atomic absorption spectrometers was carried out. Under optimal conditions of instrument operation using aqueous standards, significant differences were found during the comparative analyses of three lyophilized pool samples and one liquid serum pool sample. Use of the NBS-AACC serum calcium protocol did not guarantee analytic results within +/- 2% of the IDMS value. In four of eight comparisons, differences from IDMS greater than 2% were observed. Several variables were studied to account for these differences. It was shown that a serum matrix, when present in standards used to bracket the unknown sample, reduced differences between instruments in four of four instances and improved the accuracy of the results from a range of -1.1 to +3.5% to +0.1 to +1.0%. It is concluded that a serum sample with a verified IDMS calcium value is a valuable tool that establishes an accurate and stable reference point for serum calcium measurement. The use of transfer-of-NBS-technology multipliers is suggested. Regional quality control serum pools and clinical chemistry survey sample materials that have been analyzed for calcium concentration by the NBS-IDMS definitive method are examples of these multipliers.

  19. Availability growth modeling

    Energy Technology Data Exchange (ETDEWEB)

    Wendelberger, J.R.

    1998-12-01

    In reliability modeling, the term availability is used to represent the fraction of time that a process is operating successfully. Several different definitions have been proposed for different types of availability. One commonly used measure of availability is cumulative availability, which is defined as the ratio of the amount of time that a system is up and running to the total elapsed time. During the startup phase of a process, cumulative availability may be treated as a growth process. A procedure for modeling cumulative availability as a function of time is proposed. Estimates of other measures of availability are derived from the estimated cumulative availability function. The use of empirical Bayes techniques to improve the resulting estimates is also discussed.

  20. Computational Modeling of Culture's Consequences

    NARCIS (Netherlands)

    Hofstede, G.J.; Jonker, C.M.; Verwaart, T.

    2010-01-01

    This paper presents an approach to formalize the influence of culture on the decision functions of agents in social simulations. The key components are (a) a definition of the domain of study in the form of a decision model, (b) knowledge acquisition based on a dimensional theory of culture, resulti

  1. Selective Maintenance Model Considering Time Uncertainty

    OpenAIRE

    Le Chen; Zhengping Shu; Yuan Li; Xuezhi Lv

    2012-01-01

    This study proposes a selective maintenance model for weapon system during mission interval. First, it gives relevant definitions and operational process of material support system. Then, it introduces current research on selective maintenance modeling. Finally, it establishes numerical model for selecting corrective and preventive maintenance tasks, considering time uncertainty brought by unpredictability of maintenance procedure, indetermination of downtime for spares and difference of skil...

  2. [Integration of a psychologist into Nephrology-Dialysis-Hypertension Operative Unit: from needs evaluation to the definition of an intervention model].

    Science.gov (United States)

    Monica, Ratti Maria; Delli Zotti, Giulia Bruna; Spotti, Donatella; Sarno, Lucio

    2014-01-01

    Chronic Kidney Disease (CKD) and the dialytic treatment cause a significant psychological impact on patients, their families and on the medical-nursing staff too. The psychological aspects linked to the chronic condition of Kidney Disease generate the need to integrated a psychologist into the healthcare team of the Nephrology, Dialysis and Hypertension Operative Unit, in order to offer a specific and professional support to the patient during the different stages of the disease, to their caregivers and to the medical team. The aim of this collaboration project between Nephrology and Psychology is to create a global and integrated healthcare model. It does not give attention simply to the physical dimension of patients affected by CKD, but also to the emotional-affective, cognitive and social dimensions and to the health environment.

  3. Using EPA Tools and Data Services to Inform Changes to Design Storm Definitions for Wastewater Utilities based on Climate Model Projections

    Science.gov (United States)

    Tryby, M.; Fries, J. S.; Baranowski, C.

    2014-12-01

    Extreme precipitation events can cause significant impacts to drinking water and wastewater utilities, including facility damage, water quality impacts, service interruptions and potential risks to human health and the environment due to localized flooding and combined sewer overflows (CSOs). These impacts will become more pronounced with the projected increases in frequency and intensity of extreme precipitation events due to climate change. To model the impacts of extreme precipitation events, wastewater utilities often develop Intensity, Duration, and Frequency (IDF) rainfall curves and "design storms" for use in the U.S. Environmental Protection Agency's (EPA) Storm Water Management Model (SWMM). Wastewater utilities use SWMM for planning, analysis, and facility design related to stormwater runoff, combined and sanitary sewers, and other drainage systems in urban and non-urban areas. SWMM tracks (1) the quantity and quality of runoff made within each sub-catchment; and (2) the flow rate, flow depth, and quality of water in each pipe and channel during a simulation period made up of multiple time steps. In its current format, EPA SWMM does not consider climate change projection data. Climate change may affect the relationship between intensity, duration, and frequency described by past rainfall events. Therefore, EPA is integrating climate projection data available in the Climate Resilience Evaluation and Awareness Tool (CREAT) into SWMM. CREAT is a climate risk assessment tool for utilities that provides downscaled climate change projection data for changes in the amount of rainfall in a 24-hour period for various extreme precipitation events (e.g., from 5-year to 100-year storm events). Incorporating climate change projections into SWMM will provide wastewater utilities with more comprehensive data they can use in planning for future storm events, thereby reducing the impacts to the utility and customers served from flooding and stormwater issues.

  4. Representation of architectural artifacts: definition of an approach combining the complexity of the 3d digital instance with the intelligibility of the theoretical model.

    Directory of Open Access Journals (Sweden)

    David Lo Buglio

    2012-12-01

    Full Text Available EnWith the arrival of digital technologies in the field of architectural documentation, many tools and methods for data acquisition have been considerably developed. However, these developments are primarily used for recording colorimetric and dimensional properties of the objects processed. The actors, of the disciplines concerned by 3D digitization of architectural heritage, are facing with a large number of data, leaving the survey far from its cognitive dimension. In this context, it seems necessary to provide innovative solutions in order to increase the informational value of the representations produced by strengthen relations between "multiplicity" of data and "intelligibility" of the theoretical model. With the purpose of answering to the lack of methodology we perceived, this article therefore offers an approach to the creation of representation systems that articulate the digital instance with the geometric/semantic model.ItGrazie all’introduzione delle tecnologie digitali nel campo della documentazione architettonica, molti strumenti e metodi di acquisizione hanno avuto un notevole sviluppo. Tuttavia, questi sviluppi si sono principalmente concentrati sulla registrazione e sulla restituzione delle proprietà geometriche e colorimetriche degli oggetti di studio. Le discipline interessate alla digitalizzazione 3D del patrimonio architettonico hanno pertanto la possibilità di produrre delle grandi quantità di dati attraverso un’evoluzione delle pratiche di documentazione che potrebbero progressivamente far scomparire la dimensione cognitiva del rilievo. In questo contesto, appare necessario fornire soluzioni innovative per aumentare il valore informativo delle rappresentazioni digitali tramite l’identificazione delle relazioni potenziali che è possibile costruire fra le nozioni di "molteplicità" ed "intelligibilità". Per rispondere a questo deficit metodologico, questo articolo presenta le basi di un approccio per la

  5. Modelling Behaviour

    DEFF Research Database (Denmark)

    2015-01-01

    This book reflects and expands on the current trend in the building industry to understand, simulate and ultimately design buildings by taking into consideration the interlinked elements and forces that act on them. This approach overcomes the traditional, exclusive focus on building tasks, while....... The chapter authors were invited speakers at the 5th Symposium "Modelling Behaviour", which took place at the CITA in Copenhagen in September 2015....... posing new challenges in all areas of the industry from material and structural to the urban scale. Contributions from invited experts, papers and case studies provide the reader with a comprehensive overview of the field, as well as perspectives from related disciplines, such as computer science...

  6. Econometric modelling

    Directory of Open Access Journals (Sweden)

    M. Alguacil Marí

    2017-08-01

    Full Text Available The current economic environment, together with the low scores obtained by our students in recent years, makes it necessary to incorporate new teaching methods. In this sense, econometric modelling provides a unique opportunity offering to the student with the basic tools to address the study of Econometrics in a deeper and novel way. In this article, this teaching method is described, presenting also an example based on a recent study carried out by two students of the Degree of Economics. Likewise, the success of this method is evaluated quantitatively in terms of academic performance. The results confirm our initial idea that the greater involvement of the student, as well as the need for a more complete knowledge of the subject, suppose a stimulus for the study of this subject. As evidence of this, we show how those students who opted for the method we propose here obtained higher qualifications than those that chose the traditional method.

  7. The Concept of Model. What is Remarkable in Mathematical Models

    Science.gov (United States)

    Bezruchko, Boris P.; Smirnov, Dmitry A.

    Dictionaries tell us that the word "model" originates from the Latin word "modulus" which means "measure, template, norm". This term was used in proceedings on civil engineering several centuries BC. Currently, it relates to an enormously wide range of material objects, symbolic structures and ideal images ranging from models of clothes, small copies of ships and aeroplanes, different pictures and plots to mathematical equations and computational algorithms. Starting to define the concept of "model", we would like to remind about the difficulty to give strict definitions of basic concepts. Thus, when university professors define "oscillations" and "waves" in their lectures on this subject, it is common for many of them to repeat the joke of Russian academician L.I. Mandel'shtam, who illustrated the problem with the example of the term "heap": How many objects, and of which kind, deserve such a name? As well, he compared strict definitions at the beginning of studying any topic to "swaddling oneself with barbed wire". Among classical examples of impossibility to give exhaustive formulations, one can mention the terms "bald spot", "forest", etc. Therefore, we will not consider variety of existing definitions of "model" and "modelling" in detail. Any of them relates to the purposes and subjective preferences of an author and is valid in a certain sense. However, it is restricted since it ignores some objects or properties that deserve attention from other points of view.

  8. Dynamical Modelling of Meteoroid Streams

    Science.gov (United States)

    Clark, David; Wiegert, P. A.

    2012-10-01

    Accurate simulations of meteoroid streams permit the prediction of stream interaction with Earth, and provide a measure of risk to Earth satellites and interplanetary spacecraft. Current cometary ejecta and meteoroid stream models have been somewhat successful in predicting some stream observations, but have required questionable assumptions and significant simplifications. Extending on the approach of Vaubaillon et al. (2005)1, we model dust ejection from the cometary nucleus, and generate sample particles representing bins of distinct dynamical evolution-regulating characteristics (size, density, direction, albedo). Ephemerides of the sample particles are integrated and recorded for later assignment of frequency based on model parameter changes. To assist in model analysis we are developing interactive software to permit the “turning of knobs” of model parameters, allowing for near-real-time 3D visualization of resulting stream structure. With this tool, we will revisit prior assumptions made, and will observe the impact of introducing non-uniform cometary surface attributes and temporal activity. The software uses a single model definition and implementation throughout model verification, sample particle bin generation and integration, and analysis. It supports the adjustment with feedback of both independent and independent model values, with the intent of providing an interface supporting multivariate analysis. Propagations of measurement uncertainties and model parameter precisions are tracked rigorously throughout. We maintain a separation of the model itself from the abstract concepts of model definition, parameter manipulation, and real-time analysis and visualization. Therefore we are able to quickly adapt to fundamental model changes. It is hoped the tool will also be of use in other solar system dynamics problems. 1 Vaubaillon, J.; Colas, F.; Jorda, L. (2005) A new method to predict meteor showers. I. Description of the model. Astronomy and

  9. On Activity modelling in process modeling

    Directory of Open Access Journals (Sweden)

    Dorel Aiordachioaie

    2001-12-01

    Full Text Available The paper is looking to the dynamic feature of the meta-models of the process modelling process, the time. Some principles are considered and discussed as main dimensions of any modelling activity: the compatibility of the substances, the equipresence of phenomena and the solvability of the model. The activity models are considered and represented at meta-level.

  10. Modelling and Forecasting Multivariate Realized Volatility

    DEFF Research Database (Denmark)

    Halbleib, Roxana; Voev, Valeri

    2011-01-01

    This paper proposes a methodology for dynamic modelling and forecasting of realized covariance matrices based on fractionally integrated processes. The approach allows for flexible dependence patterns and automatically guarantees positive definiteness of the forecast. We provide an empirical appl...

  11. Experimental models of epilepsy

    Directory of Open Access Journals (Sweden)

    Stanojlović Olivera P.

    2004-01-01

    Full Text Available Introduction An epileptic seizure is a clinical event and epilepsy is rather a group of symptoms than a disease. The main features all epilepsies have in common include: spontaneous occurrence, repetitiveness, and ictal correlation within the EEG. Epilepsies are manifested with distinct EEG changes, requiring exact clinical definition and consequential treatment. Current data show that 1% of the world's population (approximately 50 million people suffers from epilepsy, with 25% of patients being refractory to therapy and requiring search for new substances in order to decrease EEG and behavioral manifestations of epilepsies. Material and methods In regard to discovery and testing of anticonvulsant substances the best results were achieved by implementation of experi- mental models. Animal models of epilepsy are useful in acquiring basic knowledge regarding pathogenesis, neurotransmitters (glutamate, receptors (NMDA/AMPA/kainate, propagation of epileptic seizures and preclinical assessment of antiepileptics (competitive and non-competitive NMDA antagonists. Results and conclusions In our lab, we have developed a pharmacologic model of a (metaphit, NMDA and remacemide-cilastatin generalized, reflex, and audiogenic epilepsy. The model is suitable for testing various anticonvulsant substances (e.g. APH, APV, CPP, Mk-801 and potential antiepileptics (e.g. DSIP, its tetra- and octaanalogues.

  12. Improved Trailing Edge Noise Model

    DEFF Research Database (Denmark)

    Bertagnolio, Franck

    2012-01-01

    The modeling of the surface pressure spectrum under a turbulent boundary layer is investigated in the presence of an adverse pressure gradient along the flow direction. It is shown that discrepancies between measurements and results from a well-known model increase as the pressure gradient...... increases. This model is modified by introducing anisotropy in the definition of the vertical velocity component spectrum across the boundary layer. The degree of anisotropy is directly related to the strength of the pressure gradient. It is shown that by appropriately normalizing the pressure gradient...... and by tuning the anisotropy factor, experimental results can be closely reproduced by the modified model. In this section, the original TNO-Blake model is modified in order to account for the effects of a pressure gradient through turbulence anisotropy. The model results are compared with measurements...

  13. Definitive localization of intracellular proteins: Novel approach using CRISPR-Cas9 genome editing, with glucose 6-phosphate dehydrogenase as a model.

    Science.gov (United States)

    Spencer, Netanya Y; Yan, Ziying; Cong, Le; Zhang, Yulong; Engelhardt, John F; Stanton, Robert C

    2016-02-01

    Studies to determine subcellular localization and translocation of proteins are important because subcellular localization of proteins affects every aspect of cellular function. Such studies frequently utilize mutagenesis to alter amino acid sequences hypothesized to constitute subcellular localization signals. These studies often utilize fluorescent protein tags to facilitate live cell imaging. These methods are excellent for studies of monomeric proteins, but for multimeric proteins, they are unable to rule out artifacts from native protein subunits already present in the cells. That is, native monomers might direct the localization of fluorescent proteins with their localization signals obliterated. We have developed a method for ruling out such artifacts, and we use glucose 6-phosphate dehydrogenase (G6PD) as a model to demonstrate the method's utility. Because G6PD is capable of homodimerization, we employed a novel approach to remove interference from native G6PD. We produced a G6PD knockout somatic (hepatic) cell line using CRISPR-Cas9 mediated genome engineering. Transfection of G6PD knockout cells with G6PD fluorescent mutant proteins demonstrated that the major subcellular localization sequences of G6PD are within the N-terminal portion of the protein. This approach sets a new gold standard for similar studies of subcellular localization signals in all homodimerization-capable proteins.

  14. Life sciences domain analysis model.

    Science.gov (United States)

    Freimuth, Robert R; Freund, Elaine T; Schick, Lisa; Sharma, Mukesh K; Stafford, Grace A; Suzek, Baris E; Hernandez, Joyce; Hipp, Jason; Kelley, Jenny M; Rokicki, Konrad; Pan, Sue; Buckler, Andrew; Stokes, Todd H; Fernandez, Anna; Fore, Ian; Buetow, Kenneth H; Klemm, Juli D

    2012-01-01

    Meaningful exchange of information is a fundamental challenge in collaborative biomedical research. To help address this, the authors developed the Life Sciences Domain Analysis Model (LS DAM), an information model that provides a framework for communication among domain experts and technical teams developing information systems to support biomedical research. The LS DAM is harmonized with the Biomedical Research Integrated Domain Group (BRIDG) model of protocol-driven clinical research. Together, these models can facilitate data exchange for translational research. The content of the LS DAM was driven by analysis of life sciences and translational research scenarios and the concepts in the model are derived from existing information models, reference models and data exchange formats. The model is represented in the Unified Modeling Language and uses ISO 21090 data types. The LS DAM v2.2.1 is comprised of 130 classes and covers several core areas including Experiment, Molecular Biology, Molecular Databases and Specimen. Nearly half of these classes originate from the BRIDG model, emphasizing the semantic harmonization between these models. Validation of the LS DAM against independently derived information models, research scenarios and reference databases supports its general applicability to represent life sciences research. The LS DAM provides unambiguous definitions for concepts required to describe life sciences research. The processes established to achieve consensus among domain experts will be applied in future iterations and may be broadly applicable to other standardization efforts. The LS DAM provides common semantics for life sciences research. Through harmonization with BRIDG, it promotes interoperability in translational science.

  15. Exposure at Default Modeling with Default Intensities

    OpenAIRE

    Witzany, Jiří

    2011-01-01

    The paper provides an overview of the Exposure at Default (EAD) definition, requirements, and estimation methods as set by the Basel II regulation. A new methodology connected to the intensity of default modeling is proposed. The numerical examples show that various estimation techniques may lead to quite different results with intensity of default based model being recommended as the most faithful with respect to a precise probabilistic definition of the EAD parameter.

  16. Semiparametric Regression and Model Refining

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    This paper presents a semiparametric adjustment method suitable for general cases.Assuming that the regularizer matrix is positive definite,the calculation method is discussed and the corresponding formulae are presented.Finally,a simulated adjustment problem is constructed to explain the method given in this paper.The results from the semiparametric model and G-M model are compared.The results demonstrate that the model errors or the systematic errors of the observations can be detected correctly with the semiparametric estimate method.

  17. Definition of a dynamic laparoscopic model for the prediction of incomplete cytoreduction in advanced epithelial ovarian cancer: proof of a concept.

    Science.gov (United States)

    Petrillo, M; Vizzielli, G; Fanfani, F; Gallotta, V; Cosentino, F; Chiantera, V; Legge, F; Carbone, V; Scambia, G; Fagotti, A

    2015-10-01

    To develop an updated laparoscopy-based model to predict incomplete cytoreduction (RT>0) in advanced epithelial ovarian cancer (AEOC), after the introduction of upper abdominal surgery (UAS). The presence of omental cake, peritoneal extensive carcinomatosis, diaphragmatic confluent carcinomatosis, bowel infiltration, stomach and/or spleen and/or lesser omentum infiltration, and superficial liver metastases was evaluated by staging laparoscopy (S-LPS) in a consecutive series of 234 women with newly diagnosed AEOC, receiving laparotomic PDS after S-LPS. Parameters showing a specificity≥75%, PPV≥50%, and NPV≥50% received 1 point score, with an additional one point in the presence of an accuracy of ≥60% in predicting incomplete cytoreduction. The overall discriminating performance of the LPS-PI was finally estimated by ROC curve analysis. No-gross residual disease at PDS was achieved in 135 cases (57.5%). Among them, UAS was required in 72 cases (53.3%) for a total of 112 procedures, and around 25% of these patients received bowel resection, excluding recto-sigmoid resection. We observed a very high overall agreement between S-LPS and laparotomic findings, which ranged from 74.7% for omental cake to 94.8% for stomach infiltration. At a LPS-PIV≥10 the chance of achieving complete PDS was 0, and the risk of unnecessary laparotomy was 33.2%. Discriminating performance of LPS-PI was very high (AUC=0.885). S-LPS is confirmed as an accurate tool in the prediction of complete PDS in women with AEOC. The updated LPS-PI showed improved discriminating performance, with a lower rate of inappropriate laparotomic explorations at the established cut-off value of 10. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. Towards a Multi Business Model Innovation Model

    DEFF Research Database (Denmark)

    Lindgren, Peter; Jørgensen, Rasmus

    2012-01-01

    This paper studies the evolution of business model (BM) innovations related to a multi business model framework. The paper tries to answer the research questions: • What are the requirements for a multi business model innovation model (BMIM)? • How should a multi business model innovation model...... look like? Different generations of BMIMs are initially studied in the context of laying the baseline for how next generation multi BM Innovation model (BMIM) should look like. All generations of models are analyzed with the purpose of comparing the characteristics and challenges of previous...

  19. Better Language Models with Model Merging

    CERN Document Server

    Brants, T

    1996-01-01

    This paper investigates model merging, a technique for deriving Markov models from text or speech corpora. Models are derived by starting with a large and specific model and by successively combining states to build smaller and more general models. We present methods to reduce the time complexity of the algorithm and report on experiments on deriving language models for a speech recognition task. The experiments show the advantage of model merging over the standard bigram approach. The merged model assigns a lower perplexity to the test set and uses considerably fewer states.

  20. Towards Clone Detection in UML Domain Models

    DEFF Research Database (Denmark)

    Störrle, Harald

    2013-01-01

    Code clones (i.e., duplicate fragments of code) have been studied for long, and there is strong evidence that they are a major source of software faults. Anecdotal evidence suggests that this phenomenon occurs similarly in models, suggesting that model clones are as detrimental to model quality...... as they are to code quality. However, programming language code and visual models have significant differences that make it difficult to directly transfer notions and algorithms developed in the code clone arena to model clones. In this article, we develop and propose a definition of the notion of “model clone” based...... on the thorough analysis of practical scenarios. We propose a formal definition of model clones, specify a clone detection algorithm for UML domain models, and implement it prototypically. We investigate different similarity heuristics to be used in the algorithm, and report the performance of our approach. While...

  1. Managing Analysis Models in the Design Process

    Science.gov (United States)

    Briggs, Clark

    2006-01-01

    Design of large, complex space systems depends on significant model-based support for exploration of the design space. Integrated models predict system performance in mission-relevant terms given design descriptions and multiple physics-based numerical models. Both the design activities and the modeling activities warrant explicit process definitions and active process management to protect the project from excessive risk. Software and systems engineering processes have been formalized and similar formal process activities are under development for design engineering and integrated modeling. JPL is establishing a modeling process to define development and application of such system-level models.

  2. Model correction factor method for system analysis

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager; Johannesen, Johannes M.

    2000-01-01

    The Model Correction Factor Method is an intelligent response surface method based on simplifiedmodeling. MCFM is aimed for reliability analysis in case of a limit state defined by an elaborate model. Herein it isdemonstrated that the method is applicable for elaborate limit state surfaces on which...... severallocally most central points exist without there being a simple geometric definition of the corresponding failuremodes such as is the case for collapse mechanisms in rigid plastic hinge models for frame structures. Taking as simplifiedidealized model a model of similarity with the elaborate model...... surface than existing in the idealized model....

  3. Model Selection Principles in Misspecified Models

    CERN Document Server

    Lv, Jinchi

    2010-01-01

    Model selection is of fundamental importance to high dimensional modeling featured in many contemporary applications. Classical principles of model selection include the Kullback-Leibler divergence principle and the Bayesian principle, which lead to the Akaike information criterion and Bayesian information criterion when models are correctly specified. Yet model misspecification is unavoidable when we have no knowledge of the true model or when we have the correct family of distributions but miss some true predictor. In this paper, we propose a family of semi-Bayesian principles for model selection in misspecified models, which combine the strengths of the two well-known principles. We derive asymptotic expansions of the semi-Bayesian principles in misspecified generalized linear models, which give the new semi-Bayesian information criteria (SIC). A specific form of SIC admits a natural decomposition into the negative maximum quasi-log-likelihood, a penalty on model dimensionality, and a penalty on model miss...

  4. Kinetic modeling of light limitation and sulfur deprivation effects in the induction of hydrogen production with Chlamydomonas reinhardtii. Part II: Definition of model-based protocols and experimental validation.

    Science.gov (United States)

    Degrenne, B; Pruvost, J; Titica, M; Takache, H; Legrand, J

    2011-10-01

    Photosynthetic hydrogen production under light by the green microalga Chlamydomonas reinhardtii was investigated in a torus-shaped PBR in sulfur-deprived conditions. Culture conditions, represented by the dry biomass concentration of the inoculum, sulfate concentration, and incident photon flux density (PFD), were optimized based on a previously published model (Fouchard et al., 2009. Biotechnol Bioeng 102:232-245). This allowed a strictly autotrophic production, whereas the sulfur-deprived protocol is usually applied in photoheterotrophic conditions. Experimental results combined with additional information from kinetic simulations emphasize effects of sulfur deprivation and light attenuation in the PBR in inducing anoxia and hydrogen production. A broad range of PFD was tested (up to 500 µmol photons m(-2) s(-1) ). Maximum hydrogen productivities were 1.0 ± 0.2 mL H₂ /h/L (or 25 ± 5 mL H₂ /m(2) h) and 3.1 mL ± 0.4 H₂ /h L (or 77.5 ± 10 mL H₂ /m(2) h), at 110 and 500 µmol photons m(-2) s(-1) , respectively. These values approached a maximum specific productivity of approximately 1.9 mL ± 0.4 H₂ /h/g of biomass dry weight, clearly indicative of a limitation in cell capacity to produce hydrogen. The efficiency of the process and further optimizations are discussed.

  5. Imposing causality on a matrix model

    Energy Technology Data Exchange (ETDEWEB)

    Benedetti, Dario [Perimeter Institute for Theoretical Physics, 31 Caroline St. N, N2L 2Y5, Waterloo ON (Canada)], E-mail: dbenedetti@perimeterinstitute.ca; Henson, Joe [Perimeter Institute for Theoretical Physics, 31 Caroline St. N, N2L 2Y5, Waterloo ON (Canada)

    2009-07-13

    We introduce a new matrix model that describes Causal Dynamical Triangulations (CDT) in two dimensions. In order to do so, we introduce a new, simpler definition of 2D CDT and show it to be equivalent to the old one. The model makes use of ideas from dually weighted matrix models, combined with multi-matrix models, and can be studied by the method of character expansion.

  6. TWO REGRESSION CREDIBILITY MODELS

    Directory of Open Access Journals (Sweden)

    Constanţa-Nicoleta BODEA

    2010-03-01

    Full Text Available In this communication we will discuss two regression credibility models from Non – Life Insurance Mathematics that can be solved by means of matrix theory. In the first regression credibility model, starting from a well-known representation formula of the inverse for a special class of matrices a risk premium will be calculated for a contract with risk parameter θ. In the next regression credibility model, we will obtain a credibility solution in the form of a linear combination of the individual estimate (based on the data of a particular state and the collective estimate (based on aggregate USA data. To illustrate the solution with the properties mentioned above, we shall need the well-known representation theorem for a special class of matrices, the properties of the trace for a square matrix, the scalar product of two vectors, the norm with respect to a positive definite matrix given in advance and the complicated mathematical properties of conditional expectations and of conditional covariances.

  7. The IMACLIM model; Le modele IMACLIM

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2003-07-01

    This document provides annexes to the IMACLIM model which propose an actualized description of IMACLIM, model allowing the design of an evaluation tool of the greenhouse gases reduction policies. The model is described in a version coupled with the POLES, technical and economical model of the energy industry. Notations, equations, sources, processing and specifications are proposed and detailed. (A.L.B.)

  8. Building Mental Models by Dissecting Physical Models

    Science.gov (United States)

    Srivastava, Anveshna

    2016-01-01

    When students build physical models from prefabricated components to learn about model systems, there is an implicit trade-off between the physical degrees of freedom in building the model and the intensity of instructor supervision needed. Models that are too flexible, permitting multiple possible constructions require greater supervision to…

  9. The IMACLIM model; Le modele IMACLIM

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2003-07-01

    This document provides annexes to the IMACLIM model which propose an actualized description of IMACLIM, model allowing the design of an evaluation tool of the greenhouse gases reduction policies. The model is described in a version coupled with the POLES, technical and economical model of the energy industry. Notations, equations, sources, processing and specifications are proposed and detailed. (A.L.B.)

  10. Modelling live forensic acquisition

    CSIR Research Space (South Africa)

    Grobler, MM

    2009-06-01

    Full Text Available This paper discusses the development of a South African model for Live Forensic Acquisition - Liforac. The Liforac model is a comprehensive model that presents a range of aspects related to Live Forensic Acquisition. The model provides forensic...

  11. Continuous Time Model Estimation

    OpenAIRE

    Carl Chiarella; Shenhuai Gao

    2004-01-01

    This paper introduces an easy to follow method for continuous time model estimation. It serves as an introduction on how to convert a state space model from continuous time to discrete time, how to decompose a hybrid stochastic model into a trend model plus a noise model, how to estimate the trend model by simulation, and how to calculate standard errors from estimation of the noise model. It also discusses the numerical difficulties involved in discrete time models that bring about the unit ...

  12. Implementation of WPDL Conforming Workflow Model

    Institute of Scientific and Technical Information of China (English)

    张志君; 范玉顺

    2003-01-01

    Workflow process definition language (WPDL) facilitates the transfer of workflow process definitions between separate workflow products. However, much work is still needed to transfer the specific workflow model to a WPDL conforming model. CIMFlow is a workflow management system developed by the National CIMS Engineering Research Center. This paper discusses the methods by which the CIMFlow model conforms to the WPDL meta-model and the differences between the WPDL meta-model and the CIMFlow model. Some improvements are proposed for the WPDL specification. Finally, the mapping and translating methods between the entities and attributes are given for the two models. The proposed methods and improvements are valuable as a reference for other mapping applications and the WPDL specification.

  13. Comparative Protein Structure Modeling Using MODELLER.

    Science.gov (United States)

    Webb, Benjamin; Sali, Andrej

    2016-06-20

    Comparative protein structure modeling predicts the three-dimensional structure of a given protein sequence (target) based primarily on its alignment to one or more proteins of known structure (templates). The prediction process consists of fold assignment, target-template alignment, model building, and model evaluation. This unit describes how to calculate comparative models using the program MODELLER and how to use the ModBase database of such models, and discusses all four steps of comparative modeling, frequently observed errors, and some applications. Modeling lactate dehydrogenase from Trichomonas vaginalis (TvLDH) is described as an example. The download and installation of the MODELLER software is also described. © 2016 by John Wiley & Sons, Inc.

  14. Adaptive response modelling

    Science.gov (United States)

    Campa, Alessandro; Esposito, Giuseppe; Belli, Mauro

    Cellular response to radiation is often modified by a previous delivery of a small "priming" dose: a smaller amount of damage, defined by the end point being investigated, is observed, and for this reason the effect is called adaptive response. An improved understanding of this effect is essential (as much as for the case of the bystander effect) for a reliable radiation risk assessment when low dose irradiations are involved. Experiments on adaptive response have shown that there are a number of factors that strongly influence the occurrence (and the level) of the adaptation. In particular, priming doses and dose rates have to fall in defined ranges; the same is true for the time interval between the delivery of the small priming dose and the irradiation with the main, larger, dose (called in this case challenging dose). Different hypotheses can be formulated on the main mechanism(s) determining the adaptive response: an increased efficiency of DNA repair, an increased level of antioxidant enzymes, an alteration of cell cycle progression, a chromatin conformation change. An experimental clearcut evidence going definitely in the direction of one of these explanations is not yet available. Modelling can be done at different levels. Simple models, relating the amount of damage, through elementary differential equations, to the dose and dose rate experienced by the cell, are relatively easy to handle, and they can be modified to account for the priming irradiation. However, this can hardly be of decisive help in the explanation of the mechanisms, since each parameter of these models often incorporates in an effective way several cellular processes related to the response to radiation. In this presentation we show our attempts to describe adaptive response with models that explicitly contain, as a dynamical variable, the inducible adaptive agent. At a price of a more difficult treatment, this approach is probably more prone to give support to the experimental studies

  15. Towards a Formal Model of Context Awareness

    DEFF Research Database (Denmark)

    Kjærgaard, Mikkel Baun; Bunde-Pedersen, Jonathan

    2006-01-01

    There is a definite lack of formal support for modeling realistic context-awareness in pervasive computing applications. The CONAWA calculus presented in this paper provides mechanisms for modeling complex and interwoven sets of context-information by extending ambient calculus with new constructs...

  16. Model Uncertainty for Bilinear Hysteretic Systems

    DEFF Research Database (Denmark)

    1984-01-01

    is related to the concept of a failure surface (or limit state surface) in the n-dimensional basic variable space then model uncertainty is at least due to the neglected variables, the modelling of the failure surface and the computational technique used. A more precise definition is given in section 2...

  17. A Formal Model for Context-Awareness

    DEFF Research Database (Denmark)

    Kjærgaard, Mikkel Baun; Bunde-Pedersen, Jonathan

    here is a definite lack of formal support for modeling real- istic context-awareness in pervasive computing applications. The Conawa calculus presented in this paper provides mechanisms for modeling complex and interwoven sets of context-information by extending ambient calculus with new construc...

  18. A study on intercultural competence definitions and intercultural competence model%国外跨文化能力定义和主要跨文化能力模式研究

    Institute of Scientific and Technical Information of China (English)

    周玲

    2016-01-01

    全球化加速使跨文化能力对处于多元文化社会的人们的发展日益重要,特别是对将来要在全球化国际市场竞争的高校学生。跨文化能力已经成为当今学术界和教育界共同关注的热点。本文先对国外跨文化能力定义进行评判性分析,然后探讨了具有重大影响的Byram 跨文化能力模式。%The increasing globalization has made intercultural competence extremely important for people living in this diverse world, especially fo r students in higher institutions who will be competing in this globalized world. This essay first critically discusses and analyzes various definitions for inte rcultural competence in the west and then explores the influential Byram’s model of intercultural competence in a critical way.

  19. Research on Concept Definition and Scenario Models of Resilient Affordable Housing Communities%弹性保障房社区的概念界定及情景模型研究

    Institute of Scientific and Technical Information of China (English)

    李德智; 吴洁; 杨钧

    2016-01-01

    将弹性思想引入保障房社区的建设管理,以积极应对不确定性扰动,实现保障房社区的可持续发展.基于弹性、弹性社区、保障房社区等概念,界定弹性保障房社区的概念,建构弹性保障房社区的技术弹性、组织弹性、社会弹性、经济弹性等情景模型,提出建设弹性保障房社区的对策建议.%In order to deal with uncertain disturbances and achieve the sustainable development of affordable housing communities,this paper introduces the idea of resilience into the construction and management of affordable housing communities. Based on the definitions of resilience,resilient communities and affordable housing communities,defines the concept of a resilient affordable housing community. Then,builds the scenario models of resilient affordable housing communities,including technical resilience,organizational resilience, social resilience and economic resilience. Finally,proposes the corresponding suggestions to construct resilient affordablehousing communities.

  20. E-Learning Security Models

    Directory of Open Access Journals (Sweden)

    Vladimir I. Zuev

    2012-06-01

    Full Text Available The article looks into methods and models that are useful when analyzing the risks and vulnerabilities of complex e-learning systems in an emergency management context. Definitions of vulnerability and emergency response capabilities, such as "VLE/PLE attack surface", are suggested.The article provides insight into some of the issues related to analysis of risks and vulnerabilities of e-learning systems, but more research is needed to address this difficult and comprehensive task.

  1. Integrated modeling of European migration

    OpenAIRE

    Raymer, James; Wiśniowski, Arkadiusz; Forster, Jonathan J.; Peter W. F. Smith; Bijak, Jakub

    2013-01-01

    International migration data in Europe are collected by individual countries with separate collection systems and designs. As a result, reported data are inconsistent in availability, definition and quality. In this paper, we propose a Bayesian model to overcome the limitations of the various data sources. The focus is on estimating recent international migration flows amongst 31 countries in the European Union and European Free Trade Association from 2002 to 2008, using data collated by Euro...

  2. Concept Modeling vs. Data modeling in Practice

    DEFF Research Database (Denmark)

    Madsen, Bodil Nistrup; Erdman Thomsen, Hanne

    2015-01-01

    account of the inheritance of characteristics and allows us to introduce a number of principles and constraints which render concept modeling more coherent than earlier approaches. Second, we explain how terminological ontologies can be used as the basis for developing conceptual and logical data models......This chapter shows the usefulness of terminological concept modeling as a first step in data modeling. First, we introduce terminological concept modeling with terminological ontologies, i.e. concept systems enriched with characteristics modeled as feature specifications. This enables a formal...

  3. Towards an E-market Model

    DEFF Research Database (Denmark)

    Ivang, Reimer; Hinson, Robert; Somasundaram, Ramanathan

    2006-01-01

    Purpose: Seeks to argue that there are problems associated with e-market definitive efforts and consequently seeks proposes a new e-market model. Design/methodology/Approach: Paper based largely on literature survey and an assessment of the existing e-market conceptualizations. Findings: Based...... on the literature survey and ídentification of gaps in the present e-market definitive models, the authors postulate a preliminary e-market reference model. Originality/ Value: Through synthesizing the e-market literature, and by taking into account contemporary e-market developments, key dimensions that define...... an e-market are identified and explained...

  4. A Method for Model Checking Feature Interactions

    DEFF Research Database (Denmark)

    Pedersen, Thomas; Le Guilly, Thibaut; Ravn, Anders Peter;

    2015-01-01

    This paper presents a method to check for feature interactions in a system assembled from independently developed concurrent processes as found in many reactive systems. The method combines and refines existing definitions and adds a set of activities. The activities describe how to populate the ...... the definitions with models to ensure that all interactions are captured. The method is illustrated on a home automation example with model checking as analysis tool. In particular, the modelling formalism is timed automata and the analysis uses UPPAAL to find interactions....

  5. Standard-model coupling constants from compositeness

    CERN Document Server

    Besprosvany, J

    2003-01-01

    A coupling-constant definition is given based on the compositeness property of some particle states with respect to the elementary states of other particles. It is applied in the context of the vector-spin-1/2-particle interaction vertices of a field theory, and the standard model. The definition reproduces Weinberg's angle in a grand-unified theory. One obtains coupling values close to the experimental ones for appropriate configurations of the standard-model vector particles, at the unification scale within grand-unified models, and at the electroweak breaking scale.

  6. Business model: unveiling the construct

    Directory of Open Access Journals (Sweden)

    Cyntia Vilasboas Calixto

    2015-09-01

    Full Text Available This essay was developed based on a systematic literature review to identify the main definitions of business model as well as the elements that compose this construct. We analyzed 81 papers published in journals with scores above 1.5 according to Journal Citation Report (JCR standards. We realized that the relationship between business model and multinational companies has been neglected by researchers and therefore appears as an opportunity for research. Considering that business models describe how a company creates value through a combination of internal and external activities as a whole of resources, it is important to understand the design elements of the business model established by the multinational enterprise.

  7. Shell Models of Magnetohydrodynamic Turbulence

    CERN Document Server

    Plunian, Franck; Frick, Peter

    2012-01-01

    Shell models of hydrodynamic turbulence originated in the seventies. Their main aim was to describe the statistics of homogeneous and isotropic turbulence in spectral space, using a simple set of ordinary differential equations. In the eighties, shell models of magnetohydrodynamic (MHD) turbulence emerged based on the same principles as their hydrodynamic counter-part but also incorporating interactions between magnetic and velocity fields. In recent years, significant improvements have been made such as the inclusion of non-local interactions and appropriate definitions for helicities. Though shell models cannot account for the spatial complexity of MHD turbulence, their dynamics are not over simplified and do reflect those of real MHD turbulence including intermittency or chaotic reversals of large-scale modes. Furthermore, these models use realistic values for dimensionless parameters (high kinetic and magnetic Reynolds numbers, low or high magnetic Prandtl number) allowing extended inertial range and accu...

  8. International Symposia on Scale Modeling

    CERN Document Server

    Ito, Akihiko; Nakamura, Yuji; Kuwana, Kazunori

    2015-01-01

    This volume thoroughly covers scale modeling and serves as the definitive source of information on scale modeling as a powerful simplifying and clarifying tool used by scientists and engineers across many disciplines. The book elucidates techniques used when it would be too expensive, or too difficult, to test a system of interest in the field. Topics addressed in the current edition include scale modeling to study weather systems, diffusion of pollution in air or water, chemical process in 3-D turbulent flow, multiphase combustion, flame propagation, biological systems, behavior of materials at nano- and micro-scales, and many more. This is an ideal book for students, both graduate and undergraduate, as well as engineers and scientists interested in the latest developments in scale modeling. This book also: Enables readers to evaluate essential and salient aspects of profoundly complex systems, mechanisms, and phenomena at scale Offers engineers and designers a new point of view, liberating creative and inno...

  9. Network model of security system

    Directory of Open Access Journals (Sweden)

    Adamczyk Piotr

    2016-01-01

    Full Text Available The article presents the concept of building a network security model and its application in the process of risk analysis. It indicates the possibility of a new definition of the role of the network models in the safety analysis. Special attention was paid to the development of the use of an algorithm describing the process of identifying the assets, vulnerability and threats in a given context. The aim of the article is to present how this algorithm reduced the complexity of the problem by eliminating from the base model these components that have no links with others component and as a result and it was possible to build a real network model corresponding to reality.

  10. Business Model Innovation

    OpenAIRE

    Dodgson, Mark; Gann, David; Phillips, Nelson; Massa, Lorenzo; Tucci, Christopher

    2014-01-01

    The chapter offers a broad review of the literature at the nexus between Business Models and innovation studies, and examines the notion of Business Model Innovation in three different situations: Business Model Design in newly formed organizations, Business Model Reconfiguration in incumbent firms, and Business Model Innovation in the broad context of sustainability. Tools and perspectives to make sense of Business Models and support managers and entrepreneurs in dealing with Business Model ...

  11. Modeling cholera outbreaks.

    Science.gov (United States)

    Chao, Dennis L; Longini, Ira M; Morris, J Glenn

    2014-01-01

    Mathematical modeling can be a valuable tool for studying infectious disease outbreak dynamics and simulating the effects of possible interventions. Here, we describe approaches to modeling cholera outbreaks and how models have been applied to explore intervention strategies, particularly in Haiti. Mathematical models can play an important role in formulating and evaluating complex cholera outbreak response options. Major challenges to cholera modeling are insufficient data for calibrating models and the need to tailor models for different outbreak scenarios.

  12. Modeling cholera outbreaks

    Science.gov (United States)

    Longini, Ira M.; Morris, J. Glenn

    2014-01-01

    Mathematical modeling can be a valuable tool for studying infectious disease outbreak dynamics and simulating the effects of possible interventions. Here, we describe approaches to modeling cholera outbreaks and how models have been applied to explore intervention strategies, particularly in Haiti. Mathematical models can play an important role in formulating and evaluating complex cholera outbreak response options. Major challenges to cholera modeling are insufficient data for calibrating models and the need to tailor models for different outbreak scenarios. PMID:23412687

  13. Model Manipulation for End-User Modelers

    DEFF Research Database (Denmark)

    Acretoaie, Vlad

    of these proposals. To achieve its first goal, the thesis presents the findings of a Systematic Mapping Study showing that human factors topics are scarcely and relatively poorly addressed in model transformation research. Motivated by these findings, the thesis explores the requirements of end-user modelers......End-user modelers are domain experts who create and use models as part of their work. They are typically not Software Engineers, and have little or no programming and meta-modeling experience. However, using model manipulation languages developed in the context of Model-Driven Engineering often...... requires such experience. These languages are therefore only used by a small subset of the modelers that could, in theory, benefit from them. The goals of this thesis are to substantiate this observation, introduce the concepts and tools required to overcome it, and provide empirical evidence in support...

  14. Air Quality Dispersion Modeling - Alternative Models

    Science.gov (United States)

    Models, not listed in Appendix W, that can be used in regulatory applications with case-by-case justification to the Reviewing Authority as noted in Section 3.2, Use of Alternative Models, in Appendix W.

  15. JPI UML Software Modeling

    Directory of Open Access Journals (Sweden)

    Cristian Vidal Silva

    2015-12-01

    Full Text Available Aspect-Oriented Programming AOP extends object-oriented programming OOP with aspects to modularize crosscutting behavior on classes by means of aspects to advise base code in the occurrence of join points according to pointcut rules definition. However, join points introduce dependencies between aspects and base code, a great issue to achieve an effective independent development of software modules. Join Point Interfaces JPI represent join points using interfaces between classes and aspect, thus these modules do not depend of each other. Nevertheless, since like AOP, JPI is a programming methodology; thus, for a complete aspect-oriented software development process, it is necessary to define JPI requirements and JPI modeling phases. Towards previous goal, this article proposes JPI UML class and sequence diagrams for modeling JPI software solutions. A purpose of these diagrams is to facilitate understanding the structure and behavior of JPI programs. As an application example, this article applies the JPI UML diagrams proposal on a case study and analyzes the associated JPI code to prove their hegemony.

  16. From Product Models to Product State Models

    DEFF Research Database (Denmark)

    Larsen, Michael Holm

    1999-01-01

    A well-known technology designed to handle product data is Product Models. Product Models are in their current form not able to handle all types of product state information. Hence, the concept of a Product State Model (PSM) is proposed. The PSM and in particular how to model a PSM is the Research...... Object for this project. In the presentation, benefits and challenges of the PSM will be presented as a basis for the discussion....

  17. Measurement and Modeling: Infectious Disease Modeling

    NARCIS (Netherlands)

    Kretzschmar, MEE

    2016-01-01

    After some historical remarks about the development of mathematical theory for infectious disease dynamics we introduce a basic mathematical model for the spread of an infection with immunity. The concepts of the model are explained and the model equations are derived from first principles. Using th

  18. Modelling of Hydraulic Robot

    DEFF Research Database (Denmark)

    Madsen, Henrik; Zhou, Jianjun; Hansen, Lars Henrik

    1997-01-01

    This paper describes a case study of identifying the physical model (or the grey box model) of a hydraulic test robot. The obtained model is intended to provide a basis for model-based control of the robot. The physical model is formulated in continuous time and is derived by application...... of the laws of physics on the system. The unknown (or uncertain) parameters are estimated with Maximum Likelihood (ML) parameter estimation. The identified model has been evaluated by comparing the measurements with simulation of the model. The identified model was much more capable of describing the dynamics...... of the system than the deterministic model....

  19. Product and Process Modelling

    DEFF Research Database (Denmark)

    Cameron, Ian T.; Gani, Rafiqul

    This book covers the area of product and process modelling via a case study approach. It addresses a wide range of modelling applications with emphasis on modelling methodology and the subsequent in-depth analysis of mathematical models to gain insight via structural aspects of the models. These ...

  20. "Bohr's Atomic Model."

    Science.gov (United States)

    Willden, Jeff

    2001-01-01

    "Bohr's Atomic Model" is a small interactive multimedia program that introduces the viewer to a simplified model of the atom. This interactive simulation lets students build an atom using an atomic construction set. The underlying design methodology for "Bohr's Atomic Model" is model-centered instruction, which means the central model of the…

  1. Modelling of Hydraulic Robot

    DEFF Research Database (Denmark)

    Madsen, Henrik; Zhou, Jianjun; Hansen, Lars Henrik

    1997-01-01

    This paper describes a case study of identifying the physical model (or the grey box model) of a hydraulic test robot. The obtained model is intended to provide a basis for model-based control of the robot. The physical model is formulated in continuous time and is derived by application...

  2. Forest-fire models

    Science.gov (United States)

    Haiganoush Preisler; Alan Ager

    2013-01-01

    For applied mathematicians forest fire models refer mainly to a non-linear dynamic system often used to simulate spread of fire. For forest managers forest fire models may pertain to any of the three phases of fire management: prefire planning (fire risk models), fire suppression (fire behavior models), and postfire evaluation (fire effects and economic models). In...

  3. Wind Shear Modeling for Aircraft Hazard Definition.

    Science.gov (United States)

    1978-02-01

    lai . if . (.f this peg .) 21. N.. .4 P. .I 22. P..v. Unclassified Unclassified 257 Penn DOT F 1700.1 ($.-7~) - l.,r.di jcsl.,, of com pl.t.d peg . suN...involved is too complicated to present in deta i l i n this report and the reader is referred to Reference [2—15]. There is considerable encouragement

  4. Wind Shear Modeling for Aircraft Hazard Definition

    Science.gov (United States)

    1977-03-01

    Fichtl, "Rough to Smooth Transition of an Equilibrium Neutral Constant Stress Layer," NASA TM X-3322, (1975). 5-36 Geiger, Rudolf , The Climate Near the...Roy Steiner , and K. G. Pratt. "Dynamic Response of Airplanes to Atmospheric Turbulence Including Flight Data on Input and Response," NASA TR R-199

  5. Solicited abstract: Global hydrological modeling and models

    Science.gov (United States)

    Xu, Chong-Yu

    2010-05-01

    The origins of rainfall-runoff modeling in the broad sense can be found in the middle of the 19th century arising in response to three types of engineering problems: (1) urban sewer design, (2) land reclamation drainage systems design, and (3) reservoir spillway design. Since then numerous empirical, conceptual and physically-based models are developed including event based models using unit hydrograph concept, Nash's linear reservoir models, HBV model, TOPMODEL, SHE model, etc. From the late 1980s, the evolution of global and continental-scale hydrology has placed new demands on hydrologic modellers. The macro-scale hydrological (global and regional scale) models were developed on the basis of the following motivations (Arenll, 1999). First, for a variety of operational and planning purposes, water resource managers responsible for large regions need to estimate the spatial variability of resources over large areas, at a spatial resolution finer than can be provided by observed data alone. Second, hydrologists and water managers are interested in the effects of land-use and climate variability and change over a large geographic domain. Third, there is an increasing need of using hydrologic models as a base to estimate point and non-point sources of pollution loading to streams. Fourth, hydrologists and atmospheric modellers have perceived weaknesses in the representation of hydrological processes in regional and global climate models, and developed global hydrological models to overcome the weaknesses of global climate models. Considerable progress in the development and application of global hydrological models has been achieved to date, however, large uncertainties still exist considering the model structure including large scale flow routing, parameterization, input data, etc. This presentation will focus on the global hydrological models, and the discussion includes (1) types of global hydrological models, (2) procedure of global hydrological model development

  6. Bayesian Model Selection and Statistical Modeling

    CERN Document Server

    Ando, Tomohiro

    2010-01-01

    Bayesian model selection is a fundamental part of the Bayesian statistical modeling process. The quality of these solutions usually depends on the goodness of the constructed Bayesian model. Realizing how crucial this issue is, many researchers and practitioners have been extensively investigating the Bayesian model selection problem. This book provides comprehensive explanations of the concepts and derivations of the Bayesian approach for model selection and related criteria, including the Bayes factor, the Bayesian information criterion (BIC), the generalized BIC, and the pseudo marginal lik

  7. From Numeric Models to Granular System Modeling

    Directory of Open Access Journals (Sweden)

    Witold Pedrycz

    2015-03-01

    To make this study self-contained, we briefly recall the key concepts of granular computing and demonstrate how this conceptual framework and its algorithmic fundamentals give rise to granular models. We discuss several representative formal setups used in describing and processing information granules including fuzzy sets, rough sets, and interval calculus. Key architectures of models dwell upon relationships among information granules. We demonstrate how information granularity and its optimization can be regarded as an important design asset to be exploited in system modeling and giving rise to granular models. With this regard, an important category of rule-based models along with their granular enrichments is studied in detail.

  8. Multivariable Wind Modeling in State Space

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri; Pedersen, B. J.

    2011-01-01

    Turbulence of the incoming wind field is of paramount importance to the dynamic response of wind turbines. Hence reliable stochastic models of the turbulence should be available from which time series can be generated for dynamic response and structural safety analysis. In the paper an empirical...... cross-spectral density function for the along-wind turbulence component over the rotor plane is taken as the starting point. The spectrum is spatially discretized in terms of a Hermitian cross-spectral density matrix for the turbulence state vector which turns out not to be positive definite. Since...... the succeeding state space and ARMA modeling of the turbulence rely on the positive definiteness of the cross-spectral density matrix, the problem with the non-positive definiteness of such matrices is at first addressed and suitable treatments regarding it are proposed. From the adjusted positive definite cross...

  9. Project TANDEM (Tsunamis in the Atlantic and the English ChaNnel: Definition of the Effects through numerical Modeling) (2014-2018): a French initiative to draw lessons from the Tohoku-oki tsunami on French coastal nuclear facilities

    Science.gov (United States)

    Hébert, Hélène; Abadie, Stéphane; Benoit, Michel; Créach, Ronan; Frère, Antoine; Gailler, Audrey; Garzaglia, Sébastien; Hayashi, Yutaka; Loevenbruck, Anne; Macary, Olivier; Marcer, Richard; Morichon, Denis; Pedreros, Rodrigo; Rebour, Vincent; Ricchiuto, Mario; Silva Jacinto, Ricardo; Terrier, Monique; Toucanne, Samuel; Traversa, Paola; Violeau, Damien

    2014-05-01

    TANDEM (Tsunamis in the Atlantic and the English ChaNnel: Definition of the Effects through numerical Modeling) is a French research project dedicated to the appraisal of coastal effects due to tsunami waves on the French coastlines, with a special focus on the Atlantic and Channel coastlines, where French civil nuclear facilities have been operating since about 30 years. This project aims at drawing conclusions from the 2011 catastrophic tsunami, and will allow, together with a Japanese research partner, to design, adapt and validate numerical methods of tsunami hazard assessment, using the outstanding database of the 2011 tsunami. Then the validated methods will be applied to estimate, as accurately as possible, the tsunami hazard for the French Atlantic and Channel coastlines, in order to provide guidance for risk assessment on the nuclear facilities. The project TANDEM follows the recommendations of International Atomic Energy Agency (IAEA) to analyse the tsunami exposure of the nuclear facilities, as well as the recommendations of the French Nuclear Safety Authority (Autorité de Sûreté Nucléaire, ASN) in the aftermath of the 2011 catastrophe, which required the licensee of nuclear facilities to conduct complementary safety assessments (CSA), also including "the robustness beyond their design basis". The tsunami hazard deserves an appraisal in the light of the 2011 catastrophe, to check whether any unforeseen tsunami impact can be expected for these facilities. TANDEM aims at defining the tsunami effects expected for the French Atlantic and Channel coastlines, basically from numerical modeling methods, through adaptation and improvement of numerical methods, in order to study tsunami impacts down to the interaction with coastal structures (thus sometimes using 3D approaches) (WP1). Then the methods will be tested to better characterize and quantify the associated uncertainties (in the source, the propagation, and the coastal impact) (WP2). The project will

  10. Geologic Framework Model Analysis Model Report

    Energy Technology Data Exchange (ETDEWEB)

    R. Clayton

    2000-12-19

    The purpose of this report is to document the Geologic Framework Model (GFM), Version 3.1 (GFM3.1) with regard to data input, modeling methods, assumptions, uncertainties, limitations, and validation of the model results, qualification status of the model, and the differences between Version 3.1 and previous versions. The GFM represents a three-dimensional interpretation of the stratigraphy and structural features of the location of the potential Yucca Mountain radioactive waste repository. The GFM encompasses an area of 65 square miles (170 square kilometers) and a volume of 185 cubic miles (771 cubic kilometers). The boundaries of the GFM were chosen to encompass the most widely distributed set of exploratory boreholes (the Water Table or WT series) and to provide a geologic framework over the area of interest for hydrologic flow and radionuclide transport modeling through the unsaturated zone (UZ). The depth of the model is constrained by the inferred depth of the Tertiary-Paleozoic unconformity. The GFM was constructed from geologic map and borehole data. Additional information from measured stratigraphy sections, gravity profiles, and seismic profiles was also considered. This interim change notice (ICN) was prepared in accordance with the Technical Work Plan for the Integrated Site Model Process Model Report Revision 01 (CRWMS M&O 2000). The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. The GFM is one component of the Integrated Site Model (ISM) (Figure l), which has been developed to provide a consistent volumetric portrayal of the rock layers, rock properties, and mineralogy of the Yucca Mountain site. The ISM consists of three components: (1) Geologic Framework Model (GFM); (2) Rock Properties Model (RPM); and (3) Mineralogic Model (MM). The ISM merges the detailed project stratigraphy into model stratigraphic units that are most useful for the primary downstream models and the

  11. Model Theory and Applications

    CERN Document Server

    Mangani, P

    2011-01-01

    This title includes: Lectures - G.E. Sacks - Model theory and applications, and H.J. Keisler - Constructions in model theory; and, Seminars - M. Servi - SH formulas and generalized exponential, and J.A. Makowski - Topological model theory.

  12. Wildfire Risk Main Model

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — The model combines three modeled fire behavior parameters (rate of spread, flame length, crown fire potential) and one modeled ecological health measure (fire regime...

  13. Energy modelling software

    CSIR Research Space (South Africa)

    Osburn, L

    2010-01-01

    Full Text Available The construction industry has turned to energy modelling in order to assist them in reducing the amount of energy consumed by buildings. However, while the energy loads of buildings can be accurately modelled, energy models often under...

  14. A Comparative of business process modelling techniques

    Science.gov (United States)

    Tangkawarow, I. R. H. T.; Waworuntu, J.

    2016-04-01

    In this era, there is a lot of business process modeling techniques. This article is the research about differences of business process modeling techniques. For each technique will explain about the definition and the structure. This paper presents a comparative analysis of some popular business process modelling techniques. The comparative framework is based on 2 criteria: notation and how it works when implemented in Somerleyton Animal Park. Each technique will end with the advantages and disadvantages. The final conclusion will give recommend of business process modeling techniques that easy to use and serve the basis for evaluating further modelling techniques.

  15. Model correction factor method for system analysis

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager; Johannesen, Johannes M.

    2000-01-01

    severallocally most central points exist without there being a simple geometric definition of the corresponding failuremodes such as is the case for collapse mechanisms in rigid plastic hinge models for frame structures. Taking as simplifiedidealized model a model of similarity with the elaborate model...... but with clearly defined failure modes, the MCFM can bestarted from each idealized single mode limit state in turn to identify a locally most central point on the elaborate limitstate surface. Typically this procedure leads to a fewer number of locally most central failure points on the elaboratelimit state...... surface than existing in the idealized model....

  16. Performance Appraisal: A New Model for Academic Advisement.

    Science.gov (United States)

    Hazleton, Vincent; Tuttle, George E.

    1981-01-01

    Presents the performance appraisal model for student advisement, a centralized developmental model that focuses on the content and process of advisement. The model has three content objectives: job definition, performance assessment, and goal setting. Operation of the model is described. Benefits and potential limitations are identified. (Author)

  17. Traceability in Model-Based Testing

    Directory of Open Access Journals (Sweden)

    Mathew George

    2012-11-01

    Full Text Available The growing complexities of software and the demand for shorter time to market are two important challenges that face today’s IT industry. These challenges demand the increase of both productivity and quality of software. Model-based testing is a promising technique for meeting these challenges. Traceability modeling is a key issue and challenge in model-based testing. Relationships between the different models will help to navigate from one model to another, and trace back to the respective requirements and the design model when the test fails. In this paper, we present an approach for bridging the gaps between the different models in model-based testing. We propose relation definition markup language (RDML for defining the relationships between models.

  18. A View of Earth System Model Development

    Institute of Scientific and Technical Information of China (English)

    ZHOU Tianjun; YU Yongqiang; WANG Bin

    2009-01-01

    This paper gives a definition of earth system model and shows three development phases of it, including physical climate system model, earth climate system model, and earth system model, based on an inves-tigation of climate system models in the world. It provides an expatiation on the strategic significance of future development of earth system model, an introduction of some representative scientific research plans on development of earth system model home and abroad, and a review of its status and trends based on the models of the fourth assessment report (AR4) of the Intergovernmental Panel on Climate Change (IPCC).Some suggestions on future development of earth system model in China are given, which are expected to be helpful to advance the development.

  19. Multiple Retrieval Models and Regression Models for Prior Art Search

    CERN Document Server

    Lopez, Patrice

    2009-01-01

    This paper presents the system called PATATRAS (PATent and Article Tracking, Retrieval and AnalysiS) realized for the IP track of CLEF 2009. Our approach presents three main characteristics: 1. The usage of multiple retrieval models (KL, Okapi) and term index definitions (lemma, phrase, concept) for the three languages considered in the present track (English, French, German) producing ten different sets of ranked results. 2. The merging of the different results based on multiple regression models using an additional validation set created from the patent collection. 3. The exploitation of patent metadata and of the citation structures for creating restricted initial working sets of patents and for producing a final re-ranking regression model. As we exploit specific metadata of the patent documents and the citation relations only at the creation of initial working sets and during the final post ranking step, our architecture remains generic and easy to extend.

  20. Computational neurogenetic modeling

    CERN Document Server

    Benuskova, Lubica

    2010-01-01

    Computational Neurogenetic Modeling is a student text, introducing the scope and problems of a new scientific discipline - Computational Neurogenetic Modeling (CNGM). CNGM is concerned with the study and development of dynamic neuronal models for modeling brain functions with respect to genes and dynamic interactions between genes. These include neural network models and their integration with gene network models. This new area brings together knowledge from various scientific disciplines, such as computer and information science, neuroscience and cognitive science, genetics and molecular biol

  1. Predictive Models for Music

    OpenAIRE

    Paiement, Jean-François; Grandvalet, Yves; Bengio, Samy

    2008-01-01

    Modeling long-term dependencies in time series has proved very difficult to achieve with traditional machine learning methods. This problem occurs when considering music data. In this paper, we introduce generative models for melodies. We decompose melodic modeling into two subtasks. We first propose a rhythm model based on the distributions of distances between subsequences. Then, we define a generative model for melodies given chords and rhythms based on modeling sequences of Narmour featur...

  2. An Acoustic Charge Transport Imager for High Definition Television Applications: Reliability Modeling and Parametric Yield Prediction of GaAs Multiple Quantum Well Avalanche Photodiodes. Degree awarded Oct. 1997

    Science.gov (United States)

    Hunt, W. D.; Brennan, K. F.; Summers, C. J.; Yun, Ilgu

    1994-01-01

    Reliability modeling and parametric yield prediction of GaAs/AlGaAs multiple quantum well (MQW) avalanche photodiodes (APDs), which are of interest as an ultra-low noise image capture mechanism for high definition systems, have been investigated. First, the effect of various doping methods on the reliability of GaAs/AlGaAs multiple quantum well (MQW) avalanche photodiode (APD) structures fabricated by molecular beam epitaxy is investigated. Reliability is examined by accelerated life tests by monitoring dark current and breakdown voltage. Median device lifetime and the activation energy of the degradation mechanism are computed for undoped, doped-barrier, and doped-well APD structures. Lifetimes for each device structure are examined via a statistically designed experiment. Analysis of variance shows that dark-current is affected primarily by device diameter, temperature and stressing time, and breakdown voltage depends on the diameter, stressing time and APD type. It is concluded that the undoped APD has the highest reliability, followed by the doped well and doped barrier devices, respectively. To determine the source of the degradation mechanism for each device structure, failure analysis using the electron-beam induced current method is performed. This analysis reveals some degree of device degradation caused by ionic impurities in the passivation layer, and energy-dispersive spectrometry subsequently verified the presence of ionic sodium as the primary contaminant. However, since all device structures are similarly passivated, sodium contamination alone does not account for the observed variation between the differently doped APDs. This effect is explained by the dopant migration during stressing, which is verified by free carrier concentration measurements using the capacitance-voltage technique.

  3. TRACKING CLIMATE MODELS

    Data.gov (United States)

    National Aeronautics and Space Administration — CLAIRE MONTELEONI*, GAVIN SCHMIDT, AND SHAILESH SAROHA* Climate models are complex mathematical models designed by meteorologists, geophysicists, and climate...

  4. Environmental Modeling Center

    Data.gov (United States)

    Federal Laboratory Consortium — The Environmental Modeling Center provides the computational tools to perform geostatistical analysis, to model ground water and atmospheric releases for comparison...

  5. Multilevel modeling using R

    CERN Document Server

    Finch, W Holmes; Kelley, Ken

    2014-01-01

    A powerful tool for analyzing nested designs in a variety of fields, multilevel/hierarchical modeling allows researchers to account for data collected at multiple levels. Multilevel Modeling Using R provides you with a helpful guide to conducting multilevel data modeling using the R software environment.After reviewing standard linear models, the authors present the basics of multilevel models and explain how to fit these models using R. They then show how to employ multilevel modeling with longitudinal data and demonstrate the valuable graphical options in R. The book also describes models fo

  6. Global Business Models

    DEFF Research Database (Denmark)

    Rask, Morten

    insight from the literature about business models, international product policy, international entry modes and globalization into a conceptual model of relevant design elements of global business models, enabling global business model innovation to deal with differences in a downstream perspective...... regarding the customer interface and in an upstream perspective regarding the supply infrastructure. The paper offers a coherent conceptual dynamic meta-model of global business model innovation. Students, scholars and managers within the field of international business can use this conceptualization...... to understand, to study, and to create global business model innovation. Managerial and research implications draw on the developed ideal type of global business model innovation....

  7. Continuous system modeling

    Science.gov (United States)

    Cellier, Francois E.

    1991-01-01

    A comprehensive and systematic introduction is presented for the concepts associated with 'modeling', involving the transition from a physical system down to an abstract description of that system in the form of a set of differential and/or difference equations, and basing its treatment of modeling on the mathematics of dynamical systems. Attention is given to the principles of passive electrical circuit modeling, planar mechanical systems modeling, hierarchical modular modeling of continuous systems, and bond-graph modeling. Also discussed are modeling in equilibrium thermodynamics, population dynamics, and system dynamics, inductive reasoning, artificial neural networks, and automated model synthesis.

  8. Understandings of 'Modelling'

    DEFF Research Database (Denmark)

    Andresen, Mette

    2007-01-01

    This paper meets the common critique of the teaching of non-authentic modelling in school mathematics. In the paper, non-authentic modelling is related to a change of view on the intentions of modelling from knowledge about applications of mathematical models to modelling for concept formation. Non......-authentic modelling is also linked with the potentials of exploration of ready-made models as a forerunner for more authentic modelling processes. The discussion includes analysis of an episode of students? work in the classroom, which serves to illustrate how concept formation may be linked to explorations of a non...

  9. Interfacing materials models with fire field models

    Energy Technology Data Exchange (ETDEWEB)

    Nicolette, V.F.; Tieszen, S.R.; Moya, J.L.

    1995-12-01

    For flame spread over solid materials, there has traditionally been a large technology gap between fundamental combustion research and the somewhat simplistic approaches used for practical, real-world applications. Recent advances in computational hardware and computational fluid dynamics (CFD)-based software have led to the development of fire field models. These models, when used in conjunction with material burning models, have the potential to bridge the gap between research and application by implementing physics-based engineering models in a transient, multi-dimensional tool. This paper discusses the coupling that is necessary between fire field models and burning material models for the simulation of solid material fires. Fire field models are capable of providing detailed information about the local fire environment. This information serves as an input to the solid material combustion submodel, which subsequently calculates the impact of the fire environment on the material. The response of the solid material (in terms of thermal response, decomposition, charring, and off-gassing) is then fed back into the field model as a source of mass, momentum and energy. The critical parameters which must be passed between the field model and the material burning model have been identified. Many computational issues must be addressed when developing such an interface. Some examples include the ability to track multiple fuels and species, local ignition criteria, and the need to use local grid refinement over the burning material of interest.

  10. Combustion modeling in a model combustor

    Institute of Scientific and Technical Information of China (English)

    L.Y.Jiang; I.Campbell; K.Su

    2007-01-01

    The flow-field of a propane-air diffusion flame combustor with interior and exterior conjugate heat transfers was numerically studied.Results obtained from four combustion models,combined with the re-normalization group (RNG) k-ε turbulence model,discrete ordinates radiation model and enhanced wall treatment are presented and discussed.The results are compared with a comprehensive database obtained from a series of experimental measurements.The flow patterns and the recirculation zone length in the combustion chamber are accurately predicted,and the mean axial velocities are in fairly good agreement with the experimental data,particularly at downstream sections for all four combustion models.The mean temperature profiles are captured fairly well by the eddy dissipation (EDS),probability density function (PDF),and laminar flamelet combustion models.However,the EDS-finite-rate combustion model fails to provide an acceptable temperature field.In general,the flamelet model illustrates little superiority over the PDF model,and to some extent the PDF model shows better performance than the EDS model.

  11. Model Validation in Ontology Based Transformations

    Directory of Open Access Journals (Sweden)

    Jesús M. Almendros-Jiménez

    2012-10-01

    Full Text Available Model Driven Engineering (MDE is an emerging approach of software engineering. MDE emphasizes the construction of models from which the implementation should be derived by applying model transformations. The Ontology Definition Meta-model (ODM has been proposed as a profile for UML models of the Web Ontology Language (OWL. In this context, transformations of UML models can be mapped into ODM/OWL transformations. On the other hand, model validation is a crucial task in model transformation. Meta-modeling permits to give a syntactic structure to source and target models. However, semantic requirements have to be imposed on source and target models. A given transformation will be sound when source and target models fulfill the syntactic and semantic requirements. In this paper, we present an approach for model validation in ODM based transformations. Adopting a logic programming based transformational approach we will show how it is possible to transform and validate models. Properties to be validated range from structural and semantic requirements of models (pre and post conditions to properties of the transformation (invariants. The approach has been applied to a well-known example of model transformation: the Entity-Relationship (ER to Relational Model (RM transformation.

  12. Model space of economic events

    Science.gov (United States)

    Romanovsky, M. Yu.

    A method for constructing the model or virtual space of economic events when economic objects can be considered as material ones is suggested. We describe change of share rates in time at stock markets as the potential difference of attracted bodies in time in this virtual space. Each share of each enterprise is displayed by a single particle with a unit “charge”. It is shown that the random value of potential difference at the origin of coordinates measured at a definite time interval has the probability density coinciding with the known distribution of “Levy flights” or “Levy walks”. A distribution of alteration in time of the “Standard and Poor” index value obtained by Mantegna and Stanley (they shown that it is the “Levy walks” distribution too) (Mantegna and Stanley, Nature 376 (1995) 46) is used for determination of the introduced potential dependence on coordinates in the model space. A simple phenomenological model of interaction potential is introduced. The potential law of each particle turns out to be closed to r-2.14 in the minimum possible three-dimensional model space. This model permits calculation of time of random potential correlations at a certain point of the model space. These correlations could characterize the time period of making a decision by an investor at stock exchange. It is shown that this time is notably shorter in unstable periods (1987). A “microscopical” model of interaction in the virtual space is also discussed.

  13. Holographic dark-energy models

    Science.gov (United States)

    Del Campo, Sergio; Fabris, Júlio. C.; Herrera, Ramón; Zimdahl, Winfried

    2011-06-01

    Different holographic dark-energy models are studied from a unifying point of view. We compare models for which the Hubble scale, the future event horizon or a quantity proportional to the Ricci scale are taken as the infrared cutoff length. We demonstrate that the mere definition of the holographic dark-energy density generally implies an interaction with the dark-matter component. We discuss the relation between the equation-of-state parameter and the energy density ratio of both components for each of the choices, as well as the possibility of noninteracting and scaling solutions. Parameter estimations for all three cutoff options are performed with the help of a Bayesian statistical analysis, using data from supernovae type Ia and the history of the Hubble parameter. The ΛCDM model is the clear winner of the analysis. According to the Bayesian information criterion (BIC), all holographic models should be considered as ruled out, since the difference ΔBIC to the corresponding ΛCDM value is >10. According to the Akaike information criterion (AIC), however, we find ΔAIC<2 for models with Hubble-scale and Ricci-scale cutoffs, indicating, that they may still be competitive. As we show for the example of the Ricci-scale case, also the use of certain priors, reducing the number of free parameters to that of the ΛCDM model, may result in a competitive holographic model.

  14. Regularized Structural Equation Modeling.

    Science.gov (United States)

    Jacobucci, Ross; Grimm, Kevin J; McArdle, John J

    A new method is proposed that extends the use of regularization in both lasso and ridge regression to structural equation models. The method is termed regularized structural equation modeling (RegSEM). RegSEM penalizes specific parameters in structural equation models, with the goal of creating easier to understand and simpler models. Although regularization has gained wide adoption in regression, very little has transferred to models with latent variables. By adding penalties to specific parameters in a structural equation model, researchers have a high level of flexibility in reducing model complexity, overcoming poor fitting models, and the creation of models that are more likely to generalize to new samples. The proposed method was evaluated through a simulation study, two illustrative examples involving a measurement model, and one empirical example involving the structural part of the model to demonstrate RegSEM's utility.

  15. Wastewater treatment models

    DEFF Research Database (Denmark)

    Gernaey, Krist; Sin, Gürkan

    2011-01-01

    The state-of-the-art level reached in modeling wastewater treatment plants (WWTPs) is reported. For suspended growth systems, WWTP models have evolved from simple description of biological removal of organic carbon and nitrogen in aeration tanks (ASM1 in 1987) to more advanced levels including...... of WWTP modeling by linking the wastewater treatment line with the sludge handling line in one modeling platform. Application of WWTP models is currently rather time consuming and thus expensive due to the high model complexity, and requires a great deal of process knowledge and modeling expertise....... Efficient and good modeling practice therefore requires the use of a proper set of guidelines, thus grounding the modeling studies on a general and systematic framework. Last but not least, general limitations of WWTP models – more specifically activated sludge models – are introduced since these define...

  16. Wastewater Treatment Models

    DEFF Research Database (Denmark)

    Gernaey, Krist; Sin, Gürkan

    2008-01-01

    The state-of-the-art level reached in modeling wastewater treatment plants (WWTPs) is reported. For suspended growth systems, WWTP models have evolved from simple description of biological removal of organic carbon and nitrogen in aeration tanks (ASM1 in 1987) to more advanced levels including...... the practice of WWTP modeling by linking the wastewater treatment line with the sludge handling line in one modeling platform. Application of WWTP models is currently rather time consuming and thus expensive due to the high model complexity, and requires a great deal of process knowledge and modeling expertise....... Efficient and good modeling practice therefore requires the use of a proper set of guidelines, thus grounding the modeling studies on a general and systematic framework. Last but not least, general limitations of WWTP models – more specifically, activated sludge models – are introduced since these define...

  17. ROCK PROPERTIES MODEL ANALYSIS MODEL REPORT

    Energy Technology Data Exchange (ETDEWEB)

    Clinton Lum

    2002-02-04

    The purpose of this Analysis and Model Report (AMR) is to document Rock Properties Model (RPM) 3.1 with regard to input data, model methods, assumptions, uncertainties and limitations of model results, and qualification status of the model. The report also documents the differences between the current and previous versions and validation of the model. The rock properties models are intended principally for use as input to numerical physical-process modeling, such as of ground-water flow and/or radionuclide transport. The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. This work was conducted in accordance with the following planning documents: WA-0344, ''3-D Rock Properties Modeling for FY 1998'' (SNL 1997, WA-0358), ''3-D Rock Properties Modeling for FY 1999'' (SNL 1999), and the technical development plan, Rock Properties Model Version 3.1, (CRWMS M&O 1999c). The Interim Change Notice (ICNs), ICN 02 and ICN 03, of this AMR were prepared as part of activities being conducted under the Technical Work Plan, TWP-NBS-GS-000003, ''Technical Work Plan for the Integrated Site Model, Process Model Report, Revision 01'' (CRWMS M&O 2000b). The purpose of ICN 03 is to record changes in data input status due to data qualification and verification activities. These work plans describe the scope, objectives, tasks, methodology, and implementing procedures for model construction. The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. The work scope for this activity consists of the following: (1) Conversion of the input data (laboratory measured porosity data, x-ray diffraction mineralogy, petrophysical calculations of bound water, and petrophysical calculations of porosity) for each borehole into stratigraphic coordinates; (2) Re-sampling and merging of data sets; (3

  18. Neutrino and The Standard Model

    CERN Document Server

    Bilenky, S M

    2014-01-01

    After discovery of the Higgs boson at CERN the Standard Model acquired a status of the full, correct theory of the elementary particles in the electroweak range. What general conclusions can be inferred from the SM? I am suggesting here that in the framework of such general principles as local gauge symmetry, unification of the weak and electromagnetic interactions and Brout-Englert-Higgs spontaneous breaking of the electroweak symmetry nature chooses the simplest possibilities. It is very plausible that massless left-handed neutrinos (simplest, most economical possibility) play crucial role in the determination of the charged current structure of the Standard Model and that neutrino properties (masses and nature) are determined by a beyond the Standard Model physics. The discovery of the neutrinoless double $\\beta$-decay and proof that neutrinos with definite masses are Majorana particles would be an important evidence in favor of the considered scenario.

  19. Mathematical modelling in solid mechanics

    CERN Document Server

    Sofonea, Mircea; Steigmann, David

    2017-01-01

    This book presents new research results in multidisciplinary fields of mathematical and numerical modelling in mechanics. The chapters treat the topics: mathematical modelling in solid, fluid and contact mechanics nonconvex variational analysis with emphasis to nonlinear solid and structural mechanics numerical modelling of problems with non-smooth constitutive laws, approximation of variational and hemivariational inequalities, numerical analysis of discrete schemes, numerical methods and the corresponding algorithms, applications to mechanical engineering numerical aspects of non-smooth mechanics, with emphasis on developing accurate and reliable computational tools mechanics of fibre-reinforced materials behaviour of elasto-plastic materials accounting for the microstructural defects definition of structural defects based on the differential geometry concepts or on the atomistic basis interaction between phase transformation and dislocations at nano-scale energetic arguments bifurcation and post-buckling a...

  20. Business process modeling in healthcare.

    Science.gov (United States)

    Ruiz, Francisco; Garcia, Felix; Calahorra, Luis; Llorente, César; Gonçalves, Luis; Daniel, Christel; Blobel, Bernd

    2012-01-01

    The importance of the process point of view is not restricted to a specific enterprise sector. In the field of health, as a result of the nature of the service offered, health institutions' processes are also the basis for decision making which is focused on achieving their objective of providing quality medical assistance. In this chapter the application of business process modelling - using the Business Process Modelling Notation (BPMN) standard is described. Main challenges of business process modelling in healthcare are the definition of healthcare processes, the multi-disciplinary nature of healthcare, the flexibility and variability of the activities involved in health care processes, the need of interoperability between multiple information systems, and the continuous updating of scientific knowledge in healthcare.

  1. Plasto-damage modelling for semi-brittle geomaterials

    OpenAIRE

    Alizadeh Ali; Gatmiri Behrouz

    2016-01-01

    This paper presents an elastoplastic damage model for constitutive modelling of semi-brittle geomaterials showing two irreversible mechanisms. On one hand, the model deals with the plastic behaviour of a porous medium by a new variant of Barcelona Basic Model. On the other hand, the model combines the micromechanical definition of damage and phenomenological concepts in the framework of Continuum Damage Mechanics (CDM) for damage modelling. A second order tensorial damage variable is adopted ...

  2. Model Reduction of Nonlinear Fire Dynamics Models

    OpenAIRE

    Lattimer, Alan Martin

    2016-01-01

    Due to the complexity, multi-scale, and multi-physics nature of the mathematical models for fires, current numerical models require too much computational effort to be useful in design and real-time decision making, especially when dealing with fires over large domains. To reduce the computational time while retaining the complexity of the domain and physics, our research has focused on several reduced-order modeling techniques. Our contributions are improving wildland fire reduced-order mod...

  3. Better models are more effectively connected models

    Science.gov (United States)

    Nunes, João Pedro; Bielders, Charles; Darboux, Frederic; Fiener, Peter; Finger, David; Turnbull-Lloyd, Laura; Wainwright, John

    2016-04-01

    The concept of hydrologic and geomorphologic connectivity describes the processes and pathways which link sources (e.g. rainfall, snow and ice melt, springs, eroded areas and barren lands) to accumulation areas (e.g. foot slopes, streams, aquifers, reservoirs), and the spatial variations thereof. There are many examples of hydrological and sediment connectivity on a watershed scale; in consequence, a process-based understanding of connectivity is crucial to help managers understand their systems and adopt adequate measures for flood prevention, pollution mitigation and soil protection, among others. Modelling is often used as a tool to understand and predict fluxes within a catchment by complementing observations with model results. Catchment models should therefore be able to reproduce the linkages, and thus the connectivity of water and sediment fluxes within the systems under simulation. In modelling, a high level of spatial and temporal detail is desirable to ensure taking into account a maximum number of components, which then enables connectivity to emerge from the simulated structures and functions. However, computational constraints and, in many cases, lack of data prevent the representation of all relevant processes and spatial/temporal variability in most models. In most cases, therefore, the level of detail selected for modelling is too coarse to represent the system in a way in which connectivity can emerge; a problem which can be circumvented by representing fine-scale structures and processes within coarser scale models using a variety of approaches. This poster focuses on the results of ongoing discussions on modelling connectivity held during several workshops within COST Action Connecteur. It assesses the current state of the art of incorporating the concept of connectivity in hydrological and sediment models, as well as the attitudes of modellers towards this issue. The discussion will focus on the different approaches through which connectivity

  4. Multiple Model Approaches to Modelling and Control,

    DEFF Research Database (Denmark)

    on the ease with which prior knowledge can be incorporated. It is interesting to note that researchers in Control Theory, Neural Networks,Statistics, Artificial Intelligence and Fuzzy Logic have more or less independently developed very similar modelling methods, calling them Local ModelNetworks, Operating...... of introduction of existing knowledge, as well as the ease of model interpretation. This book attempts to outlinemuch of the common ground between the various approaches, encouraging the transfer of ideas.Recent progress in algorithms and analysis is presented, with constructive algorithms for automated model...

  5. Integrity modelling of tropospheric delay models

    Science.gov (United States)

    Rózsa, Szabolcs; Bastiaan Ober, Pieter; Mile, Máté; Ambrus, Bence; Juni, Ildikó

    2017-04-01

    The effect of the neutral atmosphere on signal propagation is routinely estimated by various tropospheric delay models in satellite navigation. Although numerous studies can be found in the literature investigating the accuracy of these models, for safety-of-life applications it is crucial to study and model the worst case performance of these models using very low recurrence frequencies. The main objective of the INTegrity of TROpospheric models (INTRO) project funded by the ESA PECS programme is to establish a model (or models) of the residual error of existing tropospheric delay models for safety-of-life applications. Such models are required to overbound rare tropospheric delays and should thus include the tails of the error distributions. Their use should lead to safe error bounds on the user position and should allow computation of protection levels for the horizontal and vertical position errors. The current tropospheric model from the RTCA SBAS Minimal Operational Standards has an associated residual error that equals 0.12 meters in the vertical direction. This value is derived by simply extrapolating the observed distribution of the residuals into the tail (where no data is present) and then taking the point where the cumulative distribution has an exceedance level would be 10-7.While the resulting standard deviation is much higher than the estimated standard variance that best fits the data (0.05 meters), it surely is conservative for most applications. In the context of the INTRO project some widely used and newly developed tropospheric delay models (e.g. RTCA MOPS, ESA GALTROPO and GPT2W) were tested using 16 years of daily ERA-INTERIM Reanalysis numerical weather model data and the raytracing technique. The results showed that the performance of some of the widely applied models have a clear seasonal dependency and it is also affected by a geographical position. In order to provide a more realistic, but still conservative estimation of the residual

  6. Processing Approach of Non-linear Adjustment Models in the Space of Non-linear Models

    Institute of Scientific and Technical Information of China (English)

    LI Chaokui; ZHU Qing; SONG Chengfang

    2003-01-01

    This paper investigates the mathematic features of non-linear models and discusses the processing way of non-linear factors which contributes to the non-linearity of a nonlinear model. On the basis of the error definition, this paper puts forward a new adjustment criterion, SGPE.Last, this paper investigates the solution of a non-linear regression model in the non-linear model space and makes the comparison between the estimated values in non-linear model space and those in linear model space.

  7. Numerical Modelling of Streams

    DEFF Research Database (Denmark)

    Vestergaard, Kristian

    In recent years there has been a sharp increase in the use of numerical water quality models. Numeric water quality modeling can be divided into three steps: Hydrodynamic modeling for the determination of stream flow and water levels. Modelling of transport and dispersion of a conservative...

  8. Graphical Models with R

    DEFF Research Database (Denmark)

    Højsgaard, Søren; Edwards, David; Lauritzen, Steffen

    , the book provides examples of how more advanced aspects of graphical modeling can be represented and handled within R. Topics covered in the seven chapters include graphical models for contingency tables, Gaussian and mixed graphical models, Bayesian networks and modeling high dimensional data...

  9. Dynamic Latent Classification Model

    DEFF Research Database (Denmark)

    Zhong, Shengtong; Martínez, Ana M.; Nielsen, Thomas Dyhre

    as possible. Motivated by this problem setting, we propose a generative model for dynamic classification in continuous domains. At each time point the model can be seen as combining a naive Bayes model with a mixture of factor analyzers (FA). The latent variables of the FA are used to capture the dynamics...... in the process as well as modeling dependences between attributes....

  10. HRM: HII Region Models

    Science.gov (United States)

    Wenger, Trey V.; Kepley, Amanda K.; Balser, Dana S.

    2017-07-01

    HII Region Models fits HII region models to observed radio recombination line and radio continuum data. The algorithm includes the calculations of departure coefficients to correct for non-LTE effects. HII Region Models has been used to model star formation in the nucleus of IC 342.

  11. Multilevel IRT Model Assessment

    NARCIS (Netherlands)

    Fox, Jean-Paul; Ark, L. Andries; Croon, Marcel A.

    2005-01-01

    Modelling complex cognitive and psychological outcomes in, for example, educational assessment led to the development of generalized item response theory (IRT) models. A class of models was developed to solve practical and challenging educational problems by generalizing the basic IRT models. An IRT

  12. Models for Dynamic Applications

    DEFF Research Database (Denmark)

    2011-01-01

    be applied to formulate, analyse and solve these dynamic problems and how in the case of the fuel cell problem the model consists of coupledmeso and micro scale models. It is shown how data flows are handled between the models and how the solution is obtained within the modelling environment....

  13. Multivariate GARCH models

    DEFF Research Database (Denmark)

    Silvennoinen, Annastiina; Teräsvirta, Timo

    This article contains a review of multivariate GARCH models. Most common GARCH models are presented and their properties considered. This also includes nonparametric and semiparametric models. Existing specification and misspecification tests are discussed. Finally, there is an empirical example...... in which several multivariate GARCH models are fitted to the same data set and the results compared....

  14. The Model Confidence Set

    DEFF Research Database (Denmark)

    Hansen, Peter Reinhard; Lunde, Asger; Nason, James M.

    The paper introduces the model confidence set (MCS) and applies it to the selection of models. A MCS is a set of models that is constructed such that it will contain the best model with a given level of confidence. The MCS is in this sense analogous to a confidence interval for a parameter. The M...

  15. Modelling Railway Interlocking Systems

    DEFF Research Database (Denmark)

    Lindegaard, Morten Peter; Viuf, P.; Haxthausen, Anne Elisabeth

    2000-01-01

    In this report we present a model of interlocking systems, and describe how the model may be validated by simulation. Station topologies are modelled by graphs in which the nodes denote track segments, and the edges denote connectivity for train traÆc. Points and signals are modelled by annotatio...

  16. AIDS Epidemiological models

    Science.gov (United States)

    Rahmani, Fouad Lazhar

    2010-11-01

    The aim of this paper is to present mathematical modelling of the spread of infection in the context of the transmission of the human immunodeficiency virus (HIV) and the acquired immune deficiency syndrome (AIDS). These models are based in part on the models suggested in the field of th AIDS mathematical modelling as reported by ISHAM [6].

  17. Multivariate GARCH models

    DEFF Research Database (Denmark)

    Silvennoinen, Annastiina; Teräsvirta, Timo

    This article contains a review of multivariate GARCH models. Most common GARCH models are presented and their properties considered. This also includes nonparametric and semiparametric models. Existing specification and misspecification tests are discussed. Finally, there is an empirical example...... in which several multivariate GARCH models are fitted to the same data set and the results compared....

  18. Multilevel IRT Model Assessment

    NARCIS (Netherlands)

    Fox, Gerardus J.A.; Ark, L. Andries; Croon, Marcel A.

    2005-01-01

    Modelling complex cognitive and psychological outcomes in, for example, educational assessment led to the development of generalized item response theory (IRT) models. A class of models was developed to solve practical and challenging educational problems by generalizing the basic IRT models. An IRT

  19. Biomass Scenario Model

    Energy Technology Data Exchange (ETDEWEB)

    2015-09-01

    The Biomass Scenario Model (BSM) is a unique, carefully validated, state-of-the-art dynamic model of the domestic biofuels supply chain which explicitly focuses on policy issues, their feasibility, and potential side effects. It integrates resource availability, physical/technological/economic constraints, behavior, and policy. The model uses a system dynamics simulation (not optimization) to model dynamic interactions across the supply chain.

  20. Lumped-parameter models

    Energy Technology Data Exchange (ETDEWEB)

    Ibsen, Lars Bo; Liingaard, M.

    2006-12-15

    A lumped-parameter model represents the frequency dependent soil-structure interaction of a massless foundation placed on or embedded into an unbounded soil domain. In this technical report the steps of establishing a lumped-parameter model are presented. Following sections are included in this report: Static and dynamic formulation, Simple lumped-parameter models and Advanced lumped-parameter models. (au)

  1. Plant development models

    NARCIS (Netherlands)

    Chuine, I.; Garcia de Cortazar-Atauri, I.; Kramer, K.; Hänninen, H.

    2013-01-01

    In this chapter we provide a brief overview of plant phenology modeling, focusing on mechanistic phenological models. After a brief history of plant phenology modeling, we present the different models which have been described in the literature so far and highlight the main differences between them,

  2. Generic Market Models

    NARCIS (Netherlands)

    R. Pietersz (Raoul); M. van Regenmortel

    2005-01-01

    textabstractCurrently, there are two market models for valuation and risk management of interest rate derivatives, the LIBOR and swap market models. In this paper, we introduce arbitrage-free constant maturity swap (CMS) market models and generic market models featuring forward rates that span perio

  3. A Model for Conversation

    DEFF Research Database (Denmark)

    Ayres, Phil

    2012-01-01

    This essay discusses models. It examines what models are, the roles models perform and suggests various intentions that underlie their construction and use. It discusses how models act as a conversational partner, and how they support various forms of conversation within the conversational activity...... of design. Three distinctions are drawn through which to develop this discussion of models in an architectural context. An examination of these distinctions serves to nuance particular characteristics and roles of models, the modelling activity itself and those engaged in it....

  4. Talk about toy models

    Science.gov (United States)

    Luczak, Joshua

    2017-02-01

    Scientific models are frequently discussed in philosophy of science. A great deal of the discussion is centred on approximation, idealisation, and on how these models achieve their representational function. Despite the importance, distinct nature, and high presence of toy models, they have received little attention from philosophers. This paper hopes to remedy this situation. It aims to elevate the status of toy models: by distinguishing them from approximations and idealisations, by highlighting and elaborating on several ways the Kac ring, a simple statistical mechanical model, is used as a toy model, and by explaining why toy models can be used to successfully carry out important work without performing a representational function.

  5. Latent classification models

    DEFF Research Database (Denmark)

    Langseth, Helge; Nielsen, Thomas Dyhre

    2005-01-01

    One of the simplest, and yet most consistently well-performing setof classifiers is the \\NB models. These models rely on twoassumptions: $(i)$ All the attributes used to describe an instanceare conditionally independent given the class of that instance,and $(ii)$ all attributes follow a specific...... parametric family ofdistributions.  In this paper we propose a new set of models forclassification in continuous domains, termed latent classificationmodels. The latent classification model can roughly be seen ascombining the \\NB model with a mixture of factor analyzers,thereby relaxing the assumptions...... classification model, and wedemonstrate empirically that the accuracy of the proposed model issignificantly higher than the accuracy of other probabilisticclassifiers....

  6. Wastewater treatment models

    DEFF Research Database (Denmark)

    Gernaey, Krist; Sin, Gürkan

    2011-01-01

    The state-of-the-art level reached in modeling wastewater treatment plants (WWTPs) is reported. For suspended growth systems, WWTP models have evolved from simple description of biological removal of organic carbon and nitrogen in aeration tanks (ASM1 in 1987) to more advanced levels including...... of WWTP modeling by linking the wastewater treatment line with the sludge handling line in one modeling platform. Application of WWTP models is currently rather time consuming and thus expensive due to the high model complexity, and requires a great deal of process knowledge and modeling expertise...

  7. Wastewater Treatment Models

    DEFF Research Database (Denmark)

    Gernaey, Krist; Sin, Gürkan

    2008-01-01

    The state-of-the-art level reached in modeling wastewater treatment plants (WWTPs) is reported. For suspended growth systems, WWTP models have evolved from simple description of biological removal of organic carbon and nitrogen in aeration tanks (ASM1 in 1987) to more advanced levels including...... the practice of WWTP modeling by linking the wastewater treatment line with the sludge handling line in one modeling platform. Application of WWTP models is currently rather time consuming and thus expensive due to the high model complexity, and requires a great deal of process knowledge and modeling expertise...

  8. The Hospitable Meal Model

    DEFF Research Database (Denmark)

    Justesen, Lise; Overgaard, Svend Skafte

    2017-01-01

    This article presents an analytical model that aims to conceptualize how meal experiences are framed when taking into account a dynamic understanding of hospitality: the meal model is named The Hospitable Meal Model. The idea behind The Hospitable Meal Model is to present a conceptual model...... that can serve as a frame for developing hospitable meal competencies among professionals working within the area of institutional foodservices as well as a conceptual model for analysing meal experiences. The Hospitable Meal Model transcends and transforms existing meal models by presenting a more open......-ended approach towards meal experiences. The underlying purpose of The Hospitable Meal Model is to provide the basis for creating value for the individuals involved in institutional meal services. The Hospitable Meal Model was developed on the basis of an empirical study on hospital meal experiences explored...

  9. Protein Models Comparator

    CERN Document Server

    Widera, Paweł

    2011-01-01

    The process of comparison of computer generated protein structural models is an important element of protein structure prediction. It has many uses including model quality evaluation, selection of the final models from a large set of candidates or optimisation of parameters of energy functions used in template free modelling and refinement. Although many protein comparison methods are available online on numerous web servers, their ability to handle a large scale model comparison is often very limited. Most of the servers offer only a single pairwise structural comparison, and they usually do not provide a model-specific comparison with a fixed alignment between the models. To bridge the gap between the protein and model structure comparison we have developed the Protein Models Comparator (pm-cmp). To be able to deliver the scalability on demand and handle large comparison experiments the pm-cmp was implemented "in the cloud". Protein Models Comparator is a scalable web application for a fast distributed comp...

  10. Nonuniform Markov models

    CERN Document Server

    Ristad, E S; Ristad, Eric Sven; Thomas, Robert G.

    1996-01-01

    A statistical language model assigns probability to strings of arbitrary length. Unfortunately, it is not possible to gather reliable statistics on strings of arbitrary length from a finite corpus. Therefore, a statistical language model must decide that each symbol in a string depends on at most a small, finite number of other symbols in the string. In this report we propose a new way to model conditional independence in Markov models. The central feature of our nonuniform Markov model is that it makes predictions of varying lengths using contexts of varying lengths. Experiments on the Wall Street Journal reveal that the nonuniform model performs slightly better than the classic interpolated Markov model. This result is somewhat remarkable because both models contain identical numbers of parameters whose values are estimated in a similar manner. The only difference between the two models is how they combine the statistics of longer and shorter strings. Keywords: nonuniform Markov model, interpolated Markov m...

  11. Lumped Thermal Household Model

    DEFF Research Database (Denmark)

    Biegel, Benjamin; Andersen, Palle; Stoustrup, Jakob

    2013-01-01

    a lumped model approach as an alternative to the individual models. In the lumped model, the portfolio is seen as baseline consumption superimposed with an ideal storage of limited power and energy capacity. The benefit of such a lumped model is that the computational effort of flexibility optimization......In this paper we discuss two different approaches to model the flexible power consumption of heat pump heated households: individual household modeling and lumped modeling. We illustrate that a benefit of individual modeling is that we can overview and optimize the complete flexibility of a heat...... pump portfolio. Following, we illustrate two disadvantages of individual models, namely that it requires much computational effort to optimize over a large portfolio, and second that it is difficult to accurately model the houses in certain time periods due to local disturbances. Finally, we propose...

  12. Calibrated Properties Model

    Energy Technology Data Exchange (ETDEWEB)

    C. Ahlers; H. Liu

    2000-03-12

    The purpose of this Analysis/Model Report (AMR) is to document the Calibrated Properties Model that provides calibrated parameter sets for unsaturated zone (UZ) flow and transport process models for the Yucca Mountain Site Characterization Project (YMP). This work was performed in accordance with the ''AMR Development Plan for U0035 Calibrated Properties Model REV00. These calibrated property sets include matrix and fracture parameters for the UZ Flow and Transport Model (UZ Model), drift seepage models, drift-scale and mountain-scale coupled-processes models, and Total System Performance Assessment (TSPA) models as well as Performance Assessment (PA) and other participating national laboratories and government agencies. These process models provide the necessary framework to test conceptual hypotheses of flow and transport at different scales and predict flow and transport behavior under a variety of climatic and thermal-loading conditions.

  13. The UTCI-clothing model

    Science.gov (United States)

    Havenith, George; Fiala, Dusan; Błazejczyk, Krzysztof; Richards, Mark; Bröde, Peter; Holmér, Ingvar; Rintamaki, Hannu; Benshabat, Yael; Jendritzky, Gerd

    2012-05-01

    The Universal Thermal Climate Index (UTCI) was conceived as a thermal index covering the whole climate range from heat to cold. This would be impossible without considering clothing as the interface between the person (here, the physiological model of thermoregulation) and the environment. It was decided to develop a clothing model for this application in which the following three factors were considered: (1) typical dressing behaviour in different temperatures, as observed in the field, resulting in a model of the distribution of clothing over the different body segments in relation to the ambient temperature, (2) the changes in clothing insulation and vapour resistance caused by wind and body movement, and (3) the change in wind speed in relation to the height above ground. The outcome was a clothing model that defines in detail the effective clothing insulation and vapour resistance for each of the thermo-physiological model's body segments over a wide range of climatic conditions. This paper details this model's conception and documents its definitions.

  14. Introduction to Adjoint Models

    Science.gov (United States)

    Errico, Ronald M.

    2015-01-01

    In this lecture, some fundamentals of adjoint models will be described. This includes a basic derivation of tangent linear and corresponding adjoint models from a parent nonlinear model, the interpretation of adjoint-derived sensitivity fields, a description of methods of automatic differentiation, and the use of adjoint models to solve various optimization problems, including singular vectors. Concluding remarks will attempt to correct common misconceptions about adjoint models and their utilization.

  15. Modeling cholera outbreaks

    OpenAIRE

    Chao, Dennis L.; Ira M Longini; Morris, J. Glenn

    2014-01-01

    Mathematical modeling can be a valuable tool for studying infectious disease outbreak dynamics and simulating the effects of possible interventions. Here, we describe approaches to modeling cholera outbreaks and how models have been applied to explore intervention strategies, particularly in Haiti. Mathematical models can play an important role in formulating and evaluating complex cholera outbreak response options. Major challenges to cholera modeling are insufficient data for calibrating mo...

  16. Business Model Visualization

    OpenAIRE

    Zagorsek, Branislav

    2013-01-01

    Business model describes the company’s most important activities, proposed value, and the compensation for the value. Business model visualization enables to simply and systematically capture and describe the most important components of the business model while the standardization of the concept allows the comparison between companies. There are several possibilities how to visualize the model. The aim of this paper is to describe the options for business model visualization and business mod...

  17. Diffeomorphic Statistical Deformation Models

    DEFF Research Database (Denmark)

    Hansen, Michael Sass; Hansen, Mads/Fogtman; Larsen, Rasmus

    2007-01-01

    In this paper we present a new method for constructing diffeomorphic statistical deformation models in arbitrary dimensional images with a nonlinear generative model and a linear parameter space. Our deformation model is a modified version of the diffeomorphic model introduced by Cootes et al. Th...... with ground truth in form of manual expert annotations, and compared to Cootes's model. We anticipate applications in unconstrained diffeomorphic synthesis of images, e.g. for tracking, segmentation, registration or classification purposes....

  18. Dimension of linear models

    DEFF Research Database (Denmark)

    Høskuldsson, Agnar

    1996-01-01

    Determination of the proper dimension of a given linear model is one of the most important tasks in the applied modeling work. We consider here eight criteria that can be used to determine the dimension of the model, or equivalently, the number of components to use in the model. Four...... the basic problems in determining the dimension of linear models. Then each of the eight measures are treated. The results are illustrated by examples....

  19. Modeling cholera outbreaks

    OpenAIRE

    Dennis L Chao; Longini, Ira M.; Morris, J. Glenn

    2014-01-01

    Mathematical modeling can be a valuable tool for studying infectious disease outbreak dynamics and simulating the effects of possible interventions. Here, we describe approaches to modeling cholera outbreaks and how models have been applied to explore intervention strategies, particularly in Haiti. Mathematical models can play an important role in formulating and evaluating complex cholera outbreak response options. Major challenges to cholera modeling are insufficient data for calibrating mo...

  20. Multiple Model Approaches to Modelling and Control,

    DEFF Research Database (Denmark)

    Why Multiple Models?This book presents a variety of approaches which produce complex models or controllers by piecing together a number of simpler subsystems. Thisdivide-and-conquer strategy is a long-standing and general way of copingwith complexity in engineering systems, nature and human probl...

  1. Model Checking of Boolean Process Models

    CERN Document Server

    Schneider, Christoph

    2011-01-01

    In the field of Business Process Management formal models for the control flow of business processes have been designed since more than 15 years. Which methods are best suited to verify the bulk of these models? The first step is to select a formal language which fixes the semantics of the models. We adopt the language of Boolean systems as reference language for Boolean process models. Boolean systems form a simple subclass of coloured Petri nets. Their characteristics are low tokens to model explicitly states with a subsequent skipping of activations and arbitrary logical rules of type AND, XOR, OR etc. to model the split and join of the control flow. We apply model checking as a verification method for the safeness and liveness of Boolean systems. Model checking of Boolean systems uses the elementary theory of propositional logic, no modal operators are needed. Our verification builds on a finite complete prefix of a certain T-system attached to the Boolean system. It splits the processes of the Boolean sy...

  2. Pavement Aging Model by Response Surface Modeling

    Directory of Open Access Journals (Sweden)

    Manzano-Ramírez A.

    2011-10-01

    Full Text Available In this work, surface course aging was modeled by Response Surface Methodology (RSM. The Marshall specimens were placed in a conventional oven for time and temperature conditions established on the basis of the environment factors of the region where the surface course is constructed by AC-20 from the Ing. Antonio M. Amor refinery. Volatilized material (VM, load resistance increment (ΔL and flow resistance increment (ΔF models were developed by the RSM. Cylindrical specimens with real aging were extracted from the surface course pilot to evaluate the error of the models. The VM model was adequate, in contrast (ΔL and (ΔF models were almost adequate with an error of 20 %, that was associated with the other environmental factors, which were not considered at the beginning of the research.

  3. Modelling and Forecasting Multivariate Realized Volatility

    DEFF Research Database (Denmark)

    Chiriac, Roxana; Voev, Valeri

    This paper proposes a methodology for modelling time series of realized covariance matrices in order to forecast multivariate risks. The approach allows for flexible dynamic dependence patterns and guarantees positive definiteness of the resulting forecasts without imposing parameter restrictions....... We provide an empirical application of the model, in which we show by means of stochastic dominance tests that the returns from an optimal portfolio based on the model's forecasts second-order dominate returns of portfolios optimized on the basis of traditional MGARCH models. This result implies...

  4. Magnetization curve modelling of soft magnetic alloys

    Energy Technology Data Exchange (ETDEWEB)

    Meszaros, I, E-mail: meszaros@eik.bme.hu [Department of Materials Science and Engineering, Budapest University of Technology and Economics, Bertalan L. street 7., Budapest, H-1111 (Hungary)

    2011-01-01

    In this paper we present an application of the so called hyperbolic model of magnetization. The model was modified and it was applied for nine different soft magnetic alloys. The tested samples were electro-technical steels (FeSi alloys) and a permalloy (FeNi alloy) with strongly different magnetic properties. Among them there are top, medium and definitely poor quality soft magnetic materials as well. Their minor hysteresis loops and normal magnetization curves were measured by alternating current measurement. The hyperbolic model of magnetization was applied for the experimental normal magnetization curves. It was proved that the applied model is excellent for describing mathematically the experimental magnetization curves.

  5. 岩爆、冲击地压的定义、机制、分类及其定量预测模型%Definition, mechanism, classification and quantitative forecast model for rockburst and pressure bump

    Institute of Scientific and Technical Information of China (English)

    钱七虎

    2014-01-01

    引述了若干国际权威学者关于岩爆的机制和定义的论述,在此基础上,依据岩爆发生的不同机制,将岩爆分为断层滑移或者剪切断裂所导致的断裂滑移型和岩石破坏导致的应变型岩爆,并结合事故案例,分析了应变型和滑移型两类岩爆及冲击地压的发生机制和特点。在机制分析的基础上,介绍了岩(煤)柱应变型岩爆和围岩应变型岩爆以及断裂滑移型岩爆定量预测的研究成果,其中,包括作者利用非欧几何模型研究非协调变形影响后对应变型岩爆进行了定量预测和数值模拟的新研究成果。%As a review paper, the discourses on the mechanism and the definition of rockburst of authoritative experts were quoted, based on these discourses and according to the different mechanisms of rockburst, rockbursts are divided into the sliding mode rockburst resulting from fault-slip events and strain mode rockburst resulting from the failure of the rock. Combined with the specific accident cases, the mechanism and characteristics of the two types of rockburst are analyzed. Finally, based on the analysis of the mechanism, the quantitative forecast and numerical simulation of rock (coal) pillar strain mode rockburst, enclosing rock strain mode rockburst and sliding mode rockburst are introduced, in which, the author’s latest research of quantitative forecast and numerical simulation for strain mode rockburst after incompatible deformation by using non-Euclidean geometry model is included.

  6. Model Validation Status Review

    Energy Technology Data Exchange (ETDEWEB)

    E.L. Hardin

    2001-11-28

    The primary objective for the Model Validation Status Review was to perform a one-time evaluation of model validation associated with the analysis/model reports (AMRs) containing model input to total-system performance assessment (TSPA) for the Yucca Mountain site recommendation (SR). This review was performed in response to Corrective Action Request BSC-01-C-01 (Clark 2001, Krisha 2001) pursuant to Quality Assurance review findings of an adverse trend in model validation deficiency. The review findings in this report provide the following information which defines the extent of model validation deficiency and the corrective action needed: (1) AMRs that contain or support models are identified, and conversely, for each model the supporting documentation is identified. (2) The use for each model is determined based on whether the output is used directly for TSPA-SR, or for screening (exclusion) of features, events, and processes (FEPs), and the nature of the model output. (3) Two approaches are used to evaluate the extent to which the validation for each model is compliant with AP-3.10Q (Analyses and Models). The approaches differ in regard to whether model validation is achieved within individual AMRs as originally intended, or whether model validation could be readily achieved by incorporating information from other sources. (4) Recommendations are presented for changes to the AMRs, and additional model development activities or data collection, that will remedy model validation review findings, in support of licensing activities. The Model Validation Status Review emphasized those AMRs that support TSPA-SR (CRWMS M&O 2000bl and 2000bm). A series of workshops and teleconferences was held to discuss and integrate the review findings. The review encompassed 125 AMRs (Table 1) plus certain other supporting documents and data needed to assess model validity. The AMRs were grouped in 21 model areas representing the modeling of processes affecting the natural and

  7. Product and Process Modelling

    DEFF Research Database (Denmark)

    Cameron, Ian T.; Gani, Rafiqul

    This book covers the area of product and process modelling via a case study approach. It addresses a wide range of modelling applications with emphasis on modelling methodology and the subsequent in-depth analysis of mathematical models to gain insight via structural aspects of the models....... These approaches are put into the context of life cycle modelling, where multiscale and multiform modelling is increasingly prevalent in the 21st century. The book commences with a discussion of modern product and process modelling theory and practice followed by a series of case studies drawn from a variety...... to biotechnology applications, food, polymer and human health application areas. The book highlights to important nature of modern product and process modelling in the decision making processes across the life cycle. As such it provides an important resource for students, researchers and industrial practitioners....

  8. Practical Marginalized Multilevel Models.

    Science.gov (United States)

    Griswold, Michael E; Swihart, Bruce J; Caffo, Brian S; Zeger, Scott L

    2013-01-01

    Clustered data analysis is characterized by the need to describe both systematic variation in a mean model and cluster-dependent random variation in an association model. Marginalized multilevel models embrace the robustness and interpretations of a marginal mean model, while retaining the likelihood inference capabilities and flexible dependence structures of a conditional association model. Although there has been increasing recognition of the attractiveness of marginalized multilevel models, there has been a gap in their practical application arising from a lack of readily available estimation procedures. We extend the marginalized multilevel model to allow for nonlinear functions in both the mean and association aspects. We then formulate marginal models through conditional specifications to facilitate estimation with mixed model computational solutions already in place. We illustrate the MMM and approximate MMM approaches on a cerebrovascular deficiency crossover trial using SAS and an epidemiological study on race and visual impairment using R. Datasets, SAS and R code are included as supplemental materials.

  9. Modelling Foundations and Applications

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 8th European Conference on Modelling Foundations and Applications, held in Kgs. Lyngby, Denmark, in July 2012. The 20 revised full foundations track papers and 10 revised full applications track papers presented were carefully reviewed...... and selected from 81 submissions. Papers on all aspects of MDE were received, including topics such as architectural modelling and product lines, code generation, domain-specic modeling, metamodeling, model analysis and verication, model management, model transformation and simulation. The breadth of topics...

  10. Modeling worldwide highway networks

    Science.gov (United States)

    Villas Boas, Paulino R.; Rodrigues, Francisco A.; da F. Costa, Luciano

    2009-12-01

    This Letter addresses the problem of modeling the highway systems of different countries by using complex networks formalism. More specifically, we compare two traditional geographical models with a modified geometrical network model where paths, rather than edges, are incorporated at each step between the origin and the destination vertices. Optimal configurations of parameters are obtained for each model and used for the comparison. The highway networks of Australia, Brazil, India, and Romania are considered and shown to be properly modeled by the modified geographical model.

  11. THE IMPROVED XINANJIANG MODEL

    Institute of Scientific and Technical Information of China (English)

    LI Zhi-jia; YAO Cheng; KONG Xiang-guang

    2005-01-01

    To improve the Xinanjiang model, the runoff generating from infiltration-excess is added to the model.The another 6 parameters are added to Xinanjiang model.In principle, the improved Xinanjiang model can be used to simulate runoff in the humid, semi-humid and also semi-arid regions.The application in Yi River shows the improved Xinanjiang model could forecast discharge with higher accuracy and can satisfy the practical requirements.It also shows that the improved model is reasonable.

  12. Microsoft tabular modeling cookbook

    CERN Document Server

    Braak, Paul te

    2013-01-01

    This book follows a cookbook style with recipes explaining the steps for developing analytic data using Business Intelligence Semantic Models.This book is designed for developers who wish to develop powerful and dynamic models for users as well as those who are responsible for the administration of models in corporate environments. It is also targeted at analysts and users of Excel who wish to advance their knowledge of Excel through the development of tabular models or who wish to analyze data through tabular modeling techniques. We assume no prior knowledge of tabular modeling

  13. Five models of capitalism

    Directory of Open Access Journals (Sweden)

    Luiz Carlos Bresser-Pereira

    2012-03-01

    Full Text Available Besides analyzing capitalist societies historically and thinking of them in terms of phases or stages, we may compare different models or varieties of capitalism. In this paper I survey the literature on this subject, and distinguish the classification that has a production or business approach from those that use a mainly political criterion. I identify five forms of capitalism: among the rich countries, the liberal democratic or Anglo-Saxon model, the social or European model, and the endogenous social integration or Japanese model; among developing countries, I distinguish the Asian developmental model from the liberal-dependent model that characterizes most other developing countries, including Brazil.

  14. Holographic twin Higgs model.

    Science.gov (United States)

    Geller, Michael; Telem, Ofri

    2015-05-15

    We present the first realization of a "twin Higgs" model as a holographic composite Higgs model. Uniquely among composite Higgs models, the Higgs potential is protected by a new standard model (SM) singlet elementary "mirror" sector at the sigma model scale f and not by the composite states at m_{KK}, naturally allowing for m_{KK} beyond the LHC reach. As a result, naturalness in our model cannot be constrained by the LHC, but may be probed by precision Higgs measurements at future lepton colliders, and by direct searches for Kaluza-Klein excitations at a 100 TeV collider.

  15. Energy-consumption modelling

    Energy Technology Data Exchange (ETDEWEB)

    Reiter, E.R.

    1980-01-01

    A highly sophisticated and accurate approach is described to compute on an hourly or daily basis the energy consumption for space heating by individual buildings, urban sectors, and whole cities. The need for models and specifically weather-sensitive models, composite models, and space-heating models are discussed. Development of the Colorado State University Model, based on heat-transfer equations and on a heuristic, adaptive, self-organizing computation learning approach, is described. Results of modeling energy consumption by the city of Minneapolis and Cheyenne are given. Some data on energy consumption in individual buildings are included.

  16. Empirical Model Building Data, Models, and Reality

    CERN Document Server

    Thompson, James R

    2011-01-01

    Praise for the First Edition "This...novel and highly stimulating book, which emphasizes solving real problems...should be widely read. It will have a positive and lasting effect on the teaching of modeling and statistics in general." - Short Book Reviews This new edition features developments and real-world examples that showcase essential empirical modeling techniques Successful empirical model building is founded on the relationship between data and approximate representations of the real systems that generated that data. As a result, it is essential for researchers who construct these m

  17. Multiscale modeling methods in biomechanics.

    Science.gov (United States)

    Bhattacharya, Pinaki; Viceconti, Marco

    2017-01-19

    More and more frequently, computational biomechanics deals with problems where the portion of physical reality to be modeled spans over such a large range of spatial and temporal dimensions, that it is impossible to represent it as a single space-time continuum. We are forced to consider multiple space-time continua, each representing the phenomenon of interest at a characteristic space-time scale. Multiscale models describe a complex process across multiple scales, and account for how quantities transform as we move from one scale to another. This review offers a set of definitions for this emerging field, and provides a brief summary of the most recent developments on multiscale modeling in biomechanics. Of all possible perspectives, we chose that of the modeling intent, which vastly affect the nature and the structure of each research activity. To the purpose we organized all papers reviewed in three categories: 'causal confirmation,' where multiscale models are used as materializations of the causation theories; 'predictive accuracy,' where multiscale modeling is aimed to improve the predictive accuracy; and 'determination of effect,' where multiscale modeling is used to model how a change at one scale manifests in an effect at another radically different space-time scale. Consistent with how the volume of computational biomechanics research is distributed across application targets, we extensively reviewed papers targeting the musculoskeletal and the cardiovascular systems, and covered only a few exemplary papers targeting other organ systems. The review shows a research subdomain still in its infancy, where causal confirmation papers remain the most common. For further resources related to this article, please visit the WIREs website.

  18. Towards a Formal Model of Context Awareness

    DEFF Research Database (Denmark)

    Kjærgaard, Mikkel Baun; Bunde-Pedersen, Jonathan

    2006-01-01

    There is a definite lack of formal support for modeling realistic context-awareness in pervasive computing applications. The CONAWA calculus presented in this paper provides mechanisms for modeling complex and interwoven sets of context-information by extending ambient calculus with new constructs...... and capabilities. The calculus is a step in the direction of making formal methods applicable in the area of pervasive computing....

  19. Incorporating Resilience into Dynamic Social Models

    Science.gov (United States)

    2016-07-20

    resiliency, computational modeling, computational social science /systems, modeling and simulation 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF...system. The relationships between random variables are given as conditional probability rules. BKBs are represented as a directed graph with...and BKB inferencing methods can be found in Santos et al [20]. 4.1. BKB Definition and Inferencing A BKB is a directed , bipartite graph consisting

  20. AXIOLOGICAL MODEL OF INSTRUCTIONAL DESIGN

    Directory of Open Access Journals (Sweden)

    Takushevich I. A.

    2015-10-01

    Full Text Available The article presents instructional design as a new approach to the issue of developing value-oriented worldview. Scientific research and analysis led the author to summarize instructional design theory, broaden the definition of instructional design and apply it to instruction and learning in a new manner. The goal to build a pattern of instruction aimed at developing learners’ value-oriented worldview required the author to study the existing instructional design model, to analyse and generalize a number of monographs and articles devoted to the problem of building value systems and value orientations, and finally to investigate and apply the new knowledge to real life in the form of experiment. The work conducted brought the author to axiological model of instructional design, which consists of three dimensions: a linear sequence of the events from designing the instructional material to independent learning activities, interaction between a teacher and a learner, pace of learning and design. The article touches upon every dimension, level and stage of the model, describes and defines the procedures that take place on each of them, as well as suggests a possible way to visualize the model in a form of a sketch. The author also points out the advantages of using instructional design as an efficient and smart tool to organize learning and justifies the use of the new instructional design model in XXI century

  1. Mouse models of intracranial aneurysm.

    Science.gov (United States)

    Wang, Yutang; Emeto, Theophilus I; Lee, James; Marshman, Laurence; Moran, Corey; Seto, Sai-wang; Golledge, Jonathan

    2015-05-01

    Subarachnoid hemorrhage secondary to rupture of an intracranial aneurysm is a highly lethal medical condition. Current management strategies for unruptured intracranial aneurysms involve radiological surveillance and neurosurgical or endovascular interventions. There is no pharmacological treatment available to decrease the risk of aneurysm rupture and subsequent subarachnoid hemorrhage. There is growing interest in the pathogenesis of intracranial aneurysm focused on the development of drug therapies to decrease the incidence of aneurysm rupture. The study of rodent models of intracranial aneurysms has the potential to improve our understanding of intracranial aneurysm development and progression. This review summarizes current mouse models of intact and ruptured intracranial aneurysms and discusses the relevance of these models to human intracranial aneurysms. The article also reviews the importance of these models in investigating the molecular mechanisms involved in the disease. Finally, potential pharmaceutical targets for intracranial aneurysm suggested by previous studies are discussed. Examples of potential drug targets include matrix metalloproteinases, stromal cell-derived factor-1, tumor necrosis factor-α, the renin-angiotensin system and the β-estrogen receptor. An agreed clear, precise and reproducible definition of what constitutes an aneurysm in the models would assist in their use to better understand the pathology of intracranial aneurysm and applying findings to patients.

  2. Develop a Model Component

    Science.gov (United States)

    Ensey, Tyler S.

    2013-01-01

    During my internship at NASA, I was a model developer for Ground Support Equipment (GSE). The purpose of a model developer is to develop and unit test model component libraries (fluid, electrical, gas, etc.). The models are designed to simulate software for GSE (Ground Special Power, Crew Access Arm, Cryo, Fire and Leak Detection System, Environmental Control System (ECS), etc. .) before they are implemented into hardware. These models support verifying local control and remote software for End-Item Software Under Test (SUT). The model simulates the physical behavior (function, state, limits and 110) of each end-item and it's dependencies as defined in the Subsystem Interface Table, Software Requirements & Design Specification (SRDS), Ground Integrated Schematic (GIS), and System Mechanical Schematic.(SMS). The software of each specific model component is simulated through MATLAB's Simulink program. The intensiv model development life cycle is a.s follows: Identify source documents; identify model scope; update schedule; preliminary design review; develop model requirements; update model.. scope; update schedule; detailed design review; create/modify library component; implement library components reference; implement subsystem components; develop a test script; run the test script; develop users guide; send model out for peer review; the model is sent out for verifictionlvalidation; if there is empirical data, a validation data package is generated; if there is not empirical data, a verification package is generated; the test results are then reviewed; and finally, the user. requests accreditation, and a statement of accreditation is prepared. Once each component model is reviewed and approved, they are intertwined together into one integrated model. This integrated model is then tested itself, through a test script and autotest, so that it can be concluded that all models work conjointly, for a single purpose. The component I was assigned, specifically, was a

  3. Biosphere Model Report

    Energy Technology Data Exchange (ETDEWEB)

    D.W. Wu; A.J. Smith

    2004-11-08

    The purpose of this report is to document the biosphere model, the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), which describes radionuclide transport processes in the biosphere and associated human exposure that may arise as the result of radionuclide release from the geologic repository at Yucca Mountain. The biosphere model is one of the process models that support the Yucca Mountain Project (YMP) Total System Performance Assessment (TSPA) for the license application (LA), TSPA-LA. The ERMYN provides the capability of performing human radiation dose assessments. This report documents the biosphere model, which includes: (1) Describing the reference biosphere, human receptor, exposure scenarios, and primary radionuclides for each exposure scenario (Section 6.1); (2) Developing a biosphere conceptual model using site-specific features, events, and processes (FEPs) (Section 6.2), the reference biosphere (Section 6.1.1), the human receptor (Section 6.1.2), and approximations (Sections 6.3.1.4 and 6.3.2.4); (3) Building a mathematical model using the biosphere conceptual model (Section 6.3) and published biosphere models (Sections 6.4 and 6.5); (4) Summarizing input parameters for the mathematical model, including the uncertainty associated with input values (Section 6.6); (5) Identifying improvements in the ERMYN compared with the model used in previous biosphere modeling (Section 6.7); (6) Constructing an ERMYN implementation tool (model) based on the biosphere mathematical model using GoldSim stochastic simulation software (Sections 6.8 and 6.9); (7) Verifying the ERMYN by comparing output from the software with hand calculations to ensure that the GoldSim implementation is correct (Section 6.10); (8) Validating the ERMYN by corroborating it with published biosphere models; comparing conceptual models, mathematical models, and numerical results (Section 7).

  4. Major Differences between the Jerome Model and the Horace Model

    Institute of Scientific and Technical Information of China (English)

    朱艳

    2014-01-01

    There are three famous translation models in the field of translation: the Jerome model, the Horace model and the Schleiermacher model. The production and development of the three models have significant influence on the translation. To find the major differences between the two western classical translation theoretical models, we discuss the Jerome model and the Hor-ace model deeply in this paper.

  5. External model validation of binary clinical risk prediction models in cardiovascular and thoracic surgery.

    Science.gov (United States)

    Hickey, Graeme L; Blackstone, Eugene H

    2016-08-01

    Clinical risk-prediction models serve an important role in healthcare. They are used for clinical decision-making and measuring the performance of healthcare providers. To establish confidence in a model, external model validation is imperative. When designing such an external model validation study, thought must be given to patient selection, risk factor and outcome definitions, missing data, and the transparent reporting of the analysis. In addition, there are a number of statistical methods available for external model validation. Execution of a rigorous external validation study rests in proper study design, application of suitable statistical methods, and transparent reporting.

  6. Modelling cointegration in the vector autoregressive model

    DEFF Research Database (Denmark)

    Johansen, Søren

    2000-01-01

    A survey is given of some results obtained for the cointegrated VAR. The Granger representation theorem is discussed and the notions of cointegration and common trends are defined. The statistical model for cointegrated I(1) variables is defined, and it is shown how hypotheses on the cointegrating...... relations can be estimated under suitable identification conditions. The asymptotic theory is briefly mentioned and a few economic applications of the cointegration model are indicated....

  7. The Control System Modeling Language

    CERN Document Server

    Zagar, K; Sekoranja, M; Tkacik, G; Vodovnik, A; Zagar, Klemen; Plesko, Mark; Sekoranja, Matej; Tkacik, Gasper; Vodovnik, Anze

    2001-01-01

    The well-known Unified Modeling Language (UML) describes software entities, such as interfaces, classes, operations and attributes, as well as relationships among them, e.g. inheritance, containment and dependency. The power of UML lies in Computer Aided Software Engineering (CASE) tools such as Rational Rose, which are also capable of generating software structures from visual object definitions and relations. UML also allows add-ons that define specific structures and patterns in order to steer and automate the design process. We have developed an add-on called Control System Modeling Language (CSML). It introduces entities and relationships that we know from control systems, such as "property" representing a single controllable point/channel, or an "event" specifying that a device is capable of notifying its clients through events. Entities can also possess CSML-specific characteristics, such as physical units and valid ranges for input parameters. CSML is independent of any specific language or technology...

  8. The Concept of Data Model Pattern Based on Fully Communication Oriented Information Modeling (FCO-IM

    Directory of Open Access Journals (Sweden)

    Fazat Nur Azizah

    2010-04-01

    Full Text Available Just as in many areas of software engineering, patterns have been used in data modeling to create high quality data models. We provide a concept of data model pattern based on Fully Communication Oriented Information Modeling (FCO-IM, a fact oriented data modeling method. A data model pattern is defined as the relation between context, problem, and solution. This definition is adopted from the concept of pattern by Christopher Alexander. We define the concept of Information Grammar for Pattern (IGP in the solution part of a pattern, which works as a template to create a data model. The IGP also shows how a pattern can relate to other patterns. The data model pattern concept is then used to describe 15 data model patterns, organized into 4 categories. A case study on geographical location is provided to show the use of the concept in a real case.

  9. Emissions Modeling Clearinghouse

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Emissions Modeling Clearinghouse (EMCH) supports and promotes emissions modeling activities both internal and external to the EPA. Through this site, the EPA...

  10. ASC Champ Orbit Model

    DEFF Research Database (Denmark)

    Riis, Troels; Jørgensen, John Leif

    1999-01-01

    This documents describes a test of the implementation of the ASC orbit model for the Champ satellite.......This documents describes a test of the implementation of the ASC orbit model for the Champ satellite....

  11. World Magnetic Model 2015

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The World Magnetic Model is the standard model used by the U.S. Department of Defense, the U.K. Ministry of Defence, the North Atlantic Treaty Organization (NATO)...

  12. Laboratory of Biological Modeling

    Data.gov (United States)

    Federal Laboratory Consortium — The Laboratory of Biological Modeling is defined by both its methodologies and its areas of application. We use mathematical modeling in many forms and apply it to...

  13. Model comparison in ANOVA.

    Science.gov (United States)

    Rouder, Jeffrey N; Engelhardt, Christopher R; McCabe, Simon; Morey, Richard D

    2016-12-01

    Analysis of variance (ANOVA), the workhorse analysis of experimental designs, consists of F-tests of main effects and interactions. Yet, testing, including traditional ANOVA, has been recently critiqued on a number of theoretical and practical grounds. In light of these critiques, model comparison and model selection serve as an attractive alternative. Model comparison differs from testing in that one can support a null or nested model vis-a-vis a more general alternative by penalizing more flexible models. We argue this ability to support more simple models allows for more nuanced theoretical conclusions than provided by traditional ANOVA F-tests. We provide a model comparison strategy and show how ANOVA models may be reparameterized to better address substantive questions in data analysis.

  14. Graphical Models with R

    DEFF Research Database (Denmark)

    Højsgaard, Søren; Edwards, David; Lauritzen, Steffen

    Graphical models in their modern form have been around since the late 1970s and appear today in many areas of the sciences. Along with the ongoing developments of graphical models, a number of different graphical modeling software programs have been written over the years. In recent years many...... of these software developments have taken place within the R community, either in the form of new packages or by providing an R ingerface to existing software. This book attempts to give the reader a gentle introduction to graphical modeling using R and the main features of some of these packages. In addition......, the book provides examples of how more advanced aspects of graphical modeling can be represented and handled within R. Topics covered in the seven chapters include graphical models for contingency tables, Gaussian and mixed graphical models, Bayesian networks and modeling high dimensional data...

  15. Controlling Modelling Artifacts

    DEFF Research Database (Denmark)

    Smith, Michael James Andrew; Nielson, Flemming; Nielson, Hanne Riis

    2011-01-01

    the possible configurations of the system (for example, by counting the number of components in a certain state). We motivate our methodology with a case study of the LMAC protocol for wireless sensor networks. In particular, we investigate the accuracy of a recently proposed high-level model of LMAC......When analysing the performance of a complex system, we typically build abstract models that are small enough to analyse, but still capture the relevant details of the system. But it is difficult to know whether the model accurately describes the real system, or if its behaviour is due to modelling...... artifacts that were inadvertently introduced. In this paper, we propose a novel methodology to reason about modelling artifacts, given a detailed model and a highlevel (more abstract) model of the same system. By a series of automated abstraction steps, we lift the detailed model to the same state space...

  16. Laboratory of Biological Modeling

    Data.gov (United States)

    Federal Laboratory Consortium — The Laboratory of Biological Modeling is defined by both its methodologies and its areas of application. We use mathematical modeling in many forms and apply it to a...

  17. Modeling EERE deployment programs

    Energy Technology Data Exchange (ETDEWEB)

    Cort, K. A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hostick, D. J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Belzer, D. B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Livingston, O. V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2007-11-01

    The purpose of the project was to identify and characterize the modeling of deployment programs within the EERE Technology Development (TD) programs, address possible improvements to the modeling process, and note gaps in knowledge for future research.

  18. Consistent model driven architecture

    Science.gov (United States)

    Niepostyn, Stanisław J.

    2015-09-01

    The goal of the MDA is to produce software systems from abstract models in a way where human interaction is restricted to a minimum. These abstract models are based on the UML language. However, the semantics of UML models is defined in a natural language. Subsequently the verification of consistency of these diagrams is needed in order to identify errors in requirements at the early stage of the development process. The verification of consistency is difficult due to a semi-formal nature of UML diagrams. We propose automatic verification of consistency of the series of UML diagrams originating from abstract models implemented with our consistency rules. This Consistent Model Driven Architecture approach enables us to generate automatically complete workflow applications from consistent and complete models developed from abstract models (e.g. Business Context Diagram). Therefore, our method can be used to check practicability (feasibility) of software architecture models.

  19. Bounding species distribution models

    Directory of Open Access Journals (Sweden)

    Thomas J. STOHLGREN, Catherine S. JARNEVICH, Wayne E. ESAIAS,Jeffrey T. MORISETTE

    2011-10-01

    Full Text Available Species distribution models are increasing in popularity for mapping suitable habitat for species of management concern. Many investigators now recognize that extrapolations of these models with geographic information systems (GIS might be sensitive to the environmental bounds of the data used in their development, yet there is no recommended best practice for “clamping” model extrapolations. We relied on two commonly used modeling approaches: classification and regression tree (CART and maximum entropy (Maxent models, and we tested a simple alteration of the model extrapolations, bounding extrapolations to the maximum and minimum values of primary environmental predictors, to provide a more realistic map of suitable habitat of hybridized Africanized honey bees in the southwestern United States. Findings suggest that multiple models of bounding, and the most conservative bounding of species distribution models, like those presented here, should probably replace the unbounded or loosely bounded techniques currently used [Current Zoology 57 (5: 642–647, 2011].

  20. CCF model comparison

    Energy Technology Data Exchange (ETDEWEB)

    Pulkkinen, U. [VTT Industrial Systems (Finland)

    2004-04-01

    The report describes a simple comparison of two CCF-models, the ECLM, and the Beta-model. The objective of the comparison is to identify differences in the results of the models by applying the models in some simple test data cases. The comparison focuses mainly on theoretical aspects of the above mentioned CCF-models. The properties of the model parameter estimates in the data cases is also discussed. The practical aspects in using and estimating CCFmodels in real PSA context (e.g. the data interpretation, properties of computer tools, the model documentation) are not discussed in the report. Similarly, the qualitative CCF-analyses needed in using the models are not discussed in the report. (au)

  1. Chip Multithreaded Consistency Model

    Institute of Scientific and Technical Information of China (English)

    Zu-Song Li; Dan-Dan Huan; Wei-Wu Hu; Zhi-Min Tang

    2008-01-01

    Multithreaded technique is the developing trend of high performance processor. Memory consistency model is essential to the correctness, performance and complexity of multithreaded processor. The chip multithreaded consistency model adapting to multithreaded processor is proposed in this paper. The restriction imposed on memory event ordering by chip multithreaded consistency is presented and formalized. With the idea of critical cycle built by Wei-Wu Hu, we prove that the proposed chip multithreaded consistency model satisfies the criterion of correct execution of sequential consistency model. Chip multithreaded consistency model provides a way of achieving high performance compared with sequential consistency model and ensures the compatibility of software that the execution result in multithreaded processor is the same as the execution result in uniprocessor. The implementation strategy of chip multithreaded consistency model in Godson-2 SMT processor is also proposed. Godson-2 SMT processor supports chip multithreaded consistency model correctly by exception scheme based on the sequential memory access queue of each thread.

  2. Models (Part 1).

    Science.gov (United States)

    Callison, Daniel

    2002-01-01

    Defines models and describes information search models that can be helpful to instructional media specialists in meeting users' abilities and information needs. Explains pathfinders and Kuhlthau's information search process, including the pre-writing information search process. (LRW)

  3. Modeling DNA Replication.

    Science.gov (United States)

    Bennett, Joan

    1998-01-01

    Recommends the use of a model of DNA made out of Velcro to help students visualize the steps of DNA replication. Includes a materials list, construction directions, and details of the demonstration using the model parts. (DDR)

  4. Modeling Complex Time Limits

    Directory of Open Access Journals (Sweden)

    Oleg Svatos

    2013-01-01

    Full Text Available In this paper we analyze complexity of time limits we can find especially in regulated processes of public administration. First we review the most popular process modeling languages. There is defined an example scenario based on the current Czech legislature which is then captured in discussed process modeling languages. Analysis shows that the contemporary process modeling languages support capturing of the time limit only partially. This causes troubles to analysts and unnecessary complexity of the models. Upon unsatisfying results of the contemporary process modeling languages we analyze the complexity of the time limits in greater detail and outline lifecycles of a time limit using the multiple dynamic generalizations pattern. As an alternative to the popular process modeling languages there is presented PSD process modeling language, which supports the defined lifecycles of a time limit natively and therefore allows keeping the models simple and easy to understand.

  5. World Magnetic Model 2010

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The World Magnetic Model is the standard model used by the U.S. Department of Defense, the U.K. Ministry of Defence, the North Atlantic Treaty Organization (NATO)...

  6. Bounding Species Distribution Models

    Science.gov (United States)

    Stohlgren, Thomas J.; Jarnevich, Cahterine S.; Morisette, Jeffrey T.; Esaias, Wayne E.

    2011-01-01

    Species distribution models are increasing in popularity for mapping suitable habitat for species of management concern. Many investigators now recognize that extrapolations of these models with geographic information systems (GIS) might be sensitive to the environmental bounds of the data used in their development, yet there is no recommended best practice for "clamping" model extrapolations. We relied on two commonly used modeling approaches: classification and regression tree (CART) and maximum entropy (Maxent) models, and we tested a simple alteration of the model extrapolations, bounding extrapolations to the maximum and minimum values of primary environmental predictors, to provide a more realistic map of suitable habitat of hybridized Africanized honey bees in the southwestern United States. Findings suggest that multiple models of bounding, and the most conservative bounding of species distribution models, like those presented here, should probably replace the unbounded or loosely bounded techniques currently used [Current Zoology 57 (5): 642-647, 2011].

  7. The ATLAS Analysis Model

    CERN Multimedia

    Amir Farbin

    The ATLAS Analysis Model is a continually developing vision of how to reconcile physics analysis requirements with the ATLAS offline software and computing model constraints. In the past year this vision has influenced the evolution of the ATLAS Event Data Model, the Athena software framework, and physics analysis tools. These developments, along with the October Analysis Model Workshop and the planning for CSC analyses have led to a rapid refinement of the ATLAS Analysis Model in the past few months. This article introduces some of the relevant issues and presents the current vision of the future ATLAS Analysis Model. Event Data Model The ATLAS Event Data Model (EDM) consists of several levels of details, each targeted for a specific set of tasks. For example the Event Summary Data (ESD) stores calorimeter cells and tracking system hits thereby permitting many calibration and alignment tasks, but will be only accessible at particular computing sites with potentially large latency. In contrast, the Analysis...

  8. LAT Background Models

    Data.gov (United States)

    National Aeronautics and Space Administration — The Galactic model is a spatial and spectral template. The model for the Galactic diffuse emission was developed using spectral line surveys of HI and CO (as a...

  9. NUMERICAL SIMULATION AND MODELING OF UNSTEADY FLOW ...

    African Journals Online (AJOL)

    2014-06-30

    Jun 30, 2014 ... Modeling is by definition an approximation of reality, so its results are ... The values of lift coefficient were improved after modifications of the .... of static pressure is defined boundary conditions at the origin of the variation of ...

  10. Measurements and Information in Spin Foam Models

    CERN Document Server

    Garcia-Islas, J Manuel

    2012-01-01

    We present a problem relating measurements and information theory in spin foam models. In the three dimensional case of quantum gravity we can compute probabilities of spin network graphs and study the behaviour of the Shannon entropy associated to the corresponding information. We present a general definition, compute the Shannon entropy of some examples, and find some interesting inequalities.

  11. Compositional design and reuse of a generic agent model

    NARCIS (Netherlands)

    Brazier, F.M.T.; Jonker, C.M.; Treur, J.

    2000-01-01

    This article introduces a formally specified design of a compositional generic agent model (GAM). This agent model abstracts from specific application domains; it provides a unified formal definition of a model for weak agenthood. It can be (re) used as a template or pattern for a large variety of a

  12. q-Deformation of Lorentzian spin foam models

    CERN Document Server

    Fairbairn, Winston J

    2011-01-01

    We construct and analyse a quantum deformation of the Lorentzian EPRL model. The model is based on the representation theory of the quantum Lorentz group with real deformation parameter. We give a definition of the quantum EPRL intertwiner, study its convergence and braiding properties and construct an amplitude for the four-simplexes. We find that the resulting model is finite.

  13. Visualisation of Domain-Specific Modelling Languages Using UML

    NARCIS (Netherlands)

    Graaf, B.; Van Deursen, A.

    2006-01-01

    Currently, general-purpose modelling tools are often only used to draw diagrams for the documentation. The introduction of model-driven software development approaches involves the definition of domain-specific modelling languages that allow code generation. Although graphical representations of the

  14. The research on HRM model of geosciences engineering perambulation enterprise

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Firstly,this paper defines the definition of geosciences engineering perambulation enterprise,which belongs to the knowledgeable enterprise;then,it summarizes the general HRM model presented by other researchers,based on those models,this paper builds a new HRM model of geosciences engineering perambulation enterprise.

  15. Photovoltaic sources modeling

    CERN Document Server

    Petrone, Giovanni; Spagnuolo, Giovanni

    2016-01-01

    This comprehensive guide surveys all available models for simulating a photovoltaic (PV) generator at different levels of granularity, from cell to system level, in uniform as well as in mismatched conditions. Providing a thorough comparison among the models, engineers have all the elements needed to choose the right PV array model for specific applications or environmental conditions matched with the model of the electronic circuit used to maximize the PV power production.

  16. Toric models of graphs

    CERN Document Server

    Buczyńska, Weronika

    2010-01-01

    We define toric projective model of a trivalent graph as a generalization of a binary symmetric model of a trivalent phylogenetic tree. Generators of the projective coordinate ring of the models of graphs with one cycle are explicitly described. The models of graphs with the same topological invariants are deformation equivalent and share the same Hilbert function. We also provide an algorithm to compute the Hilbert function.

  17. Model of magnetostrictive actuator

    Institute of Scientific and Technical Information of China (English)

    LI Lin; ZHANG Yuan-yuan

    2005-01-01

    The hysteresis of the magnetostrictive actuator was studied. A mathematical model of the hysteresis loop was obtained on the basis of experiment. This model depends on the frequency and the amplitude of the alternating current inputted to the magnetostrictive actuator. Based on the model, the effect of hysteresis on dynamic output of the magnetostrictive actuator was investigated. Then how to consider hysteresis and establish a dynamic model of a magnetostrictive actuator system is discussed when a practical system was designed and applied.

  18. GIS Conceptual Data Model

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    In order to set up a conceptual data model that reflects the real world as accurately as possible,this paper firstly reviews and analyzes the disadvantages of previous conceptual data models used by traditional GIS in simulating geographic space,gives a new explanation to geographic space and analyzes its various essential characteristics.Finally,this paper proposes several detailed key points for designing a new type of GIS data model and gives a simple holistic GIS data model.

  19. Modeling Digital Video Database

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    The main purpose of the model is to present how the UnifiedModeling L anguage (UML) can be used for modeling digital video database system (VDBS). It demonstrates the modeling process that can be followed during the analysis phase of complex applications. In order to guarantee the continuity mapping of the mo dels, the authors propose some suggestions to transform the use case diagrams in to an object diagram, which is one of the main diagrams for the next development phases.

  20. Hierarchical Bass model

    Science.gov (United States)

    Tashiro, Tohru

    2014-03-01

    We propose a new model about diffusion of a product which includes a memory of how many adopters or advertisements a non-adopter met, where (non-)adopters mean people (not) possessing the product. This effect is lacking in the Bass model. As an application, we utilize the model to fit the iPod sales data, and so the better agreement is obtained than the Bass model.