WorldWideScience

Sample records for model model definition

  1. Conceptualising Business Models: Definitions, Frameworks and Classifications

    Directory of Open Access Journals (Sweden)

    Erwin Fielt

    2013-12-01

    Full Text Available The business model concept is gaining traction in different disciplines but is still criticized for being fuzzy and vague and lacking consensus on its definition and compositional elements. In this paper we set out to advance our understanding of the business model concept by addressing three areas of foundational research: business model definitions, business model elements, and business model archetypes. We define a business model as a representation of the value logic of an organization in terms of how it creates and captures customer value. This abstract and generic definition is made more specific and operational by the compositional elements that need to address the customer, value proposition, organizational architecture (firm and network level and economics dimensions. Business model archetypes complement the definition and elements by providing a more concrete and empirical understanding of the business model concept. The main contributions of this paper are (1 explicitly including the customer value concept in the business model definition and focussing on value creation, (2 presenting four core dimensions that business model elements need to cover, (3 arguing for flexibility by adapting and extending business model elements to cater for different purposes and contexts (e.g. technology, innovation, strategy (4 stressing a more systematic approach to business model archetypes by using business model elements for their description, and (5 suggesting to use business model archetype research for the empirical exploration and testing of business model elements and their relationships.

  2. Conceptualising Business Models: Definitions, Frameworks and Classifications

    OpenAIRE

    Erwin Fielt

    2013-01-01

    The business model concept is gaining traction in different disciplines but is still criticized for being fuzzy and vague and lacking consensus on its definition and compositional elements. In this paper we set out to advance our understanding of the business model concept by addressing three areas of foundational research: business model definitions, business model elements, and business model archetypes. We define a business model as a representation of the value logic of an organization in...

  3. Translating building information modeling to building energy modeling using model view definition.

    Science.gov (United States)

    Jeong, WoonSeong; Kim, Jong Bum; Clayton, Mark J; Haberl, Jeff S; Yan, Wei

    2014-01-01

    This paper presents a new approach to translate between Building Information Modeling (BIM) and Building Energy Modeling (BEM) that uses Modelica, an object-oriented declarative, equation-based simulation environment. The approach (BIM2BEM) has been developed using a data modeling method to enable seamless model translations of building geometry, materials, and topology. Using data modeling, we created a Model View Definition (MVD) consisting of a process model and a class diagram. The process model demonstrates object-mapping between BIM and Modelica-based BEM (ModelicaBEM) and facilitates the definition of required information during model translations. The class diagram represents the information and object relationships to produce a class package intermediate between the BIM and BEM. The implementation of the intermediate class package enables system interface (Revit2Modelica) development for automatic BIM data translation into ModelicaBEM. In order to demonstrate and validate our approach, simulation result comparisons have been conducted via three test cases using (1) the BIM-based Modelica models generated from Revit2Modelica and (2) BEM models manually created using LBNL Modelica Buildings library. Our implementation shows that BIM2BEM (1) enables BIM models to be translated into ModelicaBEM models, (2) enables system interface development based on the MVD for thermal simulation, and (3) facilitates the reuse of original BIM data into building energy simulation without an import/export process.

  4. Translating Building Information Modeling to Building Energy Modeling Using Model View Definition

    Directory of Open Access Journals (Sweden)

    WoonSeong Jeong

    2014-01-01

    Full Text Available This paper presents a new approach to translate between Building Information Modeling (BIM and Building Energy Modeling (BEM that uses Modelica, an object-oriented declarative, equation-based simulation environment. The approach (BIM2BEM has been developed using a data modeling method to enable seamless model translations of building geometry, materials, and topology. Using data modeling, we created a Model View Definition (MVD consisting of a process model and a class diagram. The process model demonstrates object-mapping between BIM and Modelica-based BEM (ModelicaBEM and facilitates the definition of required information during model translations. The class diagram represents the information and object relationships to produce a class package intermediate between the BIM and BEM. The implementation of the intermediate class package enables system interface (Revit2Modelica development for automatic BIM data translation into ModelicaBEM. In order to demonstrate and validate our approach, simulation result comparisons have been conducted via three test cases using (1 the BIM-based Modelica models generated from Revit2Modelica and (2 BEM models manually created using LBNL Modelica Buildings library. Our implementation shows that BIM2BEM (1 enables BIM models to be translated into ModelicaBEM models, (2 enables system interface development based on the MVD for thermal simulation, and (3 facilitates the reuse of original BIM data into building energy simulation without an import/export process.

  5. Promoting Model-based Definition to Establish a Complete Product Definition.

    Science.gov (United States)

    Ruemler, Shawn P; Zimmerman, Kyle E; Hartman, Nathan W; Hedberg, Thomas; Feeny, Allison Barnard

    2017-05-01

    The manufacturing industry is evolving and starting to use 3D models as the central knowledge artifact for product data and product definition, or what is known as Model-based Definition (MBD). The Model-based Enterprise (MBE) uses MBD as a way to transition away from using traditional paper-based drawings and documentation. As MBD grows in popularity, it is imperative to understand what information is needed in the transition from drawings to models so that models represent all the relevant information needed for processes to continue efficiently. Finding this information can help define what data is common amongst different models in different stages of the lifecycle, which could help establish a Common Information Model. The Common Information Model is a source that contains common information from domain specific elements amongst different aspects of the lifecycle. To help establish this Common Information Model, information about how models are used in industry within different workflows needs to be understood. To retrieve this information, a survey mechanism was administered to industry professionals from various sectors. Based on the results of the survey a Common Information Model could not be established. However, the results gave great insight that will help in further investigation of the Common Information Model.

  6. Weak Memory Models: Balancing Definitional Simplicity and Implementation Flexibility

    OpenAIRE

    Zhang, Sizhuo; Vijayaraghavan, Muralidaran; Arvind

    2017-01-01

    The memory model for RISC-V, a newly developed open source ISA, has not been finalized yet and thus, offers an opportunity to evaluate existing memory models. We believe RISC-V should not adopt the memory models of POWER or ARM, because their axiomatic and operational definitions are too complicated. We propose two new weak memory models: WMM and WMM-S, which balance definitional simplicity and implementation flexibility differently. Both allow all instruction reorderings except overtaking of...

  7. What Is A Homosexual? A Definitional Model.

    Science.gov (United States)

    Berger, Raymond M.

    1983-01-01

    Presents a definitional model to explain homosexuality and discusses its implications for practice. Contends that social workers must discard the traditional binary model of hetersexual versus homesexual for one incorporating relevant psychosocial factors including life experiences, social reaction, and association with others. (Author/JAC)

  8. Formal Definition of Measures for BPMN Models

    Science.gov (United States)

    Reynoso, Luis; Rolón, Elvira; Genero, Marcela; García, Félix; Ruiz, Francisco; Piattini, Mario

    Business process models are currently attaining more relevance, and more attention is therefore being paid to their quality. This situation led us to define a set of measures for the understandability of BPMN models, which is shown in a previous work. We focus on understandability since a model must be well understood before any changes are made to it. These measures were originally informally defined in natural language. As is well known, natural language is ambiguous and may lead to misunderstandings and a misinterpretation of the concepts captured by a measure and the way in which the measure value is obtained. This has motivated us to provide the formal definition of the proposed measures using OCL (Object Constraint Language) upon the BPMN (Business Process Modeling Notation) metamodel presented in this paper. The main advantages and lessons learned (which were obtained both from the current work and from previous works carried out in relation to the formal definition of other measures) are also summarized.

  9. A definitional framework for the human/biometric sensor interaction model

    Science.gov (United States)

    Elliott, Stephen J.; Kukula, Eric P.

    2010-04-01

    Existing definitions for biometric testing and evaluation do not fully explain errors in a biometric system. This paper provides a definitional framework for the Human Biometric-Sensor Interaction (HBSI) model. This paper proposes six new definitions based around two classifications of presentations, erroneous and correct. The new terms are: defective interaction (DI), concealed interaction (CI), false interaction (FI), failure to detect (FTD), failure to extract (FTX), and successfully acquired samples (SAS). As with all definitions, the new terms require a modification to the general biometric model developed by Mansfield and Wayman [1].

  10. Livrable D1.2 of the PERSEE project : Perceptual-Modelling-Definition-of-the-Models

    OpenAIRE

    Wang , Junle; Bosc , Emilie; Li , Jing; Ricordel , Vincent

    2011-01-01

    Livrable D1.2 du projet ANR PERSEE; Ce rapport a été réalisé dans le cadre du projet ANR PERSEE (n° ANR-09-BLAN-0170). Exactement il correspond au livrable D1.2 du projet. Son titre : Perceptual-Modelling-Definition-of-the-Models

  11. A consensus definition of cataplexy in mouse models of narcolepsy.

    Science.gov (United States)

    Scammell, Thomas E; Willie, Jon T; Guilleminault, Christian; Siegel, Jerome M

    2009-01-01

    People with narcolepsy often have episodes of cataplexy, brief periods of muscle weakness triggered by strong emotions. Many researchers are now studying mouse models of narcolepsy, but definitions of cataplexy-like behavior in mice differ across labs. To establish a common language, the International Working Group on Rodent Models of Narcolepsy reviewed the literature on cataplexy in people with narcolepsy and in dog and mouse models of narcolepsy and then developed a consensus definition of murine cataplexy. The group concluded that murine cataplexy is an abrupt episode of nuchal atonia lasting at least 10 seconds. In addition, theta activity dominates the EEG during the episode, and video recordings document immobility. To distinguish a cataplexy episode from REM sleep after a brief awakening, at least 40 seconds of wakefulness must precede the episode. Bouts of cataplexy fitting this definition are common in mice with disrupted orexin/hypocretin signaling, but these events almost never occur in wild type mice. It remains unclear whether murine cataplexy is triggered by strong emotions or whether mice remain conscious during the episodes as in people with narcolepsy. This working definition provides helpful insights into murine cataplexy and should allow objective and accurate comparisons of cataplexy in future studies using mouse models of narcolepsy.

  12. Towards a Definition of Role-related Concepts for Business Modeling

    NARCIS (Netherlands)

    Meertens, Lucas Onno; Iacob, Maria Eugenia; Nieuwenhuis, Lambertus Johannes Maria

    2010-01-01

    Abstract—While several role-related concepts play an important role in business modeling, their definitions, relations, and use differ greatly between languages, papers, and reports. Due to this, the knowledge captured by models is not transferred correctly, and models are incomparable. In this

  13. Integral definition of transition time in the Landau-Zener model

    International Nuclear Information System (INIS)

    Yan Yue; Wu Biao

    2010-01-01

    We give a general definition for the transition time in the Landau-Zener model. This definition allows us to compute numerically the Landau-Zener transition time at any sweeping rate without ambiguity in both diabatic and adiabatic bases. With this new definition, analytical results are obtained in both the adiabatic limit and the sudden limit.

  14. Building a Shared Definitional Model of Long Duration Human Spaceflight

    Science.gov (United States)

    Orr, M.; Whitmire, A.; Sandoval, L.; Leveton, L.; Arias, D.

    2011-01-01

    In 1956, on the eve of human space travel Strughold first proposed a simple classification of the present and future stages of manned flight that identified key factors, risks and developmental stages for the evolutionary journey ahead. As we look to optimize the potential of the ISS as a gateway to new destinations, we need a current shared working definitional model of long duration human space flight to help guide our path. Initial search of formal and grey literature augmented by liaison with subject matter experts. Search strategy focused on both the use of term long duration mission and long duration spaceflight, and also broader related current and historical definitions and classification models of spaceflight. The related sea and air travel literature was also subsequently explored with a view to identifying analogous models or classification systems. There are multiple different definitions and classification systems for spaceflight including phase and type of mission, craft and payload and related risk management models. However the frequently used concepts of long duration mission and long duration spaceflight are infrequently operationally defined by authors, and no commonly referenced classical or gold standard definition or model of these terms emerged from the search. The categorization (Cat) system for sailing was found to be of potential analogous utility, with its focus on understanding the need for crew and craft autonomy at various levels of potential adversity and inability to gain outside support or return to a safe location, due to factors of time, distance and location.

  15. A Model-Free Definition of Increasing Uncertainty

    NARCIS (Netherlands)

    Grant, S.; Quiggin, J.

    2001-01-01

    We present a definition of increasing uncertainty, independent of any notion of subjective probabilities, or of any particular model of preferences.Our notion of an elementary increase in the uncertainty of any act corresponds to the addition of an 'elementary bet' which increases consumption by a

  16. Weak Memory Models with Matching Axiomatic and Operational Definitions

    OpenAIRE

    Zhang, Sizhuo; Vijayaraghavan, Muralidaran; Lustig, Dan; Arvind

    2017-01-01

    Memory consistency models are notorious for being difficult to define precisely, to reason about, and to verify. More than a decade of effort has gone into nailing down the definitions of the ARM and IBM Power memory models, and yet there still remain aspects of those models which (perhaps surprisingly) remain unresolved to this day. In response to these complexities, there has been somewhat of a recent trend in the (general-purpose) architecture community to limit new memory models to being ...

  17. Development of a definition, classification system, and model for cultural geology

    Science.gov (United States)

    Mitchell, Lloyd W., III

    The concept for this study is based upon a personal interest by the author, an American Indian, in promoting cultural perspectives in undergraduate college teaching and learning environments. Most academicians recognize that merged fields can enhance undergraduate curricula. However, conflict may occur when instructors attempt to merge social science fields such as history or philosophy with geoscience fields such as mining and geomorphology. For example, ideologies of Earth structures derived from scientific methodologies may conflict with historical and spiritual understandings of Earth structures held by American Indians. Specifically, this study addresses the problem of how to combine cultural studies with the geosciences into a new merged academic discipline called cultural geology. This study further attempts to develop the merged field of cultural geology using an approach consisting of three research foci: a definition, a classification system, and a model. Literature reviews were conducted for all three foci. Additionally, to better understand merged fields, a literature review was conducted specifically for academic fields that merged social and physical sciences. Methodologies concentrated on the three research foci: definition, classification system, and model. The definition was derived via a two-step process. The first step, developing keyword hierarchical ranking structures, was followed by creating and analyzing semantic word meaning lists. The classification system was developed by reviewing 102 classification systems and incorporating selected components into a system framework. The cultural geology model was created also utilizing a two-step process. A literature review of scientific models was conducted. Then, the definition and classification system were incorporated into a model felt to reflect the realm of cultural geology. A course syllabus was then developed that incorporated the resulting definition, classification system, and model. This

  18. Health literacy and public health: A systematic review and integration of definitions and models

    LENUS (Irish Health Repository)

    Sorensen, Kristine

    2012-01-25

    Abstract Background Health literacy concerns the knowledge and competences of persons to meet the complex demands of health in modern society. Although its importance is increasingly recognised, there is no consensus about the definition of health literacy or about its conceptual dimensions, which limits the possibilities for measurement and comparison. The aim of the study is to review definitions and models on health literacy to develop an integrated definition and conceptual model capturing the most comprehensive evidence-based dimensions of health literacy. Methods A systematic literature review was performed to identify definitions and conceptual frameworks of health literacy. A content analysis of the definitions and conceptual frameworks was carried out to identify the central dimensions of health literacy and develop an integrated model. Results The review resulted in 17 definitions of health literacy and 12 conceptual models. Based on the content analysis, an integrative conceptual model was developed containing 12 dimensions referring to the knowledge, motivation and competencies of accessing, understanding, appraising and applying health-related information within the healthcare, disease prevention and health promotion setting, respectively. Conclusions Based upon this review, a model is proposed integrating medical and public health views of health literacy. The model can serve as a basis for developing health literacy enhancing interventions and provide a conceptual basis for the development and validation of measurement tools, capturing the different dimensions of health literacy within the healthcare, disease prevention and health promotion settings.

  19. Multiple organ definition in CT using a Bayesian approach for 3D model fitting

    Science.gov (United States)

    Boes, Jennifer L.; Weymouth, Terry E.; Meyer, Charles R.

    1995-08-01

    Organ definition in computed tomography (CT) is of interest for treatment planning and response monitoring. We present a method for organ definition using a priori information about shape encoded in a set of biometric organ models--specifically for the liver and kidney-- that accurately represents patient population shape information. Each model is generated by averaging surfaces from a learning set of organ shapes previously registered into a standard space defined by a small set of landmarks. The model is placed in a specific patient's data set by identifying these landmarks and using them as the basis for model deformation; this preliminary representation is then iteratively fit to the patient's data based on a Bayesian formulation of the model's priors and CT edge information, yielding a complete organ surface. We demonstrate this technique using a set of fifteen abdominal CT data sets for liver surface definition both before and after the addition of a kidney model to the fitting; we demonstrate the effectiveness of this tool for organ surface definition in this low-contrast domain.

  20. Basic definitions for discrete modeling of computer worms epidemics

    Directory of Open Access Journals (Sweden)

    Pedro Guevara López

    2015-01-01

    Full Text Available The information technologies have evolved in such a way that communication between computers or hosts has become common, so much that the worldwide organization (governments and corporations depends on it; what could happen if these computers stop working for a long time is catastrophic. Unfortunately, networks are attacked by malware such as viruses and worms that could collapse the system. This has served as motivation for the formal study of computer worms and epidemics to develop strategies for prevention and protection; this is why in this paper, before analyzing epidemiological models, a set of formal definitions based on set theory and functions is proposed for describing 21 concepts used in the study of worms. These definitions provide a basis for future qualitative research on the behavior of computer worms, and quantitative for the study of their epidemiological models.

  1. Fuzzy Entropy: Axiomatic Definition and Neural Networks Model

    Institute of Scientific and Technical Information of China (English)

    QINGMing; CAOYue; HUANGTian-min

    2004-01-01

    The measure of uncertainty is adopted as a measure of information. The measures of fuzziness are known as fuzzy information measures. The measure of a quantity of fuzzy information gained from a fuzzy set or fuzzy system is known as fuzzy entropy. Fuzzy entropy has been focused and studied by many researchers in various fields. In this paper, firstly, the axiomatic definition of fuzzy entropy is discussed. Then, neural networks model of fuzzy entropy is proposed, based on the computing capability of neural networks. In the end, two examples are discussed to show the efficiency of the model.

  2. Beyond a Definition: Toward a Framework for Designing and Specifying Mentoring Models

    Science.gov (United States)

    Dawson, Phillip

    2014-01-01

    More than three decades of mentoring research has yet to converge on a unifying definition of mentoring; this is unsurprising given the diversity of relationships classified as mentoring. This article advances beyond a definition toward a common framework for specifying mentoring models. Sixteen design elements were identified from the literature…

  3. Cultural competence in end-of-life care: terms, definitions, and conceptual models from the British literature.

    Science.gov (United States)

    Evans, Natalie; Meñaca, Arantza; Koffman, Jonathan; Harding, Richard; Higginson, Irene J; Pool, Robert; Gysels, Marjolein

    2012-07-01

    Cultural competency is increasingly recommended in policy and practice to improve end-of-life (EoL) care for minority ethnic groups in multicultural societies. It is imperative to critically analyze this approach to understand its underlying concepts. Our aim was to appraise cultural competency approaches described in the British literature on EoL care and minority ethnic groups. This is a critical review. Articles on cultural competency were identified from a systematic review of the literature on minority ethnic groups and EoL care in the United Kingdom. Terms, definitions, and conceptual models of cultural competency approaches were identified and situated according to purpose, components, and origin. Content analysis of definitions and models was carried out to identify key components. One-hundred thirteen articles on minority ethnic groups and EoL care in the United Kingdom were identified. Over half (n=60) contained a term, definition, or model for cultural competency. In all, 17 terms, 17 definitions, and 8 models were identified. The most frequently used term was "culturally sensitive," though "cultural competence" was defined more often. Definitions contained one or more of the components: "cognitive," "implementation," or "outcome." Models were categorized for teaching or use in patient assessment. Approaches were predominantly of American origin. The variety of terms, definitions, and models underpinning cultural competency approaches demonstrates a lack of conceptual clarity, and potentially complicates implementation. Further research is needed to compare the use of cultural competency approaches in diverse cultures and settings, and to assess the impact of such approaches on patient outcomes.

  4. Scoring predictive models using a reduced representation of proteins: model and energy definition.

    Science.gov (United States)

    Fogolari, Federico; Pieri, Lidia; Dovier, Agostino; Bortolussi, Luca; Giugliarelli, Gilberto; Corazza, Alessandra; Esposito, Gennaro; Viglino, Paolo

    2007-03-23

    Reduced representations of proteins have been playing a keyrole in the study of protein folding. Many such models are available, with different representation detail. Although the usefulness of many such models for structural bioinformatics applications has been demonstrated in recent years, there are few intermediate resolution models endowed with an energy model capable, for instance, of detecting native or native-like structures among decoy sets. The aim of the present work is to provide a discrete empirical potential for a reduced protein model termed here PC2CA, because it employs a PseudoCovalent structure with only 2 Centers of interactions per Amino acid, suitable for protein model quality assessment. All protein structures in the set top500H have been converted in reduced form. The distribution of pseudobonds, pseudoangle, pseudodihedrals and distances between centers of interactions have been converted into potentials of mean force. A suitable reference distribution has been defined for non-bonded interactions which takes into account excluded volume effects and protein finite size. The correlation between adjacent main chain pseudodihedrals has been converted in an additional energetic term which is able to account for cooperative effects in secondary structure elements. Local energy surface exploration is performed in order to increase the robustness of the energy function. The model and the energy definition proposed have been tested on all the multiple decoys' sets in the Decoys'R'us database. The energetic model is able to recognize, for almost all sets, native-like structures (RMSD less than 2.0 A). These results and those obtained in the blind CASP7 quality assessment experiment suggest that the model compares well with scoring potentials with finer granularity and could be useful for fast exploration of conformational space. Parameters are available at the url: http://www.dstb.uniud.it/~ffogolari/download/.

  5. Current definition and a generalized federbush model

    International Nuclear Information System (INIS)

    Singh, L.P.S.; Hagen, C.R.

    1978-01-01

    The Federbush model is studied, with particular attention being given to the definition of currents. Inasmuch as there is no a priori restriction of local gauge invariance, the currents in the interacting case can be defined more generally than in Q.E.D. It is found that two arbitrary parameters are thereby introduced into the theory. Lowest order perturbation calculations for the current correlation functions and the Fermion propagators indicate that the theory admits a whole class of solutions dependent upon these parameters with the closed solution of Federbush emerging as a special case. The theory is shown to be locally covariant, and a conserved energy--momentum tensor is displayed. One finds in addition that the generators of gauge transformations for the fields are conserved. Finally it is shown that the general theory yields the Federbush solution if suitable Thirring model type counterterms are added

  6. High-Dimensional Modeling for Cytometry: Building Rock Solid Models Using GemStone™ and Verity Cen-se'™ High-Definition t-SNE Mapping.

    Science.gov (United States)

    Bruce Bagwell, C

    2018-01-01

    This chapter outlines how to approach the complex tasks associated with designing models for high-dimensional cytometry data. Unlike gating approaches, modeling lends itself to automation and accounts for measurement overlap among cellular populations. Designing these models is now easier because of a new technique called high-definition t-SNE mapping. Nontrivial examples are provided that serve as a guide to create models that are consistent with data.

  7. Making the Case for a Model-Based Definition of Engineering Materials (Postprint)

    Science.gov (United States)

    2017-09-12

    MBE relies on digi- tal representations, or a model-based definition (MBD), to define a product throughout design , manufacturing and sus- tainment...discovery through development, scale-up, product design and qualification, manufacture and sustainment have changed little over the past decades. This...testing data provided a certifiable material definition, so as to minimize risk and simplify procurement of materials during the design , manufacture , and

  8. HIV lipodystrophy case definition using artificial neural network modelling

    DEFF Research Database (Denmark)

    Ioannidis, John P A; Trikalinos, Thomas A; Law, Matthew

    2003-01-01

    OBJECTIVE: A case definition of HIV lipodystrophy has recently been developed from a combination of clinical, metabolic and imaging/body composition variables using logistic regression methods. We aimed to evaluate whether artificial neural networks could improve the diagnostic accuracy. METHODS......: The database of the case-control Lipodystrophy Case Definition Study was split into 504 subjects (265 with and 239 without lipodystrophy) used for training and 284 independent subjects (152 with and 132 without lipodystrophy) used for validation. Back-propagation neural networks with one or two middle layers...... were trained and validated. Results were compared against logistic regression models using the same information. RESULTS: Neural networks using clinical variables only (41 items) achieved consistently superior performance than logistic regression in terms of specificity, overall accuracy and area under...

  9. A hierarchical modeling methodology for the definition and selection of requirements

    Science.gov (United States)

    Dufresne, Stephane

    This dissertation describes the development of a requirements analysis methodology that takes into account the concept of operations and the hierarchical decomposition of aerospace systems. At the core of the methodology, the Analytic Network Process (ANP) is used to ensure the traceability between the qualitative and quantitative information present in the hierarchical model. The proposed methodology is implemented to the requirements definition of a hurricane tracker Unmanned Aerial Vehicle. Three research objectives are identified in this work; (1) improve the requirements mapping process by matching the stakeholder expectations with the concept of operations, systems and available resources; (2) reduce the epistemic uncertainty surrounding the requirements and requirements mapping; and (3) improve the requirements down-selection process by taking into account the level of importance of the criteria and the available resources. Several challenges are associated with the identification and definition of requirements. The complexity of the system implies that a large number of requirements are needed to define the systems. These requirements are defined early in the conceptual design, where the level of knowledge is relatively low and the level of uncertainty is large. The proposed methodology intends to increase the level of knowledge and reduce the level of uncertainty by guiding the design team through a structured process. To address these challenges, a new methodology is created to flow-down the requirements from the stakeholder expectations to the systems alternatives. A taxonomy of requirements is created to classify the information gathered during the problem definition. Subsequently, the operational and systems functions and measures of effectiveness are integrated to a hierarchical model to allow the traceability of the information. Monte Carlo methods are used to evaluate the variations of the hierarchical model elements and consequently reduce the

  10. Defining epidemics in computer simulation models: How do definitions influence conclusions?

    Directory of Open Access Journals (Sweden)

    Carolyn Orbann

    2017-06-01

    Full Text Available Computer models have proven to be useful tools in studying epidemic disease in human populations. Such models are being used by a broader base of researchers, and it has become more important to ensure that descriptions of model construction and data analyses are clear and communicate important features of model structure. Papers describing computer models of infectious disease often lack a clear description of how the data are aggregated and whether or not non-epidemic runs are excluded from analyses. Given that there is no concrete quantitative definition of what constitutes an epidemic within the public health literature, each modeler must decide on a strategy for identifying epidemics during simulation runs. Here, an SEIR model was used to test the effects of how varying the cutoff for considering a run an epidemic changes potential interpretations of simulation outcomes. Varying the cutoff from 0% to 15% of the model population ever infected with the illness generated significant differences in numbers of dead and timing variables. These results are important for those who use models to form public health policy, in which questions of timing or implementation of interventions might be answered using findings from computer simulation models.

  11. The participative method of subject definition as used in the quantitative modelling of hospital laundry services.

    Science.gov (United States)

    Hammer, K A; Janes, F R

    1995-01-01

    The objectives for developing the participative method of subject definition were to gain all the relevant information to a high level of fidelity in the earliest stages of the work and so be able to build a realistic model at reduced labour cost. In order to better integrate the two activities--information acquisition and mathematical modelling--a procedure was devised using the methods of interactive management to facilitate teamwork. This procedure provided the techniques to create suitable working relationships between the two groups, the informants and the modellers, so as to maximize their free and accurate intercommunication, both during the initial definition of the linen service and during the monitoring of the accuracy and reality of the draft models. The objectives of this project were met in that the final model was quickly validated and approved, at a low labour cost.

  12. Improving the Functionality of Dictionary Definitions for Lexical Sets: The Role of Definitional Templates, Definitional Consistency, Definitional Coherence and the Incorporation of Lexical Conceptual Models

    Directory of Open Access Journals (Sweden)

    Piet Swanepoel

    2011-10-01

    Full Text Available

    ABSTRACT: This article focuses on some of the problems raised by Atkins and Rundell's (2008 approach to the design of lexicographic definitions for members of lexical sets. The questions raised are how to define and identify lexical sets, how lexical conceptual models (LCMs can support definitional consistency and coherence in defining members of lexical sets, and what the ideal content and structure of LCMs could be. Although similarity of meaning is proposed as the defining feature of lexical sets, similarity of meaning is only one dimension of the broader concept of lexical coherence. The argument is presented that numerous conceptual lexical models (e.g. taxonomies, folk models, frames, etc. in fact indicate, justify or explain how lexical items cohere (and thus form sets. In support of Fillmore's (2003 suggestion that definitions of the lexical items of cohering sets should be linked to such explanatory models, additional functionally-orientated arguments are presented for the incorporation of conceptual lexical models in electronic monolingual learners' dictionaries. Numerous resources exist to support the design of LCMs which can improve the functionality of definitions of members of lexical sets. A few examples are discussed of how such resources can be used to design functionally justified LCMs.

    OPSOMMING: Verbetering van die funksionaliteit van woordeboekdefinisies vir leksikale versamelings: Die rol van definisiematryse, definisie-eenvormigheid, definisiesamehang en die inkorporering van leksikale konseptuele modelle. Hierdie artikel fokus op sommige van die probleme wat ter sprake kom deur Atkins en Rundell (2008 se benadering tot die ontwerp van leksikografiese definisies vir lede van leksikale versamelings. Die vrae wat gestel word, is hoe leksikale versamelings gedefinieer en geïdentifiseer moet word, hoe leksikale konseptuele modelle (LKM's definisie-eenvormigheid en-samehang kan ondersteun by die definiëring van lede

  13. Micro worlds versus boundary objects in group model building; evidence from the literature on problem definition and model conceptulization

    Energy Technology Data Exchange (ETDEWEB)

    Zagonel, Aldo A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Systems Engineering & Analysis; Andersen, David F. [University in Albany, NY (United States). The Rockefeller College of Public Affairs & Policy

    2007-03-01

    Based upon participant observation in group model building and content analysis of the system dynamics literature, we postulate that modeling efforts have a dual nature. On one hand, the modeling process aims to create a useful representation of a real-world system. This must be done, however, while aligning the clients’ mental models around a shared view of the system. There is significant overlap and confusion between these two goals and how they play out on a practical level. This research clarifies these distinctions by establishing an ideal-type dichotomy. To highlight the differences, we created two straw men: “micro world” characterizes a model that represents reality and “boundary object” represents a socially negotiated model. Using this framework, the literature was examined, revealing evidence for several competing views on problem definition and model conceptualization. The results are summarized in the text of this article, substantiated with strikingly polarized citations, often from the same authors. We also introduce hypotheses for the duality across the remaining phases of the modeling process. Finally, understanding and appreciation of the differences between these ideal types can promote constructive debate on their balance in system dynamics theory and practice.

  14. Linking definitions, mechanisms, and modeling of drought-induced tree death.

    Science.gov (United States)

    Anderegg, William R L; Berry, Joseph A; Field, Christopher B

    2012-12-01

    Tree death from drought and heat stress is a critical and uncertain component in forest ecosystem responses to a changing climate. Recent research has illuminated how tree mortality is a complex cascade of changes involving interconnected plant systems over multiple timescales. Explicit consideration of the definitions, dynamics, and temporal and biological scales of tree mortality research can guide experimental and modeling approaches. In this review, we draw on the medical literature concerning human death to propose a water resource-based approach to tree mortality that considers the tree as a complex organism with a distinct growth strategy. This approach provides insight into mortality mechanisms at the tree and landscape scales and presents promising avenues into modeling tree death from drought and temperature stress. Copyright © 2012 Elsevier Ltd. All rights reserved.

  15. Integration Of Externalized Decision Models In The Definition Of Workflows For Digital Pathology

    Directory of Open Access Journals (Sweden)

    J. van Leeuwen

    2016-06-01

    We proposed a workflow solution enabling the representation of decision models as externalized executable tasks in the process definition. Our approach separates the task implementations from the workflow model, ensuring scalability and allowing for the inclusion of complex decision logic in the workflow execution. In we depict a simplified model of a pathology diagnosis workflow (starting with the digitization of the slides, represented according to the BPMN modeling conventions. The example shows a workflow sequence that automatically orders a HER2 FISH when IHC is borderline according to defined customizable thresholds. The process model integrates an image analysis algorithm that scores images. Based on the score and the thresholds the decision model evaluates the condition and recommends the pre-ordering of an additional test when the score falls between the two thresholds. This leads to faster diagnosis and allows balancing the costs of an additional test versus the overhead of the pathologist by choosing the values of the thresholds.  

  16. Exploratory analysis regarding the domain definitions for computer based analytical models

    Science.gov (United States)

    Raicu, A.; Oanta, E.; Barhalescu, M.

    2017-08-01

    Our previous computer based studies dedicated to structural problems using analytical methods defined the composite cross section of a beam as a result of Boolean operations with so-called ‘simple’ shapes. Using generalisations, in the class of the ‘simple’ shapes were included areas bounded by curves approximated using spline functions and areas approximated as polygons. However, particular definitions lead to particular solutions. In order to ascend above the actual limitations, we conceived a general definition of the cross sections that are considered now calculus domains consisting of several subdomains. The according set of input data use complex parameterizations. This new vision allows us to naturally assign a general number of attributes to the subdomains. In this way there may be modelled new phenomena that use map-wise information, such as the metal alloys equilibrium diagrams. The hierarchy of the input data text files that use the comma-separated-value format and their structure are also presented and discussed in the paper. This new approach allows us to reuse the concepts and part of the data processing software instruments already developed. The according software to be subsequently developed will be modularised and generalised in order to be used in the upcoming projects that require rapid development of computer based models.

  17. Diffusion tensor magnetic resonance imaging driven growth modeling for radiotherapy target definition in glioblastoma

    DEFF Research Database (Denmark)

    Jensen, Morten B; Guldberg, Trine L; Harbøll, Anja

    2017-01-01

    the microscopic tumor cell spread. Gliomas favor spread along the white matter fiber tracts. Tumor growth models incorporating the MRI diffusion tensors (DTI) allow to account more consistently for the glioma growth. The aim of the study was to investigate the potential of a DTI driven growth model to improve...... target definition in glioblastoma (GBM). MATERIAL AND METHODS: Eleven GBM patients were scanned using T1w, T2w FLAIR, T1w + Gd and DTI. The brain was segmented into white matter, gray matter and cerebrospinal fluid. The Fisher-Kolmogorov growth model was used assuming uniform proliferation...

  18. Modelling SDL, Modelling Languages

    Directory of Open Access Journals (Sweden)

    Michael Piefel

    2007-02-01

    Full Text Available Today's software systems are too complex to implement them and model them using only one language. As a result, modern software engineering uses different languages for different levels of abstraction and different system aspects. Thus to handle an increasing number of related or integrated languages is the most challenging task in the development of tools. We use object oriented metamodelling to describe languages. Object orientation allows us to derive abstract reusable concept definitions (concept classes from existing languages. This language definition technique concentrates on semantic abstractions rather than syntactical peculiarities. We present a set of common concept classes that describe structure, behaviour, and data aspects of high-level modelling languages. Our models contain syntax modelling using the OMG MOF as well as static semantic constraints written in OMG OCL. We derive metamodels for subsets of SDL and UML from these common concepts, and we show for parts of these languages that they can be modelled and related to each other through the same abstract concepts.

  19. Model integration and a theory of models

    OpenAIRE

    Dolk, Daniel R.; Kottemann, Jeffrey E.

    1993-01-01

    Model integration extends the scope of model management to include the dimension of manipulation as well. This invariably leads to comparisons with database theory. Model integration is viewed from four perspectives: Organizational, definitional, procedural, and implementational. Strategic modeling is discussed as the organizational motivation for model integration. Schema and process integration are examined as the logical and manipulation counterparts of model integr...

  20. An integrated operational definition and conceptual model of asthma self-management in teens.

    Science.gov (United States)

    Mammen, Jennifer; Rhee, Hyekyun; Norton, Sally A; Butz, Arlene M; Halterman, Jill S; Arcoleo, Kimberly

    2018-01-19

    A previous definition of adolescent asthma self-management was derived from interviews with clinicians/researchers and published literature; however, it did not incorporate perspectives of teens or parents. Therefore, we conducted in-depth interviews with teens and parents and synthesized present findings with the prior analysis to develop a more encompassing definition and model. Focal concepts were qualitatively extracted from 14-day self-management voice-diaries (n = 14) and 1-hour interviews (n = 42) with teens and parents (28 individuals) along with concepts found in the previous clinical/research oriented analysis. Conceptual structure and relationships were identified and key findings synthesized to develop a revised definition and model of adolescent asthma self-management. There were two primary self-management constructs: processes of self-management and tasks of self-management. Self-management was defined as the iterative process of assessing, deciding, and responding to specific situations in order to achieve personally important outcomes. Clinically relevant asthma self-management tasks included monitoring asthma, managing active issues through pharmacologic and non-pharmacologic strategies, preventing future issues, and communicating with others as needed. Self-management processes were reciprocally influenced by intrapersonal factors (both cognitive and physical), interpersonal factors (family, social and physical environments), and personally relevant asthma and non-asthma outcomes. This is the first definition of asthma self-management incorporating teen, parent, clinician, and researcher perspectives, which suggests that self-management processes and behaviors are influenced by individually variable personal and interpersonal factors, and are driven by personally important outcomes. Clinicians and researchers should investigate teens' symptom perceptions, medication beliefs, current approaches to symptom management, relevant outcomes, and

  1. Process and building information modelling in the construction industry by using information delivery manuals and model view definitions

    DEFF Research Database (Denmark)

    Karlshøj, Jan

    2012-01-01

    The construction industry is gradually increasing its use of structured information and building information modelling.To date, the industry has suffered from the disadvantages of a project-based organizational structure and ad hoc solutions. Furthermore, it is not used to formalizing the flow...... of information and specifying exactly which objects and properties are needed for each process and which information is produced by the processes. The present study is based on reviewing the existing methodology of Information Delivery Manuals (IDM) from Buildingsmart, which also is also an ISO standard 29481...... Part 1; and the Model View Definition (MVD) methodology developed by Buildingsmart and BLIS. The research also includes a review of concrete IDM development projects that have been developed over the last five years. Although the study has identified interest in the IDM methodology in a number...

  2. Systemic resilience model

    International Nuclear Information System (INIS)

    Lundberg, Jonas; Johansson, Björn JE

    2015-01-01

    It has been realized that resilience as a concept involves several contradictory definitions, both for instance resilience as agile adjustment and as robust resistance to situations. Our analysis of resilience concepts and models suggest that beyond simplistic definitions, it is possible to draw up a systemic resilience model (SyRes) that maintains these opposing characteristics without contradiction. We outline six functions in a systemic model, drawing primarily on resilience engineering, and disaster response: anticipation, monitoring, response, recovery, learning, and self-monitoring. The model consists of four areas: Event-based constraints, Functional Dependencies, Adaptive Capacity and Strategy. The paper describes dependencies between constraints, functions and strategies. We argue that models such as SyRes should be useful both for envisioning new resilience methods and metrics, as well as for engineering and evaluating resilient systems. - Highlights: • The SyRes model resolves contradictions between previous resilience definitions. • SyRes is a core model for envisioning and evaluating resilience metrics and models. • SyRes describes six functions in a systemic model. • They are anticipation, monitoring, response, recovery, learning, self-monitoring. • The model describes dependencies between constraints, functions and strategies

  3. FORECASTING MODELS IN MANAGEMENT

    OpenAIRE

    Sindelar, Jiri

    2008-01-01

    This article deals with the problems of forecasting models. First part of the article is dedicated to definition of the relevant areas (vertical and horizontal pillar of definition) and then the forecasting model itself is defined; as article presents theoretical background for further primary research, this definition is crucial. Finally the position of forecasting models within the management system is identified. The paper is a part of the outputs of FEM CULS grant no. 1312/11/3121.

  4. Modeling, Simulation, and Analysis of Novel Threshold Voltage Definition for Nano-MOSFET

    Directory of Open Access Journals (Sweden)

    Yashu Swami

    2017-01-01

    Full Text Available Threshold voltage (VTH is the indispensable vital parameter in MOSFET designing, modeling, and operation. Diverse expounds and extraction methods exist to model the on-off transition characteristics of the device. The governing gauge for efficient threshold voltage definition and extraction method can be itemized as clarity, simplicity, precision, and stability throughout the operating conditions and technology node. The outcomes of extraction methods diverge from the exact values due to various short-channel effects (SCEs and nonidealities present in the device. A new approach to define and extract the real value of VTH of MOSFET is proposed in the manuscript. The subsequent novel enhanced SCE-independent VTH extraction method named “hybrid extrapolation VTH extraction method” (HEEM is elaborated, modeled, and compared with few prevalent MOSFET threshold voltage extraction methods for validation of the results. All the results are verified by extensive 2D TCAD simulation and confirmed analytically at various technology nodes.

  5. Basic Definitions and Concepts of Systems Approach, Mathematical Modeling and Information Technologies in Sports Science

    Directory of Open Access Journals (Sweden)

    А. Лопатьєв

    2017-09-01

    Full Text Available The objective is to systematize and adapt the basic definitions and concepts of the systems approach, mathematical modeling and information technologies to sports science. Materials and methods. The research has studied the availability of appropriate terms in shooting sports, which would meet the requirements of modern sports science. It has examined the compliance of the shooting sports training program for children and youth sports schools, the Olympic reserve specialized children and youth schools, schools of higher sports skills, and sports educational institutions with the modern requirements and principles. Research results. The paper suggests the basic definitions adapted to the requirements of technical sports and sports science. The research has thoroughly analyzed the shooting sports training program for children and youth sports schools, the Olympic reserve specialized children and youth schools, schools of higher sports skills, and sports educational institutions. The paper offers options to improve the training program in accordance with the modern tendencies of training athletes.  Conclusions. The research suggests to systematize and adapt the basic definitions and concepts of the systems approach, mathematical modeling and information technologies using the example of technical sports.

  6. Improving transcriptome construction in non-model organisms: integrating manual and automated gene definition in Emiliania huxleyi.

    OpenAIRE

    Feldmesser, Ester; Rosenwasser, Shilo; Vardi, Assaf; Ben-Dor, Shifra

    2014-01-01

    Background The advent of Next Generation Sequencing technologies and corresponding bioinformatics tools allows the definition of transcriptomes in non-model organisms. Non-model organisms are of great ecological and biotechnological significance, and consequently the understanding of their unique metabolic pathways is essential. Several methods that integrate de novo assembly with genome-based assembly have been proposed. Yet, there are many open challenges in defining genes, particularly whe...

  7. Modelling Practice

    DEFF Research Database (Denmark)

    Cameron, Ian; Gani, Rafiqul

    2011-01-01

    This chapter deals with the practicalities of building, testing, deploying and maintaining models. It gives specific advice for each phase of the modelling cycle. To do this, a modelling framework is introduced which covers: problem and model definition; model conceptualization; model data...... requirements; model construction; model solution; model verification; model validation and finally model deployment and maintenance. Within the adopted methodology, each step is discussedthrough the consideration of key issues and questions relevant to the modelling activity. Practical advice, based on many...

  8. The effect of neighbourhood definitions on spatio-temporal models of disease outbreaks: Separation distance versus range overlap.

    Science.gov (United States)

    Laffan, Shawn W; Wang, Zhaoyuan; Ward, Michael P

    2011-12-01

    The definition of the spatial relatedness between infectious and susceptible animal groups is a fundamental component of spatio-temporal modelling of disease outbreaks. A common neighbourhood definition for disease spread in wild and feral animal populations is the distance between the centroids of neighbouring group home ranges. This distance can be used to define neighbourhood interactions, and also to describe the probability of successful disease transmission. Key limitations of this approach are (1) that a susceptible neighbour of an infectious group with an overlapping home range - but whose centroid lies outside the home range of an infectious group - will not be considered for disease transmission, and (2) the degree of overlap between the home ranges is not taken into account for those groups with centroids inside the infectious home range. We assessed the impact of both distance-based and range overlap methods of disease transmission on model-predicted disease spread. Range overlap was calculated using home ranges modelled as circles. We used the Sirca geographic automata model, with the population data from a nine-county study area in Texas that we have previously described. For each method we applied 100 model repetitions, each of 100 time steps, to 30 index locations. The results show that the rate of disease spread for the range-overlap method is clearly less than the distance-based method, with median outbreaks modelled using the latter being 1.4-1.45 times larger. However, the two methods show similar overall trends in the area infected, and the range-overlap median (48 and 120 for cattle and pigs, respectively) falls within the 5th-95th percentile range of the distance-based method (0-96 and 0-252 for cattle and pigs, respectively). These differences can be attributed to the calculation of the interaction probabilities in the two methods, with overlap weights generally resulting in lower interaction probabilities. The definition of spatial

  9. Product models for the Construction industry

    DEFF Research Database (Denmark)

    Sørensen, Lars Schiøtt

    1996-01-01

    Different types of product models for the building sector was elaborated and grouped. Some discussion on the different models was given. The "definition" of Product models was given.......Different types of product models for the building sector was elaborated and grouped. Some discussion on the different models was given. The "definition" of Product models was given....

  10. Sustainable geothermal utilization - Case histories; definitions; research issues and modelling

    International Nuclear Information System (INIS)

    Axelsson, Gudni

    2010-01-01

    Sustainable development by definition meets the needs of the present without compromising the ability of future generations to meet their own needs. The Earth's enormous geothermal resources have the potential to contribute significantly to sustainable energy use worldwide as well as to help mitigate climate change. Experience from the use of numerous geothermal systems worldwide lasting several decades demonstrates that by maintaining production below a certain limit the systems reach a balance between net energy discharge and recharge that may be maintained for a long time (100-300 years). Modelling studies indicate that the effect of heavy utilization is often reversible on a time-scale comparable to the period of utilization. Thus, geothermal resources can be used in a sustainable manner either through (1) constant production below the sustainable limit, (2) step-wise increase in production, (3) intermittent excessive production with breaks, and (4) reduced production after a shorter period of heavy production. The long production histories that are available for low-temperature as well as high-temperature geothermal systems distributed throughout the world, provide the most valuable data available for studying sustainable management of geothermal resources, and reservoir modelling is the most powerful tool available for this purpose. The paper presents sustainability modelling studies for the Hamar and Nesjavellir geothermal systems in Iceland, the Beijing Urban system in China and the Olkaria system in Kenya as examples. Several relevant research issues have also been identified, such as the relevance of system boundary conditions during long-term utilization, how far reaching interference from utilization is, how effectively geothermal systems recover after heavy utilization and the reliability of long-term (more than 100 years) model predictions. (author)

  11. Gaps Analysis of Integrating Product Design, Manufacturing, and Quality Data in The Supply Chain Using Model-Based Definition.

    Science.gov (United States)

    Trainer, Asa; Hedberg, Thomas; Feeney, Allison Barnard; Fischer, Kevin; Rosche, Phil

    2016-01-01

    Advances in information technology triggered a digital revolution that holds promise of reduced costs, improved productivity, and higher quality. To ride this wave of innovation, manufacturing enterprises are changing how product definitions are communicated - from paper to models. To achieve industry's vision of the Model-Based Enterprise (MBE), the MBE strategy must include model-based data interoperability from design to manufacturing and quality in the supply chain. The Model-Based Definition (MBD) is created by the original equipment manufacturer (OEM) using Computer-Aided Design (CAD) tools. This information is then shared with the supplier so that they can manufacture and inspect the physical parts. Today, suppliers predominantly use Computer-Aided Manufacturing (CAM) and Coordinate Measuring Machine (CMM) models for these tasks. Traditionally, the OEM has provided design data to the supplier in the form of two-dimensional (2D) drawings, but may also include a three-dimensional (3D)-shape-geometry model, often in a standards-based format such as ISO 10303-203:2011 (STEP AP203). The supplier then creates the respective CAM and CMM models and machine programs to produce and inspect the parts. In the MBE vision for model-based data exchange, the CAD model must include product-and-manufacturing information (PMI) in addition to the shape geometry. Today's CAD tools can generate models with embedded PMI. And, with the emergence of STEP AP242, a standards-based model with embedded PMI can now be shared downstream. The on-going research detailed in this paper seeks to investigate three concepts. First, that the ability to utilize a STEP AP242 model with embedded PMI for CAD-to-CAM and CAD-to-CMM data exchange is possible and valuable to the overall goal of a more efficient process. Second, the research identifies gaps in tools, standards, and processes that inhibit industry's ability to cost-effectively achieve model-based-data interoperability in the pursuit of the

  12. TAME - the terrestrial-aquatic model of the environment: model definition

    International Nuclear Information System (INIS)

    Klos, R.A.; Mueller-Lemans, H.; Dorp, F. van; Gribi, P.

    1996-10-01

    TAME - the Terrestrial-Aquatic Model of the Environment is a new computer model for use in assessments of the radiological impact of the release of radionuclides to the biosphere, following their disposal in underground waste repositories. Based on regulatory requirements, the end-point of the calculations is the maximum annual individual dose to members of a hypothetical population group inhabiting the biosphere region. Additional mid- and end-points in the TAME calculations are dose as function of time from eleven exposure pathways, foodstuff concentrations and the distribution of radionuclides in the modelled biosphere. A complete description of the mathematical representations of the biosphere in TAME is given in this document, based on a detailed review of the underlying conceptual framework for the model. Example results are used to illustrate features of the conceptual and mathematical models. The end-point of dose is shown to be robust for the simplifying model assumptions used to define the biosphere for the example calculations. TAME comprises two distinct sub-models - one representing the transport of radionuclides in the near-surface environment and one for the calculation of dose to individual inhabitants of that biosphere. The former is the result of a detailed review of the modelling requirements for such applications and is based on a comprehensive consideration of all features, events and processes (FEPs) relevant to Swiss biospheres, both in the present-day biosphere and in potential future biosphere states. Representations of the transport processes are derived from first principles. Mass balance for water and solid material fluxes is used to determine the rates of contaminant transfer between components of the biosphere system. The calculation of doses is based on existing representations of exposure pathways and draws on experience both from Switzerland and elsewhere. (author) figs., tabs., refs

  13. TAME - the terrestrial-aquatic model of the environment: model definition

    Energy Technology Data Exchange (ETDEWEB)

    Klos, R.A. [Paul Scherrer Inst. (PSI), Villigen (Switzerland); Mueller-Lemans, H. [Tergoso AG fuer Umweltfragen, Sargans (Switzerland); Dorp, F. van [Nationale Genossenschaft fuer die Lagerung Radioaktiver Abfaelle (NAGRA), Baden (Switzerland); Gribi, P. [Colenco AG, Baden (Switzerland)

    1996-10-01

    TAME - the Terrestrial-Aquatic Model of the Environment is a new computer model for use in assessments of the radiological impact of the release of radionuclides to the biosphere, following their disposal in underground waste repositories. Based on regulatory requirements, the end-point of the calculations is the maximum annual individual dose to members of a hypothetical population group inhabiting the biosphere region. Additional mid- and end-points in the TAME calculations are dose as function of time from eleven exposure pathways, foodstuff concentrations and the distribution of radionuclides in the modelled biosphere. A complete description of the mathematical representations of the biosphere in TAME is given in this document, based on a detailed review of the underlying conceptual framework for the model. Example results are used to illustrate features of the conceptual and mathematical models. The end-point of dose is shown to be robust for the simplifying model assumptions used to define the biosphere for the example calculations. TAME comprises two distinct sub-models - one representing the transport of radionuclides in the near-surface environment and one for the calculation of dose to individual inhabitants of that biosphere. The former is the result of a detailed review of the modelling requirements for such applications and is based on a comprehensive consideration of all features, events and processes (FEPs) relevant to Swiss biospheres, both in the present-day biosphere and in potential future biosphere states. Representations of the transport processes are derived from first principles. Mass balance for water and solid material fluxes is used to determine the rates of contaminant transfer between components of the biosphere system. The calculation of doses is based on existing representations of exposure pathways and draws on experience both from Switzerland and elsewhere. (author) figs., tabs., refs.

  14. Concrete syntax definition for modeling languages

    OpenAIRE

    Fondement, Frédéric; Baar, Thomas

    2008-01-01

    Model Driven Engineering (MDE) promotes the use of models as primary artefacts of a software development process, as an attempt to handle complexity through abstraction, e.g. to cope with the evolution of execution platforms. MDE follows a stepwise approach, by prescribing to develop abstract models further improved to integrate little by little details relative to the final deployment platforms. Thus, the application of an MDE process results in various models residing at various levels of a...

  15. Concrete syntax definition for modeling languages

    OpenAIRE

    Fondement, Frédéric

    2007-01-01

    Model Driven Engineering (MDE) promotes the use of models as primary artefacts of a software development process, as an attempt to handle complexity through abstraction, e.g. to cope with the evolution of execution platforms. MDE follows a stepwise approach, by prescribing to develop abstract models further improved to integrate little by little details relative to the final deployment platforms. Thus, the application of an MDE process results in various models residing at various levels of a...

  16. A conceptual definition of vocational rehabilitation based on the ICF: building a shared global model.

    Science.gov (United States)

    Escorpizo, Reuben; Reneman, Michiel F; Ekholm, Jan; Fritz, Julie; Krupa, Terry; Marnetoft, Sven-Uno; Maroun, Claude E; Guzman, Julietta Rodriguez; Suzuki, Yoshiko; Stucki, Gerold; Chan, Chetwyn C H

    2011-06-01

    The International Classification of Functioning, Disability and Health (ICF) is a conceptual framework and classification system by the World Health Organization (WHO) to understand functioning. The objective of this discussion paper is to offer a conceptual definition for vocational rehabilitation (VR) based on the ICF. We presented the ICF as a model for application in VR and the rationale for the integration of the ICF. We also briefly reviewed other work disability models. Five essential elements of foci were found towards a conceptual definition of VR: an engagement or re-engagement to work, along a work continuum, involved health conditions or events leading to work disability, patient-centered and evidence-based, and is multi-professional or multidisciplinary. VR refers to a multi-professional approach that is provided to individuals of working age with health-related impairments, limitations, or restrictions with work functioning and whose primary aim is to optimize work participation. We propose that the ICF and VR interface be explored further using empirical and qualitative works and encouraging stakeholders' participation.

  17. A competing risk model of first failure site after definitive (chemo) radiation therapy for locally advanced non-small cell lung cancer

    DEFF Research Database (Denmark)

    Nygård, Lotte; Vogelius, Ivan R; Fischer, Barbara M

    2018-01-01

    INTRODUCTION: The aim of the study was to build a model of first failure site and lesion specific failure probability after definitive chemo-radiotherapy for inoperable non-small cell lung cancer (NSCLC). METHODS: We retrospectively analyzed 251 patients receiving definitive chemo......-regional failure, multivariable logistic regression was applied to assess risk of each lesion being first site of failure. The two models were used in combination to predict lesion failure probability accounting for competing events. RESULTS: Adenocarcinoma had a lower hazard ratio (HR) of loco-regional (LR...

  18. Model-based software process improvement

    Science.gov (United States)

    Zettervall, Brenda T.

    1994-01-01

    The activities of a field test site for the Software Engineering Institute's software process definition project are discussed. Products tested included the improvement model itself, descriptive modeling techniques, the CMM level 2 framework document, and the use of process definition guidelines and templates. The software process improvement model represents a five stage cyclic approach for organizational process improvement. The cycles consist of the initiating, diagnosing, establishing, acting, and leveraging phases.

  19. Limitations of JEDI Models | Jobs and Economic Development Impact Models |

    Science.gov (United States)

    Group's IMPLAN accounting software. For JEDI, these are updated every two years for the best available -output modeling remains a widely used methodology for measuring economic development activity. Definition definition of the geographic area under consideration. Datasets of multipliers from IMPLAN are available at

  20. Programming Models in HPC

    Energy Technology Data Exchange (ETDEWEB)

    Shipman, Galen M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-06-13

    These are the slides for a presentation on programming models in HPC, at the Los Alamos National Laboratory's Parallel Computing Summer School. The following topics are covered: Flynn's Taxonomy of computer architectures; single instruction single data; single instruction multiple data; multiple instruction multiple data; address space organization; definition of Trinity (Intel Xeon-Phi is a MIMD architecture); single program multiple data; multiple program multiple data; ExMatEx workflow overview; definition of a programming model, programming languages, runtime systems; programming model and environments; MPI (Message Passing Interface); OpenMP; Kokkos (Performance Portable Thread-Parallel Programming Model); Kokkos abstractions, patterns, policies, and spaces; RAJA, a systematic approach to node-level portability and tuning; overview of the Legion Programming Model; mapping tasks and data to hardware resources; interoperability: supporting task-level models; Legion S3D execution and performance details; workflow, integration of external resources into the programming model.

  1. [Safety culture: definition, models and design].

    Science.gov (United States)

    Pfaff, Holger; Hammer, Antje; Ernstmann, Nicole; Kowalski, Christoph; Ommen, Oliver

    2009-01-01

    Safety culture is a multi-dimensional phenomenon. Safety culture of a healthcare organization is high if it has a common stock in knowledge, values and symbols in regard to patients' safety. The article intends to define safety culture in the first step and, in the second step, demonstrate the effects of safety culture. We present the model of safety behaviour and show how safety culture can affect behaviour and produce safe behaviour. In the third step we will look at the causes of safety culture and present the safety-culture-model. The main hypothesis of this model is that the safety culture of a healthcare organization strongly depends on its communication culture and its social capital. Finally, we will investigate how the safety culture of a healthcare organization can be improved. Based on the safety culture model six measures to improve safety culture will be presented.

  2. Finsler Geometry Modeling of an Orientation-Asymmetric Surface Model for Membranes

    Science.gov (United States)

    Proutorov, Evgenii; Koibuchi, Hiroshi

    2017-12-01

    In this paper, a triangulated surface model is studied in the context of Finsler geometry (FG) modeling. This FG model is an extended version of a recently reported model for two-component membranes, and it is asymmetric under surface inversion. We show that the definition of the model is independent of how the Finsler length of a bond is defined. This leads us to understand that the canonical (or Euclidean) surface model is obtained from the FG model such that it is uniquely determined as a trivial model from the viewpoint of well definedness.

  3. Document Models

    Directory of Open Access Journals (Sweden)

    A.A. Malykh

    2017-08-01

    Full Text Available In this paper, the concept of locally simple models is considered. Locally simple models are arbitrarily complex models built from relatively simple components. A lot of practically important domains of discourse can be described as locally simple models, for example, business models of enterprises and companies. Up to now, research in human reasoning automation has been mainly concentrated around the most intellectually intensive activities, such as automated theorem proving. On the other hand, the retailer business model is formed from ”jobs”, and each ”job” can be modelled and automated more or less easily. At the same time, the whole retailer model as an integrated system is extremely complex. In this paper, we offer a variant of the mathematical definition of a locally simple model. This definition is intended for modelling a wide range of domains. Therefore, we also must take into account the perceptual and psychological issues. Logic is elitist, and if we want to attract to our models as many people as possible, we need to hide this elitism behind some metaphor, to which ’ordinary’ people are accustomed. As such a metaphor, we use the concept of a document, so our locally simple models are called document models. Document models are built in the paradigm of semantic programming. This allows us to achieve another important goal - to make the documentary models executable. Executable models are models that can act as practical information systems in the described domain of discourse. Thus, if our model is executable, then programming becomes redundant. The direct use of a model, instead of its programming coding, brings important advantages, for example, a drastic cost reduction for development and maintenance. Moreover, since the model is well and sound, and not dissolved within programming modules, we can directly apply AI tools, in particular, machine learning. This significantly expands the possibilities for automation and

  4. Assessing harmonic current source modelling and power definitions in balanced and unbalanced networks

    Energy Technology Data Exchange (ETDEWEB)

    Atkinson-Hope, Gary; Stemmet, W.C. [Cape Peninsula University of Technology, Cape Town Campus, Cape Town (South Africa)

    2006-07-01

    The purpose of this paper is to assess the DlgSILENT PowerFactory software power definitions (indices) in terms of phase and sequence components for balanced and unbalanced networks when harmonic distortion is present and to compare its results to hand calculations done, following recommendation made by the IEEE Working Group on this topic. This paper also includes the development of a flowchart for calculating power indices in balanced and unbalanced three-phase networks when non-sinusoidal voltages and currents are present. A further purpose is to determine how two industrial grade harmonic analysis software packages (DlgSILENT and ERACS) model three-phase harmonic sources used for current penetration studies and to compare their results when applied to a network. From the investigations, another objective was to develop a methodology for modelling harmonic current sources based on a spectrum obtained from measurements. Three case studies were conducted and the assessment and developed methodologies were shown to be effective. (Author)

  5. Group-level self-definition and self-investment: a hierarchical (multicomponent) model of in-group identification.

    Science.gov (United States)

    Leach, Colin Wayne; van Zomeren, Martijn; Zebel, Sven; Vliek, Michael L W; Pennekamp, Sjoerd F; Doosje, Bertjan; Ouwerkerk, Jaap W; Spears, Russell

    2008-07-01

    Recent research shows individuals' identification with in-groups to be psychologically important and socially consequential. However, there is little agreement about how identification should be conceptualized or measured. On the basis of previous work, the authors identified 5 specific components of in-group identification and offered a hierarchical 2-dimensional model within which these components are organized. Studies 1 and 2 used confirmatory factor analysis to validate the proposed model of self-definition (individual self-stereotyping, in-group homogeneity) and self-investment (solidarity, satisfaction, and centrality) dimensions, across 3 different group identities. Studies 3 and 4 demonstrated the construct validity of the 5 components by examining their (concurrent) correlations with established measures of in-group identification. Studies 5-7 demonstrated the predictive and discriminant validity of the 5 components by examining their (prospective) prediction of individuals' orientation to, and emotions about, real intergroup relations. Together, these studies illustrate the conceptual and empirical value of a hierarchical multicomponent model of in-group identification.

  6. Metamodeling for Business Model Design : Facilitating development and communication of Business Model Canvas (BMC) models with an OMG standards-based metamodel.

    OpenAIRE

    Hauksson, Hilmar

    2013-01-01

    Interest for business models and business modeling has increased rapidly since the mid-1990‘s and there are numerous approaches used to create business models. The business model concept has many definitions which can lead to confusion and slower progress in the research and development of business models. A business model ontology (BMO) was created in 2004 where the business model concept was conceptualized based on an analysis of existing literature. A few years later the Business Model Can...

  7. Common problematic aspects of coupling hydrological models with groundwater flow models on the river catchment scale

    Directory of Open Access Journals (Sweden)

    R. Barthel

    2006-01-01

    Full Text Available Model coupling requires a thorough conceptualisation of the coupling strategy, including an exact definition of the individual model domains, the "transboundary" processes and the exchange parameters. It is shown here that in the case of coupling groundwater flow and hydrological models – in particular on the regional scale – it is very important to find a common definition and scale-appropriate process description of groundwater recharge and baseflow (or "groundwater runoff/discharge" in order to achieve a meaningful representation of the processes that link the unsaturated and saturated zones and the river network. As such, integration by means of coupling established disciplinary models is problematic given that in such models, processes are defined from a purpose-oriented, disciplinary perspective and are therefore not necessarily consistent with definitions of the same process in the model concepts of other disciplines. This article contains a general introduction to the requirements and challenges of model coupling in Integrated Water Resources Management including a definition of the most relevant technical terms, a short description of the commonly used approach of model coupling and finally a detailed consideration of the role of groundwater recharge and baseflow in coupling groundwater models with hydrological models. The conclusions summarize the most relevant problems rather than giving practical solutions. This paper aims to point out that working on a large scale in an integrated context requires rethinking traditional disciplinary workflows and encouraging communication between the different disciplines involved. It is worth noting that the aspects discussed here are mainly viewed from a groundwater perspective, which reflects the author's background.

  8. Moving towards maturity in business model definitions

    DEFF Research Database (Denmark)

    Nielsen, Christian; Lund, Morten; Bukh, Per Nikolaj

    2014-01-01

    The field of business models has, as is the case with all emerging fields of practice, slowly matured through the development of frameworks, models, concepts and ideas over the last 15 years. New concepts, theories and models typically transcend a series of maturity phases. For the concept of Bus...

  9. Draft Common Frame of Reference. Principles, Definitions and Model Rules of European Private Law

    OpenAIRE

    AA.VV; IUDICA G.

    2009-01-01

    European private law in principles, definitions and model rules. The volumes contain the results of the work of the Study Group on a European Civil Code (the “Study Group”) and the Research Group on Existing EC Private Law (the “Acquis Group”). The former Commission on European Contract Law (the “Lando Commission”) provided the basis for much of Books II and III; it was on their Principles of European Contract Law (PECL)1 that the Study Group and the Acquis Group built. The Acquis Group ...

  10. Towards an E-market Model

    DEFF Research Database (Denmark)

    Ivang, Reimer; Hinson, Robert; Somasundaram, Ramanathan

    2006-01-01

    on the literature survey and ídentification of gaps in the present e-market definitive models, the authors postulate a preliminary e-market reference model. Originality/ Value: Through synthesizing the e-market literature, and by taking into account contemporary e-market developments, key dimensions that define......Purpose: Seeks to argue that there are problems associated with e-market definitive efforts and consequently seeks proposes a new e-market model. Design/methodology/Approach: Paper based largely on literature survey and an assessment of the existing e-market conceptualizations. Findings: Based...

  11. Dosimetric comparison of the specific anthropomorphic mannequin (SAM) to 14 anatomical head models using a novel definition for the mobile phone positioning

    International Nuclear Information System (INIS)

    Kainz, Wolfgang; Christ, Andreas; Kellom, Tocher; Seidman, Seth; Nikoloski, Neviana; Beard, Brian; Kuster, Niels

    2005-01-01

    This paper presents new definitions for obtaining reproducible results in numerical phone dosimetry. Numerous numerical dosimetric studies have been published about the exposure of mobile phone users which concluded with conflicting results. However, many of these studies lack reproducibility due to shortcomings in the description of the phone positioning. The new approach was tested by two groups applying two different numerical program packages to compare the specific anthropomorphic mannequin (SAM) to 14 anatomically correct head models. A novel definition for the positioning of mobile phones next to anatomically correct head models is given along with other essential parameters to be reported. The definition is solely based on anatomical characteristics of the head. A simple up-to-date phone model was used to determine the peak spatial specific absorption rate (SAR) of mobile phones in SAM and in the anatomically correct head models. The results were validated by measurements. The study clearly shows that SAM gives a conservative estimate of the exposure in anatomically correct head models for head only tissue. Depending on frequency, phone position and head size the numerically calculated 10 g averaged SAR in the pinna can be up to 2.1 times greater than the peak spatial SAR in SAM. Measurements in small structures, such as the pinna, will significantly increase the uncertainty; therefore SAM was designed for SAR assessment in the head only. Whether SAM will provide a conservative value for the pinna depends on the pinna SAR limit of the safety standard considered

  12. Psychoacoustic and cognitive aspects of auditory roughness: definitions, models, and applications

    Science.gov (United States)

    Vassilakis, Pantelis N.; Kendall, Roger A.

    2010-02-01

    The term "auditory roughness" was first introduced in the 19th century to describe the buzzing, rattling auditory sensation accompanying narrow harmonic intervals (i.e. two tones with frequency difference in the range of ~15-150Hz, presented simultaneously). A broader definition and an overview of the psychoacoustic correlates of the auditory roughness sensation, also referred to as sensory dissonance, is followed by an examination of efforts to quantify it over the past one hundred and fifty years and leads to the introduction of a new roughness calculation model and an application that automates spectral and roughness analysis of sound signals. Implementation of spectral and roughness analysis is briefly discussed in the context of two pilot perceptual experiments, designed to assess the relationship among cultural background, music performance practice, and aesthetic attitudes towards the auditory roughness sensation.

  13. Flight Dynamic Model Exchange using XML

    Science.gov (United States)

    Jackson, E. Bruce; Hildreth, Bruce L.

    2002-01-01

    The AIAA Modeling and Simulation Technical Committee has worked for several years to develop a standard by which the information needed to develop physics-based models of aircraft can be specified. The purpose of this standard is to provide a well-defined set of information, definitions, data tables and axis systems so that cooperating organizations can transfer a model from one simulation facility to another with maximum efficiency. This paper proposes using an application of the eXtensible Markup Language (XML) to implement the AIAA simulation standard. The motivation and justification for using a standard such as XML is discussed. Necessary data elements to be supported are outlined. An example of an aerodynamic model as an XML file is given. This example includes definition of independent and dependent variables for function tables, definition of key variables used to define the model, and axis systems used. The final steps necessary for implementation of the standard are presented. Software to take an XML-defined model and import/export it to/from a given simulation facility is discussed, but not demonstrated. That would be the next step in final implementation of standards for physics-based aircraft dynamic models.

  14. Multivariable Wind Modeling in State Space

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri; Pedersen, B. J.

    2011-01-01

    Turbulence of the incoming wind field is of paramount importance to the dynamic response of wind turbines. Hence reliable stochastic models of the turbulence should be available from which time series can be generated for dynamic response and structural safety analysis. In the paper an empirical...... for the vector turbulence process incorporating its phase spectrum in one stage, and its results are compared with a conventional ARMA modeling method....... the succeeding state space and ARMA modeling of the turbulence rely on the positive definiteness of the cross-spectral density matrix, the problem with the non-positive definiteness of such matrices is at first addressed and suitable treatments regarding it are proposed. From the adjusted positive definite cross...

  15. A probabilistic approach using deformable organ models for automatic definition of normal anatomical structures for 3D treatment planning

    International Nuclear Information System (INIS)

    Fritsch, Daniel; Yu Liyun; Johnson, Valen; McAuliffe, Matthew; Pizer, Stephen; Chaney, Edward

    1996-01-01

    Purpose/Objective : Current clinical methods for defining normal anatomical structures on tomographic images are time consuming and subject to intra- and inter-user variability. With the widespread implementation of 3D RTP, conformal radiotherapy, and dose escalation the implications of imprecise object definition have assumed a much higher level of importance. Object definition and volume-weighted metrics for normal anatomy, such as DVHs and NTCPs, play critical roles in aiming, shaping, and weighting beams. Improvements in object definition, including computer automation, are essential to yield reliable volume-weighted metrics and gains in human efficiency. The purpose of this study was to investigate a probabilistic approach using deformable models to automatically recognize and extract normal anatomy in tomographic images. Materials and Methods: Object models were created from normal organs that were segmented by an interactive method which involved placing a cursor near the center of the object on a slice and clicking a mouse button to initiate computation of structures called cores. Cores describe the skeletal and boundary shape of image objects in a manner that, in 2D, associates a location on the skeleton with the width of the object at that location. A significant advantage of cores is stability against image disturbances such as noise and blur. The model was composed of a relatively small set of extracted points on the skeleton and boundary. The points were carefully chosen to summarize the shape information captured by the cores. Neighborhood relationships between points were represented mathematically by energy functions that penalize, due to warping of the model, the ''goodness'' of match between the model and the image data at any stage during the segmentation process. The model was matched against the image data using a probabilistic approach based on Bayes theorem, which provides a means for computing a posteriori (posterior) probability from 1) a

  16. 4K Video Traffic Prediction using Seasonal Autoregressive Modeling

    Directory of Open Access Journals (Sweden)

    D. R. Marković

    2017-06-01

    Full Text Available From the perspective of average viewer, high definition video streams such as HD (High Definition and UHD (Ultra HD are increasing their internet presence year over year. This is not surprising, having in mind expansion of HD streaming services, such as YouTube, Netflix etc. Therefore, high definition video streams are starting to challenge network resource allocation with their bandwidth requirements and statistical characteristics. Need for analysis and modeling of this demanding video traffic has essential importance for better quality of service and experience support. In this paper we use an easy-to-apply statistical model for prediction of 4K video traffic. Namely, seasonal autoregressive modeling is applied in prediction of 4K video traffic, encoded with HEVC (High Efficiency Video Coding. Analysis and modeling were performed within R programming environment using over 17.000 high definition video frames. It is shown that the proposed methodology provides good accuracy in high definition video traffic modeling.

  17. The infinitesimal model: Definition, derivation, and implications.

    Science.gov (United States)

    Barton, N H; Etheridge, A M; Véber, A

    2017-12-01

    Our focus here is on the infinitesimal model. In this model, one or several quantitative traits are described as the sum of a genetic and a non-genetic component, the first being distributed within families as a normal random variable centred at the average of the parental genetic components, and with a variance independent of the parental traits. Thus, the variance that segregates within families is not perturbed by selection, and can be predicted from the variance components. This does not necessarily imply that the trait distribution across the whole population should be Gaussian, and indeed selection or population structure may have a substantial effect on the overall trait distribution. One of our main aims is to identify some general conditions on the allelic effects for the infinitesimal model to be accurate. We first review the long history of the infinitesimal model in quantitative genetics. Then we formulate the model at the phenotypic level in terms of individual trait values and relationships between individuals, but including different evolutionary processes: genetic drift, recombination, selection, mutation, population structure, …. We give a range of examples of its application to evolutionary questions related to stabilising selection, assortative mating, effective population size and response to selection, habitat preference and speciation. We provide a mathematical justification of the model as the limit as the number M of underlying loci tends to infinity of a model with Mendelian inheritance, mutation and environmental noise, when the genetic component of the trait is purely additive. We also show how the model generalises to include epistatic effects. We prove in particular that, within each family, the genetic components of the individual trait values in the current generation are indeed normally distributed with a variance independent of ancestral traits, up to an error of order 1∕M. Simulations suggest that in some cases the convergence

  18. Modelling binary data

    CERN Document Server

    Collett, David

    2002-01-01

    INTRODUCTION Some Examples The Scope of this Book Use of Statistical Software STATISTICAL INFERENCE FOR BINARY DATA The Binomial Distribution Inference about the Success Probability Comparison of Two Proportions Comparison of Two or More Proportions MODELS FOR BINARY AND BINOMIAL DATA Statistical Modelling Linear Models Methods of Estimation Fitting Linear Models to Binomial Data Models for Binomial Response Data The Linear Logistic Model Fitting the Linear Logistic Model to Binomial Data Goodness of Fit of a Linear Logistic Model Comparing Linear Logistic Models Linear Trend in Proportions Comparing Stimulus-Response Relationships Non-Convergence and Overfitting Some other Goodness of Fit Statistics Strategy for Model Selection Predicting a Binary Response Probability BIOASSAY AND SOME OTHER APPLICATIONS The Tolerance Distribution Estimating an Effective Dose Relative Potency Natural Response Non-Linear Logistic Regression Models Applications of the Complementary Log-Log Model MODEL CHECKING Definition of Re...

  19. Conceptual IT model

    Science.gov (United States)

    Arnaoudova, Kristina; Stanchev, Peter

    2015-11-01

    The business processes are the key asset for every organization. The design of the business process models is the foremost concern and target among an organization's functions. Business processes and their proper management are intensely dependent on the performance of software applications and technology solutions. The paper is attempt for definition of new Conceptual model of IT service provider, it could be examined as IT focused Enterprise model, part of Enterprise Architecture (EA) school.

  20. Persuasive Game Design : A model and its definitions

    NARCIS (Netherlands)

    Visch, V.T.; Vegt, N.J.H.; Anderiesen, H.; Van der Kooij, K.

    2013-01-01

    The following position paper proposes a general theoretical model for persuasive game design. This model combines existing theories on persuasive technology, serious gaming, and gamification. The model is based on user experience, gamification design, and transfer effects.

  1. A Bigraph Relational Model

    DEFF Research Database (Denmark)

    Beauquier, Maxime; Schürmann, Carsten

    2011-01-01

    In this paper, we present a model based on relations for bigraphical reactive system [Milner09]. Its defining characteristics are that validity and reaction relations are captured as traces in a multi-set rewriting system. The relational model is derived from Milner's graphical definition...

  2. Integrated Safety Culture Model and Application

    Institute of Scientific and Technical Information of China (English)

    汪磊; 孙瑞山; 刘汉辉

    2009-01-01

    A new safety culture model is constructed and is applied to analyze the correlations between safety culture and SMS. On the basis of previous typical definitions, models and theories of safety culture, an in-depth analysis on safety culture's structure, composing elements and their correlations was conducted. A new definition of safety culture was proposed from the perspective of sub-cuhure. 7 types of safety sub-culture, which are safety priority culture, standardizing culture, flexible culture, learning culture, teamwork culture, reporting culture and justice culture were defined later. Then integrated safety culture model (ISCM) was put forward based on the definition. The model divided safety culture into intrinsic latency level and extrinsic indication level and explained the potential relationship between safety sub-culture and all safety culture dimensions. Finally in the analyzing of safety culture and SMS, it concluded that positive safety culture is the basis of im-plementing SMS effectively and an advanced SMS will improve safety culture from all around.

  3. Integrated source-risk model for radon: A definition study

    International Nuclear Information System (INIS)

    Laheij, G.M.H.; Aldenkamp, F.J.; Stoop, P.

    1993-10-01

    The purpose of a source-risk model is to support policy making on radon mitigation by comparing effects of various policy options and to enable optimization of counter measures applied to different parts of the source-risk chain. There are several advantages developing and using a source-risk model: risk calculations are standardized; the effects of measures applied to different parts of the source-risk chain can be better compared because interactions are included; and sensitivity analyses can be used to determine the most important parameters within the total source-risk chain. After an inventory of processes and sources to be included in the source-risk chain, the models presently available in the Netherlands are investigated. The models were screened for completeness, validation and operational status. The investigation made clear that, by choosing for each part of the source-risk chain the most convenient model, a source-risk chain model for radon may be realized. However, the calculation of dose out of the radon concentrations and the status of the validation of most models should be improved. Calculations with the proposed source-risk model will give estimations with a large uncertainty at the moment. For further development of the source-risk model an interaction between the source-risk model and experimental research is recommended. Organisational forms of the source-risk model are discussed. A source-risk model in which only simple models are included is also recommended. The other models are operated and administrated by the model owners. The model owners execute their models for a combination of input parameters. The output of the models is stored in a database which will be used for calculations with the source-risk model. 5 figs., 15 tabs., 7 appendices, 14 refs

  4. The modified turning bands (MTB) model for space-time rainfall. I. Model definition and properties

    Science.gov (United States)

    Mellor, Dale

    1996-02-01

    A new stochastic model of space-time rainfall, the Modified Turning Bands (MTB) model, is proposed which reproduces, in particular, the movements and developments of rainbands, cluster potential regions and raincells, as well as their respective interactions. The ensemble correlation structure is unsuitable for practical estimation of the model parameters because the model is not ergodic in this statistic, and hence it cannot easily be measured from a single real storm. Thus, some general theory on the internal covariance structure of a class of stochastic models is presented, of which the MTB model is an example. It is noted that, for the MTB model, the internal covariance structure may be measured from a single storm, and can thus be used for model identification.

  5. Natural climate variability in a coupled model

    International Nuclear Information System (INIS)

    Zebiak, S.E.; Cane, M.A.

    1990-01-01

    Multi-century simulations with a simplified coupled ocean-atmosphere model are described. These simulations reveal an impressive range of variability on decadal and longer time scales, in addition to the dominant interannual el Nino/Southern Oscillation signal that the model originally was designed to simulate. Based on a very large sample of century-long simulations, it is nonetheless possible to identify distinct model parameter sensitivities that are described here in terms of selected indices. Preliminary experiments motivated by general circulation model results for increasing greenhouse gases suggest a definite sensitivity to model global warming. While these results are not definitive, they strongly suggest that coupled air-sea dynamics figure prominently in global change and must be included in models for reliable predictions

  6. A Method for Model Checking Feature Interactions

    DEFF Research Database (Denmark)

    Pedersen, Thomas; Le Guilly, Thibaut; Ravn, Anders Peter

    2015-01-01

    This paper presents a method to check for feature interactions in a system assembled from independently developed concurrent processes as found in many reactive systems. The method combines and refines existing definitions and adds a set of activities. The activities describe how to populate the ...... the definitions with models to ensure that all interactions are captured. The method is illustrated on a home automation example with model checking as analysis tool. In particular, the modelling formalism is timed automata and the analysis uses UPPAAL to find interactions....

  7. A Gaussian mixture model for definition of lung tumor volumes in positron emission tomography

    International Nuclear Information System (INIS)

    Aristophanous, Michalis; Penney, Bill C.; Martel, Mary K.; Pelizzari, Charles A.

    2007-01-01

    The increased interest in 18 F-fluorodeoxyglucose (FDG) positron emission tomography (PET) in radiation treatment planning in the past five years necessitated the independent and accurate segmentation of gross tumor volume (GTV) from FDG-PET scans. In some studies the radiation oncologist contours the GTV based on a computed tomography scan, while incorporating pertinent data from the PET images. Alternatively, a simple threshold, typically 40% of the maximum intensity, has been employed to differentiate tumor from normal tissue, while other researchers have developed algorithms to aid the PET based GTV definition. None of these methods, however, results in reliable PET tumor segmentation that can be used for more sophisticated treatment plans. For this reason, we developed a Gaussian mixture model (GMM) based segmentation technique on selected PET tumor regions from non-small cell lung cancer patients. The purpose of this study was to investigate the feasibility of using a GMM-based tumor volume definition in a robust, reliable and reproducible way. A GMM relies on the idea that any distribution, in our case a distribution of image intensities, can be expressed as a mixture of Gaussian densities representing different classes. According to our implementation, each class belongs to one of three regions in the image; the background (B), the uncertain (U) and the target (T), and from these regions we can obtain the tumor volume. User interaction in the implementation is required, but is limited to the initialization of the model parameters and the selection of an ''analysis region'' to which the modeling is restricted. The segmentation was developed on three and tested on another four clinical cases to ensure robustness against differences observed in the clinic. It also compared favorably with thresholding at 40% of the maximum intensity and a threshold determination function based on tumor to background image intensities proposed in a recent paper. The parts of the

  8. A Competing Risk Model of First Failure Site after Definitive Chemoradiation Therapy for Locally Advanced Non-Small Cell Lung Cancer.

    Science.gov (United States)

    Nygård, Lotte; Vogelius, Ivan R; Fischer, Barbara M; Kjær, Andreas; Langer, Seppo W; Aznar, Marianne C; Persson, Gitte F; Bentzen, Søren M

    2018-04-01

    The aim of the study was to build a model of first failure site- and lesion-specific failure probability after definitive chemoradiotherapy for inoperable NSCLC. We retrospectively analyzed 251 patients receiving definitive chemoradiotherapy for NSCLC at a single institution between 2009 and 2015. All patients were scanned by fludeoxyglucose positron emission tomography/computed tomography for radiotherapy planning. Clinical patient data and fludeoxyglucose positron emission tomography standardized uptake values from primary tumor and nodal lesions were analyzed by using multivariate cause-specific Cox regression. In patients experiencing locoregional failure, multivariable logistic regression was applied to assess risk of each lesion being the first site of failure. The two models were used in combination to predict probability of lesion failure accounting for competing events. Adenocarcinoma had a lower hazard ratio (HR) of locoregional failure than squamous cell carcinoma (HR = 0.45, 95% confidence interval [CI]: 0.26-0.76, p = 0.003). Distant failures were more common in the adenocarcinoma group (HR = 2.21, 95% CI: 1.41-3.48, p failure showed that primary tumors were more likely to fail than lymph nodes (OR = 12.8, 95% CI: 5.10-32.17, p failure (OR = 1.26 per unit increase, 95% CI: 1.12-1.40, p failure site-specific competing risk model based on patient- and lesion-level characteristics. Failure patterns differed between adenocarcinoma and squamous cell carcinoma, illustrating the limitation of aggregating them into NSCLC. Failure site-specific models add complementary information to conventional prognostic models. Copyright © 2018 International Association for the Study of Lung Cancer. Published by Elsevier Inc. All rights reserved.

  9. Improved Maximum Parsimony Models for Phylogenetic Networks.

    Science.gov (United States)

    Van Iersel, Leo; Jones, Mark; Scornavacca, Celine

    2018-05-01

    Phylogenetic networks are well suited to represent evolutionary histories comprising reticulate evolution. Several methods aiming at reconstructing explicit phylogenetic networks have been developed in the last two decades. In this article, we propose a new definition of maximum parsimony for phylogenetic networks that permits to model biological scenarios that cannot be modeled by the definitions currently present in the literature (namely, the "hardwired" and "softwired" parsimony). Building on this new definition, we provide several algorithmic results that lay the foundations for new parsimony-based methods for phylogenetic network reconstruction.

  10. Ferrofluids: Modeling, numerical analysis, and scientific computation

    Science.gov (United States)

    Tomas, Ignacio

    This dissertation presents some developments in the Numerical Analysis of Partial Differential Equations (PDEs) describing the behavior of ferrofluids. The most widely accepted PDE model for ferrofluids is the Micropolar model proposed by R.E. Rosensweig. The Micropolar Navier-Stokes Equations (MNSE) is a subsystem of PDEs within the Rosensweig model. Being a simplified version of the much bigger system of PDEs proposed by Rosensweig, the MNSE are a natural starting point of this thesis. The MNSE couple linear velocity u, angular velocity w, and pressure p. We propose and analyze a first-order semi-implicit fully-discrete scheme for the MNSE, which decouples the computation of the linear and angular velocities, is unconditionally stable and delivers optimal convergence rates under assumptions analogous to those used for the Navier-Stokes equations. Moving onto the much more complex Rosensweig's model, we provide a definition (approximation) for the effective magnetizing field h, and explain the assumptions behind this definition. Unlike previous definitions available in the literature, this new definition is able to accommodate the effect of external magnetic fields. Using this definition we setup the system of PDEs coupling linear velocity u, pressure p, angular velocity w, magnetization m, and magnetic potential ϕ We show that this system is energy-stable and devise a numerical scheme that mimics the same stability property. We prove that solutions of the numerical scheme always exist and, under certain simplifying assumptions, that the discrete solutions converge. A notable outcome of the analysis of the numerical scheme for the Rosensweig's model is the choice of finite element spaces that allow the construction of an energy-stable scheme. Finally, with the lessons learned from Rosensweig's model, we develop a diffuse-interface model describing the behavior of two-phase ferrofluid flows and present an energy-stable numerical scheme for this model. For a

  11. Criterion of Semi-Markov Dependent Risk Model

    Institute of Scientific and Technical Information of China (English)

    Xiao Yun MO; Xiang Qun YANG

    2014-01-01

    A rigorous definition of semi-Markov dependent risk model is given. This model is a generalization of the Markov dependent risk model. A criterion and necessary conditions of semi-Markov dependent risk model are obtained. The results clarify relations between elements among semi-Markov dependent risk model more clear and are applicable for Markov dependent risk model.

  12. Decision-Making Theories and Models: A Discussion of Rational and Psychological Decision-Making Theories and Models: The Search for a Cultural-Ethical Decision-Making Model

    OpenAIRE

    Oliveira, Arnaldo

    2007-01-01

    This paper examines rational and psychological decision-making models. Descriptive and normative methodologies such as attribution theory, schema theory, prospect theory, ambiguity model, game theory, and expected utility theory are discussed. The definition of culture is reviewed, and the relationship between culture and decision making is also highlighted as many organizations use a cultural-ethical decision-making model.

  13. A Model-Driven Approach for Telecommunications Network Services Definition

    Science.gov (United States)

    Chiprianov, Vanea; Kermarrec, Yvon; Alff, Patrick D.

    Present day Telecommunications market imposes a short concept-to-market time for service providers. To reduce it, we propose a computer-aided, model-driven, service-specific tool, with support for collaborative work and for checking properties on models. We started by defining a prototype of the Meta-model (MM) of the service domain. Using this prototype, we defined a simple graphical modeling language specific for service designers. We are currently enlarging the MM of the domain using model transformations from Network Abstractions Layers (NALs). In the future, we will investigate approaches to ensure the support for collaborative work and for checking properties on models.

  14. Progress in modeling and simulation.

    Science.gov (United States)

    Kindler, E

    1998-01-01

    For the modeling of systems, the computers are more and more used while the other "media" (including the human intellect) carrying the models are abandoned. For the modeling of knowledges, i.e. of more or less general concepts (possibly used to model systems composed of instances of such concepts), the object-oriented programming is nowadays widely used. For the modeling of processes existing and developing in the time, computer simulation is used, the results of which are often presented by means of animation (graphical pictures moving and changing in time). Unfortunately, the object-oriented programming tools are commonly not designed to be of a great use for simulation while the programming tools for simulation do not enable their users to apply the advantages of the object-oriented programming. Nevertheless, there are exclusions enabling to use general concepts represented at a computer, for constructing simulation models and for their easy modification. They are described in the present paper, together with true definitions of modeling, simulation and object-oriented programming (including cases that do not satisfy the definitions but are dangerous to introduce misunderstanding), an outline of their applications and of their further development. In relation to the fact that computing systems are being introduced to be control components into a large spectrum of (technological, social and biological) systems, the attention is oriented to models of systems containing modeling components.

  15. Systems and context modeling approach to requirements analysis

    Science.gov (United States)

    Ahuja, Amrit; Muralikrishna, G.; Patwari, Puneet; Subhrojyoti, C.; Swaminathan, N.; Vin, Harrick

    2014-08-01

    Ensuring completeness and correctness of the requirements for a complex system such as the SKA is challenging. Current system engineering practice includes developing a stakeholder needs definition, a concept of operations, and defining system requirements in terms of use cases and requirements statements. We present a method that enhances this current practice into a collection of system models with mutual consistency relationships. These include stakeholder goals, needs definition and system-of-interest models, together with a context model that participates in the consistency relationships among these models. We illustrate this approach by using it to analyze the SKA system requirements.

  16. Ontology for Life-Cycle Modeling of Water Distribution Systems: Model View Definition

    Science.gov (United States)

    2013-06-01

    attributes with a defined datatype indicating a measure datatype ; • To all properties and quantities with a defined datatype indicating a measure... datatype and with no local unit definitions provided. 3.2.3.4 Project context A project representation context indicates the coordinate system orienta

  17. A Memory Hierarchy Model Based on Data Reuse for Full-Search Motion Estimation on High-Definition Digital Videos

    Directory of Open Access Journals (Sweden)

    Alba Sandyra Bezerra Lopes

    2012-01-01

    Full Text Available The motion estimation is the most complex module in a video encoder requiring a high processing throughput and high memory bandwidth, mainly when the focus is high-definition videos. The throughput problem can be solved increasing the parallelism in the internal operations. The external memory bandwidth may be reduced using a memory hierarchy. This work presents a memory hierarchy model for a full-search motion estimation core. The proposed memory hierarchy model is based on a data reuse scheme considering the full search algorithm features. The proposed memory hierarchy expressively reduces the external memory bandwidth required for the motion estimation process, and it provides a very high data throughput for the ME core. This throughput is necessary to achieve real time when processing high-definition videos. When considering the worst bandwidth scenario, this memory hierarchy is able to reduce the external memory bandwidth in 578 times. A case study for the proposed hierarchy, using 32×32 search window and 8×8 block size, was implemented and prototyped on a Virtex 4 FPGA. The results show that it is possible to reach 38 frames per second when processing full HD frames (1920×1080 pixels using nearly 299 Mbytes per second of external memory bandwidth.

  18. How to define the tool kit for the corrective maintenance service? : a tool kit definition model under the service performance criterion

    NARCIS (Netherlands)

    Chen, Denise

    2009-01-01

    Currently, the rule of defining tool kits is varied and more engineer's aspects oriented. However, the decision of the tool kit's definition is a trade-off problem between the cost and the service performance. This project is designed to develop a model that can integrate the engineer's preferences

  19. Ontology for Life-Cycle Modeling of Electrical Distribution Systems: Model View Definition

    Science.gov (United States)

    2013-06-01

    contexts; • To all attributes with a defined datatype indicating a measure datatype ; • To all properties and quantities with a defined datatype ...indicating a measure datatype and with no local unit definitions provided. 3.2.2.3 Project context A project representation context indicates the

  20. BUDAYA TRI HITA KARANA DALAM MODEL UTAUT

    Directory of Open Access Journals (Sweden)

    Dodik Ariyanto

    2017-08-01

    Full Text Available Abstract: Tri Hita Karana Culture in UTAUT Model. This study explores the definitions and indicators of questions that represent Tri Hita Karana (THK culture in Unified Theory of Acceptance and Use of Technology (UTAUT model. This study uses literature study (to dig definition and field test (to validate. This study finds Social Factor Culture (FSB as a new indicator in UTAUT model. FSB is defined as perceptions of individuals that is considered important (adoption, utilization, and use of Accounting Information Systems. FSB is influenced by the important people around, individual thinking, and the level of spirituality.

  1. Predictive modeling of outcomes following definitive chemoradiotherapy for oropharyngeal cancer based on FDG-PET image characteristics

    Science.gov (United States)

    Folkert, Michael R.; Setton, Jeremy; Apte, Aditya P.; Grkovski, Milan; Young, Robert J.; Schöder, Heiko; Thorstad, Wade L.; Lee, Nancy Y.; Deasy, Joseph O.; Oh, Jung Hun

    2017-07-01

    In this study, we investigate the use of imaging feature-based outcomes research (‘radiomics’) combined with machine learning techniques to develop robust predictive models for the risk of all-cause mortality (ACM), local failure (LF), and distant metastasis (DM) following definitive chemoradiation therapy (CRT). One hundred seventy four patients with stage III-IV oropharyngeal cancer (OC) treated at our institution with CRT with retrievable pre- and post-treatment 18F-fluorodeoxyglucose positron emission tomography (FDG-PET) scans were identified. From pre-treatment PET scans, 24 representative imaging features of FDG-avid disease regions were extracted. Using machine learning-based feature selection methods, multiparameter logistic regression models were built incorporating clinical factors and imaging features. All model building methods were tested by cross validation to avoid overfitting, and final outcome models were validated on an independent dataset from a collaborating institution. Multiparameter models were statistically significant on 5 fold cross validation with the area under the receiver operating characteristic curve (AUC)  =  0.65 (p  =  0.004), 0.73 (p  =  0.026), and 0.66 (p  =  0.015) for ACM, LF, and DM, respectively. The model for LF retained significance on the independent validation cohort with AUC  =  0.68 (p  =  0.029) whereas the models for ACM and DM did not reach statistical significance, but resulted in comparable predictive power to the 5 fold cross validation with AUC  =  0.60 (p  =  0.092) and 0.65 (p  =  0.062), respectively. In the largest study of its kind to date, predictive features including increasing metabolic tumor volume, increasing image heterogeneity, and increasing tumor surface irregularity significantly correlated to mortality, LF, and DM on 5 fold cross validation in a relatively uniform single-institution cohort. The LF model also retained

  2. Diffusion tensor magnetic resonance imaging driven growth modeling for radiotherapy target definition in glioblastoma.

    Science.gov (United States)

    Jensen, Morten B; Guldberg, Trine L; Harbøll, Anja; Lukacova, Slávka; Kallehauge, Jesper F

    2017-11-01

    The clinical target volume (CTV) in radiotherapy is routinely based on gadolinium contrast enhanced T1 weighted (T1w + Gd) and T2 weighted fluid attenuated inversion recovery (T2w FLAIR) magnetic resonance imaging (MRI) sequences which have been shown to over- or underestimate the microscopic tumor cell spread. Gliomas favor spread along the white matter fiber tracts. Tumor growth models incorporating the MRI diffusion tensors (DTI) allow to account more consistently for the glioma growth. The aim of the study was to investigate the potential of a DTI driven growth model to improve target definition in glioblastoma (GBM). Eleven GBM patients were scanned using T1w, T2w FLAIR, T1w + Gd and DTI. The brain was segmented into white matter, gray matter and cerebrospinal fluid. The Fisher-Kolmogorov growth model was used assuming uniform proliferation and a difference in white and gray matter diffusion of a ratio of 10. The tensor directionality was tested using an anisotropy weighting parameter set to zero (γ0) and twenty (γ20). The volumetric comparison was performed using Hausdorff distance, Dice similarity coefficient (DSC) and surface area. The median of the standard CTV (CTVstandard) was 180 cm 3 . The median surface area of CTVstandard was 211 cm 2 . The median surface area of respective CTV γ0 and CTV γ20 significantly increased to 338 and 376 cm 2 , respectively. The Hausdorff distance was greater than zero and significantly increased for both CTV γ0 and CTV γ20 with respective median of 18.7 and 25.2 mm. The DSC for both CTV γ0 and CTV γ20 were significantly below one with respective median of 0.74 and 0.72, which means that 74 and 72% of CTVstandard were included in CTV γ0 and CTV γ20, respectively. DTI driven growth models result in CTVs with a significantly increased surface area, a significantly increased Hausdorff distance and decreased overlap between the standard and model derived volume.

  3. Generating WS-SecurityPolicy documents via security model transformation

    DEFF Research Database (Denmark)

    Jensen, Meiko

    2009-01-01

    When SOA-based business processes are to be enhanced with security properties, the model-driven business process development approach enables an easier and more reliable security definition compared to manually crafting the security realizations afterwards. In this paper, we outline an appropriat...... security model definition and transformation approach, targeting the WS-SecurityPolicy and WS-BPEL specifications, in order to enable a Web-Service-based secure business process development.......When SOA-based business processes are to be enhanced with security properties, the model-driven business process development approach enables an easier and more reliable security definition compared to manually crafting the security realizations afterwards. In this paper, we outline an appropriate...

  4. Turbulence Model

    DEFF Research Database (Denmark)

    Nielsen, Mogens Peter; Shui, Wan; Johansson, Jens

    2011-01-01

    term with stresses depending linearly on the strain rates. This term takes into account the transfer of linear momentum from one part of the fluid to another. Besides there is another term, which takes into account the transfer of angular momentum. Thus the model implies a new definition of turbulence...

  5. Compendium of Material Composition Data for Radiation Transport Modeling

    International Nuclear Information System (INIS)

    Williams, Ralph G.; Gesh, Christopher J.; Pagh, Richard T.

    2006-01-01

    Computational modeling of radiation transport problems including homeland security, radiation shielding and protection, and criticality safety all depend upon material definitions. This document has been created to serve two purposes: (1) to provide a quick reference of material compositions for analysts and (2) a standardized reference to reduce the differences between results from two independent analysts. Analysts are always encountering a variety of materials for which elemental definitions are not readily available or densities are not defined. This document provides a location where unique or hard to define materials will be located to reduce duplication in research for modeling purposes. Additionally, having a common set of material definitions helps to standardize modeling across PNNL and provide two separate researchers the ability to compare different modeling results from a common materials basis.

  6. Model of observed stochastic balance between work and free time supporting the LQTAI definition

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager

    2008-01-01

    A balance differential equation between free time and money-producing work time on the national economy level is formulated in a previous paper in terms of two dimensionless quantities, the fraction of work time and the total productivity factor defined as the ratio of the Gross Domestic Product...... significant systematically balance influencing parameters on the macro economical level than those considered in the definition in the previous paper of the Life Quality Time Allocation Index....... to the total salary paid in return for work. Among the solutions there is one relation that compares surprisingly well with the relevant sequences of Danish data spanning from 1948 to 2003, and also with similar data from several other countries except for slightly different model parameter values. Statistical...

  7. Hidrodinamički model podvodnog projektila / Hidrodinamical model of an underwater projectile

    Directory of Open Access Journals (Sweden)

    Miroslav Radosavljević

    2008-07-01

    Full Text Available Radi dobijanja kvalitetnog matematičkog modela podvodnog projektila u radu su definisane ulazne i izlazne veličine, brzine i ubrzanje projektila. Uz zadate uslove mogućeg kretanja projektila definisan je model podvodnog projektila sa šest jednačina. / The paper analyzes an underwater projectile. The input and output values, the projectile speed and acceleration are defined for a quality definition of the projectile mathematical model. With the conditions of the projectile potential movement previously set out, the torpedo model is defined by six equations.

  8. Traceability in Model-Based Testing

    Directory of Open Access Journals (Sweden)

    Mathew George

    2012-11-01

    Full Text Available The growing complexities of software and the demand for shorter time to market are two important challenges that face today’s IT industry. These challenges demand the increase of both productivity and quality of software. Model-based testing is a promising technique for meeting these challenges. Traceability modeling is a key issue and challenge in model-based testing. Relationships between the different models will help to navigate from one model to another, and trace back to the respective requirements and the design model when the test fails. In this paper, we present an approach for bridging the gaps between the different models in model-based testing. We propose relation definition markup language (RDML for defining the relationships between models.

  9. A Formal Model for Context-Awareness

    DEFF Research Database (Denmark)

    Kjærgaard, Mikkel Baun; Bunde-Pedersen, Jonathan

    here is a definite lack of formal support for modeling real- istic context-awareness in pervasive computing applications. The Conawa calculus presented in this paper provides mechanisms for modeling complex and interwoven sets of context-information by extending ambient calculus with new construc...

  10. Principles of models based engineering

    Energy Technology Data Exchange (ETDEWEB)

    Dolin, R.M.; Hefele, J.

    1996-11-01

    This report describes a Models Based Engineering (MBE) philosophy and implementation strategy that has been developed at Los Alamos National Laboratory`s Center for Advanced Engineering Technology. A major theme in this discussion is that models based engineering is an information management technology enabling the development of information driven engineering. Unlike other information management technologies, models based engineering encompasses the breadth of engineering information, from design intent through product definition to consumer application.

  11. Establishing model credibility involves more than validation

    International Nuclear Information System (INIS)

    Kirchner, T.

    1991-01-01

    One widely used definition of validation is that the quantitative test of the performance of a model through the comparison of model predictions to independent sets of observations from the system being simulated. The ability to show that the model predictions compare well with observations is often thought to be the most rigorous test that can be used to establish credibility for a model in the scientific community. However, such tests are only part of the process used to establish credibility, and in some cases may be either unnecessary or misleading. Naylor and Finger extended the concept of validation to include the establishment of validity for the postulates embodied in the model and the test of assumptions used to select postulates for the model. Validity of postulates is established through concurrence by experts in the field of study that the mathematical or conceptual model contains the structural components and mathematical relationships necessary to adequately represent the system with respect to the goals for the model. This extended definition of validation provides for consideration of the structure of the model, not just its performance, in establishing credibility. Evaluation of a simulation model should establish the correctness of the code and the efficacy of the model within its domain of applicability. (24 refs., 6 figs.)

  12. Open source molecular modeling.

    Science.gov (United States)

    Pirhadi, Somayeh; Sunseri, Jocelyn; Koes, David Ryan

    2016-09-01

    The success of molecular modeling and computational chemistry efforts are, by definition, dependent on quality software applications. Open source software development provides many advantages to users of modeling applications, not the least of which is that the software is free and completely extendable. In this review we categorize, enumerate, and describe available open source software packages for molecular modeling and computational chemistry. An updated online version of this catalog can be found at https://opensourcemolecularmodeling.github.io. Copyright © 2016 The Author(s). Published by Elsevier Inc. All rights reserved.

  13. Agent oriented modeling of business information systems

    OpenAIRE

    Vymetal, Dominik

    2009-01-01

    Enterprise modeling is an abstract definition of processes running in enterprise using process, value, data and resource models. There are two perspectives of business modeling: process perspective and value chain perspective. Both have some advantages and disadvantages. This paper proposes a combination of both perspectives into one generic model. The model takes also social part or the enterprise system into consideration and pays attention to disturbances influencing the enterprise system....

  14. Partially ordered models

    NARCIS (Netherlands)

    Fernandez, R.; Deveaux, V.

    2010-01-01

    We provide a formal definition and study the basic properties of partially ordered chains (POC). These systems were proposed to model textures in image processing and to represent independence relations between random variables in statistics (in the later case they are known as Bayesian networks).

  15. Definition of an Object-Oriented Modeling Language for Enterprise Architecture

    OpenAIRE

    Lê, Lam Son; Wegmann, Alain

    2005-01-01

    In enterprise architecture, the goal is to integrate business resources and IT resources in order to improve an enterprises competitiveness. In an enterprise architecture project, the development team usually constructs a model that represents the enterprise: the enterprise model. In this paper, we present a modeling language for building such enterprise models. Our enterprise models are hierarchical object-oriented representations of the enterprises. This paper presents the foundations of o...

  16. Time domain series system definition and gear set reliability modeling

    International Nuclear Information System (INIS)

    Xie, Liyang; Wu, Ningxiang; Qian, Wenxue

    2016-01-01

    Time-dependent multi-configuration is a typical feature for mechanical systems such as gear trains and chain drives. As a series system, a gear train is distinct from a traditional series system, such as a chain, in load transmission path, system-component relationship, system functioning manner, as well as time-dependent system configuration. Firstly, the present paper defines time-domain series system to which the traditional series system reliability model is not adequate. Then, system specific reliability modeling technique is proposed for gear sets, including component (tooth) and subsystem (tooth-pair) load history description, material priori/posterior strength expression, time-dependent and system specific load-strength interference analysis, as well as statistically dependent failure events treatment. Consequently, several system reliability models are developed for gear sets with different tooth numbers in the scenario of tooth root material ultimate tensile strength failure. The application of the models is discussed in the last part, and the differences between the system specific reliability model and the traditional series system reliability model are illustrated by virtue of several numerical examples. - Highlights: • A new type of series system, i.e. time-domain multi-configuration series system is defined, that is of great significance to reliability modeling. • Multi-level statistical analysis based reliability modeling method is presented for gear transmission system. • Several system specific reliability models are established for gear set reliability estimation. • The differences between the traditional series system reliability model and the new model are illustrated.

  17. Spectra of definite type in waveguide models

    Czech Academy of Sciences Publication Activity Database

    Lotoreichik, Vladimir; Siegl, Petr

    2017-01-01

    Roč. 145, č. 3 (2017), s. 1231-1246 ISSN 0002-9939 R&D Projects: GA ČR(CZ) GA14-06818S Institutional support: RVO:61389005 Keywords : spectral points of definite and of type pi * weakly coupled bound states * pertrubations of essential spectrum * PT-symmetric waveguide Subject RIV: BE - Theoretical Physics OBOR OECD: Applied mathematics Impact factor: 0.679, year: 2016

  18. High resolution extremity CT for biomechanics modeling

    International Nuclear Information System (INIS)

    Ashby, A.E.; Brand, H.; Hollerbach, K.; Logan, C.M.; Martz, H.E.

    1995-01-01

    With the advent of ever more powerful computing and finite element analysis (FEA) capabilities, the bone and joint geometry detail available from either commercial surface definitions or from medical CT scans is inadequate. For dynamic FEA modeling of joints, precise articular contours are necessary to get appropriate contact definition. In this project, a fresh cadaver extremity was suspended in parafin in a lucite cylinder and then scanned with an industrial CT system to generate a high resolution data set for use in biomechanics modeling

  19. High resolution extremity CT for biomechanics modeling

    Energy Technology Data Exchange (ETDEWEB)

    Ashby, A.E.; Brand, H.; Hollerbach, K.; Logan, C.M.; Martz, H.E.

    1995-09-23

    With the advent of ever more powerful computing and finite element analysis (FEA) capabilities, the bone and joint geometry detail available from either commercial surface definitions or from medical CT scans is inadequate. For dynamic FEA modeling of joints, precise articular contours are necessary to get appropriate contact definition. In this project, a fresh cadaver extremity was suspended in parafin in a lucite cylinder and then scanned with an industrial CT system to generate a high resolution data set for use in biomechanics modeling.

  20. Modelling synergistic effects of appetite regulating hormones

    DEFF Research Database (Denmark)

    Schmidt, Julie Berg; Ritz, Christian

    2016-01-01

    We briefly reviewed one definition of dose addition, which is applicable within the framework of generalized linear models. We established how this definition of dose addition corresponds to effect addition in case only two doses per compound are considered for evaluating synergistic effects. The....... The link between definitions was exemplified for an appetite study where two appetite hormones were studied....

  1. Domain Modeling for Adaptive Training and Education in Support of the US Army Learning Model-Research Outline

    Science.gov (United States)

    2015-06-01

    Definitions are provided for this section to distinguish between adaptive training and education elements and also to highlight their relationships ...illustrate this point Franke (2011) asserts that through the use of case study examples, instruction can provide the pedagogical foundation for decision...a prime example of an adaptive training and education system: a learner or trainee model, an instructional or pedagogical model, a domain model

  2. Streamline Your Project: A Lifecycle Model.

    Science.gov (United States)

    Viren, John

    2000-01-01

    Discusses one approach to project organization providing a baseline lifecycle model for multimedia/CBT development. This variation of the standard four-phase model of Analysis, Design, Development, and Implementation includes a Pre-Analysis phase, called Definition, and a Post-Implementation phase, known as Maintenance. Each phase is described.…

  3. Modeling and cellular studies

    International Nuclear Information System (INIS)

    Anon.

    1982-01-01

    Testing the applicability of mathematical models with carefully designed experiments is a powerful tool in the investigations of the effects of ionizing radiation on cells. The modeling and cellular studies complement each other, for modeling provides guidance for designing critical experiments which must provide definitive results, while the experiments themselves provide new input to the model. Based on previous experimental results the model for the accumulation of damage in Chlamydomonas reinhardi has been extended to include various multiple two-event combinations. Split dose survival experiments have shown that models tested to date predict most but not all the observed behavior. Stationary-phase mammalian cells, required for tests of other aspects of the model, have been shown to be at different points in the cell cycle depending on how they were forced to stop proliferating. These cultures also demonstrate different capacities for repair of sublethal radiation damage

  4. Data Modeling Challenges of Advanced Interoperability.

    Science.gov (United States)

    Blobel, Bernd; Oemig, Frank; Ruotsalainen, Pekka

    2018-01-01

    Progressive health paradigms, involving many different disciplines and combining multiple policy domains, requires advanced interoperability solutions. This results in special challenges for modeling health systems. The paper discusses classification systems for data models and enterprise business architectures and compares them with the ISO Reference Architecture. On that basis, existing definitions, specifications and standards of data models for interoperability are evaluated and their limitations are discussed. Amendments to correctly use those models and to better meet the aforementioned challenges are offered.

  5. The Shuttle Cost and Price model

    Science.gov (United States)

    Leary, Katherine; Stone, Barbara

    1983-01-01

    The Shuttle Cost and Price (SCP) model was developed as a tool to assist in evaluating major aspects of Shuttle operations that have direct and indirect economic consequences. It incorporates the major aspects of NASA Pricing Policy and corresponds to the NASA definition of STS operating costs. An overview of the SCP model is presented and the cost model portion of SCP is described in detail. Selected recent applications of the SCP model to NASA Pricing Policy issues are presented.

  6. Modelling and evaluation of surgical performance using hidden Markov models.

    Science.gov (United States)

    Megali, Giuseppe; Sinigaglia, Stefano; Tonet, Oliver; Dario, Paolo

    2006-10-01

    Minimally invasive surgery has become very widespread in the last ten years. Since surgeons experience difficulties in learning and mastering minimally invasive techniques, the development of training methods is of great importance. While the introduction of virtual reality-based simulators has introduced a new paradigm in surgical training, skill evaluation methods are far from being objective. This paper proposes a method for defining a model of surgical expertise and an objective metric to evaluate performance in laparoscopic surgery. Our approach is based on the processing of kinematic data describing movements of surgical instruments. We use hidden Markov model theory to define an expert model that describes expert surgical gesture. The model is trained on kinematic data related to exercises performed on a surgical simulator by experienced surgeons. Subsequently, we use this expert model as a reference model in the definition of an objective metric to evaluate performance of surgeons with different abilities. Preliminary results show that, using different topologies for the expert model, the method can be efficiently used both for the discrimination between experienced and novice surgeons, and for the quantitative assessment of surgical ability.

  7. Extendable linearised adjustment model for deformation analysis

    NARCIS (Netherlands)

    Hiddo Velsink

    2015-01-01

    Author supplied: "This paper gives a linearised adjustment model for the affine, similarity and congruence transformations in 3D that is easily extendable with other parameters to describe deformations. The model considers all coordinates stochastic. Full positive semi-definite covariance matrices

  8. Extendable linearised adjustment model for deformation analysis

    NARCIS (Netherlands)

    Velsink, H.

    2015-01-01

    This paper gives a linearised adjustment model for the affine, similarity and congruence transformations in 3D that is easily extendable with other parameters to describe deformations. The model considers all coordinates stochastic. Full positive semi-definite covariance matrices and correlation

  9. Controversy around the definition of waste

    CSIR Research Space (South Africa)

    Oelofse, Suzanna HH

    2009-11-20

    Full Text Available This paper presents the information concerning the definition of waste. Discussing the importance of the clear definition, ongoing debates, broad definition of waste, problems with the broad definition, interpretation, current waste management model...

  10. Business Model Process Configurations

    DEFF Research Database (Denmark)

    Taran, Yariv; Nielsen, Christian; Thomsen, Peter

    2015-01-01

    , by developing (inductively) an ontological classification framework, in view of the BM process configurations typology developed. Design/methodology/approach – Given the inconsistencies found in the business model studies (e.g. definitions, configurations, classifications) we adopted the analytical induction...

  11. Tree-Structured Digital Organisms Model

    Science.gov (United States)

    Suzuki, Teruhiko; Nobesawa, Shiho; Tahara, Ikuo

    Tierra and Avida are well-known models of digital organisms. They describe a life process as a sequence of computation codes. A linear sequence model may not be the only way to describe a digital organism, though it is very simple for a computer-based model. Thus we propose a new digital organism model based on a tree structure, which is rather similar to the generic programming. With our model, a life process is a combination of various functions, as if life in the real world is. This implies that our model can easily describe the hierarchical structure of life, and it can simulate evolutionary computation through mutual interaction of functions. We verified our model by simulations that our model can be regarded as a digital organism model according to its definitions. Our model even succeeded in creating species such as viruses and parasites.

  12. Grammar Maturity Model

    NARCIS (Netherlands)

    Zaytsev, V.; Pierantonio, A.; Schätz, B.; Tamzalit, D.

    2014-01-01

    The evolution of a software language (whether modelled by a grammar or a schema or a metamodel) is not limited to development of new versions and dialects. An important dimension of a software language evolution is maturing in the sense of improving the quality of its definition. In this paper, we

  13. Communication and Procedural Models of the E-Commerce Systems

    Directory of Open Access Journals (Sweden)

    Petr SUCHÁNEK

    2009-06-01

    Full Text Available E-commerce systems became a standard interface between sellers (or suppliers and customers. One of basic condition of an e-commerce system to be efficient is correct definitions and describes of the all internal and external processes. All is targeted the customers´ needs and requirements. The optimal and most exact way how to obtain and find optimal solution of e-commerce system and its processes structure in companies is the modeling and simulation. In this article author shows basic model of communication between customers and sellers in connection with the customer feedback and procedural models of e-commerce systems in terms of e-shops. Procedural model was made with the aid of definition of SOA.

  14. Stochastic Subspace Modelling of Turbulence

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri; Pedersen, B. J.; Nielsen, Søren R.K.

    2009-01-01

    positive definite cross-spectral density matrix a frequency response matrix is constructed which determines the turbulence vector as a linear filtration of Gaussian white noise. Finally, an accurate state space modelling method is proposed which allows selection of an appropriate model order......, and estimation of a state space model for the vector turbulence process incorporating its phase spectrum in one stage, and its results are compared with a conventional ARMA modelling method.......Turbulence of the incoming wind field is of paramount importance to the dynamic response of civil engineering structures. Hence reliable stochastic models of the turbulence should be available from which time series can be generated for dynamic response and structural safety analysis. In the paper...

  15. Usability Prediction & Ranking of SDLC Models Using Fuzzy Hierarchical Usability Model

    Science.gov (United States)

    Gupta, Deepak; Ahlawat, Anil K.; Sagar, Kalpna

    2017-06-01

    Evaluation of software quality is an important aspect for controlling and managing the software. By such evaluation, improvements in software process can be made. The software quality is significantly dependent on software usability. Many researchers have proposed numbers of usability models. Each model considers a set of usability factors but do not cover all the usability aspects. Practical implementation of these models is still missing, as there is a lack of precise definition of usability. Also, it is very difficult to integrate these models into current software engineering practices. In order to overcome these challenges, this paper aims to define the term `usability' using the proposed hierarchical usability model with its detailed taxonomy. The taxonomy considers generic evaluation criteria for identifying the quality components, which brings together factors, attributes and characteristics defined in various HCI and software models. For the first time, the usability model is also implemented to predict more accurate usability values. The proposed system is named as fuzzy hierarchical usability model that can be easily integrated into the current software engineering practices. In order to validate the work, a dataset of six software development life cycle models is created and employed. These models are ranked according to their predicted usability values. This research also focuses on the detailed comparison of proposed model with the existing usability models.

  16. SatisFactory Common Information Data Exchange Model

    OpenAIRE

    CERTH

    2016-01-01

    This deliverable defines the Common Information Data Exchange Model (CIDEM). The aim of CIDEM is to provide a model of information elements (e.g. concepts, even, relations, interfaces) used for information exchange between components as well as for modelling work performed by other tasks (e.g. knowledge models to support human resources optimization). The CIDEM definition is considered as a shared vocabulary that enables to address the information needs for the SatisFactory framework components.

  17. Business process model abstraction : a definition, catalog, and survey

    NARCIS (Netherlands)

    Smirnov, S.; Reijers, H.A.; Weske, M.H.; Nugteren, T.

    2012-01-01

    The discipline of business process management aims at capturing, understanding, and improving work in organizations by using process models as central artifacts. Since business-oriented tasks require different information from such models to be highlighted, a range of abstraction techniques has been

  18. STEREOMETRIC MODELLING

    Directory of Open Access Journals (Sweden)

    P. Grimaldi

    2012-07-01

    Full Text Available These mandatory guidelines are provided for preparation of papers accepted for publication in the series of Volumes of The The stereometric modelling means modelling achieved with : – the use of a pair of virtual cameras, with parallel axes and positioned at a mutual distance average of 1/10 of the distance camera-object (in practice the realization and use of a stereometric camera in the modeling program; – the shot visualization in two distinct windows – the stereoscopic viewing of the shot while modelling. Since the definition of "3D vision" is inaccurately referred to as the simple perspective of an object, it is required to add the word stereo so that "3D stereo vision " shall stand for "three-dimensional view" and ,therefore, measure the width, height and depth of the surveyed image. Thanks to the development of a stereo metric model , either real or virtual, through the "materialization", either real or virtual, of the optical-stereo metric model made visible with a stereoscope. It is feasible a continuous on line updating of the cultural heritage with the help of photogrammetry and stereometric modelling. The catalogue of the Architectonic Photogrammetry Laboratory of Politecnico di Bari is available on line at: http://rappresentazione.stereofot.it:591/StereoFot/FMPro?-db=StereoFot.fp5&-lay=Scheda&-format=cerca.htm&-view

  19. MOS modeling hierarchy including radiation effects

    International Nuclear Information System (INIS)

    Alexander, D.R.; Turfler, R.M.

    1975-01-01

    A hierarchy of modeling procedures has been developed for MOS transistors, circuit blocks, and integrated circuits which include the effects of total dose radiation and photocurrent response. The models were developed for use with the SCEPTRE circuit analysis program, but the techniques are suitable for other modern computer aided analysis programs. The modeling hierarchy permits the designer or analyst to select the level of modeling complexity consistent with circuit size, parametric information, and accuracy requirements. Improvements have been made in the implementation of important second order effects in the transistor MOS model, in the definition of MOS building block models, and in the development of composite terminal models for MOS integrated circuits

  20. A model for persistency of egg production

    NARCIS (Netherlands)

    Grossman, M.; Gossman, T.N.; Koops, W.J.

    2000-01-01

    The objectives of our study were to propose a new definition for persistency of egg production and to develop a mathematical model to describe the egg production curve, one that includes a new measure for persistency, based on the proposed definition, for use as a selection criterion to improve

  1. FOSSIL2 energy policy model documentation: FOSSIL2 documentation

    Energy Technology Data Exchange (ETDEWEB)

    None

    1980-10-01

    This report discusses the structure, derivations, assumptions, and mathematical formulation of the FOSSIL2 model. Each major facet of the model - supply/demand interactions, industry financing, and production - has been designed to parallel closely the actual cause/effect relationships determining the behavior of the United States energy system. The data base for the FOSSIL2 program is large, as is appropriate for a system dynamics simulation model. When possible, all data were obtained from sources well known to experts in the energy field. Cost and resource estimates are based on DOE data whenever possible. This report presents the FOSSIL2 model at several levels. Volumes II and III of this report list the equations that comprise the FOSSIL2 model, along with variable definitions and a cross-reference list of the model variables. Volume II provides the model equations with each of their variables defined, while Volume III lists the equations, and a one line definition for equations, in a shorter, more readable format.

  2. Improving fire season definition by optimized temporal modelling of daily human-caused ignitions.

    Science.gov (United States)

    Costafreda-Aumedes, S; Vega-Garcia, C; Comas, C

    2018-07-01

    Wildfire suppression management is usually based on fast control of all ignitions, especially in highly populated countries with pervasive values-at-risk. To minimize values-at-risk loss by improving response time of suppression resources it is necessary to anticipate ignitions, which are mainly caused by people. Previous studies have found that human-ignition patterns change spatially and temporally depending on socio-economic activities, hence, the deployment of suppression resources along the year should consider these patterns. However, full suppression capacity is operational only within legally established fire seasons, driven by past events and budgets, which limits response capacity and increases damages out of them. The aim of this study was to assess the temporal definition of fire seasons from the perspective of human-ignition patterns for the case study of Spain, where people cause over 95% of fires. Humans engage in activities that use fire as a tool in certain periods within a year, and in locations linked to specific spatial factors. Geographic variables (population, infrastructures, physiography and land uses) were used as explanatory variables for human-ignition patterns. The changing influence of these geographic variables on occurrence along the year was analysed with day-by-day logistic regression models. Daily models were built for all the municipal units in the two climatic regions in Spain (Atlantic and Mediterranean Spain) from 2002 to 2014, and similar models were grouped within continuous periods, designated as ignition-based seasons. We found three ignition-based seasons in the Mediterranean region and five in the Atlantic zones, not coincidental with calendar seasons, but with a high degree of agreement with current legally designated operational fire seasons. Our results suggest that an additional late-winter-early-spring fire season in the Mediterranean area and the extension of this same season in the Atlantic zone should be re

  3. Preliminary Multi-Variable Parametric Cost Model for Space Telescopes

    Science.gov (United States)

    Stahl, H. Philip; Hendrichs, Todd

    2010-01-01

    This slide presentation reviews creating a preliminary multi-variable cost model for the contract costs of making a space telescope. There is discussion of the methodology for collecting the data, definition of the statistical analysis methodology, single variable model results, testing of historical models and an introduction of the multi variable models.

  4. Model Uncertainty for Bilinear Hysteretic Systems

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Thoft-Christensen, Palle

    1984-01-01

    . The statistical uncertainty -due to lack of information can e.g. be taken into account by describing the variables by predictive density functions, Veneziano [2). In general, model uncertainty is the uncertainty connected with mathematical modelling of the physical reality. When structural reliability analysis...... is related to the concept of a failure surface (or limit state surface) in the n-dimensional basic variable space then model uncertainty is at least due to the neglected variables, the modelling of the failure surface and the computational technique used. A more precise definition is given in section 2...

  5. A formal definition of data flow graph models

    Science.gov (United States)

    Kavi, Krishna M.; Buckles, Bill P.; Bhat, U. Narayan

    1986-01-01

    In this paper, a new model for parallel computations and parallel computer systems that is based on data flow principles is presented. Uninterpreted data flow graphs can be used to model computer systems including data driven and parallel processors. A data flow graph is defined to be a bipartite graph with actors and links as the two vertex classes. Actors can be considered similar to transitions in Petri nets, and links similar to places. The nondeterministic nature of uninterpreted data flow graphs necessitates the derivation of liveness conditions.

  6. On dark degeneracy and interacting models

    International Nuclear Information System (INIS)

    Carneiro, S.; Borges, H.A.

    2014-01-01

    Cosmological background observations cannot fix the dark energy equation of state, which is related to a degeneracy in the definition of the dark sector components. Here we show that this degeneracy can be broken at perturbation level by imposing two observational properties on dark matter. First, dark matter is defined as the clustering component we observe in large scale structures. This definition is meaningful only if dark energy is unperturbed, which is achieved if we additionally assume, as a second condition, that dark matter is cold, i.e. non-relativistic. As a consequence, dark energy models with equation-of-state parameter −1 ≤ ω < 0 are reduced to two observationally distinguishable classes with ω = −1, equally competitive when tested against observations. The first comprises the ΛCDM model with constant dark energy density. The second consists of interacting models with an energy flux from dark energy to dark matter

  7. Stochastic dynamical models for ecological regime shifts

    DEFF Research Database (Denmark)

    Møller, Jan Kloppenborg; Carstensen, Jacob; Madsen, Henrik

    the physical and biological knowledge of the system, and nonlinearities introduced here can generate regime shifts or enhance the probability of regime shifts in the case of stochastic models, typically characterized by a threshold value for the known driver. A simple model for light competition between...... definition and stability of regimes become less subtle. Ecological regime shifts and their modeling must be viewed in a probabilistic manner, particularly if such model results are to be used in ecosystem management....

  8. Towards the quantitative evaluation of visual attention models.

    Science.gov (United States)

    Bylinskii, Z; DeGennaro, E M; Rajalingham, R; Ruda, H; Zhang, J; Tsotsos, J K

    2015-11-01

    Scores of visual attention models have been developed over the past several decades of research. Differences in implementation, assumptions, and evaluations have made comparison of these models very difficult. Taxonomies have been constructed in an attempt at the organization and classification of models, but are not sufficient at quantifying which classes of models are most capable of explaining available data. At the same time, a multitude of physiological and behavioral findings have been published, measuring various aspects of human and non-human primate visual attention. All of these elements highlight the need to integrate the computational models with the data by (1) operationalizing the definitions of visual attention tasks and (2) designing benchmark datasets to measure success on specific tasks, under these definitions. In this paper, we provide some examples of operationalizing and benchmarking different visual attention tasks, along with the relevant design considerations. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Genetic Programming for Automatic Hydrological Modelling

    Science.gov (United States)

    Chadalawada, Jayashree; Babovic, Vladan

    2017-04-01

    One of the recent challenges for the hydrologic research community is the need for the development of coupled systems that involves the integration of hydrologic, atmospheric and socio-economic relationships. This poses a requirement for novel modelling frameworks that can accurately represent complex systems, given, the limited understanding of underlying processes, increasing volume of data and high levels of uncertainity. Each of the existing hydrological models vary in terms of conceptualization and process representation and is the best suited to capture the environmental dynamics of a particular hydrological system. Data driven approaches can be used in the integration of alternative process hypotheses in order to achieve a unified theory at catchment scale. The key steps in the implementation of integrated modelling framework that is influenced by prior understanding and data, include, choice of the technique for the induction of knowledge from data, identification of alternative structural hypotheses, definition of rules, constraints for meaningful, intelligent combination of model component hypotheses and definition of evaluation metrics. This study aims at defining a Genetic Programming based modelling framework that test different conceptual model constructs based on wide range of objective functions and evolves accurate and parsimonious models that capture dominant hydrological processes at catchment scale. In this paper, GP initializes the evolutionary process using the modelling decisions inspired from the Superflex framework [Fenicia et al., 2011] and automatically combines them into model structures that are scrutinized against observed data using statistical, hydrological and flow duration curve based performance metrics. The collaboration between data driven and physical, conceptual modelling paradigms improves the ability to model and manage hydrologic systems. Fenicia, F., D. Kavetski, and H. H. Savenije (2011), Elements of a flexible approach

  10. Persistence and extinction for stochastic logistic model with Levy noise and impulsive perturbation

    OpenAIRE

    Chun Lu; Qiang Ma; Xiaohua Ding

    2015-01-01

    This article investigates a stochastic logistic model with Levy noise and impulsive perturbation. In the model, the impulsive perturbation and Levy noise are taken into account simultaneously. This model is new and more feasible and more accordance with the actual. The definition of solution to a stochastic differential equation with Levy noise and impulsive perturbation is established. Based on this definition, we show that our model has a unique global positive solut...

  11. The manifold model for space-time

    International Nuclear Information System (INIS)

    Heller, M.

    1981-01-01

    Physical processes happen on a space-time arena. It turns out that all contemporary macroscopic physical theories presuppose a common mathematical model for this arena, the so-called manifold model of space-time. The first part of study is an heuristic introduction to the concept of a smooth manifold, starting with the intuitively more clear concepts of a curve and a surface in the Euclidean space. In the second part the definitions of the Csub(infinity) manifold and of certain structures, which arise in a natural way from the manifold concept, are given. The role of the enveloping Euclidean space (i.e. of the Euclidean space appearing in the manifold definition) in these definitions is stressed. The Euclidean character of the enveloping space induces to the manifold local Euclidean (topological and differential) properties. A suggestion is made that replacing the enveloping Euclidean space by a discrete non-Euclidean space would be a correct way towards the quantization of space-time. (author)

  12. Models and automation technologies for the curriculum development

    Directory of Open Access Journals (Sweden)

    V. N. Volkova

    2016-01-01

    Full Text Available The aim of the research was to determine the sequence of the curriculum development stages on the basis of the system analysis, as well as to create models and information technologies for the implementation of thesestages.The methods and the models of the systems’ theory and the system analysis, including methods and automated procedures for structuring organizational aims, models and automated procedures for organizing complex expertise.On the basis of the analysis of existing studies in the field of curriculum modeling, using formal mathematical language, including optimization models, that help to make distribution of disciplines by years and semesters in accordance with the relevant restrictions, it is shown, that the complexity and dimension of these tasks require the development of special software; the problem of defining the input data and restrictions requires a large time investment, that seems to be difficult to provide in real conditions of plans’ developing, thus it is almost impossible to verify the objectivity of the input data and the restrictions in such models. For a complete analysis of the process of curriculum development it is proposed to use the system definition, based on the system-targeted approach. On the basis of this definition the reasonable sequence of the integrated stages for the development of the curriculum was justified: 1 definition (specification of the requirements for the educational content; 2 determining the number of subjects, included in the curriculum; 3 definition of the sequence of the subjects; 4 distribution of subjects by semesters. The models and technologies for the implementation of these stages of curriculum development were given in the article: 1 models, based on the information approach of A.Denisov and the modified degree of compliance with objectives based on Denisov’s evaluation index (in the article the idea of evaluating the degree of the impact of disciplines for realization

  13. Mobility Models for Systems Evaluation

    Science.gov (United States)

    Musolesi, Mirco; Mascolo, Cecilia

    Mobility models are used to simulate and evaluate the performance of mobile wireless systems and the algorithms and protocols at the basis of them. The definition of realistic mobility models is one of the most critical and, at the same time, difficult aspects of the simulation of applications and systems designed for mobile environments. There are essentially two possible types of mobility patterns that can be used to evaluate mobile network protocols and algorithms by means of simulations: traces and synthetic models [130]. Traces are obtained by means of measurements of deployed systems and usually consist of logs of connectivity or location information, whereas synthetic models are mathematical models, such as sets of equations, which try to capture the movement of the devices.

  14. Equation-oriented specification of neural models for simulations

    Directory of Open Access Journals (Sweden)

    Marcel eStimberg

    2014-02-01

    Full Text Available Simulating biological neuronal networks is a core method of research in computational neuroscience. A full specification of such a network model includes a description of the dynamics and state changes of neurons and synapses, as well as the synaptic connectivity patterns and the initial values of all parameters. A standard approach in neuronal modelling software is to build models based on a library of pre-defined models and mechanisms; if a model component does not yet exist, it has to be defined in a special-purpose or general low-level language and potentially be compiled and linked with the simulator. Here we propose an alternative approach that allows flexible definition of models by writing textual descriptions based on mathematical notation. We demonstrate that this approach allows the definition of a wide range of models with minimal syntax. Furthermore, such explicit model descriptions allow the generation of executable code for various target languages and devices, since the description is not tied to an implementation. Finally, this approach also has advantages for readability and reproducibility, because the model description is fully explicit, and because it can be automatically parsed and transformed into formatted descriptions.The presented approach has been implemented in the Brian2 simulator.

  15. Perti Net-Based Workflow Access Control Model

    Institute of Scientific and Technical Information of China (English)

    陈卓; 骆婷; 石磊; 洪帆

    2004-01-01

    Access control is an important protection mechanism for information systems. This paper shows how to make access control in workflow system. We give a workflow access control model (WACM) based on several current access control models. The model supports roles assignment and dynamic authorization. The paper defines the workflow using Petri net. It firstly gives the definition and description of the workflow, and then analyzes the architecture of the workflow access control model (WACM). Finally, an example of an e-commerce workflow access control model is discussed in detail.

  16. 3D virtual human rapid modeling method based on top-down modeling mechanism

    Directory of Open Access Journals (Sweden)

    LI Taotao

    2017-01-01

    Full Text Available Aiming to satisfy the vast custom-made character demand of 3D virtual human and the rapid modeling in the field of 3D virtual reality, a new virtual human top-down rapid modeling method is put for-ward in this paper based on the systematic analysis of the current situation and shortage of the virtual hu-man modeling technology. After the top-level realization of virtual human hierarchical structure frame de-sign, modular expression of the virtual human and parameter design for each module is achieved gradu-al-level downwards. While the relationship of connectors and mapping restraints among different modules is established, the definition of the size and texture parameter is also completed. Standardized process is meanwhile produced to support and adapt the virtual human top-down rapid modeling practice operation. Finally, the modeling application, which takes a Chinese captain character as an example, is carried out to validate the virtual human rapid modeling method based on top-down modeling mechanism. The result demonstrates high modelling efficiency and provides one new concept for 3D virtual human geometric mod-eling and texture modeling.

  17. A Data-Driven Air Transportation Delay Propagation Model Using Epidemic Process Models

    Directory of Open Access Journals (Sweden)

    B. Baspinar

    2016-01-01

    Full Text Available In air transport network management, in addition to defining the performance behavior of the system’s components, identification of their interaction dynamics is a delicate issue in both strategic and tactical decision-making process so as to decide which elements of the system are “controlled” and how. This paper introduces a novel delay propagation model utilizing epidemic spreading process, which enables the definition of novel performance indicators and interaction rates of the elements of the air transportation network. In order to understand the behavior of the delay propagation over the network at different levels, we have constructed two different data-driven epidemic models approximating the dynamics of the system: (a flight-based epidemic model and (b airport-based epidemic model. The flight-based epidemic model utilizing SIS epidemic model focuses on the individual flights where each flight can be in susceptible or infected states. The airport-centric epidemic model, in addition to the flight-to-flight interactions, allows us to define the collective behavior of the airports, which are modeled as metapopulations. In network model construction, we have utilized historical flight-track data of Europe and performed analysis for certain days involving certain disturbances. Through this effort, we have validated the proposed delay propagation models under disruptive events.

  18. Out-of-equilibrium dynamics in a Gaussian trap model

    International Nuclear Information System (INIS)

    Diezemann, Gregor

    2007-01-01

    The violations of the fluctuation-dissipation theorem are analysed for a trap model with a Gaussian density of states. In this model, the system reaches thermal equilibrium for long times after a quench to any finite temperature and therefore all ageing effect are of a transient nature. For not too long times after the quench it is found that the so-called fluctuation-dissipation ratio tends to a non-trivial limit, thus indicating the possibility for the definition of a timescale-dependent effective temperature. However, different definitions of the effective temperature yield distinct results. In particular, plots of the integrated response versus the correlation function strongly depend on the way they are constructed. Also the definition of effective temperatures in the frequency domain is not unique for the model considered. This may have some implications for the interpretation of results from computer simulations and experimental determinations of effective temperatures

  19. Specialty Payment Model Opportunities and Assessment: Gastroenterology and Cardiology Model Design Report.

    Science.gov (United States)

    Mulcahy, Andrew W; Chan, Chris; Hirshman, Samuel; Huckfeldt, Peter J; Kofner, Aaron; Liu, Jodi L; Lovejoy, Susan L; Popescu, Ioana; Timbie, Justin W; Hussey, Peter S

    2015-07-15

    Gastroenterology and cardiology services are common and costly among Medicare beneficiaries. Episode-based payment, which aims to create incentives for high-quality, low-cost care, has been identified as a promising alternative payment model. This article describes research related to the design of episode-based payment models for ambulatory gastroenterology and cardiology services for possible testing by the Center for Medicare and Medicaid Innovation at the Centers for Medicare and Medicaid Services (CMS). The authors analyzed Medicare claims data to describe the frequency and characteristics of gastroenterology and cardiology index procedures, the practices that delivered index procedures, and the patients that received index procedures. The results of these analyses can help inform CMS decisions about the definition of episodes in an episode-based payment model; payment adjustments for service setting, multiple procedures, or other factors; and eligibility for the payment model.

  20. Comparison of BrainTool to other UML modeling and model transformation tools

    Science.gov (United States)

    Nikiforova, Oksana; Gusarovs, Konstantins

    2017-07-01

    In the last 30 years there were numerous model generated software systems offered targeting problems with the development productivity and the resulting software quality. CASE tools developed due today's date are being advertised as having "complete code-generation capabilities". Nowadays the Object Management Group (OMG) is calling similar arguments in regards to the Unified Modeling Language (UML) models at different levels of abstraction. It is being said that software development automation using CASE tools enables significant level of automation. Actual today's CASE tools are usually offering a combination of several features starting with a model editor and a model repository for a traditional ones and ending with code generator (that could be using a scripting or domain-specific (DSL) language), transformation tool to produce the new artifacts from the manually created and transformation definition editor to define new transformations for the most advanced ones. Present paper contains the results of CASE tool (mainly UML editors) comparison against the level of the automation they are offering.

  1. Modelling Active Faults in Probabilistic Seismic Hazard Analysis (PSHA) with OpenQuake: Definition, Design and Experience

    Science.gov (United States)

    Weatherill, Graeme; Garcia, Julio; Poggi, Valerio; Chen, Yen-Shin; Pagani, Marco

    2016-04-01

    The Global Earthquake Model (GEM) has, since its inception in 2009, made many contributions to the practice of seismic hazard modeling in different regions of the globe. The OpenQuake-engine (hereafter referred to simply as OpenQuake), GEM's open-source software for calculation of earthquake hazard and risk, has found application in many countries, spanning a diversity of tectonic environments. GEM itself has produced a database of national and regional seismic hazard models, harmonizing into OpenQuake's own definition the varied seismogenic sources found therein. The characterization of active faults in probabilistic seismic hazard analysis (PSHA) is at the centre of this process, motivating many of the developments in OpenQuake and presenting hazard modellers with the challenge of reconciling seismological, geological and geodetic information for the different regions of the world. Faced with these challenges, and from the experience gained in the process of harmonizing existing models of seismic hazard, four critical issues are addressed. The challenge GEM has faced in the development of software is how to define a representation of an active fault (both in terms of geometry and earthquake behaviour) that is sufficiently flexible to adapt to different tectonic conditions and levels of data completeness. By exploring the different fault typologies supported by OpenQuake we illustrate how seismic hazard calculations can, and do, take into account complexities such as geometrical irregularity of faults in the prediction of ground motion, highlighting some of the potential pitfalls and inconsistencies that can arise. This exploration leads to the second main challenge in active fault modeling, what elements of the fault source model impact most upon the hazard at a site, and when does this matter? Through a series of sensitivity studies we show how different configurations of fault geometry, and the corresponding characterisation of near-fault phenomena (including

  2. Exploring the Freemium Business Model

    OpenAIRE

    Reime, Erlend Vihovde

    2011-01-01

    This thesis is exploring the Freemium business model, answering how a Freemium model is defined and how it works in real-life. It discusses the original definition by Fred Wilson, and present the context where the Freemium business model is used, Internet services in the Web 2.0. It also looks at how customers react to free services and the Internet. After this, the three main directions within business strategy theory are explored: Industry-based competition, Firm-specific Resources and Capa...

  3. Model of facilitation of emotional intelligence to promote wholeness ...

    African Journals Online (AJOL)

    The facilitation of inherent affective and mental resourcefulness and resilience was the main concept of the model. Step two comprised the definition and classification of central and related concepts. Step three provides a description of the model. The model operates in three phases namely the dependent phase, partially ...

  4. Minimum-complexity helicopter simulation math model

    Science.gov (United States)

    Heffley, Robert K.; Mnich, Marc A.

    1988-01-01

    An example of a minimal complexity simulation helicopter math model is presented. Motivating factors are the computational delays, cost, and inflexibility of the very sophisticated math models now in common use. A helicopter model form is given which addresses each of these factors and provides better engineering understanding of the specific handling qualities features which are apparent to the simulator pilot. The technical approach begins with specification of features which are to be modeled, followed by a build up of individual vehicle components and definition of equations. Model matching and estimation procedures are given which enable the modeling of specific helicopters from basic data sources such as flight manuals. Checkout procedures are given which provide for total model validation. A number of possible model extensions and refinement are discussed. Math model computer programs are defined and listed.

  5. Network model of security system

    Directory of Open Access Journals (Sweden)

    Adamczyk Piotr

    2016-01-01

    Full Text Available The article presents the concept of building a network security model and its application in the process of risk analysis. It indicates the possibility of a new definition of the role of the network models in the safety analysis. Special attention was paid to the development of the use of an algorithm describing the process of identifying the assets, vulnerability and threats in a given context. The aim of the article is to present how this algorithm reduced the complexity of the problem by eliminating from the base model these components that have no links with others component and as a result and it was possible to build a real network model corresponding to reality.

  6. Improving the quality of clinical coding: a comprehensive audit model

    Directory of Open Access Journals (Sweden)

    Hamid Moghaddasi

    2014-04-01

    Full Text Available Introduction: The review of medical records with the aim of assessing the quality of codes has long been conducted in different countries. Auditing medical coding, as an instructive approach, could help to review the quality of codes objectively using defined attributes, and this in turn would lead to improvement of the quality of codes. Method: The current study aimed to present a model for auditing the quality of clinical codes. The audit model was formed after reviewing other audit models, considering their strengths and weaknesses. A clear definition was presented for each quality attribute and more detailed criteria were then set for assessing the quality of codes. Results: The audit tool (based on the quality attributes included legibility, relevancy, completeness, accuracy, definition and timeliness; led to development of an audit model for assessing the quality of medical coding. Delphi technique was then used to reassure the validity of the model. Conclusion: The inclusive audit model designed could provide a reliable and valid basis for assessing the quality of codes considering more quality attributes and their clear definition. The inter-observer check suggested in the method of auditing is of particular importance to reassure the reliability of coding.

  7. Simulation modelling for new gas turbine fuel controller creation.

    Science.gov (United States)

    Vendland, L. E.; Pribylov, V. G.; Borisov, Yu A.; Arzamastsev, M. A.; Kosoy, A. A.

    2017-11-01

    State of the art gas turbine fuel flow control systems are based on throttle principle. Major disadvantage of such systems is that they require high pressure fuel intake. Different approach to fuel flow control is to use regulating compressor. And for this approach because of controller and gas turbine interaction a specific regulating compressor is required. Difficulties emerge as early as the requirement definition stage. To define requirements for new object, his properties must be known. Simulation modelling helps to overcome these difficulties. At the requirement definition stage the most simplified mathematical model is used. Mathematical models will get more complex and detailed as we advance in planned work. If future adjusting of regulating compressor physical model to work with virtual gas turbine and physical control system is planned.

  8. Modeling exposure–lag–response associations with distributed lag non-linear models

    Science.gov (United States)

    Gasparrini, Antonio

    2014-01-01

    In biomedical research, a health effect is frequently associated with protracted exposures of varying intensity sustained in the past. The main complexity of modeling and interpreting such phenomena lies in the additional temporal dimension needed to express the association, as the risk depends on both intensity and timing of past exposures. This type of dependency is defined here as exposure–lag–response association. In this contribution, I illustrate a general statistical framework for such associations, established through the extension of distributed lag non-linear models, originally developed in time series analysis. This modeling class is based on the definition of a cross-basis, obtained by the combination of two functions to flexibly model linear or nonlinear exposure-responses and the lag structure of the relationship, respectively. The methodology is illustrated with an example application to cohort data and validated through a simulation study. This modeling framework generalizes to various study designs and regression models, and can be applied to study the health effects of protracted exposures to environmental factors, drugs or carcinogenic agents, among others. © 2013 The Authors. Statistics in Medicine published by John Wiley & Sons, Ltd. PMID:24027094

  9. On an elementary definition of visual saliency

    DEFF Research Database (Denmark)

    Loog, Marco

    2008-01-01

    Various approaches to computational modelling of bottom-up visual attention have been proposed in the past two decades. As part of this trend, researchers have studied ways to characterize the saliency map underlying many of these models. In more recent years, several definitions based on probabi......Various approaches to computational modelling of bottom-up visual attention have been proposed in the past two decades. As part of this trend, researchers have studied ways to characterize the saliency map underlying many of these models. In more recent years, several definitions based...... on probabilistic and information or decision theoretic considerations have been proposed. These provide experimentally successful, appealing, low-level, operational, and elementary definitions of visual saliency (see eg, Bruce, 2005 Neurocomputing 65 125 - 133). Here, I demonstrate that, in fact, all...

  10. Modeling and Predistortion of Envelope Tracking Power Amplifiers using a Memory Binomial Model

    DEFF Research Database (Denmark)

    Tafuri, Felice Francesco; Sira, Daniel; Larsen, Torben

    2013-01-01

    . The model definition is based on binomial series, hence the name of memory binomial model (MBM). The MBM is here applied to measured data-sets acquired from an ET measurement set-up. When used as a PA model the MBM showed an NMSE (Normalized Mean Squared Error) as low as −40dB and an ACEPR (Adjacent Channel...... Error Power Ratio) below −51 dB. The simulated predistortion results showed that the MBM can improve the compensation of distortion in the adjacent channel of 5.8 dB and 5.7 dB compared to a memory polynomial predistorter (MPPD). The predistortion performance in the time domain showed an NMSE...

  11. OWL references in ORM conceptual modelling

    Science.gov (United States)

    Matula, Jiri; Belunek, Roman; Hunka, Frantisek

    2017-07-01

    Object Role Modelling methodology is the fact-based type of conceptual modelling. The aim of the paper is to emphasize a close connection to OWL documents and its possible mutual cooperation. The definition of entities or domain values is an indispensable part of the conceptual schema design procedure defined by the ORM methodology. Many of these entities are already defined in OWL documents. Therefore, it is not necessary to declare entities again, whereas it is possible to utilize references from OWL documents during modelling of information systems.

  12. Function Model for Community Health Service Information

    Science.gov (United States)

    Yang, Peng; Pan, Feng; Liu, Danhong; Xu, Yongyong

    In order to construct a function model of community health service (CHS) information for development of CHS information management system, Integration Definition for Function Modeling (IDEF0), an IEEE standard which is extended from Structured Analysis and Design(SADT) and now is a widely used function modeling method, was used to classifying its information from top to bottom. The contents of every level of the model were described and coded. Then function model for CHS information, which includes 4 super-classes, 15 classes and 28 sub-classed of business function, 43 business processes and 168 business activities, was established. This model can facilitate information management system development and workflow refinement.

  13. A Comparative of business process modelling techniques

    Science.gov (United States)

    Tangkawarow, I. R. H. T.; Waworuntu, J.

    2016-04-01

    In this era, there is a lot of business process modeling techniques. This article is the research about differences of business process modeling techniques. For each technique will explain about the definition and the structure. This paper presents a comparative analysis of some popular business process modelling techniques. The comparative framework is based on 2 criteria: notation and how it works when implemented in Somerleyton Animal Park. Each technique will end with the advantages and disadvantages. The final conclusion will give recommend of business process modeling techniques that easy to use and serve the basis for evaluating further modelling techniques.

  14. Business Model Innovation to Create and Capture Resource Value in Future Circular Material Chains

    OpenAIRE

    Roos, Göran

    2014-01-01

    This article briefly discusses the origins and development of the business model concept resulting in a high level definition. Against this backdrop, frameworks from the literature around green business models with examples of green business models and the business model innovation process are presented. The article then discusses the origins and meaning of different "green" concepts relevant for the circular value chain concluding with a high level definition. The article finally outl...

  15. Marginal Models for Categorial Data

    NARCIS (Netherlands)

    Bergsma, W.P.; Rudas, T.

    2002-01-01

    Statistical models defined by imposing restrictions on marginal distributions of contingency tables have received considerable attention recently. This paper introduces a general definition of marginal log-linear parameters and describes conditions for a marginal log-linear parameter to be a smooth

  16. Effect of the MCNP model definition on the computation time

    International Nuclear Information System (INIS)

    Šunka, Michal

    2017-01-01

    The presented work studies the influence of the method of defining the geometry in the MCNP transport code and its impact on the computational time, including the difficulty of preparing an input file describing the given geometry. Cases using different geometric definitions including the use of basic 2-dimensional and 3-dimensional objects and theirs combinations were studied. The results indicate that an inappropriate definition can increase the computational time by up to 59% (a more realistic case indicates 37%) for the same results and the same statistical uncertainty. (orig.)

  17. EIA model documentation: World oil refining logistics demand model,''WORLD'' reference manual

    International Nuclear Information System (INIS)

    1994-01-01

    This manual is intended primarily for use as a reference by analysts applying the WORLD model to regional studies. It also provides overview information on WORLD features of potential interest to managers and analysts. Broadly, the manual covers WORLD model features in progressively increasing detail. Section 2 provides an overview of the WORLD model, how it has evolved, what its design goals are, what it produces, and where it can be taken with further enhancements. Section 3 reviews model management covering data sources, managing over-optimization, calibration and seasonality, check-points for case construction and common errors. Section 4 describes in detail the WORLD system, including: data and program systems in overview; details of mainframe and PC program control and files;model generation, size management, debugging and error analysis; use with different optimizers; and reporting and results analysis. Section 5 provides a detailed description of every WORLD model data table, covering model controls, case and technology data. Section 6 goes into the details of WORLD matrix structure. It provides an overview, describes how regional definitions are controlled and defines the naming conventions for-all model rows, columns, right-hand sides, and bounds. It also includes a discussion of the formulation of product blending and specifications in WORLD. Several Appendices supplement the main sections

  18. On the implementation of the spherical collapse model for dark energy models

    Energy Technology Data Exchange (ETDEWEB)

    Pace, Francesco [Jodrell Bank Centre for Astrophysics, School of Physics and Astronomy, The University of Manchester, Manchester, M13 9PL (United Kingdom); Meyer, Sven; Bartelmann, Matthias, E-mail: francesco.pace@manchester.ac.uk, E-mail: sven.meyer@uni-heidelberg.de, E-mail: bartelmann@uni-heidelberg.de [Zentrum für Astronomie der Universität Heidelberg, Institut für theoretische Astrophysik, Philosophenweg 12, D-69120, Heidelberg (Germany)

    2017-10-01

    In this work we review the theory of the spherical collapse model and critically analyse the aspects of the numerical implementation of its fundamental equations. By extending a recent work by [1], we show how different aspects, such as the initial integration time, the definition of constant infinity and the criterion for the extrapolation method (how close the inverse of the overdensity has to be to zero at the collapse time) can lead to an erroneous estimation (a few per mill error which translates to a few percent in the mass function) of the key quantity in the spherical collapse model: the linear critical overdensity δ{sub c}, which plays a crucial role for the mass function of halos. We provide a better recipe to adopt in designing a code suitable to a generic smooth dark energy model and we compare our numerical results with analytic predictions for the EdS and the ΛCDM models. We further discuss the evolution of δ{sub c} for selected classes of dark energy models as a general test of the robustness of our implementation. We finally outline which modifications need to be taken into account to extend the code to more general classes of models, such as clustering dark energy models and non-minimally coupled models.

  19. On the implementation of the spherical collapse model for dark energy models

    Science.gov (United States)

    Pace, Francesco; Meyer, Sven; Bartelmann, Matthias

    2017-10-01

    In this work we review the theory of the spherical collapse model and critically analyse the aspects of the numerical implementation of its fundamental equations. By extending a recent work by [1], we show how different aspects, such as the initial integration time, the definition of constant infinity and the criterion for the extrapolation method (how close the inverse of the overdensity has to be to zero at the collapse time) can lead to an erroneous estimation (a few per mill error which translates to a few percent in the mass function) of the key quantity in the spherical collapse model: the linear critical overdensity δc, which plays a crucial role for the mass function of halos. We provide a better recipe to adopt in designing a code suitable to a generic smooth dark energy model and we compare our numerical results with analytic predictions for the EdS and the ΛCDM models. We further discuss the evolution of δc for selected classes of dark energy models as a general test of the robustness of our implementation. We finally outline which modifications need to be taken into account to extend the code to more general classes of models, such as clustering dark energy models and non-minimally coupled models.

  20. A new multidimensional model with text dimensions: definition and implementation

    Directory of Open Access Journals (Sweden)

    MariaJ. Martin-Bautista

    2013-02-01

    Full Text Available We present a new multidimensional model with textual dimensions based on a knowledge structure extracted from the texts, where any textual attribute in a database can be processed, and not only XML texts. This dimension allows to treat the textual data in the same way as the non-textual one in an automatic way, without user's intervention, so all the classical operations in the multidimensional model can been defined for this textual dimension. While most of the models dealing with texts that can be found in the literature are not implemented, in this proposal, the multidimensional model and the OLAP system have been implemented in a software tool, so it can be tested on real data. A case study with medical data is included in this work.

  1. On the equivalence of quadrupole phonon model and interacting boson model

    International Nuclear Information System (INIS)

    Kyrchev, G.

    1980-01-01

    A rigorous proof of the quadrupole phonon model (QPM) and the interacting boson model (IBM) equivalence (the Hamiltonians and the relevant operators of both models are identical) is presented. Within the theory of classical Lie algebras the Schwinger representation (SR) construction of SU(6)-algebra, generated by QPM collective coordinates, conjugated momenta and their commutators, is given. Having the explicit form of SU(6) generators in SR, we get the QPM collective Hamiltonian in SR (previously Holstein-Primakoff infinite Boson expansion has been applied for this Hamiltonian). The Hamiltonian of QPM thus obtained contains all Boson structures, which are present in the Hamiltonian of IBM and under definite relations between their parameters, both Hamiltonians coincide identically. The relevant operators are identical too. Thus, though QPM and IBM, being advanced independently, have been developed in a different fashion, they are essentially equivalent

  2. Modelers' perception of mathematical modeling in epidemiology: a web-based survey.

    Directory of Open Access Journals (Sweden)

    Gilles Hejblum

    Full Text Available BACKGROUND: Mathematical modeling in epidemiology (MME is being used increasingly. However, there are many uncertainties in terms of definitions, uses and quality features of MME. METHODOLOGY/PRINCIPAL FINDINGS: To delineate the current status of these models, a 10-item questionnaire on MME was devised. Proposed via an anonymous internet-based survey, the questionnaire was completed by 189 scientists who had published in the domain of MME. A small minority (18% of respondents claimed to have in mind a concise definition of MME. Some techniques were identified by the researchers as characterizing MME (e.g. Markov models, while others-at the same level of sophistication in terms of mathematics-were not (e.g. Cox regression. The researchers' opinions were also contrasted about the potential applications of MME, perceived as highly relevant for providing insight into complex mechanisms and less relevant for identifying causal factors. The quality criteria were those of good science and were not related to the size and the nature of the public health problems addressed. CONCLUSIONS/SIGNIFICANCE: This study shows that perceptions on the nature, uses and quality criteria of MME are contrasted, even among the very community of published authors in this domain. Nevertheless, MME is an emerging discipline in epidemiology and this study underlines that it is associated with specific areas of application and methods. The development of this discipline is likely to deserve a framework providing recommendations and guidance at various steps of the studies, from design to report.

  3. Study on individual stochastic model of GNSS observations for precise kinematic applications

    Science.gov (United States)

    Próchniewicz, Dominik; Szpunar, Ryszard

    2015-04-01

    The proper definition of mathematical positioning model, which is defined by functional and stochastic models, is a prerequisite to obtain the optimal estimation of unknown parameters. Especially important in this definition is realistic modelling of stochastic properties of observations, which are more receiver-dependent and time-varying than deterministic relationships. This is particularly true with respect to precise kinematic applications which are characterized by weakening model strength. In this case, incorrect or simplified definition of stochastic model causes that the performance of ambiguity resolution and accuracy of position estimation can be limited. In this study we investigate the methods of describing the measurement noise of GNSS observations and its impact to derive precise kinematic positioning model. In particular stochastic modelling of individual components of the variance-covariance matrix of observation noise performed using observations from a very short baseline and laboratory GNSS signal generator, is analyzed. Experimental test results indicate that the utilizing the individual stochastic model of observations including elevation dependency and cross-correlation instead of assumption that raw measurements are independent with the same variance improves the performance of ambiguity resolution as well as rover positioning accuracy. This shows that the proposed stochastic assessment method could be a important part in complex calibration procedure of GNSS equipment.

  4. Modeling and analysis of stochastic systems

    CERN Document Server

    Kulkarni, Vidyadhar G

    2011-01-01

    Based on the author's more than 25 years of teaching experience, Modeling and Analysis of Stochastic Systems, Second Edition covers the most important classes of stochastic processes used in the modeling of diverse systems, from supply chains and inventory systems to genetics and biological systems. For each class of stochastic process, the text includes its definition, characterization, applications, transient and limiting behavior, first passage times, and cost/reward models. Along with reorganizing the material, this edition revises and adds new exercises and examples. New to the second edi

  5. Phenotyping animal models of diabetic neuropathy

    DEFF Research Database (Denmark)

    Biessels, G J; Bril, V; Calcutt, N A

    2014-01-01

    NIDDK, JDRF, and the Diabetic Neuropathy Study Group of EASD sponsored a meeting to explore the current status of animal models of diabetic peripheral neuropathy. The goal of the workshop was to develop a set of consensus criteria for the phenotyping of rodent models of diabetic neuropathy...... with a discussion on the merits and limitations of a unified approach to phenotyping rodent models of diabetic neuropathy and a consensus formed on the definition of the minimum criteria required for establishing the presence of the disease. A neuropathy phenotype in rodents was defined as the presence...

  6. Bioavailability in the boris assessment model

    International Nuclear Information System (INIS)

    Norden, M.; Avila, R.; Gonze, M.A.; Tamponnet, C.

    2004-01-01

    The fifth framework EU project BORIS (Bioavailability Of Radionuclides In Soils: role of biological components and resulting improvement of prediction models) has three scientific objectives. The first is to improve understanding of the mechanisms governing the transfer of radionuclides to plants. The second is to improve existing predictive models of radionuclide interaction with soils by incorporating the knowledge acquired from the experimental results. The last and third objective is to extract from the experimental results some scientific basis for the development of bioremediation methods of radionuclides contaminated soils and to apprehend the role of additional non-radioactive pollutants on radionuclide bio-availability. This paper is focused on the second objective. The purpose of the BORIS assessment model is to describe the behaviour of radionuclides in the soil-plant system with the aim of making predictions of the time dynamics of the bioavailability of radionuclides in soil and the radionuclides concentrations in plants. To be useful the assessment model should be rather simple and use only a few parameters, which are commonly available or possible to measure for different sites. The model shall take into account, as much as possible, the results of the experimental studies and the mechanistic models developed in the BORIS project. One possible approach is to introduce in the assessment model a quantitative relationship between bioavailability of the radionuclides in soil and the soil properties. To do this an operational definition of bioavailability is needed. Here operational means experimentally measurable, directly or indirectly, and that the bioavailability can be translated into a mathematical expression. This paper describes the reasoning behind the chosen definition of bioavailability for the assessment model, how to derive operational expressions for the bioavailability and how to use them in the assessment model. (author)

  7. Solar Sail Models and Test Measurements Correspondence for Validation Requirements Definition

    Science.gov (United States)

    Ewing, Anthony; Adams, Charles

    2004-01-01

    Solar sails are being developed as a mission-enabling technology in support of future NASA science missions. Current efforts have advanced solar sail technology sufficient to justify a flight validation program. A primary objective of this activity is to test and validate solar sail models that are currently under development so that they may be used with confidence in future science mission development (e.g., scalable to larger sails). Both system and model validation requirements must be defined early in the program to guide design cycles and to ensure that relevant and sufficient test data will be obtained to conduct model validation to the level required. A process of model identification, model input/output documentation, model sensitivity analyses, and test measurement correspondence is required so that decisions can be made to satisfy validation requirements within program constraints.

  8. Target-Centric Network Modeling

    DEFF Research Database (Denmark)

    Mitchell, Dr. William L.; Clark, Dr. Robert M.

    In Target-Centric Network Modeling: Case Studies in Analyzing Complex Intelligence Issues, authors Robert Clark and William Mitchell take an entirely new approach to teaching intelligence analysis. Unlike any other book on the market, it offers case study scenarios using actual intelligence...... reporting formats, along with a tested process that facilitates the production of a wide range of analytical products for civilian, military, and hybrid intelligence environments. Readers will learn how to perform the specific actions of problem definition modeling, target network modeling......, and collaborative sharing in the process of creating a high-quality, actionable intelligence product. The case studies reflect the complexity of twenty-first century intelligence issues by dealing with multi-layered target networks that cut across political, economic, social, technological, and military issues...

  9. Applying the Business Process and Practice Alignment Meta-model: Daily Practices and Process Modelling

    Directory of Open Access Journals (Sweden)

    Ventura Martins Paula

    2017-03-01

    Full Text Available Background: Business Process Modelling (BPM is one of the most important phases of information system design. Business Process (BP meta-models allow capturing informational and behavioural aspects of business processes. Unfortunately, standard BP meta-modelling approaches focus just on process description, providing different BP models. It is not possible to compare and identify related daily practices in order to improve BP models. This lack of information implies that further research in BP meta-models is needed to reflect the evolution/change in BP. Considering this limitation, this paper introduces a new BP meta-model designed by Business Process and Practice Alignment Meta-model (BPPAMeta-model. Our intention is to present a meta-model that addresses features related to the alignment between daily work practices and BP descriptions. Objectives: This paper intends to present a metamodel which is going to integrate daily work information into coherent and sound process definitions. Methods/Approach: The methodology employed in the research follows a design-science approach. Results: The results of the case study are related to the application of the proposed meta-model to align the specification of a BP model with work practices models. Conclusions: This meta-model can be used within the BPPAM methodology to specify or improve business processes models based on work practice descriptions.

  10. Definition of Videogames

    Directory of Open Access Journals (Sweden)

    Grant Tavinor

    2008-01-01

    Full Text Available Can videogames be defined? The new field of games studies has generated three somewhat competing models of videogaming that characterize games as new forms of gaming, narratives, and interactive fictions. When treated as necessary and sufficient condition definitions, however, each of the three approaches fails to pick out all and only videogames. In this paper I argue that looking more closely at the formal qualities of definition helps to set out the range of definitional options open to the games theorist. A disjunctive definition of videogaming seems the most appropriate of these definitional options. The disjunctive definition I offer here is motivated by the observation that there is more than one characteristic way of being a videogame.

  11. Designing a Sustainable Future with Mental Models

    OpenAIRE

    Bernotat, Anke; Bertling, Jürgen; English, Christiane; Schanz, Judith

    2017-01-01

    Inspired by the question of the Club of Rome as to Design could help to translate the ubiquitous knowledge on sustainability into daily practise and Peter Senge's belief on mental models as a limiting factor to implementation of systemic insight (Senge 2006), we explored working with mental models as a sustainable design tool. We propose a definition for design uses. At the 7th Sustainable Summer School we collected general unsustainable mental models and "designed" sustainable ones. These me...

  12. Multimodality Tumor Delineation and Predictive Modelling via Fuzzy-Fusion Deformable Models and Biological Potential Functions

    Science.gov (United States)

    Wasserman, Richard Marc

    The radiation therapy treatment planning (RTTP) process may be subdivided into three planning stages: gross tumor delineation, clinical target delineation, and modality dependent target definition. The research presented will focus on the first two planning tasks. A gross tumor target delineation methodology is proposed which focuses on the integration of MRI, CT, and PET imaging data towards the generation of a mathematically optimal tumor boundary. The solution to this problem is formulated within a framework integrating concepts from the fields of deformable modelling, region growing, fuzzy logic, and data fusion. The resulting fuzzy fusion algorithm can integrate both edge and region information from multiple medical modalities to delineate optimal regions of pathological tissue content. The subclinical boundaries of an infiltrating neoplasm cannot be determined explicitly via traditional imaging methods and are often defined to extend a fixed distance from the gross tumor boundary. In order to improve the clinical target definition process an estimation technique is proposed via which tumor growth may be modelled and subclinical growth predicted. An in vivo, macroscopic primary brain tumor growth model is presented, which may be fit to each patient undergoing treatment, allowing for the prediction of future growth and consequently the ability to estimate subclinical local invasion. Additionally, the patient specific in vivo tumor model will be of significant utility in multiple diagnostic clinical applications.

  13. Quasinuclear colored quark model for hadrons

    International Nuclear Information System (INIS)

    Lipkin, H.J.

    1978-09-01

    Lectures are presented in which a quasinuclear constituent quark model in which constituent quarks are assumed to be made of constituent interacting with a two-body color-exchange logarithmic potential is considered. The color degree of freedom is discussed in detail. Some properties of the logarithmic potential and the definition of the quasinuclear model and its validity, and a comparison of some of its predictions with experiment are described. 31 references

  14. Considering lesbian identity from a social-psychological perspective: two different models of "being a lesbian".

    Science.gov (United States)

    Tate, Charlotte Chuck

    2012-01-01

    One long-standing project within lesbian studies has been to develop a satisfactory working definition of "lesbian." This article proposes two new models of a definition using principles of social psychology. Each model (a) utilizes the premise that gender lacks a categorical essence and (b) separates behavioral adherence to cultural stereotypes of femininity and masculinity from one's gender self-categorization. From these premises, I generate and critique two internally coherent models of lesbian identity that are inclusive (to different degrees) of various gender identities. For each model, the potential inclusion of trans men, trans women, genderqueers, and lesbian-identified cisgender men is evaluated. The explanatory power of these models is twofold. One, the models can serve as theoretical perspectives for scholars who study the intersection of gender and sexual identity. Two, the models can also characterize the everyday experience of people who have tacit working definitions of lesbian identity.

  15. Analyzing differences in operational disease definitions using ontological modeling

    NARCIS (Netherlands)

    Peelen, Linda; Klein, Michel C.A.; Schlobach, Stefan; De Keizer, Nicolette F.; Peek, Niels

    2007-01-01

    In medicine, there are many diseases which cannot be precisely characterized but are considered as natural kinds. In the communication between health care professionals, this is generally not problematic. In biomedical research, however, crisp definitions are required to unambiguously distinguish

  16. Object-oriented biomedical system modelling--the language.

    Science.gov (United States)

    Hakman, M; Groth, T

    1999-11-01

    The paper describes a new object-oriented biomedical continuous system modelling language (OOBSML). It is fully object-oriented and supports model inheritance, encapsulation, and model component instantiation and behaviour polymorphism. Besides the traditional differential and algebraic equation expressions the language includes also formal expressions for documenting models and defining model quantity types and quantity units. It supports explicit definition of model input-, output- and state quantities, model components and component connections. The OOBSML model compiler produces self-contained, independent, executable model components that can be instantiated and used within other OOBSML models and/or stored within model and model component libraries. In this way complex models can be structured as multilevel, multi-component model hierarchies. Technically the model components produced by the OOBSML compiler are executable computer code objects based on distributed object and object request broker technology. This paper includes both the language tutorial and the formal language syntax and semantic description.

  17. Model Validation in Ontology Based Transformations

    Directory of Open Access Journals (Sweden)

    Jesús M. Almendros-Jiménez

    2012-10-01

    Full Text Available Model Driven Engineering (MDE is an emerging approach of software engineering. MDE emphasizes the construction of models from which the implementation should be derived by applying model transformations. The Ontology Definition Meta-model (ODM has been proposed as a profile for UML models of the Web Ontology Language (OWL. In this context, transformations of UML models can be mapped into ODM/OWL transformations. On the other hand, model validation is a crucial task in model transformation. Meta-modeling permits to give a syntactic structure to source and target models. However, semantic requirements have to be imposed on source and target models. A given transformation will be sound when source and target models fulfill the syntactic and semantic requirements. In this paper, we present an approach for model validation in ODM based transformations. Adopting a logic programming based transformational approach we will show how it is possible to transform and validate models. Properties to be validated range from structural and semantic requirements of models (pre and post conditions to properties of the transformation (invariants. The approach has been applied to a well-known example of model transformation: the Entity-Relationship (ER to Relational Model (RM transformation.

  18. Process modelling on a canonical basis[Process modelling; Canonical modelling

    Energy Technology Data Exchange (ETDEWEB)

    Siepmann, Volker

    2006-12-20

    possible to retrieve symbolically obtained derivatives of arbitrary process properties with respect to process parameters efficiently as a post calculation. The approach is therefore perfectly suitable to perform advanced process systems engineering tasks, such as sensitivity analysis, process optimisation, and data reconciliation. The concept of canonical modelling yields a natural definition of a general exergy state function for second law analysis. By partitioning of exergy into latent, mechanical, and chemical contributions, irreversible effects can be identified specifically, even for black-box models. The calculation core of a new process simulator called Yasim is developed and implemented. The software design follows the concepts described in the theoretical part of this thesis. Numerous exemplary process models are presented to address various subtopics of canonical modelling (author)

  19. Biomass Scenario Model Scenario Library: Definitions, Construction, and Description

    Energy Technology Data Exchange (ETDEWEB)

    Inman, D.; Vimmerstedt, L.; Bush, B.; Peterson, S.

    2014-04-01

    Understanding the development of the biofuels industry in the United States is important to policymakers and industry. The Biomass Scenario Model (BSM) is a system dynamics model of the biomass-to-biofuels system that can be used to explore policy effects on biofuels development. Because of the complexity of the model, as well as the wide range of possible future conditions that affect biofuels industry development, we have not developed a single reference case but instead developed a set of specific scenarios that provide various contexts for our analyses. The purpose of this report is to describe the scenarios that comprise the BSM scenario library. At present, we have the following policy-focused scenarios in our library: minimal policies, ethanol-focused policies, equal access to policies, output-focused policies, technological diversity focused, and the point-of-production- focused. This report describes each scenario, its policy settings, and general insights gained through use of the scenarios in analytic studies.

  20. Evacuation modeling trends

    CERN Document Server

    Abreu, Orlando; Alvear, Daniel

    2016-01-01

    This book presents an overview of modeling definitions and concepts, theory on human behavior and human performance data, available tools and simulation approaches, model development, and application and validation methods. It considers the data and research efforts needed to develop and incorporate functions for the different parameters into comprehensive escape and evacuation simulations, with a number of examples illustrating different aspects and approaches. After an overview of basic modeling approaches, the book discusses benefits and challenges of current techniques. The representation of evacuees is a central issue, including human behavior and the proper implementation of representational tools. Key topics include the nature and importance of the different parameters involved in ASET and RSET and the interactions between them. A review of the current literature on verification and validation methods is provided, with a set of recommended verification tests and examples of validation tests. The book c...

  1. New temperature model of the Netherlands from new data and novel modelling methodology

    Science.gov (United States)

    Bonté, Damien; Struijk, Maartje; Békési, Eszter; Cloetingh, Sierd; van Wees, Jan-Diederik

    2017-04-01

    Deep geothermal energy has grown in interest in Western Europe in the last decades, for direct use but also, as the knowledge of the subsurface improves, for electricity generation. In the Netherlands, where the sector took off with the first system in 2005, geothermal energy is seen has a key player for a sustainable future. The knowledge of the temperature subsurface, together with the available flow from the reservoir, is an important factor that can determine the success of a geothermal energy project. To support the development of deep geothermal energy system in the Netherlands, we have made a first assessment of the subsurface temperature based on thermal data but also on geological elements (Bonté et al, 2012). An outcome of this work was ThermoGIS that uses the temperature model. This work is a revision of the model that is used in ThermoGIS. The improvement from the first model are multiple, we have been improving not only the dataset used for the calibration and structural model, but also the methodology trough an improved software (called b3t). The temperature dataset has been updated by integrating temperature on the newly accessible wells. The sedimentary description in the basin has been improved by using an updated and refined structural model and an improved lithological definition. A major improvement in from the methodology used to perform the modelling, with b3t the calibration is made not only using the lithospheric parameters but also using the thermal conductivity of the sediments. The result is a much more accurate definition of the parameters for the model and a perfected handling of the calibration process. The result obtain is a precise and improved temperature model of the Netherlands. The thermal conductivity variation in the sediments associated with geometry of the layers is an important factor of temperature variations and the influence of the Zechtein salt in the north of the country is important. In addition, the radiogenic heat

  2. Loopless nontrapping invasion-percolation model for fracking.

    Science.gov (United States)

    Norris, J Quinn; Turcotte, Donald L; Rundle, John B

    2014-02-01

    Recent developments in hydraulic fracturing (fracking) have enabled the recovery of large quantities of natural gas and oil from old, low-permeability shales. These developments include a change from low-volume, high-viscosity fluid injection to high-volume, low-viscosity injection. The injected fluid introduces distributed damage that provides fracture permeability for the extraction of the gas and oil. In order to model this process, we utilize a loopless nontrapping invasion percolation previously introduced to model optimal polymers in a strongly disordered medium and for determining minimum energy spanning trees on a lattice. We performed numerical simulations on a two-dimensional square lattice and find significant differences from other percolation models. Additionally, we find that the growing fracture network satisfies both Horton-Strahler and Tokunaga network statistics. As with other invasion percolation models, our model displays burst dynamics, in which the cluster extends rapidly into a connected region. We introduce an alternative definition of bursts to be a consecutive series of opened bonds whose strengths are all below a specified value. Using this definition of bursts, we find good agreement with a power-law frequency-area distribution. These results are generally consistent with the observed distribution of microseismicity observed during a high-volume frack.

  3. Toward an Expanded Definition of Adaptive Decision Making.

    Science.gov (United States)

    Phillips, Susan D.

    1997-01-01

    Uses the lifespan, life-space model to examine the definition of adaptive decision making. Reviews the existing definition of adaptive decision making as "rational" decision making and offers alternate perspectives on decision making with an emphasis on the implications of using the model. Makes suggestions for future theory, research,…

  4. Analyzing numerics of bulk microphysics schemes in community models: warm rain processes

    Directory of Open Access Journals (Sweden)

    I. Sednev

    2012-08-01

    Full Text Available Implementation of bulk cloud microphysics (BLK parameterizations in atmospheric models of different scales has gained momentum in the last two decades. Utilization of these parameterizations in cloud-resolving models when timesteps used for the host model integration are a few seconds or less is justified from the point of view of cloud physics. However, mechanistic extrapolation of the applicability of BLK schemes to the regional or global scales and the utilization of timesteps of hundreds up to thousands of seconds affect both physics and numerics.

    We focus on the mathematical aspects of BLK schemes, such as stability and positive-definiteness. We provide a strict mathematical definition for the problem of warm rain formation. We also derive a general analytical condition (SM-criterion that remains valid regardless of parameterizations for warm rain processes in an explicit Eulerian time integration framework used to advanced finite-difference equations, which govern warm rain formation processes in microphysics packages in the Community Atmosphere Model and the Weather Research and Forecasting model. The SM-criterion allows for the existence of a unique positive-definite stable mass-conserving numerical solution, imposes an additional constraint on the timestep permitted due to the microphysics (like the Courant-Friedrichs-Lewy condition for the advection equation, and prohibits use of any additional assumptions not included in the strict mathematical definition of the problem under consideration.

    By analyzing the numerics of warm rain processes in source codes of BLK schemes implemented in community models we provide general guidelines regarding the appropriate choice of time steps in these models.

  5. TAPWAT: Definition structure and applications for modelling drinking water treatment

    NARCIS (Netherlands)

    Versteegh JFM; Gaalen FW van; Rietveld LC; Evers EG; Aldenberg TA; Cleij P; Technische Universiteit Delft; LWD

    2001-01-01

    The 'Tool for the Analysis of the Production of drinking WATer' (TAPWAT) model has been developed for describing drinking-water quality in integral studies in the context of the Environmental Policy Assessment of the RIVM. The model consists of modules that represent individual steps in a treatment

  6. TAPWAT: Definition structure and applications for modelling drinking water treatment

    NARCIS (Netherlands)

    Versteegh JFM; van Gaalen FW; Rietveld LC; Evers EG; Aldenberg TA; Cleij P; LWD

    2001-01-01

    Het model TAPWAT (Tool for the Analysis of the Production of drinking WATer), is ontwikkeld om de drinkwaterkwaliteit te beschrijven voor integrale studies in het kader van het planbureau Milieu en Natuur van het RIVM. Het model bestaat uit modules die de individuele zuiveringsstappen van het

  7. Scope Definition

    DEFF Research Database (Denmark)

    Bjørn, Anders; Owsianiak, Mikołaj; Laurent, Alexis

    2018-01-01

    The scope definition is the second phase of an LCA. It determines what product systems are to be assessed and how this assessment should take place. This chapter teaches how to perform a scope definition. First, important terminology and key concepts of LCA are introduced. Then, the nine items...... making up a scope definition are elaborately explained: (1) Deliverables. (2) Object of assessment, (3) LCI modelling framework and handling of multifunctional processes, (4) System boundaries and completeness requirements, (5) Representativeness of LCI data, (6) Preparing the basis for the impact...... assessment, (7) Special requirements for system comparisons, (8) Critical review needs and (9) Planning reporting of results. The instructions relate both to the performance and reporting of a scope definition and are largely based on ILCD....

  8. ARMA Cholesky Factor Models for the Covariance Matrix of Linear Models.

    Science.gov (United States)

    Lee, Keunbaik; Baek, Changryong; Daniels, Michael J

    2017-11-01

    In longitudinal studies, serial dependence of repeated outcomes must be taken into account to make correct inferences on covariate effects. As such, care must be taken in modeling the covariance matrix. However, estimation of the covariance matrix is challenging because there are many parameters in the matrix and the estimated covariance matrix should be positive definite. To overcomes these limitations, two Cholesky decomposition approaches have been proposed: modified Cholesky decomposition for autoregressive (AR) structure and moving average Cholesky decomposition for moving average (MA) structure, respectively. However, the correlations of repeated outcomes are often not captured parsimoniously using either approach separately. In this paper, we propose a class of flexible, nonstationary, heteroscedastic models that exploits the structure allowed by combining the AR and MA modeling of the covariance matrix that we denote as ARMACD. We analyze a recent lung cancer study to illustrate the power of our proposed methods.

  9. Serpentinization reaction pathways: implications for modeling approach

    Energy Technology Data Exchange (ETDEWEB)

    Janecky, D.R.

    1986-01-01

    Experimental seawater-peridotite reaction pathways to form serpentinites at 300/sup 0/C, 500 bars, can be accurately modeled using the EQ3/6 codes in conjunction with thermodynamic and kinetic data from the literature and unpublished compilations. These models provide both confirmation of experimental interpretations and more detailed insight into hydrothermal reaction processes within the oceanic crust. The accuracy of these models depends on careful evaluation of the aqueous speciation model, use of mineral compositions that closely reproduce compositions in the experiments, and definition of realistic reactive components in terms of composition, thermodynamic data, and reaction rates.

  10. Exclusion statistics and integrable models

    International Nuclear Information System (INIS)

    Mashkevich, S.

    1998-01-01

    The definition of exclusion statistics that was given by Haldane admits a 'statistical interaction' between distinguishable particles (multispecies statistics). For such statistics, thermodynamic quantities can be evaluated exactly; explicit expressions are presented here for cluster coefficients. Furthermore, single-species exclusion statistics is realized in one-dimensional integrable models of the Calogero-Sutherland type. The interesting questions of generalizing this correspondence to the higher-dimensional and the multispecies cases remain essentially open; however, our results provide some hints as to searches for the models in question

  11. A Model of Teacher Stress

    Science.gov (United States)

    Kyriacou, Chris; Sutcliffe, John

    1978-01-01

    A definition and model of teacher stress is presented which conceptualizes teacher stress as a response syndrome (anger or depression) mediated by (1) an appraisal of threat to the teacher's self-esteem or well-being and (2) coping mechanisms activated to reduce the perceived threat. (Author)

  12. Radiation budget measurement/model interface research

    Science.gov (United States)

    Vonderhaar, T. H.

    1981-01-01

    The NIMBUS 6 data were analyzed to form an up to date climatology of the Earth radiation budget as a basis for numerical model definition studies. Global maps depicting infrared emitted flux, net flux and albedo from processed NIMBUS 6 data for July, 1977, are presented. Zonal averages of net radiation flux for April, May, and June and zonal mean emitted flux and net flux for the December to January period are also presented. The development of two models is reported. The first is a statistical dynamical model with vertical and horizontal resolution. The second model is a two level global linear balance model. The results of time integration of the model up to 120 days, to simulate the January circulation, are discussed. Average zonal wind, meridonal wind component, vertical velocity, and moisture budget are among the parameters addressed.

  13. Multiple Scenario Generation of Subsurface Models

    DEFF Research Database (Denmark)

    Cordua, Knud Skou

    of information is obeyed such that no unknown assumptions and biases influence the solution to the inverse problem. This involves a definition of the probabilistically formulated inverse problem, a discussion about how prior models can be established based on statistical information from sample models...... of the probabilistic formulation of the inverse problem. This function is based on an uncertainty model that describes the uncertainties related to the observed data. In a similar way, a formulation of the prior probability distribution that takes into account uncertainties related to the sample model statistics...... similar to observation uncertainties. We refer to the effect of these approximations as modeling errors. Examples that show how the modeling error is estimated are provided. Moreover, it is shown how these effects can be taken into account in the formulation of the posterior probability distribution...

  14. Verilog HDL digital design and modeling

    CERN Document Server

    Cavanagh, Joseph

    2007-01-01

    PREFACE INTRODUCTION History of HDL Verilog HDL IEEE Standard Features Assertion Levels OVERVIEW Design Methodologies Modulo-16 Synchronous Counter Four-Bit Ripple Adder Modules and Ports Designing a Test Bench for Simulation Construct Definitions Introduction to Dataflow Modeling Two-Input Exclusive-OR Gate Four 2-Input AND Gates With Delay Introduction to Behavioral Modeling Three-Input OR Gate Four-Bit Adder Modulo-16 Synchronous Counter Introduction to Structural Modeling Sum-of-Products Implementation Full Adder Four-Bit Ripple Adder Introduction to Mixed-Design Modeling Full Adder Problems LANGUAGE ELEMENTS Comments Identifiers Keywords Bidirectional Gates Charge Storage Strengths CMOS Gates Combinational Logic Gates Continuous Assignment Data Types Module Declaration MOS Switches Multiple-Way Branching Named Ev...

  15. Feature-based tolerancing for intelligent inspection process definition

    International Nuclear Information System (INIS)

    Brown, C.W.

    1993-07-01

    This paper describes a feature-based tolerancing capability that complements a geometric solid model with an explicit representation of conventional and geometric tolerances. This capability is focused on supporting an intelligent inspection process definition system. The feature-based tolerance model's benefits include advancing complete product definition initiatives (e.g., STEP -- Standard for Exchange of Product model dam), suppling computer-integrated manufacturing applications (e.g., generative process planning and automated part programming) with product definition information, and assisting in the solution of measurement performance issues. A feature-based tolerance information model was developed based upon the notion of a feature's toleranceable aspects and describes an object-oriented scheme for representing and relating tolerance features, tolerances, and datum reference frames. For easy incorporation, the tolerance feature entities are interconnected with STEP solid model entities. This schema will explicitly represent the tolerance specification for mechanical products, support advanced dimensional measurement applications, and assist in tolerance-related methods divergence issues

  16. Towards Clone Detection in UML Domain Models

    DEFF Research Database (Denmark)

    Störrle, Harald

    2013-01-01

    Code clones (i.e., duplicate fragments of code) have been studied for long, and there is strong evidence that they are a major source of software faults. Anecdotal evidence suggests that this phenomenon occurs similarly in models, suggesting that model clones are as detrimental to model quality...... as they are to code quality. However, programming language code and visual models have significant differences that make it difficult to directly transfer notions and algorithms developed in the code clone arena to model clones. In this article, we develop and propose a definition of the notion of “model clone” based...... we believe that our approach advances the state of the art significantly, it is restricted to UML models, its results leave room for improvements, and there is no validation by field studies....

  17. Communicative Modelling of Cultural Transmission and Evolution Through a Holographic Cognition Model

    Directory of Open Access Journals (Sweden)

    Ambjörn Naeve

    2012-12-01

    Full Text Available This article presents communicative ways to model the transmission and evolution of the processes and artefacts of a culture as the result of ongoing interactions between its members - both at the tacit and the explicit level. The purpose is not to model the entire cultural process, but to provide semantically rich “conceptual placeholders” for modelling any cultural activity that is considered important enough within a certain context. The general purpose of communicative modelling is to create models that improve the quality of communication between people. In order to capture the subjective aspects of Gregory Bateson’s definition of information as “a difference that makes a difference,” the article introduces a Holographic Cognition Model that uses optical holography as an analogy for human cognition, with the object beam of holography corresponding to the first difference (the situation that the cognitive agent encounters, and the reference beam of holography corresponding to the subjective experiences and biases that the agent brings to the situation, and which makes the second difference (the interference/interpretation pattern unique for each agent. By combining the HCM with a semantically rich and recursive form of process modelling, based on the SECI-theory of knowledge creation, we arrive at way to model the cultural transmission and evolution process that is consistent with the Unified Theory of Information (the Triple-C model with its emphasis on intra-, inter- and supra-actions.

  18. Modelling and Forecasting Multivariate Realized Volatility

    DEFF Research Database (Denmark)

    Halbleib, Roxana; Voev, Valeri

    2011-01-01

    This paper proposes a methodology for dynamic modelling and forecasting of realized covariance matrices based on fractionally integrated processes. The approach allows for flexible dependence patterns and automatically guarantees positive definiteness of the forecast. We provide an empirical appl...

  19. Computational Modeling of Culture's Consequences

    NARCIS (Netherlands)

    Hofstede, G.J.; Jonker, C.M.; Verwaart, T.

    2010-01-01

    This paper presents an approach to formalize the influence of culture on the decision functions of agents in social simulations. The key components are (a) a definition of the domain of study in the form of a decision model, (b) knowledge acquisition based on a dimensional theory of culture,

  20. Accurate phenotyping: Reconciling approaches through Bayesian model averaging.

    Directory of Open Access Journals (Sweden)

    Carla Chia-Ming Chen

    Full Text Available Genetic research into complex diseases is frequently hindered by a lack of clear biomarkers for phenotype ascertainment. Phenotypes for such diseases are often identified on the basis of clinically defined criteria; however such criteria may not be suitable for understanding the genetic composition of the diseases. Various statistical approaches have been proposed for phenotype definition; however our previous studies have shown that differences in phenotypes estimated using different approaches have substantial impact on subsequent analyses. Instead of obtaining results based upon a single model, we propose a new method, using Bayesian model averaging to overcome problems associated with phenotype definition. Although Bayesian model averaging has been used in other fields of research, this is the first study that uses Bayesian model averaging to reconcile phenotypes obtained using multiple models. We illustrate the new method by applying it to simulated genetic and phenotypic data for Kofendred personality disorder-an imaginary disease with several sub-types. Two separate statistical methods were used to identify clusters of individuals with distinct phenotypes: latent class analysis and grade of membership. Bayesian model averaging was then used to combine the two clusterings for the purpose of subsequent linkage analyses. We found that causative genetic loci for the disease produced higher LOD scores using model averaging than under either individual model separately. We attribute this improvement to consolidation of the cores of phenotype clusters identified using each individual method.

  1. European project for a multinational macrosectoral model

    Energy Technology Data Exchange (ETDEWEB)

    d' Alcantara, G; Italianer, A

    1984-01-01

    This paper describes the HERMES project, a multinational macrosectoral European econometric modelling effort, sponsored by the Directorates General II (Economic and Financial Affairs), XII (Science, Research and Development), XVII (Energy) and the SOEC. The set-up of the model is sketched against the background of problems of growth, unemployment, inflation, trade balances, government balances and energy policy. Although the definitions of the variables and a complete specification of the model are given in the Appendix, the major features of the model are described extensively in the text. These include private and collective consumption (incl. a consumer demand system), the putty-clay production process, price and wage formation, sectoral bilateral trade flows and integrated energy economy modelling.

  2. A metric model of lambda calculus with guarded recursion

    DEFF Research Database (Denmark)

    Birkedal, Lars; Schwinghammer, Jan; Støvring, Kristian

    2010-01-01

    We give a model for Nakano’s typed lambda calculus with guarded recursive definitions in a category of metric spaces. By proving a computational adequacy result that relates the interpretation with the operational semantics, we show that the model can be used to reason about contextual equivalence....

  3. Deriving simulators for hybrid Chi models

    NARCIS (Netherlands)

    Beek, van D.A.; Man, K.L.; Reniers, M.A.; Rooda, J.E.; Schiffelers, R.R.H.

    2006-01-01

    The hybrid Chi language is formalism for modeling, simulation and verification of hybrid systems. The formal semantics of hybrid Chi allows the definition of provably correct implementations for simulation, verification and realtime control. This paper discusses the principles of deriving an

  4. Generalized formal model of Big Data

    OpenAIRE

    Shakhovska, N.; Veres, O.; Hirnyak, M.

    2016-01-01

    This article dwells on the basic characteristic features of the Big Data technologies. It is analyzed the existing definition of the “big data” term. The article proposes and describes the elements of the generalized formal model of big data. It is analyzed the peculiarities of the application of the proposed model components. It is described the fundamental differences between Big Data technology and business analytics. Big Data is supported by the distributed file system Google File System ...

  5. Macroeconomic models and energy transition

    International Nuclear Information System (INIS)

    Douillard, Pierre; Le Hir, Boris; Epaulard, Anne

    2016-02-01

    As a new policy for energy transition has just been adopted, several questions emerge about the best way to reduce CO 2 emissions, about policies which enable this reduction, and about their costs and opportunities. This note discusses the contribution macro-economic models may have in this respect, notably in the definition of policies which trigger behaviour changes, and those which support energy transition. The authors first discuss the stakes of the assessment of energy transition, and then describe macro-economic models which can be used for such an assessment, give and comment some results of simulations performed for France by using four of these models (Mesange, Numesis, ThreeME, and Imaclim-R France). The authors finally draw lessons about the way to use these models and to interpret their results within the frame of energy transition

  6. Improving transcriptome construction in non-model organisms: integrating manual and automated gene definition in Emiliania huxleyi.

    Science.gov (United States)

    Feldmesser, Ester; Rosenwasser, Shilo; Vardi, Assaf; Ben-Dor, Shifra

    2014-02-22

    The advent of Next Generation Sequencing technologies and corresponding bioinformatics tools allows the definition of transcriptomes in non-model organisms. Non-model organisms are of great ecological and biotechnological significance, and consequently the understanding of their unique metabolic pathways is essential. Several methods that integrate de novo assembly with genome-based assembly have been proposed. Yet, there are many open challenges in defining genes, particularly where genomes are not available or incomplete. Despite the large numbers of transcriptome assemblies that have been performed, quality control of the transcript building process, particularly on the protein level, is rarely performed if ever. To test and improve the quality of the automated transcriptome reconstruction, we used manually defined and curated genes, several of them experimentally validated. Several approaches to transcript construction were utilized, based on the available data: a draft genome, high quality RNAseq reads, and ESTs. In order to maximize the contribution of the various data, we integrated methods including de novo and genome based assembly, as well as EST clustering. After each step a set of manually curated genes was used for quality assessment of the transcripts. The interplay between the automated pipeline and the quality control indicated which additional processes were required to improve the transcriptome reconstruction. We discovered that E. huxleyi has a very high percentage of non-canonical splice junctions, and relatively high rates of intron retention, which caused unique issues with the currently available tools. While individual tools missed genes and artificially joined overlapping transcripts, combining the results of several tools improved the completeness and quality considerably. The final collection, created from the integration of several quality control and improvement rounds, was compared to the manually defined set both on the DNA and

  7. Different use of medical terminology and culture-specific models of ...

    African Journals Online (AJOL)

    Where words were in the vocabulary of both groups, significant differences existed in the number and range of definitions, with many clinically significant discordances of definition being apparent. Some common examples relevant to paediatric respiratory problems are presented. Three culture-specific explanatory models ...

  8. Dual deep modeling: multi-level modeling with dual potencies and its formalization in F-Logic.

    Science.gov (United States)

    Neumayr, Bernd; Schuetz, Christoph G; Jeusfeld, Manfred A; Schrefl, Michael

    2018-01-01

    An enterprise database contains a global, integrated, and consistent representation of a company's data. Multi-level modeling facilitates the definition and maintenance of such an integrated conceptual data model in a dynamic environment of changing data requirements of diverse applications. Multi-level models transcend the traditional separation of class and object with clabjects as the central modeling primitive, which allows for a more flexible and natural representation of many real-world use cases. In deep instantiation, the number of instantiation levels of a clabject or property is indicated by a single potency. Dual deep modeling (DDM) differentiates between source potency and target potency of a property or association and supports the flexible instantiation and refinement of the property by statements connecting clabjects at different modeling levels. DDM comes with multiple generalization of clabjects, subsetting/specialization of properties, and multi-level cardinality constraints. Examples are presented using a UML-style notation for DDM together with UML class and object diagrams for the representation of two-level user views derived from the multi-level model. Syntax and semantics of DDM are formalized and implemented in F-Logic, supporting the modeler with integrity checks and rich query facilities.

  9. The Open Business Model: Understanding an Emerging Concept

    OpenAIRE

    Weiblen Tobias

    2014-01-01

    Along with the emergence of phenomena such as value co-creation, firm networks, and open innovation, open business models have achieved growing attention in research. Scholars from different fields use the open business model, largely without providing a definition. This has led to an overall lack of clarity of the concept itself. Based on a comprehensive review of scholarly literature in the field, commonalities and differences in the perceived nature of the open business model are carved ou...

  10. Modelling of the Deterioration of Reinforced Concrete Structures

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle

    Stochastic modelling of the deterioration of reinforced concrete structures is addressed in this paper on basis of a detailed modelling of corrosion initiation and corrosion cracking. It is proposed that modelling of the deterioration of concrete should be based on a sound understanding...... of the physical and chemical properties of the concrete. The relationship between rebar corrosion and crack width is investigated. A new service life definition based on evolution of the corrosion crack width is proposed....

  11. Defining generic architecture for Cloud IaaS provisioning model

    NARCIS (Netherlands)

    Demchenko, Y.; de Laat, C.; Mavrin, A.; Leymann, F.; Ivanov, I.; van Sinderen, M.; Shishkov, B.

    2011-01-01

    Infrastructure as a Service (IaaS) is one of the provisioning models for Clouds as defined in the NIST Clouds definition. Although widely used, current IaaS implementations and solutions doesn’t have common and well defined architecture model. The paper attempts to define a generic architecture for

  12. Organizational Frustration: A Model and Review of the Literature.

    Science.gov (United States)

    Spector, Paul E.

    1978-01-01

    This discussion is divided into four parts: (1) the definition of frustration; (2) general behavioral reactions to frustration which have implications for organizations; (3) integration of the individual behavioral reactions into a model of organizational frustration; and (4) a review of the supporting evidence for the model. (Author)

  13. QSAR models for anti-androgenic effect - a preliminary study

    DEFF Research Database (Denmark)

    Jensen, Gunde Egeskov; Nikolov, Nikolai Georgiev; Wedebye, Eva Bay

    2011-01-01

    Three modelling systems (MultiCase (R), LeadScope (R) and MDL (R) QSAR) were used for construction of androgenic receptor antagonist models. There were 923-942 chemicals in the training sets. The models were cross-validated (leave-groups-out) with concordances of 77-81%, specificity of 78...... of the model for a particular application, balance of training sets, domain definition, and cut-offs for prediction interpretation should also be taken into account. Different descriptors in the modelling systems are illustrated with hydroxyflutamide and dexamethasone as examples (a non-steroid and a steroid...

  14. Persistence and extinction for stochastic logistic model with Levy noise and impulsive perturbation

    Directory of Open Access Journals (Sweden)

    Chun Lu

    2015-09-01

    Full Text Available This article investigates a stochastic logistic model with Levy noise and impulsive perturbation. In the model, the impulsive perturbation and Levy noise are taken into account simultaneously. This model is new and more feasible and more accordance with the actual. The definition of solution to a stochastic differential equation with Levy noise and impulsive perturbation is established. Based on this definition, we show that our model has a unique global positive solution and obtains its explicit expression. Sufficient conditions for extinction are established as well as nonpersistence in the mean, weak persistence and stochastic permanence. The threshold between weak persistence and extinction is obtained.

  15. Phenomenological model of nanocluster in polymer matrix

    International Nuclear Information System (INIS)

    Oksengendler, B.L.; Turaeva, N.N.; Azimov, J.; Rashidova, S.Sh.

    2010-01-01

    The phenomenological model of matrix nanoclusters is presented based on the Wood-Saxon potential used in nuclear physics. In frame of this model the following problems have been considered: calculation of width of diffusive layer between nanocluster and matrix, definition of Tamm surface electronic state taking into account the diffusive layer width, receiving the expression for specific magnetic moment of nanoclusters taking into account the interface width. (authors)

  16. Parameters and error of a theoretical model

    International Nuclear Information System (INIS)

    Moeller, P.; Nix, J.R.; Swiatecki, W.

    1986-09-01

    We propose a definition for the error of a theoretical model of the type whose parameters are determined from adjustment to experimental data. By applying a standard statistical method, the maximum-likelihoodlmethod, we derive expressions for both the parameters of the theoretical model and its error. We investigate the derived equations by solving them for simulated experimental and theoretical quantities generated by use of random number generators. 2 refs., 4 tabs

  17. A stage-based model of design teaching

    DEFF Research Database (Denmark)

    Beier, Sofie

    2014-01-01

    With a focus on the teaching of design students in higher education, the article will present a teaching approach model that follows the stages of the design process. The model suggests that at the Definition stage, the supervisor can focus on leading the student into a more thorough elaboration ...... apply an approach inspired by the master–apprentice relationship, where the student learns by observing the master at work....

  18. The Langevin method and Hubbard-like models

    International Nuclear Information System (INIS)

    Gross, M.; Hamber, H.

    1989-01-01

    The authors reexamine the difficulties associated with application of the Langevin method to numerical simulation of models with non-positive definite statistical weights, including the Hubbard model. They show how to avoid the violent crossing of the zeroes of the weight and how to move those nodes away from the real axis. However, it still appears necessary to keep track of the sign (or phase) of the weight

  19. Business process modeling in healthcare.

    Science.gov (United States)

    Ruiz, Francisco; Garcia, Felix; Calahorra, Luis; Llorente, César; Gonçalves, Luis; Daniel, Christel; Blobel, Bernd

    2012-01-01

    The importance of the process point of view is not restricted to a specific enterprise sector. In the field of health, as a result of the nature of the service offered, health institutions' processes are also the basis for decision making which is focused on achieving their objective of providing quality medical assistance. In this chapter the application of business process modelling - using the Business Process Modelling Notation (BPMN) standard is described. Main challenges of business process modelling in healthcare are the definition of healthcare processes, the multi-disciplinary nature of healthcare, the flexibility and variability of the activities involved in health care processes, the need of interoperability between multiple information systems, and the continuous updating of scientific knowledge in healthcare.

  20. An introduction to multilevel flow modeling

    DEFF Research Database (Denmark)

    Lind, Morten

    2011-01-01

    Multilevel Flow Modeling (MFM) is a methodology for functional modeling of industrial processes on several interconnected levels of means-end and part-whole abstractions. The basic idea of MFM is to represent an industrial plant as a system which provides the means required to serve purposes in i...... in detail by a water mill example. The overall reasoning capabilities of MFM and its basis in cause-effect relations are also explained. The appendix contains an overview of MFM concepts and their definitions....

  1. The (Mathematical) Modeling Process in Biosciences.

    Science.gov (United States)

    Torres, Nestor V; Santos, Guido

    2015-01-01

    In this communication, we introduce a general framework and discussion on the role of models and the modeling process in the field of biosciences. The objective is to sum up the common procedures during the formalization and analysis of a biological problem from the perspective of Systems Biology, which approaches the study of biological systems as a whole. We begin by presenting the definitions of (biological) system and model. Particular attention is given to the meaning of mathematical model within the context of biology. Then, we present the process of modeling and analysis of biological systems. Three stages are described in detail: conceptualization of the biological system into a model, mathematical formalization of the previous conceptual model and optimization and system management derived from the analysis of the mathematical model. All along this work the main features and shortcomings of the process are analyzed and a set of rules that could help in the task of modeling any biological system are presented. Special regard is given to the formative requirements and the interdisciplinary nature of this approach. We conclude with some general considerations on the challenges that modeling is posing to current biology.

  2. Tolerance of uncertainty: Conceptual analysis, integrative model, and implications for healthcare.

    Science.gov (United States)

    Hillen, Marij A; Gutheil, Caitlin M; Strout, Tania D; Smets, Ellen M A; Han, Paul K J

    2017-05-01

    Uncertainty tolerance (UT) is an important, well-studied phenomenon in health care and many other important domains of life, yet its conceptualization and measurement by researchers in various disciplines have varied substantially and its essential nature remains unclear. The objectives of this study were to: 1) analyze the meaning and logical coherence of UT as conceptualized by developers of UT measures, and 2) develop an integrative conceptual model to guide future empirical research regarding the nature, causes, and effects of UT. A narrative review and conceptual analysis of 18 existing measures of Uncertainty and Ambiguity Tolerance was conducted, focusing on how measure developers in various fields have defined both the "uncertainty" and "tolerance" components of UT-both explicitly through their writings and implicitly through the items constituting their measures. Both explicit and implicit conceptual definitions of uncertainty and tolerance vary substantially and are often poorly and inconsistently specified. A logically coherent, unified understanding or theoretical model of UT is lacking. To address these gaps, we propose a new integrative definition and multidimensional conceptual model that construes UT as the set of negative and positive psychological responses-cognitive, emotional, and behavioral-provoked by the conscious awareness of ignorance about particular aspects of the world. This model synthesizes insights from various disciplines and provides an organizing framework for future research. We discuss how this model can facilitate further empirical and theoretical research to better measure and understand the nature, determinants, and outcomes of UT in health care and other domains of life. Uncertainty tolerance is an important and complex phenomenon requiring more precise and consistent definition. An integrative definition and conceptual model, intended as a tentative and flexible point of departure for future research, adds needed breadth

  3. Ethics in Librarianship: A Management Model.

    Science.gov (United States)

    Du Mont, Rosemary Ruhig

    1991-01-01

    Presents a management model of ethical decision making in librarianship. Highlights include a definition of ethics; ethical concerns in information professions; the concept of social responsibility; ethical dimensions of decision making, including access to information and hiring decisions; ethical considerations for managers; and strategies for…

  4. PATHS groundwater hydrologic model

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, R.W.; Schur, J.A.

    1980-04-01

    A preliminary evaluation capability for two-dimensional groundwater pollution problems was developed as part of the Transport Modeling Task for the Waste Isolation Safety Assessment Program (WISAP). Our approach was to use the data limitations as a guide in setting the level of modeling detail. PATHS Groundwater Hydrologic Model is the first level (simplest) idealized hybrid analytical/numerical model for two-dimensional, saturated groundwater flow and single component transport; homogeneous geology. This document consists of the description of the PATHS groundwater hydrologic model. The preliminary evaluation capability prepared for WISAP, including the enhancements that were made because of the authors' experience using the earlier capability is described. Appendixes A through D supplement the report as follows: complete derivations of the background equations are provided in Appendix A. Appendix B is a comprehensive set of instructions for users of PATHS. It is written for users who have little or no experience with computers. Appendix C is for the programmer. It contains information on how input parameters are passed between programs in the system. It also contains program listings and test case listing. Appendix D is a definition of terms.

  5. Continuation-like semantics for modeling structural process anomalies

    Directory of Open Access Journals (Sweden)

    Grewe Niels

    2012-09-01

    Full Text Available Abstract Background Biomedical ontologies usually encode knowledge that applies always or at least most of the time, that is in normal circumstances. But for some applications like phenotype ontologies it is becoming increasingly important to represent information about aberrations from a norm. These aberrations may be modifications of physiological structures, but also modifications of biological processes. Methods To facilitate precise definitions of process-related phenotypes, such as delayed eruption of the primary teeth or disrupted ocular pursuit movements, I introduce a modeling approach that draws inspiration from the use of continuations in the analysis of programming languages and apply a similar idea to ontological modeling. This approach characterises processes by describing their outcome up to a certain point and the way they will continue in the canonical case. Definitions of process types are then given in terms of their continuations and anomalous phenotypes are defined by their differences to the canonical definitions. Results The resulting model is capable of accurately representing structural process anomalies. It allows distinguishing between different anomaly kinds (delays, interruptions, gives identity criteria for interrupted processes, and explains why normal and anomalous process instances can be subsumed under a common type, thus establishing the connection between canonical and anomalous process-related phenotypes. Conclusion This paper shows how to to give semantically rich definitions of process-related phenotypes. These allow to expand the application areas of phenotype ontologies beyond literature annotation and establishment of genotype-phenotype associations to the detection of anomalies in suitably encoded datasets.

  6. Model for paramagnetic Fermi systems

    International Nuclear Information System (INIS)

    Ainsworth, T.L.; Bedell, K.S.; Brown, G.E.; Quader, K.F.

    1983-01-01

    We develop a mode for paramagnetic Fermi liquids. This model has both direct and induced interactions, the latter including both density-density and current-current response. The direct interactions are chosen to reproduce the Fermi liquid parameters F/sup s/ 0 , F/sup a/ 0 , F/sup s/ 1 and to satify the forward scattering sum rule. The F/sup a/ 1 and F/sup s/,a/sub l/ for l>1 are determined self-consistently by the induced interactions; they are checked aginst experimental determinations. The model is applied in detail to liquid 3 He, using data from spin-echo experiments, sound attenuation, and the velocities of first and zero sound. Consistency with experiments gives definite preferences for values of m. The model is also applied to paramagnetic metals. Arguments are given that this model should provide a basis for calculating effects of magnetic fields

  7. Unpacking prevention capacity: an intersection of research-to-practice models and community-centered models.

    Science.gov (United States)

    Flaspohler, Paul; Duffy, Jennifer; Wandersman, Abraham; Stillman, Lindsey; Maras, Melissa A

    2008-06-01

    Capacity is a complex construct that lacks definitional clarity. Little has been done to define capacity, explicate components of capacity, or explore the development of capacity in prevention. This article represents an attempt to operationalize capacity and distinguish among types and levels of capacity as they relate to dissemination and implementation through the use of a taxonomy of capacity. The development of the taxonomy was informed by the capacity literature from two divergent models in the field: research-to-practice (RTP) models and community-centered (CC) models. While these models differ in perspective and focus, both emphasize the importance of capacity to the dissemination and sustainability of prevention innovations. Based on the review of the literature, the taxonomy differentiates the concepts of capacity among two dimensions: level (individual, organizational, and community levels) and type (general capacity and innovation-specific capacity). The proposed taxonomy can aid in understanding the concept of capacity and developing methods to support the implementation and sustainability of prevention efforts in novel settings.

  8. Categorical model of structural operational semantics for imperative language

    Directory of Open Access Journals (Sweden)

    William Steingartner

    2016-12-01

    Full Text Available Definition of programming languages consists of the formal definition of syntax and semantics. One of the most popular semantic methods used in various stages of software engineering is structural operational semantics. It describes program behavior in the form of state changes after execution of elementary steps of program. This feature makes structural operational semantics useful for implementation of programming languages and also for verification purposes. In our paper we present a new approach to structural operational semantics. We model behavior of programs in category of states, where objects are states, an abstraction of computer memory and morphisms model state changes, execution of a program in elementary steps. The advantage of using categorical model is its exact mathematical structure with many useful proved properties and its graphical illustration of program behavior as a path, i.e. a composition of morphisms. Our approach is able to accentuate dynamics of structural operational semantics. For simplicity, we assume that data are intuitively typed. Visualization and facility of our model is  not only  a  new model of structural operational semantics of imperative programming languages but it can also serve for education purposes.

  9. Archetype modeling methodology.

    Science.gov (United States)

    Moner, David; Maldonado, José Alberto; Robles, Montserrat

    2018-03-01

    Clinical Information Models (CIMs) expressed as archetypes play an essential role in the design and development of current Electronic Health Record (EHR) information structures. Although there exist many experiences about using archetypes in the literature, a comprehensive and formal methodology for archetype modeling does not exist. Having a modeling methodology is essential to develop quality archetypes, in order to guide the development of EHR systems and to allow the semantic interoperability of health data. In this work, an archetype modeling methodology is proposed. This paper describes its phases, the inputs and outputs of each phase, and the involved participants and tools. It also includes the description of the possible strategies to organize the modeling process. The proposed methodology is inspired by existing best practices of CIMs, software and ontology development. The methodology has been applied and evaluated in regional and national EHR projects. The application of the methodology provided useful feedback and improvements, and confirmed its advantages. The conclusion of this work is that having a formal methodology for archetype development facilitates the definition and adoption of interoperable archetypes, improves their quality, and facilitates their reuse among different information systems and EHR projects. Moreover, the proposed methodology can be also a reference for CIMs development using any other formalism. Copyright © 2018 Elsevier Inc. All rights reserved.

  10. Homogenized thermal conduction model for particulate foods

    OpenAIRE

    Chinesta , Francisco; Torres , Rafael; Ramón , Antonio; Rodrigo , Mari Carmen; Rodrigo , Miguel

    2002-01-01

    International audience; This paper deals with the definition of an equivalent thermal conductivity for particulate foods. An homogenized thermal model is used to asses the effect of particulate spatial distribution and differences in thermal conductivities. We prove that the spatial average of the conductivity can be used in an homogenized heat transfer model if the conductivity differences among the food components are not very large, usually the highest conductivity ratio between the foods ...

  11. Formulation of consumables management models: Mission planning processor payload interface definition

    Science.gov (United States)

    Torian, J. G.

    1977-01-01

    Consumables models required for the mission planning and scheduling function are formulated. The relation of the models to prelaunch, onboard, ground support, and postmission functions for the space transportation systems is established. Analytical models consisting of an orbiter planning processor with consumables data base is developed. A method of recognizing potential constraint violations in both the planning and flight operations functions, and a flight data file storage/retrieval of information over an extended period which interfaces with a flight operations processor for monitoring of the actual flights is presented.

  12. Flow modelling in fractured aquifers, development of multi-continua model (direct and inverse problems) and application to the CEA/Cadarache site

    International Nuclear Information System (INIS)

    Cartalade, Alain

    2002-01-01

    This research thesis concerns the modelling of aquifer flows under the CEA/Cadarache site. The author reports the implementation of a numerical simulation tool adapted to large scale flows in fractured media, and its application to the Cadarache nuclear site. After a description of the site geological and hydrogeological characteristics, the author presents the conceptual model on which the modelling is based, presents the inverse model which allows a better definition of parameters, reports the validation of the inverse approach by means of synthetic and semi-synthetic cases. Then, he reports experiments and simulation of the Cadarache site

  13. Groundwater Model Validation

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed E. Hassan

    2006-01-24

    Models have an inherent uncertainty. The difficulty in fully characterizing the subsurface environment makes uncertainty an integral component of groundwater flow and transport models, which dictates the need for continuous monitoring and improvement. Building and sustaining confidence in closure decisions and monitoring networks based on models of subsurface conditions require developing confidence in the models through an iterative process. The definition of model validation is postulated as a confidence building and long-term iterative process (Hassan, 2004a). Model validation should be viewed as a process not an end result. Following Hassan (2004b), an approach is proposed for the validation process of stochastic groundwater models. The approach is briefly summarized herein and detailed analyses of acceptance criteria for stochastic realizations and of using validation data to reduce input parameter uncertainty are presented and applied to two case studies. During the validation process for stochastic models, a question arises as to the sufficiency of the number of acceptable model realizations (in terms of conformity with validation data). Using a hierarchical approach to make this determination is proposed. This approach is based on computing five measures or metrics and following a decision tree to determine if a sufficient number of realizations attain satisfactory scores regarding how they represent the field data used for calibration (old) and used for validation (new). The first two of these measures are applied to hypothetical scenarios using the first case study and assuming field data consistent with the model or significantly different from the model results. In both cases it is shown how the two measures would lead to the appropriate decision about the model performance. Standard statistical tests are used to evaluate these measures with the results indicating they are appropriate measures for evaluating model realizations. The use of validation

  14. Matrix models with non-even potentials

    International Nuclear Information System (INIS)

    Marzban, C.; Raju Viswanathan, R.

    1990-07-01

    We study examples of hermitian 1-matrix models with even and odd terms present in the potential. A definition of criticality is presented which in these cases leads to multicritical models falling into the same universality classes as those of the purely even potentials. We also show that, in our examples, for polynomial potentials ending in odd powers (unbounded) the coupling constants, in addition to their expected real critical values, also admit critical values which alternate between imaginary/real values in the odd/even terms. We find that, remarkably, the ensuing statistical models are insensitive to the real/imaginary nature of these critical values. This feature may be of relevance in the recently-studied connection between matrix models and the moduli space of Riemann surfaces. (author). 9 refs

  15. What Makes Hydrologic Models Differ? Using SUMMA to Systematically Explore Model Uncertainty and Error

    Science.gov (United States)

    Bennett, A.; Nijssen, B.; Chegwidden, O.; Wood, A.; Clark, M. P.

    2017-12-01

    Model intercomparison experiments have been conducted to quantify the variability introduced during the model development process, but have had limited success in identifying the sources of this model variability. The Structure for Unifying Multiple Modeling Alternatives (SUMMA) has been developed as a framework which defines a general set of conservation equations for mass and energy as well as a common core of numerical solvers along with the ability to set options for choosing between different spatial discretizations and flux parameterizations. SUMMA can be thought of as a framework for implementing meta-models which allows for the investigation of the impacts of decisions made during the model development process. Through this flexibility we develop a hierarchy of definitions which allows for models to be compared to one another. This vocabulary allows us to define the notion of weak equivalence between model instantiations. Through this weak equivalence we develop the concept of model mimicry, which can be used to investigate the introduction of uncertainty and error during the modeling process as well as provide a framework for identifying modeling decisions which may complement or negate one another. We instantiate SUMMA instances that mimic the behaviors of the Variable Infiltration Capacity (VIC) model and the Precipitation Runoff Modeling System (PRMS) by choosing modeling decisions which are implemented in each model. We compare runs from these models and their corresponding mimics across the Columbia River Basin located in the Pacific Northwest of the United States and Canada. From these comparisons, we are able to determine the extent to which model implementation has an effect on the results, as well as determine the changes in sensitivity of parameters due to these implementation differences. By examining these changes in results and sensitivities we can attempt to postulate changes in the modeling decisions which may provide better estimation of

  16. Model validation: a systemic and systematic approach

    International Nuclear Information System (INIS)

    Sheng, G.; Elzas, M.S.; Cronhjort, B.T.

    1993-01-01

    The term 'validation' is used ubiquitously in association with the modelling activities of numerous disciplines including social, political natural, physical sciences, and engineering. There is however, a wide range of definitions which give rise to very different interpretations of what activities the process involves. Analyses of results from the present large international effort in modelling radioactive waste disposal systems illustrate the urgent need to develop a common approach to model validation. Some possible explanations are offered to account for the present state of affairs. The methodology developed treats model validation and code verification in a systematic fashion. In fact, this approach may be regarded as a comprehensive framework to assess the adequacy of any simulation study. (author)

  17. Formalization of Database Systems -- and a Formal Definition of {IMS}

    DEFF Research Database (Denmark)

    Bjørner, Dines; Løvengreen, Hans Henrik

    1982-01-01

    Drawing upon an analogy between Programming Language Systems and Database Systems we outline the requirements that architectural specifications of database systems must futfitl, and argue that only formal, mathematical definitions may 6atisfy these. Then we illustrate home aspects and touch upon...... come ueee of formal definitions of data models and databaee management systems. A formal model of INS will carry this discussion. Finally we survey some of the exkting literature on formal definitions of database systems. The emphasis will be on constructive definitions in the denotationul semantics...... style of the VCM: Vienna Development Nethd. The role of formal definitions in international standardiaation efforts is briefly mentioned....

  18. Efficient methodology for multibody simulations with discontinuous changes in system definition

    International Nuclear Information System (INIS)

    Mukherjee, Rudranarayan M.; Anderson, Kurt S.

    2007-01-01

    A new method is presented for accurately and efficiently simulating multi-scale multibody systems with discontinuous changes in system definitions as encountered in adaptive switching between models with different resolutions as well as models with different system topologies. An example of model resolution change is a transition of a system from a discrete particle model to a reduced order articulated multi-rigid body model. The discontinuous changes in system definition may be viewed as an instantaneous change (release or impulsive application of) the system constraints. The method uses a spatial impulse-momentum formulation in a divide and conquer scheme. The approach utilizes a hierarchic assembly-disassembly process by traversing the system topology in a binary tree map to solve for the jumps in the system generalized speeds and the constraint impulsive loads in linear and logarithmic cost in serial and parallel implementations, respectively. The method is applicable for systems in serial chain as well as kinematical loop topologies. The coupling between the unilateral and bilateral constraints is handled efficiently through the use of kinematic joint definitions. The equations of motion for the system are produced in a hierarchic sub-structured form. This has the advantage that changes in sub-structure definitions/models results in a change to the system equations only within the associated sub-structure. This allows for significant changes in model types and definitions without having to reformulate the equations for the whole system

  19. 3D active shape and appearance models in cardiac image analysis

    NARCIS (Netherlands)

    Lelieveldt, B.P.F.; Frangi, A.F.; Mitchell, S.C.; Assen, van H.C.; Ordás, S.; Reiber, J.H.C.; Sonka, M.; Paragios, N.; Chen, Y.; Faugeras, O.

    2006-01-01

    This chapter introduces statistical shape- and appearance models and their biomedical applications. Three- and four-dimensional extension of these models are the main focus. Approaches leading to automated landmark definition are introduced and discussed. The applicability is underlined by

  20. EIA model documentation: World oil refining logistics demand model,``WORLD`` reference manual. Version 1.1

    Energy Technology Data Exchange (ETDEWEB)

    1994-04-11

    This manual is intended primarily for use as a reference by analysts applying the WORLD model to regional studies. It also provides overview information on WORLD features of potential interest to managers and analysts. Broadly, the manual covers WORLD model features in progressively increasing detail. Section 2 provides an overview of the WORLD model, how it has evolved, what its design goals are, what it produces, and where it can be taken with further enhancements. Section 3 reviews model management covering data sources, managing over-optimization, calibration and seasonality, check-points for case construction and common errors. Section 4 describes in detail the WORLD system, including: data and program systems in overview; details of mainframe and PC program control and files;model generation, size management, debugging and error analysis; use with different optimizers; and reporting and results analysis. Section 5 provides a detailed description of every WORLD model data table, covering model controls, case and technology data. Section 6 goes into the details of WORLD matrix structure. It provides an overview, describes how regional definitions are controlled and defines the naming conventions for-all model rows, columns, right-hand sides, and bounds. It also includes a discussion of the formulation of product blending and specifications in WORLD. Several Appendices supplement the main sections.

  1. Business Model Innovation to Create and Capture Resource Value in Future Circular Material Chains

    Directory of Open Access Journals (Sweden)

    Göran Roos

    2014-03-01

    Full Text Available This article briefly discusses the origins and development of the business model concept resulting in a high level definition. Against this backdrop, frameworks from the literature around green business models with examples of green business models and the business model innovation process are presented. The article then discusses the origins and meaning of different "green" concepts relevant for the circular value chain concluding with a high level definition. The article finally outline the process by which a business model for a circular value chain can be developed taking into account the social dilemma that exist in these type of situations. The article concludes with the specific questions that need to be answered in order to create an appropriate business model for a circular value chain.

  2. Consultation Models Revisited

    International Nuclear Information System (INIS)

    Fawaz, S.; Khan, Zulfiquar A.; Mossa, Samir Y.

    2006-01-01

    A new definition is proposed for analyzing the consultation in the primary health care, integrating other models of consultation and provides a framework by which general practitioners can apply the principles of consultation using communication skills to reconcile the respective agenda and autonomy of both doctor and patient into a negotiated agreed plan, which includes both management of health problems and health promotion. Achieving success of consultations depends on time and mutual cooperation between patient and doctor showed by doctor-patient relationship. (author)

  3. Distributions with given marginals and statistical modelling

    CERN Document Server

    Fortiana, Josep; Rodriguez-Lallena, José

    2002-01-01

    This book contains a selection of the papers presented at the meeting `Distributions with given marginals and statistical modelling', held in Barcelona (Spain), July 17-20, 2000. In 24 chapters, this book covers topics such as the theory of copulas and quasi-copulas, the theory and compatibility of distributions, models for survival distributions and other well-known distributions, time series, categorical models, definition and estimation of measures of dependence, monotonicity and stochastic ordering, shape and separability of distributions, hidden truncation models, diagonal families, orthogonal expansions, tests of independence, and goodness of fit assessment. These topics share the use and properties of distributions with given marginals, this being the fourth specialised text on this theme. The innovative aspect of the book is the inclusion of statistical aspects such as modelling, Bayesian statistics, estimation, and tests.

  4. Business model elements impacting cloud computing adoption

    DEFF Research Database (Denmark)

    Bogataj, Kristina; Pucihar, Andreja; Sudzina, Frantisek

    The paper presents a proposed research framework for identification of business model elements impacting Cloud Computing Adoption. We provide a definition of main Cloud Computing characteristics, discuss previous findings on factors impacting Cloud Computing Adoption, and investigate technology a...

  5. Financial Modelling: Where to go? with an illustration for portfolio management

    NARCIS (Netherlands)

    W.G.P.M. Hallerbach (Winfried); J. Spronk (Jaap)

    1997-01-01

    textabstractThe definition of Financial Modelling chosen by the EURO working group on financial modelling is ‘the development and implementation of tools supporting firms, investors, intermediaries, governments and others in their financial-economic decision making, including the validation of the

  6. Towards a Unified Business Model Vocabulary: A Proposition of Key Constructs

    OpenAIRE

    Mettler, Tobias

    2014-01-01

    The design of business models is of decisive importance and as such it has been a major research theme in service and particularly electronic markets. Today, different definitions of the term and ideas of core constructs of business models exist. In this paper we present a unified vocabulary for business models that builds upon the elementary perception of three existing, yet very dissimilar ontologies for modeling the essence of a business. The resulting unified business model vocabulary not...

  7. Mathematical modelling in solid mechanics

    CERN Document Server

    Sofonea, Mircea; Steigmann, David

    2017-01-01

    This book presents new research results in multidisciplinary fields of mathematical and numerical modelling in mechanics. The chapters treat the topics: mathematical modelling in solid, fluid and contact mechanics nonconvex variational analysis with emphasis to nonlinear solid and structural mechanics numerical modelling of problems with non-smooth constitutive laws, approximation of variational and hemivariational inequalities, numerical analysis of discrete schemes, numerical methods and the corresponding algorithms, applications to mechanical engineering numerical aspects of non-smooth mechanics, with emphasis on developing accurate and reliable computational tools mechanics of fibre-reinforced materials behaviour of elasto-plastic materials accounting for the microstructural defects definition of structural defects based on the differential geometry concepts or on the atomistic basis interaction between phase transformation and dislocations at nano-scale energetic arguments bifurcation and post-buckling a...

  8. Activated sludge models ASM1, ASM2, ASM2d and ASM3

    DEFF Research Database (Denmark)

    Henze, Mogens; Gujer, W.; Mino, T.

    This book has been produced to give a total overview of the Activated Sludge Model (ASM) family at the start of 2000 and to give the reader easy access to the different models in their original versions. It thus presents ASM1, ASM2, ASM2d and ASM3 together for the first time.Modelling of activated...... sludge processes has become a common part of the design and operation of wastewater treatment plants. Today models are being used in design, control, teaching and research.ContentsASM3: Introduction, Comparison of ASM1 and ASM3, ASM3: Definition of compounds in the model, ASM3: Definition of processes...... in the Model, ASM3: Stoichiometry, ASM3: Kinetics, Limitations of ASM3, Aspects of application of ASM3, ASM3C: A Carbon based model, Conclusion ASM 2d: Introduction, Conceptual Approach, ASM 2d, Typical Wastewater Characteristics and Kinetic and Stoichiometric Constants, Limitations, Conclusion ASM 2...

  9. Modeling and Simulation for Safeguards

    International Nuclear Information System (INIS)

    Swinhoe, Martyn T.

    2012-01-01

    The purpose of this talk is to give an overview of the role of modeling and simulation in Safeguards R and D and introduce you to (some of) the tools used. Some definitions are: (1) Modeling - the representation, often mathematical, of a process, concept, or operation of a system, often implemented by a computer program; (2) Simulation - the representation of the behavior or characteristics of one system through the use of another system, especially a computer program designed for the purpose; and (3) Safeguards - the timely detection of diversion of significant quantities of nuclear material. The role of modeling and simulation are: (1) Calculate amounts of material (plant modeling); (2) Calculate signatures of nuclear material etc. (source terms); and (3) Detector performance (radiation transport and detection). Plant modeling software (e.g. FACSIM) gives the flows and amount of material stored at all parts of the process. In safeguards this allow us to calculate the expected uncertainty of the mass and evaluate the expected MUF. We can determine the measurement accuracy required to achieve a certain performance.

  10. Defining Generic Architecture for Cloud Infrastructure as a Service model

    NARCIS (Netherlands)

    Demchenko, Y.; de Laat, C.

    2011-01-01

    Infrastructure as a Service (IaaS) is one of the provisioning models for Clouds as defined in the NIST Clouds definition. Although widely used, current IaaS implementations and solutions doesn’t have common and well defined architecture model. The paper attempts to define a generic architecture for

  11. On a Graphical Technique for Evaluating Some Rational Expectations Models

    DEFF Research Database (Denmark)

    Johansen, Søren; Swensen, Anders R.

    2011-01-01

    Campbell and Shiller (1987) proposed a graphical technique for the present value model, which consists of plotting estimates of the spread and theoretical spread as calculated from the cointegrated vector autoregressive model without imposing the restrictions implied by the present value model....... In addition to getting a visual impression of the fit of the model, the purpose is to see if the two spreads are nevertheless similar as measured by correlation, variance ratio, and noise ratio. We extend these techniques to a number of rational expectation models and give a general definition of spread...

  12. International Symposia on Scale Modeling

    CERN Document Server

    Ito, Akihiko; Nakamura, Yuji; Kuwana, Kazunori

    2015-01-01

    This volume thoroughly covers scale modeling and serves as the definitive source of information on scale modeling as a powerful simplifying and clarifying tool used by scientists and engineers across many disciplines. The book elucidates techniques used when it would be too expensive, or too difficult, to test a system of interest in the field. Topics addressed in the current edition include scale modeling to study weather systems, diffusion of pollution in air or water, chemical process in 3-D turbulent flow, multiphase combustion, flame propagation, biological systems, behavior of materials at nano- and micro-scales, and many more. This is an ideal book for students, both graduate and undergraduate, as well as engineers and scientists interested in the latest developments in scale modeling. This book also: Enables readers to evaluate essential and salient aspects of profoundly complex systems, mechanisms, and phenomena at scale Offers engineers and designers a new point of view, liberating creative and inno...

  13. A sensitivity analysis of the WIPP disposal room model: Phase 1

    Energy Technology Data Exchange (ETDEWEB)

    Labreche, D.A.; Beikmann, M.A. [RE/SPEC, Inc., Albuquerque, NM (United States); Osnes, J.D. [RE/SPEC, Inc., Rapid City, SD (United States); Butcher, B.M. [Sandia National Labs., Albuquerque, NM (United States)

    1995-07-01

    The WIPP Disposal Room Model (DRM) is a numerical model with three major components constitutive models of TRU waste, crushed salt backfill, and intact halite -- and several secondary components, including air gap elements, slidelines, and assumptions on symmetry and geometry. A sensitivity analysis of the Disposal Room Model was initiated on two of the three major components (waste and backfill models) and on several secondary components as a group. The immediate goal of this component sensitivity analysis (Phase I) was to sort (rank) model parameters in terms of their relative importance to model response so that a Monte Carlo analysis on a reduced set of DRM parameters could be performed under Phase II. The goal of the Phase II analysis will be to develop a probabilistic definition of a disposal room porosity surface (porosity, gas volume, time) that could be used in WIPP Performance Assessment analyses. This report documents a literature survey which quantifies the relative importance of the secondary room components to room closure, a differential analysis of the creep consolidation model and definition of a follow-up Monte Carlo analysis of the model, and an analysis and refitting of the waste component data on which a volumetric plasticity model of TRU drum waste is based. A summary, evaluation of progress, and recommendations for future work conclude the report.

  14. An Implicit Model Development Process for Bounding External, Seemingly Intangible/Non-Quantifiable Factors

    Science.gov (United States)

    2017-06-01

    This research expands the modeling and simulation (M and S) body of knowledge through the development of an Implicit Model Development Process (IMDP...When augmented to traditional Model Development Processes (MDP), the IMDP enables the development of models that can address a broader array of...potential impacts on operational effectiveness. Specifically, the IMDP provides a formalized methodology for developing an improved model definition

  15. Description logics with approximate definitions precise modeling of vague concepts

    NARCIS (Netherlands)

    Schlobach, Stefan; Klein, Michel; Peelen, Linda

    2007-01-01

    We extend traditional Description Logics (DL) with a simple mechanism to handle approximate concept definitions in a qualitative way. Often, for example in medical applications, concepts are not definable in a crisp way but can fairly exhaustively be constrained through a particular sub- and a

  16. Modeling Routinization in Games: An Information Theory Approach

    DEFF Research Database (Denmark)

    Wallner, Simon; Pichlmair, Martin; Hecher, Michael

    2015-01-01

    Routinization is the result of practicing until an action stops being a goal-directed process. This paper formulates a definition of routinization in games based on prior research in the fields of activity theory and practice theory. Routinization is analyzed using the formal model of discrete......-time, discrete-space Markov chains and information theory to measure the actual error between the dynamically trained models and the player interaction. Preliminary research supports the hypothesis that Markov chains can be effectively used to model routinization in games. A full study design is presented...

  17. Building Information Model: advantages, tools and adoption efficiency

    Science.gov (United States)

    Abakumov, R. G.; Naumov, A. E.

    2018-03-01

    The paper expands definition and essence of Building Information Modeling. It describes content and effects from application of Information Modeling at different stages of a real property item. Analysis of long-term and short-term advantages is given. The authors included an analytical review of Revit software package in comparison with Autodesk with respect to: features, advantages and disadvantages, cost and pay cutoff. A prognostic calculation is given for efficiency of adoption of the Building Information Modeling technology, with examples of its successful adoption in Russia and worldwide.

  18. Circular Business Models: Defining a Concept and Framing an Emerging Research Field

    Directory of Open Access Journals (Sweden)

    Julia L. K. Nußholz

    2017-10-01

    Full Text Available To aid companies in transitioning towards a circular economy and adopting strategies such as reuse, repair, and remanufacturing, the concept of circular business models has been developed. Although the concept draws on contributions from various academic disciplines, and despite its increasingly frequent use, few scholars clearly define what a circular business model is. Understanding about what makes a business model circular is diverse, hampering the theoretical development and practical application of circular business models. This study aims to help frame the field of circular business model research, by clarifying the fundamentals of the concept from the perspectives of resource efficiency and business model innovation. Expanding on these findings, a review of how the concept is used in recent academic literature is provided. It shows that a coherent view is lacking on which resource efficiency strategies classify a business model as circular. This study clarifies which resource efficiency strategies can be deemed as relevant key strategies for circular business models, and suggests a new definition of the concept. With the definition grounded in analysis of the fundamentals in terms of resource efficiency and business models, the study contributes to theoretical advancement and effective implementation of circular business models.

  19. Minimalistic Neutrino Mass Model

    CERN Document Server

    De Gouvêa, A; Gouvea, Andre de

    2001-01-01

    We consider the simplest model which solves the solar and atmospheric neutrino puzzles, in the sense that it contains the smallest amount of beyond the Standard Model ingredients. The solar neutrino data is accounted for by Planck-mass effects while the atmospheric neutrino anomaly is due to the existence of a single right-handed neutrino at an intermediate mass scale between 10^9 GeV and 10^14 GeV. Even though the neutrino mixing angles are not exactly predicted, they can be naturally large, which agrees well with the current experimental situation. Furthermore, the amount of lepton asymmetry produced in the early universe by the decay of the right-handed neutrino is very predictive and may be enough to explain the current baryon-to-photon ratio if the right-handed neutrinos are produced out of thermal equilibrium. One definitive test for the model is the search for anomalous seasonal effects at Borexino.

  20. Theoretical aspects of the optical model

    International Nuclear Information System (INIS)

    Mahaux, C.

    1980-01-01

    We first recall the definition of the optical-model potential for nucleons and the physical interpretation of the main related quantities. We then survey the recent theoretical progress towards a reliable calculation of this potential. The present limitations of the theory and some prospects for future developments are outlined. (author)

  1. Using Interaction Scenarios to Model Information Systems

    DEFF Research Database (Denmark)

    Bækgaard, Lars; Bøgh Andersen, Peter

    The purpose of this paper is to define and discuss a set of interaction primitives that can be used to model the dynamics of socio-technical activity systems, including information systems, in a way that emphasizes structural aspects of the interaction that occurs in such systems. The primitives...... a number of case studies that indicate that interaction primitives can be useful modeling tools for supplementing conventional flow-oriented modeling of business processes....... are based on a unifying, conceptual definition of the disparate interaction types - a robust model of the types. The primitives can be combined and may thus represent mediated interaction. We present a set of visualizations that can be used to define multiple related interactions and we present and discuss...

  2. Enhancements to NURBS-Based FEA Airfoil Modeler: SABER

    Science.gov (United States)

    Saleeb, A. F.; Trowbridge, D. A.

    2003-01-01

    NURBS (Non-Uniform Rational B-Splines) have become a common way for CAD programs to fit a smooth surface to discrete geometric data. This concept has been extended to allow for the fitting of analysis data in a similar manner and "attaching" the analysis data to the geometric definition of the structure. The "attaching" of analysis data to the geometric definition allows for a more seamless sharing of data between analysis disciplines. NURBS have become a useful tool in the modeling of airfoils. The use of NURBS has allowed for the development of software that easily and consistently generates plate finite element models of the midcamber surface of a given airfoil. The resulting displacements can then be applied to the original airfoil surface and the deformed shape calculated.

  3. Labour Quality Model for Organic Farming Food Chains

    OpenAIRE

    Gassner, B.; Freyer, B.; Leitner, H.

    2008-01-01

    The debate on labour quality in science is controversial as well as in the organic agriculture community. Therefore, we reviewed literature on different labour quality models and definitions, and had key informant interviews on labour quality issues with stakeholders in a regional oriented organic agriculture bread food chain. We developed a labour quality model with nine quality categories and discussed linkages to labour satisfaction, ethical values and IFOAM principles.

  4. Modeling humanoid swarm robots with petri nets

    OpenAIRE

    Maharjan, Bikee

    2015-01-01

    Master's thesis in Computer science Robots have become a hot topic in today‟s electronic world. There are many definitions for it. One of the definition in Oxford dictionary states “a robot is a machine capable for carrying out a complex series of action automatically especially one programmable by a computer”. This paper deals with a special kind of robot, which is also known as humanoid robot. These robots are replication of human beings with head, torso, arms and legs. A model of hum...

  5. A process-based model for the definition of hydrological alert systems in landslide risk mitigation

    Directory of Open Access Journals (Sweden)

    M. Floris

    2012-11-01

    Full Text Available The definition of hydrological alert systems for rainfall-induced landslides is strongly related to a deep knowledge of the geological and geomorphological features of the territory. Climatic conditions, spatial and temporal evolution of the phenomena and characterization of landslide triggering, together with propagation mechanisms, are the key elements to be considered. Critical steps for the development of the systems consist of the identification of the hydrological variable related to landslide triggering and of the minimum rainfall threshold for landslide occurrence.

    In this paper we report the results from a process-based model to define a hydrological alert system for the Val di Maso Landslide, located in the northeastern Italian Alps and included in the Vicenza Province (Veneto region, NE Italy. The instability occurred in November 2010, due to an exceptional rainfall event that hit the Vicenza Province and the entire NE Italy. Up to 500 mm in 3-day cumulated rainfall generated large flood conditions and triggered hundreds of landslides. During the flood, the Soil Protection Division of the Vicenza Province received more than 500 warnings of instability phenomena. The complexity of the event and the high level of risk to infrastructure and private buildings are the main reasons for deepening the specific phenomenon occurred at Val di Maso.

    Empirical and physically-based models have been used to identify the minimum rainfall threshold for the occurrence of instability phenomena in the crown area of Val di Maso landslide, where a retrogressive evolution by multiple rotational slides is expected. Empirical models helped in the identification and in the evaluation of recurrence of critical rainfall events, while physically-based modelling was essential to verify the effects on the slope stability of determined rainfall depths. Empirical relationships between rainfall and landslide consist of the calculation of rainfall

  6. Parametric cost models for space telescopes

    Science.gov (United States)

    Stahl, H. Philip; Henrichs, Todd; Dollinger, Courtnay

    2017-11-01

    Multivariable parametric cost models for space telescopes provide several benefits to designers and space system project managers. They identify major architectural cost drivers and allow high-level design trades. They enable cost-benefit analysis for technology development investment. And, they provide a basis for estimating total project cost. A survey of historical models found that there is no definitive space telescope cost model. In fact, published models vary greatly [1]. Thus, there is a need for parametric space telescopes cost models. An effort is underway to develop single variable [2] and multi-variable [3] parametric space telescope cost models based on the latest available data and applying rigorous analytical techniques. Specific cost estimating relationships (CERs) have been developed which show that aperture diameter is the primary cost driver for large space telescopes; technology development as a function of time reduces cost at the rate of 50% per 17 years; it costs less per square meter of collecting aperture to build a large telescope than a small telescope; and increasing mass reduces cost.

  7. Parametric Cost Models for Space Telescopes

    Science.gov (United States)

    Stahl, H. Philip; Henrichs, Todd; Dollinger, Courtney

    2010-01-01

    Multivariable parametric cost models for space telescopes provide several benefits to designers and space system project managers. They identify major architectural cost drivers and allow high-level design trades. They enable cost-benefit analysis for technology development investment. And, they provide a basis for estimating total project cost. A survey of historical models found that there is no definitive space telescope cost model. In fact, published models vary greatly [1]. Thus, there is a need for parametric space telescopes cost models. An effort is underway to develop single variable [2] and multi-variable [3] parametric space telescope cost models based on the latest available data and applying rigorous analytical techniques. Specific cost estimating relationships (CERs) have been developed which show that aperture diameter is the primary cost driver for large space telescopes; technology development as a function of time reduces cost at the rate of 50% per 17 years; it costs less per square meter of collecting aperture to build a large telescope than a small telescope; and increasing mass reduces cost.

  8. Towards Accurate Modelling of Galaxy Clustering on Small Scales: Testing the Standard ΛCDM + Halo Model

    Science.gov (United States)

    Sinha, Manodeep; Berlind, Andreas A.; McBride, Cameron K.; Scoccimarro, Roman; Piscionere, Jennifer A.; Wibking, Benjamin D.

    2018-04-01

    Interpreting the small-scale clustering of galaxies with halo models can elucidate the connection between galaxies and dark matter halos. Unfortunately, the modelling is typically not sufficiently accurate for ruling out models statistically. It is thus difficult to use the information encoded in small scales to test cosmological models or probe subtle features of the galaxy-halo connection. In this paper, we attempt to push halo modelling into the "accurate" regime with a fully numerical mock-based methodology and careful treatment of statistical and systematic errors. With our forward-modelling approach, we can incorporate clustering statistics beyond the traditional two-point statistics. We use this modelling methodology to test the standard ΛCDM + halo model against the clustering of SDSS DR7 galaxies. Specifically, we use the projected correlation function, group multiplicity function and galaxy number density as constraints. We find that while the model fits each statistic separately, it struggles to fit them simultaneously. Adding group statistics leads to a more stringent test of the model and significantly tighter constraints on model parameters. We explore the impact of varying the adopted halo definition and cosmological model and find that changing the cosmology makes a significant difference. The most successful model we tried (Planck cosmology with Mvir halos) matches the clustering of low luminosity galaxies, but exhibits a 2.3σ tension with the clustering of luminous galaxies, thus providing evidence that the "standard" halo model needs to be extended. This work opens the door to adding interesting freedom to the halo model and including additional clustering statistics as constraints.

  9. The territorial biorefinery as a new business model

    Directory of Open Access Journals (Sweden)

    Ion Lucian Ceapraz

    2016-05-01

    Full Text Available The transition toward more sustainable industries opens the way for alternative solutions based upon new economic models using agricultural inputs or biomass to substitute oil-based inputs. In this context different generations of biorefinery complexes are evolving rapidly and highlight the numerous possibilities for the organization of processing activities, from supply to final markets. The evolution of these biorefineries has followed two main business models, the port biorefinery, based on the import of raw materials, and the territorial biorefinery, based on strong relationships with local (or regional supply bases. In this article we focus on the concept of the ‘territorial biorefinery’, seen as a new business model. We develop the idea of a link between the biorefinery and its territory through several relevant theoretical approaches and demonstrate that the definition of ‘territorial biorefinery’ does not achieve, from these theoretical backgrounds, a consensus. More importantly, we emphasise that the theoretical assumptions underlying the different definitions used should be made explicit in order to facilitate the manner in which practioners study, develop and set up businesses of this kind.

  10. Dynamical compensation and structural identifiability of biological models: Analysis, implications, and reconciliation.

    Science.gov (United States)

    Villaverde, Alejandro F; Banga, Julio R

    2017-11-01

    The concept of dynamical compensation has been recently introduced to describe the ability of a biological system to keep its output dynamics unchanged in the face of varying parameters. However, the original definition of dynamical compensation amounts to lack of structural identifiability. This is relevant if model parameters need to be estimated, as is often the case in biological modelling. Care should we taken when using an unidentifiable model to extract biological insight: the estimated values of structurally unidentifiable parameters are meaningless, and model predictions about unmeasured state variables can be wrong. Taking this into account, we explore alternative definitions of dynamical compensation that do not necessarily imply structural unidentifiability. Accordingly, we show different ways in which a model can be made identifiable while exhibiting dynamical compensation. Our analyses enable the use of the new concept of dynamical compensation in the context of parameter identification, and reconcile it with the desirable property of structural identifiability.

  11. Coupled intertwiner dynamics: A toy model for coupling matter to spin foam models

    Science.gov (United States)

    Steinhaus, Sebastian

    2015-09-01

    The universal coupling of matter and gravity is one of the most important features of general relativity. In quantum gravity, in particular spin foams, matter couplings have been defined in the past, yet the mutual dynamics, in particular if matter and gravity are strongly coupled, are hardly explored, which is related to the definition of both matter and gravitational degrees of freedom on the discretization. However, extracting these mutual dynamics is crucial in testing the viability of the spin foam approach and also establishing connections to other discrete approaches such as lattice gauge theories. Therefore, we introduce a simple two-dimensional toy model for Yang-Mills coupled to spin foams, namely an Ising model coupled to so-called intertwiner models defined for SU (2 )k. The two systems are coupled by choosing the Ising coupling constant to depend on spin labels of the background, as these are interpreted as the edge lengths of the discretization. We coarse grain this toy model via tensor network renormalization and uncover an interesting dynamics: the Ising phase transition temperature turns out to be sensitive to the background configurations and conversely, the Ising model can induce phase transitions in the background. Moreover, we observe a strong coupling of both systems if close to both phase transitions.

  12. Controlled Nonlinear Stochastic Delay Equations: Part I: Modeling and Approximations

    International Nuclear Information System (INIS)

    Kushner, Harold J.

    2012-01-01

    This two-part paper deals with “foundational” issues that have not been previously considered in the modeling and numerical optimization of nonlinear stochastic delay systems. There are new classes of models, such as those with nonlinear functions of several controls (such as products), each with is own delay, controlled random Poisson measure driving terms, admissions control with delayed retrials, and others. There are two basic and interconnected themes for these models. The first, dealt with in this part, concerns the definition of admissible control. The classical definition of an admissible control as a nonanticipative relaxed control is inadequate for these models and needs to be extended. This is needed for the convergence proofs of numerical approximations for optimal controls as well as to have a well-defined model. It is shown that the new classes of admissible controls do not enlarge the range of the value functions, is closed (together with the associated paths) under weak convergence, and is approximatable by ordinary controls. The second theme, dealt with in Part II, concerns transportation equation representations, and their role in the development of numerical algorithms with much reduced memory and computational requirements.

  13. First-order regional seismotectonic model for South Africa

    CSIR Research Space (South Africa)

    Singh, M

    2011-10-01

    Full Text Available A first-order seismotectonic model was created for South Africa. This was done using four logical steps: geoscientific data collection, characterisation, assimilation and zonation. Through the definition of subunits of concentrations of earthquake...

  14. Behavioral models as theoretical frames to analyze the business objective

    Directory of Open Access Journals (Sweden)

    Hernán Alonso Bafico

    2015-12-01

    Full Text Available This paper examines Pfeffer’s Models of Behavior and connects each of them with attributes of the definition of the firm’s objective, assumed as the maximization of the sustainable, long term valor of the residual claims.Each of the five models of behavior (rational, social, moral, retrospective and cognitive contributes to the decision making and goal setting processes with its particular and complementary elements. From those assuming complete rationality and frictionless markets, to the models emphasizing the role of ethical positions, and the presence of perceptive and cognitive mechanisms. The analysis highlights the main contributions of critical theories and models of behavior, underlining their focus on non-traditional variables, regarded as critical inputs for goal setting processes and designing alternative executive incentive schemes.  The explicit consideration of those variables does not indicate the need for a new definition of corporate objective. The maximization of the long term value of the shareholders’ claims still defines the relevant objective function of the firm, remaining as the main yardstick of corporate performance.Behavioral models are recognized as important tools to help managers direct their attention to long term strategies. In the last part, we comment on the relationship between the objective function and behavioral models, from the practitioners’ perspective.Key words: Firm Objectives, Behavioral Models, Value Maximization, Stakeholder Theory.

  15. The potential model of coloured quarks

    International Nuclear Information System (INIS)

    Greenberg, O.W.

    1981-01-01

    The success of the additive potential model of colored quarks for the masses, decay rates, and other properties of single mesons and baryons does not imply that this model can yield the observed meson-nucleon and nucleon-nucleon interactions. We give a comprehensive discussion of this issue. In agreement with previous authors, we conclude that, on the contrary, this model predicts inverse-power color-analog van der Waals potentials between separated hadrons which are in substantial contradiction with experimental data. We also discuss pathologies of non-abelian confining potentials, and show that the hamiltonian is unbounded below for an arbitrary number of quarks and antiquarks in a definite color state for all color states, except the singlet, triplet, and antitriplet. (orig.)

  16. The Twin-Cycle Experiential Learning Model: Reconceptualising Kolb's Theory

    Science.gov (United States)

    Bergsteiner, Harald; Avery, Gayle C.

    2014-01-01

    Experiential learning styles remain popular despite criticisms about their validity, usefulness, fragmentation and poor definitions and categorisation. After examining four prominent models and building on Bergsteiner, Avery, and Neumann's suggestion of a dual cycle, this paper proposes a twin-cycle experiential learning model to overcome…

  17. Modeling Complex Systems

    International Nuclear Information System (INIS)

    Schreckenberg, M

    2004-01-01

    This book by Nino Boccara presents a compilation of model systems commonly termed as 'complex'. It starts with a definition of the systems under consideration and how to build up a model to describe the complex dynamics. The subsequent chapters are devoted to various categories of mean-field type models (differential and recurrence equations, chaos) and of agent-based models (cellular automata, networks and power-law distributions). Each chapter is supplemented by a number of exercises and their solutions. The table of contents looks a little arbitrary but the author took the most prominent model systems investigated over the years (and up until now there has been no unified theory covering the various aspects of complex dynamics). The model systems are explained by looking at a number of applications in various fields. The book is written as a textbook for interested students as well as serving as a comprehensive reference for experts. It is an ideal source for topics to be presented in a lecture on dynamics of complex systems. This is the first book on this 'wide' topic and I have long awaited such a book (in fact I planned to write it myself but this is much better than I could ever have written it!). Only section 6 on cellular automata is a little too limited to the author's point of view and one would have expected more about the famous Domany-Kinzel model (and more accurate citation!). In my opinion this is one of the best textbooks published during the last decade and even experts can learn a lot from it. Hopefully there will be an actualization after, say, five years since this field is growing so quickly. The price is too high for students but this, unfortunately, is the normal case today. Nevertheless I think it will be a great success! (book review)

  18. A COMPARISON OF SEMANTIC SIMILARITY MODELS IN EVALUATING CONCEPT SIMILARITY

    Directory of Open Access Journals (Sweden)

    Q. X. Xu

    2012-08-01

    Full Text Available The semantic similarities are important in concept definition, recognition, categorization, interpretation, and integration. Many semantic similarity models have been established to evaluate semantic similarities of objects or/and concepts. To find out the suitability and performance of different models in evaluating concept similarities, we make a comparison of four main types of models in this paper: the geometric model, the feature model, the network model, and the transformational model. Fundamental principles and main characteristics of these models are introduced and compared firstly. Land use and land cover concepts of NLCD92 are employed as examples in the case study. The results demonstrate that correlations between these models are very high for a possible reason that all these models are designed to simulate the similarity judgement of human mind.

  19. Models for reliability and management of NDT data

    International Nuclear Information System (INIS)

    Simola, K.

    1997-01-01

    In this paper the reliability of NDT measurements was approached from three directions. We have modelled the flaw sizing performance, the probability of flaw detection, and developed models to update the knowledge of true flaw size based on sequential measurement results and flaw sizing reliability model. In discussed models the measured flaw characteristics (depth, length) are assumed to be simple functions of the true characteristics and random noise corresponding to measurement errors, and the models are based on logarithmic transforms. Models for Bayesian updating of the flaw size distributions were developed. Using these models, it is possible to take into account the prior information of the flaw size and combine it with the measured results. A Bayesian approach could contribute e. g. to the definition of an appropriate combination of practical assessments and technical justifications in NDT system qualifications, as expressed by the European regulatory bodies

  20. [Decision modeling for economic evaluation of health technologies].

    Science.gov (United States)

    de Soárez, Patrícia Coelho; Soares, Marta Oliveira; Novaes, Hillegonda Maria Dutilh

    2014-10-01

    Most economic evaluations that participate in decision-making processes for incorporation and financing of technologies of health systems use decision models to assess the costs and benefits of the compared strategies. Despite the large number of economic evaluations conducted in Brazil, there is a pressing need to conduct an in-depth methodological study of the types of decision models and their applicability in our setting. The objective of this literature review is to contribute to the knowledge and use of decision models in the national context of economic evaluations of health technologies. This article presents general definitions about models and concerns with their use; it describes the main models: decision trees, Markov chains, micro-simulation, simulation of discrete and dynamic events; it discusses the elements involved in the choice of model; and exemplifies the models addressed in national economic evaluation studies of diagnostic and therapeutic preventive technologies and health programs.

  1. Specialty Payment Model Opportunities and Assessment

    Science.gov (United States)

    Mulcahy, Andrew W.; Chan, Chris; Hirshman, Samuel; Huckfeldt, Peter J.; Kofner, Aaron; Liu, Jodi L.; Lovejoy, Susan L.; Popescu, Ioana; Timbie, Justin W.; Hussey, Peter S.

    2015-01-01

    Abstract Gastroenterology and cardiology services are common and costly among Medicare beneficiaries. Episode-based payment, which aims to create incentives for high-quality, low-cost care, has been identified as a promising alternative payment model. This article describes research related to the design of episode-based payment models for ambulatory gastroenterology and cardiology services for possible testing by the Center for Medicare and Medicaid Innovation at the Centers for Medicare and Medicaid Services (CMS). The authors analyzed Medicare claims data to describe the frequency and characteristics of gastroenterology and cardiology index procedures, the practices that delivered index procedures, and the patients that received index procedures. The results of these analyses can help inform CMS decisions about the definition of episodes in an episode-based payment model; payment adjustments for service setting, multiple procedures, or other factors; and eligibility for the payment model. PMID:28083363

  2. Leggett's noncontextual model studied with neutrons

    International Nuclear Information System (INIS)

    Durstberger-Rennhofer, K.; Sponar, S.; Badurek, G.; Hasegawa, Y.; Schmitzer, C.; Bartosik, H.; Klepp, J.

    2011-01-01

    Full text: It is a long-lasting debate whether nature can be described by deterministic hidden variable theories (HVT) underlying quantum mechanics (QM). Bell inequalities for local HVT as well as the Kochen- Specker theorem for non-contextual models stress the conflict between these alternative theories and QM. Leggett showed that even nonlocal hidden variable models are incompatible with quantum predictions. Neutron interferometry and polarimetry are very proper tools to analyse the behaviour of single neutron systems, where entanglement is created between different degrees of freedom (e.g., spin/ path, spin/energy) and thus quantum contextuality can be studied. We report the first experimental test of a contextual model of quantum mechanics a la Leggett, which deals with definiteness of measurement results before the measurements. The results show a discrepancy between our model and quantum mechanics of more than 7 standard deviations and confirm quantum indefiniteness under the contextual condition. (author)

  3. Ecosystem models are by definition simplifications of the real ...

    African Journals Online (AJOL)

    spamer

    to calculate changes in total phytoplankton vegetative biomass with time ... into account when modelling phytoplankton population dynamics. ... Then, the means whereby the magnitude of ..... There was increased heat input and slight stratification from mid to ... conditions must be optimal and the water should be extremely ...

  4. Application of blocking diagnosis methods to general circulation models. Part II: model simulations

    Energy Technology Data Exchange (ETDEWEB)

    Barriopedro, D.; Trigo, R.M. [Universidade de Lisboa, CGUL-IDL, Faculdade de Ciencias, Lisbon (Portugal); Garcia-Herrera, R.; Gonzalez-Rouco, J.F. [Universidad Complutense de Madrid, Departamento de Fisica de la Tierra II, Facultad de C.C. Fisicas, Madrid (Spain)

    2010-12-15

    A previously defined automatic method is applied to reanalysis and present-day (1950-1989) forced simulations of the ECHO-G model in order to assess its performance in reproducing atmospheric blocking in the Northern Hemisphere. Unlike previous methodologies, critical parameters and thresholds to estimate blocking occurrence in the model are not calibrated with an observed reference, but objectively derived from the simulated climatology. The choice of model dependent parameters allows for an objective definition of blocking and corrects for some intrinsic model bias, the difference between model and observed thresholds providing a measure of systematic errors in the model. The model captures reasonably the main blocking features (location, amplitude, annual cycle and persistence) found in observations, but reveals a relative southward shift of Eurasian blocks and an overall underestimation of blocking activity, especially over the Euro-Atlantic sector. Blocking underestimation mostly arises from the model inability to generate long persistent blocks with the observed frequency. This error is mainly attributed to a bias in the basic state. The bias pattern consists of excessive zonal winds over the Euro-Atlantic sector and a southward shift at the exit zone of the jet stream extending into in the Eurasian continent, that are more prominent in cold and warm seasons and account for much of Euro-Atlantic and Eurasian blocking errors, respectively. It is shown that other widely used blocking indices or empirical observational thresholds may not give a proper account of the lack of realism in the model as compared with the proposed method. This suggests that in addition to blocking changes that could be ascribed to natural variability processes or climate change signals in the simulated climate, attention should be paid to significant departures in the diagnosis of phenomena that can also arise from an inappropriate adaptation of detection methods to the climate of the

  5. Engineering modelling. A contribution to the CommonKADS library

    Energy Technology Data Exchange (ETDEWEB)

    Top, J.L.; Akkermans, J.M.

    1993-12-01

    Generic knowledge components and models for the task of in particular engineering modelling are presented.It is intended as a contribution to the CommonKADS library. In the first chapter an executive summary is provided. Next, the Conceptual Modelling Language (CML) definitions of the various generic library components are given. In the following two chapters the underlying theory is developed. First, a task-oriented analysis is made, based upon the similarities between modelling and design tasks. Second, an ontological analysis is given, which shows that ontology differentiation constitutes an important problem-solving method (PSM) for engineering modelling, on a par with task-decomposition PSMs. Finally, three different modelling applications, based on existing knowledgeable systems, are analyzed, which analysis illustrates and provides data points for the discussed generic components and models for modelling. 50 figs., 77 refs.

  6. REGIONAL FIRST ORDER PERIODIC AUTOREGRESSIVE MODELS FOR MONTHLY FLOWS

    Directory of Open Access Journals (Sweden)

    Ceyhun ÖZÇELİK

    2008-01-01

    Full Text Available First order periodic autoregressive models is of mostly used models in modeling of time dependency of hydrological flow processes. In these models, periodicity of the correlogram is preserved as well as time dependency of processes. However, the parameters of these models, namely, inter-monthly lag-1 autocorrelation coefficients may be often estimated erroneously from short samples, since they are statistics of high order moments. Therefore, to constitute a regional model may be a solution that can produce more reliable and decisive estimates, and derive models and model parameters in any required point of the basin considered. In this study, definitions of homogeneous region for lag-1 autocorrelation coefficients are made; five parametric and non parametric models are proposed to set regional models of lag-1 autocorrelation coefficients. Regional models are applied on 30 stream flow gauging stations in Seyhan and Ceyhan basins, and tested by criteria of relative absolute bias, simple and relative root of mean square errors.

  7. A parametric costing model for wave energy technology

    International Nuclear Information System (INIS)

    1992-01-01

    This document describes the philosophy and technical approach to a parametric cost model for offshore wave energy systems. Consideration is given both to existing known devices and other devices yet to be conceptualised. The report is complementary to a spreadsheet based cost estimating model. The latter permits users to derive capital cost estimates using either inherent default data or user provided data, if a particular scheme provides sufficient design definition for more accurate estimation. The model relies on design default data obtained from wave energy device designs and a set of specifically collected cost data. (author)

  8. One-dimensional reactor kinetics model for RETRAN

    International Nuclear Information System (INIS)

    Gose, G.C.; Peterson, C.E.; Ellis, N.L.; McClure, J.A.

    1981-01-01

    Previous versions of RETRAN have had only a point kinetics model to describe the reactor core behavior during thermal-hydraulic transients. The principal assumption in deriving the point kinetics model is that the neutron flux may be separated into a time-dependent amplitude funtion and a time-independent shape function. Certain types of transients cannot be correctly analyzed under this assumption, since proper definitions for core average quantities such as reactivity or lifetime include the inner product of the adjoint flux with the perturbed flux. A one-dimensional neutronics model has been included in a preliminary version of RETRAN-02. The ability to account for flux shape changes will permit an improved representation of the thermal and hydraulic feedback effects. This paper describes the neutronics model and discusses some of the analyses

  9. Theoretical Modelling of Intercultural Communication Process

    Directory of Open Access Journals (Sweden)

    Mariia Soter

    2016-08-01

    Full Text Available The definition of the concepts of “communication”, “intercultural communication”, “model of communication” are analyzed in the article. The basic components of the communication process are singled out. The model of intercultural communication is developed. Communicative, behavioral and complex skills for optimal organization of intercultural communication, establishment of productive contact with a foreign partner to achieve mutual understanding, searching for acceptable ways of organizing interaction and cooperation for both communicants are highlighted in the article. It is noted that intercultural communication through interaction between people affects the development of different cultures’ aspects.

  10. Bayesian models a statistical primer for ecologists

    CERN Document Server

    Hobbs, N Thompson

    2015-01-01

    Bayesian modeling has become an indispensable tool for ecological research because it is uniquely suited to deal with complexity in a statistically coherent way. This textbook provides a comprehensive and accessible introduction to the latest Bayesian methods-in language ecologists can understand. Unlike other books on the subject, this one emphasizes the principles behind the computations, giving ecologists a big-picture understanding of how to implement this powerful statistical approach. Bayesian Models is an essential primer for non-statisticians. It begins with a definition of probabili

  11. Modelling of crustal rock mechanics for radioactive waste storage in Fennoscandia - problem definition

    International Nuclear Information System (INIS)

    Stephansson, O.

    1987-05-01

    Existing knowledge of crustal stresses for Fennoscandia is presented. Generic, two-dimensional models are proposed for vertical and planar sections of a traverse having a direction NW-SE in Northern Fennoscandia. The proposed traverse will include the major neotectonic structures at Lansjaerv and Paervie, respectively, and also the study site for storage of spent nuclear fuel at Kamlunge. The influence of glaciation, deglaciation, glacial rebound on crustal rock mechanics and stability is studied for the modelling work. Global models, with a length of roughly 100 km, will increase our over all understanding of the change in stresses and deformations. These can provide boundary conditions for regional and near-field models. Properties of strength and stiffness of intact granitic rock masses, faults and joints are considered in the modelling of the crustal rock mechanics for any of the three models described. (orig./HP)

  12. Computer models versus reality: how well do in silico models currently predict the sensitization potential of a substance.

    Science.gov (United States)

    Teubner, Wera; Mehling, Anette; Schuster, Paul Xaver; Guth, Katharina; Worth, Andrew; Burton, Julien; van Ravenzwaay, Bennard; Landsiedel, Robert

    2013-12-01

    National legislations for the assessment of the skin sensitization potential of chemicals are increasingly based on the globally harmonized system (GHS). In this study, experimental data on 55 non-sensitizing and 45 sensitizing chemicals were evaluated according to GHS criteria and used to test the performance of computer (in silico) models for the prediction of skin sensitization. Statistic models (Vega, Case Ultra, TOPKAT), mechanistic models (Toxtree, OECD (Q)SAR toolbox, DEREK) or a hybrid model (TIMES-SS) were evaluated. Between three and nine of the substances evaluated were found in the individual training sets of various models. Mechanism based models performed better than statistical models and gave better predictivities depending on the stringency of the domain definition. Best performance was achieved by TIMES-SS, with a perfect prediction, whereby only 16% of the substances were within its reliability domain. Some models offer modules for potency; however predictions did not correlate well with the GHS sensitization subcategory derived from the experimental data. In conclusion, although mechanistic models can be used to a certain degree under well-defined conditions, at the present, the in silico models are not sufficiently accurate for broad application to predict skin sensitization potentials. Copyright © 2013 Elsevier Inc. All rights reserved.

  13. A 3D elasto-plastic soil model for lateral buckling analysis

    DEFF Research Database (Denmark)

    Hededal, Ole; Strandgaard, Torsten

    2008-01-01

    Modeling the lay-down of pipelines and subsequently the in- service conditions for a pipeline involves definition of a pipe-soil interaction model. A generalized true 3D elasto-plastic spring element based on an anisotropic hardening/degradation model for sliding is presented. The basis...... for the model is the elasto-plastic framework. A generic format is selected, allowing different yield criteria and flow rules to be implemented in a simple way. The model complies to a finite element format allowing it to be directly implemented into a standard finite element code. Examples demonstrating...

  14. Engineering Model of High Pressure Moist Air

    Directory of Open Access Journals (Sweden)

    Hyhlík Tomáš

    2017-01-01

    Full Text Available The article deals with the moist air equation of state. There are equations of state discussed in the article, i.e. the model of an ideal mixture of ideal gases, the model of an ideal mixture of real gases and the model based on the virial equation of state. The evaluation of sound speed based on the ideal mixture concept is mentioned. The sound speed calculated by the model of an ideal mixture of ideal gases is compared with the sound speed calculated by using the model based on the concept of an ideal mixture of real gases. The comparison of enthalpy end entropy based on the model of an ideal mixture of ideal gases and the model of an ideal mixture of real gases is performed. It is shown that the model of an ideal mixture of real gases deviates from the model of an ideal mixture of ideal gases only in the case of high pressure. An impossibility of the definition of partial pressure in the mixture of real gases is discussed, where the virial equation of state is used.

  15. Models for assessing and managing credit risk

    Directory of Open Access Journals (Sweden)

    Neogradi Slađana

    2014-01-01

    Full Text Available This essay deals with the definition of a model for assessing and managing credit risk. Risk is an inseparable component of any average and normal credit transaction. Looking at the different aspects of the identification and classification of risk in the banking industry as well as representation of the key components of modern risk management. In the first part of the essay will analyze how the impact of credit risk on bank and empirical models for determining the financial difficulties in which the company can be found. Bank on the basis of these models can reduce number of approved risk assets. In the second part, we consider models for improving credit risk with emphasis on Basel I, II and III, and the third part, we conclude that the most appropriate model and gives the best effect for measuring credit risk in domestic banks.

  16. Emergent randomness in the Jaynes-Cummings model

    International Nuclear Information System (INIS)

    Garraway, B M; Stenholm, S

    2008-01-01

    We consider the well-known Jaynes-Cummings model and ask if it can display randomness. As a solvable Hamiltonian system, it does not display chaotic behaviour in the ordinary sense. Here, however, we look at the distribution of values taken up during the total time evolution. This evolution is determined by the eigenvalues distributed as the square roots of integers and leads to a seemingly erratic behaviour. That this may display a random Gaussian value distribution is suggested by an exactly provable result by Kac. In order to reach our conclusion we use the Kac model to develop tests for the emergence of a Gaussian. Even if the consequent double limits are difficult to evaluate numerically, we find definite indications that the Jaynes-Cummings case also produces a randomness in its value distributions. Numerical methods do not establish such a result beyond doubt, but our conclusions are definite enough to suggest strongly an unexpected randomness emerging in a dynamic time evolution

  17. An object-oriented approach to energy-economic modeling

    Energy Technology Data Exchange (ETDEWEB)

    Wise, M.A.; Fox, J.A.; Sands, R.D.

    1993-12-01

    In this paper, the authors discuss the experiences in creating an object-oriented economic model of the U.S. energy and agriculture markets. After a discussion of some central concepts, they provide an overview of the model, focusing on the methodology of designing an object-oriented class hierarchy specification based on standard microeconomic production functions. The evolution of the model from the class definition stage to programming it in C++, a standard object-oriented programming language, will be detailed. The authors then discuss the main differences between writing the object-oriented program versus a procedure-oriented program of the same model. Finally, they conclude with a discussion of the advantages and limitations of the object-oriented approach based on the experience in building energy-economic models with procedure-oriented approaches and languages.

  18. Architecture oriented modeling and simulation method for combat mission profile

    Directory of Open Access Journals (Sweden)

    CHEN Xia

    2017-05-01

    Full Text Available In order to effectively analyze the system behavior and system performance of combat mission profile, an architecture-oriented modeling and simulation method is proposed. Starting from the architecture modeling,this paper describes the mission profile based on the definition from National Military Standard of China and the US Department of Defense Architecture Framework(DoDAFmodel, and constructs the architecture model of the mission profile. Then the transformation relationship between the architecture model and the agent simulation model is proposed to form the mission profile executable model. At last,taking the air-defense mission profile as an example,the agent simulation model is established based on the architecture model,and the input and output relations of the simulation model are analyzed. It provides method guidance for the combat mission profile design.

  19. 78 FR 79579 - Energy Conservation Program: Alternative Efficiency Determination Methods, Basic Model Definition...

    Science.gov (United States)

    2013-12-31

    ... features to be excluded from certification, verification, and enforcement testing as long as specific... class* that must be tested Self-Contained Open Refrigerators... 2 Basic Models. Self-Contained Open... Open Freezers..... 2 Basic Models. Self-Contained Closed Refrigerators. 2 Basic Models. Self-Contained...

  20. Uncertainty Quantification in Geomagnetic Field Modeling

    Science.gov (United States)

    Chulliat, A.; Nair, M. C.; Alken, P.; Meyer, B.; Saltus, R.; Woods, A.

    2017-12-01

    Geomagnetic field models are mathematical descriptions of the various sources of the Earth's magnetic field, and are generally obtained by solving an inverse problem. They are widely used in research to separate and characterize field sources, but also in many practical applications such as aircraft and ship navigation, smartphone orientation, satellite attitude control, and directional drilling. In recent years, more sophisticated models have been developed, thanks to the continuous availability of high quality satellite data and to progress in modeling techniques. Uncertainty quantification has become an integral part of model development, both to assess the progress made and to address specific users' needs. Here we report on recent advances made by our group in quantifying the uncertainty of geomagnetic field models. We first focus on NOAA's World Magnetic Model (WMM) and the International Geomagnetic Reference Field (IGRF), two reference models of the main (core) magnetic field produced every five years. We describe the methods used in quantifying the model commission error as well as the omission error attributed to various un-modeled sources such as magnetized rocks in the crust and electric current systems in the atmosphere and near-Earth environment. A simple error model was derived from this analysis, to facilitate usage in practical applications. We next report on improvements brought by combining a main field model with a high resolution crustal field model and a time-varying, real-time external field model, like in NOAA's High Definition Geomagnetic Model (HDGM). The obtained uncertainties are used by the directional drilling industry to mitigate health, safety and environment risks.

  1. Dark energy observational evidence and theoretical models

    CERN Document Server

    Novosyadlyj, B; Shtanov, Yu; Zhuk, A

    2013-01-01

    The book elucidates the current state of the dark energy problem and presents the results of the authors, who work in this area. It describes the observational evidence for the existence of dark energy, the methods and results of constraining of its parameters, modeling of dark energy by scalar fields, the space-times with extra spatial dimensions, especially Kaluza---Klein models, the braneworld models with a single extra dimension as well as the problems of positive definition of gravitational energy in General Relativity, energy conditions and consequences of their violation in the presence of dark energy. This monograph is intended for science professionals, educators and graduate students, specializing in general relativity, cosmology, field theory and particle physics.

  2. Flexible Bayesian Dynamic Modeling of Covariance and Correlation Matrices

    KAUST Repository

    Lan, Shiwei; Holbrook, Andrew; Fortin, Norbert J.; Ombao, Hernando; Shahbaba, Babak

    2017-01-01

    Modeling covariance (and correlation) matrices is a challenging problem due to the large dimensionality and positive-definiteness constraint. In this paper, we propose a novel Bayesian framework based on decomposing the covariance matrix

  3. Models of sequential decision making in consumer lending

    OpenAIRE

    Kanshukan Rajaratnam; Peter A. Beling; George A. Overstreet

    2016-01-01

    Abstract In this paper, we introduce models of sequential decision making in consumer lending. From the definition of adverse selection in static lending models, we show that homogenous borrowers take-up offers at different instances of time when faced with a sequence of loan offers. We postulate that bounded rationality and diverse decision heuristics used by consumers drive the decisions they make about credit offers. Under that postulate, we show how observation of early decisions in a seq...

  4. Crisis and emergency risk communication as an integrative model.

    Science.gov (United States)

    Reynolds, Barbara; W Seeger, Matthew

    2005-01-01

    This article describes a model of communication known as crisis and emergency risk communication (CERC). The model is outlined as a merger of many traditional notions of health and risk communication with work in crisis and disaster communication. The specific kinds of communication activities that should be called for at various stages of disaster or crisis development are outlined. Although crises are by definition uncertain, equivocal, and often chaotic situations, the CERC model is presented as a tool health communicators can use to help manage these complex events.

  5. Working Group 2: A critical appraisal of model simulations

    International Nuclear Information System (INIS)

    MacCracken, M.; Cubasch, U.; Gates, W.L.; Harvey, L.D.; Hunt, B.; Katz, R.; Lorenz, E.; Manabe, S.; McAvaney, B.; McFarlane, N.; Meehl, G.; Meleshko, V.; Robock, A.; Stenchikov, G.; Stouffer, R.; Wang, W.C.; Washington, W.; Watts, R.; Zebiak, S.

    1990-01-01

    The complexity of the climate system and the absence of definitive analogs to the evolving climatic situation force use of theoretical models to project the future climatic influence of the relatively rapid and on-going increase in the atmospheric concentrations of CO 2 and other trace gases. A wide variety of climate models has been developed to look at particular aspects of the problem and to vary the mix of complexity and resource requirements needed to study various aspects of the problem; all such models have contributed to their insights of the problem

  6. Elements for modeling and design of centrifugal compressor housings

    International Nuclear Information System (INIS)

    Magoia, J.E.; Calderon, T.

    1990-01-01

    Various aspects of the structural analysis of centrifugal compressor housings are studied. These are usually used in different kinds of nuclear sites. Multiple areas of the analysis are evaluated with elastic models based on finite elements: sensitivity to different variables, quality of models on facing theoretical solutions and performed measurements. The development of an excentric bar element improved for the rigidized plate model, is included. The definition of criteria for a more efficient structural analysis as well as recommendations for the design of centrifugal compressor housings concludes the work. (Author) [es

  7. Stochastic Modelling Of The Repairable System

    Directory of Open Access Journals (Sweden)

    Andrzejczak Karol

    2015-11-01

    Full Text Available All reliability models consisting of random time factors form stochastic processes. In this paper we recall the definitions of the most common point processes which are used for modelling of repairable systems. Particularly this paper presents stochastic processes as examples of reliability systems for the support of the maintenance related decisions. We consider the simplest one-unit system with a negligible repair or replacement time, i.e., the unit is operating and is repaired or replaced at failure, where the time required for repair and replacement is negligible. When the repair or replacement is completed, the unit becomes as good as new and resumes operation. The stochastic modelling of recoverable systems constitutes an excellent method of supporting maintenance related decision-making processes and enables their more rational use.

  8. The Long Time Behavior of a Stochastic Logistic Model with Infinite Delay and Impulsive Perturbation

    OpenAIRE

    Lu, Chun; Wu, Kaining

    2016-01-01

    This paper considers a stochastic logistic model with infinite delay and impulsive perturbation. Firstly, with the space $C_{g}$ as phase space, the definition of solution to a stochastic functional differential equation with infinite delay and impulsive perturbation is established. According to this definition, we show that our model has an unique global positive solution. Then we establish the sufficient and necessary conditions for extinction and stochastic permanence of the...

  9. A multi-scale modeling of surface effect via the modified boundary Cauchy-Born model

    Energy Technology Data Exchange (ETDEWEB)

    Khoei, A.R., E-mail: arkhoei@sharif.edu; Aramoon, A.

    2012-10-01

    In this paper, a new multi-scale approach is presented based on the modified boundary Cauchy-Born (MBCB) technique to model the surface effects of nano-structures. The salient point of the MBCB model is the definition of radial quadrature used in the surface elements which is an indicator of material behavior. The characteristics of quadrature are derived by interpolating data from atoms laid in a circular support around the quadrature, in a least-square scene. The total-Lagrangian formulation is derived for the equivalent continua by employing the Cauchy-Born hypothesis for calculating the strain energy density function of the continua. The numerical results of the proposed method are compared with direct atomistic and finite element simulation results to indicate that the proposed technique provides promising results for modeling surface effects of nano-structures. - Highlights: Black-Right-Pointing-Pointer A multi-scale approach is presented to model the surface effects in nano-structures. Black-Right-Pointing-Pointer The total-Lagrangian formulation is derived by employing the Cauchy-Born hypothesis. Black-Right-Pointing-Pointer The radial quadrature is used to model the material behavior in surface elements. Black-Right-Pointing-Pointer The quadrature characteristics are derived using the data at the atomistic level.

  10. THE INTERNAL CONTROL MODELS IN ROMANIA

    Directory of Open Access Journals (Sweden)

    TEODORESCU CRISTIAN DRAGOȘ

    2015-06-01

    Full Text Available Internal control is indissolubly linked to business and accounting. Throughout history, domestic and international trade has grown exponentially, which has led to an increasing complexity of internal control, to new methods and techniques to control the business. The literature has presented the first models of internal control in the Sumerian period (3600 - 3200 BC, and the emergence and development of internal control in Egypt, Persia, Greek and Roman Empire, in the Middle Ages till modern times. The purpose of this article is to present the models of internal control in Romania, starting from the principles of the classical model of internal control (COSO model. For a better understanding of the implication of internal control in terms of public and private sector, I have structured the article in the following parts: (a the definition of internal control in the literature; (b the presentation of the COSO model; (c internal control and internal audit in public institutions; (d internal control issues in accounting regulations on the individual and consolidated annual financial statements; (e internal / managerial control; (f conclusions.

  11. Assessing the capability of CORDEX models in simulating onset of rainfall in West Africa

    Science.gov (United States)

    Mounkaila, Moussa S.; Abiodun, Babatunde J.; `Bayo Omotosho, J.

    2015-01-01

    Reliable forecasts of rainfall-onset dates (RODs) are crucial for agricultural planning and food security in West Africa. This study evaluates the ability of nine CORDEX regional climate models (RCMs: ARPEGE, CRCM5, RACMO, RCA35, REMO, RegCM3, PRECIS, CCLM and WRF) in simulating RODs over the region. Four definitions are used to compute RODs, and two observation datasets (GPCP and TRMM) are used in the model evaluation. The evaluation considers how well the RCMs, driven by ERA-Interim reanalysis (ERAIN), simulate the observed mean, standard deviation and inter-annual variability of RODs over West Africa. It also investigates how well the models link RODs with the northward movement of the monsoon system over the region. The model performances are compared to that of the driving reanalysis—ERAIN. Observations show that the mean RODs in West Africa have a zonal distribution, and the dates increase from the Guinea coast northward. ERAIN fails to reproduce the spatial distribution of the RODs as observed. The performance of some RCMs in simulating the RODs depends on the ROD definition used. For instance, ARPEGE, RACMO, PRECIS and CCLM produce a better ROD distribution than that of ERAIN when three of the ROD definitions are used, but give a worse ROD distribution than that of ERAIN when the fourth definition is used. However, regardless of the definition used, CCRM5, RCA35, REMO, RegCM3 and WRF show a remarkable improvement over ERAIN. The study shows that the ability of the RCMs in simulating RODs over West Africa strongly depends on how well the models reproduce the northward movement of the monsoon system and the associated features. The results show that there are some differences in the RODs obtained between the two observation datasets and RCMs, and the differences are magnified by differences in the ROD definitions. However, the study shows that most CORDEX RCMs have remarkable skills in predicting the RODs in West Africa.

  12. Implementation of the Business Process Modelling Notation (BPMN) in the modelling of anatomic pathology processes.

    Science.gov (United States)

    Rojo, Marcial García; Rolón, Elvira; Calahorra, Luis; García, Felix Oscar; Sánchez, Rosario Paloma; Ruiz, Francisco; Ballester, Nieves; Armenteros, María; Rodríguez, Teresa; Espartero, Rafael Martín

    2008-07-15

    Process orientation is one of the essential elements of quality management systems, including those in use in healthcare. Business processes in hospitals are very complex and variable. BPMN (Business Process Modelling Notation) is a user-oriented language specifically designed for the modelling of business (organizational) processes. Previous experiences of the use of this notation in the processes modelling within the Pathology in Spain or another country are not known. We present our experience in the elaboration of the conceptual models of Pathology processes, as part of a global programmed surgical patient process, using BPMN. With the objective of analyzing the use of BPMN notation in real cases, a multidisciplinary work group was created, including software engineers from the Dep. of Technologies and Information Systems from the University of Castilla-La Mancha and health professionals and administrative staff from the Hospital General de Ciudad Real. The work in collaboration was carried out in six phases: informative meetings, intensive training, process selection, definition of the work method, process describing by hospital experts, and process modelling. The modelling of the processes of Anatomic Pathology is presented using BPMN. The presented subprocesses are those corresponding to the surgical pathology examination of the samples coming from operating theatre, including the planning and realization of frozen studies. The modelling of Anatomic Pathology subprocesses has allowed the creation of an understandable graphical model, where management and improvements are more easily implemented by health professionals.

  13. Development of flexible process-centric web applications: An integrated model driven approach

    NARCIS (Netherlands)

    Bernardi, M.L.; Cimitile, M.; Di Lucca, G.A.; Maggi, F.M.

    2012-01-01

    In recent years, Model Driven Engineering (MDE) approaches have been proposed and used to develop and evolve WAs. However, the definition of appropriate MDE approaches for the development of flexible process-centric WAs is still limited. In particular, (flexible) workflow models have never been

  14. User-owned utility models for rural electrification

    Energy Technology Data Exchange (ETDEWEB)

    Waddle, D.

    1997-12-01

    The author discusses the history of rural electric cooperatives (REC) in the United States, and the broader question of whether such organizations can serve as a model for rural electrification in other countries. The author points out the features of such cooperatives which have given them stability and strength, and emphasizes that for success of such programs, many of these same features must be present. He definitely feels the cooperative models are not outdated, but they need strong local support, and a governmental structure which is supportive, and in particular not negative.

  15. Verification-Driven Slicing of UML/OCL Models

    DEFF Research Database (Denmark)

    Shaikh, Asadullah; Clarisó Viladrosa, Robert; Wiil, Uffe Kock

    2010-01-01

    computational complexity can limit their scalability. In this paper, we consider a specific static model (UML class diagrams annotated with unrestricted OCL constraints) and a specific property to verify (satisfiability, i.e., “is it possible to create objects without violating any constraint?”). Current...... approaches to this problem have an exponential worst-case runtime. We propose a technique to improve their scalability by partitioning the original model into submodels (slices) which can be verified independently and where irrelevant information has been abstracted. The definition of the slicing procedure...

  16. Functional Decomposition of Modeling and Simulation Terrain Database Generation Process

    National Research Council Canada - National Science Library

    Yakich, Valerie R; Lashlee, J. D

    2008-01-01

    .... This report documents the conceptual procedure as implemented by Lockheed Martin Simulation, Training, and Support and decomposes terrain database construction using the Integration Definition for Function Modeling (IDEF...

  17. Scientific white paper on concentration-QTc modeling.

    Science.gov (United States)

    Garnett, Christine; Bonate, Peter L; Dang, Qianyu; Ferber, Georg; Huang, Dalong; Liu, Jiang; Mehrotra, Devan; Riley, Steve; Sager, Philip; Tornoe, Christoffer; Wang, Yaning

    2018-06-01

    The International Council for Harmonisation revised the E14 guideline through the questions and answers process to allow concentration-QTc (C-QTc) modeling to be used as the primary analysis for assessing the QTc interval prolongation risk of new drugs. A well-designed and conducted QTc assessment based on C-QTc modeling in early phase 1 studies can be an alternative approach to a thorough QT study for some drugs to reliably exclude clinically relevant QTc effects. This white paper provides recommendations on how to plan and conduct a definitive QTc assessment of a drug using C-QTc modeling in early phase clinical pharmacology and thorough QT studies. Topics included are: important study design features in a phase 1 study; modeling objectives and approach; exploratory plots; the pre-specified linear mixed effects model; general principles for model development and evaluation; and expectations for modeling analysis plans and reports. The recommendations are based on current best modeling practices, scientific literature and personal experiences of the authors. These recommendations are expected to evolve as their implementation during drug development provides additional data and with advances in analytical methodology.

  18. Stochastic approaches to inflation model building

    International Nuclear Information System (INIS)

    Ramirez, Erandy; Liddle, Andrew R.

    2005-01-01

    While inflation gives an appealing explanation of observed cosmological data, there are a wide range of different inflation models, providing differing predictions for the initial perturbations. Typically models are motivated either by fundamental physics considerations or by simplicity. An alternative is to generate large numbers of models via a random generation process, such as the flow equations approach. The flow equations approach is known to predict a definite structure to the observational predictions. In this paper, we first demonstrate a more efficient implementation of the flow equations exploiting an analytic solution found by Liddle (2003). We then consider alternative stochastic methods of generating large numbers of inflation models, with the aim of testing whether the structures generated by the flow equations are robust. We find that while typically there remains some concentration of points in the observable plane under the different methods, there is significant variation in the predictions amongst the methods considered

  19. Exclusion statistics and integrable models

    International Nuclear Information System (INIS)

    Mashkevich, S.

    1998-01-01

    The definition of exclusion statistics, as given by Haldane, allows for a statistical interaction between distinguishable particles (multi-species statistics). The thermodynamic quantities for such statistics ca be evaluated exactly. The explicit expressions for the cluster coefficients are presented. Furthermore, single-species exclusion statistics is realized in one-dimensional integrable models. The interesting questions of generalizing this correspondence onto the higher-dimensional and the multi-species cases remain essentially open

  20. A discussion on validation of hydrogeological models

    International Nuclear Information System (INIS)

    Carrera, J.; Mousavi, S.F.; Usunoff, E.J.; Sanchez-Vila, X.; Galarza, G.

    1993-01-01

    Groundwater flow and solute transport are often driven by heterogeneities that elude easy identification. It is also difficult to select and describe the physico-chemical processes controlling solute behaviour. As a result, definition of a conceptual model involves numerous assumptions both on the selection of processes and on the representation of their spatial variability. Validating a numerical model by comparing its predictions with actual measurements may not be sufficient for evaluating whether or not it provides a good representation of 'reality'. Predictions will be close to measurements, regardless of model validity, if these are taken from experiments that stress well-calibrated model modes. On the other hand, predictions will be far from measurements when model parameters are very uncertain, even if the model is indeed a very good representation of the real system. Hence, we contend that 'classical' validation of hydrogeological models is not possible. Rather, models should be viewed as theories about the real system. We propose to follow a rigorous modeling approach in which different sources of uncertainty are explicitly recognized. The application of one such approach is illustrated by modeling a laboratory uranium tracer test performed on fresh granite, which was used as Test Case 1b in INTRAVAL. (author)

  1. Model Hadron asymptotic behaviour

    International Nuclear Information System (INIS)

    Kralchevsky, P.; Nikolov, A.

    1983-01-01

    The work is devoted to the problem of solving a set of asymptotic equations describing the model hardon interaction. More specifically an interactive procedure consisting of two stages is proposed and the first stage is exhaustively studied here. The principle of contracting transformations has been applied for this purpose. Under rather general and natural assumptions, solutions in a series of metric spaces suitable for physical applications have been found. For each of these spaces a solution with unique definiteness is found. (authors)

  2. Comparing Two Definitions of Work for a Biological Quantum Heat Engine

    International Nuclear Information System (INIS)

    Xu You-Yang; Zhao Shun-Cai; Liu Juan

    2015-01-01

    Systems of photosynthetic reaction centres have been modelled as heat engines, while it has also been reported that the efficiency and power of such heat engines can be enhanced by quantum interference — a trait that has attracted much interest. We compare two definitions of the work of such a photosynthetic heat engine, i.e. definition A used by Weimer et al. and B by Dorfman et al. We also introduce a coherent interaction between donor and acceptor (CIDA) to demonstrate a reversible energy transport. We show that these two definitions of work can impart contradictory results, that is, CIDA enhances the power and efficiency of the photosynthetic heat engine with definition B but not with A. Additionally, we find that both reversible and irreversible excitation-energy transport can be described with definition A, but definition B can only model irreversible transport. As a result, we conclude that definition A is more suitable for photosynthetic systems than definition B. (paper)

  3. Model dependence and its effect on ensemble projections in CMIP5

    Science.gov (United States)

    Abramowitz, G.; Bishop, C.

    2013-12-01

    Conceptually, the notion of model dependence within climate model ensembles is relatively simple - modelling groups share a literature base, parametrisations, data sets and even model code - the potential for dependence in sampling different climate futures is clear. How though can this conceptual problem inform a practical solution that demonstrably improves the ensemble mean and ensemble variance as an estimate of system uncertainty? While some research has already focused on error correlation or error covariance as a candidate to improve ensemble mean estimates, a complete definition of independence must at least implicitly subscribe to an ensemble interpretation paradigm, such as the 'truth-plus-error', 'indistinguishable', or more recently 'replicate Earth' paradigm. Using a definition of model dependence based on error covariance within the replicate Earth paradigm, this presentation will show that accounting for dependence in surface air temperature gives cooler projections in CMIP5 - by as much as 20% globally in some RCPs - although results differ significantly for each RCP, especially regionally. The fact that the change afforded by accounting for dependence across different RCPs is different is not an inconsistent result. Different numbers of submissions to each RCP by different modelling groups mean that differences in projections from different RCPs are not entirely about RCP forcing conditions - they also reflect different sampling strategies.

  4. Theoretical Biology and Medical Modelling: ensuring continued growth and future leadership.

    Science.gov (United States)

    Nishiura, Hiroshi; Rietman, Edward A; Wu, Rongling

    2013-07-11

    Theoretical biology encompasses a broad range of biological disciplines ranging from mathematical biology and biomathematics to philosophy of biology. Adopting a broad definition of "biology", Theoretical Biology and Medical Modelling, an open access journal, considers original research studies that focus on theoretical ideas and models associated with developments in biology and medicine.

  5. Effect of dental technician disparities on the 3-dimensional accuracy of definitive casts.

    Science.gov (United States)

    Emir, Faruk; Piskin, Bulent; Sipahi, Cumhur

    2017-03-01

    Studies that evaluated the effect of dental technician disparities on the accuracy of presectioned and postsectioned definitive casts are lacking. The purpose of this in vitro study was to evaluate the accuracy of presectioned and postsectioned definitive casts fabricated by different dental technicians by using a 3-dimensional computer-aided measurement method. An arch-shaped metal master model consisting of 5 abutments resembling prepared mandibular incisors, canines, and first molars and with a 6-degree total angle of convergence was designed and fabricated by computer-aided design and computer-aided manufacturing (CAD-CAM) technology. Complete arch impressions were made (N=110) from the master model, using polyvinyl siloxane (PVS) and delivered to 11 dental technicians. Each technician fabricated 10 definitive casts with dental stone, and the obtained casts were numbered. All casts were sectioned, and removable dies were obtained. The master model and the presectioned and postsectioned definitive casts were digitized with an extraoral scanner, and the virtual master model and virtual presectioned and postsectioned definitive casts were obtained. All definitive casts were compared with the master model by using computer-aided measurements, and the 3-dimensional accuracy of the definitive casts was determined with best fit alignment and represented in color-coded maps. Differences were analyzed using univariate analyses of variance, and the Tukey honest significant differences post hoc tests were used for multiple comparisons (α=.05). The accuracy of presectioned and postsectioned definitive casts was significantly affected by dental technician disparities (Ptechnician differences. Copyright © 2016 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  6. The theory of planned behaviour in medical education: a model for integrating professionalism training.

    Science.gov (United States)

    Archer, Ray; Elder, William; Hustedde, Carol; Milam, Andrea; Joyce, Jennifer

    2008-08-01

    Teaching and evaluating professionalism remain important issues in medical education. However, two factors hinder attempts to integrate curricular elements addressing professionalism into medical school training: there is no common definition of medical professionalism used across medical education, and there is no commonly accepted theoretical model upon which to integrate professionalism into the curriculum. This paper proposes a definition of professionalism, examines this definition in the context of some of the previous definitions of professionalism and connects this definition to the attitudinal roots of professionalism. The problems described above bring uncertainty about the best content and methods with which to teach professionalism in medical education. Although various aspects of professionalism have been incorporated into medical school curricula, content, teaching and evaluation remain controversial. We suggest that intervening variables, which may augment or interfere with medical students' implementation of professionalism knowledge, skills and, therefore, attitudes, may go unaddressed. We offer a model based on the theory of planned behaviour (TPB), which describes the relationships of attitudes, social norms and perceived behavioural control with behaviour. It has been used to predict a wide range of behaviours, including doctor professional behaviours. Therefore, we propose an educational model that expands the TPB as an organisational framework that can integrate professionalism training into medical education. We conclude with a discussion about the implications of using this model to transform medical school curricula to develop positive professionalism attitudes, alter the professionalism social norms of the medical school and increase students' perceived control over their behaviours.

  7. Landau-Lifshitz sigma-models, fermions and the AdS/CFT correspondence

    OpenAIRE

    Stefanski Jr, B.

    2007-01-01

    We define Landau-Lifshitz sigma models on general coset space $G/H$, with $H$ a maximal stability sub-group of $G$. These are non-relativistic models that have $G$-valued N\\"other charges, local $H$ invariance and are classically integrable. Using this definition, we construct the $PSU(2,2|4)/PS(U(2|2)^2)$ Landau-Lifshitz sigma-model. This sigma model describes the thermodynamic limit of the spin-chain Hamiltonian obtained from the complete one-loop dilatation operator of the N=4 super Yang-M...

  8. A Grammatical Approach to the Modeling of an Autonomous Robot

    Directory of Open Access Journals (Sweden)

    Gabriel López-García

    2012-06-01

    Full Text Available Virtual Worlds Generator is a grammatical model that is proposed to define virtual worlds. It integrates the diversity of sensors and interaction devices, multimodality and a virtual simulation system. Its grammar allows the definition and abstraction in symbols strings of the scenes of the virtual world, independently of the hardware that is used to represent the world or to interact with it. A case study is presented to explain how to use the proposed model to formalize a robot navigation system with multimodal perception and a hybrid control scheme of the robot. The result is an instance of the model grammar that implements the robotic system and is independent of the sensing devices used for perception and interaction. As a conclusion the Virtual Worlds Generator adds value in the simulation of virtual worlds since the definition can be done formally and independently of the peculiarities of the supporting devices

  9. A topo-graph model for indistinct target boundary definition from anatomical images.

    Science.gov (United States)

    Cui, Hui; Wang, Xiuying; Zhou, Jianlong; Gong, Guanzhong; Eberl, Stefan; Yin, Yong; Wang, Lisheng; Feng, Dagan; Fulham, Michael

    2018-06-01

    It can be challenging to delineate the target object in anatomical imaging when the object boundaries are difficult to discern due to the low contrast or overlapping intensity distributions from adjacent tissues. We propose a topo-graph model to address this issue. The first step is to extract a topographic representation that reflects multiple levels of topographic information in an input image. We then define two types of node connections - nesting branches (NBs) and geodesic edges (GEs). NBs connect nodes corresponding to initial topographic regions and GEs link the nodes at a detailed level. The weights for NBs are defined to measure the similarity of regional appearance, and weights for GEs are defined with geodesic and local constraints. NBs contribute to the separation of topographic regions and the GEs assist the delineation of uncertain boundaries. Final segmentation is achieved by calculating the relevance of the unlabeled nodes to the labels by the optimization of a graph-based energy function. We test our model on 47 low contrast CT studies of patients with non-small cell lung cancer (NSCLC), 10 contrast-enhanced CT liver cases and 50 breast and abdominal ultrasound images. The validation criteria are the Dice's similarity coefficient and the Hausdorff distance. Student's t-test show that our model outperformed the graph models with pixel-only, pixel and regional, neighboring and radial connections (p-values <0.05). Our findings show that the topographic representation and topo-graph model provides improved delineation and separation of objects from adjacent tissues compared to the tested models. Copyright © 2018 Elsevier B.V. All rights reserved.

  10. Study of seismic data acquisition using physical modeling system; Butsuri model jikken sochi wo mochiita data shutoku gijutsu ni kansuru kento

    Energy Technology Data Exchange (ETDEWEB)

    Tsukui, R; Tsuru, T [Japan National Oil Corp., Tokyo (Japan). Technology Research Center; Matsuoka, T [Japan Petroleum Exploration Corp., Tokyo (Japan)

    1996-10-01

    For the physical modeling system of Technology Research Center, Japan National Oil Corporation, data acquisition on the ocean and ground can be simulated using models. This system can provide data for verification of the data processing and elastic wave simulation algorithm. This can also provide data for decision of experiment specifications by making a model simulating the underground structure of the given test field. The model used for the physical modeling system is a gradient multilayer model with six-layer structure. Depth migration before stacking was conducted using data obtained through two acquisition methods, i.e., up-dip acquisition and down-dip acquisition. The depth migration before stacking was performed for data obtained by up-dip acquisition in addition to the records obtained by down-dip acquisition. Consequently, a definite reflection surface was observed, which has not been observed from the processing results of down-dip acquisition data. 9 figs.

  11. Qualitative and Quantitative Integrated Modeling for Stochastic Simulation and Optimization

    Directory of Open Access Journals (Sweden)

    Xuefeng Yan

    2013-01-01

    Full Text Available The simulation and optimization of an actual physics system are usually constructed based on the stochastic models, which have both qualitative and quantitative characteristics inherently. Most modeling specifications and frameworks find it difficult to describe the qualitative model directly. In order to deal with the expert knowledge, uncertain reasoning, and other qualitative information, a qualitative and quantitative combined modeling specification was proposed based on a hierarchical model structure framework. The new modeling approach is based on a hierarchical model structure which includes the meta-meta model, the meta-model and the high-level model. A description logic system is defined for formal definition and verification of the new modeling specification. A stochastic defense simulation was developed to illustrate how to model the system and optimize the result. The result shows that the proposed method can describe the complex system more comprehensively, and the survival probability of the target is higher by introducing qualitative models into quantitative simulation.

  12. Multivariate Variance Targeting in the BEKK-GARCH Model

    DEFF Research Database (Denmark)

    Pedersen, Rasmus Søndergaard; Rahbek, Anders

    2014-01-01

    This paper considers asymptotic inference in the multivariate BEKK model based on (co-)variance targeting (VT). By definition the VT estimator is a two-step estimator and the theory presented is based on expansions of the modified likelihood function, or estimating function, corresponding...

  13. The Active Model: a calibration of material intent

    DEFF Research Database (Denmark)

    Ramsgaard Thomsen, Mette; Tamke, Martin

    2012-01-01

    created it. This definition suggests structural characteristics that are perhaps not immediately obvious when implemented within architectural models. It opens the idea that materiality might persist into the digital environment, as well as the digital lingering within the material. It implies questions...

  14. Murine Models of Gastric Corpus PreneoplasiaSummary

    Directory of Open Access Journals (Sweden)

    Christine P. Petersen

    2017-01-01

    Full Text Available Intestinal-type gastric adenocarcinoma evolves in a field of pre-existing metaplasia. Over the past 20 years, a number of murine models have been developed to address aspects of the physiology and pathophysiology of metaplasia induction. Although none of these models has achieved true recapitulation of the induction of adenocarcinoma, they have led to important insights into the factors that influence the induction and progression of metaplasia. Here, we review the pathologic definitions relevant to alterations in gastric corpus lineages and classification of metaplasia by specific lineage markers. In addition, we review present murine models of the induction and progression of spasmolytic polypeptide (TFF2–expressing metaplasia, the predominant metaplastic lineage observed in murine models. These models provide a basis for the development of a broader understanding of the physiological and pathophysiological roles of metaplasia in the stomach. Keywords: SPEM, Intestinal Metaplasia, Gastric Cancer, TFF2, Chief Cell, Hyperplasia

  15. Software Tools For Building Decision-support Models For Flood Emergency Situations

    Science.gov (United States)

    Garrote, L.; Molina, M.; Ruiz, J. M.; Mosquera, J. C.

    The SAIDA decision-support system was developed by the Spanish Ministry of the Environment to provide assistance to decision-makers during flood situations. SAIDA has been tentatively implemented in two test basins: Jucar and Guadalhorce, and the Ministry is currently planning to have it implemented in all major Spanish basins in a few years' time. During the development cycle of SAIDA, the need for providing as- sistance to end-users in model definition and calibration was clearly identified. System developers usually emphasise abstraction and generality with the goal of providing a versatile software environment. End users, on the other hand, require concretion and specificity to adapt the general model to their local basins. As decision-support models become more complex, the gap between model developers and users gets wider: Who takes care of model definition, calibration and validation?. Initially, model developers perform these tasks, but the scope is usually limited to a few small test basins. Before the model enters operational stage, end users must get involved in model construction and calibration, in order to gain confidence in the model recommendations. However, getting the users involved in these activities is a difficult task. The goal of this re- search is to develop representation techniques for simulation and management models in order to define, develop and validate a mechanism, supported by a software envi- ronment, oriented to provide assistance to the end-user in building decision models for the prediction and management of river floods in real time. The system is based on three main building blocks: A library of simulators of the physical system, an editor to assist the user in building simulation models, and a machine learning method to calibrate decision models based on the simulation models provided by the user.

  16. Towards a public, standardized, diagnostic benchmarking system for land surface models

    Directory of Open Access Journals (Sweden)

    G. Abramowitz

    2012-06-01

    Full Text Available This work examines different conceptions of land surface model benchmarking and the importance of internationally standardized evaluation experiments that specify data sets, variables, metrics and model resolutions. It additionally demonstrates how essential the definition of a priori expectations of model performance can be, based on the complexity of a model and the amount of information being provided to it, and gives an example of how these expectations might be quantified. Finally, the Protocol for the Analysis of Land Surface models (PALS is introduced – a free, online land surface model benchmarking application that is structured to meet both of these goals.

  17. Realizing three generations of the Standard Model fermions in the type IIB matrix model

    International Nuclear Information System (INIS)

    Aoki, Hajime; Nishimura, Jun; Tsuchiya, Asato

    2014-01-01

    We discuss how the Standard Model particles appear from the type IIB matrix model, which is considered to be a nonperturbative formulation of superstring theory. In particular, we are concerned with a constructive definition of the theory, in which we start with finite-N matrices and take the large-N limit afterwards. In that case, it was pointed out recently that realizing chiral fermions in the model is more difficult than it had been thought from formal arguments at N=∞ and that introduction of a matrix version of the warp factor is necessary. Based on this new insight, we show that two generations of the Standard Model fermions can be realized by considering a rather generic configuration of fuzzy S"2 and fuzzy S"2×S"2 in the extra dimensions. We also show that three generations can be obtained by squashing one of the S"2’s that appear in the configuration. Chiral fermions appear at the intersections of the fuzzy manifolds with nontrivial Yukawa couplings to the Higgs field, which can be calculated from the overlap of their wave functions.

  18. ModFossa: A library for modeling ion channels using Python.

    Science.gov (United States)

    Ferneyhough, Gareth B; Thibealut, Corey M; Dascalu, Sergiu M; Harris, Frederick C

    2016-06-01

    The creation and simulation of ion channel models using continuous-time Markov processes is a powerful and well-used tool in the field of electrophysiology and ion channel research. While several software packages exist for the purpose of ion channel modeling, most are GUI based, and none are available as a Python library. In an attempt to provide an easy-to-use, yet powerful Markov model-based ion channel simulator, we have developed ModFossa, a Python library supporting easy model creation and stimulus definition, complete with a fast numerical solver, and attractive vector graphics plotting.

  19. Quantitative model of New Zealand's energy supply industry

    Energy Technology Data Exchange (ETDEWEB)

    Smith, B. R. [Victoria Univ., Wellington, (New Zealand); Lucas, P. D. [Ministry of Energy Resources (New Zealand)

    1977-10-15

    A mathematical model is presented to assist in an analysis of energy policy options available. The model is based on an engineering orientated description of New Zealand's energy supply and distribution system. The system is cast as a linear program, in which energy demand is satisfied at least cost. The capacities and operating modes of process plant (such as power stations, oil refinery units, and LP-gas extraction plants) are determined by the model, as well as the optimal mix of fuels supplied to the final consumers. Policy analysis with the model enables a wide ranging assessment of the alternatives and uncertainties within a consistent quantitative framework. It is intended that the model be used as a tool to investigate the relative effects of various policy options, rather than to present a definitive plan for satisfying the nation's energy requirements.

  20. Models in geography ? A sense to research

    Directory of Open Access Journals (Sweden)

    Roger Brunet

    2001-12-01

    Full Text Available Ideas on models and modelling made a conspicuous entry into geography in the 1960s. They have since evolved, through practice and under the influence of—partly justified—criticism. No serious research can dispense with modelling as a means to reach the essential and to evaluate the divergence between singular geographical objects and the models that assist their interpretation. On two conditions, which merit further definition and exploration : models must have meaning in and through the practices, objectives and intentions of human action ; and we must know how to use models—whether tried and tested or new—to understand the structure and dynamics of singular geographical objects, and not just to infer general mechanisms from them, even though they will certainly enhance our understanding of the nature and scope of general mechanisms.

  1. The spherical limit of the n-vector model and correlation inequaljties

    International Nuclear Information System (INIS)

    Angelescu, N.; Bundaru, M.; Costache, G.

    1978-08-01

    The asymptotics of the state of the n-vector model with a finite number of spins in the spherical limit is studied. Besides rederiving the limit free energy, corresponding to a generalized spherical model (with ''spherical constraint'' at every site), we obtain also the limit of the correlation functions, which allow a precise definition of the state of the latter model. Correlation inequalities are proved for ferromagnetic interactions in the asymptotic regime. In particular, it is shown that the generalized spherical model fulfills the expected Griffiths' type inequalities, differing in this respect from the spherical model with overall constraint. (author)

  2. PerMallows: An R Package for Mallows and Generalized Mallows Models

    Directory of Open Access Journals (Sweden)

    Ekhine Irurozki

    2016-08-01

    Full Text Available In this paper we present the R package PerMallows, which is a complete toolbox to work with permutations, distances and some of the most popular probability models for permutations: Mallows and the Generalized Mallows models. The Mallows model is an exponential location model, considered as analogous to the Gaussian distribution. It is based on the definition of a distance between permutations. The Generalized Mallows model is its best-known extension. The package includes functions for making inference, sampling and learning such distributions. The distances considered in PerMallows are Kendall's τ , Cayley, Hamming and Ulam.

  3. Transport spatial model for the definition of green routes for city logistics centers

    Energy Technology Data Exchange (ETDEWEB)

    Pamučar, Dragan, E-mail: dpamucar@gmail.com [University of Defence in Belgrade, Department of Logistics, Pavla Jurisica Sturma 33, 11000 Belgrade (Serbia); Gigović, Ljubomir, E-mail: gigoviclj@gmail.com [University of Defence in Belgrade, Department of Mathematics, Pavla Jurisica Sturma 33, 11000 Belgrade (Serbia); Ćirović, Goran, E-mail: cirovic@sezampro.rs [College of Civil Engineering and Geodesy, The Belgrade University, Hajduk Stankova 2, 11000 Belgrade (Serbia); Regodić, Miodrag, E-mail: mregodic62@gmail.com [University of Defence in Belgrade, Department of Mathematics, Pavla Jurisica Sturma 33, 11000 Belgrade (Serbia)

    2016-01-15

    This paper presents a transport spatial decision support model (TSDSM) for carrying out the optimization of green routes for city logistics centers. The TSDSM model is based on the integration of the multi-criteria method of Weighted Linear Combination (WLC) and the modified Dijkstra algorithm within a geographic information system (GIS). The GIS is used for processing spatial data. The proposed model makes it possible to plan routes for green vehicles and maximize the positive effects on the environment, which can be seen in the reduction of harmful gas emissions and an increase in the air quality in highly populated areas. The scheduling of delivery vehicles is given as a problem of optimization in terms of the parameters of: the environment, health, use of space and logistics operating costs. Each of these input parameters was thoroughly examined and broken down in the GIS into criteria which further describe them. The model presented here takes into account the fact that logistics operators have a limited number of environmentally friendly (green) vehicles available. The TSDSM was tested on a network of roads with 127 links for the delivery of goods from the city logistics center to the user. The model supports any number of available environmentally friendly or environmentally unfriendly vehicles consistent with the size of the network and the transportation requirements. - Highlights: • Model for routing light delivery vehicles in urban areas. • Optimization of green routes for city logistics centers. • The proposed model maximizes the positive effects on the environment. • The model was tested on a real network.

  4. Transport spatial model for the definition of green routes for city logistics centers

    International Nuclear Information System (INIS)

    Pamučar, Dragan; Gigović, Ljubomir; Ćirović, Goran; Regodić, Miodrag

    2016-01-01

    This paper presents a transport spatial decision support model (TSDSM) for carrying out the optimization of green routes for city logistics centers. The TSDSM model is based on the integration of the multi-criteria method of Weighted Linear Combination (WLC) and the modified Dijkstra algorithm within a geographic information system (GIS). The GIS is used for processing spatial data. The proposed model makes it possible to plan routes for green vehicles and maximize the positive effects on the environment, which can be seen in the reduction of harmful gas emissions and an increase in the air quality in highly populated areas. The scheduling of delivery vehicles is given as a problem of optimization in terms of the parameters of: the environment, health, use of space and logistics operating costs. Each of these input parameters was thoroughly examined and broken down in the GIS into criteria which further describe them. The model presented here takes into account the fact that logistics operators have a limited number of environmentally friendly (green) vehicles available. The TSDSM was tested on a network of roads with 127 links for the delivery of goods from the city logistics center to the user. The model supports any number of available environmentally friendly or environmentally unfriendly vehicles consistent with the size of the network and the transportation requirements. - Highlights: • Model for routing light delivery vehicles in urban areas. • Optimization of green routes for city logistics centers. • The proposed model maximizes the positive effects on the environment. • The model was tested on a real network.

  5. Damage modelling in concrete subject to sulfate attack

    Directory of Open Access Journals (Sweden)

    N. Cefis

    2014-07-01

    Full Text Available In this paper, we consider the mechanical effect of the sulfate attack on concrete. The durability analysis of concrete structures in contact to external sulfate solutions requires the definition of a proper diffusion-reaction model, for the computation of the varying sulfate concentration and of the consequent ettringite formation, coupled to a mechanical model for the prediction of swelling and material degradation. In this work, we make use of a two-ions formulation of the reactive-diffusion problem and we propose a bi-phase chemo-elastic damage model aimed to simulate the mechanical response of concrete and apt to be used in structural analyses.

  6. A reference model for model-based design of critical infrastructure protection systems

    Science.gov (United States)

    Shin, Young Don; Park, Cheol Young; Lee, Jae-Chon

    2015-05-01

    Today's war field environment is getting versatile as the activities of unconventional wars such as terrorist attacks and cyber-attacks have noticeably increased lately. The damage caused by such unconventional wars has also turned out to be serious particularly if targets are critical infrastructures that are constructed in support of banking and finance, transportation, power, information and communication, government, and so on. The critical infrastructures are usually interconnected to each other and thus are very vulnerable to attack. As such, to ensure the security of critical infrastructures is very important and thus the concept of critical infrastructure protection (CIP) has come. The program to realize the CIP at national level becomes the form of statute in each country. On the other hand, it is also needed to protect each individual critical infrastructure. The objective of this paper is to study on an effort to do so, which can be called the CIP system (CIPS). There could be a variety of ways to design CIPS's. Instead of considering the design of each individual CIPS, a reference model-based approach is taken in this paper. The reference model represents the design of all the CIPS's that have many design elements in common. In addition, the development of the reference model is also carried out using a variety of model diagrams. The modeling language used therein is the systems modeling language (SysML), which was developed and is managed by Object Management Group (OMG) and a de facto standard. Using SysML, the structure and operational concept of the reference model are designed to fulfil the goal of CIPS's, resulting in the block definition and activity diagrams. As a case study, the operational scenario of the nuclear power plant while being attacked by terrorists is studied using the reference model. The effectiveness of the results is also analyzed using multiple analysis models. It is thus expected that the approach taken here has some merits

  7. CMS computing model evolution

    International Nuclear Information System (INIS)

    Grandi, C; Bonacorsi, D; Colling, D; Fisk, I; Girone, M

    2014-01-01

    The CMS Computing Model was developed and documented in 2004. Since then the model has evolved to be more flexible and to take advantage of new techniques, but many of the original concepts remain and are in active use. In this presentation we will discuss the changes planned for the restart of the LHC program in 2015. We will discuss the changes planning in the use and definition of the computing tiers that were defined with the MONARC project. We will present how we intend to use new services and infrastructure to provide more efficient and transparent access to the data. We will discuss the computing plans to make better use of the computing capacity by scheduling more of the processor nodes, making better use of the disk storage, and more intelligent use of the networking.

  8. Towards benchmarking an in-stream water quality model

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available A method of model evaluation is presented which utilises a comparison with a benchmark model. The proposed benchmarking concept is one that can be applied to many hydrological models but, in this instance, is implemented in the context of an in-stream water quality model. The benchmark model is defined in such a way that it is easily implemented within the framework of the test model, i.e. the approach relies on two applications of the same model code rather than the application of two separate model codes. This is illustrated using two case studies from the UK, the Rivers Aire and Ouse, with the objective of simulating a water quality classification, general quality assessment (GQA, which is based on dissolved oxygen, biochemical oxygen demand and ammonium. Comparisons between the benchmark and test models are made based on GQA, as well as a step-wise assessment against the components required in its derivation. The benchmarking process yields a great deal of important information about the performance of the test model and raises issues about a priori definition of the assessment criteria.

  9. Exploring EFL Teachers’ Cognitive Models Through Metaphor Analysis

    Directory of Open Access Journals (Sweden)

    Hui Xiong

    2015-10-01

    Full Text Available This study aims to investigate how a group of Chinese university teachers developed their cognitive models by using “English as a Foreign Language (EFL teachers” metaphors. The research method includes an open-ended questionnaire, a checklist questionnaire, and verbal reports. The goal for this research is twofold. First, we will present those metaphors we believe to be the most frequently used or most central in shaping the thoughts or ideas they have had for EFL teaching and learning. Second, we will provide a description of their internal process of developing cognitive models, as well as factors that could account for such models. The findings showed that (a most of us had three ways of understanding EFL teachers in terms of the educational journey metaphor, the educational building metaphor, and the educational conduit metaphor; (b we used such a cluster of converging cognitive models as the instructor model, the transmitter model, and the builder model to construct definitions for EFL teachers, with the instructor model as a central model; and (c metaphor can actually serve as a useful, effective, and analytic tool for making us aware of the cognitive model underlying our conceptual framework.

  10. VHTR core modeling: coupling between neutronic and thermal-hydraulics

    International Nuclear Information System (INIS)

    Limaiem, I.; Damian, F.; Raepsaet, X.; Studer, E.

    2005-01-01

    Following the present interest in the next generation nuclear power plan (NGNP), Cea is deploying special effort to develop new models and qualify its research tools for this next generation reactors core. In this framework, the Very High Temperature Reactor concept (VHTR) has an increasing place in the actual research program. In such type of core, a strong interaction exists between neutronic and thermal-hydraulics. Consequently, the global core modelling requires accounting for the temperature feedback in the neutronic models. The purpose of this paper is to present the new neutronic and thermal-hydraulics coupling model dedicated to the High Temperature Reactors (HTR). The coupling model integrates a new version of the neutronic scheme calculation developed in collaboration between Cea and Framatome-ANP. The neutronic calculations are performed using a specific calculation processes based on the APOLLO2 transport code and CRONOS2 diffusion code which are part of the French reactor physics code system SAPHYR. The thermal-hydraulics model is characterised by an equivalent porous media and 1-D fluid/3-D thermal model implemented in the CAST3M/ARCTURUS code. The porous media approach involves the definition of both homogenous and heterogeneous models to ensure a correct temperature feedback. This study highlights the sensitivity of the coupling system's parameters (radial/axial meshing and data exchange strategy between neutronic and thermal-hydraulics code). The parameters sensitivity study leads to the definition of an optimal coupling system specification for the VHTR. Besides, this work presents the first physical analysis of the VHTR core in steady-state condition. The analysis gives information about the 3-D power peaking and the temperature coefficient. Indeed, it covers different core configurations with different helium distribution in the core bypass. (authors)

  11. The CRAFT Fortran Programming Model

    Directory of Open Access Journals (Sweden)

    Douglas M. Pase

    1994-01-01

    Full Text Available Many programming models for massively parallel machines exist, and each has its advantages and disadvantages. In this article we present a programming model that combines features from other programming models that (1 can be efficiently implemented on present and future Cray Research massively parallel processor (MPP systems and (2 are useful in constructing highly parallel programs. The model supports several styles of programming: message-passing, data parallel, global address (shared data, and work-sharing. These styles may be combined within the same program. The model includes features that allow a user to define a program in terms of the behavior of the system as a whole, where the behavior of individual tasks is implicit from this systemic definition. (In general, features marked as shared are designed to support this perspective. It also supports an opposite perspective, where a program may be defined in terms of the behaviors of individual tasks, and a program is implicitly the sum of the behaviors of all tasks. (Features marked as private are designed to support this perspective. Users can exploit any combination of either set of features without ambiguity and thus are free to define a program from whatever perspective is most appropriate to the problem at hand.

  12. What are hierarchical models and how do we analyze them?

    Science.gov (United States)

    Royle, Andy

    2016-01-01

    In this chapter we provide a basic definition of hierarchical models and introduce the two canonical hierarchical models in this book: site occupancy and N-mixture models. The former is a hierarchical extension of logistic regression and the latter is a hierarchical extension of Poisson regression. We introduce basic concepts of probability modeling and statistical inference including likelihood and Bayesian perspectives. We go through the mechanics of maximizing the likelihood and characterizing the posterior distribution by Markov chain Monte Carlo (MCMC) methods. We give a general perspective on topics such as model selection and assessment of model fit, although we demonstrate these topics in practice in later chapters (especially Chapters 5, 6, 7, and 10 Chapter 5 Chapter 6 Chapter 7 Chapter 10)

  13. Geometric accuracy of wax bade models manufactured in silicon moulds

    Directory of Open Access Journals (Sweden)

    G. Budzik

    2010-01-01

    Full Text Available The article presents the test results of the geometric accuracy of wax blade models manufactured in silicon moulds in the Rapid Tooling process, with the application of the Vacuum Casting technology. In batch production casting waxes are designed for the manufacture of models and components of model sets through injection into a metal die. The objective of the tests was to determine the possibility of using traditional wax for the production of casting models in the rapid prototyping process. Blade models made of five types of casting wax were measured. The definition of the geometric accuracy of wax blade models makes it possible to introduce individual modifications aimed at improving their shape in order to increase the dimensional accuracy of blade models manufactured in the rapid prototyping process.

  14. A critical analysis of the X.400 model of message handling systems

    NARCIS (Netherlands)

    van Sinderen, Marten J.; Dorregeest, Evert

    1988-01-01

    The CCITT X.400 model of store and forward Message Handling Systems (MHS) serves as a common basis for the definition of electronic mail services and protocols both within CCITT and ISO. This paper presents an analysis of this model and its related recommendations from two perspectives. First the

  15. FORMAL MODELLING OF BUSINESS RULES: WHAT KIND OF TOOL TO USE?

    Directory of Open Access Journals (Sweden)

    Sandra Lovrenčić

    2006-12-01

    Full Text Available Business rules are today essential parts of a business system model. But presently, there are still various approaches to, definitions and classifications of this concept. Similarly, there are also different approaches in business rules formalization and implementation. This paper investigates formalization using formal language in association with easy domain modelling. Two of the tools that enable such approach are described and compared according to several factors. They represent ontology modelling and UML, nowadays widely used standard for object-oriented modelling. A simple example is also presented.

  16. An ecological process model of systems change.

    Science.gov (United States)

    Peirson, Leslea J; Boydell, Katherine M; Ferguson, H Bruce; Ferris, Lorraine E

    2011-06-01

    In June 2007 the American Journal of Community Psychology published a special issue focused on theories, methods and interventions for systems change which included calls from the editors and authors for theoretical advancement in this field. We propose a conceptual model of systems change that integrates familiar and fundamental community psychology principles (succession, interdependence, cycling of resources, adaptation) and accentuates a process orientation. To situate our framework we offer a definition of systems change and a brief review of the ecological perspective and principles. The Ecological Process Model of Systems Change is depicted, described and applied to a case example of policy driven systems level change in publicly funded social programs. We conclude by identifying salient implications for thinking and action which flow from the Model.

  17. Map-Based Channel Model for Urban Macrocell Propagation Scenarios

    Directory of Open Access Journals (Sweden)

    Jose F. Monserrat

    2015-01-01

    Full Text Available The evolution of LTE towards 5G has started and different research projects and institutions are in the process of verifying new technology components through simulations. Coordination between groups is strongly recommended and, in this sense, a common definition of test cases and simulation models is needed. The scope of this paper is to present a realistic channel model for urban macrocell scenarios. This model is map-based and takes into account the layout of buildings situated in the area under study. A detailed description of the model is given together with a comparison with other widely used channel models. The benchmark includes a measurement campaign in which the proposed model is shown to be much closer to the actual behavior of a cellular system. Particular attention is given to the outdoor component of the model, since it is here where the proposed approach is showing main difference with other previous models.

  18. Classification criteria of syndromes by latent variable models

    DEFF Research Database (Denmark)

    Petersen, Janne

    2010-01-01

    , although this is often desired. I have proposed a new method for predicting class membership that, in contrast to methods based on posterior probabilities of class membership, yields consistent estimates when regressed on explanatory variables in a subsequent analysis. There are four different basic models...... analyses. Part 1: HALS engages different phenotypic changes of peripheral lipoatrophy and central lipohypertrophy.  There are several different definitions of HALS and no consensus on the number of phenotypes. Many of the definitions consist of counting fulfilled criteria on markers and do not include...

  19. Improving the functionality of dictionary definitions for lexical sets ...

    African Journals Online (AJOL)

    2008) approach to the design of lexicographic definitions for members of lexical sets. The questions raised are how to define and identify lexical sets, how lexical conceptual models (LCMs) can support definitional consistency and coherence in ...

  20. Lotka-Volterra competition models for sessile organisms.

    Science.gov (United States)

    Spencer, Matthew; Tanner, Jason E

    2008-04-01

    Markov models are widely used to describe the dynamics of communities of sessile organisms, because they are easily fitted to field data and provide a rich set of analytical tools. In typical ecological applications, at any point in time, each point in space is in one of a finite set of states (e.g., species, empty space). The models aim to describe the probabilities of transitions between states. In most Markov models for communities, these transition probabilities are assumed to be independent of state abundances. This assumption is often suspected to be false and is rarely justified explicitly. Here, we start with simple assumptions about the interactions among sessile organisms and derive a model in which transition probabilities depend on the abundance of destination states. This model is formulated in continuous time and is equivalent to a Lotka-Volterra competition model. We fit this model and a variety of alternatives in which transition probabilities do not depend on state abundances to a long-term coral reef data set. The Lotka-Volterra model describes the data much better than all models we consider other than a saturated model (a model with a separate parameter for each transition at each time interval, which by definition fits the data perfectly). Our approach provides a basis for further development of stochastic models of sessile communities, and many of the methods we use are relevant to other types of community. We discuss possible extensions to spatially explicit models.

  1. Specialty Payment Model Opportunities and Assessment: Oncology Model Design Report.

    Science.gov (United States)

    Huckfeldt, Peter J; Chan, Chris; Hirshman, Samuel; Kofner, Aaron; Liu, Jodi L; Mulcahy, Andrew W; Popescu, Ioana; Stevens, Clare; Timbie, Justin W; Hussey, Peter S

    2015-07-15

    This article describes research related to the design of a payment model for specialty oncology services for possible testing by the Center for Medicare and Medicaid Innovation at the Centers for Medicare & Medicaid Services (CMS). Cancer is a common and costly condition. Episode-based payment, which aims to create incentives for high-quality, low-cost care, has been identified as a promising alternative payment model for oncology care. Episode-based payment systems can provide flexibility to health care providers to select among the most effective and efficient treatment alternatives, including activities that are not currently reimbursed under Medicare payment policies. However, the model design also needs to ensure that high-quality care is delivered and that beneficial treatments are not withheld from patients. CMS asked MITRE and RAND to conduct analyses to inform design decisions related to an episode-based oncology model for Medicare beneficiaries undergoing chemotherapy treatment for cancer. In particular, this study focuses on analyses of Medicare claims data related to the definition of the initiation of an episode of chemotherapy, patterns of spending during and surrounding episodes of chemotherapy, and attribution of episodes of chemotherapy to physician practices. We found that the time between the primary cancer diagnosis and chemotherapy initiation varied widely across patients, ranging from one day to over seven years, with a median of 2.4 months. The average level of total monthly payments varied considerably across cancers, with the highest spending peak of $9,972 for lymphoma, and peaks of $3,109 for breast cancer and $2,135 for prostate cancer.

  2. Equicontrollability and its application to model-following and decoupling.

    Science.gov (United States)

    Curran, R. T.

    1971-01-01

    Discussion of 'model following,' a term used to describe a class of problems characterized by having two dynamic systems, generically known as the 'plant' and the 'model,' it being required to find a controller to attach to the plant so as to make the resultant compensated system behave, in an input/output sense, in the same way as the model. The approach presented to the problem takes a structural point of view. The result is a complex but informative definition which solves the problem as posed. The application of both the algorithm and its basis, equicontrollability, to the decoupling problem is considered.

  3. IMPLEMENTASI EDITOR MODEL DATA KONSEPTUAL DAN MODEL DATA FISIK DENGAN ROUND-TRIP ENGINEERING

    Directory of Open Access Journals (Sweden)

    Aldy Sefan Rezanaldy

    2012-01-01

    Full Text Available Normal 0 false false false IN X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Desain model basis data merupakan sebuah fase penting dalam pengembangan sebuah Aplikasi Sistem Informasi. Editor Model Data yang digunakan untuk melakukan desain basis data sangat diperlukan dalam dunia IT. Sebagian Editor yang ada saat ini belum menerapkan konsep round-trip engineering secara real time, sehingga perubahan yang terjadi pada satu data model membutuhkan event update untuk melakukan pembaharuan pada model data yang lainnya. Editor model data ini merupakan editor dengan round-trip engineering. Konversi bolak-balik dilakukan antara data model konseptual dan data model fisik. Editor ini dikembangkan dengan menggunakan C# .NET Framework dan implementasi desain pola pada Object Oriented. Dalam implementasi sebuah editor, yang merupakan bagian terpenting selain berjalannya seluruh fitur yang ada adalah tentang performa dan kenyamanan user ketika menggunakannya. Performa dan kenyamanan user menjadi penilaian tersendiri pada sebuah editor model data. Aplikasi yang dihasilkan diharapkan dapat digunakan untuk melakukan desain basis data dengan menerapkan metode konversi bolak-balik, sehingga tidak diperlukan proses perbaruan dari model data yang satu ke model data yang lain secara manual. Diharapkan dapat

  4. a Predator-Prey Model Based on the Fully Parallel Cellular Automata

    Science.gov (United States)

    He, Mingfeng; Ruan, Hongbo; Yu, Changliang

    We presented a predator-prey lattice model containing moveable wolves and sheep, which are characterized by Penna double bit strings. Sexual reproduction and child-care strategies are considered. To implement this model in an efficient way, we build a fully parallel Cellular Automata based on a new definition of the neighborhood. We show the roles played by the initial densities of the populations, the mutation rate and the linear size of the lattice in the evolution of this model.

  5. Defining and detecting structural sensitivity in biological models: developing a new framework.

    Science.gov (United States)

    Adamson, M W; Morozov, A Yu

    2014-12-01

    When we construct mathematical models to represent biological systems, there is always uncertainty with regards to the model specification--whether with respect to the parameters or to the formulation of model functions. Sometimes choosing two different functions with close shapes in a model can result in substantially different model predictions: a phenomenon known in the literature as structural sensitivity, which is a significant obstacle to improving the predictive power of biological models. In this paper, we revisit the general definition of structural sensitivity, compare several more specific definitions and discuss their usefulness for the construction and analysis of biological models. Then we propose a general approach to reveal structural sensitivity with regards to certain system properties, which considers infinite-dimensional neighbourhoods of the model functions: a far more powerful technique than the conventional approach of varying parameters for a fixed functional form. In particular, we suggest a rigorous method to unearth sensitivity with respect to the local stability of systems' equilibrium points. We present a method for specifying the neighbourhood of a general unknown function with [Formula: see text] inflection points in terms of a finite number of local function properties, and provide a rigorous proof of its completeness. Using this powerful result, we implement our method to explore sensitivity in several well-known multicomponent ecological models and demonstrate the existence of structural sensitivity in these models. Finally, we argue that structural sensitivity is an important intrinsic property of biological models, and a direct consequence of the complexity of the underlying real systems.

  6. Mathematical models for estimating radio channels utilization when ...

    African Journals Online (AJOL)

    Definition of the radio channel utilization indicator is given. Mathematical models for radio channels utilization assessment by real-time flows transfer in the wireless self-organized network are presented. Estimated experiments results according to the average radio channel utilization productivity with and without buffering of ...

  7. Verifying and Validating Simulation Models

    Energy Technology Data Exchange (ETDEWEB)

    Hemez, Francois M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-02-23

    This presentation is a high-level discussion of the Verification and Validation (V&V) of computational models. Definitions of V&V are given to emphasize that “validation” is never performed in a vacuum; it accounts, instead, for the current state-of-knowledge in the discipline considered. In particular comparisons between physical measurements and numerical predictions should account for their respective sources of uncertainty. The differences between error (bias), aleatoric uncertainty (randomness) and epistemic uncertainty (ignorance, lack-of- knowledge) are briefly discussed. Four types of uncertainty in physics and engineering are discussed: 1) experimental variability, 2) variability and randomness, 3) numerical uncertainty and 4) model-form uncertainty. Statistical sampling methods are available to propagate, and analyze, variability and randomness. Numerical uncertainty originates from the truncation error introduced by the discretization of partial differential equations in time and space. Model-form uncertainty is introduced by assumptions often formulated to render a complex problem more tractable and amenable to modeling and simulation. The discussion concludes with high-level guidance to assess the “credibility” of numerical simulations, which stems from the level of rigor with which these various sources of uncertainty are assessed and quantified.

  8. Development of a Prototype System for Archiving Integrative/Hybrid Structure Models of Biological Macromolecules.

    Science.gov (United States)

    Vallat, Brinda; Webb, Benjamin; Westbrook, John D; Sali, Andrej; Berman, Helen M

    2018-04-09

    Essential processes in biology are carried out by large macromolecular assemblies, whose structures are often difficult to determine by traditional methods. Increasingly, researchers combine measured data and computed information from several complementary methods to obtain "hybrid" or "integrative" structural models of macromolecules and their assemblies. These integrative/hybrid (I/H) models are not archived in the PDB because of the absence of standard data representations and processing mechanisms. Here we present the development of data standards and a prototype system for archiving I/H models. The data standards provide the definitions required for representing I/H models that span multiple spatiotemporal scales and conformational states, as well as spatial restraints derived from different experimental techniques. Based on these data definitions, we have built a prototype system called PDB-Dev, which provides the infrastructure necessary to archive I/H structural models. PDB-Dev is now accepting structures and is open to the community for new submissions. Copyright © 2018 Elsevier Ltd. All rights reserved.

  9. Halo Models of Large Scale Structure and Reliability of Cosmological N-Body Simulations

    Directory of Open Access Journals (Sweden)

    José Gaite

    2013-05-01

    Full Text Available Halo models of the large scale structure of the Universe are critically examined, focusing on the definition of halos as smooth distributions of cold dark matter. This definition is essentially based on the results of cosmological N-body simulations. By a careful analysis of the standard assumptions of halo models and N-body simulations and by taking into account previous studies of self-similarity of the cosmic web structure, we conclude that N-body cosmological simulations are not fully reliable in the range of scales where halos appear. Therefore, to have a consistent definition of halos is necessary either to define them as entities of arbitrary size with a grainy rather than smooth structure or to define their size in terms of small-scale baryonic physics.

  10. Ports: Definition and study of types, sizes and business models

    Directory of Open Access Journals (Sweden)

    Ivan Roa

    2013-09-01

    Full Text Available Purpose: In the world today there are thousands of port facilities of different types and sizes, competing to capture some market share of freight by sea, mainly. This article aims to determine the type of port and the most common size, in order to find out which business model is applied in that segment and what is the legal status of the companies of such infrastructure.Design/methodology/approach: To achieve this goal, we develop a research on a representative sample of 800 ports worldwide, which manage 90% of the containerized port loading. Then you can find out the legal status of the companies that manage them.Findings: The results indicate a port type and a dominant size, which are mostly managed by companies subject to a concession model.Research limitations/implications: In this research, we study only those ports that handle freight (basically containerized, ignoring other activities such as fishing, military, tourism or recreational.Originality/value: This is an investigation to show that the vast majority of the studied segment port facilities are governed by a similar corporate model and subject to pressure from the markets, which increasingly demand efficiency and service. Consequently, we tend to concession terminals to private operators in a process that might be called privatization, but in the strictest sense of the term, is not entirely realistic because the ownership of the land never ceases to be public

  11. Computer modeling of flow induced in-reactor vibrations

    International Nuclear Information System (INIS)

    Turula, P.; Mulcahy, T.M.

    1977-01-01

    An assessment of the reliability of finite element method computer models, as applied to the computation of flow induced vibration response of components used in nuclear reactors, is presented. The prototype under consideration was the Fast Flux Test Facility reactor being constructed for US-ERDA. Data were available from an extensive test program which used a scale model simulating the hydraulic and structural characteristics of the prototype components, subjected to scaled prototypic flow conditions as well as to laboratory shaker excitations. Corresponding analytical solutions of the component vibration problems were obtained using the NASTRAN computer code. Modal analyses and response analyses were performed. The effect of the surrounding fluid was accounted for. Several possible forcing function definitions were considered. Results indicate that modal computations agree well with experimental data. Response amplitude comparisons are good only under conditions favorable to a clear definition of the structural and hydraulic properties affecting the component motion. 20 refs

  12. Stability of rotor systems: A complex modelling approach

    DEFF Research Database (Denmark)

    Kliem, Wolfhard; Pommer, Christian; Stoustrup, Jakob

    1998-01-01

    The dynamics of a large class of rotor systems can be modelled by a linearized complex matrix differential equation of second order, Mz + (D + iG)(z) over dot + (K + iN)z = 0, where the system matrices M, D, G, K and N are real symmetric. Moreover M and K are assumed to be positive definite and D...... approach applying bounds of appropriate Rayleigh quotients. The rotor systems tested are: a simple Laval rotor, a Laval rotor with additional elasticity and damping in the bearings, and a number of rotor systems with complex symmetric 4 x 4 randomly generated matrices.......The dynamics of a large class of rotor systems can be modelled by a linearized complex matrix differential equation of second order, Mz + (D + iG)(z) over dot + (K + iN)z = 0, where the system matrices M, D, G, K and N are real symmetric. Moreover M and K are assumed to be positive definite and D...

  13. Credit Risk Evaluation : Modeling - Analysis - Management

    OpenAIRE

    Wehrspohn, Uwe

    2002-01-01

    An analysis and further development of the building blocks of modern credit risk management: -Definitions of default -Estimation of default probabilities -Exposures -Recovery Rates -Pricing -Concepts of portfolio dependence -Time horizons for risk calculations -Quantification of portfolio risk -Estimation of risk measures -Portfolio analysis and portfolio improvement -Evaluation and comparison of credit risk models -Analytic portfolio loss distributions The thesis contributes to the evaluatio...

  14. The architecture and prototype implementation of the Model Environment system

    Science.gov (United States)

    Donchyts, G.; Treebushny, D.; Primachenko, A.; Shlyahtun, N.; Zheleznyak, M.

    2007-01-01

    An approach that simplifies software development of the model based decision support systems for environmental management has been introduced. The approach is based on definition and management of metadata and data related to computational model without losing data semantics and proposed methods of integration of the new modules into the information system and their management. An architecture of the integrated modelling system is presented. The proposed architecture has been implemented as a prototype of integrated modelling system using. NET/Gtk{#} and is currently being used to re-design European Decision Support System for Nuclear Emergency Management RODOS (http://www.rodos.fzk.de) using Java/Swing.

  15. Tensor Dictionary Learning for Positive Definite Matrices.

    Science.gov (United States)

    Sivalingam, Ravishankar; Boley, Daniel; Morellas, Vassilios; Papanikolopoulos, Nikolaos

    2015-11-01

    Sparse models have proven to be extremely successful in image processing and computer vision. However, a majority of the effort has been focused on sparse representation of vectors and low-rank models for general matrices. The success of sparse modeling, along with popularity of region covariances, has inspired the development of sparse coding approaches for these positive definite descriptors. While in earlier work, the dictionary was formed from all, or a random subset of, the training signals, it is clearly advantageous to learn a concise dictionary from the entire training set. In this paper, we propose a novel approach for dictionary learning over positive definite matrices. The dictionary is learned by alternating minimization between sparse coding and dictionary update stages, and different atom update methods are described. A discriminative version of the dictionary learning approach is also proposed, which simultaneously learns dictionaries for different classes in classification or clustering. Experimental results demonstrate the advantage of learning dictionaries from data both from reconstruction and classification viewpoints. Finally, a software library is presented comprising C++ binaries for all the positive definite sparse coding and dictionary learning approaches presented here.

  16. A Privacy Model for RFID Tag Ownership Transfer

    Directory of Open Access Journals (Sweden)

    Xingchun Yang

    2017-01-01

    Full Text Available The ownership of RFID tag is often transferred from one owner to another in its life cycle. To address the privacy problem caused by tag ownership transfer, we propose a tag privacy model which captures the adversary’s abilities to get secret information inside readers, to corrupt tags, to authenticate tags, and to observe tag ownership transfer processes. This model gives formal definitions for tag forward privacy and backward privacy and can be used to measure the privacy property of tag ownership transfer scheme. We also present a tag ownership transfer scheme, which is privacy-preserving under the proposed model and satisfies the other common security requirements, in addition to achieving better performance.

  17. MODEL PERANGKAT PEMBELAJARAN MENULIS BERDASARKAN PENDEKATAN PROSES GENRE BAGI SISWA SMP

    Directory of Open Access Journals (Sweden)

    Kastam Syamsi

    2013-01-01

    writing based on the genre process approach for junior high school students. The procedure employed the R2D2 model, consisting of three main stages, i.e. (1 definition, (2 design and development, and (3 dissemination. The data consisted of qualitative and quantitative data. The qualitative data were analyzed using the domain analysis technique with critical and reflective principles, while the quantitative data were analyzed using t-test with SPSS 16.0 for Windows. The study produces five models, i.e.: (1 a syllabus model, (2 a lesson plan model, (3 a learning materials model, (4 an evaluation instrument model, and (5 a teacher guide model of the teaching of writing. Based on the data analysis, the five models of teaching kits for writing are effective to improve the students’ writing ability.

  18. Extending enterprise architecture modelling with business goals and requirements

    Science.gov (United States)

    Engelsman, Wilco; Quartel, Dick; Jonkers, Henk; van Sinderen, Marten

    2011-02-01

    The methods for enterprise architecture (EA), such as The Open Group Architecture Framework, acknowledge the importance of requirements modelling in the development of EAs. Modelling support is needed to specify, document, communicate and reason about goals and requirements. The current modelling techniques for EA focus on the products, services, processes and applications of an enterprise. In addition, techniques may be provided to describe structured requirements lists and use cases. Little support is available however for modelling the underlying motivation of EAs in terms of stakeholder concerns and the high-level goals that address these concerns. This article describes a language that supports the modelling of this motivation. The definition of the language is based on existing work on high-level goal and requirements modelling and is aligned with an existing standard for enterprise modelling: the ArchiMate language. Furthermore, the article illustrates how EA can benefit from analysis techniques from the requirements engineering domain.

  19. Volume-based geometric modeling for radiation transport calculations

    International Nuclear Information System (INIS)

    Li, Z.; Williamson, J.F.

    1992-01-01

    Accurate theoretical characterization of radiation fields is a valuable tool in the design of complex systems, such as linac heads and intracavitary applicators, and for generation of basic dose calculation data that is inaccessible to experimental measurement. Both Monte Carlo and deterministic solutions to such problems require a system for accurately modeling complex 3-D geometries that supports ray tracing, point and segment classification, and 2-D graphical representation. Previous combinatorial approaches to solid modeling, which involve describing complex structures as set-theoretic combinations of simple objects, are limited in their ease of use and place unrealistic constraints on the geometric relations between objects such as excluding common boundaries. A new approach to volume-based solid modeling has been developed which is based upon topologically consistent definitions of boundary, interior, and exterior of a region. From these definitions, FORTRAN union, intersection, and difference routines have been developed that allow involuted and deeply nested structures to be described as set-theoretic combinations of ellipsoids, elliptic cylinders, prisms, cones, and planes that accommodate shared boundaries. Line segments between adjacent intersections on a trajectory are assigned to the appropriate region by a novel sorting algorithm that generalizes upon Siddon's approach. Two 2-D graphic display tools are developed to help the debugging of a given geometric model. In this paper, the mathematical basis of our system is described, it is contrasted to other approaches, and examples are discussed

  20. Perspectives on continuum flow models for force-driven nano-channel liquid flows

    Science.gov (United States)

    Beskok, Ali; Ghorbanian, Jafar; Celebi, Alper

    2017-11-01

    A phenomenological continuum model is developed using systematic molecular dynamics (MD) simulations of force-driven liquid argon flows confined in gold nano-channels at a fixed thermodynamic state. Well known density layering near the walls leads to the definition of an effective channel height and a density deficit parameter. While the former defines the slip-plane, the latter parameter relates channel averaged density with the desired thermodynamic state value. Definitions of these new parameters require a single MD simulation performed for a specific liquid-solid pair at the desired thermodynamic state and used for calibration of model parameters. Combined with our observations of constant slip-length and kinematic viscosity, the model accurately predicts the velocity distribution and volumetric and mass flow rates for force-driven liquid flows in different height nano-channels. Model is verified for liquid argon flow at distinct thermodynamic states and using various argon-gold interaction strengths. Further verification is performed for water flow in silica and gold nano-channels, exhibiting slip lengths of 1.2 nm and 15.5 nm, respectively. Excellent agreements between the model and the MD simulations are reported for channel heights as small as 3 nm for various liquid-solid pairs.

  1. THERMODYNAMIC MODEL OF GAS HYDRATES

    OpenAIRE

    Недоступ, В. И.; Недоступ, О. В.

    2015-01-01

    The interest to gas hydrates grows last years. Therefore working out of reliable settlement-theoretical methods of definition of their properties is necessary. The thermodynamic model of gas hydrates in which the central place occupies a behaviour of guest molecule in cell is described. The equations of interaction of molecule hydrate formative gas with cell are received, and also an enthalpy and energy of output of molecule from a cell are determined. The equation for calculation of thermody...

  2. Proposing a Metaliteracy Model to Redefine Information Literacy

    Science.gov (United States)

    Jacobson, Trudi E.; Mackey, Thomas P.

    2013-01-01

    Metaliteracy is envisioned as a comprehensive model for information literacy to advance critical thinking and reflection in social media, open learning settings, and online communities. At this critical time in higher education, an expansion of the original definition of information literacy is required to include the interactive production and…

  3. Spin foam models for quantum gravity

    International Nuclear Information System (INIS)

    Perez, Alejandro

    2003-01-01

    In this topical review, we review the present status of the spin foam formulation of non-perturbative (background-independent) quantum gravity. The topical review is divided into two parts. In the first part, we present a general introduction to the main ideas emphasizing their motivation from various perspectives. Riemannian three-dimensional gravity is used as a simple example to illustrate conceptual issues and the main goals of the approach. The main features of the various existing models for four-dimensional gravity are also presented here. We conclude with a discussion of important questions to be addressed in four dimensions (gauge invariance, discretization independence, etc). In the second part, we concentrate on the definition of the Barrett-Crane model. We present the main results obtained in this framework from a critical perspective. Finally, we review the combinatorial formulation of spin foam models based on the dual group field theory technology. We present the Barrett-Crane model in this framework and review the finiteness results obtained for both its Riemannian and its Lorentzian variants. (topical review)

  4. A cautionary note on generalized linear models for covariance of unbalanced longitudinal data

    KAUST Repository

    Huang, Jianhua Z.

    2012-03-01

    Missing data in longitudinal studies can create enormous challenges in data analysis when coupled with the positive-definiteness constraint on a covariance matrix. For complete balanced data, the Cholesky decomposition of a covariance matrix makes it possible to remove the positive-definiteness constraint and use a generalized linear model setup to jointly model the mean and covariance using covariates (Pourahmadi, 2000). However, this approach may not be directly applicable when the longitudinal data are unbalanced, as coherent regression models for the dependence across all times and subjects may not exist. Within the existing generalized linear model framework, we show how to overcome this and other challenges by embedding the covariance matrix of the observed data for each subject in a larger covariance matrix and employing the familiar EM algorithm to compute the maximum likelihood estimates of the parameters and their standard errors. We illustrate and assess the methodology using real data sets and simulations. © 2011 Elsevier B.V.

  5. Structures and scan strategies of software net models

    International Nuclear Information System (INIS)

    Puhr-Westerheide, P.; Sandbaek, H.

    1984-01-01

    The present paper deals with some aspects of plant control and monitoring systems as used in nuclear power plants. These aspects concern executable net models to run on computers. A short survey on the nets' environment and on some net scan strategies is given. Among the strategies are the 'topologically ordered scan' and the 'signal propagation scan'. A combined method 'topologically ordered signal propagation (TOSIP) scan' will be outlined as well as a net model data structure that allows the definition of subsystems for the use of clear structuration and dischargement to distributed systems. (author)

  6. Modeling of processes of an adaptive business management

    Directory of Open Access Journals (Sweden)

    Karev Dmitry Vladimirovich

    2011-04-01

    Full Text Available On the basis of the analysis of systems of adaptive management board business proposed the original version of the real system of adaptive management, the basis of which used dynamic recursive model cash flow forecast and real data. Proposed definitions and the simulation of scales and intervals of model time in the control system, as well as the thresholds observations and conditions of changing (correction of the administrative decisions. The process of adaptive management is illustrated on the basis proposed by the author of the script of development of business.

  7. Documentation of the Uranium Market Model (UMM)

    International Nuclear Information System (INIS)

    1989-01-01

    The Uranium Market Model is used to make projections of activity in the US uranium mining and milling industry. The primary data sources were EIA, the Nuclear Assurance Corporation, and, to a lesser extent, Nuexco and Nuclear Resources International. The Uranium Market Model is a microeconomic simulation model in which uranium supplied by the mining and milling industry is provided to meet the demand for uranium by electric utilities with nuclear power plants. Uranium is measured on a U 3 O 8 (uranium oxide) equivalent basis. The model considers every major production center and utility on a worldwide basis (with Centrally Planned Economies considered in a limited way), and makes annual projections for each major uranium production and consumption region in the world. Typically, nine regions are used: the United States, Canada, Australia, South Africa, Other Africa, Europe, Latin America, the Far East, and Other. Production centers and utilities are identified as being in one of these regions. In general, the model can accommodate any user-provided set of regional definitions and data

  8. Numerical Solution of Fractional Neutron Point Kinetics Model in Nuclear Reactor

    Directory of Open Access Journals (Sweden)

    Nowak Tomasz Karol

    2014-06-01

    Full Text Available This paper presents results concerning solutions of the fractional neutron point kinetics model for a nuclear reactor. Proposed model consists of a bilinear system of fractional and ordinary differential equations. Three methods to solve the model are presented and compared. The first one entails application of discrete Grünwald-Letnikov definition of the fractional derivative in the model. Second involves building an analog scheme in the FOMCON Toolbox in MATLAB environment. Third is the method proposed by Edwards. The impact of selected parameters on the model’s response was examined. The results for typical input were discussed and compared.

  9. Measurement of psychological disorders using cognitive diagnosis models.

    Science.gov (United States)

    Templin, Jonathan L; Henson, Robert A

    2006-09-01

    Cognitive diagnosis models are constrained (multiple classification) latent class models that characterize the relationship of questionnaire responses to a set of dichotomous latent variables. Having emanated from educational measurement, several aspects of such models seem well suited to use in psychological assessment and diagnosis. This article presents the development of a new cognitive diagnosis model for use in psychological assessment--the DINO (deterministic input; noisy "or" gate) model--which, as an illustrative example, is applied to evaluate and diagnose pathological gamblers. As part of this example, a demonstration of the estimates obtained by cognitive diagnosis models is provided. Such estimates include the probability an individual meets each of a set of dichotomous Diagnostic and Statistical Manual of Mental Disorders (text revision [DSM-IV-TR]; American Psychiatric Association, 2000) criteria, resulting in an estimate of the probability an individual meets the DSM-IV-TR definition for being a pathological gambler. Furthermore, a demonstration of how the hypothesized underlying factors contributing to pathological gambling can be measured with the DINO model is presented, through use of a covariance structure model for the tetrachoric correlation matrix of the dichotomous latent variables representing DSM-IV-TR criteria. Copyright 2006 APA

  10. Large-scale modeling of rain fields from a rain cell deterministic model

    Science.gov (United States)

    FéRal, Laurent; Sauvageot, Henri; Castanet, Laurent; Lemorton, JoëL.; Cornet, FréDéRic; Leconte, Katia

    2006-04-01

    A methodology to simulate two-dimensional rain rate fields at large scale (1000 × 1000 km2, the scale of a satellite telecommunication beam or a terrestrial fixed broadband wireless access network) is proposed. It relies on a rain rate field cellular decomposition. At small scale (˜20 × 20 km2), the rain field is split up into its macroscopic components, the rain cells, described by the Hybrid Cell (HYCELL) cellular model. At midscale (˜150 × 150 km2), the rain field results from the conglomeration of rain cells modeled by HYCELL. To account for the rain cell spatial distribution at midscale, the latter is modeled by a doubly aggregative isotropic random walk, the optimal parameterization of which is derived from radar observations at midscale. The extension of the simulation area from the midscale to the large scale (1000 × 1000 km2) requires the modeling of the weather frontal area. The latter is first modeled by a Gaussian field with anisotropic covariance function. The Gaussian field is then turned into a binary field, giving the large-scale locations over which it is raining. This transformation requires the definition of the rain occupation rate over large-scale areas. Its probability distribution is determined from observations by the French operational radar network ARAMIS. The coupling with the rain field modeling at midscale is immediate whenever the large-scale field is split up into midscale subareas. The rain field thus generated accounts for the local CDF at each point, defining a structure spatially correlated at small scale, midscale, and large scale. It is then suggested that this approach be used by system designers to evaluate diversity gain, terrestrial path attenuation, or slant path attenuation for different azimuth and elevation angle directions.

  11. Examining the Etiology of Reading Disability as Conceptualized by the Hybrid Model

    Science.gov (United States)

    Erbeli, Florina; Hart, Sara A.; Wagner, Richard K.; Taylor, Jeanette

    2018-01-01

    A fairly recent definition of reading disability (RD) is that in the form of a hybrid model. The model views RD as a latent construct that is manifested through various observable unexpected impairments in reading-related skills and through inadequate response to intervention. The current report evaluated this new conceptualization of RD from an…

  12. A compositional modelling framework for exploring MPSoC systems

    DEFF Research Database (Denmark)

    Tranberg-Hansen, Anders Sejer; Madsen, Jan

    2009-01-01

    This paper presents a novel compositional framework for system level performance estimation and exploration of Multi-Processor System On Chip (MPSoC) based systems. The main contributions are the definition of a compositional model which allows quantitative performance estimation to be carried ou...

  13. System modelling of a lateral force microscope

    International Nuclear Information System (INIS)

    Michal, Guillaume; Lu, Cheng; Kiet Tieu, A

    2008-01-01

    To quantitatively analyse lateral force microscope measurements one needs to develop a model able to relate the photodiode signal to the force acting on the tip apex. In this paper we focus on the modelling of the interaction between the cantilever and the optical chain. The laser beam is discretized by a set of rays which propagates in the system. The analytical equation of a single ray's position on the optical sensor is presented as a function of the reflection's state on top of the cantilever. We use a finite element analysis on the cantilever to connect the optical model with the force acting on the tip apex. A first-order approximation of the constitutive equations are derived along with a definition of the system's crosstalk. Finally, the model is used to analytically simulate the 'wedge method' in the presence of crosstalk in 2D. The analysis shows how the torsion loop and torsion offset signals are affected by the crosstalk.

  14. Aerodynamic models for a Darrieus wind turbine

    Science.gov (United States)

    Fraunie, P.; Beguier, C.; Paraschivoiu, I.; Delclaux, F.

    1982-11-01

    Various models proposed for the aerodynamics of Darrieus wind turbines are reviewed. The magnitude of the L/D ratio for a Darrieus rotor blade is dependent on the profile, the Re, boundary layer characteristics, and the three-dimensional flow effects. The aerodynamic efficiency is theoretically the Betz limit, and the interference of one blade with another is constrained by the drag force integrated over all points on the actuator disk. A single streamtube model can predict the power available in a Darrieus, but the model lacks definition of the flow structure and the cyclic stresses. Techniques for calculating the velocity profiles and the consequent induced velocity at the blades are presented. The multiple streamtube theory has been devised to account for the repartition of the velocity in the rotor interior. The model has been expanded as the double multiple streamtube theory at Sandia Laboratories. Futher work is necessary, however, to include the effects of dynamic decoupling at high rotation speeds and to accurately describe blade behavior.

  15. Aspect Βased Classification Model for Social Reviews

    Directory of Open Access Journals (Sweden)

    J. Mir

    2017-12-01

    Full Text Available Aspect based opinion mining investigates deeply, the emotions related to one’s aspects. Aspects and opinion word identification is the core task of aspect based opinion mining. In previous studies aspect based opinion mining have been applied on service or product domain. Moreover, product reviews are short and simple whereas, social reviews are long and complex. However, this study introduces an efficient model for social reviews which classifies aspects and opinion words related to social domain. The main contributions of this paper are auto tagging and data training phase, feature set definition and dictionary usage. Proposed model results are compared with CR model and Naïve Bayes classifier on same dataset having accuracy 98.17% and precision 96.01%, while recall and F1 are 96.00% and 96.01% respectively. The experimental results show that the proposed model performs better than the CR model and Naïve Bayes classifier.

  16. Academic Talent Development Programs: A Best Practices Model

    Science.gov (United States)

    Gagné, Françoys

    2015-01-01

    This article aims to describe how schools should structure the development of academic talent at all levels of the K-12 educational system. Adopting as its theoretical framework the "Differentiating Model of Giftedness and Talent," the author proposes (a) a formal definition of academic talent development (ATD) inspired by the principles…

  17. Logistic regression model for diagnosis of transition zone prostate cancer on multi-parametric MRI.

    Science.gov (United States)

    Dikaios, Nikolaos; Alkalbani, Jokha; Sidhu, Harbir Singh; Fujiwara, Taiki; Abd-Alazeez, Mohamed; Kirkham, Alex; Allen, Clare; Ahmed, Hashim; Emberton, Mark; Freeman, Alex; Halligan, Steve; Taylor, Stuart; Atkinson, David; Punwani, Shonit

    2015-02-01

    We aimed to develop logistic regression (LR) models for classifying prostate cancer within the transition zone on multi-parametric magnetic resonance imaging (mp-MRI). One hundred and fifty-five patients (training cohort, 70 patients; temporal validation cohort, 85 patients) underwent mp-MRI and transperineal-template-prostate-mapping (TPM) biopsy. Positive cores were classified by cancer definitions: (1) any-cancer; (2) definition-1 [≥Gleason 4 + 3 or ≥ 6 mm cancer core length (CCL)] [high risk significant]; and (3) definition-2 (≥Gleason 3 + 4 or ≥ 4 mm CCL) cancer [intermediate-high risk significant]. For each, logistic-regression mp-MRI models were derived from the training cohort and validated internally and with the temporal cohort. Sensitivity/specificity and the area under the receiver operating characteristic (ROC-AUC) curve were calculated. LR model performance was compared to radiologists' performance. Twenty-eight of 70 patients from the training cohort, and 25/85 patients from the temporal validation cohort had significant cancer on TPM. The ROC-AUC of the LR model for classification of cancer was 0.73/0.67 at internal/temporal validation. The radiologist A/B ROC-AUC was 0.65/0.74 (temporal cohort). For patients scored by radiologists as Prostate Imaging Reporting and Data System (Pi-RADS) score 3, sensitivity/specificity of radiologist A 'best guess' and LR model was 0.14/0.54 and 0.71/0.61, respectively; and radiologist B 'best guess' and LR model was 0.40/0.34 and 0.50/0.76, respectively. LR models can improve classification of Pi-RADS score 3 lesions similar to experienced radiologists. • MRI helps find prostate cancer in the anterior of the gland • Logistic regression models based on mp-MRI can classify prostate cancer • Computers can help confirm cancer in areas doctors are uncertain about.

  18. Mouse Models as Predictors of Human Responses: Evolutionary Medicine.

    Science.gov (United States)

    Uhl, Elizabeth W; Warner, Natalie J

    Mice offer a number of advantages and are extensively used to model human diseases and drug responses. Selective breeding and genetic manipulation of mice have made many different genotypes and phenotypes available for research. However, in many cases, mouse models have failed to be predictive. Important sources of the prediction problem have been the failure to consider the evolutionary basis for species differences, especially in drug metabolism, and disease definitions that do not reflect the complexity of gene expression underlying disease phenotypes. Incorporating evolutionary insights into mouse models allow for unique opportunities to characterize the effects of diet, different gene expression profiles, and microbiomics underlying human drug responses and disease phenotypes.

  19. Logistic regression model for diagnosis of transition zone prostate cancer on multi-parametric MRI

    Energy Technology Data Exchange (ETDEWEB)

    Dikaios, Nikolaos; Halligan, Steve; Taylor, Stuart; Atkinson, David; Punwani, Shonit [University College London, Centre for Medical Imaging, London (United Kingdom); University College London Hospital, Departments of Radiology, London (United Kingdom); Alkalbani, Jokha; Sidhu, Harbir Singh; Fujiwara, Taiki [University College London, Centre for Medical Imaging, London (United Kingdom); Abd-Alazeez, Mohamed; Ahmed, Hashim; Emberton, Mark [University College London, Research Department of Urology, London (United Kingdom); Kirkham, Alex; Allen, Clare [University College London Hospital, Departments of Radiology, London (United Kingdom); Freeman, Alex [University College London Hospital, Department of Histopathology, London (United Kingdom)

    2014-09-17

    We aimed to develop logistic regression (LR) models for classifying prostate cancer within the transition zone on multi-parametric magnetic resonance imaging (mp-MRI). One hundred and fifty-five patients (training cohort, 70 patients; temporal validation cohort, 85 patients) underwent mp-MRI and transperineal-template-prostate-mapping (TPM) biopsy. Positive cores were classified by cancer definitions: (1) any-cancer; (2) definition-1 [≥Gleason 4 + 3 or ≥ 6 mm cancer core length (CCL)] [high risk significant]; and (3) definition-2 (≥Gleason 3 + 4 or ≥ 4 mm CCL) cancer [intermediate-high risk significant]. For each, logistic-regression mp-MRI models were derived from the training cohort and validated internally and with the temporal cohort. Sensitivity/specificity and the area under the receiver operating characteristic (ROC-AUC) curve were calculated. LR model performance was compared to radiologists' performance. Twenty-eight of 70 patients from the training cohort, and 25/85 patients from the temporal validation cohort had significant cancer on TPM. The ROC-AUC of the LR model for classification of cancer was 0.73/0.67 at internal/temporal validation. The radiologist A/B ROC-AUC was 0.65/0.74 (temporal cohort). For patients scored by radiologists as Prostate Imaging Reporting and Data System (Pi-RADS) score 3, sensitivity/specificity of radiologist A 'best guess' and LR model was 0.14/0.54 and 0.71/0.61, respectively; and radiologist B 'best guess' and LR model was 0.40/0.34 and 0.50/0.76, respectively. LR models can improve classification of Pi-RADS score 3 lesions similar to experienced radiologists. (orig.)

  20. Logistic regression model for diagnosis of transition zone prostate cancer on multi-parametric MRI

    International Nuclear Information System (INIS)

    Dikaios, Nikolaos; Halligan, Steve; Taylor, Stuart; Atkinson, David; Punwani, Shonit; Alkalbani, Jokha; Sidhu, Harbir Singh; Fujiwara, Taiki; Abd-Alazeez, Mohamed; Ahmed, Hashim; Emberton, Mark; Kirkham, Alex; Allen, Clare; Freeman, Alex

    2015-01-01

    We aimed to develop logistic regression (LR) models for classifying prostate cancer within the transition zone on multi-parametric magnetic resonance imaging (mp-MRI). One hundred and fifty-five patients (training cohort, 70 patients; temporal validation cohort, 85 patients) underwent mp-MRI and transperineal-template-prostate-mapping (TPM) biopsy. Positive cores were classified by cancer definitions: (1) any-cancer; (2) definition-1 [≥Gleason 4 + 3 or ≥ 6 mm cancer core length (CCL)] [high risk significant]; and (3) definition-2 (≥Gleason 3 + 4 or ≥ 4 mm CCL) cancer [intermediate-high risk significant]. For each, logistic-regression mp-MRI models were derived from the training cohort and validated internally and with the temporal cohort. Sensitivity/specificity and the area under the receiver operating characteristic (ROC-AUC) curve were calculated. LR model performance was compared to radiologists' performance. Twenty-eight of 70 patients from the training cohort, and 25/85 patients from the temporal validation cohort had significant cancer on TPM. The ROC-AUC of the LR model for classification of cancer was 0.73/0.67 at internal/temporal validation. The radiologist A/B ROC-AUC was 0.65/0.74 (temporal cohort). For patients scored by radiologists as Prostate Imaging Reporting and Data System (Pi-RADS) score 3, sensitivity/specificity of radiologist A 'best guess' and LR model was 0.14/0.54 and 0.71/0.61, respectively; and radiologist B 'best guess' and LR model was 0.40/0.34 and 0.50/0.76, respectively. LR models can improve classification of Pi-RADS score 3 lesions similar to experienced radiologists. (orig.)

  1. Viable Techniques, Leontief’s Closed Model, and Sraffa’s Subsistence Economies

    Directory of Open Access Journals (Sweden)

    Alberto Benítez

    2014-11-01

    Full Text Available This paper studies the production techniques employed in economies that reproduce themselves. Special attention is paid to the distinction usually made between those that do not produce a surplus and those that do, which are referred to as first and second class economies, respectively. Based on this, we present a new definition of viable economies and show that every viable economy of the second class can be represented as a viable economy of the first class under two different forms, Leontief‘s closed model and Sraffa’s subsistence economies. This allows us to present some remarks concerning the economic interpretation of the two models. On the one hand, we argue that the participation of each good in the production of every good can be considered as a normal characteristic of the first model and, on the other hand, we provide a justification for the same condition to be considered a characteristic of the second model. Furthermore, we discuss three definitions of viable techniques advanced by other authors and show that they differ from ours because they admit economies that do not reproduce themselves completely.

  2. An approach to modeling tensile–compressive asymmetry for martensitic shape memory alloys

    International Nuclear Information System (INIS)

    Zaki, Wael

    2010-01-01

    In this paper, the asymmetric tensile–compressive behavior of shape memory alloys is modeled based on the mathematical framework of Raniecki and Mróz (2008 Acta Mech. 195 81–102). The framework allows the definition of smooth, non-symmetric, pressure-insensitive yield functions that are used here to incorporate tensile–compressive modeling capabilities into the Zaki–Moumni (ZM) model for shape memory materials. It is found that, despite some increased complexity, the generalized model is capable of producing satisfactory results that agree with uniaxial experimental data taken from the literature

  3. A formal model for classifying trusted Semantic Web Services

    OpenAIRE

    Galizia, Stefania; Gugliotta, Alessio; Pedrinaci, Carlos

    2008-01-01

    Semantic Web Services (SWS) aim to alleviate Web service limitations, by combining Web service technologies with the potential of Semantic Web. Several open issues have to be tackled yet, in order to enable a safe and efficient Web services selection. One of them is represented by trust. In this paper, we introduce a trust definition and formalize a model for managing trust in SWS. The model approaches the selection of trusted Web services as a classification problem, and it is realized by an...

  4. Communication and Procedural Models of the E-commerce Systems

    OpenAIRE

    Suchánek, Petr

    2009-01-01

    E-commerce systems became a standard interface between sellers (or suppliers) and customers. One of basic condition of an e-commerce system to be efficient is correct definitions and describes of the all internal and external processes. All is targeted the customers´ needs and requirements. The optimal and most exact way how to obtain and find optimal solution of e-commerce system and its processes structure in companies is the modeling and simulation. In this article author shows basic model...

  5. Craving's place in addiction theory: contributions of the major models.

    Science.gov (United States)

    Skinner, Marilyn D; Aubin, Henri-Jean

    2010-03-01

    We examine in this paper the unfolding of craving concepts within 18 models that span roughly 60 years (1948-2009). The amassed evidence suggests that craving is an indispensable construct, useful as a research area because it has continued to destabilize patients seeking treatment for substances. The models fall into four categories: the conditioning-based models, the cognitive models, the psychobiological models, and the motivation models. In the conditioning models, craving is assumed to be an automatic, unconscious reaction to a stimulus. In the cognitive models, craving arises from the operation of information processing systems. In the psychobiological models, craving can be explained at least in part by biological factors with an emphasis on motivational components. Finally, in the motivation models, craving is viewed as a component of a larger decision-making framework. It is well accepted that no single model explains craving completely, suggesting that a solid understanding of the phenomenon will only occur with consideration from multiple angles. A reformulated definition of craving is proposed. (c) 2009 Elsevier Ltd. All rights reserved.

  6. Model-Based Systems Engineering in Concurrent Engineering Centers

    Science.gov (United States)

    Iwata, Curtis; Infeld, Samantha; Bracken, Jennifer Medlin; McGuire, Melissa; McQuirk, Christina; Kisdi, Aron; Murphy, Jonathan; Cole, Bjorn; Zarifian, Pezhman

    2015-01-01

    Concurrent Engineering Centers (CECs) are specialized facilities with a goal of generating and maturing engineering designs by enabling rapid design iterations. This is accomplished by co-locating a team of experts (either physically or virtually) in a room with a narrow design goal and a limited timeline of a week or less. The systems engineer uses a model of the system to capture the relevant interfaces and manage the overall architecture. A single model that integrates other design information and modeling allows the entire team to visualize the concurrent activity and identify conflicts more efficiently, potentially resulting in a systems model that will continue to be used throughout the project lifecycle. Performing systems engineering using such a system model is the definition of model-based systems engineering (MBSE); therefore, CECs evolving their approach to incorporate advances in MBSE are more successful in reducing time and cost needed to meet study goals. This paper surveys space mission CECs that are in the middle of this evolution, and the authors share their experiences in order to promote discussion within the community.

  7. Adapting Evaluations of Alternative Payment Models to a Changing Environment.

    Science.gov (United States)

    Grannemann, Thomas W; Brown, Randall S

    2018-04-01

    To identify the most robust methods for evaluating alternative payment models (APMs) in the emerging health care delivery system environment. We assess the impact of widespread testing of alternative payment models on the ability to find credible comparison groups. We consider the applicability of factorial research designs for assessing the effects of these models. The widespread adoption of alternative payment models could effectively eliminate the possibility of comparing APM results with a "pure" control or comparison group unaffected by other interventions. In this new environment, factorial experiments have distinct advantages over the single-model experimental or quasi-experimental designs that have been the mainstay of recent tests of Medicare payment and delivery models. The best prospects for producing definitive evidence of the effects of payment incentives for APMs include fractional factorial experiments that systematically vary requirements and payment provisions within a payment model. © Health Research and Educational Trust.

  8. Implementation of a Goal-Based Systems Engineering Process Using the Systems Modeling Language (SysML)

    Science.gov (United States)

    Breckenridge, Jonathan T.; Johnson, Stephen B.

    2013-01-01

    Building upon the purpose, theoretical approach, and use of a Goal-Function Tree (GFT) being presented by Dr. Stephen B. Johnson, described in a related Infotech 2013 ISHM abstract titled "Goal-Function Tree Modeling for Systems Engineering and Fault Management", this paper will describe the core framework used to implement the GFTbased systems engineering process using the Systems Modeling Language (SysML). These two papers are ideally accepted and presented together in the same Infotech session. Statement of problem: SysML, as a tool, is currently not capable of implementing the theoretical approach described within the "Goal-Function Tree Modeling for Systems Engineering and Fault Management" paper cited above. More generally, SysML's current capabilities to model functional decompositions in the rigorous manner required in the GFT approach are limited. The GFT is a new Model-Based Systems Engineering (MBSE) approach to the development of goals and requirements, functions, and its linkage to design. As a growing standard for systems engineering, it is important to develop methods to implement GFT in SysML. Proposed Method of Solution: Many of the central concepts of the SysML language are needed to implement a GFT for large complex systems. In the implementation of those central concepts, the following will be described in detail: changes to the nominal SysML process, model view definitions and examples, diagram definitions and examples, and detailed SysML construct and stereotype definitions.

  9. Computational Modeling for Language Acquisition: A Tutorial With Syntactic Islands.

    Science.gov (United States)

    Pearl, Lisa S; Sprouse, Jon

    2015-06-01

    Given the growing prominence of computational modeling in the acquisition research community, we present a tutorial on how to use computational modeling to investigate learning strategies that underlie the acquisition process. This is useful for understanding both typical and atypical linguistic development. We provide a general overview of why modeling can be a particularly informative tool and some general considerations when creating a computational acquisition model. We then review a concrete example of a computational acquisition model for complex structural knowledge referred to as syntactic islands. This includes an overview of syntactic islands knowledge, a precise definition of the acquisition task being modeled, the modeling results, and how to meaningfully interpret those results in a way that is relevant for questions about knowledge representation and the learning process. Computational modeling is a powerful tool that can be used to understand linguistic development. The general approach presented here can be used to investigate any acquisition task and any learning strategy, provided both are precisely defined.

  10. A visual LISP program for voxelizing AutoCAD solid models

    Science.gov (United States)

    Marschallinger, Robert; Jandrisevits, Carmen; Zobl, Fritz

    2015-01-01

    AutoCAD solid models are increasingly recognized in geological and geotechnical 3D modeling. In order to bridge the currently existing gap between AutoCAD solid models and the grid modeling realm, a Visual LISP program is presented that converts AutoCAD solid models into voxel arrays. Acad2Vox voxelizer works on a 3D-model that is made up of arbitrary non-overlapping 3D-solids. After definition of the target voxel array geometry, 3D-solids are scanned at grid positions and properties are streamed to an ASCII output file. Acad2Vox has a novel voxelization strategy that combines a hierarchical reduction of sampling dimensionality with an innovative use of AutoCAD-specific methods for a fast and memory-saving operation. Acad2Vox provides georeferenced, voxelized analogs of 3D design data that can act as regions-of-interest in later geostatistical modeling and simulation. The Supplement includes sample geological solid models with instructions for practical work with Acad2Vox.

  11. Activated sludge wastewater treatment plant modelling and simulation: state of the art

    DEFF Research Database (Denmark)

    Gernaey, Krist; Loosdrecht, M.C.M. van; Henze, Mogens

    2004-01-01

    This review paper focuses on modelling of wastewater treatment plants (WWTP). White-box modelling is widely applied in this field, with learning, design and process optimisation as the main applications. The introduction of the ASM model family by the IWA task group was of great importance......, providing researchers and practitioners with a standardised. set of basis models. This paper introduces the nowadays most frequently used white-box models for description of biological nitrogen and phosphorus removal activated sludge processes. These models are mainly applicable to municipal wastewater...... systems, but can be adapted easily to specific situations such as the presence of industrial wastewater. Some of the main model assumptions are highlighted, and their implications for practical model application are discussed. A step-wise procedure leads from the model purpose definition to a calibrated...

  12. On the topology of the inflaton field in minimal supergravity models

    Science.gov (United States)

    Ferrara, Sergio; Fré, Pietro; Sorin, Alexander S.

    2014-04-01

    We consider global issues in minimal supergravity models where a single field inflaton potential emerges. In a particular case we reproduce the Starobinsky model and its description dual to a certain formulation of R + R 2 supergravity. For definiteness we confine our analysis to spaces at constant curvature, either vanishing or negative. Five distinct models arise, two flat models with respectively a quadratic and a quartic potential and three based on the space where its distinct isometries, elliptic, hyperbolic and parabolic are gauged. Fayet-Iliopoulos terms are introduced in a geometric way and they turn out to be a crucial ingredient in order to describe the de Sitter inflationary phase of the Starobinsky model.

  13. On the Topology of the Inflaton Field in Minimal Supergravity Models

    CERN Document Server

    Ferrara, Sergio; Sorin, Alexander S

    2014-01-01

    We consider global issues in minimal supergravity models where a single field inflaton potential emerges. In a particular case we reproduce the Starobinsky model and its description dual to a certain formulation of R+R^2 supergravity. For definiteness we confine our analysis to spaces at constant curvature, either vanishing or negative. Five distinct models arise, two flat models with respectively a quadratic and a quartic potential and three based on the SU(1,1)/U(1) space where its distinct isometries, elliptic, hyperbolic and parabolic are gauged. Fayet-Iliopoulos terms are introduced in a geometric way and they turn out to be a crucial ingredient in order to describe the de Sitter inflationary phase of the Starobinsky model.

  14. Reducing uncertainty based on model fitness: Application to a ...

    African Journals Online (AJOL)

    A weakness of global sensitivity and uncertainty analysis methodologies is the often subjective definition of prior parameter probability distributions, especially ... The reservoir representing the central part of the wetland, where flood waters separate into several independent distributaries, is a keystone area within the model.

  15. Software vulnerability: Definition, modelling, and practical evaluation for E-mail: transfer software

    International Nuclear Information System (INIS)

    Kimura, Mitsuhiro

    2006-01-01

    This paper proposes a method of assessing software vulnerability quantitatively. By expanding the concept of the IPO (input-program-output) model, we first define the software vulnerability and construct a stochastic model. Then we evaluate the software vulnerability of the sendmail system by analyzing the actual security-hole data, which were collected from its release note. Also we show the relationship between the estimated software reliability and vulnerability of the analyzed system

  16. N=3 and N=4 superconformal WZNW sigma models in superspace. Part 1

    International Nuclear Information System (INIS)

    Ivanov, E.A.; Krivonos, S.O.

    1989-01-01

    A manifestly invariant superfield description of N=3 and N=4 2D superconformal WZNW sigma models with U(1)xO(3) and U(1)xUO(4) as the bosonic target manifolds. We construct the N=3 superspace formulation of the U(1)xO(3) model. The self-contained definition of N=3 supercurrent via the basic U(1)xO(3) model. 23 refs

  17. Equivalence of a Measurement Model of Cognitive Abilities in U.S. Standardization and Australian Neuroscience Samples

    Science.gov (United States)

    Bowden, Stephen C.; Weiss, Lawrence G.; Holdnack, James A.; Bardenhagen, Fiona J.; Cook, Mark J.

    2008-01-01

    A psychological measurement model provides an explicit definition of (a) the theoretical and (b) the numerical relationships between observed scores and the latent variables that underlie the observed scores. Examination of the metric invariance of a measurement model involves testing the hypothesis that all components of the model relating…

  18. Job stress models for predicting burnout syndrome: a review.

    Science.gov (United States)

    Chirico, Francesco

    2016-01-01

    In Europe, the Council Directive 89/391 for improvement of workers' safety and health has emphasized the importance of addressing all occupational risk factors, and hence also psychosocial and organizational risk factors. Nevertheless, the construct of "work-related stress" elaborated from EU-OSHA is not totally corresponding with the "psychosocial" risk, that is a broader category of risk, comprising various and different psychosocial risk factors. The term "burnout", without any binding definition, tries to integrate symptoms as well as cause of the burnout process. In Europe, the most important methods developed for the work related stress risk assessment are based on the Cox's transactional model of job stress. Nevertheless, there are more specific models for predicting burnout syndrome. This literature review provides an overview of job burnout, highlighting the most important models of job burnout, such as the Job Strain, the Effort/Reward Imbalance and the Job Demands-Resources models. The difference between these models and the Cox's model of job stress is explored.

  19. Modeling and simulation of axisymmetric coating growth on nanofibers

    International Nuclear Information System (INIS)

    Moore, K.; Clemons, C. B.; Kreider, K. L.; Young, G. W.

    2007-01-01

    This work is a modeling and simulation extension of an integrated experimental/modeling investigation of a procedure to coat nanofibers and core-clad nanostructures with thin film materials using plasma enhanced physical vapor deposition. In the experimental effort, electrospun polymer nanofibers are coated with metallic materials under different operating conditions to observe changes in the coating morphology. The modeling effort focuses on linking simple models at the reactor level, nanofiber level, and atomic level to form a comprehensive model. The comprehensive model leads to the definition of an evolution equation for the coating free surface. This equation was previously derived and solved under a single-valued assumption in a polar geometry to determine the coating morphology as a function of operating conditions. The present work considers the axisymmetric geometry and solves the evolution equation without the single-valued assumption and under less restrictive assumptions on the concentration field than the previous work

  20. What Models and Satellites Tell Us (and Don't Tell Us) About Arctic Sea Ice Melt Season Length

    Science.gov (United States)

    Ahlert, A.; Jahn, A.

    2017-12-01

    Melt season length—the difference between the sea ice melt onset date and the sea ice freeze onset date—plays an important role in the radiation balance of the Arctic and the predictability of the sea ice cover. However, there are multiple possible definitions for sea ice melt and freeze onset in climate models, and none of them exactly correspond to the remote sensing definition. Using the CESM Large Ensemble model simulations, we show how this mismatch between model and remote sensing definitions of melt and freeze onset limits the utility of melt season remote sensing data for bias detection in models. It also opens up new questions about the precise physical meaning of the melt season remote sensing data. Despite these challenges, we find that the increase in melt season length in the CESM is not as large as that derived from remote sensing data, even when we account for internal variability and different definitions. At the same time, we find that the CESM ensemble members that have the largest trend in sea ice extent over the period 1979-2014 also have the largest melt season trend, driven primarily by the trend towards later freeze onsets. This might be an indication that an underestimation of the melt season length trend is one factor contributing to the generally underestimated sea ice loss within the CESM, and potentially climate models in general.

  1. Automated cost modeling for coal combustion systems

    International Nuclear Information System (INIS)

    Rowe, R.M.; Anast, K.R.

    1991-01-01

    This paper reports on cost information developed at AMAX R and D Center for coal-water slurry production implemented in an automated spreadsheet (Lotus 123) for personal computer use. The spreadsheet format allows the user toe valuate impacts of various process options, coal feedstock characteristics, fuel characteristics, plant location sites, and plant sizes on fuel cost. Model flexibility reduces time and labor required to determine fuel costs and provides a basis to compare fuels manufactured by different processes. The model input includes coal characteristics, plant flowsheet definition, plant size, and market location. Based on these inputs, selected unit operations are chosen for coal processing

  2. Towards a streaming model for nested data parallelism

    DEFF Research Database (Denmark)

    Madsen, Frederik Meisner; Filinski, Andrzej

    2013-01-01

    The language-integrated cost semantics for nested data parallelism pioneered by NESL provides an intuitive, high-level model for predicting performance and scalability of parallel algorithms with reasonable accuracy. However, this predictability, obtained through a uniform, parallelism-flattening......The language-integrated cost semantics for nested data parallelism pioneered by NESL provides an intuitive, high-level model for predicting performance and scalability of parallel algorithms with reasonable accuracy. However, this predictability, obtained through a uniform, parallelism......-processable in a streaming fashion. This semantics is directly compatible with previously proposed piecewise execution models for nested data parallelism, but allows the expected space usage to be reasoned about directly at the source-language level. The language definition and implementation are still very much work...

  3. Electrothermal Equivalent Three-Dimensional Finite-Element Model of a Single Neuron.

    Science.gov (United States)

    Cinelli, Ilaria; Destrade, Michel; Duffy, Maeve; McHugh, Peter

    2018-06-01

    We propose a novel approach for modelling the interdependence of electrical and mechanical phenomena in nervous cells, by using electrothermal equivalences in finite element (FE) analysis so that existing thermomechanical tools can be applied. First, the equivalence between electrical and thermal properties of the nerve materials is established, and results of a pure heat conduction analysis performed in Abaqus CAE Software 6.13-3 are validated with analytical solutions for a range of steady and transient conditions. This validation includes the definition of equivalent active membrane properties that enable prediction of the action potential. Then, as a step toward fully coupled models, electromechanical coupling is implemented through the definition of equivalent piezoelectric properties of the nerve membrane using the thermal expansion coefficient, enabling prediction of the mechanical response of the nerve to the action potential. Results of the coupled electromechanical model are validated with previously published experimental results of deformation for squid giant axon, crab nerve fibre, and garfish olfactory nerve fibre. A simplified coupled electromechanical modelling approach is established through an electrothermal equivalent FE model of a nervous cell for biomedical applications. One of the key findings is the mechanical characterization of the neural activity in a coupled electromechanical domain, which provides insights into the electromechanical behaviour of nervous cells, such as thinning of the membrane. This is a first step toward modelling three-dimensional electromechanical alteration induced by trauma at nerve bundle, tissue, and organ levels.

  4. A mechanistic model for electricity consumption on dairy farms: Definition, validation, and demonstration

    NARCIS (Netherlands)

    Upton, J.R.; Murphy, M.; Shallo, L.; Groot Koerkamp, P.W.G.; Boer, de I.J.M.

    2014-01-01

    Our objective was to define and demonstrate a mechanistic model that enables dairy farmers to explore the impact of a technical or managerial innovation on electricity consumption, associated CO2 emissions, and electricity costs. We, therefore, (1) defined a model for electricity consumption on

  5. Modelling a New Product Model on the Basis of an Existing STEP Application Protocol

    Directory of Open Access Journals (Sweden)

    B.-R. Hoehn

    2005-01-01

    Full Text Available During the last years a great range of computer aided tools has been generated to support the development process of various products. The goal of a continuous data flow, needed for high efficiency, requires powerful standards for the data exchange. At the FZG (Gear Research Centre of the Technical University of Munich there was a need for a common gear data format for data exchange between gear calculation programs. The STEP standard ISO 10303 was developed for this type of purpose, but a suitable definition of gear data was still missing, even in the Application Protocol AP 214, developed for the design process in the automotive industry. The creation of a new STEP Application Protocol or the extension of existing protocol would be a very time consumpting normative process. So a new method was introduced by FZG. Some very general definitions of an Application Protocol (here AP 214 were used to determine rules for an exact specification of the required kind of data. In this case a product model for gear units was defined based on elements of the AP 214. Therefore no change of the Application Protocol is necessary. Meanwhile the product model for gear units has been published as a VDMA paper and successfully introduced for data exchange within the German gear industry associated with FVA (German Research Organisation for Gears and Transmissions. This method can also be adopted for other applications not yet sufficiently defined by STEP. 

  6. Modeling X-Ray Scattering Process and Applications of the Scattering Model

    Science.gov (United States)

    Al-Jundi, Taher Lutfi

    1995-01-01

    Computer modeling of nondestructive inspections with x-rays is proving to be a very useful tool for enhancing the performance of these techniques. Two x-ray based inspection techniques are considered in this study. The first is "Radiographic Inspection", where an existing simulation model has been improved to account for scattered radiation effects. The second technique is "Inspection with Compton backscattering", where a new simulation model has been developed. The effect of scattered radiation on a simulated radiographic image can be insignificant, equally important, or more important than the effect of the uncollided flux. Techniques to account for the scattered radiation effects include Monte Carlo techniques, and solving the particle transport equation for photons. However, these two techniques although accurate, are computationally expensive and hence inappropriate for use in computer simulation of radiography. A less accurate approach but computationally efficient is the principle of buildup factors. Traditionally, buildup factors are defined for monoenergetic photons of energies typical of a nuclear reactor. In this work I have expanded the definition of buildup factors to include a bremsstrahlung spectrum of photons with energies typically used in radiography (keV's instead of MeV's). This expansion of the definition relies on an intensive experimental work to measure buildup factors for a white spectrum of x-rays. I have also developed a monte carlo code to reproduce the measured buildup factors. The code was then converted to a parallel code and distributed on a network of workstations to reduce the execution time. The second inspection technique is based on Compton backscattering, where photons are scattered at large angles, more than 90 degrees. The importance of this technique arises when the inspected object is very large, or when access is limited to only one side of the specimen. The downside of detecting photons from backscattering is the low

  7. MODEL KEPEMIMPINAN INSTRUKSIONAL KEPALA SEKOLAH

    Directory of Open Access Journals (Sweden)

    Husaini Usman

    2015-12-01

    Full Text Available Abstrak: Penelitian ini bertujuan untuk menemukan model kepemimpinan instruksional kepala sekolah. Penelitian ini menggunakan pendekatan kualitatif dengan jenis multikasus. Instrumen penelitan adalah peneliti sendiri. Teknik pengumpulan data yang digunakan adalah wawancara mendalam, observasi partisipasi, dan dokumentasi. Subjek penelitian adalah kepala sekolah, wakil kepala sekolah, dan guru secara snowball. Informan kunci dalam penelitian ini adalah kepala sekolah. Objek penelitian adalah pelaku, konsep, tempat, dan kegiatan. Langkah-langkah dan analisis data menggunakan model Creswell (2014. Keabsahan data dilakukan dengan kriteria kredibilitas, transferabilitas, dependabilitas, dan konfirmabilitas. Pada penelitian ditemukan model kepemimpinan instruksional kepala sekolah dengan siklus: pemahaman definisi kepemimpinan instruksional; tujuan dan manfaat kepemimpinan instruksional; indikator kepemimpinan instruksional yang efektif; strategi kepemimpinan instruksional; carapraktis melaksanakan kepemimpinan instruksional. Kata Kunci: kepemimpinan, instruksional, kepala sekolah THE SCHOOL PRINCIPALS’ INSTRUCTIONAL LEADERSHIP MODEL Abstract: This study was aimed to find a school principals’ instructional leadership model. This study used the multicase qualitative approach. The instrument was the researcher himself. The data were collected using the in-depth interviews, participant observation, and documentation. The subjects were the school principals, vice principals, and teachers taken using the snowball sampling technique. The key informants were the school principals. The object of the study was the persons, concept, places, and activities. The stages and the data analysis used Creswell’s (2014 model. The data validation was carried out using the credibility, transferability, dependability, and confirmability criteria. The findings showed that the school principals’ instructional model followed the following cycle: an understanding of

  8. Computer Modelling «Smart Building»

    Directory of Open Access Journals (Sweden)

    O. Yu. Maryasin

    2016-01-01

    Full Text Available Currently ”Smart building” or ”Smart house” technology is developing actively in industrialized countries. The main idea of ”smart building” or ”smart house” is to have a system which is able to identify definite situations happening in house and respond accordingly. Automated house management system is made for automated control and management and also for organization of interaction between separated systems of engineering equipment. This system includes automation subsystems of one or another engineering equipment as separated components. In order to perform study of different functioning modes of engineering subsystems and the whole system, mathematical and computer modeling needs to be used. From mathematical point of veiw description of ”Smart building” is a continuous-discrete or hybrid system consisting of interacting elements of different nature, whose behavior is described by continuous and discrete processes. In the article the authors present a computer model ”Smart building” which allows to model the work of main engineering subsystems and management algorithms. The model is created in Simulink Matlab system with ”physical modeling” library Simscape and Stateflow library. The peculiarity of this model is the use of specialized management and control algorithms which allow providing coordinated interaction of subsystems and optimizing power consumption. 

  9. A plastic damage model with stress triaxiality-dependent hardening

    International Nuclear Information System (INIS)

    Shen Xinpu; Shen Guoxiao; Zhou Lin

    2005-01-01

    Emphases of this study were placed on the modelling of plastic damage behaviour of prestressed structural concrete, with special attention being paid to the stress-triaxiality dependent plastic hardening law and the corresponding damage evolution law. A definition of stress triaxiality was proposed and introduced in the model presented here. Drucker-Prager -type plasticity was adopted in the formulation of the plastic damage constitutive equations. Numerical validations were performed for the proposed plasticity-based damage model with a driver subroutine developed in this study. The predicted stress-strain behaviour seems reasonably accurate for the uniaxial tension and uniaxial compression compared with the experimental data reported in references. Numerical calculations of compressions under various hydrostatic stress confinements were carried out in order to validate the stress triaxiality dependent properties of the model. (authors)

  10. a Review of Recent Research in Indoor Modelling & Mapping

    Science.gov (United States)

    Gunduz, M.; Isikdag, U.; Basaraner, M.

    2016-06-01

    Indoor modeling and mapping has been an active area of research in last 20 years in order to tackle the problems related to positioning and tracking of people and objects indoors, and provides many opportunities for several domains ranging from emergency response to logistics in micro urban spaces. The outputs of recent research in the field have been presented in several scientific publications and events primarily related to spatial information science and technology. This paper summarizes the outputs of last 10 years of research on indoor modeling and mapping within a proper classification which covers 7 areas, i.e. Information Acquisition by Sensors, Model Definition, Model Integration, Indoor Positioning and LBS, Routing & Navigation Methods, Augmented and Virtual Reality Applications, and Ethical Issues. Finally, the paper outlines the current and future research directions and concluding remarks.

  11. Item Construction Using Reflective, Formative, or Rasch Measurement Models: Implications for Group Work

    Science.gov (United States)

    Peterson, Christina Hamme; Gischlar, Karen L.; Peterson, N. Andrew

    2017-01-01

    Measures that accurately capture the phenomenon are critical to research and practice in group work. The vast majority of group-related measures were developed using the reflective measurement model rooted in classical test theory (CTT). Depending on the construct definition and the measure's purpose, the reflective model may not always be the…

  12. Testbed model and data assimilation for ARM

    International Nuclear Information System (INIS)

    Louis, J.F.

    1992-01-01

    The objectives of this contract are to further develop and test the ALFA (AER Local Forecast and Assimilation) model originally designed at AER for local weather prediction and apply it to three distinct but related purposes in connection with the Atmospheric Radiation Measurement (ARM) program: (a) to provide a testbed that simulates a global climate model in order to facilitate the development and testing of new cloud parametrizations and radiation models; (b) to assimilate the ARM data continuously at the scale of a climate model, using the adjoint method, thus providing the initial conditions and verification data for testing parameumtions; (c) to study the sensitivity of a radiation scheme to cloud parameters, again using the adjoint method, thus demonstrating the usefulness of the testbed model. The data assimilation will use a variational technique that minimizes the difference between the model results and the observation during the analysis period. The adjoint model is used to compute the gradient of a measure of the model errors with respect to nudging terms that are added to the equations to force the model output closer to the data. The radiation scheme that will be included in the basic ALFA model makes use of a gen two-stream approximation, and is designed for vertically inhonogeneous, multiple-scattering atmospheres. The sensitivity of this model to the definition of cloud parameters will be studied. The adjoint technique will also be used to compute the sensitivities. This project is designed to provide the Science Team members with the appropriate tools and modeling environment for proper testing and tuning of new radiation models and cloud parametrization schemes

  13. PBL and CDIO: Complementary Models for Engineering Education Development

    Science.gov (United States)

    Edström, Kristina; Kolmos, Anette

    2014-01-01

    This paper compares two models for reforming engineering education, problem/project-based learning (PBL), and conceive-design-implement-operate (CDIO), identifying and explaining similarities and differences. PBL and CDIO are defined and contrasted in terms of their history, community, definitions, curriculum design, relation to disciplines,…

  14. The International Geomagnetic Reference Field (IGRF) generation 12: BGS candidates and final models

    OpenAIRE

    Beggan, Ciaran D.; Hamilton, Brian; Taylor, Victoria; Macmillan, Susan; Thomson, Alan

    2015-01-01

    The International Geomagnetic Reference Field (IGRF) model is a reference main field magnetic model updated on a quinquennial basis. The latest revision (generation 12) was released in January 2015. The IGRF-12 consists of a definitive model (DGRF2010) of the main field for 2010.0, a model for the field at 2015.0 (IGRF2015) and a prediction of secular variation (IGRF-12 SV) for the forthcoming five years until 2020.0. The remaining coefficients of IGRF-12 are unchanged from IGRF-11. Nin...

  15. Uncertainty modeling in vibration, control and fuzzy analysis of structural systems

    CERN Document Server

    Halder, Achintya; Ayyub, Bilal M

    1997-01-01

    This book gives an overview of the current state of uncertainty modeling in vibration, control, and fuzzy analysis of structural and mechanical systems. It is a coherent compendium written by leading experts and offers the reader a sampling of exciting research areas in several fast-growing branches in this field. Uncertainty modeling and analysis are becoming an integral part of system definition and modeling in many fields. The book consists of ten chapters that report the work of researchers, scientists and engineers on theoretical developments and diversified applications in engineering sy

  16. The role of computer modelling in participatory integrated assessments

    International Nuclear Information System (INIS)

    Siebenhuener, Bernd; Barth, Volker

    2005-01-01

    In a number of recent research projects, computer models have been included in participatory procedures to assess global environmental change. The intention was to support knowledge production and to help the involved non-scientists to develop a deeper understanding of the interactions between natural and social systems. This paper analyses the experiences made in three projects with the use of computer models from a participatory and a risk management perspective. Our cross-cutting analysis of the objectives, the employed project designs and moderation schemes and the observed learning processes in participatory processes with model use shows that models play a mixed role in informing participants and stimulating discussions. However, no deeper reflection on values and belief systems could be achieved. In terms of the risk management phases, computer models serve best the purposes of problem definition and option assessment within participatory integrated assessment (PIA) processes

  17. Bayesian models: A statistical primer for ecologists

    Science.gov (United States)

    Hobbs, N. Thompson; Hooten, Mevin B.

    2015-01-01

    Bayesian modeling has become an indispensable tool for ecological research because it is uniquely suited to deal with complexity in a statistically coherent way. This textbook provides a comprehensive and accessible introduction to the latest Bayesian methods—in language ecologists can understand. Unlike other books on the subject, this one emphasizes the principles behind the computations, giving ecologists a big-picture understanding of how to implement this powerful statistical approach.Bayesian Models is an essential primer for non-statisticians. It begins with a definition of probability and develops a step-by-step sequence of connected ideas, including basic distribution theory, network diagrams, hierarchical models, Markov chain Monte Carlo, and inference from single and multiple models. This unique book places less emphasis on computer coding, favoring instead a concise presentation of the mathematical statistics needed to understand how and why Bayesian analysis works. It also explains how to write out properly formulated hierarchical Bayesian models and use them in computing, research papers, and proposals.This primer enables ecologists to understand the statistical principles behind Bayesian modeling and apply them to research, teaching, policy, and management.Presents the mathematical and statistical foundations of Bayesian modeling in language accessible to non-statisticiansCovers basic distribution theory, network diagrams, hierarchical models, Markov chain Monte Carlo, and moreDeemphasizes computer coding in favor of basic principlesExplains how to write out properly factored statistical expressions representing Bayesian models

  18. Simplified Models for LHC New Physics Searches

    International Nuclear Information System (INIS)

    Alves, Daniele; Arkani-Hamed, Nima; Arora, Sanjay; Bai, Yang; Baumgart, Matthew; Berger, Joshua; Butler, Bart; Chang, Spencer; Cheng, Hsin-Chia; Cheung, Clifford; Chivukula, R. Sekhar; Cho, Won Sang; Cotta, Randy; D'Alfonso, Mariarosaria; El Hedri, Sonia; Essig, Rouven; Fitzpatrick, Liam; Fox, Patrick; Franceschini, Roberto

    2012-01-01

    This document proposes a collection of simplified models relevant to the design of new-physics searches at the LHC and the characterization of their results. Both ATLAS and CMS have already presented some results in terms of simplified models, and we encourage them to continue and expand this effort, which supplements both signature-based results and benchmark model interpretations. A simplified model is defined by an effective Lagrangian describing the interactions of a small number of new particles. Simplified models can equally well be described by a small number of masses and cross-sections. These parameters are directly related to collider physics observables, making simplified models a particularly effective framework for evaluating searches and a useful starting point for characterizing positive signals of new physics. This document serves as an official summary of the results from the 'Topologies for Early LHC Searches' workshop, held at SLAC in September of 2010, the purpose of which was to develop a set of representative models that can be used to cover all relevant phase space in experimental searches. Particular emphasis is placed on searches relevant for the first ∼ 50-500 pb -1 of data and those motivated by supersymmetric models. This note largely summarizes material posted at http://lhcnewphysics.org/, which includes simplified model definitions, Monte Carlo material, and supporting contacts within the theory community. We also comment on future developments that may be useful as more data is gathered and analyzed by the experiments.

  19. Simplified Models for LHC New Physics Searches

    Energy Technology Data Exchange (ETDEWEB)

    Alves, Daniele; /SLAC; Arkani-Hamed, Nima; /Princeton, Inst. Advanced Study; Arora, Sanjay; /Rutgers U., Piscataway; Bai, Yang; /SLAC; Baumgart, Matthew; /Johns Hopkins U.; Berger, Joshua; /Cornell U., Phys. Dept.; Buckley, Matthew; /Fermilab; Butler, Bart; /SLAC; Chang, Spencer; /Oregon U. /UC, Davis; Cheng, Hsin-Chia; /UC, Davis; Cheung, Clifford; /UC, Berkeley; Chivukula, R.Sekhar; /Michigan State U.; Cho, Won Sang; /Tokyo U.; Cotta, Randy; /SLAC; D' Alfonso, Mariarosaria; /UC, Santa Barbara; El Hedri, Sonia; /SLAC; Essig, Rouven, (ed.); /SLAC; Evans, Jared A.; /UC, Davis; Fitzpatrick, Liam; /Boston U.; Fox, Patrick; /Fermilab; Franceschini, Roberto; /LPHE, Lausanne /Pittsburgh U. /Argonne /Northwestern U. /Rutgers U., Piscataway /Rutgers U., Piscataway /Carleton U. /CERN /UC, Davis /Wisconsin U., Madison /SLAC /SLAC /SLAC /Rutgers U., Piscataway /Syracuse U. /SLAC /SLAC /Boston U. /Rutgers U., Piscataway /Seoul Natl. U. /Tohoku U. /UC, Santa Barbara /Korea Inst. Advanced Study, Seoul /Harvard U., Phys. Dept. /Michigan U. /Wisconsin U., Madison /Princeton U. /UC, Santa Barbara /Wisconsin U., Madison /Michigan U. /UC, Davis /SUNY, Stony Brook /TRIUMF; /more authors..

    2012-06-01

    This document proposes a collection of simplified models relevant to the design of new-physics searches at the LHC and the characterization of their results. Both ATLAS and CMS have already presented some results in terms of simplified models, and we encourage them to continue and expand this effort, which supplements both signature-based results and benchmark model interpretations. A simplified model is defined by an effective Lagrangian describing the interactions of a small number of new particles. Simplified models can equally well be described by a small number of masses and cross-sections. These parameters are directly related to collider physics observables, making simplified models a particularly effective framework for evaluating searches and a useful starting point for characterizing positive signals of new physics. This document serves as an official summary of the results from the 'Topologies for Early LHC Searches' workshop, held at SLAC in September of 2010, the purpose of which was to develop a set of representative models that can be used to cover all relevant phase space in experimental searches. Particular emphasis is placed on searches relevant for the first {approx} 50-500 pb{sup -1} of data and those motivated by supersymmetric models. This note largely summarizes material posted at http://lhcnewphysics.org/, which includes simplified model definitions, Monte Carlo material, and supporting contacts within the theory community. We also comment on future developments that may be useful as more data is gathered and analyzed by the experiments.

  20. A new approach to Naturalness in SUSY models

    CERN Document Server

    Ghilencea, D M

    2013-01-01

    We review recent results that provide a new approach to the old problem of naturalness in supersymmetric models, without relying on subjective definitions for the fine-tuning associated with {\\it fixing} the EW scale (to its measured value) in the presence of quantum corrections. The approach can address in a model-independent way many questions related to this problem. The results show that naturalness and its measure (fine-tuning) are an intrinsic part of the likelihood to fit the data that {\\it includes} the EW scale. One important consequence is that the additional {\\it constraint} of fixing the EW scale, usually not imposed in the data fits of the models, impacts on their overall likelihood to fit the data (or chi^2/ndf, ndf: number of degrees of freedom). This has negative implications for the viability of currently popular supersymmetric extensions of the Standard Model.

  1. The Aalborg Model and The Problem

    DEFF Research Database (Denmark)

    Qvist, Palle

    To know the definition of a problem in is an important implication for the possibility to identify and formulate the problem1, the starting point of the learning process in the Aalborg Model2 3. For certification it has been suggested that: A problem grows out of students’ wondering within differ...... – a wondering - that something is different from what is expected, something novel and unexpected or inexplicable; astonishment mingled with perplexity or bewildered curiosity?...

  2. Assessment of nutritional status in the elderly: a proposed function-driven model.

    Science.gov (United States)

    Engelheart, Stina; Brummer, Robert

    2018-01-01

    There is no accepted or standardized definition of 'malnutrition'. Hence, there is also no definition of what constitutes an adequate nutritional status. In elderly people, assessment of nutritional status is complex and is complicated by multi-morbidity and disabilities combined with nutrition-related problems, such as dysphagia, decreased appetite, fatigue, and muscle weakness. We propose a nutritional status model that presents nutritional status from a comprehensive functional perspective. This model visualizes the complexity of the nutritional status in elderly people. The presented model could be interpreted as the nutritional status is conditional to a person's optimal function or situation. Another way of looking at it might be that a person's nutritional status affects his or her optimal situation. The proposed model includes four domains: (1) physical function and capacity; (2) health and somatic disorders; (3) food and nutrition; and (4) cognitive, affective, and sensory function. Each domain has a major impact on nutritional status, which in turn has a major impact on the outcome of each domain. Nutritional status is a multifaceted concept and there exist several knowledge gaps in the diagnosis, prevention, and optimization of treatment of inadequate nutritional status in elderly people. The nutritional status model may be useful in nutritional assessment research, as well as in the clinical setting.

  3. 78 FR 62472 - Energy Conservation Program: Alternative Efficiency Determination Methods, Basic Model Definition...

    Science.gov (United States)

    2013-10-22

    ... transformers, electric motors, and small electric motors to use AEDMs to rate their non-tested combinations... electric storage water heaters [cir] Commercial gas-fired and oil-fired storage water heaters [cir.... Electric Water Heaters 2 Basic Models. Heat Pump Water Heaters 2 Basic Models. Unfired Hot Water Storage...

  4. Regulation models for district heating. Main report; Denmark; Reguleringsmodeller for fjernvarmen. Hovedrapport

    Energy Technology Data Exchange (ETDEWEB)

    2012-02-15

    With regard to choice of model for the regulation of district heating prices the report points out that a detailed analysis of a cost+ model could be considered. Such an analysis could provide further definition of the extended right to recoup the investment for heating companies, the shaping of the possibility of recognition of opportunity costs and the fixed cost allocation, and the clarified definition of necessary costs. The report also suggests that a price cap regulation or a completely free pricing in the entire sector is hardly appropriate forms of regulation. The report's analysis clearly shows that the choice of price regulation in the heat sector has impact on the incentives in terms of investment, green conversion, etc. It also appears that the different regulatory models have very different advantages and disadvantages, and lessons learned from other sectors and abroad show that changing price regulation rules can be a difficult and lengthy process with unintended consequences along the way. (LN)

  5. INCORPORATING MULTIPLE OBJECTIVES IN PLANNING MODELS OF LOW-RESOURCE FARMERS

    OpenAIRE

    Flinn, John C.; Jayasuriya, Sisira; Knight, C. Gregory

    1980-01-01

    Linear goal programming provides a means of formally incorporating the multiple goals of a household into the analysis of farming systems. Using this approach, the set of plans which come as close as possible to achieving a set of desired goals under conditions of land and cash scarcity are derived for a Filipino tenant farmer. A challenge in making LGP models empirically operational is the accurate definition of the goals of the farm household being modelled.

  6. Diversity in case management modalities: the Summit model.

    Science.gov (United States)

    Peterson, G A; Drone, I D; Munetz, M R

    1997-06-01

    Though ubiquitous in community mental health agencies, case management suffers from a lack of consensus regarding its definition, essential components, and appropriate application. Meaningful comparisons of various case management models await such a consensus. Global assessments of case management must be replaced by empirical studies of specific interventions with respect to the needs of specific populations. The authors describe a highly differentiated and prescriptive system of case management involving the application of more than one model of service delivery. Such a diversified and targeted system offers an opportunity to study the technology of case management in a more meaningful manner.

  7. MODEL OF THE TOKAMAK EDGE DENSITY PEDESTAL INCLUDING DIFFUSIVE NEUTRALS

    International Nuclear Information System (INIS)

    BURRELL, K.H.

    2003-01-01

    OAK-B135 Several previous analytic models of the tokamak edge density pedestal have been based on diffusive transport of plasma plus free-streaming of neutrals. This latter neutral model includes only the effect of ionization and neglects charge exchange. The present work models the edge density pedestal using diffusive transport for both the plasma and the neutrals. In contrast to the free-streaming model, a diffusion model for the neutrals includes the effect of both charge exchange and ionization and is valid when charge exchange is the dominant interaction. Surprisingly, the functional forms for the electron and neutral density profiles from the present calculation are identical to the results of the previous analytic models. There are some differences in the detailed definition of various parameters in the solution. For experimentally relevant cases where ionization and charge exchange rate are comparable, both models predict approximately the same width for the edge density pedestal

  8. Entropic Constitutive Relation and Modeling for Fourier and Hyperbolic Heat Conductions

    Directory of Open Access Journals (Sweden)

    Shu-Nan Li

    2017-12-01

    Full Text Available Most existing phenomenological heat conduction models are expressed by temperature and heat flux distributions, whose definitions might be debatable in heat conductions with strong non-equilibrium. The constitutive relations of Fourier and hyperbolic heat conductions are here rewritten by the entropy and entropy flux distributions in the frameworks of classical irreversible thermodynamics (CIT and extended irreversible thermodynamics (EIT. The entropic constitutive relations are then generalized by Boltzmann–Gibbs–Shannon (BGS statistical mechanics, which can avoid the debatable definitions of thermodynamic quantities relying on local equilibrium. It shows a possibility of modeling heat conduction through entropic constitutive relations. The applicability of the generalizations by BGS statistical mechanics is also discussed based on the relaxation time approximation, and it is found that the generalizations require a sufficiently small entropy production rate.

  9. Towards Precise Metadata-set for Discovering 3D Geospatial Models in Geo-portals

    Science.gov (United States)

    Zamyadi, A.; Pouliot, J.; Bédard, Y.

    2013-09-01

    Accessing 3D geospatial models, eventually at no cost and for unrestricted use, is certainly an important issue as they become popular among participatory communities, consultants, and officials. Various geo-portals, mainly established for 2D resources, have tried to provide access to existing 3D resources such as digital elevation model, LIDAR or classic topographic data. Describing the content of data, metadata is a key component of data discovery in geo-portals. An inventory of seven online geo-portals and commercial catalogues shows that the metadata referring to 3D information is very different from one geo-portal to another as well as for similar 3D resources in the same geo-portal. The inventory considered 971 data resources affiliated with elevation. 51% of them were from three geo-portals running at Canadian federal and municipal levels whose metadata resources did not consider 3D model by any definition. Regarding the remaining 49% which refer to 3D models, different definition of terms and metadata were found, resulting in confusion and misinterpretation. The overall assessment of these geo-portals clearly shows that the provided metadata do not integrate specific and common information about 3D geospatial models. Accordingly, the main objective of this research is to improve 3D geospatial model discovery in geo-portals by adding a specific metadata-set. Based on the knowledge and current practices on 3D modeling, and 3D data acquisition and management, a set of metadata is proposed to increase its suitability for 3D geospatial models. This metadata-set enables the definition of genuine classes, fields, and code-lists for a 3D metadata profile. The main structure of the proposal contains 21 metadata classes. These classes are classified in three packages as General and Complementary on contextual and structural information, and Availability on the transition from storage to delivery format. The proposed metadata set is compared with Canadian Geospatial

  10. Criticality Model

    International Nuclear Information System (INIS)

    Alsaed, A.

    2004-01-01

    The ''Disposal Criticality Analysis Methodology Topical Report'' (YMP 2003) presents the methodology for evaluating potential criticality situations in the monitored geologic repository. As stated in the referenced Topical Report, the detailed methodology for performing the disposal criticality analyses will be documented in model reports. Many of the models developed in support of the Topical Report differ from the definition of models as given in the Office of Civilian Radioactive Waste Management procedure AP-SIII.10Q, ''Models'', in that they are procedural, rather than mathematical. These model reports document the detailed methodology necessary to implement the approach presented in the Disposal Criticality Analysis Methodology Topical Report and provide calculations utilizing the methodology. Thus, the governing procedure for this type of report is AP-3.12Q, ''Design Calculations and Analyses''. The ''Criticality Model'' is of this latter type, providing a process evaluating the criticality potential of in-package and external configurations. The purpose of this analysis is to layout the process for calculating the criticality potential for various in-package and external configurations and to calculate lower-bound tolerance limit (LBTL) values and determine range of applicability (ROA) parameters. The LBTL calculations and the ROA determinations are performed using selected benchmark experiments that are applicable to various waste forms and various in-package and external configurations. The waste forms considered in this calculation are pressurized water reactor (PWR), boiling water reactor (BWR), Fast Flux Test Facility (FFTF), Training Research Isotope General Atomic (TRIGA), Enrico Fermi, Shippingport pressurized water reactor, Shippingport light water breeder reactor (LWBR), N-Reactor, Melt and Dilute, and Fort Saint Vrain Reactor spent nuclear fuel (SNF). The scope of this analysis is to document the criticality computational method. The criticality

  11. Film grain noise modeling in advanced video coding

    Science.gov (United States)

    Oh, Byung Tae; Kuo, C.-C. Jay; Sun, Shijun; Lei, Shawmin

    2007-01-01

    A new technique for film grain noise extraction, modeling and synthesis is proposed and applied to the coding of high definition video in this work. The film grain noise is viewed as a part of artistic presentation by people in the movie industry. On one hand, since the film grain noise can boost the natural appearance of pictures in high definition video, it should be preserved in high-fidelity video processing systems. On the other hand, video coding with film grain noise is expensive. It is desirable to extract film grain noise from the input video as a pre-processing step at the encoder and re-synthesize the film grain noise and add it back to the decoded video as a post-processing step at the decoder. Under this framework, the coding gain of the denoised video is higher while the quality of the final reconstructed video can still be well preserved. Following this idea, we present a method to remove film grain noise from image/video without distorting its original content. Besides, we describe a parametric model containing a small set of parameters to represent the extracted film grain noise. The proposed model generates the film grain noise that is close to the real one in terms of power spectral density and cross-channel spectral correlation. Experimental results are shown to demonstrate the efficiency of the proposed scheme.

  12. Development of a global 1-D chemically radiatively coupled model and an introduction to the development of a chemically coupled General Circulation Model

    International Nuclear Information System (INIS)

    Akiyoshi, H.

    1997-01-01

    A global one-dimensional, chemically and radiatively coupled model has been developed. The basic concept of the coupled model, definition of globally averaged zenith angles, the formulation of the model chemistry, radiation, the coupled processes, and profiles and diurnal variations of temperature and chemical species at a normal steady state are presented. Furthermore, a suddenly doubled CO 2 experiment and a Pinatubo aerosol increase experiment were performed with the model. The time scales of variations in ozone and temperature in the lower stratosphere of the coupled system in the doubled CO 2 experiment was long, due to a feedback process among ultra violet radiation, O(1D), NO y , NO x , and O 3 . From the Pinatubo aerosol experiment, a delay of maximum ozone decrease from the maximum aerosol loading is shown and discussed. Developments of 3-D chemical models with coupled processes are briefly described, and the ozone distribution from the first version of the 3-D model are presented. Chemical model development in National Institute for Environmental Studies (NIES) are briefly described. (author)

  13. Modification of a rainfall-runoff model for distributed modeling in a GIS and its validation

    Science.gov (United States)

    Nyabeze, W. R.

    A rainfall-runoff model, which can be inter-faced with a Geographical Information System (GIS) to integrate definition, measurement, calculating parameter values for spatial features, presents considerable advantages. The modification of the GWBasic Wits Rainfall-Runoff Erosion Model (GWBRafler) to enable parameter value estimation in a GIS (GISRafler) is presented in this paper. Algorithms are applied to estimate parameter values reducing the number of input parameters and the effort to populate them. The use of a GIS makes the relationship between parameter estimates and cover characteristics more evident. This paper has been produced as part of research to generalize the GWBRafler on a spatially distributed basis. Modular data structures are assumed and parameter values are weighted relative to the module area and centroid properties. Modifications to the GWBRafler enable better estimation of low flows, which are typical in drought conditions.

  14. Multiple-event probability in general-relativistic quantum mechanics. II. A discrete model

    International Nuclear Information System (INIS)

    Mondragon, Mauricio; Perez, Alejandro; Rovelli, Carlo

    2007-01-01

    We introduce a simple quantum mechanical model in which time and space are discrete and periodic. These features avoid the complications related to continuous-spectrum operators and infinite-norm states. The model provides a tool for discussing the probabilistic interpretation of generally covariant quantum systems, without the confusion generated by spurious infinities. We use the model to illustrate the formalism of general-relativistic quantum mechanics, and to test the definition of multiple-event probability introduced in a companion paper [Phys. Rev. D 75, 084033 (2007)]. We consider a version of the model with unitary time evolution and a version without unitary time evolution

  15. Core monitoring with analytical model adaption

    International Nuclear Information System (INIS)

    Linford, R.B.; Martin, C.L.; Parkos, G.R.; Rahnema, F.; Williams, R.D.

    1992-01-01

    The monitoring of BWR cores has evolved rapidly due to more capable computer systems, improved analytical models and new types of core instrumentation. Coupling of first principles diffusion theory models such as applied to design to the core instrumentation has been achieved by GE with an adaptive methodology in the 3D Minicore system. The adaptive methods allow definition of 'leakage parameters' which are incorporated directly into the diffusion models to enhance monitoring accuracy and predictions. These improved models for core monitoring allow for substitution of traversing in-core probe (TIP) and local power range monitor (LPRM) with calculations to continue monitoring with no loss of accuracy or reduction of thermal limits. Experience in small BWR cores has shown that with one out of three TIP machines failed there was no operating limitation or impact from the substitute calculations. Other capabilities exist in 3D Monicore to align TIPs more accurately and accommodate other types of system measurements or anomalies. 3D Monicore also includes an accurate predictive capability which uses the adaptive results from previous monitoring calculations and is used to plan and optimize reactor maneuvers/operations to improve operating efficiency and reduce support requirements

  16. On the topology of the inflaton field in minimal supergravity models

    Energy Technology Data Exchange (ETDEWEB)

    Ferrara, Sergio [Physics Department, Theory Unit, CERN,CH 1211, Geneva 23 (Switzerland); INFN - Laboratori Nazionali di Frascati,Via Enrico Fermi 40, I-00044, Frascati (Italy); Department of Physics and Astronomy, University of California,Los Angeles, CA 90095-1547 (United States); Fré, Pietro [Dipartimento di Fisica, Università di Torino, INFN - Sezione di Torino,via P. Giuria 1, I-10125 Torino (Italy); Sorin, Alexander S. [Bogoliubov Laboratory of Theoretical Physics,and Veksler and Baldin Laboratory of High Energy Physics,Joint Institute for Nuclear Research,141980 Dubna, Moscow Region (Russian Federation)

    2014-04-14

    We consider global issues in minimal supergravity models where a single field inflaton potential emerges. In a particular case we reproduce the Starobinsky model and its description dual to a certain formulation of R+R{sup 2} supergravity. For definiteness we confine our analysis to spaces at constant curvature, either vanishing or negative. Five distinct models arise, two flat models with respectively a quadratic and a quartic potential and three based on the ((SU(1,1))/(U(1))) space where its distinct isometries, elliptic, hyperbolic and parabolic are gauged. Fayet-Iliopoulos terms are introduced in a geometric way and they turn out to be a crucial ingredient in order to describe the de Sitter inflationary phase of the Starobinsky model.

  17. Standardizing terminology and definitions of medication adherence and persistence in research employing electronic databases.

    Science.gov (United States)

    Raebel, Marsha A; Schmittdiel, Julie; Karter, Andrew J; Konieczny, Jennifer L; Steiner, John F

    2013-08-01

    To propose a unifying set of definitions for prescription adherence research utilizing electronic health record prescribing databases, prescription dispensing databases, and pharmacy claims databases and to provide a conceptual framework to operationalize these definitions consistently across studies. We reviewed recent literature to identify definitions in electronic database studies of prescription-filling patterns for chronic oral medications. We then develop a conceptual model and propose standardized terminology and definitions to describe prescription-filling behavior from electronic databases. The conceptual model we propose defines 2 separate constructs: medication adherence and persistence. We define primary and secondary adherence as distinct subtypes of adherence. Metrics for estimating secondary adherence are discussed and critiqued, including a newer metric (New Prescription Medication Gap measure) that enables estimation of both primary and secondary adherence. Terminology currently used in prescription adherence research employing electronic databases lacks consistency. We propose a clear, consistent, broadly applicable conceptual model and terminology for such studies. The model and definitions facilitate research utilizing electronic medication prescribing, dispensing, and/or claims databases and encompasses the entire continuum of prescription-filling behavior. Employing conceptually clear and consistent terminology to define medication adherence and persistence will facilitate future comparative effectiveness research and meta-analytic studies that utilize electronic prescription and dispensing records.

  18. Modeling biological pathway dynamics with timed automata.

    Science.gov (United States)

    Schivo, Stefano; Scholma, Jetse; Wanders, Brend; Urquidi Camacho, Ricardo A; van der Vet, Paul E; Karperien, Marcel; Langerak, Rom; van de Pol, Jaco; Post, Janine N

    2014-05-01

    Living cells are constantly subjected to a plethora of environmental stimuli that require integration into an appropriate cellular response. This integration takes place through signal transduction events that form tightly interconnected networks. The understanding of these networks requires capturing their dynamics through computational support and models. ANIMO (analysis of Networks with Interactive Modeling) is a tool that enables the construction and exploration of executable models of biological networks, helping to derive hypotheses and to plan wet-lab experiments. The tool is based on the formalism of Timed Automata, which can be analyzed via the UPPAAL model checker. Thanks to Timed Automata, we can provide a formal semantics for the domain-specific language used to represent signaling networks. This enforces precision and uniformity in the definition of signaling pathways, contributing to the integration of isolated signaling events into complex network models. We propose an approach to discretization of reaction kinetics that allows us to efficiently use UPPAAL as the computational engine to explore the dynamic behavior of the network of interest. A user-friendly interface hides the use of Timed Automata from the user, while keeping the expressive power intact. Abstraction to single-parameter kinetics speeds up construction of models that remain faithful enough to provide meaningful insight. The resulting dynamic behavior of the network components is displayed graphically, allowing for an intuitive and interactive modeling experience.

  19. BLENDED LEARNING AND FEATURES OF THE USE OF THE ROTATION MODEL IN THE EDUCATIONAL PROCESS

    Directory of Open Access Journals (Sweden)

    Tkachuk H.

    2017-12-01

    Full Text Available The article analyzes of the problem of blended learning in higher education institutions. In particular, the article analyzes the legislative documents about the implementation of information technologies in the educational process, strategies for higher education, the introduction of distance learning, that determine importance of blended learning. The author also analyzes the concept of blended learning based on the definitions that are considered in the scientific and pedagogical literature. That analysis determines the ambiguity and incorrectness of the different definitions. It was proposed author's definition for this term. For order to identify the benefits of blended learning, it was analyzed of the positive and negative aspects of all technologies that are combined in the system of blended learning. Based on the analysis of different learning models, it was determined that the most optimal models is the station rotation model and the flipped classroom. The article provides an example of the use of a combination of these models for learning the topic "Computer Structure" by the students of the direction of training "Informatics". The education session was taking place in several stages and involves changing the five stations. Based on the conducted research was identified the general didactic and methodical principles of organization of blended learning.

  20. Business Model Disclosures in Corporate Reports

    Directory of Open Access Journals (Sweden)

    Jan Michalak

    2017-01-01

    Full Text Available Purpose: In this paper, we investigate the development, the current state, and the potential of business model disclosures to illustrate where, why and how organizations might want to disclose their business models to their stakeholders. The description of the business model may be relevant to stakeholders if it helps them to comprehend the company ‘story’ and increase understanding of other provided data (i.e. financial statements, risk exposure, sustainability of operations. It can also aid stakeholders in the assessment of sustainability of business models and the whole company. To realize these goals, business model descriptions should fulfil requirements of users suggested by various guidelines. Design/Methodology/Approach: First, we review and analyse literature on business model disclosure and some of its antecedents, including voluntary disclosure of intellectual capital. We also discuss business model reporting incentives from the viewpoint of shareholders, stakeholders and legitimacy theory. Second, we compare and discuss reporting guidelines on strategic reports, intellectual capital reports, and integrated reports through the lens of their requirements for business model disclosure and the consequences of their use for corporate report users. Third, we present, analyse and compare examples of good corporate practices in business model reporting. Findings: In the examined reporting guidelines, we find similarities, e.g. mostly structural but also qualitative attributes, in their presented information: materiality, completeness, connectivity, future orientation and conciseness. We also identify important differences between their frameworks concerning the target audience of the reports, business model definitions and business model disclosure requirements. Discontinuation of intellectual capital reporting conforming to DATI guidelines provides important warnings for the proponents of voluntary disclosure – especially for

  1. Continuum model for chiral induced spin selectivity in helical molecules

    Energy Technology Data Exchange (ETDEWEB)

    Medina, Ernesto [Centro de Física, Instituto Venezolano de Investigaciones Científicas, 21827, Caracas 1020 A (Venezuela, Bolivarian Republic of); Groupe de Physique Statistique, Institut Jean Lamour, Université de Lorraine, 54506 Vandoeuvre-les-Nancy Cedex (France); Department of Chemistry and Biochemistry, Arizona State University, Tempe, Arizona 85287 (United States); González-Arraga, Luis A. [IMDEA Nanoscience, Cantoblanco, 28049 Madrid (Spain); Finkelstein-Shapiro, Daniel; Mujica, Vladimiro [Department of Chemistry and Biochemistry, Arizona State University, Tempe, Arizona 85287 (United States); Berche, Bertrand [Centro de Física, Instituto Venezolano de Investigaciones Científicas, 21827, Caracas 1020 A (Venezuela, Bolivarian Republic of); Groupe de Physique Statistique, Institut Jean Lamour, Université de Lorraine, 54506 Vandoeuvre-les-Nancy Cedex (France)

    2015-05-21

    A minimal model is exactly solved for electron spin transport on a helix. Electron transport is assumed to be supported by well oriented p{sub z} type orbitals on base molecules forming a staircase of definite chirality. In a tight binding interpretation, the spin-orbit coupling (SOC) opens up an effective π{sub z} − π{sub z} coupling via interbase p{sub x,y} − p{sub z} hopping, introducing spin coupled transport. The resulting continuum model spectrum shows two Kramers doublet transport channels with a gap proportional to the SOC. Each doubly degenerate channel satisfies time reversal symmetry; nevertheless, a bias chooses a transport direction and thus selects for spin orientation. The model predicts (i) which spin orientation is selected depending on chirality and bias, (ii) changes in spin preference as a function of input Fermi level and (iii) back-scattering suppression protected by the SO gap. We compute the spin current with a definite helicity and find it to be proportional to the torsion of the chiral structure and the non-adiabatic Aharonov-Anandan phase. To describe room temperature transport, we assume that the total transmission is the result of a product of coherent steps.

  2. Rule-based modularization in model transformation languages illustrated with ATL

    NARCIS (Netherlands)

    Ivanov, Ivan; van den Berg, Klaas; Jouault, Frédéric

    2007-01-01

    This paper studies ways for modularizing transformation definitions in current rule-based model transformation languages. Two scenarios are shown in which the modular units are identified on the basis of relations between source and target metamodels and on the base of generic transformation

  3. The multichannel n-propyl + O2 reaction surface: Definitive theory on a model hydrocarbon oxidation mechanism

    Science.gov (United States)

    Bartlett, Marcus A.; Liang, Tao; Pu, Liang; Schaefer, Henry F.; Allen, Wesley D.

    2018-03-01

    The n-propyl + O2 reaction is an important model of chain branching reactions in larger combustion systems. In this work, focal point analyses (FPAs) extrapolating to the ab initio limit were performed on the n-propyl + O2 system based on explicit quantum chemical computations with electron correlation treatments through coupled cluster single, double, triple, and perturbative quadruple excitations [CCSDT(Q)] and basis sets up to cc-pV5Z. All reaction species and transition states were fully optimized at the rigorous CCSD(T)/cc-pVTZ level of theory, revealing some substantial differences in comparison to the density functional theory geometries existing in the literature. A mixed Hessian methodology was implemented and benchmarked that essentially makes the computations of CCSD(T)/cc-pVTZ vibrational frequencies feasible and thus provides critical improvements to zero-point vibrational energies for the n-propyl + O2 system. Two key stationary points, n-propylperoxy radical (MIN1) and its concerted elimination transition state (TS1), were located 32.7 kcal mol-1 and 2.4 kcal mol-1 below the reactants, respectively. Two competitive β-hydrogen transfer transition states (TS2 and TS2') were found separated by only 0.16 kcal mol-1, a fact unrecognized in the current combustion literature. Incorporating TS2' in master equation (ME) kinetic models might reduce the large discrepancy of 2.5 kcal mol-1 between FPA and ME barrier heights for TS2. TS2 exhibits an anomalously large diagonal Born-Oppenheimer correction (ΔDBOC = 1.71 kcal mol-1), which is indicative of a nearby surface crossing and possible nonadiabatic reaction dynamics. The first systematic conformational search of three hydroperoxypropyl (QOOH) intermediates was completed, uncovering a total of 32 rotamers lying within 1.6 kcal mol-1 of their respective lowest-energy minima. Our definitive energetics for stationary points on the n-propyl + O2 potential energy surface provide key benchmarks for future studies

  4. Understanding sexual harassment using aggregate construct models.

    Science.gov (United States)

    Nye, Christopher D; Brummel, Bradley J; Drasgow, Fritz

    2014-11-01

    Sexual harassment has received a substantial amount of empirical attention over the past few decades, and this research has consistently shown that experiencing these behaviors has a detrimental effect on employees' well-being, job attitudes, and behaviors at work. However, these findings, and the conclusions that are drawn from them, make the implicit assumption that the empirical models used to examine sexual harassment are properly specified. This article presents evidence that properly specified aggregate construct models are more consistent with theoretical structures and definitions of sexual harassment and can result in different conclusions about the nomological network of harassment. Results from 3 large samples, 2 military and 1 from a civilian population, are used to illustrate the differences between aggregate construct and reflective indicator models of sexual harassment. These analyses suggested that the factor structure and the nomological network of sexual harassment differ when modeling harassment as an aggregate construct. The implications of these results for the continued study of sexual harassment are discussed. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  5. Cyber threat model for tactical radio networks

    Science.gov (United States)

    Kurdziel, Michael T.

    2014-05-01

    The shift to a full information-centric paradigm in the battlefield has allowed ConOps to be developed that are only possible using modern network communications systems. Securing these Tactical Networks without impacting their capabilities has been a challenge. Tactical networks with fixed infrastructure have similar vulnerabilities to their commercial counterparts (although they need to be secure against adversaries with greater capabilities, resources and motivation). However, networks with mobile infrastructure components and Mobile Ad hoc Networks (MANets) have additional unique vulnerabilities that must be considered. It is useful to examine Tactical Network based ConOps and use them to construct a threat model and baseline cyber security requirements for Tactical Networks with fixed infrastructure, mobile infrastructure and/or ad hoc modes of operation. This paper will present an introduction to threat model assessment. A definition and detailed discussion of a Tactical Network threat model is also presented. Finally, the model is used to derive baseline requirements that can be used to design or evaluate a cyber security solution that can be scaled and adapted to the needs of specific deployments.

  6. Business models of sharing economy companies : exploring features responsible for sharing economy companies’ internationalization

    OpenAIRE

    Kosintceva, Aleksandra

    2016-01-01

    This paper is dedicated to the sharing economy business models and their features responsible for internationalization. The study proposes derived definitions for the concepts of “sharing economy” and “business model” and first generic sharing economy business models typology. The typology was created through the qualitative analysis of secondary data on twenty sharing economy companies from nine different industries. The outlined categories of sharing economy business models a...

  7. Higher dimensional generalizations of the SYK model

    Energy Technology Data Exchange (ETDEWEB)

    Berkooz, Micha [Department of Particle Physics and Astrophysics, Weizmann Institute of Science,Rehovot 7610001 (Israel); Narayan, Prithvi [International Centre for Theoretical Sciences, Hesaraghatta,Bengaluru North, 560 089 (India); Rozali, Moshe [Department of Physics and Astronomy, University of British Columbia,Vancouver, BC V6T 1Z1 (Canada); Simón, Joan [School of Mathematics and Maxwell Institute for Mathematical Sciences, University of Edinburgh,King’s Buildings, Edinburgh EH9 3FD (United Kingdom)

    2017-01-31

    We discuss a 1+1 dimensional generalization of the Sachdev-Ye-Kitaev model. The model contains N Majorana fermions at each lattice site with a nearest-neighbour hopping term. The SYK random interaction is restricted to low momentum fermions of definite chirality within each lattice site. This gives rise to an ordinary 1+1 field theory above some energy scale and a low energy SYK-like behavior. We exhibit a class of low-pass filters which give rise to a rich variety of hyperscaling behaviour in the IR. We also discuss another set of generalizations which describes probing an SYK system with an external fermion, together with the new scaling behavior they exhibit in the IR.

  8. Objectives for next generation of practical short-range atmospheric dispersion models

    International Nuclear Information System (INIS)

    Olesen, H.R.; Mikkelsen, T.

    1992-01-01

    The proceedings contains papers from the workshop ''Objectives for Next Generation of Practical Short-Range Atmospheric Dispersion Models''. They deal with two types of models, namely models for regulatory purposes and models for real-time applications. The workshop was the result of an action started in 1991 for increased cooperation and harmonization within atmospheric dispersion modelling. The focus of the workshop was on the management of model development and the definition of model objectives, rather than on detailed model contents. It was the intention to identify actions that can be taken in order to improve the development and use of atmospheric dispersion models. The papers in the proceedings deal with various topics within the broad spectrum of matters related to up-to-date practical models, such as their scientific basis, requirements for model input and output, meteorological preprocessing, standardisation within modelling, electronic information exchange as a potentially useful tool, model evaluation and data bases for model evaluation. In addition to the papers, the proceedings contain summaries of the discussions at the workshop. These summaries point to a number of recommended actions which can be taken in order to improve ''modelling culture''. (AB)

  9. 48 CFR 811.001 - Definitions.

    Science.gov (United States)

    2010-10-01

    ... ACQUISITION PLANNING DESCRIBING AGENCY NEEDS 811.001 Definitions. For the purposes of this part: Brand name product means a commercial product described by brand name and make or model number or other appropriate... distributor. Salient characteristics means those particular characteristics that specifically describe the...

  10. An 00 visual language definition approach supporting multiple views

    OpenAIRE

    Akehurst, David H.; I.E.E.E. Computer Society

    2000-01-01

    The formal approach to visual language definition is to use graph grammars and/or graph transformation techniques. These techniques focus on specifying the syntax and manipulation rules of the concrete representation. This paper presents a constraint and object-oriented approach to defining visual languages that uses UML and OCL as a definition language. Visual language definitions specify a mapping between concrete and abstract models of possible visual sentences, which carl subsequently be ...

  11. In silico ADME-Tox modeling: progress and prospects.

    Science.gov (United States)

    Alqahtani, Saeed

    2017-11-01

    Although significant progress has been made in high-throughput screening of absorption, distribution, metabolism and excretion, and toxicity (ADME-Tox) properties in drug discovery and development, in silico ADME-Tox prediction continues to play an important role in facilitating the appropriate selection of candidate drugs by pharmaceutical companies prior to expensive clinical trials. Areas covered: This review provides an overview of the available in silico models that have been used to predict the ADME-Tox properties of compounds. It also provides a comprehensive overview and summarization of the latest modeling methods and algorithms available for the prediction of physicochemical characteristics, ADME properties, and drug toxicity issues. Expert opinion: The in silico models currently available have greatly contributed to the knowledge of screening approaches in the early stages of drug discovery and the development process. As the definitive goal of in silico molding is to predict the pharmacokinetics and disposition of compounds in vivo by assembling all kinetic processes within one global model, PBPK models can serve this purpose. However, much work remains to be done in this area to generate more data and input parameters to build more reliable and accurate prediction models.

  12. A REVIEW OF RECENT RESEARCH IN INDOOR MODELLING & MAPPING

    Directory of Open Access Journals (Sweden)

    M. Gunduz

    2016-06-01

    Full Text Available Indoor modeling and mapping has been an active area of research in last 20 years in order to tackle the problems related to positioning and tracking of people and objects indoors, and provides many opportunities for several domains ranging from emergency response to logistics in micro urban spaces. The outputs of recent research in the field have been presented in several scientific publications and events primarily related to spatial information science and technology. This paper summarizes the outputs of last 10 years of research on indoor modeling and mapping within a proper classification which covers 7 areas, i.e. Information Acquisition by Sensors, Model Definition, Model Integration, Indoor Positioning and LBS, Routing & Navigation Methods, Augmented and Virtual Reality Applications, and Ethical Issues. Finally, the paper outlines the current and future research directions and concluding remarks.

  13. Importing CAD models into MONK and MCBEND

    International Nuclear Information System (INIS)

    Searson, K.; Fleurot, F.; Cooper, A. J.; Cowan, P.

    2009-01-01

    The direct use of Computer Aided Design (CAD) models in criticality and shielding codes has been a long standing goal for Sellafield Ltd. Such functionality could offer several advantages over the traditional method of text based modelling systems. Analysts would be able to take advantage of the advanced Graphical User Interface based modelling features provided by solid modelers, potentially reducing the costs associated with creating models in a format suitable for the analyst's criticality and shielding code. A prototype system has been developed that allows CAD models created in Autodesk Inventor or Solidworks to be used in criticality and shielding calculations. The system is based on the ANSI Initial Graphics Exchange Specification 5.3 standard and models are exported from the CAD software in Trimmed NURBS format. The format retains much more of the model's geometrical information than a format based on solid meshing techniques and avoids many of the associated problems such as large memory costs, surface approximations and void spaces. The time consuming and complex meshing process is also avoided. Runtime intersection calculations are performed using either a Bezier clipping process for NURBS based surface definitions, or by transforming the coordinate system through which the ray tracks for Surface of Revolution calculations. NURBS surfaces are therefore converted to Bezier form as the model is imported. In addition, the SR generatrix is, in general, converted to a 'strip tree' representation, allowing the SR intersection calculations to be performed with arbitrary generatrix shapes. Details of recent improvements to the Bezier clipping process are provided. Reduction in runtime of SR based Solidworks models over equivalent NURBS based Autodesk Inventor models is also demonstrated. (authors)

  14. A Grammatical Approach to the Modeling of an Autonomous Robot

    OpenAIRE

    Gabriel López-García; A. Javier Gallego-Sánchez; J. Luis Dalmau-Espert; Rafael Molina-Carmona; Patricia Compañ-Rosique

    2012-01-01

    Virtual Worlds Generator is a grammatical model that is proposed to define virtual worlds. It integrates the diversity of sensors and interaction devices, multimodality and a virtual simulation system. Its grammar allows the definition and abstraction in symbols strings of the scenes of the virtual world, independently of the hardware that is used to represent the world or to interact with it. A case study is presented to explain how to use the proposed model to formalize a robot navigation s...

  15. Slope wavenumber spectrum models of capillary and capillary-gravity waves

    Institute of Scientific and Technical Information of China (English)

    贾永君; 张杰; 王岩峰

    2010-01-01

    Capillary and capillary-gravity waves possess a random character, and the slope wavenumber spectra of them can be used to represent mean distributions of wave energy with respect to spatial scale of variability. But simple and practical models of the slope wavenumber spectra have not been put forward so far. In this article, we address the accurate definition of the slope wavenumber spectra of water surface capillary and capillary-gravity waves. By combining the existing slope wavenumber models and using th...

  16. МULTI-STAKEHOLDER MODEL OF EDUCATION PROJECT QUALITY MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Юлия Юрьевна ГУСЕВА

    2015-05-01

    Full Text Available The analysis of approaches to the definition of higher education projects’ stakeholders is conducted. A model of education project quality management with the influence of stakeholders is formed. A mechanism of recognition of new groups of project’s stakeholders on the basis of set theory is offered.

  17. Clinical Reasoning in Athletic Training Education: Modeling Expert Thinking

    Science.gov (United States)

    Geisler, Paul R.; Lazenby, Todd W.

    2009-01-01

    Objective: To address the need for a more definitive approach to critical thinking during athletic training educational experiences by introducing the clinical reasoning model for critical thinking. Background: Educators are aware of the need to teach students how to think critically. The multiple domains of athletic training are comprehensive and…

  18. 33 Change Mantra and Leadership Model: Schoolings from Emmy ...

    African Journals Online (AJOL)

    USER

    Change Mantra and Leadership Model- Akowe. 37. Change: According to Webster's New Collegiate Dictionary is “to become or make different; to become something or somebody different. “From this definition, one can deduce that, it means a transformation in one's identity or makeup. This source, again at another level ...

  19. Beyond Cultural Relativism: An Ecological Model for Rhetorical Ethics.

    Science.gov (United States)

    Mackin, Jim

    A model intended to overcome the cultural relativism of determining what is an ethical act draws an analogy to environmental studies. Beginning with the concepts of "telos" (final purpose) and "archai" (priority), the notion of an ecosystem of ethics avoids limitation to a particular historical definition of good. Since the…

  20. Evaluation of animal models of neurobehavioral disorders

    Directory of Open Access Journals (Sweden)

    Nordquist Rebecca E

    2009-02-01

    that for improving animal models, guided by the procedure expounded upon in this paper, the developmental and evaluation procedure itself may be improved by careful definition of the purpose(s of a model and by defining better evaluation criteria, based on the proposed use of the model.