WorldWideScience

Sample records for modeling approach specifically

  1. An approach for activity-based DEVS model specification

    DEFF Research Database (Denmark)

    Alshareef, Abdurrahman; Sarjoughian, Hessam S.; Zarrin, Bahram

    2016-01-01

    Creation of DEVS models has been advanced through Model Driven Architecture and its frameworks. The overarching role of the frameworks has been to help develop model specifications in a disciplined fashion. Frameworks can provide intermediary layers between the higher level mathematical models an...

  2. Estimating, Testing, and Comparing Specific Effects in Structural Equation Models: The Phantom Model Approach

    Science.gov (United States)

    Macho, Siegfried; Ledermann, Thomas

    2011-01-01

    The phantom model approach for estimating, testing, and comparing specific effects within structural equation models (SEMs) is presented. The rationale underlying this novel method consists in representing the specific effect to be assessed as a total effect within a separate latent variable model, the phantom model that is added to the main…

  3. AN OBJECT ORIENTED APPROACH TOWARDS THE SPECIFICATION OF SIMULATION MODELS

    Directory of Open Access Journals (Sweden)

    Antonie Van Rensburg

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: To manage problems , is to try and cope with a flux of interacting events and ideas which unrolls through time - with the manager trying to improve situations seen as problematical, or at least as less than perfect. The ability of managing or solving these problems depends on the skills of the problem solver to analyse problems. This article introduces and discusses a proposed methodology for analysing real world problems in order to construct valid simulation models.

    AFRIKAANSE OPSOMMING: Bestuurders probeer probleemsituasies bestuur, of verbeter deur 'n vloed van dinamiese interaktiewe gebeurtenisses te verstaan en te hanteer. Die sukses van die bestuur of oplos .van die probleme hang af van die kundigheid van die probleemoplosser om die probleme te kan analiseer. Die artikel bespreek 'n voorgestelde benadering tot die analise van probleme am sodoende daaruit , simulasiemodelle te kan opstel.

  4. A Model Driven Approach to domain standard specifications examplified by Finance Accounts receivable/ Accounts payable

    OpenAIRE

    Khan, Bahadar

    2005-01-01

    This thesis was written as a part of a master degree at the University of Oslo. The thesis work was conducted at SINTEF. The work has been carried out in the period November 2002 and April 2005. This thesis might be interesting to anyone interested in Domain Standard Specification Language developed by using the MDA approach to software development. The Model Driven Architecture (MDA) allows to separate the system functionality specification from its implementation on any specific technolo...

  5. Integrating UML, the Q-model and a Multi-Agent Approach in Process Specifications and Behavioural Models of Organisations

    Directory of Open Access Journals (Sweden)

    Raul Savimaa

    2005-08-01

    Full Text Available Efficient estimation and representation of an organisation's behaviour requires specification of business processes and modelling of actors' behaviour. Therefore the existing classical approaches that concentrate only on planned processes are not suitable and an approach that integrates process specifications with behavioural models of actors should be used instead. The present research indicates that a suitable approach should be based on interactive computing. This paper examines the integration of UML diagrams for process specifications, the Q-model specifications for modelling timing criteria of existing and planned processes and a multi-agent approach for simulating non-deterministic behaviour of human actors in an organisation. The corresponding original methodology is introduced and some of its applications as case studies are reviewed.

  6. Structural modeling of age specific fertility curves in Peninsular Malaysia: An approach of Lee Carter method

    Science.gov (United States)

    Hanafiah, Hazlenah; Jemain, Abdul Aziz

    2013-11-01

    In recent years, the study of fertility has been getting a lot of attention among research abroad following fear of deterioration of fertility led by the rapid economy development. Hence, this study examines the feasibility of developing fertility forecasts based on age structure. Lee Carter model (1992) is applied in this study as it is an established and widely used model in analysing demographic aspects. A singular value decomposition approach is incorporated with an ARIMA model to estimate age specific fertility rates in Peninsular Malaysia over the period 1958-2007. Residual plots is used to measure the goodness of fit of the model. Fertility index forecast using random walk drift is then utilised to predict the future age specific fertility. Results indicate that the proposed model provides a relatively good and reasonable data fitting. In addition, there is an apparent and continuous decline in age specific fertility curves in the next 10 years, particularly among mothers' in their early 20's and 40's. The study on the fertility is vital in order to maintain a balance between the population growth and the provision of facilities related resources.

  7. Lamina specific loss of inhibition may lead to distinct neuropathic manifestations: a computational modeling approach

    Directory of Open Access Journals (Sweden)

    Erick Javier Argüello Prada

    Full Text Available Introduction It has been reported that inhibitory control at the superficial dorsal horn (SDH can act in a regionally distinct manner, which suggests that regionally specific subpopulations of SDH inhibitory neurons may prevent one specific neuropathic condition. Methods In an attempt to address this issue, we provide an alternative approach by integrating neuroanatomical information provided by different studies to construct a network-model of the SDH. We use Neuroids to simulate each neuron included in that model by adapting available experimental evidence. Results Simulations suggest that the maintenance of the proper level of pain sensitivity may be attributed to lamina II inhibitory neurons and, therefore, hyperalgesia may be elicited by suppression of the inhibitory tone at that lamina. In contrast, lamina III inhibitory neurons are more likely to be responsible for keeping the nociceptive pathway from the mechanoreceptive pathway, so loss of inhibitory control in that region may result in allodynia. The SDH network-model is also able to replicate non-linearities associated to pain processing, such as Aβ-fiber mediated analgesia and frequency-dependent increase of the neural response. Discussion By incorporating biophysical accuracy and newer experimental evidence, the SDH network-model may become a valuable tool for assessing the contribution of specific SDH connectivity patterns to noxious transmission in both physiological and pathological conditions.

  8. Risk-based technical specifications: Development and application of an approach to the generation of a plant specific real-time risk model

    International Nuclear Information System (INIS)

    Puglia, B.; Gallagher, D.; Amico, P.; Atefi, B.

    1992-10-01

    This report describes a process developed to convert an existing PRA into a model amenable to real time, risk-based technical specification calculations. In earlier studies (culminating in NUREG/CR-5742), several risk-based approaches to technical specification were evaluated. A real-time approach using a plant specific PRA capable of modeling plant configurations as they change was identified as the most comprehensive approach to control plant risk. A master fault tree logic model representative of-all of the core damage sequences was developed. Portions of the system fault trees were modularized and supercomponents comprised of component failures with similar effects were developed to reduce the size of the model and, quantification times. Modifications to the master fault tree logic were made to properly model the effect of maintenance and recovery actions. Fault trees representing several actuation systems not modeled in detail in the existing PRA were added to the master fault tree logic. This process was applied to the Surry NUREG-1150 Level 1 PRA. The master logic mode was confirmed. The model was then used to evaluate frequency associated with several plant configurations using the IRRAS code. For all cases analyzed computational time was less than three minutes. This document Volume 2, contains appendices A, B, and C. These provide, respectively: Surry Technical Specifications Model Database, Surry Technical Specifications Model, and a list of supercomponents used in the Surry Technical Specifications Model

  9. Model-based versus specific dosimetry in diagnostic context: Comparison of three dosimetric approaches

    Energy Technology Data Exchange (ETDEWEB)

    Marcatili, S., E-mail: sara.marcatili@inserm.fr; Villoing, D.; Mauxion, T.; Bardiès, M. [Inserm, UMR1037 CRCT, Toulouse F-31000, France and Université Toulouse III-Paul Sabatier, UMR1037 CRCT, Toulouse F-31000 (France); McParland, B. J. [Imaging Technology Group, GE Healthcare, Life Sciences, B22U The Grove Centre, White Lion Road, Amersham, England HP7 9LL (United Kingdom)

    2015-03-15

    Purpose: The dosimetric assessment of novel radiotracers represents a legal requirement in most countries. While the techniques for the computation of internal absorbed dose in a therapeutic context have made huge progresses in recent years, in a diagnostic scenario the absorbed dose is usually extracted from model-based lookup tables, most often derived from International Commission on Radiological Protection (ICRP) or Medical Internal Radiation Dose (MIRD) Committee models. The level of approximation introduced by these models may impact the resulting dosimetry. The aim of this work is to establish whether a more refined approach to dosimetry can be implemented in nuclear medicine diagnostics, by analyzing a specific case. Methods: The authors calculated absorbed doses to various organs in six healthy volunteers administered with flutemetamol ({sup 18}F) injection. Each patient underwent from 8 to 10 whole body 3D PET/CT scans. This dataset was analyzed using a Monte Carlo (MC) application developed in-house using the toolkit GATE that is capable to take into account patient-specific anatomy and radiotracer distribution at the voxel level. They compared the absorbed doses obtained with GATE to those calculated with two commercially available software: OLINDA/EXM and STRATOS implementing a dose voxel kernel convolution approach. Results: Absorbed doses calculated with GATE were higher than those calculated with OLINDA. The average ratio between GATE absorbed doses and OLINDA’s was 1.38 ± 0.34 σ (from 0.93 to 2.23). The discrepancy was particularly high for the thyroid, with an average GATE/OLINDA ratio of 1.97 ± 0.83 σ for the six patients. Differences between STRATOS and GATE were found to be higher. The average ratio between GATE and STRATOS absorbed doses was 2.51 ± 1.21 σ (from 1.09 to 6.06). Conclusions: This study demonstrates how the choice of the absorbed dose calculation algorithm may introduce a bias when gamma radiations are of importance, as is

  10. A simplified approach to control system specification and design using domain modelling and mapping

    International Nuclear Information System (INIS)

    Ludgate, G.A.

    1992-01-01

    Recent developments in the field of accelerator-domain and computer-domain modelling have led to a better understanding of the 'art' of control system specification and design. It now appears possible to 'compile' a control system specification to produce the architectural design. The information required by the 'compiler' is discussed and one hardware optimization algorithm presented. The desired characteristics of the hardware and software components of a distributed control system architecture are discussed and the shortcomings of some commercial products. (author)

  11. Patient-specific in vitro models for hemodynamic analysis of congenital heart disease - Additive manufacturing approach.

    Science.gov (United States)

    Medero, Rafael; García-Rodríguez, Sylvana; François, Christopher J; Roldán-Alzate, Alejandro

    2017-03-21

    Non-invasive hemodynamic assessment of total cavopulmonary connection (TCPC) is challenging due to the complex anatomy. Additive manufacturing (AM) is a suitable alternative for creating patient-specific in vitro models for flow measurements using four-dimensional (4D) Flow MRI. These in vitro systems have the potential to serve as validation for computational fluid dynamics (CFD), simulating different physiological conditions. This study investigated three different AM technologies, stereolithography (SLA), selective laser sintering (SLS) and fused deposition modeling (FDM), to determine differences in hemodynamics when measuring flow using 4D Flow MRI. The models were created using patient-specific MRI data from an extracardiac TCPC. These models were connected to a perfusion pump circulating water at three different flow rates. Data was processed for visualization and quantification of velocity, flow distribution, vorticity and kinetic energy. These results were compared between each model. In addition, the flow distribution obtained in vitro was compared to in vivo. The results showed significant difference in velocities measured at the outlets of the models that required internal support material when printing. Furthermore, an ultrasound flow sensor was used to validate flow measurements at the inlets and outlets of the in vitro models. These results were highly correlated to those measured with 4D Flow MRI. This study showed that commercially available AM technologies can be used to create patient-specific vascular models for in vitro hemodynamic studies at reasonable costs. However, technologies that do not require internal supports during manufacturing allow smoother internal surfaces, which makes them better suited for flow analyses. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Recognition of Emotions in Mexican Spanish Speech: An Approach Based on Acoustic Modelling of Emotion-Specific Vowels

    Directory of Open Access Journals (Sweden)

    Santiago-Omar Caballero-Morales

    2013-01-01

    Full Text Available An approach for the recognition of emotions in speech is presented. The target language is Mexican Spanish, and for this purpose a speech database was created. The approach consists in the phoneme acoustic modelling of emotion-specific vowels. For this, a standard phoneme-based Automatic Speech Recognition (ASR system was built with Hidden Markov Models (HMMs, where different phoneme HMMs were built for the consonants and emotion-specific vowels associated with four emotional states (anger, happiness, neutral, sadness. Then, estimation of the emotional state from a spoken sentence is performed by counting the number of emotion-specific vowels found in the ASR’s output for the sentence. With this approach, accuracy of 87–100% was achieved for the recognition of emotional state of Mexican Spanish speech.

  13. Scalp simulation - A novel approach to site-specific biomechanical modeling of the skin.

    Science.gov (United States)

    Pittar, N; Winter, T; Falland-Cheung, L; Tong, D; Waddell, J N

    2018-01-01

    This study aimed to determine the hardness of the human scalp in vivo in order to identify an appropriate scalp simulant, from a range of commercially available silicone materials, for force impact assessment. Site-dependent variation in scalp hardness, and the applicability of contemporary skin simulants to the scalp were also considered. A Shore A-type durometer was used to collected hardness data from the scalps of 30 human participants (five males and five females in each of the three age categories: 18-30, 31-40, 41-50) and four commercially available silicones (light, medium, and heavy-bodied PVS, and duplication silicone). One-sample t-tests were used to compare the mean hardness of simulants to that of the scalp. Site-dependent variation in the hardness of the scalp was assessed using a mixed-model repeated measures ANOVA. Mean human scalp hardness derived from participants was 20.6 Durometer Units (DU; SD = 3.4). Analysis revealed only the medium-bodied PVS to be an acceptable scalp simulant when compared to the mean hardness of the human scalp (p = 0.869). Scalp hardness varied significantly anteroposteriorly (with an observable linear trend, p < 0.001), but not mediolaterally (p = 0.271). Comparisons of simulants to site-specific variation in scalp hardness anteroposteriorly found the medium-bodied PVS to be only suitable in the central region of the scalp (p = 0.391). In contrast, the duplication silicone (p = 0.074) and light-bodied PVS (p = 0.147) were only comparable to the posterior region. Contemporary skin simulants fail to accurately represent the scalp in terms of hardness. There is strong support for the use of medium-bodied PVS as a scalp simulant. Human scalp hardness varies significantly anteroposteriorly, but not mediolaterally, corresponding to regional anatomical variation within the scalp. A number of materials were identified as potential simulants for different regions of the scalp when more site-specific simulant research is required

  14. A data-driven modeling approach to identify disease-specific multi-organ networks driving physiological dysregulation.

    Directory of Open Access Journals (Sweden)

    Warren D Anderson

    2017-07-01

    Full Text Available Multiple physiological systems interact throughout the development of a complex disease. Knowledge of the dynamics and connectivity of interactions across physiological systems could facilitate the prevention or mitigation of organ damage underlying complex diseases, many of which are currently refractory to available therapeutics (e.g., hypertension. We studied the regulatory interactions operating within and across organs throughout disease development by integrating in vivo analysis of gene expression dynamics with a reverse engineering approach to infer data-driven dynamic network models of multi-organ gene regulatory influences. We obtained experimental data on the expression of 22 genes across five organs, over a time span that encompassed the development of autonomic nervous system dysfunction and hypertension. We pursued a unique approach for identification of continuous-time models that jointly described the dynamics and structure of multi-organ networks by estimating a sparse subset of ∼12,000 possible gene regulatory interactions. Our analyses revealed that an autonomic dysfunction-specific multi-organ sequence of gene expression activation patterns was associated with a distinct gene regulatory network. We analyzed the model structures for adaptation motifs, and identified disease-specific network motifs involving genes that exhibited aberrant temporal dynamics. Bioinformatic analyses identified disease-specific single nucleotide variants within or near transcription factor binding sites upstream of key genes implicated in maintaining physiological homeostasis. Our approach illustrates a novel framework for investigating the pathogenesis through model-based analysis of multi-organ system dynamics and network properties. Our results yielded novel candidate molecular targets driving the development of cardiovascular disease, metabolic syndrome, and immune dysfunction.

  15. A new modelling approach for compacted clayey soils using specific water volume as a state variable

    OpenAIRE

    Abeyrathne, Wedumpuli Koralalage Arunodi Prabashini

    2017-01-01

    One of the key challenges of the present geotechnical engineering community is the accurate definition of unsaturated soil behaviour in routine engineering practice. This is because despite the remarkable progression of unsaturated soil mechanics as a branch of geotechnical engineering over the last few decades, the gap between unsaturated soils research and practice has widened significantly as the models to predict the soil behaviour have become more and more complex. Ther...

  16. A competing risk approach for the European Heart SCORE model based on cause-specific and all-cause mortality

    DEFF Research Database (Denmark)

    Stovring, H.; Harmsen, C. G.; Wisloff, T.

    2013-01-01

    for older individuals. When non-CVD mortality was assumed unaffected by smoking status, the absolute risk reduction due to statin treatment ranged from 0.0% to 3.5%, whereas the gain in expected residual lifetime ranged from 3 to 11 months. Statin effectiveness increased for non-smokers and declined......, and the expected residual lifetime together with corresponding expected effects of statin treatment. Results: The modified model provided CVD-specific 10-year mortality risks similar to those of the European Heart SCORE model. Incorporation of non-CVD mortality increased 10-year mortality risks, in particular...... for smokers, when smoking was allowed to influence non-CVD mortality. Conclusion: The modified model provides mathematically consistent estimates of mortality risk and expected residual lifetime together with expected benefits from statin treatment....

  17. Intent Specifications: An Approach to Building Human-Centered Specifications

    Science.gov (United States)

    Leveson, Nancy G.

    1999-01-01

    This paper examines and proposes an approach to writing software specifications, based on research in systems theory, cognitive psychology, and human-machine interaction. The goal is to provide specifications that support human problem solving and the tasks that humans must perform in software development and evolution. A type of specification, called intent specifications, is constructed upon this underlying foundation.

  18. Comparisons of node-based and element-based approaches of assigning bone material properties onto subject-specific finite element models.

    Science.gov (United States)

    Chen, G; Wu, F Y; Liu, Z C; Yang, K; Cui, F

    2015-08-01

    Subject-specific finite element (FE) models can be generated from computed tomography (CT) datasets of a bone. A key step is assigning material properties automatically onto finite element models, which remains a great challenge. This paper proposes a node-based assignment approach and also compares it with the element-based approach in the literature. Both approaches were implemented using ABAQUS. The assignment procedure is divided into two steps: generating the data file of the image intensity of a bone in a MATLAB program and reading the data file into ABAQUS via user subroutines. The node-based approach assigns the material properties to each node of the finite element mesh, while the element-based approach assigns the material properties directly to each integration point of an element. Both approaches are independent from the type of elements. A number of FE meshes are tested and both give accurate solutions; comparatively the node-based approach involves less programming effort. The node-based approach is also independent from the type of analyses; it has been tested on the nonlinear analysis of a Sawbone femur. The node-based approach substantially improves the level of automation of the assignment procedure of bone material properties. It is the simplest and most powerful approach that is applicable to many types of analyses and elements. Copyright © 2015 IPEM. Published by Elsevier Ltd. All rights reserved.

  19. Combining the GW formalism with the polarizable continuum model: A state-specific non-equilibrium approach

    Science.gov (United States)

    Duchemin, Ivan; Jacquemin, Denis; Blase, Xavier

    2016-04-01

    We have implemented the polarizable continuum model within the framework of the many-body Green's function GW formalism for the calculation of electron addition and removal energies in solution. The present formalism includes both ground-state and non-equilibrium polarization effects. In addition, the polarization energies are state-specific, allowing to obtain the bath-induced renormalisation energy of all occupied and virtual energy levels. Our implementation is validated by comparisons with ΔSCF calculations performed at both the density functional theory and coupled-cluster single and double levels for solvated nucleobases. The present study opens the way to GW and Bethe-Salpeter calculations in disordered condensed phases of interest in organic optoelectronics, wet chemistry, and biology.

  20. Combining the GW formalism with the polarizable continuum model: A state-specific non-equilibrium approach

    Energy Technology Data Exchange (ETDEWEB)

    Duchemin, Ivan, E-mail: ivan.duchemin@cea.fr [INAC, SP2M/L-Sim, CEA/UJF Cedex 09, 38054 Grenoble (France); Jacquemin, Denis [Laboratoire CEISAM - UMR CNR 6230, Université de Nantes, 2 Rue de la Houssinière, BP 92208, 44322 Nantes Cedex 3 (France); Institut Universitaire de France, 1 rue Descartes, 75005 Paris Cedex 5 (France); Blase, Xavier [CNRS, Inst. NÉEL, F-38000 Grenoble (France); Univ. Grenoble Alpes, Inst. NÉEL, F-38000 Grenoble (France)

    2016-04-28

    We have implemented the polarizable continuum model within the framework of the many-body Green’s function GW formalism for the calculation of electron addition and removal energies in solution. The present formalism includes both ground-state and non-equilibrium polarization effects. In addition, the polarization energies are state-specific, allowing to obtain the bath-induced renormalisation energy of all occupied and virtual energy levels. Our implementation is validated by comparisons with ΔSCF calculations performed at both the density functional theory and coupled-cluster single and double levels for solvated nucleobases. The present study opens the way to GW and Bethe-Salpeter calculations in disordered condensed phases of interest in organic optoelectronics, wet chemistry, and biology.

  1. An Asset Pricing Approach to Testing General Term Structure Models including Heath-Jarrow-Morton Specifications and Affine Subclasses

    DEFF Research Database (Denmark)

    Christensen, Bent Jesper; van der Wel, Michel

    We develop a new empirical approach to term structure analysis that allows testing for time-varying risk premia and for the absence of arbitrage opportunities based on the drift restriction within the Heath, Jarrow and Morton (1992) framework. As in the equity case, a zero intercept condition...... of the risk premium is associated with the slope factor, and individual risk prices depend on own past values, factor realizations, and past values of other risk prices, and are significantly related to the output gap, consumption, and the equity risk price. The absence of arbitrage opportunities is strongly...

  2. Patient-Specific Computational Modeling

    CERN Document Server

    Peña, Estefanía

    2012-01-01

    This book addresses patient-specific modeling. It integrates computational modeling, experimental procedures, imagine clinical segmentation and mesh generation with the finite element method (FEM) to solve problems in computational biomedicine and bioengineering. Specific areas of interest include cardiovascular problems, ocular and muscular systems and soft tissue modeling. Patient-specific modeling has been the subject of serious research over the last seven years and interest in the area is continually growing and this area is expected to further develop in the near future.

  3. A patient-specific intracranial aneurysm model with endothelial lining: a novel in vitro approach to bridge the gap between biology and flow dynamics.

    Science.gov (United States)

    Kaneko, Naoki; Mashiko, Toshihiro; Namba, Katsunari; Tateshima, Satoshi; Watanabe, Eiju; Kawai, Kensuke

    2018-03-01

    To develop an in vitro model for studying the biological effect of complex-flow stress on endothelial cells in three-dimensional (3D) patient-specific vascular geometry. A vessel replica was fabricated with polydimethylsiloxanes using 3D printing technology from vascular image data acquired by rotational angiography. The vascular model was coated with fibronectin and immersed in a tube filled with a cell suspension of endothelium, and then cultured while being slowly rotated in three dimensions. Culture medium with viscosity was perfused in the circulation with the endothelialized vascular model. A computational fluid dynamics (CFD) study was conducted using perfusion conditions used in the flow experiment. The morphology of endothelial cells was observed under a confocal microscope. The CFD study showed low wall shear stress and circulating flow in the apex of the basilar tip aneurysm, with linear flow in the parent artery. Confocal imaging demonstrated that the inner surface of the vascular model was evenly covered with monolayer endothelial cells. After 24 h of flow circulation, endothelial cells in the parent artery exhibited a spindle shape and aligned with the flow direction. In contrast, endothelial cells in the aneurysmal apex were irregular in shape and size. A geometrically realistic intracranial aneurysm model with live endothelial lining was successfully developed. This in vitro model enables a new research approach combining study of the biological impact of complex flow on endothelial cells with CFD analysis and patient information, including the presence of aneurysmal growth or rupture. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  4. A formal validation approach for holonic control system specifications

    OpenAIRE

    Leitão, Paulo; Colombo, Armando W.; Restivo, Francisco

    2003-01-01

    Indexado ISI The holonic manufacturing paradigm allows a new approach to the emergent requirements faced by the manufacluring world, through the concepts of modularity, decentralisation, autonomy, re-use of control software components. The formal modelling and validation of the structural and behavioural specifications of holonic control systems assumes a critical role. This paper discusses the formal validation of the Petri Net models designed to represent the behaviour and specifications...

  5. An Approach to Greater Specificity for Glucocorticoids

    Directory of Open Access Journals (Sweden)

    Carson C. Chow

    2018-03-01

    Full Text Available Glucocorticoid steroids are among the most prescribed drugs each year. Nonetheless, the many undesirable side effects, and lack of selectivity, restrict their greater usage. Research to increase glucocorticoid specificity has spanned many years. These efforts have been hampered by the ability of glucocorticoids to both induce and repress gene transcription and also by the lack of success in defining any predictable properties that control glucocorticoid specificity. Correlations of transcriptional specificity have been observed with changes in steroid structure, receptor and chromatin conformation, DNA sequence for receptor binding, and associated cofactors. However, none of these studies have progressed to the point of being able to offer guidance for increased specificity. We summarize here a mathematical theory that allows a novel and quantifiable approach to increase selectivity. The theory applies to all three major actions of glucocorticoid receptors: induction by agonists, induction by antagonists, and repression by agonists. Simple graphical analysis of competition assays involving any two factors (steroid, chemical, peptide, protein, DNA, etc. yields information (1 about the kinetically described mechanism of action for each factor at that step where the factor acts in the overall reaction sequence and (2 about the relative position of that step where each factor acts. These two pieces of information uniquely provide direction for increasing the specificity of glucocorticoid action. Consideration of all three modes of action indicate that the most promising approach for increased specificity is to vary the concentrations of those cofactors/pharmaceuticals that act closest to the observed end point. The potential for selectivity is even greater when varying cofactors/pharmaceuticals in conjunction with a select class of antagonists.

  6. Industry specific financial distress modeling

    Directory of Open Access Journals (Sweden)

    Naz Sayari

    2017-01-01

    Full Text Available This study investigates uncertainty levels of various industries and tries to determine financial ratios having the greatest information content in determining the set of industry characteristics. It then uses these ratios to develop industry specific financial distress models. First, we employ factor analysis to determine the set of ratios that are most informative in specified industries. Second, we use a method based on the concept of entropy to measure the level of uncertainty in industries and also to single out the ratios that best reflect the uncertainty levels in specific industries. Finally, we conduct a logistic regression analysis and derive industry specific financial distress models which can be used to judge the predictive ability of selected financial ratios for each industry. The results show that financial ratios do indeed echo industry characteristics and that information content of specific ratios varies among different industries. Our findings show diverging impact of industry characteristics on companies; and thus the necessity of constructing industry specific financial distress models.

  7. Peptide Based Radiopharmaceuticals: Specific Construct Approach

    Energy Technology Data Exchange (ETDEWEB)

    Som, P; Rhodes, B A; Sharma, S S

    1997-10-21

    The objective of this project was to develop receptor based peptides for diagnostic imaging and therapy. A series of peptides related to cell adhesion molecules (CAM) and immune regulation were designed for radiolabeling with 99mTc and evaluated in animal models as potential diagnostic imaging agents for various disease conditions such as thrombus (clot), acute kidney failure, and inflection/inflammation imaging. The peptides for this project were designed by the industrial partner, Palatin Technologies, (formerly Rhomed, Inc.) using various peptide design approaches including a newly developed rational computer assisted drug design (CADD) approach termed MIDAS (Metal ion Induced Distinctive Array of Structures). In this approach, the biological function domain and the 99mTc complexing domain are fused together so that structurally these domains are indistinguishable. This approach allows construction of conformationally rigid metallo-peptide molecules (similar to cyclic peptides) that are metabolically stable in-vivo. All the newly designed peptides were screened in various in vitro receptor binding and functional assays to identify a lead compound. The lead compounds were formulated in a one-step 99mTc labeling kit form which were studied by BNL for detailed in-vivo imaging using various animals models of human disease. Two main peptides usingMIDAS approach evolved and were investigated: RGD peptide for acute renal failure and an immunomodulatory peptide derived from tuftsin (RMT-1) for infection/inflammation imaging. Various RGD based metallopeptides were designed, synthesized and assayed for their efficacy in inhibiting ADP-induced human platelet aggregation. Most of these peptides displayed biological activity in the 1-100 µM range. Based on previous work by others, RGD-I and RGD-II were evaluated in animal models of acute renal failure. These earlier studies showed that after acute ischemic injury the renal cortex displays

  8. HEDR modeling approach

    International Nuclear Information System (INIS)

    Shipler, D.B.; Napier, B.A.

    1992-07-01

    This report details the conceptual approaches to be used in calculating radiation doses to individuals throughout the various periods of operations at the Hanford Site. The report considers the major environmental transport pathways--atmospheric, surface water, and ground water--and projects and appropriate modeling technique for each. The modeling sequence chosen for each pathway depends on the available data on doses, the degree of confidence justified by such existing data, and the level of sophistication deemed appropriate for the particular pathway and time period being considered

  9. Modeling prosody: Different approaches

    Science.gov (United States)

    Carmichael, Lesley M.

    2002-11-01

    Prosody pervades all aspects of a speech signal, both in terms of raw acoustic outcomes and linguistically meaningful units, from the phoneme to the discourse unit. It is carried in the suprasegmental features of fundamental frequency, loudness, and duration. Several models have been developed to account for the way prosody organizes speech, and they vary widely in terms of their theoretical assumptions, organizational primitives, actual procedures of application to speech, and intended use (e.g., to generate speech from text vs. to model the prosodic phonology of a language). In many cases, these models overtly contradict one another with regard to their fundamental premises or their identification of the perceptible objects of linguistic prosody. These competing models are directly compared. Each model is applied to the same speech samples. This parallel analysis allows for a critical inspection of each model and its efficacy in assessing the suprasegmental behavior of the speech. The analyses illustrate how different approaches are better equipped to account for different aspects of prosody. Viewing the models and their successes from an objective perspective allows for creative possibilities in terms of combining strengths from models which might otherwise be considered fundamentally incompatible.

  10. General and specific attention-deficit/hyperactivity disorder factors of children 4 to 6 years of age: An exploratory structural equation modeling approach to assessing symptom multidimensionality.

    Science.gov (United States)

    Arias, Víctor B; Ponce, Fernando P; Martínez-Molina, Agustín; Arias, Benito; Núñez, Daniel

    2016-01-01

    We tested first-order factor and bifactor models of attention-deficit/hyperactivity disorder (ADHD) using confirmatory factor analysis (CFA) and exploratory structural equation modeling (ESEM) to adequately summarize the Diagnostic and Statistical Manual of Mental Disorders, 4th Edition, (DSM-IV-TR) symptoms observed in a Spanish sample of preschoolers and kindergarteners. Six ESEM and CFA models were estimated based on teacher evaluations of the behavior of 638 children 4 to 6 years of age. An ESEM bifactor model with a central dimension plus 3 specific factors (inattention, hyperactivity, and impulsivity) showed the best fit and interpretability. Strict invariance between the sexes was observed. The bifactor model provided a solution to previously encountered inconsistencies in the factorial models of ADHD in young children. However, the low reliability of the specific factors casts doubt on the utility of the subscales for ADHD measurement. More research is necessary to clarify the nature of G and S factors of ADHD. (c) 2016 APA, all rights reserved.

  11. A Generalised Approach to Petri Nets and Algebraic Specifications

    International Nuclear Information System (INIS)

    Sivertsen, Terje

    1998-02-01

    The present report represents a continuation of the work on Petri nets and algebraic specifications. The reported research has focused on generalising the approach introduced in HWR-454, with the aim of facilitating the translation of a wider class of Petri nets into algebraic specification. This includes autonomous Petri nets with increased descriptive power, as well as non-autonomous Petri nets allowing the modelling of systems (1) involving extensive data processing; (2) with transitions synchronized on external events; (3) whose evolutions are time dependent. The generalised approach has the important property of being modular in the sense that the translated specifications can be gradually extended to include data processing, synchronization, and timing. The report also discusses the relative merits of state-based and transition-based specifications, and includes a non-trivial case study involving automated proofs of a large number of interrelated theorems. The examples in the report illustrate the use of the new HRP Prover. Of particular importance in this context is the automatic transformation between state-based and transitionbased specifications. It is expected that the approach introduced in HWR-454 and generalised in the present report will prove useful in future work on combination of wide variety of specification techniques

  12. A Parametric Counterexample Refinement Approach for Robust Timed Specifications

    Directory of Open Access Journals (Sweden)

    Louis-Marie Traonouez

    2012-07-01

    Full Text Available Robustness analyzes the impact of small perturbations in the semantics of a model. This allows to model hardware imprecision and therefore it has been applied to determine implementability of timed automata. In a recent paper, we extend this problem to a specification theory for real-timed systems based on timed input/output automata, that are interpreted as two-player games. We propose a construction that allows to synthesize an implementation of a specification that is robust under a given timed perturbation, and we study the impact of these perturbations when composing different specifications. To complete this work we present a technique that evaluates the greatest admissible perturbation. It consists in an iterative process that extracts a spoiling strategy when a game is lost, and through a parametric analysis refines the admissible values for the perturbation. We demonstrate this approach with a prototype implementation.

  13. Building a Flexible Software Factory Using Partial Domain Specific Models

    NARCIS (Netherlands)

    Warmer, J.B.; Kleppe, A.G.

    2006-01-01

    This paper describes some experiences in building a software factory by defining multiple small domain specific languages (DSLs) and having multiple small models per DSL. This is in high contrast with traditional approaches using monolithic models, e.g. written in UML. In our approach, models behave

  14. Material Modelling - Composite Approach

    DEFF Research Database (Denmark)

    Nielsen, Lauge Fuglsang

    1997-01-01

    This report is part of a research project on "Control of Early Age Cracking" - which, in turn, is part of the major research programme, "High Performance Concrete - The Contractor's Technology (HETEK)", coordinated by the Danish Road Directorate, Copenhagen, Denmark, 1997.A composite-rheological ......This report is part of a research project on "Control of Early Age Cracking" - which, in turn, is part of the major research programme, "High Performance Concrete - The Contractor's Technology (HETEK)", coordinated by the Danish Road Directorate, Copenhagen, Denmark, 1997.A composite......-rheological model of concrete is presented by which consistent predictions of creep, relaxation, and internal stresses can be made from known concrete composition, age at loading, and climatic conditions. No other existing "creep prediction method" offers these possibilities in one approach.The model...... in this report is that cement paste and concrete behave practically as linear-viscoelastic materials from an age of approximately 10 hours. This is a significant age extension relative to earlier studies in the literature where linear-viscoelastic behavior is only demonstrated from ages of a few days. Thus...

  15. Modeling software behavior a craftsman's approach

    CERN Document Server

    Jorgensen, Paul C

    2009-01-01

    A common problem with most texts on requirements specifications is that they emphasize structural models to the near exclusion of behavioral models-focusing on what the software is, rather than what it does. If they do cover behavioral models, the coverage is brief and usually focused on a single model. Modeling Software Behavior: A Craftsman's Approach provides detailed treatment of various models of software behavior that support early analysis, comprehension, and model-based testing. Based on the popular and continually evolving course on requirements specification models taught by the auth

  16. Friendship networks and psychological well-being from late adolescence to young adulthood: a gender-specific structural equation modeling approach.

    Science.gov (United States)

    Miething, Alexander; Almquist, Ylva B; Östberg, Viveca; Rostila, Mikael; Edling, Christofer; Rydgren, Jens

    2016-07-11

    The importance of supportive social relationships for psychological well-being has been previously recognized, but the direction of associations between both dimensions and how they evolve when adolescents enter adulthood have scarcely been addressed. The present study aims to examine the gender-specific associations between self-reported friendship network quality and psychological well-being of young people during the transition from late adolescence to young adulthood by taking into account the direction of association. A random sample of Swedes born in 1990 were surveyed at age 19 and again at age 23 regarding their own health and their relationships with a maximum of five self-nominated friends. The response rate was 55.3 % at baseline and 43.7 % at follow-up, resulting in 772 cases eligible for analysis. Gender-specific structural equation modeling was conducted to explore the associations between network quality and well-being. The measurement part included a latent measure of well-being, whereas the structural part accounted for autocorrelation for network quality and for well-being over time and further examined the cross-lagged associations. The results show that network quality increased while well-being decreased from age 19 to age 23. Females reported worse well-being at both time points, whereas no gender differences were found for network quality. Network quality at age 19 predicted network quality at age 23, and well-being at age 19 predicted well-being at age 23. The results further show positive correlations between network quality and well-being for males and females alike. The strength of the correlations diminished over time but remained significant at age 23. Simultaneously testing social causation and social selection in a series of competing models indicates that while there were no cross-lagged associations among males, there was a weak reverse association between well-being at age 19 and network quality at age 23 among females. The study

  17. Specific approach for continuous air quality monitoring

    Directory of Open Access Journals (Sweden)

    Živković Predrag M.

    2012-01-01

    Full Text Available Rapid industry development as well as increase of traffic volume across the world resulted in air quality becoming one of the most important factors of everyday life. Air quality monitoring is the necessary factor for proper decision making regarding air pollution. An integral part of such investigations is the measurement of wind characteristics, as the wind is the most influential factor in turbulent pollution diffusion into the atmosphere. The most of the air pollution originates from combustion processes, so it is important to make quantitative, as well as qualitative analysis, as the sources of pollution can be very distant. In this paper, specific methodology for continuous wind, temperature and air quality data acquisition is presented. Comparison of the measured results is given, as well as the detailed presentation of the characteristics of the acquisition software used.

  18. Current Approaches to the Establishment of Credit Risk Specific Provisions

    Directory of Open Access Journals (Sweden)

    Ion Nitu

    2008-10-01

    Full Text Available The aim of the new Basel II and IFRS approaches is to make the operations of financial institutions more transparent and thus to create a better basis for the market participants and supervisory authorities to acquire information and make decisions. In the banking sector, a continuous debate is being led, related to the similarities and differences between IFRS approach on loan loss provisions and Basel II approach on calculating the capital requirements, judging against the classical method regarding loan provisions, currently used by the Romanian banks following the Central Bank’s regulations.Banks must take into consideration that IFRS and Basel II objectives are fundamentally different. While IFRS aims to ensure that the financial papers reflect adequately the losses recorded at each balance sheet date, the Basel II objective is to ensure that the bank has enough provisions or capital in order to face expected losses in the next 12 months and eventual unexpected losses.Consequently, there are clear differences between the objectives of the two models. Basel II works on statistical modeling of expected losses while IFRS, although allowing statistical models, requires a trigger event to have occurred before they can be used. IAS 39 specifically states that losses that are expected as a result of future events, no matter how likely, are not recognized. This is a clear and fundamental area of difference between the two frameworks.

  19. Semiring-based Specification Approaches for Quantitative Security

    Directory of Open Access Journals (Sweden)

    Fabio Martinelli

    2015-09-01

    Full Text Available Our goal is to provide different semiring-based formal tools for the specification of security requirements: we quantitatively enhance the open-system approach, according to which a system is partially specified. Therefore, we suppose the existence of an unknown and possibly malicious agent that interacts in parallel with the system. Two specification frameworks are designed along two different (but still related lines. First, by comparing the behaviour of a system with the expected one, or by checking if such system satisfies some security requirements: we investigate a novel approximate behavioural-equivalence for comparing processes behaviour, thus extending the Generalised Non Deducibility on Composition (GNDC approach with scores. As a second result, we equip a modal logic with semiring values with the purpose to have a weight related to the satisfaction of a formula that specifies some requested property. Finally, we generalise the classical partial model-checking function, and we name it as quantitative partial model-checking in such a way to point out the necessary and sufficient conditions that a system has to satisfy in order to be considered as secure, with respect to a fixed security/functionality threshold-value.

  20. Model Commissioning Plan and Guide Specifications

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    The objectives of Model Commissioning Plan and Guide Specifications are to ensure that the design team applies commissioning concepts to the design and prepares commissioning specifications and a commission plan for inclusion in the bid construction documents.

  1. Introducing a game approach towards IS requirements specification

    DEFF Research Database (Denmark)

    Jensen, Mika Yasuoka; Kadoya, Kyoichi; Niwa, Takashi

    2014-01-01

    Devising a system requirements specification is a challenging task. Even after several decades of system development research, specifications for large-scale, widely-used systems remain difficult. In this paper, we suggest a first step toward a requirements specification through a stakeholder inv...... stakeholder involvement method with game elements can be effectively utilized as a first step towards requirement specification....... involvement approach with game elements. We report preliminary findings from a practice case in which our methods are applied to the requirement specification phase of a project management system. The analysis showed that our game approach fostered innovative idea generation and captured implicit user...

  2. Site specific optimization of wind turbines energy cost: Iterative approach

    International Nuclear Information System (INIS)

    Rezaei Mirghaed, Mohammad; Roshandel, Ramin

    2013-01-01

    Highlights: • Optimization model of wind turbine parameters plus rectangular farm layout is developed. • Results show that levelized cost for single turbine fluctuates between 46.6 and 54.5 $/MW h. • Modeling results for two specific farms reported optimal sizing and farm layout. • Results show that levelized cost of the wind farms fluctuates between 45.8 and 67.2 $/MW h. - Abstract: The present study was aimed at developing a model to optimize the sizing parameters and farm layout of wind turbines according to the wind resource and economic aspects. The proposed model, including aerodynamic, economic and optimization sub-models, is used to achieve minimum levelized cost of electricity. The blade element momentum theory is utilized for aerodynamic modeling of pitch-regulated horizontal axis wind turbines. Also, a comprehensive cost model including capital costs of all turbine components is considered. An iterative approach is used to develop the optimization model. The modeling results are presented for three potential regions in Iran: Khaf, Ahar and Manjil. The optimum configurations and sizing for a single turbine with minimum levelized cost of electricity are presented. The optimal cost of energy for one turbine is calculated about 46.7, 54.5 and 46.6 dollars per MW h in the studied sites, respectively. In addition, optimal size of turbines, annual electricity production, capital cost, and wind farm layout for two different rectangular and square shaped farms in the proposed areas have been recognized. According to the results, optimal system configuration corresponds to minimum levelized cost of electricity about 45.8 to 67.2 dollars per MW h in the studied wind farms

  3. Risk-informed approach in US-APWR technical specifications

    International Nuclear Information System (INIS)

    Saji, Etsuro; Tanaka, Futoshi; Kuroiwa, Katsuya; Kawai, Katsunori

    2009-01-01

    The Risk-Managed Technical Specifications and the Surveillance Frequency Control Program have been adopted in the US-APWR Technical Specifications. These risk-informed approaches are unique among the technical specifications for the advanced light water reactor designs adopted by planned nuclear power stations in the United States. (author)

  4. On Automatic Modeling and Use of Domain-specific Ontologies

    DEFF Research Database (Denmark)

    Andreasen, Troels; Knappe, Rasmus; Bulskov, Henrik

    2005-01-01

    In this paper, we firstly introduce an approach to the modeling of a domain-specific ontology for use in connection with a given document collection. Secondly, we present a methodology for deriving conceptual similarity from the domain-specific ontology. Adopted for ontology representation is a s...

  5. An integrated in silico approach to design specific inhibitors targeting human poly(a-specific ribonuclease.

    Directory of Open Access Journals (Sweden)

    Dimitrios Vlachakis

    Full Text Available Poly(A-specific ribonuclease (PARN is an exoribonuclease/deadenylase that degrades 3'-end poly(A tails in almost all eukaryotic organisms. Much of the biochemical and structural information on PARN comes from the human enzyme. However, the existence of PARN all along the eukaryotic evolutionary ladder requires further and thorough investigation. Although the complete structure of the full-length human PARN, as well as several aspects of the catalytic mechanism still remain elusive, many previous studies indicate that PARN can be used as potent and promising anti-cancer target. In the present study, we attempt to complement the existing structural information on PARN with in-depth bioinformatics analyses, in order to get a hologram of the molecular evolution of PARNs active site. In an effort to draw an outline, which allows specific drug design targeting PARN, an unequivocally specific platform was designed for the development of selective modulators focusing on the unique structural and catalytic features of the enzyme. Extensive phylogenetic analysis based on all the publicly available genomes indicated a broad distribution for PARN across eukaryotic species and revealed structurally important amino acids which could be assigned as potentially strong contributors to the regulation of the catalytic mechanism of PARN. Based on the above, we propose a comprehensive in silico model for the PARN's catalytic mechanism and moreover, we developed a 3D pharmacophore model, which was subsequently used for the introduction of DNP-poly(A amphipathic substrate analog as a potential inhibitor of PARN. Indeed, biochemical analysis revealed that DNP-poly(A inhibits PARN competitively. Our approach provides an efficient integrated platform for the rational design of pharmacophore models as well as novel modulators of PARN with therapeutic potential.

  6. Cost Concept Model and Gateway Specification

    DEFF Research Database (Denmark)

    Kejser, Ulla Bøgvad

    2014-01-01

    user communities and builds upon our understanding of the requirements, drivers, obstacles and objectives that various stakeholder groups have relating to digital curation. Ultimately, this concept model should provide a critical input to the development and refinement of cost models as well as helping...... and solution providers, and by researchers in follow-up research and development projects. The Framework includes: • A Cost Concept Model—which defines the core concepts that should be included in curation costs models; • An Implementation Guide—for the cost concept model that provides guidance and proposes...... questions that should be considered when developing new cost models and refining existing cost models; • A Gateway Specification Template—which provides standard metadata for each of the core cost concepts and is intended for use by future model developers, model users, and service and solution providers...

  7. Patient-specific models of cardiac biomechanics

    Science.gov (United States)

    Krishnamurthy, Adarsh; Villongco, Christopher T.; Chuang, Joyce; Frank, Lawrence R.; Nigam, Vishal; Belezzuoli, Ernest; Stark, Paul; Krummen, David E.; Narayan, Sanjiv; Omens, Jeffrey H.; McCulloch, Andrew D.; Kerckhoffs, Roy C. P.

    2013-07-01

    Patient-specific models of cardiac function have the potential to improve diagnosis and management of heart disease by integrating medical images with heterogeneous clinical measurements subject to constraints imposed by physical first principles and prior experimental knowledge. We describe new methods for creating three-dimensional patient-specific models of ventricular biomechanics in the failing heart. Three-dimensional bi-ventricular geometry is segmented from cardiac CT images at end-diastole from patients with heart failure. Human myofiber and sheet architecture is modeled using eigenvectors computed from diffusion tensor MR images from an isolated, fixed human organ-donor heart and transformed to the patient-specific geometric model using large deformation diffeomorphic mapping. Semi-automated methods were developed for optimizing the passive material properties while simultaneously computing the unloaded reference geometry of the ventricles for stress analysis. Material properties of active cardiac muscle contraction were optimized to match ventricular pressures measured by cardiac catheterization, and parameters of a lumped-parameter closed-loop model of the circulation were estimated with a circulatory adaptation algorithm making use of information derived from echocardiography. These components were then integrated to create a multi-scale model of the patient-specific heart. These methods were tested in five heart failure patients from the San Diego Veteran's Affairs Medical Center who gave informed consent. The simulation results showed good agreement with measured echocardiographic and global functional parameters such as ejection fraction and peak cavity pressures.

  8. Morphing patient-specific musculoskeletal models

    DEFF Research Database (Denmark)

    Rasmussen, John; Galibarov, Pavel E.; Al-Munajjed, Amir

    or surface scans. Furthermore, we assume that a set of corresponding anatomical landmarks can be identified in the medical imaging data and on the generic musculoskeletal model. A nonlinear transformation, i.e. a morphing, is created by means of radial basis functions that maps points set (i) to point set...... (ii). The morphing is subsequently used to transform parts of the generic musculoskeletal model to a patient-specific version, thus changing bone shapes, muscle insertion points, joint locations and other geometrical properties. Research questions include how to select point sets and whether...... other conditions may require CT or MRI data. The method and its theoretical assumptions, advantages and limitations are presented, and several examples will illustrate morphing to patient-specific models. [1] Carbes S; Tørholm S; Rasmussen, J. A Detailed Twenty-six Segments Kinematic Foot model...

  9. New Approach to Total Dose Specification for Spacecraft Electronics

    Science.gov (United States)

    Xapsos, Michael

    2017-01-01

    Variability of the space radiation environment is investigated with regard to total dose specification for spacecraft electronics. It is shown to have a significant impact. A new approach is developed for total dose requirements that replaces the radiation design margin concept with failure probability during a mission.

  10. MODEL OF TEACHING PROFESSION SPECIFIC BILATERAL TRANSLATION

    Directory of Open Access Journals (Sweden)

    Yana Fabrychna

    2017-03-01

    Full Text Available The article deals with the author’s interpretation of the process of teaching profession specific bilateral translation to student teacher of English in the Master’s program. The goal of the model of teaching profession specific bilateral translation development is to determine the logical sequence of educational activities of the teacher as the organizer of the educational process and students as its members. English and Ukrainian texts on methods of foreign languages and cultures teaching are defined as the object of study. Learning activities aimed at the development of student teachers of English profession specific competence in bilateral translation and Translation Proficiency Language Portfolio for Student Teachers of English are suggested as teaching tools. The realization of the model of teaching profession specific bilateral translation to student teachers of English in the Master’s program is suggested within the module topics of the academic discipline «Practice of English as the first foreign language»: Globalization; Localization; Education; Work; The role of new communication technologies in personal and professional development. We believe that the amount of time needed for efficient functioning of the model is 48 academic hours, which was determined by calculating the total number of academic hours allotted for the academic discipline «Practice of English as the first foreign language» in Ukrainian universities. Peculiarities of the model realization as well as learning goals and content of class activities and home self-study work of students are outlined.

  11. Neural network approaches for noisy language modeling.

    Science.gov (United States)

    Li, Jun; Ouazzane, Karim; Kazemian, Hassan B; Afzal, Muhammad Sajid

    2013-11-01

    Text entry from people is not only grammatical and distinct, but also noisy. For example, a user's typing stream contains all the information about the user's interaction with computer using a QWERTY keyboard, which may include the user's typing mistakes as well as specific vocabulary, typing habit, and typing performance. In particular, these features are obvious in disabled users' typing streams. This paper proposes a new concept called noisy language modeling by further developing information theory and applies neural networks to one of its specific application-typing stream. This paper experimentally uses a neural network approach to analyze the disabled users' typing streams both in general and specific ways to identify their typing behaviors and subsequently, to make typing predictions and typing corrections. In this paper, a focused time-delay neural network (FTDNN) language model, a time gap model, a prediction model based on time gap, and a probabilistic neural network model (PNN) are developed. A 38% first hitting rate (HR) and a 53% first three HR in symbol prediction are obtained based on the analysis of a user's typing history through the FTDNN language modeling, while the modeling results using the time gap prediction model and the PNN model demonstrate that the correction rates lie predominantly in between 65% and 90% with the current testing samples, and 70% of all test scores above basic correction rates, respectively. The modeling process demonstrates that a neural network is a suitable and robust language modeling tool to analyze the noisy language stream. The research also paves the way for practical application development in areas such as informational analysis, text prediction, and error correction by providing a theoretical basis of neural network approaches for noisy language modeling.

  12. XRLSim model specifications and user interfaces

    Energy Technology Data Exchange (ETDEWEB)

    Young, K.D.; Breitfeller, E.; Woodruff, J.P.

    1989-12-01

    The two chapters in this manual document the engineering development leading to modification of XRLSim -- an Ada-based computer program developed to provide a realistic simulation of an x-ray laser weapon platform. Complete documentation of the FY88 effort to develop XRLSim was published in April 1989, as UCID-21736:XRLSIM Model Specifications and User Interfaces, by L. C. Ng, D. T. Gavel, R. M. Shectman. P. L. Sholl, and J. P. Woodruff. The FY89 effort has been primarily to enhance the x-ray laser weapon-platform model fidelity. Chapter 1 of this manual details enhancements made to XRLSim model specifications during FY89. Chapter 2 provides the user with changes in user interfaces brought about by these enhancements. This chapter is offered as a series of deletions, replacements, and insertions to the original document to enable XRLSim users to implement enhancements developed during FY89.

  13. Learning Actions Models: Qualitative Approach

    DEFF Research Database (Denmark)

    Bolander, Thomas; Gierasimczuk, Nina

    2015-01-01

    —they are identifiable in the limit.We then move on to a particular learning method, which proceeds via restriction of a space of events within a learning-specific action model. This way of learning closely resembles the well-known update method from dynamic epistemic logic. We introduce several different learning...... identifiability (conclusively inferring the appropriate action model in finite time) and identifiability in the limit (inconclusive convergence to the right action model). We show that deterministic actions are finitely identifiable, while non-deterministic actions require more learning power...... methods suited for finite identifiability of particular types of deterministic actions....

  14. Model-specification uncertainty in future forest pest outbreak.

    Science.gov (United States)

    Boulanger, Yan; Gray, David R; Cooke, Barry J; De Grandpré, Louis

    2016-04-01

    Climate change will modify forest pest outbreak characteristics, although there are disagreements regarding the specifics of these changes. A large part of this variability may be attributed to model specifications. As a case study, we developed a consensus model predicting spruce budworm (SBW, Choristoneura fumiferana [Clem.]) outbreak duration using two different predictor data sets and six different correlative methods. The model was used to project outbreak duration and the uncertainty associated with using different data sets and correlative methods (=model-specification uncertainty) for 2011-2040, 2041-2070 and 2071-2100, according to three forcing scenarios (RCP 2.6, RCP 4.5 and RCP 8.5). The consensus model showed very high explanatory power and low bias. The model projected a more important northward shift and decrease in outbreak duration under the RCP 8.5 scenario. However, variation in single-model projections increases with time, making future projections highly uncertain. Notably, the magnitude of the shifts in northward expansion, overall outbreak duration and the patterns of outbreaks duration at the southern edge were highly variable according to the predictor data set and correlative method used. We also demonstrated that variation in forcing scenarios contributed only slightly to the uncertainty of model projections compared with the two sources of model-specification uncertainty. Our approach helped to quantify model-specification uncertainty in future forest pest outbreak characteristics. It may contribute to sounder decision-making by acknowledging the limits of the projections and help to identify areas where model-specification uncertainty is high. As such, we further stress that this uncertainty should be strongly considered when making forest management plans, notably by adopting adaptive management strategies so as to reduce future risks. © 2015 Her Majesty the Queen in Right of Canada Global Change Biology © 2015 Published by John

  15. Modeling, Specification and Construction of PLC-programs

    Directory of Open Access Journals (Sweden)

    E. V. Kuzmin

    2013-01-01

    Full Text Available A new approach to construction of reliable discrete PLC-programs with timers — programming based on specification and verification — is proposed. Timers are modelled in a discrete way. For the specification of a program behavior we use the linear-time temporal logic LTL. Programming is carried out in the ST-language according to a LTLspecification. A new approach to programming of PLC is shown by an example. The proposed programming approach provides an ability of a correctness analysis of PLC-programs using the model checking method. The programming requires fulfillment of the following two conditions: 1 a value of each variable should be changed not more than once per one full PLC-program implementation (per one full working cycle of PLC; 2 a value of each variable should only be changed in one place of a PLC-program. Under the proposed approach the change of the value of each program variable is described by a pair of LTL-formulas. The first LTL-formula describes situations that increase the value of the corresponding variable, the second LTL-formula specifies conditions leading to a decrease of the variable value. The LTL-formulas (used for specification of the corresponding variable behavior are constructive in the sense that they construct the PLC-program, which satisfies temporal properties expressed by these formulas. Thus, the programming of PLC is reduced to the construction of LTL-specification of the behavior of each program variable.

  16. The Danish national passenger modelModel specification and results

    DEFF Research Database (Denmark)

    Rich, Jeppe; Hansen, Christian Overgaard

    2016-01-01

    , the paper provides a description of a large-scale forecast model with a discussion of the linkage between population synthesis, demand and assignment. Secondly, the paper gives specific attention to model specification and in particular choice of functional form and cost-damping. Specifically we suggest...... a family of logarithmic spline functions and illustrate how it is applied in the model. Thirdly and finally, we evaluate model sensitivity and performance by evaluating the distance distribution and elasticities. In the paper we present results where the spline-function is compared with more traditional...... function types and it is indicated that the spline-function provides a better description of the data. Results are also provided in the form of a back-casting exercise where the model is tested in a back-casting scenario to 2002....

  17. A Conceptual Modeling Approach for OLAP Personalization

    Science.gov (United States)

    Garrigós, Irene; Pardillo, Jesús; Mazón, Jose-Norberto; Trujillo, Juan

    Data warehouses rely on multidimensional models in order to provide decision makers with appropriate structures to intuitively analyze data with OLAP technologies. However, data warehouses may be potentially large and multidimensional structures become increasingly complex to be understood at a glance. Even if a departmental data warehouse (also known as data mart) is used, these structures would be also too complex. As a consequence, acquiring the required information is more costly than expected and decision makers using OLAP tools may get frustrated. In this context, current approaches for data warehouse design are focused on deriving a unique OLAP schema for all analysts from their previously stated information requirements, which is not enough to lighten the complexity of the decision making process. To overcome this drawback, we argue for personalizing multidimensional models for OLAP technologies according to the continuously changing user characteristics, context, requirements and behaviour. In this paper, we present a novel approach to personalizing OLAP systems at the conceptual level based on the underlying multidimensional model of the data warehouse, a user model and a set of personalization rules. The great advantage of our approach is that a personalized OLAP schema is provided for each decision maker contributing to better satisfy their specific analysis needs. Finally, we show the applicability of our approach through a sample scenario based on our CASE tool for data warehouse development.

  18. Context-specific graphical models for discret longitudinal data

    DEFF Research Database (Denmark)

    Edwards, David; Anantharama Ankinakatte, Smitha

    2015-01-01

    Ron et al. (1998) introduced a rich family of models for discrete longitudinal data called acyclic probabilistic finite automata. These may be represented as directed graphs that embody context-specific conditional independence relations. Here, the approach is developed from a statistical...... perspective. It is shown here that likelihood ratio tests may be constructed using standard contingency table methods, a model selection procedure that minimizes a penalized likelihood criterion is described, and a way to extend the models to incorporate covariates is proposed. The methods are applied...

  19. HEDR modeling approach: Revision 1

    International Nuclear Information System (INIS)

    Shipler, D.B.; Napier, B.A.

    1994-05-01

    This report is a revision of the previous Hanford Environmental Dose Reconstruction (HEDR) Project modeling approach report. This revised report describes the methods used in performing scoping studies and estimating final radiation doses to real and representative individuals who lived in the vicinity of the Hanford Site. The scoping studies and dose estimates pertain to various environmental pathways during various periods of time. The original report discussed the concepts under consideration in 1991. The methods for estimating dose have been refined as understanding of existing data, the scope of pathways, and the magnitudes of dose estimates were evaluated through scoping studies

  20. HEDR modeling approach: Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    Shipler, D.B.; Napier, B.A.

    1994-05-01

    This report is a revision of the previous Hanford Environmental Dose Reconstruction (HEDR) Project modeling approach report. This revised report describes the methods used in performing scoping studies and estimating final radiation doses to real and representative individuals who lived in the vicinity of the Hanford Site. The scoping studies and dose estimates pertain to various environmental pathways during various periods of time. The original report discussed the concepts under consideration in 1991. The methods for estimating dose have been refined as understanding of existing data, the scope of pathways, and the magnitudes of dose estimates were evaluated through scoping studies.

  1. Using Built-In Domain-Specific Modeling Support to Guide Model-Based Test Generation

    Directory of Open Access Journals (Sweden)

    Teemu Kanstrén

    2012-02-01

    Full Text Available We present a model-based testing approach to support automated test generation with domain-specific concepts. This includes a language expert who is an expert at building test models and domain experts who are experts in the domain of the system under test. First, we provide a framework to support the language expert in building test models using a full (Java programming language with the help of simple but powerful modeling elements of the framework. Second, based on the model built with this framework, the toolset automatically forms a domain-specific modeling language that can be used to further constrain and guide test generation from these models by a domain expert. This makes it possible to generate a large set of test cases covering the full model, chosen (constrained parts of the model, or manually define specific test cases on top of the model while using concepts familiar to the domain experts.

  2. A Modeling Approach for Marine Observatory

    Directory of Open Access Journals (Sweden)

    Charbel Geryes Aoun

    2015-02-01

    Full Text Available Infrastructure of Marine Observatory (MO is an UnderWater Sensor Networks (UW-SN to perform collaborative monitoring tasks over a given area. This observation should take into consideration the environmental constraints since it may require specific tools, materials and devices (cables, servers, etc.. The logical and physical components that are used in these observatories provide data exchanged between the various devices of the environment (Smart Sensor, Data Fusion. These components provide new functionalities or services due to the long period running of the network. In this paper, we present our approach in extending the modeling languages to include new domain- specific concepts and constraints. Thus, we propose a meta-model that is used to generate a new design tool (ArchiMO. We illustrate our proposal with an example from the MO domain on object localization with several acoustics sensors. Additionally, we generate the corresponding simulation code for a standard network simulator using our self-developed domain-specific model compiler. Our approach helps to reduce the complexity and time of the design activity of a Marine Observatory. It provides a way to share the different viewpoints of the designers in the MO domain and obtain simulation results to estimate the network capabilities.

  3. Modeling Approaches in Planetary Seismology

    Science.gov (United States)

    Weber, Renee; Knapmeyer, Martin; Panning, Mark; Schmerr, Nick

    2014-01-01

    Of the many geophysical means that can be used to probe a planet's interior, seismology remains the most direct. Given that the seismic data gathered on the Moon over 40 years ago revolutionized our understanding of the Moon and are still being used today to produce new insight into the state of the lunar interior, it is no wonder that many future missions, both real and conceptual, plan to take seismometers to other planets. To best facilitate the return of high-quality data from these instruments, as well as to further our understanding of the dynamic processes that modify a planet's interior, various modeling approaches are used to quantify parameters such as the amount and distribution of seismicity, tidal deformation, and seismic structure on and of the terrestrial planets. In addition, recent advances in wavefield modeling have permitted a renewed look at seismic energy transmission and the effects of attenuation and scattering, as well as the presence and effect of a core, on recorded seismograms. In this chapter, we will review these approaches.

  4. Predicting Cumulative Incidence Probability: Marginal and Cause-Specific Modelling

    DEFF Research Database (Denmark)

    Scheike, Thomas H.; Zhang, Mei-Jie

    2005-01-01

    cumulative incidence probability; cause-specific hazards; subdistribution hazard; binomial modelling......cumulative incidence probability; cause-specific hazards; subdistribution hazard; binomial modelling...

  5. Patient-Specific Modeling of Intraventricular Hemodynamics

    Science.gov (United States)

    Vedula, Vijay; Marsden, Alison

    2017-11-01

    Heart disease is the one of the leading causes of death in the world. Apart from malfunctions in electrophysiology and myocardial mechanics, abnormal hemodynamics is a major factor attributed to heart disease across all ages. Computer simulations offer an efficient means to accurately reproduce in vivo flow conditions and also make predictions of post-operative outcomes and disease progression. We present an experimentally validated computational framework for performing patient-specific modeling of intraventricular hemodynamics. Our modeling framework employs the SimVascular open source software to build an anatomic model and employs robust image registration methods to extract ventricular motion from the image data. We then employ a stabilized finite element solver to simulate blood flow in the ventricles, solving the Navier-Stokes equations in arbitrary Lagrangian-Eulerian (ALE) coordinates by prescribing the wall motion extracted during registration. We model the fluid-structure interaction effects of the cardiac valves using an immersed boundary method and discuss the potential application of this methodology in single ventricle physiology and trans-catheter aortic valve replacement (TAVR). This research is supported in part by the Stanford Child Health Research Institute and the Stanford NIH-NCATS-CTSA through Grant UL1 TR001085 and partly through NIH NHLBI R01 Grant 5R01HL129727-02.

  6. SLS Navigation Model-Based Design Approach

    Science.gov (United States)

    Oliver, T. Emerson; Anzalone, Evan; Geohagan, Kevin; Bernard, Bill; Park, Thomas

    2018-01-01

    management of design requirements to the development of usable models, model requirements, and model verification and validation efforts. The models themselves are represented in C/C++ code and accompanying data files. Under the idealized process, potential ambiguity in specification is reduced because the model must be implementable versus a requirement which is not necessarily subject to this constraint. Further, the models are shown to emulate the hardware during validation. For models developed by the Navigation Team, a common interface/standalone environment was developed. The common environment allows for easy implementation in design and analysis tools. Mechanisms such as unit test cases ensure implementation as the developer intended. The model verification and validation process provides a very high level of component design insight. The origin and implementation of the SLS variant of Model-based Design is described from the perspective of the SLS Navigation Team. The format of the models and the requirements are described. The Model-based Design approach has many benefits but is not without potential complications. Key lessons learned associated with the implementation of the Model Based Design approach and process from infancy to verification and certification are discussed

  7. Subgrid geoelectric field specification for GIC modeling

    Science.gov (United States)

    Butala, M.; Grawe, M.; Kamalabadi, F.; Makela, J. J.

    2017-12-01

    Geomagnetically induced currents (GICs) result from surface geomagnetic field (ěc{B}) variation driven by space weather disturbances. For the most intense disturbances, the consequences can range from power grid instability to even widespread failure. Modeling GICs to assess vulnerability requires the specification of the surface geoelectric field (ěc{E}) at all spatial locations coincident with the electric power system. In this study, we investigate how to best reproduce ěc{E} given the available sparse, irregularly spaced magnetometer measurements of ěc{B} and suitable electromagnetic transfer functions (EMTFs) to transform the local ěc{B} to ěc{E}. The assessment is made against ground truth from publicly available ěc{E} measurements provided by the EarthScope magnetotelluric (MT) array, a set of 7 fixed and several transportable joint ěc{B} and ěc{E} sensors. The scope of this study spans several dimensions: geomagnetic disturbance intensity, spatial interpolation scheme, and EMTF type, i.e., 1-D models based on studies of local geology and 3-D models derived from the EarthScope MT data.

  8. A computer model of the evolution of specific maximum lifespan.

    Science.gov (United States)

    Miller, A R

    1981-05-01

    To answer the question of why organisms have evolved finite and specific maximum lifespans, I have built and experimentally studied a discrete-event simulation model of the evolution of lifespan. Through natural selection, the model evolves an apparent plateau in maximum lifespan, the height of which is a decreasing function of both the intensity of niche fluctuations and specific fecundity. Evolved lifespan is therefore finite (small and essentially constant over accessible time intervals) and specific. Experiments demonstrate that the plateau is not due to group selection. Instead, it occurs because the rate of increase of maximum lifespan by natural selection - in an environment presenting a finite probability that death will occur prior to reaching the genetically specified maximum - is a decreasing function of maximum lifespan itself and asymptotically approaches zero. This supports in part a class of existing hypotheses that finite lifespan is due to an equilibrium between weak selection, as in the model, and various lifespan-decreasing processes, which however were not simulated in the present experiments. Although the model shows that such counter processes are not strictly necessary for the evolution of finite and specific maximum lifespan, my interpretation of the model's correspondence to organic evolution does imply a counter process, a bias in random genetic drift toward shorter lifespan, that is more general than those previously hypothesized.

  9. Branding approach and valuation models

    Directory of Open Access Journals (Sweden)

    Mamula Tatjana

    2006-01-01

    Full Text Available Much of the skill of marketing and branding nowadays is concerned with building equity for products whose characteristics, pricing, distribution and availability are really quite close to each other. Brands allow the consumer to shop with confidence. The real power of successful brands is that they meet the expectations of those that buy them or, to put it another way, they represent a promise kept. As such they are a contract between a seller and a buyer: if the seller keeps to its side of the bargain, the buyer will be satisfied; if not, the buyer will in future look elsewhere. Understanding consumer perceptions and associations is an important first step to understanding brand preferences and choices. In this paper, we discuss different models to measure value of brand according to couple of well known approaches according to request by companies. We rely upon several empirical examples.

  10. Seismic Safety Margins Research Program (Phase I). Project VII. Systems analysis specification of computational approach

    International Nuclear Information System (INIS)

    Wall, I.B.; Kaul, M.K.; Post, R.I.; Tagart, S.W. Jr.; Vinson, T.J.

    1979-02-01

    An initial specification is presented of a computation approach for a probabilistic risk assessment model for use in the Seismic Safety Margin Research Program. This model encompasses the whole seismic calculational chain from seismic input through soil-structure interaction, transfer functions to the probability of component failure, integration of these failures into a system model and thereby estimate the probability of a release of radioactive material to the environment. It is intended that the primary use of this model will be in sensitivity studies to assess the potential conservatism of different modeling elements in the chain and to provide guidance on priorities for research in seismic design of nuclear power plants

  11. A Set Theoretical Approach to Maturity Models

    DEFF Research Database (Denmark)

    Lasrado, Lester; Vatrapu, Ravi; Andersen, Kim Normann

    2016-01-01

    of it application on a social media maturity data-set. Specifically, we employ Necessary Condition Analysis (NCA) to identify maturity stage boundaries as necessary conditions and Qualitative Comparative Analysis (QCA) to arrive at multiple configurations that can be equally effective in progressing to higher......Maturity Model research in IS has been criticized for the lack of theoretical grounding, methodological rigor, empirical validations, and ignorance of multiple and non-linear paths to maturity. To address these criticisms, this paper proposes a novel set-theoretical approach to maturity models...... characterized by equifinality, multiple conjunctural causation, and case diversity. We prescribe methodological guidelines consisting of a six-step procedure to systematically apply set theoretic methods to conceptualize, develop, and empirically derive maturity models and provide a demonstration...

  12. Position-specific isotope modeling of organic micropollutants transformation through different reaction pathways

    DEFF Research Database (Denmark)

    Jin, Biao; Rolle, Massimo

    2016-01-01

    description of isotope fractionation occurring at different molecular positions. To demonstrate specific features of the modeling approach, we simulated the degradation of three selected organic micropollutants: dichlorobenzamide (BAM), isoproturon (IPU) and diclofenac (DCF). The model accurately reproduces...

  13. A Grammar Inference Approach for Predicting Kinase Specific Phosphorylation Sites

    Science.gov (United States)

    Datta, Sutapa; Mukhopadhyay, Subhasis

    2015-01-01

    Kinase mediated phosphorylation site detection is the key mechanism of post translational mechanism that plays an important role in regulating various cellular processes and phenotypes. Many diseases, like cancer are related with the signaling defects which are associated with protein phosphorylation. Characterizing the protein kinases and their substrates enhances our ability to understand the mechanism of protein phosphorylation and extends our knowledge of signaling network; thereby helping us to treat such diseases. Experimental methods for predicting phosphorylation sites are labour intensive and expensive. Also, manifold increase of protein sequences in the databanks over the years necessitates the improvement of high speed and accurate computational methods for predicting phosphorylation sites in protein sequences. Till date, a number of computational methods have been proposed by various researchers in predicting phosphorylation sites, but there remains much scope of improvement. In this communication, we present a simple and novel method based on Grammatical Inference (GI) approach to automate the prediction of kinase specific phosphorylation sites. In this regard, we have used a popular GI algorithm Alergia to infer Deterministic Stochastic Finite State Automata (DSFA) which equally represents the regular grammar corresponding to the phosphorylation sites. Extensive experiments on several datasets generated by us reveal that, our inferred grammar successfully predicts phosphorylation sites in a kinase specific manner. It performs significantly better when compared with the other existing phosphorylation site prediction methods. We have also compared our inferred DSFA with two other GI inference algorithms. The DSFA generated by our method performs superior which indicates that our method is robust and has a potential for predicting the phosphorylation sites in a kinase specific manner. PMID:25886273

  14. Scaling up biomass gasifier use: an application-specific approach

    International Nuclear Information System (INIS)

    Ghosh, Debyani; Sagar, Ambuj D.; Kishore, V.V.N.

    2006-01-01

    Biomass energy accounts for about 11% of the global primary energy supply, and it is estimated that about 2 billion people worldwide depend on biomass for their energy needs. Yet, most of the use of biomass is in a primitive and inefficient manner, primarily in developing countries, leading to a host of adverse implications on human health, environment, workplace conditions, and social well being. Therefore, the utilization of biomass in a clean and efficient manner to deliver modern energy services to the world's poor remains an imperative for the development community. One possible approach to do this is through the use of biomass gasifiers. Although significant efforts have been directed towards developing and deploying biomass gasifiers in many countries, scaling up their dissemination remains an elusive goal. Based on an examination of biomass gasifier development, demonstration, and deployment efforts in India-a country with more than two decades of experiences in biomass gasifier development and dissemination, this article identifies a number of barriers that have hindered widespread deployment of biomass gasifier-based energy systems. It also suggests a possible approach for moving forward, which involves a focus on specific application areas that satisfy a set of criteria that are critical to deployment of biomass gasifiers, and then tailoring the scaling up strategy to the characteristics of the user groups for that application. Our technical, financial, economic and institutional analysis suggests an initial focus on four categories of applications-small and medium enterprises, the informal sector, biomass-processing industries, and some rural areas-may be particularly feasible and fruitful

  15. A Bayesian approach to model uncertainty

    International Nuclear Information System (INIS)

    Buslik, A.

    1994-01-01

    A Bayesian approach to model uncertainty is taken. For the case of a finite number of alternative models, the model uncertainty is equivalent to parameter uncertainty. A derivation based on Savage's partition problem is given

  16. Specific Cell (Re-)Programming: Approaches and Perspectives.

    Science.gov (United States)

    Hausburg, Frauke; Jung, Julia Jeannine; David, Robert

    2018-01-01

    Many disorders are manifested by dysfunction of key cell types or their disturbed integration in complex organs. Thereby, adult organ systems often bear restricted self-renewal potential and are incapable of achieving functional regeneration. This underlies the need for novel strategies in the field of cell (re-)programming-based regenerative medicine as well as for drug development in vitro. The regenerative field has been hampered by restricted availability of adult stem cells and the potentially hazardous features of pluripotent embryonic stem cells (ESCs) and induced pluripotent stem cells (iPSCs). Moreover, ethical concerns and legal restrictions regarding the generation and use of ESCs still exist. The establishment of direct reprogramming protocols for various therapeutically valuable somatic cell types has overcome some of these limitations. Meanwhile, new perspectives for safe and efficient generation of different specified somatic cell types have emerged from numerous approaches relying on exogenous expression of lineage-specific transcription factors, coding and noncoding RNAs, and chemical compounds.It should be of highest priority to develop protocols for the production of mature and physiologically functional cells with properties ideally matching those of their endogenous counterparts. Their availability can bring together basic research, drug screening, safety testing, and ultimately clinical trials. Here, we highlight the remarkable successes in cellular (re-)programming, which have greatly advanced the field of regenerative medicine in recent years. In particular, we review recent progress on the generation of cardiomyocyte subtypes, with a focus on cardiac pacemaker cells. Graphical Abstract.

  17. Computer models of bacterial cells: from generalized coarsegrained to genome-specific modular models

    International Nuclear Information System (INIS)

    Nikolaev, Evgeni V; Atlas, Jordan C; Shuler, Michael L

    2006-01-01

    We discuss a modular modelling framework to rapidly develop mathematical models of bacterial cells that would explicitly link genomic details to cell physiology and population response. An initial step in this approach is the development of a coarse-grained model, describing pseudo-chemical interactions between lumped species. A hybrid model of interest can then be constructed by embedding genome-specific detail for a particular cellular subsystem (e.g. central metabolism), called here a module, into the coarse-grained model. Specifically, a new strategy for sensitivity analysis of the cell division limit cycle is introduced to identify which pseudo-molecular processes should be delumped to implement a particular biological function in a growing cell (e.g. ethanol overproduction or pathogen viability). To illustrate the modeling principles and highlight computational challenges, the Cornell coarsegrained model of Escherichia coli B/r-A is used to benchmark the proposed framework

  18. Chemical cleaning specification: few tube test model

    International Nuclear Information System (INIS)

    Hampton, L.V.; Simpson, J.L.

    1979-09-01

    The specification is for the waterside chemical cleaning of the 2 1/4 Cr - 1 Mo steel steam generator tubes. It describes the reagents and conditions for post-chemical cleaning passivation of the evaporator tubes

  19. Learning Action Models: Qualitative Approach

    NARCIS (Netherlands)

    Bolander, T.; Gierasimczuk, N.; van der Hoek, W.; Holliday, W.H.; Wang, W.-F.

    2015-01-01

    In dynamic epistemic logic, actions are described using action models. In this paper we introduce a framework for studying learnability of action models from observations. We present first results concerning propositional action models. First we check two basic learnability criteria: finite

  20. Gaussian graphical modeling reveals specific lipid correlations in glioblastoma cells

    Science.gov (United States)

    Mueller, Nikola S.; Krumsiek, Jan; Theis, Fabian J.; Böhm, Christian; Meyer-Bäse, Anke

    2011-06-01

    Advances in high-throughput measurements of biological specimens necessitate the development of biologically driven computational techniques. To understand the molecular level of many human diseases, such as cancer, lipid quantifications have been shown to offer an excellent opportunity to reveal disease-specific regulations. The data analysis of the cell lipidome, however, remains a challenging task and cannot be accomplished solely based on intuitive reasoning. We have developed a method to identify a lipid correlation network which is entirely disease-specific. A powerful method to correlate experimentally measured lipid levels across the various samples is a Gaussian Graphical Model (GGM), which is based on partial correlation coefficients. In contrast to regular Pearson correlations, partial correlations aim to identify only direct correlations while eliminating indirect associations. Conventional GGM calculations on the entire dataset can, however, not provide information on whether a correlation is truly disease-specific with respect to the disease samples and not a correlation of control samples. Thus, we implemented a novel differential GGM approach unraveling only the disease-specific correlations, and applied it to the lipidome of immortal Glioblastoma tumor cells. A large set of lipid species were measured by mass spectrometry in order to evaluate lipid remodeling as a result to a combination of perturbation of cells inducing programmed cell death, while the other perturbations served solely as biological controls. With the differential GGM, we were able to reveal Glioblastoma-specific lipid correlations to advance biomedical research on novel gene therapies.

  1. Fusarium diversity in soil using a specific molecular approach and a cultural approach.

    Science.gov (United States)

    Edel-Hermann, Véronique; Gautheron, Nadine; Mounier, Arnaud; Steinberg, Christian

    2015-04-01

    Fusarium species are ubiquitous in soil. They cause plant and human diseases and can produce mycotoxins. Surveys of Fusarium species diversity in environmental samples usually rely on laborious culture-based methods. In the present study, we have developed a molecular method to analyze Fusarium diversity directly from soil DNA. We designed primers targeting the translation elongation factor 1-alpha (EF-1α) gene and demonstrated their specificity toward Fusarium using a large collection of fungi. We used the specific primers to construct a clone library from three contrasting soils. Sequence analysis confirmed the specificity of the assay, with 750 clones identified as Fusarium and distributed among eight species or species complexes. The Fusarium oxysporum species complex (FOSC) was the most abundant one in the three soils, followed by the Fusarium solani species complex (FSSC). We then compared our molecular approach results with those obtained by isolating Fusarium colonies on two culture media and identifying species by sequencing part of the EF-1α gene. The 750 isolates were distributed into eight species or species complexes, with the same dominant species as with the cloning method. Sequence diversity was much higher in the clone library than in the isolate collection. The molecular approach proved to be a valuable tool to assess Fusarium diversity in environmental samples. Combined with high throughput sequencing, it will allow for in-depth analysis of large numbers of samples. Published by Elsevier B.V.

  2. Face recognition: a model specific ability

    Directory of Open Access Journals (Sweden)

    Jeremy B Wilmer

    2014-10-01

    Full Text Available In our everyday lives, we view it as a matter of course that different people are good at different things. It can be surprising, in this context, to learn that most of what is known about cognitive ability variation across individuals concerns the broadest of all cognitive abilities, often labeled g. In contrast, our knowledge of specific abilities, those that correlate little with g, is severely constrained. Here, we draw upon our experience investigating an exceptionally specific ability, face recognition, to make the case that many specific abilities could easily have been missed. In making this case, we derive key insights from earlier false starts in the measurement of face recognition’s variation across individuals, and we highlight the convergence of factors that enabled the recent discovery that this variation is specific. We propose that the case of face recognition ability illustrates a set of tools and perspectives that could accelerate fruitful work on specific cognitive abilities. By revealing relatively independent dimensions of human ability, such work would enhance our capacity to understand the uniqueness of individual minds.

  3. Face recognition: a model specific ability.

    Science.gov (United States)

    Wilmer, Jeremy B; Germine, Laura T; Nakayama, Ken

    2014-01-01

    In our everyday lives, we view it as a matter of course that different people are good at different things. It can be surprising, in this context, to learn that most of what is known about cognitive ability variation across individuals concerns the broadest of all cognitive abilities; an ability referred to as general intelligence, general mental ability, or just g. In contrast, our knowledge of specific abilities, those that correlate little with g, is severely constrained. Here, we draw upon our experience investigating an exceptionally specific ability, face recognition, to make the case that many specific abilities could easily have been missed. In making this case, we derive key insights from earlier false starts in the measurement of face recognition's variation across individuals, and we highlight the convergence of factors that enabled the recent discovery that this variation is specific. We propose that the case of face recognition ability illustrates a set of tools and perspectives that could accelerate fruitful work on specific cognitive abilities. By revealing relatively independent dimensions of human ability, such work would enhance our capacity to understand the uniqueness of individual minds.

  4. The 727 Approach Energy Management System avionics specification (preliminary)

    Energy Technology Data Exchange (ETDEWEB)

    Jackson, D.O.; Lambregts, A.A.

    1976-08-01

    Hardware and software requirements for an Approach Energy Management System (AEMS) consisting of an airborne digital computer and cockpit displays are presented. The displays provide the pilot with a visual indication of when to manually operate the gear, flaps, and throttles during a delayed flap approach so as to reduce approach time, fuel consumption, and community noise. The AEMS is an independent system that does not interact with other navigation or control systems, and is compatible with manually flown or autopilot coupled approaches. Operational use of the AEMS requires a DME ground station colocated with the flight path reference.

  5. SPECIFIC MODELS OF REPRESENTING THE INTELLECTUAL CAPITAL

    Directory of Open Access Journals (Sweden)

    Andreea Feraru

    2014-12-01

    Full Text Available Various scientists in the modern age of management have launched different models for evaluating intellectual capital, and some of these models are analysed critically in this study, too. Most authors examine intellectual capital from a static perspective and focus on the development of its various evaluation models. In this chapter we surveyed the classical static models: Sveiby, Edvisson, Balanced Scorecard, as well as the canonical model of intellectual capital. In a spectral dynamic analysis, organisational intellectual capital is structured in: organisational knowledge, organisational intelligence, organisational values, and their value is built on certain mechanisms entitled integrators, whose chief constitutive elements are: individual knowledge, individual intelligence and individual cultural values. The organizations, as employers, must especially reconsider those employees’ work who value knowledge because they are free to choose how, and especially where they are inclined to invest their own energy, skills and time, and they can be treated as freelancers or as some little entrepreneurs .

  6. modeling, observation and control, a multi-model approach

    OpenAIRE

    Elkhalil, Mansoura

    2011-01-01

    This thesis is devoted to the control of systems which dynamics can be suitably described by a multimodel approach from an investigation study of a model reference adaptative control performance enhancement. Four multimodel control approaches have been proposed. The first approach is based on an output reference model control design. A successful experimental validation involving a chemical reactor has been carried out. The second approach is based on a suitable partial state model reference ...

  7. Combining IVUS and Optical Coherence Tomography for More Accurate Coronary Cap Thickness Quantification and Stress/Strain Calculations: A Patient-Specific Three-Dimensional Fluid-Structure Interaction Modeling Approach.

    Science.gov (United States)

    Guo, Xiaoya; Giddens, Don P; Molony, David; Yang, Chun; Samady, Habib; Zheng, Jie; Mintz, Gary S; Maehara, Akiko; Wang, Liang; Pei, Xuan; Li, Zhi-Yong; Tang, Dalin

    2018-04-01

    Accurate cap thickness and stress/strain quantifications are of fundamental importance for vulnerable plaque research. Virtual histology intravascular ultrasound (VH-IVUS) sets cap thickness to zero when cap is under resolution limit and IVUS does not see it. An innovative modeling approach combining IVUS and optical coherence tomography (OCT) is introduced for cap thickness quantification and more accurate cap stress/strain calculations. In vivo IVUS and OCT coronary plaque data were acquired with informed consent obtained. IVUS and OCT images were merged to form the IVUS + OCT data set, with biplane angiography providing three-dimensional (3D) vessel curvature. For components where VH-IVUS set zero cap thickness (i.e., no cap), a cap was added with minimum cap thickness set as 50 and 180 μm to generate IVUS50 and IVUS180 data sets for model construction, respectively. 3D fluid-structure interaction (FSI) models based on IVUS + OCT, IVUS50, and IVUS180 data sets were constructed to investigate cap thickness impact on stress/strain calculations. Compared to IVUS + OCT, IVUS50 underestimated mean cap thickness (27 slices) by 34.5%, overestimated mean cap stress by 45.8%, (96.4 versus 66.1 kPa). IVUS50 maximum cap stress was 59.2% higher than that from IVUS + OCT model (564.2 versus 354.5 kPa). Differences between IVUS and IVUS + OCT models for cap strain and flow shear stress (FSS) were modest (cap strain <12%; FSS <6%). IVUS + OCT data and models could provide more accurate cap thickness and stress/strain calculations which will serve as basis for further plaque investigations.

  8. SPECIFICITIES OF COMPETENCY APPROACH IMPLEMENTATION: UKRAINIAN AND EUROPEAN EXPERIENCE

    Directory of Open Access Journals (Sweden)

    Oksana V. Ovcharuk

    2010-08-01

    Full Text Available The article deals with the problems of competency approach implementation to the process of education content formation. The comparative analysis of European and Ukrainian experience of key competencies list discussion has done. Ukrainian perspectives of the competency approach integration to the content of education curricula are revealed.

  9. Domain-specific modeling enabling full code generation

    CERN Document Server

    Kelly, Steven

    2007-01-01

    Domain-Specific Modeling (DSM) is the latest approach tosoftware development, promising to greatly increase the speed andease of software creation. Early adopters of DSM have been enjoyingproductivity increases of 500–1000% in production for over adecade. This book introduces DSM and offers examples from variousfields to illustrate to experienced developers how DSM can improvesoftware development in their teams. Two authorities in the field explain what DSM is, why it works,and how to successfully create and use a DSM solution to improveproductivity and quality. Divided into four parts, the book covers:background and motivation; fundamentals; in-depth examples; andcreating DSM solutions. There is an emphasis throughout the book onpractical guidelines for implementing DSM, including how toidentify the nece sary language constructs, how to generate fullcode from models, and how to provide tool support for a new DSMlanguage. The example cases described in the book are available thebook's Website, www.dsmbook....

  10. Modeling the Cumulative Effects of Social Exposures on Health: Moving beyond Disease-Specific Models

    Directory of Open Access Journals (Sweden)

    Heather L. White

    2013-03-01

    Full Text Available The traditional explanatory models used in epidemiology are “disease specific”, identifying risk factors for specific health conditions. Yet social exposures lead to a generalized, cumulative health impact which may not be specific to one illness. Disease-specific models may therefore misestimate social factors’ effects on health. Using data from the Canadian Community Health Survey and Canada 2001 Census we construct and compare “disease-specific” and “generalized health impact” (GHI models to gauge the negative health effects of one social exposure: socioeconomic position (SEP. We use logistic and multinomial multilevel modeling with neighbourhood-level material deprivation, individual-level education and household income to compare and contrast the two approaches. In disease-specific models, the social determinants under study were each associated with the health conditions of interest. However, larger effect sizes were apparent when outcomes were modeled as compound health problems (0, 1, 2, or 3+ conditions using the GHI approach. To more accurately estimate social exposures’ impacts on population health, researchers should consider a GHI framework.

  11. Selection of hydrologic modeling approaches for climate change assessment: A comparison of model scale and structures

    Science.gov (United States)

    Surfleet, Christopher G.; Tullos, Desirèe; Chang, Heejun; Jung, Il-Won

    2012-09-01

    SummaryA wide variety of approaches to hydrologic (rainfall-runoff) modeling of river basins confounds our ability to select, develop, and interpret models, particularly in the evaluation of prediction uncertainty associated with climate change assessment. To inform the model selection process, we characterized and compared three structurally-distinct approaches and spatial scales of parameterization to modeling catchment hydrology: a large-scale approach (using the VIC model; 671,000 km2 area), a basin-scale approach (using the PRMS model; 29,700 km2 area), and a site-specific approach (the GSFLOW model; 4700 km2 area) forced by the same future climate estimates. For each approach, we present measures of fit to historic observations and predictions of future response, as well as estimates of model parameter uncertainty, when available. While the site-specific approach generally had the best fit to historic measurements, the performance of the model approaches varied. The site-specific approach generated the best fit at unregulated sites, the large scale approach performed best just downstream of flood control projects, and model performance varied at the farthest downstream sites where streamflow regulation is mitigated to some extent by unregulated tributaries and water diversions. These results illustrate how selection of a modeling approach and interpretation of climate change projections require (a) appropriate parameterization of the models for climate and hydrologic processes governing runoff generation in the area under study, (b) understanding and justifying the assumptions and limitations of the model, and (c) estimates of uncertainty associated with the modeling approach.

  12. Interoperable transactions in business models: A structured approach

    NARCIS (Netherlands)

    Weigand, H.; Verharen, E.; Dignum, F.P.M.

    1996-01-01

    Recent database research has given much attention to the specification of "flexible" transactions that can be used in interoperable systems. Starting from a quite different angle, Business Process Modelling has approached the area of communication modelling as well (the Language/Action

  13. Global energy modeling - A biophysical approach

    Energy Technology Data Exchange (ETDEWEB)

    Dale, Michael

    2010-09-15

    This paper contrasts the standard economic approach to energy modelling with energy models using a biophysical approach. Neither of these approaches includes changing energy-returns-on-investment (EROI) due to declining resource quality or the capital intensive nature of renewable energy sources. Both of these factors will become increasingly important in the future. An extension to the biophysical approach is outlined which encompasses a dynamic EROI function that explicitly incorporates technological learning. The model is used to explore several scenarios of long-term future energy supply especially concerning the global transition to renewable energy sources in the quest for a sustainable energy system.

  14. From AADL Model to LNT Specification

    OpenAIRE

    Mkaouar, Hana; Zalila, Bechir; Hugues, Jérôme; Jmaiel, Mohamed

    2015-01-01

    The verification of distributed real-time systems designed by architectural languages such as AADL (Architecture Analysis and Design Language) is a research challenge. These systems are often used in safety- critical domains where one mistake can result in physical damages and even life loss. In such domains, formal methods are a suitable solution for rigorous analysis. This paper studies the formal verification of distributed real-time systems modelled with AADL. We transform AADL model to a...

  15. Challenges and opportunities for integrating lake ecosystem modelling approaches

    Science.gov (United States)

    Mooij, Wolf M.; Trolle, Dennis; Jeppesen, Erik; Arhonditsis, George; Belolipetsky, Pavel V.; Chitamwebwa, Deonatus B.R.; Degermendzhy, Andrey G.; DeAngelis, Donald L.; Domis, Lisette N. De Senerpont; Downing, Andrea S.; Elliott, J. Alex; Ruberto, Carlos Ruberto; Gaedke, Ursula; Genova, Svetlana N.; Gulati, Ramesh D.; Hakanson, Lars; Hamilton, David P.; Hipsey, Matthew R.; Hoen, Jochem 't; Hulsmann, Stephan; Los, F. Hans; Makler-Pick, Vardit; Petzoldt, Thomas; Prokopkin, Igor G.; Rinke, Karsten; Schep, Sebastiaan A.; Tominaga, Koji; Van Dam, Anne A.; Van Nes, Egbert H.; Wells, Scott A.; Janse, Jan H.

    2010-01-01

    A large number and wide variety of lake ecosystem models have been developed and published during the past four decades. We identify two challenges for making further progress in this field. One such challenge is to avoid developing more models largely following the concept of others ('reinventing the wheel'). The other challenge is to avoid focusing on only one type of model, while ignoring new and diverse approaches that have become available ('having tunnel vision'). In this paper, we aim at improving the awareness of existing models and knowledge of concurrent approaches in lake ecosystem modelling, without covering all possible model tools and avenues. First, we present a broad variety of modelling approaches. To illustrate these approaches, we give brief descriptions of rather arbitrarily selected sets of specific models. We deal with static models (steady state and regression models), complex dynamic models (CAEDYM, CE-QUAL-W2, Delft 3D-ECO, LakeMab, LakeWeb, MyLake, PCLake, PROTECH, SALMO), structurally dynamic models and minimal dynamic models. We also discuss a group of approaches that could all be classified as individual based: super-individual models (Piscator, Charisma), physiologically structured models, stage-structured models and trait-based models. We briefly mention genetic algorithms, neural networks, Kalman filters and fuzzy logic. Thereafter, we zoom in, as an in-depth example, on the multi-decadal development and application of the lake ecosystem model PCLake and related models (PCLake Metamodel, Lake Shira Model, IPH-TRIM3D-PCLake). In the discussion, we argue that while the historical development of each approach and model is understandable given its 'leading principle', there are many opportunities for combining approaches. We take the point of view that a single 'right' approach does not exist and should not be strived for. Instead, multiple modelling approaches, applied concurrently to a given problem, can help develop an integrative

  16. A Unified Approach to Modeling and Programming

    DEFF Research Database (Denmark)

    Madsen, Ole Lehrmann; Møller-Pedersen, Birger

    2010-01-01

    of this paper is to go back to the future and get inspiration from SIMULA and propose a unied approach. In addition to reintroducing the contributions of SIMULA and the Scandinavian approach to object-oriented programming, we do this by discussing a number of issues in modeling and programming and argue3 why we......SIMULA was a language for modeling and programming and provided a unied approach to modeling and programming in contrast to methodologies based on structured analysis and design. The current development seems to be going in the direction of separation of modeling and programming. The goal...

  17. General and specific consciousness: a first-order representationalist approach

    Science.gov (United States)

    Mehta, Neil; Mashour, George A.

    2013-01-01

    It is widely acknowledged that a complete theory of consciousness should explain general consciousness (what makes a state conscious at all) and specific consciousness (what gives a conscious state its particular phenomenal quality). We defend first-order representationalism, which argues that consciousness consists of sensory representations directly available to the subject for action selection, belief formation, planning, etc. We provide a neuroscientific framework for this primarily philosophical theory, according to which neural correlates of general consciousness include prefrontal cortex, posterior parietal cortex, and non-specific thalamic nuclei, while neural correlates of specific consciousness include sensory cortex and specific thalamic nuclei. We suggest that recent data support first-order representationalism over biological theory, higher-order representationalism, recurrent processing theory, information integration theory, and global workspace theory. PMID:23882231

  18. Szekeres models: a covariant approach

    Science.gov (United States)

    Apostolopoulos, Pantelis S.

    2017-05-01

    We exploit the 1  +  1  +  2 formalism to covariantly describe the inhomogeneous and anisotropic Szekeres models. It is shown that an average scale length can be defined covariantly which satisfies a 2d equation of motion driven from the effective gravitational mass (EGM) contained in the dust cloud. The contributions to the EGM are encoded to the energy density of the dust fluid and the free gravitational field E ab . We show that the quasi-symmetric property of the Szekeres models is justified through the existence of 3 independent intrinsic Killing vector fields (IKVFs). In addition the notions of the apparent and absolute apparent horizons are briefly discussed and we give an alternative gauge-invariant form to define them in terms of the kinematical variables of the spacelike congruences. We argue that the proposed program can be used in order to express Sachs’ optical equations in a covariant form and analyze the confrontation of a spatially inhomogeneous irrotational overdense fluid model with the observational data.

  19. XRLSim model specifications and user interfaces

    Energy Technology Data Exchange (ETDEWEB)

    Ng, L.C.; Gavel, D.T.; Shectman, R.M.; Sholl, P.L.; Woodruff, J.P.

    1989-04-01

    This report summarizes our FY88 engineering development effort of XRLSim --- an Ada-based computer program developed to provide a realistic simulation of an x-ray laser weapon platform. XRLSim can be used to assess platform requirements in track handoff, target acquisition, tracking, and pointing as well as engagement time line. Development effort continues in FY89 to enhance the model fidelity of the platform and to improve the performance of the tracking algorithms. Simulated targets available in XRLSim include midcourse reentry vehicles and orbiting satellites. At this time, the current version of XRLSim can only simulate a one-on-one engagement scenario. 8 refs., 26 figs.

  20. A historical approach to English for Specific Purposes in Ecuador

    Directory of Open Access Journals (Sweden)

    Maritza Sandra Pibaque Pionce

    2015-01-01

    Full Text Available The systematization of experiences in the process of doctoral training in Educational Sciences, allows proposing the historical analysis of English for Specific Purposes in Ecuador, preferably in International Business Degree students, as the basis for the theoretical analysis for the scientific assessment of educational research works. The definition and justification for the periods established in time are presented, which aims at assessing the historical development of the teaching-learning process of English for Specific Purposes related to business formation, upon which is necessary to address the effectiveness of teaching English. Thus the lingo-cultural competence focuses on the ability to negotiate cultural meanings and implement effective communication behaviors concentrating connected aspects of languages directly with specific cultural facts of a society. This research work has been based on bibliography analysis, inductive - deductive and analysis – synthesis methods.

  1. Multiple Model Approaches to Modelling and Control,

    DEFF Research Database (Denmark)

    appeal in building systems which operate robustly over a wide range of operating conditions by decomposing them into a number of simplerlinear modelling or control problems, even for nonlinear modelling or control problems. This appeal has been a factor in the development of increasinglypopular `local...... to problems in the process industries, biomedical applications and autonomoussystems. The successful application of the ideas to demanding problems is already encouraging, but creative development of the basic framework isneeded to better allow the integration of human knowledge with automated learning....... The underlying question is `How should we partition the system - what is `local'?'. This book presents alternative ways of bringing submodels together,which lead to varying levels of performance and insight. Some are further developed for autonomous learning of parameters from data, while others havefocused...

  2. System Behavior Models: A Survey of Approaches

    Science.gov (United States)

    2016-06-01

    the Petri model allowed a quick assessment of all potential states but was more cumbersome to build than the MP model. A comparison of approaches...identical state space results. The combined state space graph of the Petri model allowed a quick assessment of all potential states but was more...59 INITIAL DISTRIBUTION LIST ...................................................................................65 ix LIST

  3. Artificial neural network approach for estimation of surface specific ...

    Indian Academy of Sciences (India)

    Microwave sensor MSMR (Multifrequency Scanning Microwave Radiometer) data onboard Oceansat-1 was used for retrieval of monthly averages of near surface specific humidity (a) and air temperature (a) by means of Artificial Neural Network (ANN). The MSMR measures the microwave radiances in 8 channels at ...

  4. Current approaches to gene regulatory network modelling

    Directory of Open Access Journals (Sweden)

    Brazma Alvis

    2007-09-01

    Full Text Available Abstract Many different approaches have been developed to model and simulate gene regulatory networks. We proposed the following categories for gene regulatory network models: network parts lists, network topology models, network control logic models, and dynamic models. Here we will describe some examples for each of these categories. We will study the topology of gene regulatory networks in yeast in more detail, comparing a direct network derived from transcription factor binding data and an indirect network derived from genome-wide expression data in mutants. Regarding the network dynamics we briefly describe discrete and continuous approaches to network modelling, then describe a hybrid model called Finite State Linear Model and demonstrate that some simple network dynamics can be simulated in this model.

  5. A Polyadic pi-Calculus Approach for the Formal Specification of UML-RT

    Directory of Open Access Journals (Sweden)

    J. M. Bezerra

    2009-01-01

    Full Text Available UML-RT is a UML real-time profile that allows modeling event-driven and distributed systems; however it is not a formal specification language. This paper proposes a formal approach for UML-RT through a mapping of the UML-RT communicating elements into the -calculus (or pi-calculus process algebra. The formal approach both captures the intended behavior of the system being modeled and provides a rigorous and nonambiguous system description. Our proposal differentiates from other research work because we map UML-RT to -calculus, and we allow the mapping of dynamic reconfiguration of UML-RT unwired ports. We illustrate the usage and applicability of the mapping through three examples. The first example focuses on explaining the mapping; the second one aims to demonstrate the use of the -calculus definitions to verify system requirements; the third case is an example of mobile processes called Handover protocol.

  6. Distributed simulation a model driven engineering approach

    CERN Document Server

    Topçu, Okan; Oğuztüzün, Halit; Yilmaz, Levent

    2016-01-01

    Backed by substantive case studies, the novel approach to software engineering for distributed simulation outlined in this text demonstrates the potent synergies between model-driven techniques, simulation, intelligent agents, and computer systems development.

  7. Validation of Modeling Flow Approaching Navigation Locks

    Science.gov (United States)

    2013-08-01

    USACE, Pittsburgh District ( LRP ) requested that the US Army Engineer Research and Development Center, Coastal and ERDC/CHL TR-13-9 2 Hydraulics...approaching the lock and dam. The second set of experiments considered a design, referred to as Plan B lock approach, which contained the weir field in...conditions and model parameters A discharge of 1.35 cfs was set as the inflow boundary condition at the upstream end of the model. The outflow boundary was

  8. Estimating a DIF decomposition model using a random-weights linear logistic test model approach.

    Science.gov (United States)

    Paek, Insu; Fukuhara, Hirotaka

    2015-09-01

    A differential item functioning (DIF) decomposition model separates a testlet item DIF into two sources: item-specific differential functioning and testlet-specific differential functioning. This article provides an alternative model-building framework and estimation approach for a DIF decomposition model that was proposed by Beretvas and Walker (2012). Although their model is formulated under multilevel modeling with the restricted pseudolikelihood estimation method, our approach illustrates DIF decomposition modeling that is directly built upon the random-weights linear logistic test model framework with the marginal maximum likelihood estimation method. In addition to demonstrating our approach's performance, we provide detailed information on how to implement this new DIF decomposition model using an item response theory software program; using DIF decomposition may be challenging for practitioners, yet practical information on how to implement it has previously been unavailable in the measurement literature.

  9. Hybrid approaches to physiologic modeling and prediction

    Science.gov (United States)

    Olengü, Nicholas O.; Reifman, Jaques

    2005-05-01

    This paper explores how the accuracy of a first-principles physiological model can be enhanced by integrating data-driven, "black-box" models with the original model to form a "hybrid" model system. Both linear (autoregressive) and nonlinear (neural network) data-driven techniques are separately combined with a first-principles model to predict human body core temperature. Rectal core temperature data from nine volunteers, subject to four 30/10-minute cycles of moderate exercise/rest regimen in both CONTROL and HUMID environmental conditions, are used to develop and test the approach. The results show significant improvements in prediction accuracy, with average improvements of up to 30% for prediction horizons of 20 minutes. The models developed from one subject's data are also used in the prediction of another subject's core temperature. Initial results for this approach for a 20-minute horizon show no significant improvement over the first-principles model by itself.

  10. Asymptomatic Independence and Separability in Convariance Structure Models: Implications for Specification Error, Power, and Model Modification.

    Science.gov (United States)

    Kaplan, D; Wenger, R N

    1993-10-01

    This article presents a didactic discussion on the role of asymptotically independent test statistics and separable hypotheses as they pertain to issues of specification error, power, and model modification in the covariance structure modeling framework. Specifically, it is shown that when restricting two parameter estimates on the basis of the multivariate Wald test, the condition of asymptotic independence is necessary but not sufficient for the univariate Wald test statistics to sum to the multivariate Wald test. Instead, what is required is mutual asymptotic independence (MAI) among the univariate tests. This result generalizes to sets of multivariate tests as well. When MA1 is lacking, hypotheses can exhibit transitive relationships. It is also shown that the pattern of zero and non-zero elements of the covariance matrix of the estimates are indicative of mutually asymptotically independent test statistics, separable and transitive hypotheses. The concepts of MAI, separability, and transitivity serve as an explanatory framework for how specification errors are propagated through systems of equations and how power analyses are differentially affected by specification errors of the same magnitude. A small population study supports the major findings of this article. The question of univariate versus multivariate sequential model modification is also addressed. We argue that multivariate sequential model modification strategies do not take into account the typical lack of MA1 thus inadvertently misleading substantive investigators. Instead, a prudent approach favors univariate sequential model modification.

  11. Understanding Gulf War Illness: An Integrative Modeling Approach

    Science.gov (United States)

    2017-10-01

    high-order diffusion imaging in a rat model of Gulf War Illness. §These authors contributed equally to the work. Brain Behavior and Immunity. pii...astrocyte specific transcriptome responses to neurotoxicity. §These authors contributed equally to the work. Submitted for Internal CDC-NIOSH...Antagonist: Evaluation of Beneficial Effects for Gulf War Illness 4) GW160116 (Nathanson) Genomics approach to find gender specific mechanisms of GWI

  12. Geostatistical approach for identifying scale-specific correlations between soil thickness and topographic attributes

    Science.gov (United States)

    Bourennane, Hocine; Salvador-Blanes, Sébastien; Couturier, Alain; Chartin, Caroline; Pasquier, Catherine; Hinschberger, Florent; Macaire, Jean-Jacques; Daroussin, Joël

    2014-09-01

    This paper investigates how the spatial correlations between topographic attributes and a soil thickness can be improved by focusing on the relationships between them at specific spatial scales. In addition, this paper examines the effects of the topographic attribute data sources that are used as explanatory variables for modeling the response variable, and considers the possibility of model extrapolation for mapping beyond the area where the model was established. Here, factorial kriging analysis (FKA) and partial least square regression (PLSR) analysis are used to separate nuggets and small- and large-scale structures in data including four topographic attributes and soil thickness (ST). These analyses were conducted at different scales to analyze the relationships between ST and the selected topographic attributes in the southwest region of the Parisian Basin. The structural correlation coefficients from the FKA show strong correlations between the variables. These correlations, which change as a function of spatial scale, are not revealed by the linear correlation coefficients. The Eigen vectors from the principal component analysis that was performed on the small-scale and large-scale structures of the linear co-regionalization model are used to obtain ST and the topographic attributes at both spatial scales over the study area. The ST models are built as a function of topographic attributes using PLSR. Results have shown that the models built using variables that were assessed at a specific scale are better at predicting the target variable than models that were built using raw data. Regarding the models that were built using raw data, the structural correlations that occur at different spatial scales are merged together and the variance-covariance matrix of the nugget that represents data noise is not filtered out. Measures of model performance that are based on a validation data set have shown that the model based on small-scale structure (Model-S) is

  13. A Double Selection Approach to Achieve Specific Expression of Toxin Genes for Ovarian Cancer Gene Therapy

    National Research Council Canada - National Science Library

    Curiel, David T; Siegal, Gene; Wang, Minghui

    2005-01-01

    ... embodies the requisite properties of efficacy and specificity required for ovarian cancer gene therapy. This approach is based on targeting the delivered anti-cancer gene to tumor via two complimentary approaches...

  14. A Double Selection Approach to Achieve Specific Expression of Toxin Genes for Ovarian Cancer Gene Therapy

    National Research Council Canada - National Science Library

    Curiel, David T; Siegal, Gene; Wang, Minghui

    2006-01-01

    ... embodies the requisite properties of efficacy and specificity required for ovarian cancer gene therapy. This approach is based on targeting the delivered anti-cancer gene to tumor via two complimentary approaches...

  15. Specificity of continuous auditing approach on information technology internal controls

    Directory of Open Access Journals (Sweden)

    Kaćanski Slobodan

    2012-01-01

    Full Text Available Contemporary business world, can not be imagined without the use of information technology in all aspects of business. The use of information technology in manufacturing and non-production companies' activities can greatly facilitate and accelerate the process of operation and control. Because of its complexity, they possess vulnerable areas and provide space for the emergence of accidental and intentional frauds that can significantly materially affect the business decisions made by the companies' management. Implementation of internal controls can greatly reduce the level of errors that can contribute to making the wrong decisions. In order to protect the operating system, the company's management implement an internal audit to periodically examine the fundamental quality of the internal control systems. Since the internal audit, according to its character, only periodically checks quality of internal control systems and information technologies to be reported to the manager, the problem arises in the process of in wrong time reporting the management structures of the business entity. To eliminate this problem, management implements a special approach to internal audit, called continuous auditing.

  16. A model-driven approach to information security compliance

    Science.gov (United States)

    Correia, Anacleto; Gonçalves, António; Teodoro, M. Filomena

    2017-06-01

    The availability, integrity and confidentiality of information are fundamental to the long-term survival of any organization. Information security is a complex issue that must be holistically approached, combining assets that support corporate systems, in an extended network of business partners, vendors, customers and other stakeholders. This paper addresses the conception and implementation of information security systems, conform the ISO/IEC 27000 set of standards, using the model-driven approach. The process begins with the conception of a domain level model (computation independent model) based on information security vocabulary present in the ISO/IEC 27001 standard. Based on this model, after embedding in the model mandatory rules for attaining ISO/IEC 27001 conformance, a platform independent model is derived. Finally, a platform specific model serves the base for testing the compliance of information security systems with the ISO/IEC 27000 set of standards.

  17. A Proposal for a Flexible Trend Specification in DSGE Models

    Directory of Open Access Journals (Sweden)

    Slanicay Martin

    2016-06-01

    Full Text Available In this paper I propose a flexible trend specification for estimating DSGE models on log differences. I demonstrate this flexible trend specification on a New Keynesian DSGE model of two economies, which I consequently estimate on data from the Czech economy and the euro area, using Bayesian techniques. The advantage of the trend specification proposed is that the trend component and the cyclical component are modelled jointly in a single model. The proposed trend specification is flexible in the sense that smoothness of the trend can be easily modified by different calibration of some of the trend parameters. The results suggest that this method is capable of finding a very reasonable trend in the data. Moreover, comparison of forecast performance reveals that the proposed specification offers more reliable forecasts than the original variant of the model.

  18. Specific and General Human Capital in an Endogenous Growth Model

    OpenAIRE

    Evangelia Vourvachaki; Vahagn Jerbashian; : Sergey Slobodyan

    2014-01-01

    In this article, we define specific (general) human capital in terms of the occupations whose use is spread in a limited (wide) set of industries. We analyze the growth impact of an economy's composition of specific and general human capital, in a model where education and research and development are costly and complementary activities. The model suggests that a declining share of specific human capital, as observed in the Czech Republic, can be associated with a lower rate of long-term grow...

  19. Specification test for Markov models with measurement errors.

    Science.gov (United States)

    Kim, Seonjin; Zhao, Zhibiao

    2014-09-01

    Most existing works on specification testing assume that we have direct observations from the model of interest. We study specification testing for Markov models based on contaminated observations. The evolving model dynamics of the unobservable Markov chain is implicitly coded into the conditional distribution of the observed process. To test whether the underlying Markov chain follows a parametric model, we propose measuring the deviation between nonparametric and parametric estimates of conditional regression functions of the observed process. Specifically, we construct a nonparametric simultaneous confidence band for conditional regression functions and check whether the parametric estimate is contained within the band.

  20. A visual approach for modeling spatiotemporal relations

    NARCIS (Netherlands)

    R.L. Guimarães (Rodrigo); C.S.S. Neto; L.F.G. Soares

    2008-01-01

    htmlabstractTextual programming languages have proven to be difficult to learn and to use effectively for many people. For this sake, visual tools can be useful to abstract the complexity of such textual languages, minimizing the specification efforts. In this paper we present a visual approach for

  1. Risk Modelling for Passages in Approach Channel

    Directory of Open Access Journals (Sweden)

    Leszek Smolarek

    2013-01-01

    Full Text Available Methods of multivariate statistics, stochastic processes, and simulation methods are used to identify and assess the risk measures. This paper presents the use of generalized linear models and Markov models to study risks to ships along the approach channel. These models combined with simulation testing are used to determine the time required for continuous monitoring of endangered objects or period at which the level of risk should be verified.

  2. Measurement Model Specification Error in LISREL Structural Equation Models.

    Science.gov (United States)

    Baldwin, Beatrice; Lomax, Richard

    This LISREL study examines the robustness of the maximum likelihood estimates under varying degrees of measurement model misspecification. A true model containing five latent variables (two endogenous and three exogenous) and two indicator variables per latent variable was used. Measurement model misspecification considered included errors of…

  3. Modelling subject-specific childhood growth using linear mixed-effect models with cubic regression splines.

    Science.gov (United States)

    Grajeda, Laura M; Ivanescu, Andrada; Saito, Mayuko; Crainiceanu, Ciprian; Jaganath, Devan; Gilman, Robert H; Crabtree, Jean E; Kelleher, Dermott; Cabrera, Lilia; Cama, Vitaliano; Checkley, William

    2016-01-01

    Childhood growth is a cornerstone of pediatric research. Statistical models need to consider individual trajectories to adequately describe growth outcomes. Specifically, well-defined longitudinal models are essential to characterize both population and subject-specific growth. Linear mixed-effect models with cubic regression splines can account for the nonlinearity of growth curves and provide reasonable estimators of population and subject-specific growth, velocity and acceleration. We provide a stepwise approach that builds from simple to complex models, and account for the intrinsic complexity of the data. We start with standard cubic splines regression models and build up to a model that includes subject-specific random intercepts and slopes and residual autocorrelation. We then compared cubic regression splines vis-à-vis linear piecewise splines, and with varying number of knots and positions. Statistical code is provided to ensure reproducibility and improve dissemination of methods. Models are applied to longitudinal height measurements in a cohort of 215 Peruvian children followed from birth until their fourth year of life. Unexplained variability, as measured by the variance of the regression model, was reduced from 7.34 when using ordinary least squares to 0.81 (p linear mixed-effect models with random slopes and a first order continuous autoregressive error term. There was substantial heterogeneity in both the intercept (p linear regression equation for both estimation and prediction of population- and individual-level growth in height. We show that cubic regression splines are superior to linear regression splines for the case of a small number of knots in both estimation and prediction with the full linear mixed effect model (AIC 19,352 vs. 19,598, respectively). While the regression parameters are more complex to interpret in the former, we argue that inference for any problem depends more on the estimated curve or differences in curves rather

  4. Feasibility assessment of a risk-based approach to technical specifications

    International Nuclear Information System (INIS)

    Atefi, B.; Gallagher, D.W.

    1991-05-01

    To assess the potential use of risk and reliability techniques for improving the effectiveness of the technical specifications to control plant operational risk, the Technical Specifications Branch of the Nuclear Regulatory Commission initiated an effort to identify and evaluate alternative risk-based approaches that could bring greater risk perspective to these requirements. In the first phase four alternative approaches were identified and their characteristics were analyzed. Among these, the risk-based approach to technical specifications is the most promising approach for controlling plant operational risk using technical specifications. The second phase of the study concentrated on detailed characteristics of the real time risk-based approach. It is concluded that a real time risk-based approach to technical specifications has the potential to improve both plant safety and availability. 33 figs., 5 figs., 6 tabs

  5. Gray-box modelling approach for description of storage tunnel

    DEFF Research Database (Denmark)

    Harremoës, Poul; Carstensen, Jacob

    1999-01-01

    The dynamics of a storage tunnel is examined using a model based on on-line measured data and a combination of simple deterministic and black-box stochastic elements. This approach, called gray-box modeling, is a new promising methodology for giving an on-line state description of sewer systems. ...... in a SCADA system because the most important information on the specific system is provided on-line...

  6. Towards new approaches in phenological modelling

    Science.gov (United States)

    Chmielewski, Frank-M.; Götz, Klaus-P.; Rawel, Harshard M.; Homann, Thomas

    2014-05-01

    Modelling of phenological stages is based on temperature sums for many decades, describing both the chilling and the forcing requirement of woody plants until the beginning of leafing or flowering. Parts of this approach go back to Reaumur (1735), who originally proposed the concept of growing degree-days. Now, there is a growing body of opinion that asks for new methods in phenological modelling and more in-depth studies on dormancy release of woody plants. This requirement is easily understandable if we consider the wide application of phenological models, which can even affect the results of climate models. To this day, in phenological models still a number of parameters need to be optimised on observations, although some basic physiological knowledge of the chilling and forcing requirement of plants is already considered in these approaches (semi-mechanistic models). Limiting, for a fundamental improvement of these models, is the lack of knowledge about the course of dormancy in woody plants, which cannot be directly observed and which is also insufficiently described in the literature. Modern metabolomic methods provide a solution for this problem and allow both, the validation of currently used phenological models as well as the development of mechanistic approaches. In order to develop this kind of models, changes of metabolites (concentration, temporal course) must be set in relation to the variability of environmental (steering) parameters (weather, day length, etc.). This necessarily requires multi-year (3-5 yr.) and high-resolution (weekly probes between autumn and spring) data. The feasibility of this approach has already been tested in a 3-year pilot-study on sweet cherries. Our suggested methodology is not only limited to the flowering of fruit trees, it can be also applied to tree species of the natural vegetation, where even greater deficits in phenological modelling exist.

  7. Position-specific isotope modeling of organic micropollutants transformation through different reaction pathways

    International Nuclear Information System (INIS)

    Jin, Biao; Rolle, Massimo

    2016-01-01

    The degradation of organic micropollutants occurs via different reaction pathways. Compound specific isotope analysis is a valuable tool to identify such degradation pathways in different environmental systems. We propose a mechanism-based modeling approach that provides a quantitative framework to simultaneously evaluate concentration as well as bulk and position-specific multi-element isotope evolution during the transformation of organic micropollutants. The model explicitly simulates position-specific isotopologues for those atoms that experience isotope effects and, thereby, provides a mechanistic description of isotope fractionation occurring at different molecular positions. To demonstrate specific features of the modeling approach, we simulated the degradation of three selected organic micropollutants: dichlorobenzamide (BAM), isoproturon (IPU) and diclofenac (DCF). The model accurately reproduces the multi-element isotope data observed in previous experimental studies. Furthermore, it precisely captures the dual element isotope trends characteristic of different reaction pathways as well as their range of variation consistent with observed bulk isotope fractionation. It was also possible to directly validate the model capability to predict the evolution of position-specific isotope ratios with available experimental data. Therefore, the approach is useful both for a mechanism-based evaluation of experimental results and as a tool to explore transformation pathways in scenarios for which position-specific isotope data are not yet available. - Highlights: • Mechanism-based, position-specific isotope modeling of micropollutants degradation. • Simultaneous description of concentration and primary and secondary isotope effects. • Key features of the model are demonstrated with three illustrative examples. • Model as a tool to explore reaction mechanisms and to design experiments. - We propose a modeling approach incorporating mechanistic information and

  8. 3D Image Modelling and Specific Treatments in Orthodontics Domain

    Directory of Open Access Journals (Sweden)

    Dionysis Goularas

    2007-01-01

    Full Text Available In this article, we present a 3D specific dental plaster treatment system for orthodontics. From computer tomography scanner images, we propose first a 3D image modelling and reconstruction method of the Mandible and Maxillary based on an adaptive triangulation allowing management of contours meant for the complex topologies. Secondly, we present two specific treatment methods directly achieved on obtained 3D model allowing the automatic correction for the setting in occlusion of the Mandible and the Maxillary, and the teeth segmentation allowing more specific dental examinations. Finally, these specific treatments are presented via a client/server application with the aim of allowing a telediagnosis and treatment.

  9. Authoring and verification of clinical guidelines: a model driven approach.

    Science.gov (United States)

    Pérez, Beatriz; Porres, Ivan

    2010-08-01

    The goal of this research is to provide a framework to enable authoring and verification of clinical guidelines. The framework is part of a larger research project aimed at improving the representation, quality and application of clinical guidelines in daily clinical practice. The verification process of a guideline is based on (1) model checking techniques to verify guidelines against semantic errors and inconsistencies in their definition, (2) combined with Model Driven Development (MDD) techniques, which enable us to automatically process manually created guideline specifications and temporal-logic statements to be checked and verified regarding these specifications, making the verification process faster and cost-effective. Particularly, we use UML statecharts to represent the dynamics of guidelines and, based on this manually defined guideline specifications, we use a MDD-based tool chain to automatically process them to generate the input model of a model checker. The model checker takes the resulted model together with the specific guideline requirements, and verifies whether the guideline fulfils such properties. The overall framework has been implemented as an Eclipse plug-in named GBDSSGenerator which, particularly, starting from the UML statechart representing a guideline, allows the verification of the guideline against specific requirements. Additionally, we have established a pattern-based approach for defining commonly occurring types of requirements in guidelines. We have successfully validated our overall approach by verifying properties in different clinical guidelines resulting in the detection of some inconsistencies in their definition. The proposed framework allows (1) the authoring and (2) the verification of clinical guidelines against specific requirements defined based on a set of property specification patterns, enabling non-experts to easily write formal specifications and thus easing the verification process. Copyright 2010 Elsevier Inc

  10. Communicative Syllabus Design: A Sociolinguistic Model for Defining the Content of Purpose-Specific Language Programmes.

    Science.gov (United States)

    Munby, John

    The design of a dynamic processing model for teaching English for Specific Purposes (ESP) is discussed in this book. The model starts with the learner and ends with the learner's target communicative competence in the particular area needed. In this communicative approach to syllabus design, the first chapter is a discussion of theories of…

  11. Assessing risk factors for dental caries: a statistical modeling approach.

    Science.gov (United States)

    Trottini, Mario; Bossù, Maurizio; Corridore, Denise; Ierardo, Gaetano; Luzzi, Valeria; Saccucci, Matteo; Polimeni, Antonella

    2015-01-01

    The problem of identifying potential determinants and predictors of dental caries is of key importance in caries research and it has received considerable attention in the scientific literature. From the methodological side, a broad range of statistical models is currently available to analyze dental caries indices (DMFT, dmfs, etc.). These models have been applied in several studies to investigate the impact of different risk factors on the cumulative severity of dental caries experience. However, in most of the cases (i) these studies focus on a very specific subset of risk factors; and (ii) in the statistical modeling only few candidate models are considered and model selection is at best only marginally addressed. As a result, our understanding of the robustness of the statistical inferences with respect to the choice of the model is very limited; the richness of the set of statistical models available for analysis in only marginally exploited; and inferences could be biased due the omission of potentially important confounding variables in the model's specification. In this paper we argue that these limitations can be overcome considering a general class of candidate models and carefully exploring the model space using standard model selection criteria and measures of global fit and predictive performance of the candidate models. Strengths and limitations of the proposed approach are illustrated with a real data set. In our illustration the model space contains more than 2.6 million models, which require inferences to be adjusted for 'optimism'.

  12. Feasibility assessment of a risk-based approach to technical specifications

    International Nuclear Information System (INIS)

    Atefi, B.; Gallagher, D.W.

    1991-05-01

    The first phase of the assessment concentrates on (1) identification of selected risk-based approaches for improving current technical specifications, (2) appraisal of characteristics of each approach, including advantages and disadvantages, and (3) recommendation of one or more approaches that might result in improving current technical specification requirements. The second phase of the work concentrates on assessment of the feasibility of implementation of a pilot program to study detailed characteristics of the preferred approach. The real time risk-based approach was identified as the preferred approach to technical specifications for controlling plant operational risk. There do not appear to be any technical or institutional obstacles to prevent initiation of a pilot program to assess the characteristics and effectiveness of such an approach. 2 tabs

  13. A comprehensive approach to age-dependent dosimetric modeling

    Energy Technology Data Exchange (ETDEWEB)

    Leggett, R.W.; Cristy, M.; Eckerman, K.F.

    1986-01-01

    In the absence of age-specific biokinetic models, current retention models of the International Commission on Radiological Protection (ICRP) frequently are used as a point of departure for evaluation of exposures to the general population. These models were designed and intended for estimation of long-term integrated doses to the adult worker. Their format and empirical basis preclude incorporation of much valuable physiological information and physiologically reasonable assumptions that could be used in characterizing the age-specific behavior of radioelements in humans. In this paper we discuss a comprehensive approach to age-dependent dosimetric modeling in which consideration is given not only to changes with age in masses and relative geometries of body organs and tissues but also to best available physiological and radiobiological information relating to the age-specific biobehavior of radionuclides. This approach is useful in obtaining more accurate estimates of long-term dose commitments as a function of age at intake, but it may be particularly valuable in establishing more accurate estimates of dose rate as a function of age. Age-specific dose rates are needed for a proper analysis of the potential effects on estimates or risk of elevated dose rates per unit intake in certain stages of life, elevated response per unit dose received during some stages of life, and age-specific non-radiogenic competing risks.

  14. A comprehensive approach to age-dependent dosimetric modeling

    International Nuclear Information System (INIS)

    Leggett, R.W.; Cristy, M.; Eckerman, K.F.

    1986-01-01

    In the absence of age-specific biokinetic models, current retention models of the International Commission on Radiological Protection (ICRP) frequently are used as a point of departure for evaluation of exposures to the general population. These models were designed and intended for estimation of long-term integrated doses to the adult worker. Their format and empirical basis preclude incorporation of much valuable physiological information and physiologically reasonable assumptions that could be used in characterizing the age-specific behavior of radioelements in humans. In this paper we discuss a comprehensive approach to age-dependent dosimetric modeling in which consideration is given not only to changes with age in masses and relative geometries of body organs and tissues but also to best available physiological and radiobiological information relating to the age-specific biobehavior of radionuclides. This approach is useful in obtaining more accurate estimates of long-term dose commitments as a function of age at intake, but it may be particularly valuable in establishing more accurate estimates of dose rate as a function of age. Age-specific dose rates are needed for a proper analysis of the potential effects on estimates or risk of elevated dose rates per unit intake in certain stages of life, elevated response per unit dose received during some stages of life, and age-specific non-radiogenic competing risks

  15. 3D Image Modelling and Specific Treatments in Orthodontics Domain

    OpenAIRE

    Goularas, Dionysis; Djemal, Khalifa; Mannoussakis, Yannis

    2007-01-01

    In this article, we present a 3D specific dental plaster treatment system for orthodontics. From computer tomography scanner images, we propose first a 3D image modelling and reconstruction method of the Mandible and Maxillary based on an adaptive triangulation allowing management of contours meant for the complex topologies. Secondly, we present two specific treatment methods directly achieved on obtained 3D model allowing the automatic correction for the setting in occlusion of the Mandible...

  16. An algebraic approach to modeling in software engineering

    International Nuclear Information System (INIS)

    Loegel, C.J.; Ravishankar, C.V.

    1993-09-01

    Our work couples the formalism of universal algebras with the engineering techniques of mathematical modeling to develop a new approach to the software engineering process. Our purpose in using this combination is twofold. First, abstract data types and their specification using universal algebras can be considered a common point between the practical requirements of software engineering and the formal specification of software systems. Second, mathematical modeling principles provide us with a means for effectively analyzing real-world systems. We first use modeling techniques to analyze a system and then represent the analysis using universal algebras. The rest of the software engineering process exploits properties of universal algebras that preserve the structure of our original model. This paper describes our software engineering process and our experience using it on both research and commercial systems. We need a new approach because current software engineering practices often deliver software that is difficult to develop and maintain. Formal software engineering approaches use universal algebras to describe ''computer science'' objects like abstract data types, but in practice software errors are often caused because ''real-world'' objects are improperly modeled. There is a large semantic gap between the customer's objects and abstract data types. In contrast, mathematical modeling uses engineering techniques to construct valid models for real-world systems, but these models are often implemented in an ad hoc manner. A combination of the best features of both approaches would enable software engineering to formally specify and develop software systems that better model real systems. Software engineering, like mathematical modeling, should concern itself first and foremost with understanding a real system and its behavior under given circumstances, and then with expressing this knowledge in an executable form

  17. Results from Development of Model Specifications for Multifamily Energy Retrofits

    Energy Technology Data Exchange (ETDEWEB)

    Brozyna, K.

    2012-08-01

    Specifications, modeled after CSI MasterFormat, provide the trade contractors and builders with requirements and recommendations on specific building materials, components and industry practices that comply with the expectations and intent of the requirements within the various funding programs associated with a project. The goal is to create a greater level of consistency in execution of energy efficiency retrofits measures across the multiple regions a developer may work. IBACOS and Mercy Housing developed sample model specifications based on a common building construction type that Mercy Housing encounters.

  18. Results From Development of Model Specifications for Multifamily Energy Retrofits

    Energy Technology Data Exchange (ETDEWEB)

    Brozyna, Kevin [IBACOS, Inc., Pittsburgh, PA (United States)

    2012-08-01

    Specifications, modeled after CSI MasterFormat, provide the trade contractors and builders with requirements and recommendations on specific building materials, components and industry practices that comply with the expectations and intent of the requirements within the various funding programs associated with a project. The goal is to create a greater level of consistency in execution of energy efficiency retrofits measures across the multiple regions a developer may work. IBACOS and Mercy Housing developed sample model specifications based on a common building construction type that Mercy Housing encounters.

  19. Heat transfer modeling an inductive approach

    CERN Document Server

    Sidebotham, George

    2015-01-01

    This innovative text emphasizes a "less-is-more" approach to modeling complicated systems such as heat transfer by treating them first as "1-node lumped models" that yield simple closed-form solutions. The author develops numerical techniques for students to obtain more detail, but also trains them to use the techniques only when simpler approaches fail. Covering all essential methods offered in traditional texts, but with a different order, Professor Sidebotham stresses inductive thinking and problem solving as well as a constructive understanding of modern, computer-based practice. Readers learn to develop their own code in the context of the material, rather than just how to use packaged software, offering a deeper, intrinsic grasp behind models of heat transfer. Developed from over twenty-five years of lecture notes to teach students of mechanical and chemical engineering at The Cooper Union for the Advancement of Science and Art, the book is ideal for students and practitioners across engineering discipl...

  20. Software Requirements Specification Verifiable Fuel Cycle Simulation (VISION) Model

    International Nuclear Information System (INIS)

    D. E. Shropshire; W. H. West

    2005-01-01

    The purpose of this Software Requirements Specification (SRS) is to define the top-level requirements for a Verifiable Fuel Cycle Simulation Model (VISION) of the Advanced Fuel Cycle (AFC). This simulation model is intended to serve a broad systems analysis and study tool applicable to work conducted as part of the AFCI (including costs estimates) and Generation IV reactor development studies

  1. Mountain range specific analog weather forecast model for ...

    Indian Academy of Sciences (India)

    Home; Journals; Journal of Earth System Science; Volume 117; Issue 5. Mountain range specific ... Mountain range specific analog weather forecast model is developed utilizing surface weather observations of reference stations in each mountain range in northwest Himalaya (NW-Himalaya).The model searches past ...

  2. Position-specific isotope modeling of organic micropollutants transformation through different reaction pathways.

    Science.gov (United States)

    Jin, Biao; Rolle, Massimo

    2016-03-01

    The degradation of organic micropollutants occurs via different reaction pathways. Compound specific isotope analysis is a valuable tool to identify such degradation pathways in different environmental systems. We propose a mechanism-based modeling approach that provides a quantitative framework to simultaneously evaluate concentration as well as bulk and position-specific multi-element isotope evolution during the transformation of organic micropollutants. The model explicitly simulates position-specific isotopologues for those atoms that experience isotope effects and, thereby, provides a mechanistic description of isotope fractionation occurring at different molecular positions. To demonstrate specific features of the modeling approach, we simulated the degradation of three selected organic micropollutants: dichlorobenzamide (BAM), isoproturon (IPU) and diclofenac (DCF). The model accurately reproduces the multi-element isotope data observed in previous experimental studies. Furthermore, it precisely captures the dual element isotope trends characteristic of different reaction pathways as well as their range of variation consistent with observed bulk isotope fractionation. It was also possible to directly validate the model capability to predict the evolution of position-specific isotope ratios with available experimental data. Therefore, the approach is useful both for a mechanism-based evaluation of experimental results and as a tool to explore transformation pathways in scenarios for which position-specific isotope data are not yet available. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. A multiscale modeling approach for biomolecular systems

    Energy Technology Data Exchange (ETDEWEB)

    Bowling, Alan, E-mail: bowling@uta.edu; Haghshenas-Jaryani, Mahdi, E-mail: mahdi.haghshenasjaryani@mavs.uta.edu [The University of Texas at Arlington, Department of Mechanical and Aerospace Engineering (United States)

    2015-04-15

    This paper presents a new multiscale molecular dynamic model for investigating the effects of external interactions, such as contact and impact, during stepping and docking of motor proteins and other biomolecular systems. The model retains the mass properties ensuring that the result satisfies Newton’s second law. This idea is presented using a simple particle model to facilitate discussion of the rigid body model; however, the particle model does provide insights into particle dynamics at the nanoscale. The resulting three-dimensional model predicts a significant decrease in the effect of the random forces associated with Brownian motion. This conclusion runs contrary to the widely accepted notion that the motor protein’s movements are primarily the result of thermal effects. This work focuses on the mechanical aspects of protein locomotion; the effect ATP hydrolysis is estimated as internal forces acting on the mechanical model. In addition, the proposed model can be numerically integrated in a reasonable amount of time. Herein, the differences between the motion predicted by the old and new modeling approaches are compared using a simplified model of myosin V.

  4. Quasirelativistic quark model in quasipotential approach

    CERN Document Server

    Matveev, V A; Savrin, V I; Sissakian, A N

    2002-01-01

    The relativistic particles interaction is described within the frames of quasipotential approach. The presentation is based on the so called covariant simultaneous formulation of the quantum field theory, where by the theory is considered on the spatial-like three-dimensional hypersurface in the Minkowski space. Special attention is paid to the methods of plotting various quasipotentials as well as to the applications of the quasipotential approach to describing the characteristics of the relativistic particles interaction in the quark models, namely: the hadrons elastic scattering amplitudes, the mass spectra and widths mesons decays, the cross sections of the deep inelastic leptons scattering on the hadrons

  5. A new approach for developing adjoint models

    Science.gov (United States)

    Farrell, P. E.; Funke, S. W.

    2011-12-01

    Many data assimilation algorithms rely on the availability of gradients of misfit functionals, which can be efficiently computed with adjoint models. However, the development of an adjoint model for a complex geophysical code is generally very difficult. Algorithmic differentiation (AD, also called automatic differentiation) offers one strategy for simplifying this task: it takes the abstraction that a model is a sequence of primitive instructions, each of which may be differentiated in turn. While extremely successful, this low-level abstraction runs into time-consuming difficulties when applied to the whole codebase of a model, such as differentiating through linear solves, model I/O, calls to external libraries, language features that are unsupported by the AD tool, and the use of multiple programming languages. While these difficulties can be overcome, it requires a large amount of technical expertise and an intimate familiarity with both the AD tool and the model. An alternative to applying the AD tool to the whole codebase is to assemble the discrete adjoint equations and use these to compute the necessary gradients. With this approach, the AD tool must be applied to the nonlinear assembly operators, which are typically small, self-contained units of the codebase. The disadvantage of this approach is that the assembly of the discrete adjoint equations is still very difficult to perform correctly, especially for complex multiphysics models that perform temporal integration; as it stands, this approach is as difficult and time-consuming as applying AD to the whole model. In this work, we have developed a library which greatly simplifies and automates the alternate approach of assembling the discrete adjoint equations. We propose a complementary, higher-level abstraction to that of AD: that a model is a sequence of linear solves. The developer annotates model source code with library calls that build a 'tape' of the operators involved and their dependencies, and

  6. A Model Management Approach for Co-Simulation Model Evaluation

    NARCIS (Netherlands)

    Zhang, X.C.; Broenink, Johannes F.; Filipe, Joaquim; Kacprzyk, Janusz; Pina, Nuno

    2011-01-01

    Simulating formal models is a common means for validating the correctness of the system design and reduce the time-to-market. In most of the embedded control system design, multiple engineering disciplines and various domain-specific models are often involved, such as mechanical, control, software

  7. An equilibrium approach to modelling social interaction

    Science.gov (United States)

    Gallo, Ignacio

    2009-07-01

    The aim of this work is to put forward a statistical mechanics theory of social interaction, generalizing econometric discrete choice models. After showing the formal equivalence linking econometric multinomial logit models to equilibrium statical mechanics, a multi-population generalization of the Curie-Weiss model for ferromagnets is considered as a starting point in developing a model capable of describing sudden shifts in aggregate human behaviour. Existence of the thermodynamic limit for the model is shown by an asymptotic sub-additivity method and factorization of correlation functions is proved almost everywhere. The exact solution of the model is provided in the thermodynamical limit by finding converging upper and lower bounds for the system's pressure, and the solution is used to prove an analytic result regarding the number of possible equilibrium states of a two-population system. The work stresses the importance of linking regimes predicted by the model to real phenomena, and to this end it proposes two possible procedures to estimate the model's parameters starting from micro-level data. These are applied to three case studies based on census type data: though these studies are found to be ultimately inconclusive on an empirical level, considerations are drawn that encourage further refinements of the chosen modelling approach.

  8. Evolutionary modeling-based approach for model errors correction

    Directory of Open Access Journals (Sweden)

    S. Q. Wan

    2012-08-01

    Full Text Available The inverse problem of using the information of historical data to estimate model errors is one of the science frontier research topics. In this study, we investigate such a problem using the classic Lorenz (1963 equation as a prediction model and the Lorenz equation with a periodic evolutionary function as an accurate representation of reality to generate "observational data."

    On the basis of the intelligent features of evolutionary modeling (EM, including self-organization, self-adaptive and self-learning, the dynamic information contained in the historical data can be identified and extracted by computer automatically. Thereby, a new approach is proposed to estimate model errors based on EM in the present paper. Numerical tests demonstrate the ability of the new approach to correct model structural errors. In fact, it can actualize the combination of the statistics and dynamics to certain extent.

  9. Teaching Sustainability Using an Active Learning Constructivist Approach: Discipline-Specific Case Studies in Higher Education

    Directory of Open Access Journals (Sweden)

    Maria Kalamas Hedden

    2017-07-01

    Full Text Available In this paper we present our rationale for using an active learning constructivist approach to teach sustainability-related topics in a higher education. To push the boundaries of ecological literacy, we also develop a theoretical model for sustainability knowledge co-creation. Drawing on the experiences of faculty at a major Southeastern University in the United States, we present case studies in architecture, engineering, geography, and marketing. Four Sustainability Faculty Fellows describe their discipline-specific case studies, all of which are project-based learning experiences, and include details regarding teaching and assessment. Easily replicated in other educational contexts, these case studies contribute to the advancement of sustainability education.

  10. MODELS OF TECHNOLOGY ADOPTION: AN INTEGRATIVE APPROACH

    Directory of Open Access Journals (Sweden)

    Andrei OGREZEANU

    2015-06-01

    Full Text Available The interdisciplinary study of information technology adoption has developed rapidly over the last 30 years. Various theoretical models have been developed and applied such as: the Technology Acceptance Model (TAM, Innovation Diffusion Theory (IDT, Theory of Planned Behavior (TPB, etc. The result of these many years of research is thousands of contributions to the field, which, however, remain highly fragmented. This paper develops a theoretical model of technology adoption by integrating major theories in the field: primarily IDT, TAM, and TPB. To do so while avoiding mess, an approach that goes back to basics in independent variable type’s development is proposed; emphasizing: 1 the logic of classification, and 2 psychological mechanisms behind variable types. Once developed these types are then populated with variables originating in empirical research. Conclusions are developed on which types are underpopulated and present potential for future research. I end with a set of methodological recommendations for future application of the model.

  11. Interfacial Fluid Mechanics A Mathematical Modeling Approach

    CERN Document Server

    Ajaev, Vladimir S

    2012-01-01

    Interfacial Fluid Mechanics: A Mathematical Modeling Approach provides an introduction to mathematical models of viscous flow used in rapidly developing fields of microfluidics and microscale heat transfer. The basic physical effects are first introduced in the context of simple configurations and their relative importance in typical microscale applications is discussed. Then,several configurations of importance to microfluidics, most notably thin films/droplets on substrates and confined bubbles, are discussed in detail.  Topics from current research on electrokinetic phenomena, liquid flow near structured solid surfaces, evaporation/condensation, and surfactant phenomena are discussed in the later chapters. This book also:  Discusses mathematical models in the context of actual applications such as electrowetting Includes unique material on fluid flow near structured surfaces and phase change phenomena Shows readers how to solve modeling problems related to microscale multiphase flows Interfacial Fluid Me...

  12. Continuum modeling an approach through practical examples

    CERN Document Server

    Muntean, Adrian

    2015-01-01

    This book develops continuum modeling skills and approaches the topic from three sides: (1) derivation of global integral laws together with the associated local differential equations, (2) design of constitutive laws and (3) modeling boundary processes. The focus of this presentation lies on many practical examples covering aspects such as coupled flow, diffusion and reaction in porous media or microwave heating of a pizza, as well as traffic issues in bacterial colonies and energy harvesting from geothermal wells. The target audience comprises primarily graduate students in pure and applied mathematics as well as working practitioners in engineering who are faced by nonstandard rheological topics like those typically arising in the food industry.

  13. Generic solar photovoltaic system dynamic simulation model specification

    Energy Technology Data Exchange (ETDEWEB)

    Ellis, Abraham [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Behnke, Michael Robert [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Elliott, Ryan Thomas [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2013-10-01

    This document is intended to serve as a specification for generic solar photovoltaic (PV) system positive-sequence dynamic models to be implemented by software developers and approved by the WECC MVWG for use in bulk system dynamic simulations in accordance with NERC MOD standards. Two specific dynamic models are included in the scope of this document. The first, a Central Station PV System model, is intended to capture the most important dynamic characteristics of large scale (> 10 MW) PV systems with a central Point of Interconnection (POI) at the transmission level. The second, a Distributed PV System model, is intended to represent an aggregation of smaller, distribution-connected systems that comprise a portion of a composite load that might be modeled at a transmission load bus.

  14. Analysis specifications for the CC3 geosphere model GEONET

    International Nuclear Information System (INIS)

    Melnyk, T.W.

    1995-04-01

    AECL is assessing a concept for disposing of Canada's nuclear fuel waste in a sealed vault deep in plutonic rock of the Canadian Shield. A computer program has been developed as an analytical tool for the postclosure assessment case study, a system model, CC3 (Canadian Concept, generation 3), has been developed to describe a hypothetical disposal system. This system model includes separate models for the engineered barriers within the disposal vault, the geosphere in which the vault is emplaced, and the biosphere in the vicinity of any discharge zones. The system model is embedded within a computer code SYVAC3, (SYstems Variability Analysis Code, generation 3), which takes parameter uncertainty into account by repeated simulation of the system. GEONET (GEOsphere NETwork) is the geosphere model component of this system model. It simulates contaminant transport from the vault to the biosphere along a transport network composed of one-dimensional transport segments that are connected together in three-dimensional space. This document is a set of specifications for GEONET that were developed over a number of years. Improvements to the code will be based on revisions to these specifications. The specifications consist of a model synopsis, describing all the relevant equations and assumptions used in the model, a set of formal data flow diagrams and minispecifications, and a data dictionary. (author). 26 refs., 20 figs

  15. Analysis specifications for the CC3 biosphere model BIOTRAC

    International Nuclear Information System (INIS)

    Szekely, J.G.; Wojciechowski, L.C.; Stephens, M.E.; Halliday, H.A.

    1994-12-01

    AECL Research is assessing a concept for disposing of Canada's nuclear fuel waste in a vault deep in plutonic rock of the Canadian Shield. A computer program called the Systems Variability Analysis Code (SYVAC) has been developed as an analytical tool for the postclosure (long-term) assessment of the concept. SYVAC3, the third generation of the code, is an executive program that directs repeated simulation of the disposal system to take into account parameter variation. For the postclosure assessment, the system model, CC3 (Canadian Concept, generation 3), was developed to describe a hypothetical disposal system that includes a disposal vault, the local geosphere and the biosphere in the vicinity of any discharge zones. BIOTRAC (BIOsphere TRansport And Consequences) is the biosphere model in the CC3 system model. The specifications for BIOTRAC, which were developed over a period of seven years, were subjected to numerous walkthrough examinations by the Biosphere Model Working Group to ensure that the intent of the model developers would be correctly specified for transformation into FORTRAN code. The FORTRAN version of BIOTRAC was written from interim versions of these specifications. Improvements to the code are based on revised versions of these specifications. The specifications consist of a data dictionary; sets of synopses, data flow diagrams and mini specs for the component models of BIOTRAC (surface water, soil, atmosphere, and food chain and dose); and supporting calculations (interface to the geosphere, consequences, and mass balance). (author). 20 refs., tabs., figs

  16. Reconciliation with oneself and with others: From approach to model

    Directory of Open Access Journals (Sweden)

    Nikolić-Ristanović Vesna

    2010-01-01

    Full Text Available The paper intends to present the approach to dealing with war and its consequences which was developed within Victimology Society of Serbia over the last five years, in the framework of Association Joint Action for Truth and Reconciliation (ZAIP. First, the short review of the Association and the process through which ZAIP approach to dealing with a past was developed is presented. Then, the detailed description of the approach itself, with identification of its most important specificities, is presented. In the conclusion, next steps, aimed at development of the model of reconciliation which will have the basis in ZAIP approach and which will be appropriate to social context of Serbia and its surrounding, are suggested.

  17. A Bayesian alternative for multi-objective ecohydrological model specification

    Science.gov (United States)

    Tang, Yating; Marshall, Lucy; Sharma, Ashish; Ajami, Hoori

    2018-01-01

    Recent studies have identified the importance of vegetation processes in terrestrial hydrologic systems. Process-based ecohydrological models combine hydrological, physical, biochemical and ecological processes of the catchments, and as such are generally more complex and parametric than conceptual hydrological models. Thus, appropriate calibration objectives and model uncertainty analysis are essential for ecohydrological modeling. In recent years, Bayesian inference has become one of the most popular tools for quantifying the uncertainties in hydrological modeling with the development of Markov chain Monte Carlo (MCMC) techniques. The Bayesian approach offers an appealing alternative to traditional multi-objective hydrologic model calibrations by defining proper prior distributions that can be considered analogous to the ad-hoc weighting often prescribed in multi-objective calibration. Our study aims to develop appropriate prior distributions and likelihood functions that minimize the model uncertainties and bias within a Bayesian ecohydrological modeling framework based on a traditional Pareto-based model calibration technique. In our study, a Pareto-based multi-objective optimization and a formal Bayesian framework are implemented in a conceptual ecohydrological model that combines a hydrological model (HYMOD) and a modified Bucket Grassland Model (BGM). Simulations focused on one objective (streamflow/LAI) and multiple objectives (streamflow and LAI) with different emphasis defined via the prior distribution of the model error parameters. Results show more reliable outputs for both predicted streamflow and LAI using Bayesian multi-objective calibration with specified prior distributions for error parameters based on results from the Pareto front in the ecohydrological modeling. The methodology implemented here provides insight into the usefulness of multiobjective Bayesian calibration for ecohydrologic systems and the importance of appropriate prior

  18. Modelling subject-specific childhood growth using linear mixed-effect models with cubic regression splines

    Directory of Open Access Journals (Sweden)

    Laura M. Grajeda

    2016-01-01

    Full Text Available Abstract Background Childhood growth is a cornerstone of pediatric research. Statistical models need to consider individual trajectories to adequately describe growth outcomes. Specifically, well-defined longitudinal models are essential to characterize both population and subject-specific growth. Linear mixed-effect models with cubic regression splines can account for the nonlinearity of growth curves and provide reasonable estimators of population and subject-specific growth, velocity and acceleration. Methods We provide a stepwise approach that builds from simple to complex models, and account for the intrinsic complexity of the data. We start with standard cubic splines regression models and build up to a model that includes subject-specific random intercepts and slopes and residual autocorrelation. We then compared cubic regression splines vis-à-vis linear piecewise splines, and with varying number of knots and positions. Statistical code is provided to ensure reproducibility and improve dissemination of methods. Models are applied to longitudinal height measurements in a cohort of 215 Peruvian children followed from birth until their fourth year of life. Results Unexplained variability, as measured by the variance of the regression model, was reduced from 7.34 when using ordinary least squares to 0.81 (p < 0.001 when using a linear mixed-effect models with random slopes and a first order continuous autoregressive error term. There was substantial heterogeneity in both the intercept (p < 0.001 and slopes (p < 0.001 of the individual growth trajectories. We also identified important serial correlation within the structure of the data (ρ = 0.66; 95 % CI 0.64 to 0.68; p < 0.001, which we modeled with a first order continuous autoregressive error term as evidenced by the variogram of the residuals and by a lack of association among residuals. The final model provides a parametric linear regression equation for both estimation and

  19. Joint Modeling of Multiple Crimes: A Bayesian Spatial Approach

    Directory of Open Access Journals (Sweden)

    Hongqiang Liu

    2017-01-01

    Full Text Available A multivariate Bayesian spatial modeling approach was used to jointly model the counts of two types of crime, i.e., burglary and non-motor vehicle theft, and explore the geographic pattern of crime risks and relevant risk factors. In contrast to the univariate model, which assumes independence across outcomes, the multivariate approach takes into account potential correlations between crimes. Six independent variables are included in the model as potential risk factors. In order to fully present this method, both the multivariate model and its univariate counterpart are examined. We fitted the two models to the data and assessed them using the deviance information criterion. A comparison of the results from the two models indicates that the multivariate model was superior to the univariate model. Our results show that population density and bar density are clearly associated with both burglary and non-motor vehicle theft risks and indicate a close relationship between these two types of crime. The posterior means and 2.5% percentile of type-specific crime risks estimated by the multivariate model were mapped to uncover the geographic patterns. The implications, limitations and future work of the study are discussed in the concluding section.

  20. Evaluation of Workflow Management Systems - A Meta Model Approach

    Directory of Open Access Journals (Sweden)

    Michael Rosemann

    1998-11-01

    Full Text Available The automated enactment of processes through the use of workflow management systems enables the outsourcing of the control flow from application systems. By now a large number of systems, that follow different workflow paradigms, are available. This leads to the problem of selecting the appropriate workflow management system for a given situation. In this paper we outline the benefits of a meta model approach for the evaluation and comparison of different workflow management systems. After a general introduction on the topic of meta modeling the meta models of the workflow management systems WorkParty (Siemens Nixdorf and FlowMark (IBM are compared as an example. These product specific meta models can be generalized to meta reference models, which helps to specify a workflow methodology. Exemplary, an organisational reference meta model is presented, which helps users in specifying their requirements for a workflow management system.

  1. Thematic report: Macroeconomic models including specifically social and environmental aspects

    OpenAIRE

    Kratena, Kurt

    2015-01-01

    WWWforEurope Deliverable No. 8, 30 pages A significant reduction of the global environmental consequences of European consumption and production activities are the main objective of the policy simulations carried out in this paper. For this purpose three different modelling approaches have been chosen. Two macroeconomic models following the philosophy of consistent stock-flow accounting for the main institutional sectors (households, firms, banks, central bank and government) are used for...

  2. An Integrated Framework to Specify Domain-Specific Modeling Languages

    DEFF Research Database (Denmark)

    Zarrin, Bahram; Baumeister, Hubert

    2018-01-01

    In this paper, we propose an integrated framework that can be used by DSL designers to implement their desired graphical domain-specific languages. This framework relies on Microsoft DSL Tools, a meta-modeling framework to build graphical domain-specific languages, and an extension of ForSpec, a ...... language to define their semantics. Integrating these technologies under the umbrella of Microsoft Visual Studio IDE allows DSL designers to utilize a single development environment for developing their desired domain-specific languages....

  3. Accounting for standard errors of vision-specific latent trait in regression models.

    Science.gov (United States)

    Wong, Wan Ling; Li, Xiang; Li, Jialiang; Wong, Tien Yin; Cheng, Ching-Yu; Lamoureux, Ecosse L

    2014-07-11

    To demonstrate the effectiveness of Hierarchical Bayesian (HB) approach in a modeling framework for association effects that accounts for SEs of vision-specific latent traits assessed using Rasch analysis. A systematic literature review was conducted in four major ophthalmic journals to evaluate Rasch analysis performed on vision-specific instruments. The HB approach was used to synthesize the Rasch model and multiple linear regression model for the assessment of the association effects related to vision-specific latent traits. The effectiveness of this novel HB one-stage "joint-analysis" approach allows all model parameters to be estimated simultaneously and was compared with the frequently used two-stage "separate-analysis" approach in our simulation study (Rasch analysis followed by traditional statistical analyses without adjustment for SE of latent trait). Sixty-six reviewed articles performed evaluation and validation of vision-specific instruments using Rasch analysis, and 86.4% (n = 57) performed further statistical analyses on the Rasch-scaled data using traditional statistical methods; none took into consideration SEs of the estimated Rasch-scaled scores. The two models on real data differed for effect size estimations and the identification of "independent risk factors." Simulation results showed that our proposed HB one-stage "joint-analysis" approach produces greater accuracy (average of 5-fold decrease in bias) with comparable power and precision in estimation of associations when compared with the frequently used two-stage "separate-analysis" procedure despite accounting for greater uncertainty due to the latent trait. Patient-reported data, using Rasch analysis techniques, do not take into account the SE of latent trait in association analyses. The HB one-stage "joint-analysis" is a better approach, producing accurate effect size estimations and information about the independent association of exposure variables with vision-specific latent traits

  4. Datamining approaches for modeling tumor control probability.

    Science.gov (United States)

    Naqa, Issam El; Deasy, Joseph O; Mu, Yi; Huang, Ellen; Hope, Andrew J; Lindsay, Patricia E; Apte, Aditya; Alaly, James; Bradley, Jeffrey D

    2010-11-01

    Tumor control probability (TCP) to radiotherapy is determined by complex interactions between tumor biology, tumor microenvironment, radiation dosimetry, and patient-related variables. The complexity of these heterogeneous variable interactions constitutes a challenge for building predictive models for routine clinical practice. We describe a datamining framework that can unravel the higher order relationships among dosimetric dose-volume prognostic variables, interrogate various radiobiological processes, and generalize to unseen data before when applied prospectively. Several datamining approaches are discussed that include dose-volume metrics, equivalent uniform dose, mechanistic Poisson model, and model building methods using statistical regression and machine learning techniques. Institutional datasets of non-small cell lung cancer (NSCLC) patients are used to demonstrate these methods. The performance of the different methods was evaluated using bivariate Spearman rank correlations (rs). Over-fitting was controlled via resampling methods. Using a dataset of 56 patients with primary NCSLC tumors and 23 candidate variables, we estimated GTV volume and V75 to be the best model parameters for predicting TCP using statistical resampling and a logistic model. Using these variables, the support vector machine (SVM) kernel method provided superior performance for TCP prediction with an rs=0.68 on leave-one-out testing compared to logistic regression (rs=0.4), Poisson-based TCP (rs=0.33), and cell kill equivalent uniform dose model (rs=0.17). The prediction of treatment response can be improved by utilizing datamining approaches, which are able to unravel important non-linear complex interactions among model variables and have the capacity to predict on unseen data for prospective clinical applications.

  5. Multilevel Models for the Analysis of Angle-Specific Torque Curves with Application to Master Athletes

    Science.gov (United States)

    Carvalho, Humberto M

    2015-01-01

    The aim of this paper was to outline a multilevel modeling approach to fit individual angle-specific torque curves describing concentric knee extension and flexion isokinetic muscular actions in Master athletes. The potential of the analytical approach to examine between individual differences across the angle-specific torque curves was illustrated including between-individuals variation due to gender differences at a higher level. Torques in concentric muscular actions of knee extension and knee extension at 60º·s−1 were considered within a range of motion between 5º and 85º (only torques “truly” isokinetic). Multilevel time series models with autoregressive covariance structures with standard multilevel models were superior fits compared with standard multilevel models for repeated measures to fit angle-specific torque curves. Third and fourth order polynomial models were the best fits to describe angle-specific torque curves of isokinetic knee flexion and extension concentric actions, respectively. The fixed exponents allow interpretations for initial acceleration, the angle at peak torque and the decrement of torque after peak torque. Also, the multilevel models were flexible to illustrate the influence of gender differences on the shape of torque throughout the range of motion and in the shape of the curves. The presented multilevel regression models may afford a general framework to examine angle-specific moment curves by isokinetic dynamometry, and add to the understanding mechanisms of strength development, particularly the force-length relationship, both related to performance and injury prevention. PMID:26839603

  6. Metal Mixture Modeling Evaluation project: 2. Comparison of four modeling approaches

    Science.gov (United States)

    Farley, Kevin J.; Meyer, Joe; Balistrieri, Laurie S.; DeSchamphelaere, Karl; Iwasaki, Yuichi; Janssen, Colin; Kamo, Masashi; Lofts, Steve; Mebane, Christopher A.; Naito, Wataru; Ryan, Adam C.; Santore, Robert C.; Tipping, Edward

    2015-01-01

    As part of the Metal Mixture Modeling Evaluation (MMME) project, models were developed by the National Institute of Advanced Industrial Science and Technology (Japan), the U.S. Geological Survey (USA), HDR⎪HydroQual, Inc. (USA), and the Centre for Ecology and Hydrology (UK) to address the effects of metal mixtures on biological responses of aquatic organisms. A comparison of the 4 models, as they were presented at the MMME Workshop in Brussels, Belgium (May 2012), is provided herein. Overall, the models were found to be similar in structure (free ion activities computed by WHAM; specific or non-specific binding of metals/cations in or on the organism; specification of metal potency factors and/or toxicity response functions to relate metal accumulation to biological response). Major differences in modeling approaches are attributed to various modeling assumptions (e.g., single versus multiple types of binding site on the organism) and specific calibration strategies that affected the selection of model parameters. The models provided a reasonable description of additive (or nearly additive) toxicity for a number of individual toxicity test results. Less-than-additive toxicity was more difficult to describe with the available models. Because of limitations in the available datasets and the strong inter-relationships among the model parameters (log KM values, potency factors, toxicity response parameters), further evaluation of specific model assumptions and calibration strategies is needed.

  7. Crime Modeling using Spatial Regression Approach

    Science.gov (United States)

    Saleh Ahmar, Ansari; Adiatma; Kasim Aidid, M.

    2018-01-01

    Act of criminality in Indonesia increased both variety and quantity every year. As murder, rape, assault, vandalism, theft, fraud, fencing, and other cases that make people feel unsafe. Risk of society exposed to crime is the number of reported cases in the police institution. The higher of the number of reporter to the police institution then the number of crime in the region is increasing. In this research, modeling criminality in South Sulawesi, Indonesia with the dependent variable used is the society exposed to the risk of crime. Modelling done by area approach is the using Spatial Autoregressive (SAR) and Spatial Error Model (SEM) methods. The independent variable used is the population density, the number of poor population, GDP per capita, unemployment and the human development index (HDI). Based on the analysis using spatial regression can be shown that there are no dependencies spatial both lag or errors in South Sulawesi.

  8. Modeling the Development of Goal-Specificity in Mirror Neurons.

    Science.gov (United States)

    Thill, Serge; Svensson, Henrik; Ziemke, Tom

    2011-12-01

    Neurophysiological studies have shown that parietal mirror neurons encode not only actions but also the goal of these actions. Although some mirror neurons will fire whenever a certain action is perceived (goal-independently), most will only fire if the motion is perceived as part of an action with a specific goal. This result is important for the action-understanding hypothesis as it provides a potential neurological basis for such a cognitive ability. It is also relevant for the design of artificial cognitive systems, in particular robotic systems that rely on computational models of the mirror system in their interaction with other agents. Yet, to date, no computational model has explicitly addressed the mechanisms that give rise to both goal-specific and goal-independent parietal mirror neurons. In the present paper, we present a computational model based on a self-organizing map, which receives artificial inputs representing information about both the observed or executed actions and the context in which they were executed. We show that the map develops a biologically plausible organization in which goal-specific mirror neurons emerge. We further show that the fundamental cause for both the appearance and the number of goal-specific neurons can be found in geometric relationships between the different inputs to the map. The results are important to the action-understanding hypothesis as they provide a mechanism for the emergence of goal-specific parietal mirror neurons and lead to a number of predictions: (1) Learning of new goals may mostly reassign existing goal-specific neurons rather than recruit new ones; (2) input differences between executed and observed actions can explain observed corresponding differences in the number of goal-specific neurons; and (3) the percentage of goal-specific neurons may differ between motion primitives.

  9. REQUIREMENTS PARTICLE NETWORKS: AN APPROACH TO FORMAL SOFTWARE FUNCTIONAL REQUIREMENTS MODELLING

    OpenAIRE

    Wiwat Vatanawood; Wanchai Rivepiboon

    2001-01-01

    In this paper, an approach to software functional requirements modelling using requirements particle networks is presented. In our approach, a set of requirements particles is defined as an essential tool to construct a visual model of software functional requirements specification during the software analysis phase and the relevant formal specification is systematically generated without the experience of writing formal specification. A number of algorithms are presented to perform these for...

  10. MERGING DIGITAL SURFACE MODELS IMPLEMENTING BAYESIAN APPROACHES

    Directory of Open Access Journals (Sweden)

    H. Sadeq

    2016-06-01

    Full Text Available In this research different DSMs from different sources have been merged. The merging is based on a probabilistic model using a Bayesian Approach. The implemented data have been sourced from very high resolution satellite imagery sensors (e.g. WorldView-1 and Pleiades. It is deemed preferable to use a Bayesian Approach when the data obtained from the sensors are limited and it is difficult to obtain many measurements or it would be very costly, thus the problem of the lack of data can be solved by introducing a priori estimations of data. To infer the prior data, it is assumed that the roofs of the buildings are specified as smooth, and for that purpose local entropy has been implemented. In addition to the a priori estimations, GNSS RTK measurements have been collected in the field which are used as check points to assess the quality of the DSMs and to validate the merging result. The model has been applied in the West-End of Glasgow containing different kinds of buildings, such as flat roofed and hipped roofed buildings. Both quantitative and qualitative methods have been employed to validate the merged DSM. The validation results have shown that the model was successfully able to improve the quality of the DSMs and improving some characteristics such as the roof surfaces, which consequently led to better representations. In addition to that, the developed model has been compared with the well established Maximum Likelihood model and showed similar quantitative statistical results and better qualitative results. Although the proposed model has been applied on DSMs that were derived from satellite imagery, it can be applied to any other sourced DSMs.

  11. Medical Inpatient Journey Modeling and Clustering: A Bayesian Hidden Markov Model Based Approach.

    Science.gov (United States)

    Huang, Zhengxing; Dong, Wei; Wang, Fei; Duan, Huilong

    2015-01-01

    Modeling and clustering medical inpatient journeys is useful to healthcare organizations for a number of reasons including inpatient journey reorganization in a more convenient way for understanding and browsing, etc. In this study, we present a probabilistic model-based approach to model and cluster medical inpatient journeys. Specifically, we exploit a Bayesian Hidden Markov Model based approach to transform medical inpatient journeys into a probabilistic space, which can be seen as a richer representation of inpatient journeys to be clustered. Then, using hierarchical clustering on the matrix of similarities, inpatient journeys can be clustered into different categories w.r.t their clinical and temporal characteristics. We evaluated the proposed approach on a real clinical data set pertaining to the unstable angina treatment process. The experimental results reveal that our method can identify and model latent treatment topics underlying in personalized inpatient journeys, and yield impressive clustering quality.

  12. Age and gender specific biokinetic model for strontium in humans

    Energy Technology Data Exchange (ETDEWEB)

    Shagina, N. B.; Tolstykh, E. I.; Degteva, M. O.; Anspaugh, L. R.; Napier, Bruce A.

    2015-03-01

    A biokinetic model for strontium in humans is necessary for quantification of internal doses due to strontium radioisotopes. The ICRP-recommended biokinetic model for strontium has limitation for use in a population study, because it is not gender specific and does not cover all age ranges. The extensive Techa River data set on 90Sr in humans (tens of thousands of measurements) is a unique source of data on long-term strontium retention for men and women of all ages at intake. These, as well as published data, were used for evaluation of age- and gender-specific parameters for a new compartment biokinetic model for strontium (Sr-AGe model). The Sr-AGe model has similar structure as the ICRP model for the alkaline earth elements. The following parameters were mainly reevaluated: gastro-intestinal absorption and parameters related to the processes of bone formation and resorption defining calcium and strontium transfers in skeletal compartments. The Sr-AGe model satisfactorily describes available data sets on strontium retention for different kinds of intake (dietary and intravenous) at different ages (0–80 years old) and demonstrates good agreement with data sets for different ethnic groups. The Sr-AGe model can be used for dose assessment in epidemiological studies of general population exposed to ingested strontium radioisotopes.

  13. Modeling software with finite state machines a practical approach

    CERN Document Server

    Wagner, Ferdinand; Wagner, Thomas; Wolstenholme, Peter

    2006-01-01

    Modeling Software with Finite State Machines: A Practical Approach explains how to apply finite state machines to software development. It provides a critical analysis of using finite state machines as a foundation for executable specifications to reduce software development effort and improve quality. This book discusses the design of a state machine and of a system of state machines. It also presents a detailed analysis of development issues relating to behavior modeling with design examples and design rules for using finite state machines. This volume describes a coherent and well-tested fr

  14. A Principled Approach to the Specification of System Architectures for Space Missions

    Science.gov (United States)

    McKelvin, Mark L. Jr.; Castillo, Robert; Bonanne, Kevin; Bonnici, Michael; Cox, Brian; Gibson, Corrina; Leon, Juan P.; Gomez-Mustafa, Jose; Jimenez, Alejandro; Madni, Azad

    2015-01-01

    Modern space systems are increasing in complexity and scale at an unprecedented pace. Consequently, innovative methods, processes, and tools are needed to cope with the increasing complexity of architecting these systems. A key systems challenge in practice is the ability to scale processes, methods, and tools used to architect complex space systems. Traditionally, the process for specifying space system architectures has largely relied on capturing the system architecture in informal descriptions that are often embedded within loosely coupled design documents and domain expertise. Such informal descriptions often lead to misunderstandings between design teams, ambiguous specifications, difficulty in maintaining consistency as the architecture evolves throughout the system development life cycle, and costly design iterations. Therefore, traditional methods are becoming increasingly inefficient to cope with ever-increasing system complexity. We apply the principles of component-based design and platform-based design to the development of the system architecture for a practical space system to demonstrate feasibility of our approach using SysML. Our results show that we are able to apply a systematic design method to manage system complexity, thus enabling effective data management, semantic coherence and traceability across different levels of abstraction in the design chain. Just as important, our approach enables interoperability among heterogeneous tools in a concurrent engineering model based design environment.

  15. A conceptual model specification language (CMSL Version 2)

    NARCIS (Netherlands)

    Wieringa, Roelf J.

    1992-01-01

    Version 2 of a language (CMSL) to specify conceptual models is defined. CMSL consists of two parts, the value specification language VSL and the object spercification language OSL. There is a formal semantics and an inference system for CMSL but research on this still continues. A method for

  16. Modeling growth of specific spoilage organisms in tilapia ...

    African Journals Online (AJOL)

    enoh

    2012-03-29

    Mar 29, 2012 ... Tilapia is an important aquatic fish, but severe spoilage of tilapia is most likely related to the global aquaculture. The spoilage is mostly caused by specific spoilage organisms (SSO). Therefore, it is very important to use microbial models to predict the growth of SSO in tilapia. This study firstly verified.

  17. Modeling growth of specific spoilage organisms in tilapia ...

    African Journals Online (AJOL)

    Tilapia is an important aquatic fish, but severe spoilage of tilapia is most likely related to the global aquaculture. The spoilage is mostly caused by specific spoilage organisms (SSO). Therefore, it is very important to use microbial models to predict the growth of SSO in tilapia. This study firstly verified Pseudomonas and Vibrio ...

  18. Verifying OCL specifications of UML models : tool support and compositionality

    NARCIS (Netherlands)

    Kyas, Marcel

    2006-01-01

    The Unified Modelling Language (UML) and the Object Constraint Language (OCL) serve as specification languages for embedded and real-time systems used in a safety-critical environment. In this dissertation class diagrams, object diagrams, and OCL constraints are formalised. The formalisation

  19. Altering coenzyme specificity of Pichia stipitis xylose reductase by the semi-rational approach CASTing

    Directory of Open Access Journals (Sweden)

    Zhang Jingqing

    2007-11-01

    Full Text Available Abstract Background The NAD(PH-dependent Pichia stipitis xylose reductase (PsXR is one of the key enzymes for xylose fermentation, and has been cloned into the commonly used ethanol-producing yeast Saccharomyces cerevisiae. In order to eliminate the redox imbalance resulting from the preference of this enzyme toward NADPH, efforts have been made to alter the coenzyme specificity of PsXR by site-directed mutagenesis, with limited success. Given the industrial importance of PsXR, it is of interest to investigate further ways to create mutants of PsXR that prefers NADH rather than NADPH, by the alternative directed evolution approach. Results Based on a homology model of PsXR, six residues were predicted to interact with the adenine ribose of NAD(PH in PsXR and altered using a semi-rational mutagenesis approach (CASTing. Three rounds of saturation mutagenesis were carried to randomize these residues, and a microplate-based assay was applied in the screening. A best mutant 2-2C12, which carried four mutations K270S, N272P, S271G and R276F, was obtained. The mutant showed a preference toward NADH over NADPH by a factor of about 13-fold, or an improvement of about 42-fold, as measured by the ratio of the specificity constant kcat/Kmcoenzyme. Compared with the wild-type, the kcatNADH for the best mutant was only slightly lower, while the kcatNADPH decreased by a factor of about 10. Furthermore, the specific activity of 2-2C12 in the presence of NADH was 20.6 U·mg-1, which is highest among PsXR mutants reported. Conclusion A seemingly simplistic and yet very effective mutagenesis approach, CASTing, was applied successfully to alter the NAD(PH preference for Pichia stipitis xylose reductase, an important enzyme for xylose-fermenting yeast. The observed change in the NAD(PH preference for this enzyme seems to have resulted from the altered active site that is more unfavorable for NADPH than NADH in terms of both Km and kcat. There are potentials for

  20. A nationwide modelling approach to decommissioning - 16182

    International Nuclear Information System (INIS)

    Kelly, Bernard; Lowe, Andy; Mort, Paul

    2009-01-01

    In this paper we describe a proposed UK national approach to modelling decommissioning. For the first time, we shall have an insight into optimizing the safety and efficiency of a national decommissioning strategy. To do this we use the General Case Integrated Waste Algorithm (GIA), a universal model of decommissioning nuclear plant, power plant, waste arisings and the associated knowledge capture. The model scales from individual items of plant through cells, groups of cells, buildings, whole sites and then on up to a national scale. We describe the national vision for GIA which can be broken down into three levels: 1) the capture of the chronological order of activities that an experienced decommissioner would use to decommission any nuclear facility anywhere in the world - this is Level 1 of GIA; 2) the construction of an Operational Research (OR) model based on Level 1 to allow rapid what if scenarios to be tested quickly (Level 2); 3) the construction of a state of the art knowledge capture capability that allows future generations to learn from our current decommissioning experience (Level 3). We show the progress to date in developing GIA in levels 1 and 2. As part of level 1, GIA has assisted in the development of an IMechE professional decommissioning qualification. Furthermore, we describe GIA as the basis of a UK-Owned database of decommissioning norms for such things as costs, productivity, durations etc. From level 2, we report on a pilot study that has successfully tested the basic principles for the OR numerical simulation of the algorithm. We then highlight the advantages of applying the OR modelling approach nationally. In essence, a series of 'what if...' scenarios can be tested that will improve the safety and efficiency of decommissioning. (authors)

  1. A multiscale approach for modeling atherosclerosis progression.

    Science.gov (United States)

    Exarchos, Konstantinos P; Carpegianni, Clara; Rigas, Georgios; Exarchos, Themis P; Vozzi, Federico; Sakellarios, Antonis; Marraccini, Paolo; Naka, Katerina; Michalis, Lambros; Parodi, Oberdan; Fotiadis, Dimitrios I

    2015-03-01

    Progression of atherosclerotic process constitutes a serious and quite common condition due to accumulation of fatty materials in the arterial wall, consequently posing serious cardiovascular complications. In this paper, we assemble and analyze a multitude of heterogeneous data in order to model the progression of atherosclerosis (ATS) in coronary vessels. The patient's medical record, biochemical analytes, monocyte information, adhesion molecules, and therapy-related data comprise the input for the subsequent analysis. As indicator of coronary lesion progression, two consecutive coronary computed tomography angiographies have been evaluated in the same patient. To this end, a set of 39 patients is studied using a twofold approach, namely, baseline analysis and temporal analysis. The former approach employs baseline information in order to predict the future state of the patient (in terms of progression of ATS). The latter is based on an approach encompassing dynamic Bayesian networks whereby snapshots of the patient's status over the follow-up are analyzed in order to model the evolvement of ATS, taking into account the temporal dimension of the disease. The quantitative assessment of our work has resulted in 93.3% accuracy for the case of baseline analysis, and 83% overall accuracy for the temporal analysis, in terms of modeling and predicting the evolvement of ATS. It should be noted that the application of the SMOTE algorithm for handling class imbalance and the subsequent evaluation procedure might have introduced an overestimation of the performance metrics, due to the employment of synthesized instances. The most prominent features found to play a substantial role in the progression of the disease are: diabetes, cholesterol and cholesterol/HDL. Among novel markers, the CD11b marker of leukocyte integrin complex is associated with coronary plaque progression.

  2. IDENTIFYING CANCER SPECIFIC METABOLIC SIGNATURES USING CONSTRAINT-BASED MODELS.

    Science.gov (United States)

    Schultz, A; Mehta, S; Hu, C W; Hoff, F W; Horton, T M; Kornblau, S M; Qutub, A A

    2017-01-01

    Cancer metabolism differs remarkably from the metabolism of healthy surrounding tissues, and it is extremely heterogeneous across cancer types. While these metabolic differences provide promising avenues for cancer treatments, much work remains to be done in understanding how metabolism is rewired in malignant tissues. To that end, constraint-based models provide a powerful computational tool for the study of metabolism at the genome scale. To generate meaningful predictions, however, these generalized human models must first be tailored for specific cell or tissue sub-types. Here we first present two improved algorithms for (1) the generation of these context-specific metabolic models based on omics data, and (2) Monte-Carlo sampling of the metabolic model ux space. By applying these methods to generate and analyze context-specific metabolic models of diverse solid cancer cell line data, and primary leukemia pediatric patient biopsies, we demonstrate how the methodology presented in this study can generate insights into the rewiring differences across solid tumors and blood cancers.

  3. Modeling in transport phenomena a conceptual approach

    CERN Document Server

    Tosun, Ismail

    2007-01-01

    Modeling in Transport Phenomena, Second Edition presents and clearly explains with example problems the basic concepts and their applications to fluid flow, heat transfer, mass transfer, chemical reaction engineering and thermodynamics. A balanced approach is presented between analysis and synthesis, students will understand how to use the solution in engineering analysis. Systematic derivations of the equations and the physical significance of each term are given in detail, for students to easily understand and follow up the material. There is a strong incentive in science and engineering to

  4. Model approach brings multi-level success.

    Science.gov (United States)

    Howell, Mark

    2012-08-01

    n an article that first appeared in US magazine, Medical Construction & Design, Mark Howell, senior vice-president of Skanska USA Building, based in Seattle, describes the design and construction of a new nine-storey, 350,000 ft2 extension to the Good Samaritan Hospital in Puyallup, Washington state. He explains how the use of an Integrated Project Delivery (IPD) approach by the key players, and extensive use of building information modelling (BIM), combined to deliver a healthcare facility that he believes should meet the needs of patients, families, and the clinical care team, 'well into the future'.

  5. XML for data representation and model specification in neuroscience.

    Science.gov (United States)

    Crook, Sharon M; Howell, Fred W

    2007-01-01

    EXtensible Markup Language (XML) technology provides an ideal representation for the complex structure of models and neuroscience data, as it is an open file format and provides a language-independent method for storing arbitrarily complex structured information. XML is composed of text and tags that explicitly describe the structure and semantics of the content of the document. In this chapter, we describe some of the common uses of XML in neuroscience, with case studies in representing neuroscience data and defining model descriptions based on examples from NeuroML. The specific methods that we discuss include (1) reading and writing XML from applications, (2) exporting XML from databases, (3) using XML standards to represent neuronal morphology data, (4) using XML to represent experimental metadata, and (5) creating new XML specifications for models.

  6. Multilevel Models for the Analysis of Angle-Specific Torque Curves with Application to Master Athletes

    Directory of Open Access Journals (Sweden)

    Carvalho Humberto M.

    2015-12-01

    Full Text Available The aim of this paper was to outline a multilevel modeling approach to fit individual angle-specific torque curves describing concentric knee extension and flexion isokinetic muscular actions in Master athletes. The potential of the analytical approach to examine between individual differences across the angle-specific torque curves was illustrated including between-individuals variation due to gender differences at a higher level. Torques in concentric muscular actions of knee extension and knee extension at 60°·s-1 were considered within a range of motion between 5°and 85° (only torques “truly” isokinetic. Multilevel time series models with autoregressive covariance structures with standard multilevel models were superior fits compared with standard multilevel models for repeated measures to fit anglespecific torque curves. Third and fourth order polynomial models were the best fits to describe angle-specific torque curves of isokinetic knee flexion and extension concentric actions, respectively. The fixed exponents allow interpretations for initial acceleration, the angle at peak torque and the decrement of torque after peak torque. Also, the multilevel models were flexible to illustrate the influence of gender differences on the shape of torque throughout the range of motion and in the shape of the curves. The presented multilevel regression models may afford a general framework to examine angle-specific moment curves by isokinetic dynamometry, and add to the understanding mechanisms of strength development, particularly the force-length relationship, both related to performance and injury prevention.

  7. Patient Specific Modeling of Head-Up Tilt

    DEFF Research Database (Denmark)

    Williams, Nakeya; Wright, Andrew; Mehlsen, Jesper

    2014-01-01

    blood pressure. The model contains five compartments representing arteries and veins in the upper and lower body of the systemic circulation, as well as the left ventricle facilitating pumping of the heart. A physiologically based sub-model describes gravitational effects on pooling of blood during......Short term cardiovascular responses to head-up tilt (HUT) experiments involve complex cardiovascular regulation in order to maintain blood pressure at homeostatic levels. This manuscript presents a patient specific compartmental model developed to predict dynamic changes in heart rate and arterial...... that it is possible to estimate a subset of model parameters that allows prediction of observed changes in arterial blood pressure. Furthermore, the model adequately predicts arterial and venous blood pressures, as well as cardiac output in compartments for which data are not available....

  8. Pedagogic process modeling: Humanistic-integrative approach

    Directory of Open Access Journals (Sweden)

    Boritko Nikolaj M.

    2007-01-01

    Full Text Available The paper deals with some current problems of modeling the dynamics of the subject-features development of the individual. The term "process" is considered in the context of the humanistic-integrative approach, in which the principles of self education are regarded as criteria for efficient pedagogic activity. Four basic characteristics of the pedagogic process are pointed out: intentionality reflects logicality and regularity of the development of the process; discreteness (stageability in dicates qualitative stages through which the pedagogic phenomenon passes; nonlinearity explains the crisis character of pedagogic processes and reveals inner factors of self-development; situationality requires a selection of pedagogic conditions in accordance with the inner factors, which would enable steering the pedagogic process. Offered are two steps for singling out a particular stage and the algorithm for developing an integrative model for it. The suggested conclusions might be of use for further theoretic research, analyses of educational practices and for realistic predicting of pedagogical phenomena. .

  9. Mathematical Formulation Requirements and Specifications for the Process Models

    International Nuclear Information System (INIS)

    Steefel, C.; Moulton, D.; Pau, G.; Lipnikov, K.; Meza, J.; Lichtner, P.; Wolery, T.; Bacon, D.; Spycher, N.; Bell, J.; Moridis, G.; Yabusaki, S.; Sonnenthal, E.; Zyvoloski, G.; Andre, B.; Zheng, L.; Davis, J.

    2010-01-01

    The Advanced Simulation Capability for Environmental Management (ASCEM) is intended to be a state-of-the-art scientific tool and approach for understanding and predicting contaminant fate and transport in natural and engineered systems. The ASCEM program is aimed at addressing critical EM program needs to better understand and quantify flow and contaminant transport behavior in complex geological systems. It will also address the long-term performance of engineered components including cementitious materials in nuclear waste disposal facilities, in order to reduce uncertainties and risks associated with DOE EM's environmental cleanup and closure activities. Building upon national capabilities developed from decades of Research and Development in subsurface geosciences, computational and computer science, modeling and applied mathematics, and environmental remediation, the ASCEM initiative will develop an integrated, open-source, high-performance computer modeling system for multiphase, multicomponent, multiscale subsurface flow and contaminant transport. This integrated modeling system will incorporate capabilities for predicting releases from various waste forms, identifying exposure pathways and performing dose calculations, and conducting systematic uncertainty quantification. The ASCEM approach will be demonstrated on selected sites, and then applied to support the next generation of performance assessments of nuclear waste disposal and facility decommissioning across the EM complex. The Multi-Process High Performance Computing (HPC) Simulator is one of three thrust areas in ASCEM. The other two are the Platform and Integrated Toolsets (dubbed the Platform) and Site Applications. The primary objective of the HPC Simulator is to provide a flexible and extensible computational engine to simulate the coupled processes and flow scenarios described by the conceptual models developed using the ASCEM Platform. The graded and iterative approach to assessments naturally

  10. Specific heat of a non-local attractive Hubbard model

    Energy Technology Data Exchange (ETDEWEB)

    Calegari, E.J., E-mail: eleonir@ufsm.br [Laboratório de Teoria da Matéria Condensada, Departamento de Física, UFSM, 97105-900, Santa Maria, RS (Brazil); Lobo, C.O. [Laboratório de Teoria da Matéria Condensada, Departamento de Física, UFSM, 97105-900, Santa Maria, RS (Brazil); Magalhaes, S.G. [Instituto de Física, Universidade Federal Fluminense, Av. Litorânea s/n, 24210, 346, Niterói, Rio de Janeiro (Brazil); Chaves, C.M.; Troper, A. [Centro Brasileiro de Pesquisas Físicas, Rua Xavier Sigaud 150, 22290-180, Rio de Janeiro, RJ (Brazil)

    2013-10-01

    The specific heat C(T) of an attractive (interaction G<0) non-local Hubbard model is investigated within a two-pole approximation that leads to a set of correlation functions, which play an important role as a source of anomalies as the pseudogap. For a giving range of G and n{sub T} (where n{sub T}=n{sub ↑}+n{sub ↓}), the specific heat as a function of the temperature presents a two peak structure. Nevertehelesss, the presence of a pseudogap eliminates the two peak structure. The effects of the second nearest-neighbor hopping on C(T) are also investigated.

  11. The National Map seamless digital elevation model specifications

    Science.gov (United States)

    Archuleta, Christy-Ann M.; Constance, Eric W.; Arundel, Samantha T.; Lowe, Amanda J.; Mantey, Kimberly S.; Phillips, Lori A.

    2017-08-02

    This specification documents the requirements and standards used to produce the seamless elevation layers for The National Map of the United States. Seamless elevation data are available for the conterminous United States, Hawaii, Alaska, and the U.S. territories, in three different resolutions—1/3-arc-second, 1-arc-second, and 2-arc-second. These specifications include requirements and standards information about source data requirements, spatial reference system, distribution tiling schemes, horizontal resolution, vertical accuracy, digital elevation model surface treatment, georeferencing, data source and tile dates, distribution and supporting file formats, void areas, metadata, spatial metadata, and quality assurance and control.

  12. Spring assisted cranioplasty: A patient specific computational model.

    Science.gov (United States)

    Borghi, Alessandro; Rodriguez-Florez, Naiara; Rodgers, Will; James, Gregory; Hayward, Richard; Dunaway, David; Jeelani, Owase; Schievano, Silvia

    2018-03-01

    Implantation of spring-like distractors in the treatment of sagittal craniosynostosis is a novel technique that has proven functionally and aesthetically effective in correcting skull deformities; however, final shape outcomes remain moderately unpredictable due to an incomplete understanding of the skull-distractor interaction. The aim of this study was to create a patient specific computational model of spring assisted cranioplasty (SAC) that can help predict the individual overall final head shape. Pre-operative computed tomography images of a SAC patient were processed to extract a 3D model of the infant skull anatomy and simulate spring implantation. The distractors were modeled based on mechanical experimental data. Viscoelastic bone properties from the literature were tuned using the specific patient procedural information recorded during surgery and from x-ray measurements at follow-up. The model accurately captured spring expansion on-table (within 9% of the measured values), as well as at first and second follow-ups (within 8% of the measured values). Comparison between immediate post-operative 3D head scanning and numerical results for this patient proved that the model could successfully predict the final overall head shape. This preliminary work showed the potential application of computational modeling to study SAC, to support pre-operative planning and guide novel distractor design. Copyright © 2018 IPEM. Published by Elsevier Ltd. All rights reserved.

  13. Tritium Specific Adsorption Simulation Utilizing the OSPREY Model

    Energy Technology Data Exchange (ETDEWEB)

    Veronica Rutledge; Lawrence Tavlarides; Ronghong Lin; Austin Ladshaw

    2013-09-01

    During the processing of used nuclear fuel, volatile radionuclides will be discharged to the atmosphere if no recovery processes are in place to limit their release. The volatile radionuclides of concern are 3H, 14C, 85Kr, and 129I. Methods are being developed, via adsorption and absorption unit operations, to capture these radionuclides. It is necessary to model these unit operations to aid in the evaluation of technologies and in the future development of an advanced used nuclear fuel processing plant. A collaboration between Fuel Cycle Research and Development Offgas Sigma Team member INL and a NEUP grant including ORNL, Syracuse University, and Georgia Institute of Technology has been formed to develop off gas models and support off gas research. This report is discusses the development of a tritium specific adsorption model. Using the OSPREY model and integrating it with a fundamental level isotherm model developed under and experimental data provided by the NEUP grant, the tritium specific adsorption model was developed.

  14. Diagnostic Air Quality Model Evaluation of Source-Specific ...

    Science.gov (United States)

    Ambient measurements of 78 source-specific tracers of primary and secondary carbonaceous fine particulate matter collected at four midwestern United States locations over a full year (March 2004–February 2005) provided an unprecedented opportunity to diagnostically evaluate the results of a numerical air quality model. Previous analyses of these measurements demonstrated excellent mass closure for the variety of contributing sources. In this study, a carbon-apportionment version of the Community Multiscale Air Quality (CMAQ) model was used to track primary organic and elemental carbon emissions from 15 independent sources such as mobile sources and biomass burning in addition to four precursor-specific classes of secondary organic aerosol (SOA) originating from isoprene, terpenes, aromatics, and sesquiterpenes. Conversion of the source-resolved model output into organic tracer concentrations yielded a total of 2416 data pairs for comparison with observations. While emission source contributions to the total model bias varied by season and measurement location, the largest absolute bias of −0.55 μgC/m3 was attributed to insufficient isoprene SOA in the summertime CMAQ simulation. Biomass combustion was responsible for the second largest summertime model bias (−0.46 μgC/m3 on average). Several instances of compensating errors were also evident; model underpredictions in some sectors were masked by overpredictions in others. The National Exposure Research L

  15. Carcinogen specific dosimetry model for passive smokers of various ages

    International Nuclear Information System (INIS)

    Robinson, Risa J.

    2005-01-01

    Studies indicate that being exposed to second hand smoke increases the chance of developing lung cancer. Understanding the deposition of carcinogenic particles present in second hand smoke is necessary to understand the development of specific histologic type cancers. In this study, a deposition model is presented for subjects of various ages exposed to sidestream smoke. The model included particle dynamics of coagulation, hygroscopic growth, charge and cloud behavior. Concentrations were varied from the maximum measured indoor concentrations (10 6 particles/cm 3 ) to what would be expected from wisps of smoke (10 8 particles/cm 3 ). Model results agreed well with experimental data taken from human subject deposition measurements (four studies). The model results were used to determine the dose intensity (dose per unit airway surface area) of Benzo[a]pyrene (BaP) in the respiratory tract for subjects of various ages. Model predictions for BaP surface concentration on the airway walls paralleled incident rates of tumors by location in the upper tracheobronchial region. Mass deposition efficiency was found to be larger for younger subjects, consistent with diffusion being the predominant mechanism for this particle size range. However, the actual dose intensity of BaP was found to be smaller for children than adults. This occurred due to the predominant effect of the smaller initial inhaled mass for children resulting from smaller tidal volumes. The resulting model is a useful tool to predict carcinogen specific particle deposition

  16. Stage-specific predictive models for breast cancer survivability.

    Science.gov (United States)

    Kate, Rohit J; Nadig, Ramya

    2017-01-01

    Survivability rates vary widely among various stages of breast cancer. Although machine learning models built in past to predict breast cancer survivability were given stage as one of the features, they were not trained or evaluated separately for each stage. To investigate whether there are differences in performance of machine learning models trained and evaluated across different stages for predicting breast cancer survivability. Using three different machine learning methods we built models to predict breast cancer survivability separately for each stage and compared them with the traditional joint models built for all the stages. We also evaluated the models separately for each stage and together for all the stages. Our results show that the most suitable model to predict survivability for a specific stage is the model trained for that particular stage. In our experiments, using additional examples of other stages during training did not help, in fact, it made it worse in some cases. The most important features for predicting survivability were also found to be different for different stages. By evaluating the models separately on different stages we found that the performance widely varied across them. We also demonstrate that evaluating predictive models for survivability on all the stages together, as was done in the past, is misleading because it overestimates performance. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  17. Domain Specific Language for Modeling Waste Management Systems

    DEFF Research Database (Denmark)

    Zarrin, Bahram

    In order to develop sustainable waste management systems with considering life cycle perspective, scientists and domain experts in environmental science require readily applicable tools for modeling and evaluating the life cycle impacts of the waste management systems. Practice has proved...... environmental technologies i.e. solid waste management systems. Flow-based programming is used to support concurrent execution of the processes, and provides a model-integration language for composing processes from homogeneous or heterogeneous domains. And a domain-specific language is used to define atomic...... that modeling these systems with general-purpose tools is a cumbersome task. On one hand, the scientists have to spend considerable amount of time to understand these tools in order to develop their models. On another hand, integrated assessments are becoming gradually common in environmental management...

  18. Analysis specifications for the CC3 biosphere model biotrac

    Energy Technology Data Exchange (ETDEWEB)

    Szekely, J.G.; Wojciechowski, L.C.; Stephens, M.E.; Halliday, H.A.

    1994-12-01

    The CC3 (Canadian Concept, generation 3) model BIOTRAC (Biosphere Transport and Consequences) describes the movement in the biosphere of releases from an underground disposal vault, and the consequent radiological dose to a reference individual. Concentrations of toxic substances in different parts of the biosphere are also calculated. BIOTRAC was created specifically for the postclosure analyses of the Environmental Impact Statement that AECL is preparing on the concept for disposal of Canada`s nuclear fuel waste. The model relies on certain assumptions and constraints on the system, which are described by Davis et al. Accordingly, great care must be exercised if BIOTRAC is used for any other purpose.

  19. Surface mesh to voxel data registration for patient-specific anatomical modeling

    Science.gov (United States)

    de Oliveira, Júlia E. E.; Giessler, Paul; Keszei, András.; Herrler, Andreas; Deserno, Thomas M.

    2016-03-01

    Virtual Physiological Human (VPH) models are frequently used for training, planning, and performing medical procedures. The Regional Anaesthesia Simulator and Assistant (RASimAs) project has the goal of increasing the application and effectiveness of regional anesthesia (RA) by combining a simulator of ultrasound-guided and electrical nerve-stimulated RA procedures and a subject-specific assistance system through an integration of image processing, physiological models, subject-specific data, and virtual reality. Individualized models enrich the virtual training tools for learning and improving regional anaesthesia (RA) skills. Therefore, we suggest patient-specific VPH models that are composed by registering the general mesh-based models with patient voxel data-based recordings. Specifically, the pelvis region has been focused for the support of the femoral nerve block. The processing pipeline is composed of different freely available toolboxes such as MatLab, the open Simulation framework (SOFA), and MeshLab. The approach of Gilles is applied for mesh-to-voxel registration. Personalized VPH models include anatomical as well as mechanical properties of the tissues. Two commercial VPH models (Zygote and Anatomium) were used together with 34 MRI data sets. Results are presented for the skin surface and pelvic bones. Future work will extend the registration procedure to cope with all model tissue (i.e., skin, muscle, bone, vessel, nerve, fascia) in a one-step procedure and extrapolating the personalized models to body regions actually being out of the captured field of view.

  20. Conceptual Model of the Globalization for Domain-Specific Languages

    OpenAIRE

    Clark, Tony; Van Den Brand, Mark; Combemale, Benoit; Rumpe, Bernhard

    2015-01-01

    International audience; Domain Specific Languages (DSL) have received some prominence recently. Designing a DSL and all their tools is still cumbersome and lots of work. Engineering of DSLs is still at infancy, not even the terms have been coined and agreed on. In particular globalization and all its consequences need to be precisely defined and discussed. This chapter provides a definition of the relevant terms and relates them, such that a conceptual model emerges. The authors think that th...

  1. A new approach to modeling aviation accidents

    Science.gov (United States)

    Rao, Arjun Harsha

    views aviation accidents as a set of hazardous states of a system (pilot and aircraft), and triggers that cause the system to move between hazardous states. I used the NTSB's accident coding manual (that contains nearly 4000 different codes) to develop a "dictionary" of hazardous states, triggers, and information codes. Then, I created the "grammar", or a set of rules, that: (1) orders the hazardous states in each accident; and, (2) links the hazardous states using the appropriate triggers. This approach: (1) provides a more correct count of the causes for accidents in the NTSB database; and, (2) checks for gaps or omissions in NTSB accident data, and fills in some of these gaps using logic-based rules. These rules also help identify and count causes for accidents that were not discernable from previous analyses of historical accident data. I apply the model to 6200 helicopter accidents that occurred in the US between 1982 and 2015. First, I identify the states and triggers that are most likely to be associated with fatal and non-fatal accidents. The results suggest that non-fatal accidents, which account for approximately 84% of the accidents, provide valuable opportunities to learn about the causes for accidents. Next, I investigate the causes of inflight loss of control using both a conventional approach and using the state-based approach. The conventional analysis provides little insight into the causal mechanism for LOC. For instance, the top cause of LOC is "aircraft control/directional control not maintained", which does not provide any insight. In contrast, the state-based analysis showed that pilots' tendency to clip objects frequently triggered LOC (16.7% of LOC accidents)--this finding was not directly discernable from conventional analyses. Finally, I investigate the causes for improper autorotations using both a conventional approach and the state-based approach. The conventional approach uses modifiers (e.g., "improper", "misjudged") associated with "24520

  2. Mathematical modelling of digit specification by a sonic hedgehog gradient

    KAUST Repository

    Woolley, Thomas E.

    2013-11-26

    Background: The three chick wing digits represent a classical example of a pattern specified by a morphogen gradient. Here we have investigated whether a mathematical model of a Shh gradient can describe the specification of the identities of the three chick wing digits and if it can be applied to limbs with more digits. Results: We have produced a mathematical model for specification of chick wing digit identities by a Shh gradient that can be extended to the four digits of the chick leg with Shh-producing cells forming a digit. This model cannot be extended to specify the five digits of the mouse limb. Conclusions: Our data suggest that the parameters of a classical-type morphogen gradient are sufficient to specify the identities of three different digits. However, to specify more digit identities, this core mechanism has to be coupled to alternative processes, one being that in the chick leg and mouse limb, Shh-producing cells give rise to digits; another that in the mouse limb, the cellular response to the Shh gradient adapts over time so that digit specification does not depend simply on Shh concentration. Developmental Dynamics 243:290-298, 2014. © 2013 Wiley Periodicals, Inc.

  3. Flexible parametric modelling of cause-specific hazards to estimate cumulative incidence functions

    Science.gov (United States)

    2013-01-01

    Background Competing risks are a common occurrence in survival analysis. They arise when a patient is at risk of more than one mutually exclusive event, such as death from different causes, and the occurrence of one of these may prevent any other event from ever happening. Methods There are two main approaches to modelling competing risks: the first is to model the cause-specific hazards and transform these to the cumulative incidence function; the second is to model directly on a transformation of the cumulative incidence function. We focus on the first approach in this paper. This paper advocates the use of the flexible parametric survival model in this competing risk framework. Results An illustrative example on the survival of breast cancer patients has shown that the flexible parametric proportional hazards model has almost perfect agreement with the Cox proportional hazards model. However, the large epidemiological data set used here shows clear evidence of non-proportional hazards. The flexible parametric model is able to adequately account for these through the incorporation of time-dependent effects. Conclusion A key advantage of using this approach is that smooth estimates of both the cause-specific hazard rates and the cumulative incidence functions can be obtained. It is also relatively easy to incorporate time-dependent effects which are commonly seen in epidemiological studies. PMID:23384310

  4. Towards patient specific thermal modelling of the prostate

    Science.gov (United States)

    Van den Berg, Cornelis A. T.; Van de Kamer, Jeroen B.; DeLeeuw, Astrid A. C.; Jeukens, Cécile R. L. P. N.; Raaymakers, Bas W.; van Vulpen, Marco; Lagendijk, Jan J. W.

    2006-02-01

    The application of thermal modelling for hyperthermia and thermal ablation is severely hampered by lack of information about perfusion and vasculature. However, recently, with the advent of sophisticated angiography and dynamic contrast enhanced (DCE) imaging techniques, it has become possible to image small vessels and blood perfusion bringing the ultimate goal of patient specific thermal modelling closer within reach. In this study dynamic contrast enhanced multi-slice CT imaging techniques are employed to investigate the feasibility of this concept for regional hyperthermia treatment of the prostate. The results are retrospectively compared with clinical thermometry data of a patient group from an earlier trial. Furthermore, the role of the prostate vasculature in the establishment of the prostate temperature distribution is studied. Quantitative 3D perfusion maps of the prostate were constructed for five patients using a distributed-parameter tracer kinetics model to analyse dynamic CT data. CT angiography was applied to construct a discrete vessel model of the pelvis. Additionally, a discrete vessel model of the prostate vasculature was constructed of a prostate taken from a human corpse. Three thermal modelling schemes with increasing inclusion of the patient specific physiological information were used to simulate the temperature distribution of the prostate during regional hyperthermia. Prostate perfusion was found to be heterogeneous and T3 prostate carcinomas are often characterized by a strongly elevated tumour perfusion (up to 70-80 ml 100 g-1 min-1). This elevated tumour perfusion leads to 1-2 °C lower tumour temperatures than thermal simulations based on a homogeneous prostate perfusion. Furthermore, the comparison has shown that the simulations with the measured perfusion maps result in consistently lower prostate temperatures than clinically achieved. The simulations with the discrete vessel model indicate that significant pre-heating takes place

  5. Approaches and models of intercultural education

    Directory of Open Access Journals (Sweden)

    Iván Manuel Sánchez Fontalvo

    2013-10-01

    Full Text Available Needed to be aware of the need to build an intercultural society, awareness must be assumed in all social spheres, where stands the role play education. A role of transcendental, since it must promote educational spaces to form people with virtues and powers that allow them to live together / as in multicultural contexts and social diversities (sometimes uneven in an increasingly globalized and interconnected world, and foster the development of feelings of civic belonging shared before the neighborhood, city, region and country, allowing them concern and critical judgement to marginalization, poverty, misery and inequitable distribution of wealth, causes of structural violence, but at the same time, wanting to work for the welfare and transformation of these scenarios. Since these budgets, it is important to know the approaches and models of intercultural education that have been developed so far, analysing their impact on the contexts educational where apply.   

  6. An approach to analyse the specific impact of rapamycin on mRNA-ribosome association

    Directory of Open Access Journals (Sweden)

    Jaquier-Gubler Pascale

    2008-08-01

    Full Text Available Abstract Background Recent work, using both cell culture model systems and tumour derived cell lines, suggests that the differential recruitment into polysomes of mRNA populations may be sufficient to initiate and maintain tumour formation. Consequently, a major effort is underway to use high density microarray profiles to establish molecular fingerprints for cells exposed to defined drug regimes. The aim of these pharmacogenomic approaches is to provide new information on how drugs can impact on the translational read-out within a defined cellular background. Methods We describe an approach that permits the analysis of de-novo mRNA-ribosome association in-vivo during short drug exposures. It combines hypertonic shock, polysome fractionation and high-throughput analysis to provide a molecular phenotype of translationally responsive transcripts. Compared to previous translational profiling studies, the procedure offers increased specificity due to the elimination of the drugs secondary effects (e.g. on the transcriptional read-out. For this pilot "proof-of-principle" assay we selected the drug rapamycin because of its extensively studied impact on translation initiation. Results High throughput analysis on both the light and heavy polysomal fractions has identified mRNAs whose re-recruitment onto free ribosomes responded to short exposure to the drug rapamycin. The results of the microarray have been confirmed using real-time RT-PCR. The selective down-regulation of TOP transcripts is also consistent with previous translational profiling studies using this drug. Conclusion The technical advance outlined in this manuscript offers the possibility of new insights into mRNA features that impact on translation initiation and provides a molecular fingerprint for transcript-ribosome association in any cell type and in the presence of a range of drugs of interest. Such molecular phenotypes defined pre-clinically may ultimately impact on the evaluation of

  7. The discontinuity of the specific heat for the 5D Ising model

    Directory of Open Access Journals (Sweden)

    P.H. Lundow

    2015-06-01

    Full Text Available In this paper we investigate the behaviour of the specific heat around the critical point of the Ising model in dimension 5 to 7. We find a specific heat discontinuity, like that for the mean field Ising model, and provide estimates for the left and right hand limits of the specific heat at the critical point. We also estimate the singular exponents, describing how the specific heat approaches those limits. Additionally, we make a smaller scale investigation of the same properties in dimension 6 and 7, and provide strongly improved estimates for the critical temperature Kc in d=5,6,7 which bring the best MC-estimate closer to those obtained by long high temperature series expansions.

  8. Evaluation of mesh morphing and mapping techniques in patient specific modeling of the human pelvis.

    Science.gov (United States)

    Salo, Zoryana; Beek, Maarten; Whyne, Cari Marisa

    2013-01-01

    Robust generation of pelvic finite element models is necessary to understand the variation in mechanical behaviour resulting from differences in gender, aging, disease and injury. The objective of this study was to apply and evaluate mesh morphing and mapping techniques to facilitate the creation and structural analysis of specimen-specific finite element (FE) models of the pelvis. A specimen-specific pelvic FE model (source mesh) was generated following a traditional user-intensive meshing scheme. The source mesh was morphed onto a computed tomography scan generated target surface of a second pelvis using a landmarked-based approach, in which exterior source nodes were shifted to target surface vertices, while constrained along a normal. A second copy of the morphed model was further refined through mesh mapping, in which surface nodes of the initial morphed model were selected in patches and remapped onto the surfaces of the target model. Computed tomography intensity based material properties were assigned to each model. The source, target, morphed and mapped models were analyzed under axial compression using linear static FE analysis and their strain distributions evaluated. Morphing and mapping techniques were effectively applied to generate good quality geometrically complex specimen-specific pelvic FE models. Mapping significantly improved strain concurrence with the target pelvis FE model. Copyright © 2012 John Wiley & Sons, Ltd.

  9. Evaluation of mesh morphing and mapping techniques in patient specific modelling of the human pelvis.

    Science.gov (United States)

    Salo, Zoryana; Beek, Maarten; Whyne, Cari Marisa

    2012-08-01

    Robust generation of pelvic finite element models is necessary to understand variation in mechanical behaviour resulting from differences in gender, aging, disease and injury. The objective of this study was to apply and evaluate mesh morphing and mapping techniques to facilitate the creation and structural analysis of specimen-specific finite element (FE) models of the pelvis. A specimen-specific pelvic FE model (source mesh) was generated following a traditional user-intensive meshing scheme. The source mesh was morphed onto a computed tomography scan generated target surface of a second pelvis using a landmarked-based approach, in which exterior source nodes were shifted to target surface vertices, while constrained along a normal. A second copy of the morphed model was further refined through mesh mapping, in which surface nodes of the initial morphed model were selected in patches and remapped onto the surfaces of the target model. Computed tomography intensity-based material properties were assigned to each model. The source, target, morphed and mapped models were analyzed under axial compression using linear static FE analysis, and their strain distributions were evaluated. Morphing and mapping techniques were effectively applied to generate good quality and geometrically complex specimen-specific pelvic FE models. Mapping significantly improved strain concurrence with the target pelvis FE model. Copyright © 2012 John Wiley & Sons, Ltd.

  10. Flexible parametric modelling of the cause-specific cumulative incidence function.

    Science.gov (United States)

    Lambert, Paul C; Wilkes, Sally R; Crowther, Michael J

    2017-04-30

    Competing risks arise with time-to-event data when individuals are at risk of more than one type of event and the occurrence of one event precludes the occurrence of all other events. A useful measure with competing risks is the cause-specific cumulative incidence function (CIF), which gives the probability of experiencing a particular event as a function of follow-up time, accounting for the fact that some individuals may have a competing event. When modelling the cause-specific CIF, the most common model is a semi-parametric proportional subhazards model. In this paper, we propose the use of flexible parametric survival models to directly model the cause-specific CIF where the effect of follow-up time is modelled using restricted cubic splines. The models provide smooth estimates of the cause-specific CIF with the important advantage that the approach is easily extended to model time-dependent effects. The models can be fitted using standard survival analysis tools by a combination of data expansion and introducing time-dependent weights. Various link functions are available that allow modelling on different scales and have proportional subhazards, proportional odds and relative absolute risks as particular cases. We conduct a simulation study to evaluate how well the spline functions approximate subhazard functions with complex shapes. The methods are illustrated using data from the European Blood and Marrow Transplantation Registry showing excellent agreement between parametric estimates of the cause-specific CIF and those obtained from a semi-parametric model. We also fit models relaxing the proportional subhazards assumption using alternative link functions and/or including time-dependent effects. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  11. Genetic Algorithm Approaches to Prebiobiotic Chemistry Modeling

    Science.gov (United States)

    Lohn, Jason; Colombano, Silvano

    1997-01-01

    We model an artificial chemistry comprised of interacting polymers by specifying two initial conditions: a distribution of polymers and a fixed set of reversible catalytic reactions. A genetic algorithm is used to find a set of reactions that exhibit a desired dynamical behavior. Such a technique is useful because it allows an investigator to determine whether a specific pattern of dynamics can be produced, and if it can, the reaction network found can be then analyzed. We present our results in the context of studying simplified chemical dynamics in theorized protocells - hypothesized precursors of the first living organisms. Our results show that given a small sample of plausible protocell reaction dynamics, catalytic reaction sets can be found. We present cases where this is not possible and also analyze the evolved reaction sets.

  12. Tumour resistance to cisplatin: a modelling approach

    International Nuclear Information System (INIS)

    Marcu, L; Bezak, E; Olver, I; Doorn, T van

    2005-01-01

    Although chemotherapy has revolutionized the treatment of haematological tumours, in many common solid tumours the success has been limited. Some of the reasons for the limitations are: the timing of drug delivery, resistance to the drug, repopulation between cycles of chemotherapy and the lack of complete understanding of the pharmacokinetics and pharmacodynamics of a specific agent. Cisplatin is among the most effective cytotoxic agents used in head and neck cancer treatments. When modelling cisplatin as a single agent, the properties of cisplatin only have to be taken into account, reducing the number of assumptions that are considered in the generalized chemotherapy models. The aim of the present paper is to model the biological effect of cisplatin and to simulate the consequence of cisplatin resistance on tumour control. The 'treated' tumour is a squamous cell carcinoma of the head and neck, previously grown by computer-based Monte Carlo techniques. The model maintained the biological constitution of a tumour through the generation of stem cells, proliferating cells and non-proliferating cells. Cell kinetic parameters (mean cell cycle time, cell loss factor, thymidine labelling index) were also consistent with the literature. A sensitivity study on the contribution of various mechanisms leading to drug resistance is undertaken. To quantify the extent of drug resistance, the cisplatin resistance factor (CRF) is defined as the ratio between the number of surviving cells of the resistant population and the number of surviving cells of the sensitive population, determined after the same treatment time. It is shown that there is a supra-linear dependence of CRF on the percentage of cisplatin-DNA adducts formed, and a sigmoid-like dependence between CRF and the percentage of cells killed in resistant tumours. Drug resistance is shown to be a cumulative process which eventually can overcome tumour regression leading to treatment failure

  13. Modelling and simulating retail management practices: a first approach

    OpenAIRE

    Siebers, Peer-Olaf; Aickelin, Uwe; Celia, Helen; Clegg, Chris

    2010-01-01

    Multi-agent systems offer a new and exciting way of understanding the world of work. We apply agent-based modeling and simulation to investigate a set of problems\\ud in a retail context. Specifically, we are working to understand the relationship between people management practices on the shop-floor and retail performance. Despite the fact we are working within a relatively novel and complex domain, it is clear that using an agent-based approach offers great potential for improving organizati...

  14. Mathematical Formulation Requirements and Specifications for the Process Models

    Energy Technology Data Exchange (ETDEWEB)

    Steefel, C.; Moulton, D.; Pau, G.; Lipnikov, K.; Meza, J.; Lichtner, P.; Wolery, T.; Bacon, D.; Spycher, N.; Bell, J.; Moridis, G.; Yabusaki, S.; Sonnenthal, E.; Zyvoloski, G.; Andre, B.; Zheng, L.; Davis, J.

    2010-11-01

    The Advanced Simulation Capability for Environmental Management (ASCEM) is intended to be a state-of-the-art scientific tool and approach for understanding and predicting contaminant fate and transport in natural and engineered systems. The ASCEM program is aimed at addressing critical EM program needs to better understand and quantify flow and contaminant transport behavior in complex geological systems. It will also address the long-term performance of engineered components including cementitious materials in nuclear waste disposal facilities, in order to reduce uncertainties and risks associated with DOE EM's environmental cleanup and closure activities. Building upon national capabilities developed from decades of Research and Development in subsurface geosciences, computational and computer science, modeling and applied mathematics, and environmental remediation, the ASCEM initiative will develop an integrated, open-source, high-performance computer modeling system for multiphase, multicomponent, multiscale subsurface flow and contaminant transport. This integrated modeling system will incorporate capabilities for predicting releases from various waste forms, identifying exposure pathways and performing dose calculations, and conducting systematic uncertainty quantification. The ASCEM approach will be demonstrated on selected sites, and then applied to support the next generation of performance assessments of nuclear waste disposal and facility decommissioning across the EM complex. The Multi-Process High Performance Computing (HPC) Simulator is one of three thrust areas in ASCEM. The other two are the Platform and Integrated Toolsets (dubbed the Platform) and Site Applications. The primary objective of the HPC Simulator is to provide a flexible and extensible computational engine to simulate the coupled processes and flow scenarios described by the conceptual models developed using the ASCEM Platform. The graded and iterative approach to assessments

  15. Systems Approaches to Modeling Chronic Mucosal Inflammation

    Science.gov (United States)

    Gao, Boning; Choudhary, Sanjeev; Wood, Thomas G.; Carmical, Joseph R.; Boldogh, Istvan; Mitra, Sankar; Minna, John D.; Brasier, Allan R.

    2013-01-01

    The respiratory mucosa is a major coordinator of the inflammatory response in chronic airway diseases, including asthma and chronic obstructive pulmonary disease (COPD). Signals produced by the chronic inflammatory process induce epithelial mesenchymal transition (EMT) that dramatically alters the epithelial cell phenotype. The effects of EMT on epigenetic reprogramming and the activation of transcriptional networks are known, its effects on the innate inflammatory response are underexplored. We used a multiplex gene expression profiling platform to investigate the perturbations of the innate pathways induced by TGFβ in a primary airway epithelial cell model of EMT. EMT had dramatic effects on the induction of the innate pathway and the coupling interval of the canonical and noncanonical NF-κB pathways. Simulation experiments demonstrate that rapid, coordinated cap-independent translation of TRAF-1 and NF-κB2 is required to reduce the noncanonical pathway coupling interval. Experiments using amantadine confirmed the prediction that TRAF-1 and NF-κB2/p100 production is mediated by an IRES-dependent mechanism. These data indicate that the epigenetic changes produced by EMT induce dynamic state changes of the innate signaling pathway. Further applications of systems approaches will provide understanding of this complex phenotype through deterministic modeling and multidimensional (genomic and proteomic) profiling. PMID:24228254

  16. ECOMOD - An ecological approach to radioecological modelling

    International Nuclear Information System (INIS)

    Sazykina, Tatiana G.

    2000-01-01

    A unified methodology is proposed to simulate the dynamic processes of radionuclide migration in aquatic food chains in parallel with their stable analogue elements. The distinguishing feature of the unified radioecological/ecological approach is the description of radionuclide migration along with dynamic equations for the ecosystem. The ability of the methodology to predict the results of radioecological experiments is demonstrated by an example of radionuclide (iron group) accumulation by a laboratory culture of the algae Platymonas viridis. Based on the unified methodology, the 'ECOMOD' radioecological model was developed to simulate dynamic radioecological processes in aquatic ecosystems. It comprises three basic modules, which are operated as a set of inter-related programs. The 'ECOSYSTEM' module solves non-linear ecological equations, describing the biomass dynamics of essential ecosystem components. The 'RADIONUCLIDE DISTRIBUTION' module calculates the radionuclide distribution in abiotic and biotic components of the aquatic ecosystem. The 'DOSE ASSESSMENT' module calculates doses to aquatic biota and doses to man from aquatic food chains. The application of the ECOMOD model to reconstruct the radionuclide distribution in the Chernobyl Cooling Pond ecosystem in the early period after the accident shows good agreement with observations

  17. Multiple imputation of covariates by fully conditional specification: Accommodating the substantive model.

    Science.gov (United States)

    Bartlett, Jonathan W; Seaman, Shaun R; White, Ian R; Carpenter, James R

    2015-08-01

    Missing covariate data commonly occur in epidemiological and clinical research, and are often dealt with using multiple imputation. Imputation of partially observed covariates is complicated if the substantive model is non-linear (e.g. Cox proportional hazards model), or contains non-linear (e.g. squared) or interaction terms, and standard software implementations of multiple imputation may impute covariates from models that are incompatible with such substantive models. We show how imputation by fully conditional specification, a popular approach for performing multiple imputation, can be modified so that covariates are imputed from models which are compatible with the substantive model. We investigate through simulation the performance of this proposal, and compare it with existing approaches. Simulation results suggest our proposal gives consistent estimates for a range of common substantive models, including models which contain non-linear covariate effects or interactions, provided data are missing at random and the assumed imputation models are correctly specified and mutually compatible. Stata software implementing the approach is freely available. © The Author(s) 2014.

  18. Psychological approaches in the treatment of specific phobias: A meta-analysis

    NARCIS (Netherlands)

    Wolitzky-Taylor, K.B.; Horowitz, J.D.; Powers, M.B.; Telch, M.J.

    2008-01-01

    Data from 33 randomized treatment studies were subjected to a meta-analysis to address questions surrounding the efficacy of psychological approaches in the treatment of specific phobia. As expected, exposure-based treatment produced large effects sizes relative to no treatment. They also

  19. Scaffolding in tissue engineering: general approaches and tissue-specific considerations.

    Science.gov (United States)

    Chan, B P; Leong, K W

    2008-12-01

    Scaffolds represent important components for tissue engineering. However, researchers often encounter an enormous variety of choices when selecting scaffolds for tissue engineering. This paper aims to review the functions of scaffolds and the major scaffolding approaches as important guidelines for selecting scaffolds and discuss the tissue-specific considerations for scaffolding, using intervertebral disc as an example.

  20. A Functional-Notional Approach for English for Specific Purposes (ESP) Programs.

    Science.gov (United States)

    Kim, Young-Min

    English for Specific Purposes (ESP) programs, characterized by the special needs of the language learners, are described and a review of the literature on a functional-notional approach to the syllabus design of ESP programs is presented. It is suggested that effective ESP programs should teach the language skills necessary to function and perform…

  1. Cognitive Approach to Assessing Pragmatic Language Comprehension in Children with Specific Language Impairment

    Science.gov (United States)

    Ryder, Nuala; Leinonen, Eeva; Schulz, Joerg

    2008-01-01

    Background: Pragmatic language impairment in children with specific language impairment has proved difficult to assess, and the nature of their abilities to comprehend pragmatic meaning has not been fully investigated. Aims: To develop both a cognitive approach to pragmatic language assessment based on Relevance Theory and an assessment tool for…

  2. A Discipline-Specific Approach to the History of U.S. Science Education

    Science.gov (United States)

    Otero, Valerie K.; Meltzer, David E.

    2017-01-01

    Although much has been said and written about the value of using the history of science in teaching science, relatively little is available to guide educators in the various science disciplines through the educational history of their own discipline. Through a discipline-specific approach to a course on the history of science education in the…

  3. A Bayesian Model of Category-Specific Emotional Brain Responses

    Science.gov (United States)

    Wager, Tor D.; Kang, Jian; Johnson, Timothy D.; Nichols, Thomas E.; Satpute, Ajay B.; Barrett, Lisa Feldman

    2015-01-01

    Understanding emotion is critical for a science of healthy and disordered brain function, but the neurophysiological basis of emotional experience is still poorly understood. We analyzed human brain activity patterns from 148 studies of emotion categories (2159 total participants) using a novel hierarchical Bayesian model. The model allowed us to classify which of five categories—fear, anger, disgust, sadness, or happiness—is engaged by a study with 66% accuracy (43-86% across categories). Analyses of the activity patterns encoded in the model revealed that each emotion category is associated with unique, prototypical patterns of activity across multiple brain systems including the cortex, thalamus, amygdala, and other structures. The results indicate that emotion categories are not contained within any one region or system, but are represented as configurations across multiple brain networks. The model provides a precise summary of the prototypical patterns for each emotion category, and demonstrates that a sufficient characterization of emotion categories relies on (a) differential patterns of involvement in neocortical systems that differ between humans and other species, and (b) distinctive patterns of cortical-subcortical interactions. Thus, these findings are incompatible with several contemporary theories of emotion, including those that emphasize emotion-dedicated brain systems and those that propose emotion is localized primarily in subcortical activity. They are consistent with componential and constructionist views, which propose that emotions are differentiated by a combination of perceptual, mnemonic, prospective, and motivational elements. Such brain-based models of emotion provide a foundation for new translational and clinical approaches. PMID:25853490

  4. Electron/muon specific two Higgs doublet model

    Energy Technology Data Exchange (ETDEWEB)

    Kajiyama, Yuji, E-mail: kajiyama-yuuji@akita-pref.ed.jp [Akita Highschool, Tegata-Nakadai 1, Akita, 010-0851 (Japan); Okada, Hiroshi, E-mail: hokada@kias.re.kr [School of Physics, KIAS, Seoul 130-722 (Korea, Republic of); Yagyu, Kei, E-mail: keiyagyu@ncu.edu.tw [Department of Physics, National Central University, Chungli, 32001, Taiwan, ROC (China)

    2014-10-15

    We discuss two Higgs doublet models with a softly-broken discrete S{sub 3} symmetry, where the mass matrix for charged-leptons is predicted as the diagonal form in the weak eigenbasis of lepton fields. Similarly to an introduction of Z{sub 2} symmetry, the tree level flavor changing neutral current can be forbidden by imposing the S{sub 3} symmetry to the model. Under the S{sub 3} symmetry, there are four types of Yukawa interactions depending on the S{sub 3} charge assignment to right-handed fermions. We find that extra Higgs bosons can be muon and electron specific in one of four types of the Yukawa interaction. This property does not appear in any other two Higgs doublet models with a softly-broken Z{sub 2} symmetry. We discuss the phenomenology of the muon and electron specific Higgs bosons at the Large Hadron Collider; namely we evaluate allowed parameter regions from the current Higgs boson search data and discovery potential of such a Higgs boson at the 14 TeV run.

  5. Metabolic network modeling approaches for investigating the "hungry cancer".

    Science.gov (United States)

    Sharma, Ashwini Kumar; König, Rainer

    2013-08-01

    Metabolism is the functional phenotype of a cell, at a given condition, resulting from an intricate interplay of various regulatory processes. The study of these dynamic metabolic processes and their capabilities help to identify the fundamental properties of living systems. Metabolic deregulation is an emerging hallmark of cancer cells. This deregulation results in rewiring of the metabolic circuitry conferring an exploitative metabolic advantage for the tumor cells which leads to a distinct benefit in survival and lays the basis for unbound progression. Metabolism can be considered as a thermodynamic open-system in which source substrates of high value are being processed through a well established interconnected biochemical conversion system, strictly obeying physiochemical principles, generating useful intermediates and finally resulting in the release of byproducts. Based on this basic principle of an input-output balance, various models have been developed to interrogate metabolism elucidating its underlying functional properties. However, only a few modeling approaches have proved computationally feasible in elucidating the metabolic nature of cancer at a systems level. Besides this, statistical approaches have been set up to identify biochemical pathways being more relevant for specific types of tumor cells. In this review, we are briefly introducing the basic statistical approaches followed by the major modeling concepts. We have put an emphasis on the methods and their applications that have been used to a greater extent in understanding the metabolic remodeling of cancer. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. On specification of initial conditions in turbulence models

    Energy Technology Data Exchange (ETDEWEB)

    Rollin, Bertrand [Los Alamos National Laboratory; Andrews, Malcolm J [Los Alamos National Laboratory

    2010-12-01

    Recent research has shown that initial conditions have a significant influence on the evolution of a flow towards turbulence. This important finding offers a unique opportunity for turbulence control, but also raises the question of how to properly specify initial conditions in turbulence models. We study this problem in the context of the Rayleigh-Taylor instability. The Rayleigh-Taylor instability is an interfacial fluid instability that leads to turbulence and turbulent mixing. It occurs when a light fluid is accelerated in to a heavy fluid because of misalignment between density and pressure gradients. The Rayleigh-Taylor instability plays a key role in a wide variety of natural and man-made flows ranging from supernovae to the implosion phase of Inertial Confinement Fusion (ICF). Our approach consists of providing the turbulence models with a predicted profile of its key variables at the appropriate time in accordance to the initial conditions of the problem.

  7. Agribusiness model approach to territorial food development

    Directory of Open Access Journals (Sweden)

    Murcia Hector Horacio

    2011-04-01

    Full Text Available

    Several research efforts have coordinated the academic program of Agricultural Business Management from the University De La Salle (Bogota D.C., to the design and implementation of a sustainable agribusiness model applied to food development, with territorial projection. Rural development is considered as a process that aims to improve the current capacity and potential of the inhabitant of the sector, which refers not only to production levels and productivity of agricultural items. It takes into account the guidelines of the Organization of the United Nations “Millennium Development Goals” and considered the concept of sustainable food and agriculture development, including food security and nutrition in an integrated interdisciplinary context, with holistic and systemic dimension. Analysis is specified by a model with an emphasis on sustainable agribusiness production chains related to agricultural food items in a specific region. This model was correlated with farm (technical objectives, family (social purposes and community (collective orientations projects. Within this dimension are considered food development concepts and methodologies of Participatory Action Research (PAR. Finally, it addresses the need to link the results to low-income communities, within the concepts of the “new rurality”.

  8. Characterizing economic trends by Bayesian stochastic model specification search

    DEFF Research Database (Denmark)

    Grassi, Stefano; Proietti, Tommaso

    on whether their parameters are fixed or evolutive. Stochastic model specification is carried out to discriminate two alternative hypotheses concerning the generation of trends: the trend-stationary hypothesis, on the one hand, for which the trend is a deterministic function of time and the short run...... dynamics are represented by a stationary autoregressive process; the difference-stationary hypothesis, on the other, according to which the trend results from the cumulation of the effects of random disturbances. We illustrate the methodology for a set of U.S. macroeconomic time series, which includes...

  9. Integrating Cloud-Computing-Specific Model into Aircraft Design

    Science.gov (United States)

    Zhimin, Tian; Qi, Lin; Guangwen, Yang

    Cloud Computing is becoming increasingly relevant, as it will enable companies involved in spreading this technology to open the door to Web 3.0. In the paper, the new categories of services introduced will slowly replace many types of computational resources currently used. In this perspective, grid computing, the basic element for the large scale supply of cloud services, will play a fundamental role in defining how those services will be provided. The paper tries to integrate cloud computing specific model into aircraft design. This work has acquired good results in sharing licenses of large scale and expensive software, such as CFD (Computational Fluid Dynamics), UG, CATIA, and so on.

  10. SPECIFIC APPROACHES OF PRODUCTIVITY IN SERVICES. VALUES OF SEVERAL COMPARATIVE INDICATORS

    Directory of Open Access Journals (Sweden)

    Jivan Alexandru

    2013-07-01

    Full Text Available The paper proposes a synthesis of several specific approaches of productivity in services, in order to provide a better argued theoretical basis for a realistic assessment of the values of several comparative indicators. From a conceptual point of view, the paper will take as a base some definitions of productivity as a quantitative and industrial indicator, revealing certain aspects that need a widening of the area of approach, in the purpose of being suitable for immaterial activities. Based on the brief analysis of productivity, performance and servicity indicators, the research establishes correlations between them in the immaterial field. Elder and, as well, most recent analyses found in the field literature are used, from most rigorous sources, and they are supplemented with own approaches; such approaches ore pointing out several most specific features that are specially set for services and intellect-intensive activities. Technical and financial aspects of common productivity are taken into account, as well as performance in realising various goals, non-economic here included, and the service components, in a complex approach. The presentation focus on nuance features in the quoted references, in the purpose of a fine defining of the indictors and approaches. Specific particularization is achieved based on the literature. Methodologically, the paper is approaching in an unorthodox manner, the plus of value issued from the human activity, i.e. being in a view coming from a fine analysis of the service performed by any economic activity, in a market system. The conceptual importance of these comparative indicators is concisely highlighted, and it consists mainly in the opening that can become a useful ground for practical applicative analyses, which actually are proposed in the future developments of the topic. The research results reveal the relationship between these comparative indicators as well as some conceptual differences between

  11. Anomalous superconductivity in the tJ model; moment approach

    DEFF Research Database (Denmark)

    Sørensen, Mads Peter; Rodriguez-Nunez, J.J.

    1997-01-01

    By extending the moment approach of Nolting (Z, Phys, 225 (1972) 25) in the superconducting phase, we have constructed the one-particle spectral functions (diagonal and off-diagonal) for the tJ model in any dimensions. We propose that both the diagonal and the off-diagonal spectral functions...... Hartree shift which in the end result enlarges the bandwidth of the free carriers allowing us to take relative high values of J/t and allowing superconductivity to live in the T-c-rho phase diagram, in agreement with numerical calculations in a cluster, We have calculated the static spin susceptibility......, chi(T), and the specific heat, C-v(T), within the moment approach. We find that all the relevant physical quantities show the signature of superconductivity at T-c in the form of kinks (anomalous behavior) or jumps, for low density, in agreement with recent published literature, showing a generic...

  12. Modeling a Predictive Energy Equation Specific for Maintenance Hemodialysis.

    Science.gov (United States)

    Byham-Gray, Laura D; Parrott, J Scott; Peters, Emily N; Fogerite, Susan Gould; Hand, Rosa K; Ahrens, Sean; Marcus, Andrea Fleisch; Fiutem, Justin J

    2017-03-01

    Hypermetabolism is theorized in patients diagnosed with chronic kidney disease who are receiving maintenance hemodialysis (MHD). We aimed to distinguish key disease-specific determinants of resting energy expenditure to create a predictive energy equation that more precisely establishes energy needs with the intent of preventing protein-energy wasting. For this 3-year multisite cross-sectional study (N = 116), eligible participants were diagnosed with chronic kidney disease and were receiving MHD for at least 3 months. Predictors for the model included weight, sex, age, C-reactive protein (CRP), glycosylated hemoglobin, and serum creatinine. The outcome variable was measured resting energy expenditure (mREE). Regression modeling was used to generate predictive formulas and Bland-Altman analyses to evaluate accuracy. The majority were male (60.3%), black (81.0%), and non-Hispanic (76.7%), and 23% were ≥65 years old. After screening for multicollinearity, the best predictive model of mREE ( R 2 = 0.67) included weight, age, sex, and CRP. Two alternative models with acceptable predictability ( R 2 = 0.66) were derived with glycosylated hemoglobin or serum creatinine. Based on Bland-Altman analyses, the maintenance hemodialysis equation that included CRP had the best precision, with the highest proportion of participants' predicted energy expenditure classified as accurate (61.2%) and with the lowest number of individuals with underestimation or overestimation. This study confirms disease-specific factors as key determinants of mREE in patients on MHD and provides a preliminary predictive energy equation. Further prospective research is necessary to test the reliability and validity of this equation across diverse populations of patients who are receiving MHD.

  13. Systematic approach to verification and validation: High explosive burn models

    Energy Technology Data Exchange (ETDEWEB)

    Menikoff, Ralph [Los Alamos National Laboratory; Scovel, Christina A. [Los Alamos National Laboratory

    2012-04-16

    Most material models used in numerical simulations are based on heuristics and empirically calibrated to experimental data. For a specific model, key questions are determining its domain of applicability and assessing its relative merits compared to other models. Answering these questions should be a part of model verification and validation (V and V). Here, we focus on V and V of high explosive models. Typically, model developers implemented their model in their own hydro code and use different sets of experiments to calibrate model parameters. Rarely can one find in the literature simulation results for different models of the same experiment. Consequently, it is difficult to assess objectively the relative merits of different models. This situation results in part from the fact that experimental data is scattered through the literature (articles in journals and conference proceedings) and that the printed literature does not allow the reader to obtain data from a figure in electronic form needed to make detailed comparisons among experiments and simulations. In addition, it is very time consuming to set up and run simulations to compare different models over sufficiently many experiments to cover the range of phenomena of interest. The first difficulty could be overcome if the research community were to support an online web based database. The second difficulty can be greatly reduced by automating procedures to set up and run simulations of similar types of experiments. Moreover, automated testing would be greatly facilitated if the data files obtained from a database were in a standard format that contained key experimental parameters as meta-data in a header to the data file. To illustrate our approach to V and V, we have developed a high explosive database (HED) at LANL. It now contains a large number of shock initiation experiments. Utilizing the header information in a data file from HED, we have written scripts to generate an input file for a hydro code

  14. Risk communication: a mental models approach

    National Research Council Canada - National Science Library

    Morgan, M. Granger (Millett Granger)

    2002-01-01

    ... information about risks. The procedure uses approaches from risk and decision analysis to identify the most relevant information; it also uses approaches from psychology and communication theory to ensure that its message is understood. This book is written in nontechnical terms, designed to make the approach feasible for anyone willing to try it. It is illustrat...

  15. Checking Architectural and Implementation Constraints for Domain-Specific Component Frameworks using Models

    OpenAIRE

    Noguera, Carlos; Loiret, Frédéric

    2009-01-01

    Acceptance rate: 38%; International audience; Software components are used in various application domains, and many component models and frameworks have been proposed to fulfill domain-specific requirements. The ad-hoc development of these component frameworks hampers the reuse of tools and abstractions across different frameworks. We believe that in order to promote the reuse of components within various domain contexts an homogeneous design approach is needed. A key requirement of such an a...

  16. Patient-Specific Deep Architectural Model for ECG Classification

    Directory of Open Access Journals (Sweden)

    Kan Luo

    2017-01-01

    Full Text Available Heartbeat classification is a crucial step for arrhythmia diagnosis during electrocardiographic (ECG analysis. The new scenario of wireless body sensor network- (WBSN- enabled ECG monitoring puts forward a higher-level demand for this traditional ECG analysis task. Previously reported methods mainly addressed this requirement with the applications of a shallow structured classifier and expert-designed features. In this study, modified frequency slice wavelet transform (MFSWT was firstly employed to produce the time-frequency image for heartbeat signal. Then the deep learning (DL method was performed for the heartbeat classification. Here, we proposed a novel model incorporating automatic feature abstraction and a deep neural network (DNN classifier. Features were automatically abstracted by the stacked denoising auto-encoder (SDA from the transferred time-frequency image. DNN classifier was constructed by an encoder layer of SDA and a softmax layer. In addition, a deterministic patient-specific heartbeat classifier was achieved by fine-tuning on heartbeat samples, which included a small subset of individual samples. The performance of the proposed model was evaluated on the MIT-BIH arrhythmia database. Results showed that an overall accuracy of 97.5% was achieved using the proposed model, confirming that the proposed DNN model is a powerful tool for heartbeat pattern recognition.

  17. Testing a model of caffeinated alcohol-specific expectancies.

    Science.gov (United States)

    Linden-Carmichael, Ashley N; Lau-Barraco, Cathy; Stamates, Amy L

    2015-08-01

    The present study sought to further understand the association between caffeinated alcoholic beverage (CAB) use and alcohol-related risks. In particular, we focused on the role of two identified expectancies specific to CAB use: intoxication enhancement and avoidance of negative consequences. Although outcome expectancies are consistent predictors of substance use, limited research has examined expectancies related to CAB use and their association with alcohol-related behaviors, such as protecting themselves from alcohol-related harms. Consequently, the present study examined CAB-specific expectancies and protective behavioral strategies (PBS) as mediators of CAB use and negative consequences. Participants were 322 (219 women) college drinkers who completed self-report measures of typical CAB and alcohol use, CAB-specific expectancies, PBS use, and alcohol-related harms. Structural equation modeling revealed, after controlling for typical non-CAB heavy alcohol use, a significant indirect effect of CAB use to alcohol-related problems through avoidance of negative consequences CAB expectancies and PBS use. However, intoxication enhancement expectancies did not mediate this association. Our findings indicate that heavier CAB use was associated with stronger expectations that drinking CABs can help avoid negative consequences. These beliefs were related to using fewer PBS when drinking and a greater likelihood of experiencing problems. Given that these expectancies may be underlying mechanisms of CAB use, their inclusion in existing alcohol interventions may be beneficial. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Activity-specific ecological niche models for planning reintroductions of California condors (Gymnogyps californianus)

    Science.gov (United States)

    D'Elia, Jesse; Haig, Susan M.; Johnson, Matthew J.; Marcot, Bruce G.; Young, Richard

    2015-01-01

    Ecological niche models can be a useful tool to identify candidate reintroduction sites for endangered species but have been infrequently used for this purpose. In this paper, we (1) develop activity-specific ecological niche models (nesting, roosting, and feeding) for the critically endangered California condor (Gymnogyps californianus) to aid in reintroduction planning in California, Oregon, and Washington, USA, (2) test the accuracy of these models using empirical data withheld from model development, and (3) integrate model results with information on condor movement ecology and biology to produce predictive maps of reintroduction site suitability. Our approach, which disentangles niche models into activity-specific components, has applications for other species where it is routinely assumed (often incorrectly) that individuals fulfill all requirements for life within a single environmental space. Ecological niche models conformed to our understanding of California condor ecology, had good predictive performance when tested with data withheld from model development, and aided in the identification of several candidate reintroduction areas outside of the current distribution of the species. Our results suggest there are large unoccupied regions of the California condor’s historical range that have retained ecological features similar to currently occupied habitats, and thus could be considered for future reintroduction efforts. Combining our activity-specific ENMs with ground reconnaissance and information on other threat factors that could not be directly incorporated into empirical ENMs will ultimately improve our ability to select successful reintroduction sites for the California condor.

  19. Multilevel Autoregressive Mediation Models: Specification, Estimation, and Applications.

    Science.gov (United States)

    Zhang, Qian; Wang, Lijuan; Bergeman, C S

    2017-11-27

    In the current study, extending from the cross-lagged panel models (CLPMs) in Cole and Maxwell (2003), we proposed the multilevel autoregressive mediation models (MAMMs) by allowing the coefficients to differ across individuals. In addition, Level-2 covariates can be included to explain the interindividual differences of mediation effects. Given the complexity of the proposed models, Bayesian estimation was used. Both a CLPM and an unconditional MAMM were fitted to daily diary data. The 2 models yielded different statistical conclusions regarding the average mediation effect. A simulation study was conducted to examine the estimation accuracy of Bayesian estimation for MAMMs and consequences of model mis-specifications. Factors considered included the sample size (N), number of time points (T), fixed indirect and direct effect sizes, and Level-2 variances and covariances. Results indicated that the fixed effect estimates for the indirect effect components (a and b) and the fixed effects of Level-2 covariates were accurate when N ≥ 50 and T ≥ 5. For estimating Level-2 variances and covariances, they were accurate provided a sufficiently large N and T (e.g., N ≥ 500 and T ≥ 50). Estimates of the average mediation effect were generally accurate when N ≥ 100 and T ≥ 10, or N ≥ 50 and T ≥ 20. Furthermore, we found that when Level-2 variances were zero, MAMMs yielded valid inferences about the fixed effects, whereas when random effects existed, CLPMs had low coverage rates for fixed effects. DIC can be used for model selection. Limitations and future directions were discussed. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  20. Modelling and subject-specific validation of the heart-arterial tree system.

    Science.gov (United States)

    Guala, Andrea; Camporeale, Carlo; Tosello, Francesco; Canuto, Claudio; Ridolfi, Luca

    2015-01-01

    A modeling approach integrated with a novel subject-specific characterization is here proposed for the assessment of hemodynamic values of the arterial tree. A 1D model is adopted to characterize large-to-medium arteries, while the left ventricle, aortic valve and distal micro-circulation sectors are described by lumped submodels. A new velocity profile and a new formulation of the non-linear viscoelastic constitutive relation suitable for the {Q, A} modeling are also proposed. The model is firstly verified semi-quantitatively against literature data. A simple but effective procedure for obtaining subject-specific model characterization from non-invasive measurements is then designed. A detailed subject-specific validation against in vivo measurements from a population of six healthy young men is also performed. Several key quantities of heart dynamics-mean ejected flow, ejection fraction, and left-ventricular end-diastolic, end-systolic and stroke volumes-and the pressure waveforms (at the central, radial, brachial, femoral, and posterior tibial sites) are compared with measured data. Mean errors around 5 and 8%, obtained for the heart and arterial quantities, respectively, testify the effectiveness of the model and its subject-specific characterization.

  1. Towards Subject-Specific Strength Training Design through Predictive Use of Musculoskeletal Models

    Directory of Open Access Journals (Sweden)

    Michael Plüss

    2018-01-01

    Full Text Available Lower extremity dysfunction is often associated with hip muscle strength deficiencies. Detailed knowledge of the muscle forces generated in the hip under specific external loading conditions enables specific structures to be trained. The aim of this study was to find the most effective movement type and loading direction to enable the training of specific parts of the hip muscles using a standing posture and a pulley system. In a novel approach to release the predictive power of musculoskeletal modelling techniques based on inverse dynamics, flexion/extension and ab-/adduction movements were virtually created. To demonstrate the effectiveness of this approach, three hip orientations and an external loading force that was systematically rotated around the body were simulated using a state-of-the art OpenSim model in order to establish ideal designs for training of the anterior and posterior parts of the M. gluteus medius (GM. The external force direction as well as the hip orientation greatly influenced the muscle forces in the different parts of the GM. No setting was found for simultaneous training of the anterior and posterior parts with a muscle force higher than 50% of the maximum. Importantly, this study has demonstrated the use of musculoskeletal models as an approach to predict muscle force variations for different strength and rehabilitation exercise variations.

  2. Effects of Sample Size, Estimation Methods, and Model Specification on Structural Equation Modeling Fit Indexes.

    Science.gov (United States)

    Fan, Xitao; Wang, Lin; Thompson, Bruce

    1999-01-01

    A Monte Carlo simulation study investigated the effects on 10 structural equation modeling fit indexes of sample size, estimation method, and model specification. Some fit indexes did not appear to be comparable, and it was apparent that estimation method strongly influenced almost all fit indexes examined, especially for misspecified models. (SLD)

  3. The Relationship Between Approach to Activity Engagement, Specific Aspects of Physical Function, and Pain Duration in Chronic Pain.

    Science.gov (United States)

    Andrews, Nicole E; Strong, Jenny; Meredith, Pamela J

    2016-01-01

    To examine: (1) the relationships between habitual approach to activity engagement and specific aspects of physical functioning in chronic pain; and (2) whether or not these relationships differ according to pain duration. Outpatients (N=169) with generalized chronic pain completed a set of written questionnaires. Categories of "approach to activity engagement" were created using the confronting and avoidance subscales of the Pain and Activity Relations Questionnaire. An interaction term between "approach to activity engagement" categories and pain duration was entered into analysis with age, sex, pain intensity, the categorical "approach to activity engagement" variable, and pain duration, in 9 ordinal regression models investigating functioning in a variety of daily activities. The "approach to activity engagement" category predicted the personal care, lifting, sleeping, social life, and traveling aspects of physical functioning but, interestingly, not the performance skills used during these activities, that is, walking, sitting, and standing. The interaction term was significant in 2 models; however, the effect of pain duration on associations was the inverse of that theorized, with the relationship between variables becoming less pronounced with increasing duration of pain. The results of this study do not support the commonly held notion that avoidance and/or overactivity behavior leads to deconditioning and reduced physical capacity over time. Findings do, however, suggest that a relationship exists between avoidance and/or overactivity behavior and reduced participation in activities. Implications for the clinical management of chronic pain and directions for further research are discussed.

  4. Modeling drug- and chemical- induced hepatotoxicity with systems biology approaches

    Directory of Open Access Journals (Sweden)

    Sudin eBhattacharya

    2012-12-01

    Full Text Available We provide an overview of computational systems biology approaches as applied to the study of chemical- and drug-induced toxicity. The concept of ‘toxicity pathways’ is described in the context of the 2007 US National Academies of Science report, Toxicity testing in the 21st Century: A Vision and A Strategy. Pathway mapping and modeling based on network biology concepts are a key component of the vision laid out in this report for a more biologically-based analysis of dose-response behavior and the safety of chemicals and drugs. We focus on toxicity of the liver (hepatotoxicity – a complex phenotypic response with contributions from a number of different cell types and biological processes. We describe three case studies of complementary multi-scale computational modeling approaches to understand perturbation of toxicity pathways in the human liver as a result of exposure to environmental contaminants and specific drugs. One approach involves development of a spatial, multicellular virtual tissue model of the liver lobule that combines molecular circuits in individual hepatocytes with cell-cell interactions and blood-mediated transport of toxicants through hepatic sinusoids, to enable quantitative, mechanistic prediction of hepatic dose-response for activation of the AhR toxicity pathway. Simultaneously, methods are being developing to extract quantitative maps of intracellular signaling and transcriptional regulatory networks perturbed by environmental contaminants, using a combination of gene expression and genome-wide protein-DNA interaction data. A predictive physiological model (DILIsymTM to understand drug-induced liver injury (DILI, the most common adverse event leading to termination of clinical development programs and regulatory actions on drugs, is also described. The model initially focuses on reactive metabolite-induced DILI in response to administration of acetaminophen, and spans multiple biological scales.

  5. Generating patient-specific pulmonary vascular models for surgical planning

    Science.gov (United States)

    Murff, Daniel; Co-Vu, Jennifer; O'Dell, Walter G.

    2015-03-01

    Each year in the U.S., 7.4 million surgical procedures involving the major vessels are performed. Many of our patients require multiple surgeries, and many of the procedures include "surgical exploration". Procedures of this kind come with a significant amount of risk, carrying up to a 17.4% predicted mortality rate. This is especially concerning for our target population of pediatric patients with congenital abnormalities of the heart and major pulmonary vessels. This paper offers a novel approach to surgical planning which includes studying virtual and physical models of pulmonary vasculature of an individual patient before operation obtained from conventional 3D X-ray computed tomography (CT) scans of the chest. These models would provide clinicians with a non-invasive, intricately detailed representation of patient anatomy, and could reduce the need for invasive planning procedures such as exploratory surgery. Researchers involved in the AirPROM project have already demonstrated the utility of virtual and physical models in treatment planning of the airways of the chest. Clinicians have acknowledged the potential benefit from such a technology. A method for creating patient-derived physical models is demonstrated on pulmonary vasculature extracted from a CT scan with contrast of an adult human. Using a modified version of the NIH ImageJ program, a series of image processing functions are used to extract and mathematically reconstruct the vasculature tree structures of interest. An auto-generated STL file is sent to a 3D printer to create a physical model of the major pulmonary vasculature generated from 3D CT scans of patients.

  6. Specification of advanced safety modeling requirements (Rev. 0).

    Energy Technology Data Exchange (ETDEWEB)

    Fanning, T. H.; Tautges, T. J.

    2008-06-30

    The U.S. Department of Energy's Global Nuclear Energy Partnership has lead to renewed interest in liquid-metal-cooled fast reactors for the purpose of closing the nuclear fuel cycle and making more efficient use of future repository capacity. However, the U.S. has not designed or constructed a fast reactor in nearly 30 years. Accurate, high-fidelity, whole-plant dynamics safety simulations will play a crucial role by providing confidence that component and system designs will satisfy established design limits and safety margins under a wide variety of operational, design basis, and beyond design basis transient conditions. Current modeling capabilities for fast reactor safety analyses have resulted from several hundred person-years of code development effort supported by experimental validation. The broad spectrum of mechanistic and phenomenological models that have been developed represent an enormous amount of institutional knowledge that needs to be maintained. Complicating this, the existing code architectures for safety modeling evolved from programming practices of the 1970s. This has lead to monolithic applications with interdependent data models which require significant knowledge of the complexities of the entire code in order for each component to be maintained. In order to develop an advanced fast reactor safety modeling capability, the limitations of the existing code architecture must be overcome while preserving the capabilities that already exist. To accomplish this, a set of advanced safety modeling requirements is defined, based on modern programming practices, that focuses on modular development within a flexible coupling framework. An approach for integrating the existing capabilities of the SAS4A/SASSYS-1 fast reactor safety analysis code into the SHARP framework is provided in order to preserve existing capabilities while providing a smooth transition to advanced modeling capabilities. In doing this, the advanced fast reactor safety models

  7. Virtuous organization: A structural equation modeling approach

    Directory of Open Access Journals (Sweden)

    Majid Zamahani

    2013-02-01

    Full Text Available For years, the idea of virtue was unfavorable among researchers and virtues were traditionally considered as culture-specific, relativistic and they were supposed to be associated with social conservatism, religious or moral dogmatism, and scientific irrelevance. Virtue and virtuousness have been recently considered seriously among organizational researchers. The proposed study of this paper examines the relationships between leadership, organizational culture, human resource, structure and processes, care for community and virtuous organization. Structural equation modeling is employed to investigate the effects of each variable on other components. The data used in this study consists of questionnaire responses from employees in Payam e Noor University in Yazd province. A total of 250 questionnaires were sent out and a total of 211 valid responses were received. Our results have revealed that all the five variables have positive and significant impacts on virtuous organization. Among the five variables, organizational culture has the most direct impact (0.80 and human resource has the most total impact (0.844 on virtuous organization.

  8. Quantitative versus qualitative modeling: a complementary approach in ecosystem study.

    Science.gov (United States)

    Bondavalli, C; Favilla, S; Bodini, A

    2009-02-01

    Natural disturbance or human perturbation act upon ecosystems by changing some dynamical parameters of one or more species. Foreseeing these modifications is necessary before embarking on an intervention: predictions may help to assess management options and define hypothesis for interventions. Models become valuable tools for studying and making predictions only when they capture types of interactions and their magnitude. Quantitative models are more precise and specific about a system, but require a large effort in model construction. Because of this very often ecological systems remain only partially specified and one possible approach to their description and analysis comes from qualitative modelling. Qualitative models yield predictions as directions of change in species abundance but in complex systems these predictions are often ambiguous, being the result of opposite actions exerted on the same species by way of multiple pathways of interactions. Again, to avoid such ambiguities one needs to know the intensity of all links in the system. One way to make link magnitude explicit in a way that can be used in qualitative analysis is described in this paper and takes advantage of another type of ecosystem representation: ecological flow networks. These flow diagrams contain the structure, the relative position and the connections between the components of a system, and the quantity of matter flowing along every connection. In this paper it is shown how these ecological flow networks can be used to produce a quantitative model similar to the qualitative counterpart. Analyzed through the apparatus of loop analysis this quantitative model yields predictions that are by no means ambiguous, solving in an elegant way the basic problem of qualitative analysis. The approach adopted in this work is still preliminary and we must be careful in its application.

  9. An energy-based model for the image edge-histogram specification problem.

    Science.gov (United States)

    Mignotte, Max

    2012-01-01

    In this correspondence, we present an original energy-based model that achieves the edge-histogram specification of a real input image and thus extends the exact specification method of the image luminance (or gray level) distribution recently proposed by Coltuc et al. Our edge-histogram specification approach is stated as an optimization problem in which each edge of a real input image will tend iteratively toward some specified gradient magnitude values given by a target edge distribution (or a normalized edge histogram possibly estimated from a target image). To this end, a hybrid optimization scheme combining a global and deterministic conjugate-gradient-based procedure and a local stochastic search using the Metropolis criterion is proposed herein to find a reliable solution to our energy-based model. Experimental results are presented, and several applications follow from this procedure.

  10. Modelling, Specification and Robustness Issues for Robotic Manipulation Tasks

    Directory of Open Access Journals (Sweden)

    Danica Kragic

    2008-11-01

    Full Text Available In this paper, a system for modeling of service robot tasks is presented. Our work is motivated by the idea that a robotic task may be represented as a set of tractable modules each responsible for a certain part of the task. For general fetch-and-carry robotic applications, there will be varying demands for precision and degrees of freedom involved depending on complexity of the individual module. The particular research problem considered here is the development of a system that supports simple design of complex tasks from a set of basic primitives. The three system levels considered are: i task graph generation which allows the user to easily design or model a task, ii task graph execution which executes the task graph, and iii at the lowest level, the specification and development of primitives required for general fetch-and-carry robotic applications. In terms of robustness, we believe that one way of increasing the robustness of the whole system is by increasing the robustness of individual modules. In particular, we consider a number of different parameters that effect the performance of a model-based tracking system. Parameters such as color channels, feature detection, validation gates, outliers rejection and feature selection are considered here and their affect to the overall system performance is discussed. Experimental evaluation shows how some of these parameters can successfully be evaluated (learned on-line and consequently improve the performance of the system.

  11. Modelling, Specification and Robustness Issues for Robotic Manipulation Tasks

    Directory of Open Access Journals (Sweden)

    Danica Kragic

    2004-06-01

    Full Text Available In this paper, a system for modeling of service robot tasks is presented. Our work is motivated by the idea that a robotic task may be represented as a set of tractable modules each responsible for a certain part of the task. For general fetch-and-carry robotic applications, there will be varying demands for precision and degrees of freedom involved depending on complexity of the individual module. The particular research problem considered here is the development of a system that supports simple design of complex tasks from a set of basic primitives. The three system levels considered are: i task graph generation which allows the user to easily design or model a task, ii task graph execution which executes the task graph, and iii at the lowest level, the specification and development of primitives required for general fetch-and-carry robotic applications. In terms of robustness, we believe that one way of increasing the robustness of the whole system is by increasing the robustness of individual modules. In particular, we consider a number of different parameters that effect the performance of a model-based tracking system. Parameters such as color channels, feature detection, validation gates, outliers rejection and feature selection are considered here and their affect to the overall system performance is discussed. Experimental evaluation shows how some of these parameters can successfully be evaluated (learned on-line and consequently improve the performance of the system.

  12. Improving Baseline Model Assumptions: Evaluating the Impacts of Typical Methodological Approaches in Watershed Models

    Science.gov (United States)

    Muenich, R. L.; Kalcic, M. M.; Teshager, A. D.; Long, C. M.; Wang, Y. C.; Scavia, D.

    2017-12-01

    Thanks to the availability of open-source software, online tutorials, and advanced software capabilities, watershed modeling has expanded its user-base and applications significantly in the past thirty years. Even complicated models like the Soil and Water Assessment Tool (SWAT) are being used and documented in hundreds of peer-reviewed publications each year, and likely more applied in practice. These models can help improve our understanding of present, past, and future conditions, or analyze important "what-if" management scenarios. However, baseline data and methods are often adopted and applied without rigorous testing. In multiple collaborative projects, we have evaluated the influence of some of these common approaches on model results. Specifically, we examined impacts of baseline data and assumptions involved in manure application, combined sewer overflows, and climate data incorporation across multiple watersheds in the Western Lake Erie Basin. In these efforts, we seek to understand the impact of using typical modeling data and assumptions, versus using improved data and enhanced assumptions on model outcomes and thus ultimately, study conclusions. We provide guidance for modelers as they adopt and apply data and models for their specific study region. While it is difficult to quantitatively assess the full uncertainty surrounding model input data and assumptions, recognizing the impacts of model input choices is important when considering actions at the both the field and watershed scales.

  13. Identification and Validation of Specific Markers of Bacillus anthracis Spores by Proteomics and Genomics Approaches*

    Science.gov (United States)

    Chenau, Jérôme; Fenaille, François; Caro, Valérie; Haustant, Michel; Diancourt, Laure; Klee, Silke R.; Junot, Christophe; Ezan, Eric; Goossens, Pierre L.; Becher, François

    2014-01-01

    Bacillus anthracis is the causative bacteria of anthrax, an acute and often fatal disease in humans. The infectious agent, the spore, represents a real bioterrorism threat and its specific identification is crucial. However, because of the high genomic relatedness within the Bacillus cereus group, it is still a real challenge to identify B. anthracis spores confidently. Mass spectrometry-based tools represent a powerful approach to the efficient discovery and identification of such protein markers. Here we undertook comparative proteomics analyses of Bacillus anthracis, cereus and thuringiensis spores to identify proteoforms unique to B. anthracis. The marker discovery pipeline developed combined peptide- and protein-centric approaches using liquid chromatography coupled to tandem mass spectrometry experiments using a high resolution/high mass accuracy LTQ-Orbitrap instrument. By combining these data with those from complementary bioinformatics approaches, we were able to highlight a dozen novel proteins consistently observed across all the investigated B. anthracis spores while being absent in B. cereus/thuringiensis spores. To further demonstrate the relevance of these markers and their strict specificity to B. anthracis, the number of strains studied was extended to 55, by including closely related strains such as B. thuringiensis 9727, and above all the B. cereus biovar anthracis CI, CA strains that possess pXO1- and pXO2-like plasmids. Under these conditions, the combination of proteomics and genomics approaches confirms the pertinence of 11 markers. Genes encoding these 11 markers are located on the chromosome, which provides additional targets complementary to the commonly used plasmid-encoded markers. Last but not least, we also report the development of a targeted liquid chromatography coupled to tandem mass spectrometry method involving the selection reaction monitoring mode for the monitoring of the 4 most suitable protein markers. Within a proof

  14. A Discrete Monetary Economic Growth Model with the MIU Approach

    Directory of Open Access Journals (Sweden)

    Wei-Bin Zhang

    2008-01-01

    Full Text Available This paper proposes an alternative approach to economic growth with money. The production side is the same as the Solow model, the Ramsey model, and the Tobin model. But we deal with behavior of consumers differently from the traditional approaches. The model is influenced by the money-in-the-utility (MIU approach in monetary economics. It provides a mechanism of endogenous saving which the Solow model lacks and avoids the assumption of adding up utility over a period of time upon which the Ramsey approach is based.

  15. Factors affecting forward pricing behaviour: implications of alternative regression model specifications

    Directory of Open Access Journals (Sweden)

    Henry Jordaan

    2010-12-01

    Full Text Available Price risk associated with maize production became a reason for concern in South Africa only after the deregulation of the agricultural commodities markets in the mid-1990s, when farmers became responsible for marketing their own crops. Although farmers can use, inter alia, the cash forward contracting and/or the derivatives market to manage price risk, few farmers actually participate in forward pricing. A similar reluctance to use forward pricing methods is also found internationally. A number of different model specifications have been used in previous research to model forward pricing behaviour which is based on the assumption that the same variables influence both the adoption and the quantity decision. This study compares the results from a model specification which models forward pricing behaviour in a single-decision framework with the results from modelling the quantity decision conditional to the adoption decision in a two-step approach. The results suggest that substantially more information is obtained by modelling forward pricing behaviour as two separate decisions rather than a single decision. Such information may be valuable in educational material compiled to educate farmers in the effective use of forward pricing methods in price risk management. Modelling forward pricing behaviour as two separate decisions  is thus a more effective means of modelling forward pricing behaviour than modelling it as a single decision.

  16. Element-specific density profiles in interacting biomembrane models

    International Nuclear Information System (INIS)

    Schneck, Emanuel; Rodriguez-Loureiro, Ignacio; Bertinetti, Luca; Gochev, Georgi; Marin, Egor; Novikov, Dmitri; Konovalov, Oleg

    2017-01-01

    Surface interactions involving biomembranes, such as cell–cell interactions or membrane contacts inside cells play important roles in numerous biological processes. Structural insight into the interacting surfaces is a prerequisite to understand the interaction characteristics as well as the underlying physical mechanisms. Here, we work with simplified planar experimental models of membrane surfaces, composed of lipids and lipopolymers. Their interaction is quantified in terms of pressure–distance curves using ellipsometry at controlled dehydrating (interaction) pressures. For selected pressures, their internal structure is investigated by standing-wave x-ray fluorescence (SWXF). This technique yields specific density profiles of the chemical elements P and S belonging to lipid headgroups and polymer chains, as well as counter-ion profiles for charged surfaces. (paper)

  17. Validation of a Parametric Approach for 3d Fortification Modelling: Application to Scale Models

    Science.gov (United States)

    Jacquot, K.; Chevrier, C.; Halin, G.

    2013-02-01

    Parametric modelling approach applied to cultural heritage virtual representation is a field of research explored for years since it can address many limitations of digitising tools. For example, essential historical sources for fortification virtual reconstructions like plans-reliefs have several shortcomings when they are scanned. To overcome those problems, knowledge based-modelling can be used: knowledge models based on the analysis of theoretical literature of a specific domain such as bastioned fortification treatises can be the cornerstone of the creation of a parametric library of fortification components. Implemented in Grasshopper, these components are manually adjusted on the data available (i.e. 3D surveys of plans-reliefs or scanned maps). Most of the fortification area is now modelled and the question of accuracy assessment is raised. A specific method is used to evaluate the accuracy of the parametric components. The results of the assessment process will allow us to validate the parametric approach. The automation of the adjustment process can finally be planned. The virtual model of fortification is part of a larger project aimed at valorising and diffusing a very unique cultural heritage item: the collection of plans-reliefs. As such, knowledge models are precious assets when automation and semantic enhancements will be considered.

  18. Mathematical Modelling Approach in Mathematics Education

    Science.gov (United States)

    Arseven, Ayla

    2015-01-01

    The topic of models and modeling has come to be important for science and mathematics education in recent years. The topic of "Modeling" topic is especially important for examinations such as PISA which is conducted at an international level and measures a student's success in mathematics. Mathematical modeling can be defined as using…

  19. A Multivariate Approach to Functional Neuro Modeling

    DEFF Research Database (Denmark)

    Mørch, Niels J.S.

    1998-01-01

    by the application of linear and more flexible, nonlinear microscopic regression models to a real-world dataset. The dependency of model performance, as quantified by generalization error, on model flexibility and training set size is demonstrated, leading to the important realization that no uniformly optimal model......, provides the basis for a generalization theoretical framework relating model performance to model complexity and dataset size. Briefly summarized the major topics discussed in the thesis include: - An introduction of the representation of functional datasets by pairs of neuronal activity patterns...... exists. - Model visualization and interpretation techniques. The simplicity of this task for linear models contrasts the difficulties involved when dealing with nonlinear models. Finally, a visualization technique for nonlinear models is proposed. A single observation emerges from the thesis...

  20. A new approach for modeling dry deposition velocity of particles

    Science.gov (United States)

    Giardina, M.; Buffa, P.

    2018-05-01

    The dry deposition process is recognized as an important pathway among the various removal processes of pollutants in the atmosphere. In this field, there are several models reported in the literature useful to predict the dry deposition velocity of particles of different diameters but many of them are not capable of representing dry deposition phenomena for several categories of pollutants and deposition surfaces. Moreover, their applications is valid for specific conditions and if the data in that application meet all of the assumptions required of the data used to define the model. In this paper a new dry deposition velocity model based on an electrical analogy schema is proposed to overcome the above issues. The dry deposition velocity is evaluated by assuming that the resistances that affect the particle flux in the Quasi-Laminar Sub-layers can be combined to take into account local features of the mutual influence of inertial impact processes and the turbulent one. Comparisons with the experimental data from literature indicate that the proposed model allows to capture with good agreement the main dry deposition phenomena for the examined environmental conditions and deposition surfaces to be determined. The proposed approach could be easily implemented within atmospheric dispersion modeling codes and efficiently addressing different deposition surfaces for several particle pollution.

  1. Violence among young men: the importance of a gender-specific developmental approach to adolescent male suicide and homicide.

    Science.gov (United States)

    Rice, Timothy R

    2015-05-01

    Suicide and homicide are much more commonly committed by adolescent males than females. Herein, a proposal in favor of gender-specific understanding and approach to these violent behaviors is presented. Social and healthcare service system factors, including issues of male adolescents' access to care and help-seeking behaviors, are reviewed alongside the epidemiology of adolescent suicide and homicide as a transition into a detailed discussion of the putative biological factors at play. An emphasis upon the male androgen testosterone organizes the discussion. Behavioral manifestations of this brain-based organizational model are presented with a focus on impulsivity, aggression, and externalizing dysregulated emotionality. Treatment considerations and implications are developed.

  2. Specific acoustic models for spontaneous and dictated style in indonesian speech recognition

    Science.gov (United States)

    Vista, C. B.; Satriawan, C. H.; Lestari, D. P.; Widyantoro, D. H.

    2018-03-01

    The performance of an automatic speech recognition system is affected by differences in speech style between the data the model is originally trained upon and incoming speech to be recognized. In this paper, the usage of GMM-HMM acoustic models for specific speech styles is investigated. We develop two systems for the experiments; the first employs a speech style classifier to predict the speech style of incoming speech, either spontaneous or dictated, then decodes this speech using an acoustic model specifically trained for that speech style. The second system uses both acoustic models to recognise incoming speech and decides upon a final result by calculating a confidence score of decoding. Results show that training specific acoustic models for spontaneous and dictated speech styles confers a slight recognition advantage as compared to a baseline model trained on a mixture of spontaneous and dictated training data. In addition, the speech style classifier approach of the first system produced slightly more accurate results than the confidence scoring employed in the second system.

  3. Sex-specific habitat suitability models for Panthera tigris in Chitwan National Park, Nepal

    Science.gov (United States)

    Battle, Curtis Scott

    Although research on wildlife species across taxa has shown that males and females differentially select habitat, sex-specific models of habitat suitability for endangered species are uncommon. Here, we developed such models for Bengal Tigers (Panthera tigris) based on camera trap data collected from 20 January to 22 March, 2010, within Chitwan National Park, Nepal, and its buffer zone. We compared these to a sex-indiscriminate habitat suitability model in order to identify information that is lost when occurrence data for both sexes are included in the same model, as well as to assess the benefits of a sex-specific approach to habitat suitability modelling. Our sex-specific models allowed us to produce more informative and detailed habitat suitability maps, highlighting key differences in the distribution of suitable habitats for males and females, preferences in vegetation structure, and habitat use near human settlements. In the context of global tiger conservation, such information is essential to fulfilling established conservation goals and population recovery targets.

  4. Zooming-in on cancer metabolic rewiring with tissue specific constraint-based models.

    Science.gov (United States)

    Di Filippo, Marzia; Colombo, Riccardo; Damiani, Chiara; Pescini, Dario; Gaglio, Daniela; Vanoni, Marco; Alberghina, Lilia; Mauri, Giancarlo

    2016-06-01

    The metabolic rearrangements occurring in cancer cells can be effectively investigated with a Systems Biology approach supported by metabolic network modeling. We here present tissue-specific constraint-based core models for three different types of tumors (liver, breast and lung) that serve this purpose. The core models were extracted and manually curated from the corresponding genome-scale metabolic models in the Human Metabolic Atlas database with a focus on the pathways that are known to play a key role in cancer growth and proliferation. Along similar lines, we also reconstructed a core model from the original general human metabolic network to be used as a reference model. A comparative Flux Balance Analysis between the reference and the cancer models highlighted both a clear distinction between the two conditions and a heterogeneity within the three different cancer types in terms of metabolic flux distribution. These results emphasize the need for modeling approaches able to keep up with this tumoral heterogeneity in order to identify more suitable drug targets and develop effective treatments. According to this perspective, we identified key points able to reverse the tumoral phenotype toward the reference one or vice-versa. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Modelling the Heat Consumption in District Heating Systems using a Grey-box approach

    DEFF Research Database (Denmark)

    Nielsen, Henrik Aalborg; Madsen, Henrik

    2006-01-01

    The heat consumption in a large geographical area is considered together with climate measurements on a single location in the area. The purpose is to identify a model linking the heat consumption to climate and calendar information. The process of building a model is split into a theoretical based...... identification of an overall model structure followed by data-based modelling, whereby the details of the model are identified. This approach is sometimes called grey-box modelling, but the specific approach used here does not require states to be specified. Overall, the paper demonstrates the power of the grey...

  6. A systems biology approach to the analysis of subset-specific responses to lipopolysaccharide in dendritic cells.

    Directory of Open Access Journals (Sweden)

    David G Hancock

    Full Text Available Dendritic cells (DCs are critical for regulating CD4 and CD8 T cell immunity, controlling Th1, Th2, and Th17 commitment, generating inducible Tregs, and mediating tolerance. It is believed that distinct DC subsets have evolved to control these different immune outcomes. However, how DC subsets mount different responses to inflammatory and/or tolerogenic signals in order to accomplish their divergent functions remains unclear. Lipopolysaccharide (LPS provides an excellent model for investigating responses in closely related splenic DC subsets, as all subsets express the LPS receptor TLR4 and respond to LPS in vitro. However, previous studies of the LPS-induced DC transcriptome have been performed only on mixed DC populations. Moreover, comparisons of the in vivo response of two closely related DC subsets to LPS stimulation have not been reported in the literature to date. We compared the transcriptomes of murine splenic CD8 and CD11b DC subsets after in vivo LPS stimulation, using RNA-Seq and systems biology approaches. We identified subset-specific gene signatures, which included multiple functional immune mediators unique to each subset. To explain the observed subset-specific differences, we used a network analysis approach. While both DC subsets used a conserved set of transcription factors and major signalling pathways, the subsets showed differential regulation of sets of genes that 'fine-tune' the network Hubs expressed in common. We propose a model in which signalling through common pathway components is 'fine-tuned' by transcriptional control of subset-specific modulators, thus allowing for distinct functional outcomes in closely related DC subsets. We extend this analysis to comparable datasets from the literature and confirm that our model can account for cell subset-specific responses to LPS stimulation in multiple subpopulations in mouse and man.

  7. Multi-approach assessment of the spatial distribution of the specific yield: application to the Crau plain aquifer, France

    Science.gov (United States)

    Seraphin, Pierre; Gonçalvès, Julio; Vallet-Coulomb, Christine; Champollion, Cédric

    2018-03-01

    Spatially distributed values of the specific yield, a fundamental parameter for transient groundwater mass balance calculations, were obtained by means of three independent methods for the Crau plain, France. In contrast to its traditional use to assess recharge based on a given specific yield, the water-table fluctuation (WTF) method, applied using major recharging events, gave a first set of reference values. Then, large infiltration processes recorded by monitored boreholes and caused by major precipitation events were interpreted in terms of specific yield by means of a one-dimensional vertical numerical model solving Richards' equations within the unsaturated zone. Finally, two gravity field campaigns, at low and high piezometric levels, were carried out to assess the groundwater mass variation and thus alternative specific yield values. The range obtained by the WTF method for this aquifer made of alluvial detrital material was 2.9- 26%, in line with the scarce data available so far. The average spatial value of specific yield by the WTF method (9.1%) is consistent with the aquifer scale value from the hydro-gravimetric approach. In this investigation, an estimate of the hitherto unknown spatial distribution of the specific yield over the Crau plain was obtained using the most reliable method (the WTF method). A groundwater mass balance calculation over the domain using this distribution yielded similar results to an independent quantification based on a stable isotope-mixing model. This agreement reinforces the relevance of such estimates, which can be used to build a more accurate transient hydrogeological model.

  8. Seismic safety margins research program. Phase I. Project VII: systems analysis specifications of computational approach

    International Nuclear Information System (INIS)

    Collins, J.D.; Hudson, J.M.; Chrostowski, J.D.

    1979-02-01

    A computational methodology is presented for the prediction of core melt probabilities in a nuclear power plant due to earthquake events. The proposed model has four modules: seismic hazard, structural dynamic (including soil-structure interaction), component failure and core melt sequence. The proposed modules would operate in series and would not have to be operated at the same time. The basic statistical approach uses a Monte Carlo simulation to treat random and systematic error but alternate statistical approaches are permitted by the program design

  9. Uncertainty in biology a computational modeling approach

    CERN Document Server

    Gomez-Cabrero, David

    2016-01-01

    Computational modeling of biomedical processes is gaining more and more weight in the current research into the etiology of biomedical problems and potential treatment strategies.  Computational modeling allows to reduce, refine and replace animal experimentation as well as to translate findings obtained in these experiments to the human background. However these biomedical problems are inherently complex with a myriad of influencing factors, which strongly complicates the model building and validation process.  This book wants to address four main issues related to the building and validation of computational models of biomedical processes: Modeling establishment under uncertainty Model selection and parameter fitting Sensitivity analysis and model adaptation Model predictions under uncertainty In each of the abovementioned areas, the book discusses a number of key-techniques by means of a general theoretical description followed by one or more practical examples.  This book is intended for graduate stude...

  10. Relaxed memory models: an operational approach

    OpenAIRE

    Boudol , Gérard; Petri , Gustavo

    2009-01-01

    International audience; Memory models define an interface between programs written in some language and their implementation, determining which behaviour the memory (and thus a program) is allowed to have in a given model. A minimal guarantee memory models should provide to the programmer is that well-synchronized, that is, data-race free code has a standard semantics. Traditionally, memory models are defined axiomatically, setting constraints on the order in which memory operations are allow...

  11. Numerical modelling approach for mine backfill

    Indian Academy of Sciences (India)

    ... of mine backfill material needs special attention as the numerical model must behave realistically and in accordance with the site conditions. This paper discusses a numerical modelling strategy for modelling mine backfill material. Themodelling strategy is studied using a case study mine from Canadian mining industry.

  12. Personalization of models with many model parameters: an efficient sensitivity analysis approach.

    Science.gov (United States)

    Donders, W P; Huberts, W; van de Vosse, F N; Delhaas, T

    2015-10-01

    Uncertainty quantification and global sensitivity analysis are indispensable for patient-specific applications of models that enhance diagnosis or aid decision-making. Variance-based sensitivity analysis methods, which apportion each fraction of the output uncertainty (variance) to the effects of individual input parameters or their interactions, are considered the gold standard. The variance portions are called the Sobol sensitivity indices and can be estimated by a Monte Carlo (MC) approach (e.g., Saltelli's method [1]) or by employing a metamodel (e.g., the (generalized) polynomial chaos expansion (gPCE) [2, 3]). All these methods require a large number of model evaluations when estimating the Sobol sensitivity indices for models with many parameters [4]. To reduce the computational cost, we introduce a two-step approach. In the first step, a subset of important parameters is identified for each output of interest using the screening method of Morris [5]. In the second step, a quantitative variance-based sensitivity analysis is performed using gPCE. Efficient sampling strategies are introduced to minimize the number of model runs required to obtain the sensitivity indices for models considering multiple outputs. The approach is tested using a model that was developed for predicting post-operative flows after creation of a vascular access for renal failure patients. We compare the sensitivity indices obtained with the novel two-step approach with those obtained from a reference analysis that applies Saltelli's MC method. The two-step approach was found to yield accurate estimates of the sensitivity indices at two orders of magnitude lower computational cost. Copyright © 2015 John Wiley & Sons, Ltd.

  13. Numerical modelling of carbonate platforms and reefs: approaches and opportunities

    Energy Technology Data Exchange (ETDEWEB)

    Dalmasso, H.; Montaggioni, L.F.; Floquet, M. [Universite de Provence, Marseille (France). Centre de Sedimentologie-Palaeontologie; Bosence, D. [Royal Holloway University of London, Egham (United Kingdom). Dept. of Geology

    2001-07-01

    This paper compares different computing procedures that have been utilized in simulating shallow-water carbonate platform development. Based on our geological knowledge we can usually give a rather accurate qualitative description of the mechanisms controlling geological phenomena. Further description requires the use of computer stratigraphic simulation models that allow quantitative evaluation and understanding of the complex interactions of sedimentary depositional carbonate systems. The roles of modelling include: (1) encouraging accuracy and precision in data collection and process interpretation (Watney et al., 1999); (2) providing a means to quantitatively test interpretations concerning the control of various mechanisms on producing sedimentary packages; (3) predicting or extrapolating results into areas of limited control; (4) gaining new insights regarding the interaction of parameters; (5) helping focus on future studies to resolve specific problems. This paper addresses two main questions, namely: (1) What are the advantages and disadvantages of various types of models? (2) How well do models perform? In this paper we compare and discuss the application of five numerical models: CARBONATE (Bosence and Waltham, 1990), FUZZIM (Nordlund, 1999), CARBPLAT (Bosscher, 1992), DYNACARB (Li et al., 1993), PHIL (Bowman, 1997) and SEDPAK (Kendall et al., 1991). The comparison, testing and evaluation of these models allow one to gain a better knowledge and understanding of controlling parameters of carbonate platform development, which are necessary for modelling. Evaluating numerical models, critically comparing results from models using different approaches, and pushing experimental tests to their limits, provide an effective vehicle to improve and develop new numerical models. A main feature of this paper is to closely compare the performance between two numerical models: a forward model (CARBONATE) and a fuzzy logic model (FUZZIM). These two models use common

  14. Cognitive approach to assessing pragmatic language comprehension in children with specific language impairment.

    Science.gov (United States)

    Ryder, Nuala; Leinonen, Eeva; Schulz, Joerg

    2008-01-01

    Pragmatic language impairment in children with specific language impairment has proved difficult to assess, and the nature of their abilities to comprehend pragmatic meaning has not been fully investigated. To develop both a cognitive approach to pragmatic language assessment based on Relevance Theory and an assessment tool for identifying a group of children with pragmatic language impairment from within an specific language impairment group. The authors focused on Relevance Theory's view of the role of context in pragmatic language comprehension using questions of increasing pragmatic complexity in different verbal contexts (scenarios with and without pictures and a story with supporting pictures). The performances of the children with and without pragmatic impairment on the most pragmatically demanding Implicature questions were examined. This study included 99 children: 27 with specific language impairment (including nine pragmatically impaired children) and two groups of typically developing children (32 children aged 5-6 years and 40 children aged 7-11 years). The specific language impairment group performed similarly to their peers when utilizing context in inferring referents, inferring semantic meaning, and generating Implicatures, only when the answer was provided by pictorial context. Both the children with specific language impairment and the 5-6 year olds were not yet competent at utilizing verbal context when answering the most pragmatically demanding questions (targeting Implicature). On these questions the children with pragmatic language impairment performed significantly poorer than the rest of the specific language impairment group and performance scores on Implicature questions were found to identify accurately the children with pragmatic language impairment from the rest of the specific language impairment group (sensitivity = 89%). Children's ability to infer and integrate information in the comprehension of pragmatic meaning was found to be

  15. A working group`s conclusion on site specific flow and transport modelling

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, J. [Golder Associates AB (Sweden); Ahokas, H. [Fintact Oy, Helsinki (Finland); Koskinen, L.; Poteri, A. [VTT Energy, Espoo (Finland); Niemi, A. [Royal Inst. of Technology, Stockholm (Sweden). Hydraulic Engineering; Hautojaervi, A. [Posiva Oy, Helsinki (Finland)

    1998-03-01

    This document suggests a strategy plan for groundwater flow and transport modelling to be used in the site specific performance assessment analysis of spent nuclear fuel disposal to be used for the site selection planned by the year 2000. Considering suggested general regulations in Finland, as well as suggested regulations in Sweden and the approach taken in recent safety assessment exercises conducted in these countries, it is clear that in such an analysis, in addition to showing that the proposed repository is safe, there exist needs to strengthen the link between field data, groundwater flow modelling and derivation of safety assessment parameters, and needs to assess uncertainty and variability. The suggested strategy plan builds on an evaluation of different approaches to modelling the groundwater flow in crystalline basement rock, the abundance of data collected in the site investigation programme in Finland, and the modelling methodology developed in the programme so far. It is suggested to model the whole system using nested models, where larger scale models provide the boundary conditions for the smaller ones 62 refs.

  16. Glass Transition Temperature- and Specific Volume- Composition Models for Tellurite Glasses

    Energy Technology Data Exchange (ETDEWEB)

    Riley, Brian J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Vienna, John D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2017-09-01

    This report provides models for predicting composition-properties for tellurite glasses, namely specific gravity and glass transition temperature. Included are the partial specific coefficients for each model, the component validity ranges, and model fit parameters.

  17. Radiation Belt Specification and Situational Awareness using Data Assimilation Based Modeling

    Science.gov (United States)

    Reeves, G. D.; Koller, J.; Chen, Y.; Friedel, R. H.; Cayton, T. E.

    2006-12-01

    For a number of years now the operational limitations of the standard radiation belt models have been widely discussed. Doses from specific parts of the spectrum can be over- or under-estimated. The averaging procedures used do not give statistical distributions or worst case fluences. And, critically, the models are not time-dependent or real time. Here we present a new approach to radiation belt specification that provides fluxes, fluences, or dose rates for any arbitrary orbit and for any arbitrary mission duration up to and including real time. DREAM ( the Dynamic Radiation Environment Assimilation Model) uses data assimilation techniques to combine measurements from geosynchronous and GPS satellites along with a physics-based model to derive optimal state specification of the full radiation belts. The physical equations are solved by evolving phase space density at fixed adiabatic invariants. Once the underlying physical equations are solved and optimized with the current measured state (based on the observations), the phase space density representation is inverted back to physical space and physical fluxes. We show initial results for 6-months in 2002 and compare the orbital dose rates predicted by DREAM with those measured by HEO satellites. We also discuss how this model could be implemented with real time data to provide space situational awareness and short term radiation belt forecasts.

  18. Delay equations modeling the effects of phase-specific drugs and immunotherapy on proliferating tumor cells.

    Science.gov (United States)

    Barbarossa, Maria Vittoria; Kuttler, Christina; Zinsl, Jonathan

    2012-04-01

    In this work we present a mathematical model for tumor growth based on the biology of the cell cycle. For an appropriate description of the effects of phase-specific drugs, it is necessary to look at the cell cycle and its phases. Our model reproduces the dynamics of three different tumor cell populations: quiescent cells, cells during the interphase and mitotic cells. Starting from a partial differential equations (PDEs) setting, a delay differential equations (DDE) model is derived for an easier and more realistic approach. Our equations also include interactions of tumor cells with immune system effectors. We investigate the model both from the analytical and the numerical point of view, give conditions for positivity of solutions and focus on the stability of the cancer-free equilibrium. Different immunotherapeutic strategies and their effects on the tumor growth are considered, as well.

  19. A Residual Approach for Balanced Truncation Model Reduction (BTMR of Compartmental Systems

    Directory of Open Access Journals (Sweden)

    William La Cruz

    2014-05-01

    Full Text Available This paper presents a residual approach of the square root balanced truncation algorithm for model order reduction of continuous, linear and time-invariante compartmental systems. Specifically, the new approach uses a residual method to approximate the controllability and observability gramians, whose resolution is an essential step of the square root balanced truncation algorithm, that requires a great computational cost. Numerical experiences are included to highlight the efficacy of the proposed approach.

  20. Models Portability: Some Considerations about Transdisciplinary Approaches

    Science.gov (United States)

    Giuliani, Alessandro

    Some critical issues about the relative portability of models and solutions across disciplinary barriers are discussed. The risks linked to the use of models and theories coming from different disciplines are evidentiated with a particular emphasis on biology. A metaphorical use of conceptual tools coming from other fields is suggested, together with the unescapable need to judge about the relative merits of a model on the basis of the amount of facts relative to the particular domain of application it explains. Some examples of metaphorical modeling coming from biochemistry and psychobiology are briefly discussed in order to clarify the above positions.

  1. Systems approaches to computational modeling of the oral microbiome

    Directory of Open Access Journals (Sweden)

    Dimiter V. Dimitrov

    2013-07-01

    Full Text Available Current microbiome research has generated tremendous amounts of data providing snapshots of molecular activity in a variety of organisms, environments, and cell types. However, turning this knowledge into whole system level of understanding on pathways and processes has proven to be a challenging task. In this review we highlight the applicability of bioinformatics and visualization techniques to large collections of data in order to better understand the information that contains related diet – oral microbiome – host mucosal transcriptome interactions. In particular we focus on systems biology of Porphyromonas gingivalis in the context of high throughput computational methods tightly integrated with translational systems medicine. Those approaches have applications for both basic research, where we can direct specific laboratory experiments in model organisms and cell cultures, to human disease, where we can validate new mechanisms and biomarkers for prevention and treatment of chronic disorders

  2. A Centerline Based Model Morphing Algorithm for Patient-Specific Finite Element Modelling of the Left Ventricle.

    Science.gov (United States)

    Behdadfar, S; Navarro, L; Sundnes, J; Maleckar, M; Ross, S; Odland, H H; Avril, S

    2017-09-20

    Hexahedral automatic model generation is a recurrent problem in computer vision and computational biomechanics. It may even become a challenging problem when one wants to develop a patient-specific finite-element (FE) model of the left ventricle (LV), particularly when only low resolution images are available. In the present study, a fast and efficient algorithm is presented and tested to address such a situation. A template FE hexahedral model was created for a LV geometry using a General Electric (GE) ultrasound (US) system. A system of centerline was considered for this LV mesh. Then, the nodes located over the endocardial and epicardial surfaces are respectively projected from this centerline onto the actual endocardial and epicardial surfaces reconstructed from a patient's US data. Finally, the position of the internal nodes is derived by finding the deformations with minimal elastic energy. This approach was applied to eight patients suffering from congestive heart disease. A FE analysis was performed to derive the stress induced in the LV tissue by diastolic blood pressure on each of them. Our model morphing algorithm was applied successfully and the obtained meshes showed only marginal mismatches when compared to the corresponding US geometries. The diastolic FE analyses were successfully performed in seven patients to derive the distribution of principal stresses. The original model morphing algorithm is fast and robust with low computational cost. This low cost model morphing algorithm may be highly beneficial for future patient-specific reduced-order modelling of the LV with potential application to other crucial organs.

  3. Modelling of subject specific based segmental dynamics of knee joint

    Science.gov (United States)

    Nasir, N. H. M.; Ibrahim, B. S. K. K.; Huq, M. S.; Ahmad, M. K. I.

    2017-09-01

    This study determines segmental dynamics parameters based on subject specific method. Five hemiplegic patients participated in the study, two men and three women. Their ages ranged from 50 to 60 years, weights from 60 to 70 kg and heights from 145 to 170 cm. Sample group included patients with different side of stroke. The parameters of the segmental dynamics resembling the knee joint functions measured via measurement of Winter and its model generated via the employment Kane's equation of motion. Inertial parameters in the form of the anthropometry can be identified and measured by employing Standard Human Dimension on the subjects who are in hemiplegia condition. The inertial parameters are the location of centre of mass (COM) at the length of the limb segment, inertia moment around the COM and masses of shank and foot to generate accurate motion equations. This investigation has also managed to dig out a few advantages of employing the table of anthropometry in movement biomechanics of Winter's and Kane's equation of motion. A general procedure is presented to yield accurate measurement of estimation for the inertial parameters for the joint of the knee of certain subjects with stroke history.

  4. Rotary ATPases: models, machine elements and technical specifications.

    Science.gov (United States)

    Stewart, Alastair G; Sobti, Meghna; Harvey, Richard P; Stock, Daniela

    2013-01-01

    Rotary ATPases are molecular rotary motors involved in biological energy conversion. They either synthesize or hydrolyze the universal biological energy carrier adenosine triphosphate. Recent work has elucidated the general architecture and subunit compositions of all three sub-types of rotary ATPases. Composite models of the intact F-, V- and A-type ATPases have been constructed by fitting high-resolution X-ray structures of individual subunits or sub-complexes into low-resolution electron densities of the intact enzymes derived from electron cryo-microscopy. Electron cryo-tomography has provided new insights into the supra-molecular arrangement of eukaryotic ATP synthases within mitochondria and mass-spectrometry has started to identify specifically bound lipids presumed to be essential for function. Taken together these molecular snapshots show that nano-scale rotary engines have much in common with basic design principles of man made machines from the function of individual "machine elements" to the requirement of the right "fuel" and "oil" for different types of motors.

  5. Nonlinear Modeling of the PEMFC Based On NNARX Approach

    OpenAIRE

    Shan-Jen Cheng; Te-Jen Chang; Kuang-Hsiung Tan; Shou-Ling Kuo

    2015-01-01

    Polymer Electrolyte Membrane Fuel Cell (PEMFC) is such a time-vary nonlinear dynamic system. The traditional linear modeling approach is hard to estimate structure correctly of PEMFC system. From this reason, this paper presents a nonlinear modeling of the PEMFC using Neural Network Auto-regressive model with eXogenous inputs (NNARX) approach. The multilayer perception (MLP) network is applied to evaluate the structure of the NNARX model of PEMFC. The validity and accurac...

  6. DIVERSE APPROACHES TO MODELLING THE ASSIMILATIVE ...

    African Journals Online (AJOL)

    This study evaluated the assimilative capacity of Ikpoba River using different approaches namely: homogeneous differential equation, ANOVA/Duncan Multiple rage test, first and second order differential equations, correlation analysis, Eigen values and eigenvectors, multiple linear regression, bootstrapping and far-field ...

  7. Comparison of two novel approaches to model fibre reinforced concrete

    NARCIS (Netherlands)

    Radtke, F.K.F.; Simone, A.; Sluys, L.J.

    2009-01-01

    We present two approaches to model fibre reinforced concrete. In both approaches, discrete fibre distributions and the behaviour of the fibre-matrix interface are explicitly considered. One approach employs the reaction forces from fibre to matrix while the other is based on the partition of unity

  8. Modeling Approaches for Describing Microbial Population Heterogeneity

    DEFF Research Database (Denmark)

    Lencastre Fernandes, Rita

    in a computational (CFD) fluid dynamic model. The anaerobic Growth of a budding yeast population in a continuously run microbioreactor was used as example. The proposed integrated model describes the fluid flow, the local cell size and cell cycle position distributions, as well as the local concentrations of glucose...

  9. A simplified approach to feedwater train modeling

    International Nuclear Information System (INIS)

    Ollat, X.; Smoak, R.A.

    1990-01-01

    This paper presents a method to simplify feedwater train models for power plants. A simple set of algebraic equations, based on mass and energy balances, is developed to replace complex representations of the components under certain assumptions. The method was tested and used to model the low pressure heaters of the Sequoyah Nuclear Plant in a larger simulation

  10. The female gametophyte: an emerging model for cell type-specific systems biology in plant development

    Directory of Open Access Journals (Sweden)

    Marc William Schmid

    2015-11-01

    Full Text Available Systems biology, a holistic approach describing a system emerging from the interactions of its molecular components, critically depends on accurate qualitative determination and quantitative measurements of these components. Development and improvement of large-scale profiling methods (omics now facilitates comprehensive measurements of many relevant molecules. For multicellular organisms, such as animals, fungi, algae, and plants, the complexity of the system is augmented by the presence of specialized cell types and organs, and a complex interplay within and between them. Cell type-specific analyses are therefore crucial for the understanding of developmental processes and environmental responses. This review first gives an overview of current methods used for large-scale profiling of specific cell types exemplified by recent advances in plant biology. The focus then lies on suitable model systems to study plant development and cell type specification. We introduce the female gametophyte of flowering plants as an ideal model to study fundamental developmental processes. Moreover, the female reproductive lineage is of importance for the emergence of evolutionary novelties such as an unequal parental contribution to the tissue nurturing the embryo or the clonal production of seeds by asexual reproduction (apomixis. Understanding these processes is not only interesting from a developmental or evolutionary perspective, but bears great potential for further crop improvement and the simplification of breeding efforts. We finally highlight novel methods, which are already available or which will likely soon facilitate large-scale profiling of the specific cell types of the female gametophyte in both model and non-model species. We conclude that it may take only few years until an evolutionary systems biology approach toward female gametogenesis may decipher some of its biologically most interesting and economically most valuable processes.

  11. New Approaches in Reuseable Booster System Life Cycle Cost Modeling

    Science.gov (United States)

    Zapata, Edgar

    2013-01-01

    This paper presents the results of a 2012 life cycle cost (LCC) study of hybrid Reusable Booster Systems (RBS) conducted by NASA Kennedy Space Center (KSC) and the Air Force Research Laboratory (AFRL). The work included the creation of a new cost estimating model and an LCC analysis, building on past work where applicable, but emphasizing the integration of new approaches in life cycle cost estimation. Specifically, the inclusion of industry processes/practices and indirect costs were a new and significant part of the analysis. The focus of LCC estimation has traditionally been from the perspective of technology, design characteristics, and related factors such as reliability. Technology has informed the cost related support to decision makers interested in risk and budget insight. This traditional emphasis on technology occurs even though it is well established that complex aerospace systems costs are mostly about indirect costs, with likely only partial influence in these indirect costs being due to the more visible technology products. Organizational considerations, processes/practices, and indirect costs are traditionally derived ("wrapped") only by relationship to tangible product characteristics. This traditional approach works well as long as it is understood that no significant changes, and by relation no significant improvements, are being pursued in the area of either the government acquisition or industry?s indirect costs. In this sense then, most launch systems cost models ignore most costs. The alternative was implemented in this LCC study, whereby the approach considered technology and process/practices in balance, with as much detail for one as the other. This RBS LCC study has avoided point-designs, for now, instead emphasizing exploring the trade-space of potential technology advances joined with potential process/practice advances. Given the range of decisions, and all their combinations, it was necessary to create a model of the original model

  12. New Approaches in Reusable Booster System Life Cycle Cost Modeling

    Science.gov (United States)

    Zapata, Edgar

    2013-01-01

    This paper presents the results of a 2012 life cycle cost (LCC) study of hybrid Reusable Booster Systems (RBS) conducted by NASA Kennedy Space Center (KSC) and the Air Force Research Laboratory (AFRL). The work included the creation of a new cost estimating model and an LCC analysis, building on past work where applicable, but emphasizing the integration of new approaches in life cycle cost estimation. Specifically, the inclusion of industry processes/practices and indirect costs were a new and significant part of the analysis. The focus of LCC estimation has traditionally been from the perspective of technology, design characteristics, and related factors such as reliability. Technology has informed the cost related support to decision makers interested in risk and budget insight. This traditional emphasis on technology occurs even though it is well established that complex aerospace systems costs are mostly about indirect costs, with likely only partial influence in these indirect costs being due to the more visible technology products. Organizational considerations, processes/practices, and indirect costs are traditionally derived ("wrapped") only by relationship to tangible product characteristics. This traditional approach works well as long as it is understood that no significant changes, and by relation no significant improvements, are being pursued in the area of either the government acquisition or industry?s indirect costs. In this sense then, most launch systems cost models ignore most costs. The alternative was implemented in this LCC study, whereby the approach considered technology and process/practices in balance, with as much detail for one as the other. This RBS LCC study has avoided point-designs, for now, instead emphasizing exploring the trade-space of potential technology advances joined with potential process/practice advances. Given the range of decisions, and all their combinations, it was necessary to create a model of the original model

  13. Vaccination with lipid core peptides fails to induce epitope-specific T cell responses but confers non-specific protective immunity in a malaria model.

    Directory of Open Access Journals (Sweden)

    Simon H Apte

    Full Text Available Vaccines against many pathogens for which conventional approaches have failed remain an unmet public health priority. Synthetic peptide-based vaccines offer an attractive alternative to whole protein and whole organism vaccines, particularly for complex pathogens that cause chronic infection. Previously, we have reported a promising lipid core peptide (LCP vaccine delivery system that incorporates the antigen, carrier, and adjuvant in a single molecular entity. LCP vaccines have been used to deliver several peptide subunit-based vaccine candidates and induced high titre functional antibodies and protected against Group A streptococcus in mice. Herein, we have evaluated whether LCP constructs incorporating defined CD4(+ and/or CD8(+ T cell epitopes could induce epitope-specific T cell responses and protect against pathogen challenge in a rodent malaria model. We show that LCP vaccines failed to induce an expansion of antigen-specific CD8(+ T cells following primary immunization or by boosting. We further demonstrated that the LCP vaccines induced a non-specific type 2 polarized cytokine response, rather than an epitope-specific canonical CD8(+ T cell type 1 response. Cytotoxic responses of unknown specificity were also induced. These non-specific responses were able to protect against parasite challenge. These data demonstrate that vaccination with lipid core peptides fails to induce canonical epitope-specific T cell responses, at least in our rodent model, but can nonetheless confer non-specific protective immunity against Plasmodium parasite challenge.

  14. Vaccination with lipid core peptides fails to induce epitope-specific T cell responses but confers non-specific protective immunity in a malaria model.

    Science.gov (United States)

    Apte, Simon H; Groves, Penny L; Skwarczynski, Mariusz; Fujita, Yoshio; Chang, Chenghung; Toth, Istvan; Doolan, Denise L

    2012-01-01

    Vaccines against many pathogens for which conventional approaches have failed remain an unmet public health priority. Synthetic peptide-based vaccines offer an attractive alternative to whole protein and whole organism vaccines, particularly for complex pathogens that cause chronic infection. Previously, we have reported a promising lipid core peptide (LCP) vaccine delivery system that incorporates the antigen, carrier, and adjuvant in a single molecular entity. LCP vaccines have been used to deliver several peptide subunit-based vaccine candidates and induced high titre functional antibodies and protected against Group A streptococcus in mice. Herein, we have evaluated whether LCP constructs incorporating defined CD4(+) and/or CD8(+) T cell epitopes could induce epitope-specific T cell responses and protect against pathogen challenge in a rodent malaria model. We show that LCP vaccines failed to induce an expansion of antigen-specific CD8(+) T cells following primary immunization or by boosting. We further demonstrated that the LCP vaccines induced a non-specific type 2 polarized cytokine response, rather than an epitope-specific canonical CD8(+) T cell type 1 response. Cytotoxic responses of unknown specificity were also induced. These non-specific responses were able to protect against parasite challenge. These data demonstrate that vaccination with lipid core peptides fails to induce canonical epitope-specific T cell responses, at least in our rodent model, but can nonetheless confer non-specific protective immunity against Plasmodium parasite challenge.

  15. Position-specific isotope modeling of organic micropollutants transformations through different reaction pathways

    Science.gov (United States)

    Jin, Biao; Rolle, Massimo

    2016-04-01

    Organic compounds are produced in vast quantities for industrial and agricultural use, as well as for human and animal healthcare [1]. These chemicals and their metabolites are frequently detected at trace levels in fresh water environments where they undergo degradation via different reaction pathways. Compound specific stable isotope analysis (CSIA) is a valuable tool to identify such degradation pathways in different environmental systems. Recent advances in analytical techniques have promoted the fast development and implementation of multi-element CSIA. However, quantitative frameworks to evaluate multi-element stable isotope data and incorporating mechanistic information on the degradation processes [2,3] are still lacking. In this study we propose a mechanism-based modeling approach to simultaneously evaluate concentration as well as bulk and position-specific multi-element isotope evolution during the transformation of organic micropollutants. The model explicitly simulates position-specific isotopologues for those atoms that experience isotope effects and, thereby, provides a mechanistic description of isotope fractionation occurring at different molecular positions. We validate the proposed approach with the concentration and multi-element isotope data of three selected organic micropollutants: dichlorobenzamide (BAM), isoproturon (IPU) and diclofenac (DCF). The model precisely captures the dual element isotope trends characteristic of different reaction pathways and their range of variation consistent with observed multi-element (C, N) bulk isotope fractionation. The proposed approach can also be used as a tool to explore transformation pathways in scenarios for which position-specific isotope data are not yet available. [1] Schwarzenbach, R.P., Egli, T., Hofstetter, T.B., von Gunten, U., Wehrli, B., 2010. Global Water Pollution and Human Health. Annu. Rev. Environ. Resour. doi:10.1146/annurev-environ-100809-125342. [2] Jin, B., Haderlein, S.B., Rolle, M

  16. Methodologies for Development of Patient Specific Bone Models from Human Body CT Scans

    Science.gov (United States)

    Chougule, Vikas Narayan; Mulay, Arati Vinayak; Ahuja, Bharatkumar Bhagatraj

    2016-06-01

    This work deals with development of algorithm for physical replication of patient specific human bone and construction of corresponding implants/inserts RP models by using Reverse Engineering approach from non-invasive medical images for surgical purpose. In medical field, the volumetric data i.e. voxel and triangular facet based models are primarily used for bio-modelling and visualization, which requires huge memory space. On the other side, recent advances in Computer Aided Design (CAD) technology provides additional facilities/functions for design, prototyping and manufacturing of any object having freeform surfaces based on boundary representation techniques. This work presents a process to physical replication of 3D rapid prototyping (RP) physical models of human bone from various CAD modeling techniques developed by using 3D point cloud data which is obtained from non-invasive CT/MRI scans in DICOM 3.0 format. This point cloud data is used for construction of 3D CAD model by fitting B-spline curves through these points and then fitting surface between these curve networks by using swept blend techniques. This process also can be achieved by generating the triangular mesh directly from 3D point cloud data without developing any surface model using any commercial CAD software. The generated STL file from 3D point cloud data is used as a basic input for RP process. The Delaunay tetrahedralization approach is used to process the 3D point cloud data to obtain STL file. CT scan data of Metacarpus (human bone) is used as the case study for the generation of the 3D RP model. A 3D physical model of the human bone is generated on rapid prototyping machine and its virtual reality model is presented for visualization. The generated CAD model by different techniques is compared for the accuracy and reliability. The results of this research work are assessed for clinical reliability in replication of human bone in medical field.

  17. A Bayesian Approach to Inferring Rates of Selfing and Locus-Specific Mutation.

    Science.gov (United States)

    Redelings, Benjamin D; Kumagai, Seiji; Tatarenkov, Andrey; Wang, Liuyang; Sakai, Ann K; Weller, Stephen G; Culley, Theresa M; Avise, John C; Uyenoyama, Marcy K

    2015-11-01

    We present a Bayesian method for characterizing the mating system of populations reproducing through a mixture of self-fertilization and random outcrossing. Our method uses patterns of genetic variation across the genome as a basis for inference about reproduction under pure hermaphroditism, gynodioecy, and a model developed to describe the self-fertilizing killifish Kryptolebias marmoratus. We extend the standard coalescence model to accommodate these mating systems, accounting explicitly for multilocus identity disequilibrium, inbreeding depression, and variation in fertility among mating types. We incorporate the Ewens sampling formula (ESF) under the infinite-alleles model of mutation to obtain a novel expression for the likelihood of mating system parameters. Our Markov chain Monte Carlo (MCMC) algorithm assigns locus-specific mutation rates, drawn from a common mutation rate distribution that is itself estimated from the data using a Dirichlet process prior model. Our sampler is designed to accommodate additional information, including observations pertaining to the sex ratio, the intensity of inbreeding depression, and other aspects of reproduction. It can provide joint posterior distributions for the population-wide proportion of uniparental individuals, locus-specific mutation rates, and the number of generations since the most recent outcrossing event for each sampled individual. Further, estimation of all basic parameters of a given model permits estimation of functions of those parameters, including the proportion of the gene pool contributed by each sex and relative effective numbers. Copyright © 2015 by the Genetics Society of America.

  18. A novel patient-specific model to compute coronary fractional flow reserve.

    Science.gov (United States)

    Kwon, Soon-Sung; Chung, Eui-Chul; Park, Jin-Seo; Kim, Gook-Tae; Kim, Jun-Woo; Kim, Keun-Hong; Shin, Eun-Seok; Shim, Eun Bo

    2014-09-01

    The fractional flow reserve (FFR) is a widely used clinical index to evaluate the functional severity of coronary stenosis. A computer simulation method based on patients' computed tomography (CT) data is a plausible non-invasive approach for computing the FFR. This method can provide a detailed solution for the stenosed coronary hemodynamics by coupling computational fluid dynamics (CFD) with the lumped parameter model (LPM) of the cardiovascular system. In this work, we have implemented a simple computational method to compute the FFR. As this method uses only coronary arteries for the CFD model and includes only the LPM of the coronary vascular system, it provides simpler boundary conditions for the coronary geometry and is computationally more efficient than existing approaches. To test the efficacy of this method, we simulated a three-dimensional straight vessel using CFD coupled with the LPM. The computed results were compared with those of the LPM. To validate this method in terms of clinically realistic geometry, a patient-specific model of stenosed coronary arteries was constructed from CT images, and the computed FFR was compared with clinically measured results. We evaluated the effect of a model aorta on the computed FFR and compared this with a model without the aorta. Computationally, the model without the aorta was more efficient than that with the aorta, reducing the CPU time required for computing a cardiac cycle to 43.4%. Copyright © 2014. Published by Elsevier Ltd.

  19. The workshop on ecosystems modelling approaches for South ...

    African Journals Online (AJOL)

    roles played by models in the OMP approach, and raises questions about the costs of the data collection. (in particular) needed to apply a multispecies modelling approach in South African fisheries management. It then summarizes the deliberations of workshops held by the Scientific Committees of two international ma-.

  20. A simple approach to modeling ductile failure.

    Energy Technology Data Exchange (ETDEWEB)

    Wellman, Gerald William

    2012-06-01

    Sandia National Laboratories has the need to predict the behavior of structures after the occurrence of an initial failure. In some cases determining the extent of failure, beyond initiation, is required, while in a few cases the initial failure is a design feature used to tailor the subsequent load paths. In either case, the ability to numerically simulate the initiation and propagation of failures is a highly desired capability. This document describes one approach to the simulation of failure initiation and propagation.

  1. Advanced language modeling approaches, case study: Expert search

    NARCIS (Netherlands)

    Hiemstra, Djoerd

    2008-01-01

    This tutorial gives a clear and detailed overview of advanced language modeling approaches and tools, including the use of document priors, translation models, relevance models, parsimonious models and expectation maximization training. Expert search will be used as a case study to explain the

  2. An interdisciplinary approach to modeling tritium transfer into the environment

    International Nuclear Information System (INIS)

    Galeriu, D; Melintescu, A.

    2005-01-01

    equations between soil and plants. Considering mammals, we recently showed that the simplistic models currently applied did not accurately match experimental data from rats and sheep. Specific data for many farm and wild animals are scarce. In this paper, we are advancing a different approach based on energy metabolism, which can be parameterized predominantly based on published metabolic data for mature mammals. We started with the observation that the measured dynamics of 14 C and non-exchangeable organically bound tritium (OBT) were, not surprisingly, similar. We therefore introduced a metabolic definition for the 14 C and OBT loss rate (assumed to be the same) from the whole body and specific organs. We assumed that this was given by the specific metabolic rate of the whole body or organ, divided by the enthalpy of combustion of a kilogram of fresh matter. Since basal metabolism data were taken from the literature, they were modified for energy expenditure above basal need. To keep the model simple, organs were grouped according to their metabolic activity or importance in the food chain. Pools considered were viscera (high metabolic rate organs except the brain), muscle, adipose tissue, blood, and other (all other tissues). We disregarded any detail on substrate utilization from the dietary intake and condensed the postprandial respiration in a single rate. We included considerations of net maintenance and growth needs. For tritium, the transfer between body water and organic compartments was modeled using knowledge of basic metabolism and published relations. We considered the potential influence of rumen digestion and bacterial protein in ruminants. As for model application, we focused on laboratory and farm animals, where some experimental data were available. The model performed well for rat muscle, viscera and adipose tissue, but due to the simplicity of model structure and assumptions, blood and urine data were only satisfactorily reproduced. Whilst for sheep fed

  3. Using hedonic property models to value public water bodies: An analysis of specification issues

    Science.gov (United States)

    Muller, Nicholas Z.

    2009-01-01

    The hedonic literature has established that public water bodies provide external benefits that are reflected in the value of nearby residential real estate. The literature has employed several approaches to quantify these nonmarket services. With a residential hedonic model, this paper tests whether model specification affects resource valuation using an actively managed reservoir in Indiana and a passively managed lake in Connecticut. The results indicate that valuation is quite sensitive to model specification and that omitting either the waterview or waterfront variables from the hedonic function likely results in a misspecified model. The findings from this study are important for researchers and public agencies charged with managing water resources to bear in mind as the external benefits from existing or proposed man-made lakes and reservoirs are estimated. Therefore, while it requires considerably more effort to determine which properties are in waterfront locations and which properties have a view, the potential mispecification of "distance-only" models likely justifies these extra research costs. Further, the findings in this analysis call into question results from distance-only models in the literature.

  4. Chemotaxis: A Multi-Scale Modeling Approach

    Science.gov (United States)

    Bhowmik, Arpan

    We are attempting to build a working simulation of population level self-organization in dictyostelium discoideum cells by combining existing models for chemo-attractant production and detection, along with phenomenological motility models. Our goal is to create a computationally-viable model-framework within which a population of cells can self-generate chemo-attractant waves and self-organize based on the directional cues of those waves. The work is a direct continuation of our previous work published in Physical Biology titled ``Excitable waves and direction-sensing in Dictyostelium Discoideum: steps towards a chemotaxis model''. This is a work in progress, no official draft/paper exists yet.

  5. An Integrated Approach to Modeling Evacuation Behavior

    Science.gov (United States)

    2011-02-01

    A spate of recent hurricanes and other natural disasters have drawn a lot of attention to the evacuation decision of individuals. Here we focus on evacuation models that incorporate two economic phenomena that seem to be increasingly important in exp...

  6. Infectious disease modeling a hybrid system approach

    CERN Document Server

    Liu, Xinzhi

    2017-01-01

    This volume presents infectious diseases modeled mathematically, taking seasonality and changes in population behavior into account, using a switched and hybrid systems framework. The scope of coverage includes background on mathematical epidemiology, including classical formulations and results; a motivation for seasonal effects and changes in population behavior, an investigation into term-time forced epidemic models with switching parameters, and a detailed account of several different control strategies. The main goal is to study these models theoretically and to establish conditions under which eradication or persistence of the disease is guaranteed. In doing so, the long-term behavior of the models is determined through mathematical techniques from switched systems theory. Numerical simulations are also given to augment and illustrate the theoretical results and to help study the efficacy of the control schemes.

  7. Modelling energy demand of developing countries: Are the specific features adequately captured?

    International Nuclear Information System (INIS)

    Bhattacharyya, Subhes C.; Timilsina, Govinda R.

    2010-01-01

    This paper critically reviews existing energy demand forecasting methodologies highlighting the methodological diversities and developments over the past four decades in order to investigate whether the existing energy demand models are appropriate for capturing the specific features of developing countries. The study finds that two types of approaches, econometric and end-use accounting, are commonly used in the existing energy demand models. Although energy demand models have greatly evolved since the early seventies, key issues such as the poor-rich and urban-rural divides, traditional energy resources and differentiation between commercial and non-commercial energy commodities are often poorly reflected in these models. While the end-use energy accounting models with detailed sectoral representations produce more realistic projections as compared to the econometric models, they still suffer from huge data deficiencies especially in developing countries. Development and maintenance of more detailed energy databases, further development of models to better reflect developing country context and institutionalizing the modelling capacity in developing countries are the key requirements for energy demand modelling to deliver richer and more reliable input to policy formulation in developing countries.

  8. Reconstruction of Arabidopsis metabolic network models accounting for subcellular compartmentalization and tissue-specificity.

    Science.gov (United States)

    Mintz-Oron, Shira; Meir, Sagit; Malitsky, Sergey; Ruppin, Eytan; Aharoni, Asaph; Shlomi, Tomer

    2012-01-03

    Plant metabolic engineering is commonly used in the production of functional foods and quality trait improvement. However, to date, computational model-based approaches have only been scarcely used in this important endeavor, in marked contrast to their prominent success in microbial metabolic engineering. In this study we present a computational pipeline for the reconstruction of fully compartmentalized tissue-specific models of Arabidopsis thaliana on a genome scale. This reconstruction involves automatic extraction of known biochemical reactions in Arabidopsis for both primary and secondary metabolism, automatic gap-filling, and the implementation of methods for determining subcellular localization and tissue assignment of enzymes. The reconstructed tissue models are amenable for constraint-based modeling analysis, and significantly extend upon previous model reconstructions. A set of computational validations (i.e., cross-validation tests, simulations of known metabolic functionalities) and experimental validations (comparison with experimental metabolomics datasets under various compartments and tissues) strongly testify to the predictive ability of the models. The utility of the derived models was demonstrated in the prediction of measured fluxes in metabolically engineered seed strains and the design of genetic manipulations that are expected to increase vitamin E content, a significant nutrient for human health. Overall, the reconstructed tissue models are expected to lay down the foundations for computational-based rational design of plant metabolic engineering. The reconstructed compartmentalized Arabidopsis tissue models are MIRIAM-compliant and are available upon request.

  9. Quantification of Cooperativity in Heterodimer-DNA Binding Improves the Accuracy of Binding Specificity Models*

    Science.gov (United States)

    Isakova, Alina; Berset, Yves; Hatzimanikatis, Vassily; Deplancke, Bart

    2016-01-01

    Many transcription factors (TFs) have the ability to cooperate on DNA elements as heterodimers. Despite the significance of TF heterodimerization for gene regulation, a quantitative understanding of cooperativity between various TF dimer partners and its impact on heterodimer DNA binding specificity models is still lacking. Here, we used a novel integrative approach, combining microfluidics-steered measurements of dimer-DNA assembly with mechanistic modeling of the implicated protein-protein-DNA interactions to quantitatively interrogate the cooperative DNA binding behavior of the adipogenic peroxisome proliferator-activated receptor γ (PPARγ):retinoid X receptor α (RXRα) heterodimer. Using the high throughput MITOMI (mechanically induced trapping of molecular interactions) platform, we derived equilibrium DNA binding data for PPARγ, RXRα, as well as the PPARγ:RXRα heterodimer to more than 300 target DNA sites and variants thereof. We then quantified cooperativity underlying heterodimer-DNA binding and derived an integrative heterodimer DNA binding constant. Using this cooperativity-inclusive constant, we were able to build a heterodimer-DNA binding specificity model that has superior predictive power than the one based on a regular one-site equilibrium. Our data further revealed that individual nucleotide substitutions within the target site affect the extent of cooperativity in PPARγ:RXRα-DNA binding. Our study therefore emphasizes the importance of assessing cooperativity when generating DNA binding specificity models for heterodimers. PMID:26912662

  10. Quantification of Cooperativity in Heterodimer-DNA Binding Improves the Accuracy of Binding Specificity Models.

    Science.gov (United States)

    Isakova, Alina; Berset, Yves; Hatzimanikatis, Vassily; Deplancke, Bart

    2016-05-06

    Many transcription factors (TFs) have the ability to cooperate on DNA elements as heterodimers. Despite the significance of TF heterodimerization for gene regulation, a quantitative understanding of cooperativity between various TF dimer partners and its impact on heterodimer DNA binding specificity models is still lacking. Here, we used a novel integrative approach, combining microfluidics-steered measurements of dimer-DNA assembly with mechanistic modeling of the implicated protein-protein-DNA interactions to quantitatively interrogate the cooperative DNA binding behavior of the adipogenic peroxisome proliferator-activated receptor γ (PPARγ):retinoid X receptor α (RXRα) heterodimer. Using the high throughput MITOMI (mechanically induced trapping of molecular interactions) platform, we derived equilibrium DNA binding data for PPARγ, RXRα, as well as the PPARγ:RXRα heterodimer to more than 300 target DNA sites and variants thereof. We then quantified cooperativity underlying heterodimer-DNA binding and derived an integrative heterodimer DNA binding constant. Using this cooperativity-inclusive constant, we were able to build a heterodimer-DNA binding specificity model that has superior predictive power than the one based on a regular one-site equilibrium. Our data further revealed that individual nucleotide substitutions within the target site affect the extent of cooperativity in PPARγ:RXRα-DNA binding. Our study therefore emphasizes the importance of assessing cooperativity when generating DNA binding specificity models for heterodimers. © 2016 by The American Society for Biochemistry and Molecular Biology, Inc.

  11. The technique for 3D printing patient-specific models for auricular reconstruction.

    Science.gov (United States)

    Flores, Roberto L; Liss, Hannah; Raffaelli, Samuel; Humayun, Aiza; Khouri, Kimberly S; Coelho, Paulo G; Witek, Lukasz

    2017-06-01

    Currently, surgeons approach autogenous microtia repair by creating a two-dimensional (2D) tracing of the unaffected ear to approximate a three-dimensional (3D) construct, a difficult process. To address these shortcomings, this study introduces the fabrication of patient-specific, sterilizable 3D printed auricular model for autogenous auricular reconstruction. A high-resolution 3D digital photograph was captured of the patient's unaffected ear and surrounding anatomic structures. The photographs were exported and uploaded into Amira, for transformation into a digital (.stl) model, which was imported into Blender, an open source software platform for digital modification of data. The unaffected auricle as digitally isolated and inverted to render a model for the contralateral side. The depths of the scapha, triangular fossa, and cymba were deepened to accentuate their contours. Extra relief was added to the helical root to further distinguish this structure. The ear was then digitally deconstructed and separated into its individual auricular components for reconstruction. The completed ear and its individual components were 3D printed using polylactic acid filament and sterilized following manufacturer specifications. The sterilized models were brought to the operating room to be utilized by the surgeon. The models allowed for more accurate anatomic measurements compared to 2D tracings, which reduced the degree of estimation required by surgeons. Approximately 20 g of the PLA filament were utilized for the construction of these models, yielding a total material cost of approximately $1. Using the methodology detailed in this report, as well as departmentally available resources (3D digital photography and 3D printing), a sterilizable, patient-specific, and inexpensive 3D auricular model was fabricated to be used intraoperatively. This technique of printing customized-to-patient models for surgeons to use as 'guides' shows great promise. Copyright © 2017 European

  12. "Dispersion modeling approaches for near road | Science ...

    Science.gov (United States)

    Roadway design and roadside barriers can have significant effects on the dispersion of traffic-generated pollutants, especially in the near-road environment. Dispersion models that can accurately simulate these effects are needed to fully assess these impacts for a variety of applications. For example, such models can be useful for evaluating the mitigation potential of roadside barriers in reducing near-road exposures and their associated adverse health effects. Two databases, a tracer field study and a wind tunnel study, provide measurements used in the development and/or validation of algorithms to simulate dispersion in the presence of noise barriers. The tracer field study was performed in Idaho Falls, ID, USA with a 6-m noise barrier and a finite line source in a variety of atmospheric conditions. The second study was performed in the meteorological wind tunnel at the US EPA and simulated line sources at different distances from a model noise barrier to capture the effect on emissions from individual lanes of traffic. In both cases, velocity and concentration measurements characterized the effect of the barrier on dispersion.This paper presents comparisons with the two datasets of the barrier algorithms implemented in two different dispersion models: US EPA’s R-LINE (a research dispersion modelling tool under development by the US EPA’s Office of Research and Development) and CERC’s ADMS model (ADMS-Urban). In R-LINE the physical features reveal

  13. A Sampling Based Approach to Spacecraft Autonomous Maneuvering with Safety Specifications

    Science.gov (United States)

    Starek, Joseph A.; Barbee, Brent W.; Pavone, Marco

    2015-01-01

    This paper presents a methods for safe spacecraft autonomous maneuvering that leverages robotic motion-planning techniques to spacecraft control. Specifically the scenario we consider is an in-plan rendezvous of a chaser spacecraft in proximity to a target spacecraft at the origin of the Clohessy Wiltshire Hill frame. The trajectory for the chaser spacecraft is generated in a receding horizon fashion by executing a sampling based robotic motion planning algorithm name Fast Marching Trees (FMT) which efficiently grows a tree of trajectories over a set of probabillistically drawn samples in the state space. To enforce safety the tree is only grown over actively safe samples for which there exists a one-burn collision avoidance maneuver that circularizes the spacecraft orbit along a collision-free coasting arc and that can be executed under potential thrusters failures. The overall approach establishes a provably correct framework for the systematic encoding of safety specifications into the spacecraft trajectory generations process and appears amenable to real time implementation on orbit. Simulation results are presented for a two-fault tolerant spacecraft during autonomous approach to a single client in Low Earth Orbit.

  14. A New Approach to Model Verification, Falsification and Selection

    Directory of Open Access Journals (Sweden)

    Andrew J. Buck

    2015-06-01

    Full Text Available This paper shows that a qualitative analysis, i.e., an assessment of the consistency of a hypothesized sign pattern for structural arrays with the sign pattern of the estimated reduced form, can always provide decisive insight into a model’s validity both in general and compared to other models. Qualitative analysis can show that it is impossible for some models to have generated the data used to estimate the reduced form, even though standard specification tests might show the model to be adequate. A partially specified structural hypothesis can be falsified by estimating as few as one reduced form equation. Zero restrictions in the structure can themselves be falsified. It is further shown how the information content of the hypothesized structural sign patterns can be measured using a commonly applied concept of statistical entropy. The lower the hypothesized structural sign pattern’s entropy, the more a priori information it proposes about the sign pattern of the estimated reduced form. As an hypothesized structural sign pattern has a lower entropy, it is more subject to type 1 error and less subject to type 2 error. Three cases illustrate the approach taken here.

  15. A chain reaction approach to modelling gene pathways.

    Science.gov (United States)

    Cheng, Gary C; Chen, Dung-Tsa; Chen, James J; Soong, Seng-Jaw; Lamartiniere, Coral; Barnes, Stephen

    2012-08-01

    BACKGROUND: Of great interest in cancer prevention is how nutrient components affect gene pathways associated with the physiological events of puberty. Nutrient-gene interactions may cause changes in breast or prostate cells and, therefore, may result in cancer risk later in life. Analysis of gene pathways can lead to insights about nutrient-gene interactions and the development of more effective prevention approaches to reduce cancer risk. To date, researchers have relied heavily upon experimental assays (such as microarray analysis, etc.) to identify genes and their associated pathways that are affected by nutrient and diets. However, the vast number of genes and combinations of gene pathways, coupled with the expense of the experimental analyses, has delayed the progress of gene-pathway research. The development of an analytical approach based on available test data could greatly benefit the evaluation of gene pathways, and thus advance the study of nutrient-gene interactions in cancer prevention. In the present study, we have proposed a chain reaction model to simulate gene pathways, in which the gene expression changes through the pathway are represented by the species undergoing a set of chemical reactions. We have also developed a numerical tool to solve for the species changes due to the chain reactions over time. Through this approach we can examine the impact of nutrient-containing diets on the gene pathway; moreover, transformation of genes over time with a nutrient treatment can be observed numerically, which is very difficult to achieve experimentally. We apply this approach to microarray analysis data from an experiment which involved the effects of three polyphenols (nutrient treatments), epigallo-catechin-3-O-gallate (EGCG), genistein, and resveratrol, in a study of nutrient-gene interaction in the estrogen synthesis pathway during puberty. RESULTS: In this preliminary study, the estrogen synthesis pathway was simulated by a chain reaction model. By

  16. Experimental development based on mapping rule between requirements analysis model and web framework specific design model.

    Science.gov (United States)

    Okuda, Hirotaka; Ogata, Shinpei; Matsuura, Saeko

    2013-12-01

    Model Driven Development is a promising approach to develop high quality software systems. We have proposed a method of model-driven requirements analysis using Unified Modeling Language (UML). The main feature of our method is to automatically generate a Web user interface prototype from UML requirements analysis model so that we can confirm validity of input/output data for each page and page transition on the system by directly operating the prototype. We proposes a mapping rule in which design information independent of each web application framework implementation is defined based on the requirements analysis model, so as to improve the traceability to the final product from the valid requirements analysis model. This paper discusses the result of applying our method to the development of a Group Work Support System that is currently running in our department.

  17. Country-specific approaches to integrated population, nutrition and food programmes.

    Science.gov (United States)

    1979-01-01

    The specific objectives of the Regional Seminar on an Integrated Approach to Population, Food and Nutrition Policies and Programs for National Development organized by the Population Division of the Economic and Social Commission for Asia and the Pacific (ESCAP) region were the following: 1) to catalogue, review and exchange information on current population, food and nutrition policies and programs in countries of the region; 2) to formulate appropriate strategies for effectively developing complementary policies in the field of action-oriented integrated programs that will consider available manpower and financial resources; and 3) to suggest measures to ensure the cooperation and coordination necessary for the implementation of such programs. The Seminar prepared a set of guidelines, established goals for an integrated approach, and worked out a strategy to achieve the goals. It was emphasized that the content of an integrated policy should reflect the intersectoral nature of the problem. Systematic planning was identified as necessary for each of the sectoral areas. The need for research workers to broaden their approach was also expressed at the Seminar.

  18. A consortium approach to glass furnace modeling.

    Energy Technology Data Exchange (ETDEWEB)

    Chang, S.-L.; Golchert, B.; Petrick, M.

    1999-04-20

    Using computational fluid dynamics to model a glass furnace is a difficult task for any one glass company, laboratory, or university to accomplish. The task of building a computational model of the furnace requires knowledge and experience in modeling two dissimilar regimes (the combustion space and the liquid glass bath), along with the skill necessary to couple these two regimes. Also, a detailed set of experimental data is needed in order to evaluate the output of the code to ensure that the code is providing proper results. Since all these diverse skills are not present in any one research institution, a consortium was formed between Argonne National Laboratory, Purdue University, Mississippi State University, and five glass companies in order to marshal these skills into one three-year program. The objective of this program is to develop a fully coupled, validated simulation of a glass melting furnace that may be used by industry to optimize the performance of existing furnaces.

  19. Fractal approach to computer-analytical modelling of tree crown

    International Nuclear Information System (INIS)

    Berezovskaya, F.S.; Karev, G.P.; Kisliuk, O.F.; Khlebopros, R.G.; Tcelniker, Yu.L.

    1993-09-01

    In this paper we discuss three approaches to the modeling of a tree crown development. These approaches are experimental (i.e. regressive), theoretical (i.e. analytical) and simulation (i.e. computer) modeling. The common assumption of these is that a tree can be regarded as one of the fractal objects which is the collection of semi-similar objects and combines the properties of two- and three-dimensional bodies. We show that a fractal measure of crown can be used as the link between the mathematical models of crown growth and light propagation through canopy. The computer approach gives the possibility to visualize a crown development and to calibrate the model on experimental data. In the paper different stages of the above-mentioned approaches are described. The experimental data for spruce, the description of computer system for modeling and the variant of computer model are presented. (author). 9 refs, 4 figs

  20. 3D Modelling and Printing Technology to Produce Patient-Specific 3D Models.

    Science.gov (United States)

    Birbara, Nicolette S; Otton, James M; Pather, Nalini

    2017-11-10

    A comprehensive knowledge of mitral valve (MV) anatomy is crucial in the assessment of MV disease. While the use of three-dimensional (3D) modelling and printing in MV assessment has undergone early clinical evaluation, the precision and usefulness of this technology requires further investigation. This study aimed to assess and validate 3D modelling and printing technology to produce patient-specific 3D MV models. A prototype method for MV 3D modelling and printing was developed from computed tomography (CT) scans of a plastinated human heart. Mitral valve models were printed using four 3D printing methods and validated to assess precision. Cardiac CT and 3D echocardiography imaging data of four MV disease patients was used to produce patient-specific 3D printed models, and 40 cardiac health professionals (CHPs) were surveyed on the perceived value and potential uses of 3D models in a clinical setting. The prototype method demonstrated submillimetre precision for all four 3D printing methods used, and statistical analysis showed a significant difference (pprinted models, particularly using multiple print materials, were considered useful by CHPs for preoperative planning, as well as other applications such as teaching and training. This study suggests that, with further advances in 3D modelling and printing technology, patient-specific 3D MV models could serve as a useful clinical tool. The findings also highlight the potential of this technology to be applied in a variety of medical areas within both clinical and educational settings. Copyright © 2017 Australian and New Zealand Society of Cardiac and Thoracic Surgeons (ANZSCTS) and the Cardiac Society of Australia and New Zealand (CSANZ). Published by Elsevier B.V. All rights reserved.

  1. On Automatic Modeling and Use of Domain-specific Ontologies

    DEFF Research Database (Denmark)

    Andreasen, Troels; Knappe, Rasmus; Bulskov, Henrik

    2005-01-01

    collection the general ontology is restricted to a domain specific ontology encompassing concepts instantiated in the collection. The resulting domain specific ontology and similarity can be applied for surveying the collection through key concepts and conceptual relations and provides a means for topic...

  2. Central neuronal motor behaviour in skilled and less skilled novices - Approaching sports-specific movement techniques.

    Science.gov (United States)

    Vogt, Tobias; Kato, Kouki; Schneider, Stefan; Türk, Stefan; Kanosue, Kazuyuki

    2017-04-01

    Research on motor behavioural processes preceding voluntary movements often refers to analysing the readiness potential (RP). For this, decades of studies used laboratory setups with controlled sports-related actions. Further, recent applied approaches focus on athlete-non-athlete comparisons, omitting possible effects of training history on RP. However, RP preceding real sport-specific movements in accordance to skill acquisition remains to be elucidated. Therefore, after familiarization 16 right-handed males with no experience in archery volunteered to perform repeated sports-specific movements, i.e. 40 arrow-releasing shots at 60s rest on a 15m distant standard target. Continuous, synchronised EEG and right limb EMG recordings during arrow-releasing served to detect movement onsets for RP analyses over distinct cortical motor areas. Based on attained scores on target, archery novices were, a posteriori, subdivided into a skilled and less skilled group. EMG results for mean values revealed no significant changes (all p>0.05), whereas RP amplitudes and onsets differed between groups but not between motor areas. Arrow-releasing preceded larger RP amplitudes (psports-specific movement. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Forecasting selected specific age mortality rate of Malaysia by using Lee-Carter model

    Science.gov (United States)

    Shukri Kamaruddin, Halim; Ismail, Noriszura

    2018-03-01

    Observing mortality pattern and trend is an important subject for any country to maintain a good social-economy in the next projection years. The declining in mortality trend gives a good impression of what a government has done towards macro citizen in one nation. Selecting a particular mortality model can be a tricky based on the approached method adapting. Lee-Carter model is adapted because of its simplicity and reliability of the outcome results with approach of regression. Implementation of Lee-Carter in finding a fitted model and hence its projection has been used worldwide in most of mortality research in developed countries. This paper studies the mortality pattern of Malaysia in the past by using original model of Lee-Carter (1992) and hence its cross-sectional observation for a single age. The data is indexed by age of death and year of death from 1984 to 2012, in which are supplied by Department of Statistics Malaysia. The results are modelled by using RStudio and the keen analysis will focus on the trend and projection of mortality rate and age specific mortality rate in the future. This paper can be extended to different variants extensions of Lee-Carter or any stochastic mortality tool by using Malaysia mortality experience as a centre of the main issue.

  4. Modeling Exon-Specific Bias Distribution Improves the Analysis of RNA-Seq Data.

    Directory of Open Access Journals (Sweden)

    Xuejun Liu

    Full Text Available RNA-seq technology has become an important tool for quantifying the gene and transcript expression in transcriptome study. The two major difficulties for the gene and transcript expression quantification are the read mapping ambiguity and the overdispersion of the read distribution along reference sequence. Many approaches have been proposed to deal with these difficulties. A number of existing methods use Poisson distribution to model the read counts and this easily splits the counts into the contributions from multiple transcripts. Meanwhile, various solutions were put forward to account for the overdispersion in the Poisson models. By checking the similarities among the variation patterns of read counts for individual genes, we found that the count variation is exon-specific and has the conserved pattern across the samples for each individual gene. We introduce Gamma-distributed latent variables to model the read sequencing preference for each exon. These variables are embedded to the rate parameter of a Poisson model to account for the overdispersion of read distribution. The model is tractable since the Gamma priors can be integrated out in the maximum likelihood estimation. We evaluate the proposed approach, PGseq, using four real datasets and one simulated dataset, and compare its performance with other popular methods. Results show that PGseq presents competitive performance compared to other alternatives in terms of accuracy in the gene and transcript expression calculation and in the downstream differential expression analysis. Especially, we show the advantage of our method in the analysis of low expression.

  5. Phytoplankton as Particles - A New Approach to Modeling Algal Blooms

    Science.gov (United States)

    2013-07-01

    ER D C/ EL T R -1 3 -1 3 Civil Works Basic Research Program Phytoplankton as Particles – A New Approach to Modeling Algal Blooms E nv... Phytoplankton as Particles – A New Approach to Modeling Algal Blooms Carl F. Cerco and Mark R. Noel Environmental Laboratory U.S. Army Engineer Research... phytoplankton blooms can be modeled by treating phytoplankton as discrete particles capable of self- induced transport via buoyancy regulation or other

  6. Contribution of a companion modelling approach

    African Journals Online (AJOL)

    2009-09-16

    Sep 16, 2009 ... This paper describes the role of participatory modelling and simulation as a way to provide a meaningful framework to enable actors to understand the interdependencies in peri-urban catchment management. A role-playing game, connecting the quantitative and qualitative dynamics of the resources with ...

  7. Numerical modelling approach for mine backfill

    Indian Academy of Sciences (India)

    Muhammad Zaka Emad

    2017-07-24

    Jul 24, 2017 ... Abstract. Numerical modelling is broadly used for assessing complex scenarios in underground mines, including mining sequence and blast-induced vibrations from production blasting. Sublevel stoping mining methods with delayed backfill are extensively used to exploit steeply dipping ore bodies by ...

  8. Energy and development : A modelling approach

    NARCIS (Netherlands)

    van Ruijven, B.J.|info:eu-repo/dai/nl/304834521

    2008-01-01

    Rapid economic growth of developing countries like India and China implies that these countries become important actors in the global energy system. Examples of this impact are the present day oil shortages and rapidly increasing emissions of greenhouse gases. Global energy models are used explore

  9. Numerical modelling approach for mine backfill

    Indian Academy of Sciences (India)

    Muhammad Zaka Emad

    2017-07-24

    Jul 24, 2017 ... pulse is applied as a stress history on the CRF stope. Blast wave data obtained from the on-site monitoring are very complex. It requires processing before interpreting and using it for numerical models. Generally, mining compa- nies hire geophysics experts for interpretation of such data. The blast wave ...

  10. A new approach to model mixed hydrates

    Czech Academy of Sciences Publication Activity Database

    Hielscher, S.; Vinš, Václav; Jäger, A.; Hrubý, Jan; Breitkopf, C.; Span, R.

    2018-01-01

    Roč. 459, March (2018), s. 170-185 ISSN 0378-3812 R&D Projects: GA ČR(CZ) GA17-08218S Institutional support: RVO:61388998 Keywords : gas hydrate * mixture * modeling Subject RIV: BJ - Thermodynamics Impact factor: 2.473, year: 2016 https://www. science direct.com/ science /article/pii/S0378381217304983

  11. Energy and Development. A Modelling Approach

    International Nuclear Information System (INIS)

    Van Ruijven, B.J.

    2008-01-01

    Rapid economic growth of developing countries like India and China implies that these countries become important actors in the global energy system. Examples of this impact are the present day oil shortages and rapidly increasing emissions of greenhouse gases. Global energy models are used to explore possible future developments of the global energy system and identify policies to prevent potential problems. Such estimations of future energy use in developing countries are very uncertain. Crucial factors in the future energy use of these regions are electrification, urbanisation and income distribution, issues that are generally not included in present day global energy models. Model simulations in this thesis show that current insight in developments in low-income regions lead to a wide range of expected energy use in 2030 of the residential and transport sectors. This is mainly caused by many different model calibration options that result from the limited data availability for model development and calibration. We developed a method to identify the impact of model calibration uncertainty on future projections. We developed a new model for residential energy use in India, in collaboration with the Indian Institute of Science. Experiments with this model show that the impact of electrification and income distribution is less univocal than often assumed. The use of fuelwood, with related health risks, can decrease rapidly if the income of poor groups increases. However, there is a trade off in terms of CO2 emissions because these groups gain access to electricity and the ownership of appliances increases. Another issue is the potential role of new technologies in developing countries: will they use the opportunities of leapfrogging? We explored the potential role of hydrogen, an energy carrier that might play a central role in a sustainable energy system. We found that hydrogen only plays a role before 2050 under very optimistic assumptions. Regional energy

  12. Energy and Development. A Modelling Approach

    Energy Technology Data Exchange (ETDEWEB)

    Van Ruijven, B.J.

    2008-12-17

    Rapid economic growth of developing countries like India and China implies that these countries become important actors in the global energy system. Examples of this impact are the present day oil shortages and rapidly increasing emissions of greenhouse gases. Global energy models are used to explore possible future developments of the global energy system and identify policies to prevent potential problems. Such estimations of future energy use in developing countries are very uncertain. Crucial factors in the future energy use of these regions are electrification, urbanisation and income distribution, issues that are generally not included in present day global energy models. Model simulations in this thesis show that current insight in developments in low-income regions lead to a wide range of expected energy use in 2030 of the residential and transport sectors. This is mainly caused by many different model calibration options that result from the limited data availability for model development and calibration. We developed a method to identify the impact of model calibration uncertainty on future projections. We developed a new model for residential energy use in India, in collaboration with the Indian Institute of Science. Experiments with this model show that the impact of electrification and income distribution is less univocal than often assumed. The use of fuelwood, with related health risks, can decrease rapidly if the income of poor groups increases. However, there is a trade off in terms of CO2 emissions because these groups gain access to electricity and the ownership of appliances increases. Another issue is the potential role of new technologies in developing countries: will they use the opportunities of leapfrogging? We explored the potential role of hydrogen, an energy carrier that might play a central role in a sustainable energy system. We found that hydrogen only plays a role before 2050 under very optimistic assumptions. Regional energy

  13. A model-based systems approach to pharmaceutical product-process design and analysis

    DEFF Research Database (Denmark)

    Gernaey, Krist; Gani, Rafiqul

    2010-01-01

    This is a perspective paper highlighting the need for systematic model-based design and analysis in pharmaceutical product-process development. A model-based framework is presented and the role, development and use of models of various types are discussed together with the structure of the models...... for the product and the process. The need for a systematic modelling framework is highlighted together with modelling issues related to model identification, adaptation and extension. In the area of product design and analysis, predictive models are needed with a wide application range. In the area of process...... synthesis and design, the use of generic process models from which specific process models can be generated, is highlighted. The use of a multi-scale modelling approach to extend the application range of the property models is highlighted as well. Examples of different types of process models, model...

  14. A path integral approach to the Hodgkin-Huxley model

    Science.gov (United States)

    Baravalle, Roman; Rosso, Osvaldo A.; Montani, Fernando

    2017-11-01

    To understand how single neurons process sensory information, it is necessary to develop suitable stochastic models to describe the response variability of the recorded spike trains. Spikes in a given neuron are produced by the synergistic action of sodium and potassium of the voltage-dependent channels that open or close the gates. Hodgkin and Huxley (HH) equations describe the ionic mechanisms underlying the initiation and propagation of action potentials, through a set of nonlinear ordinary differential equations that approximate the electrical characteristics of the excitable cell. Path integral provides an adequate approach to compute quantities such as transition probabilities, and any stochastic system can be expressed in terms of this methodology. We use the technique of path integrals to determine the analytical solution driven by a non-Gaussian colored noise when considering the HH equations as a stochastic system. The different neuronal dynamics are investigated by estimating the path integral solutions driven by a non-Gaussian colored noise q. More specifically we take into account the correlational structures of the complex neuronal signals not just by estimating the transition probability associated to the Gaussian approach of the stochastic HH equations, but instead considering much more subtle processes accounting for the non-Gaussian noise that could be induced by the surrounding neural network and by feedforward correlations. This allows us to investigate the underlying dynamics of the neural system when different scenarios of noise correlations are considered.

  15. Stabilization Approaches for Linear and Nonlinear Reduced Order Models

    Science.gov (United States)

    Rezaian, Elnaz; Wei, Mingjun

    2017-11-01

    It has been a major concern to establish reduced order models (ROMs) as reliable representatives of the dynamics inherent in high fidelity simulations, while fast computation is achieved. In practice it comes to stability and accuracy of ROMs. Given the inviscid nature of Euler equations it becomes more challenging to achieve stability, especially where moving discontinuities exist. Originally unstable linear and nonlinear ROMs are stabilized here by two approaches. First, a hybrid method is developed by integrating two different stabilization algorithms. At the same time, symmetry inner product is introduced in the generation of ROMs for its known robust behavior for compressible flows. Results have shown a notable improvement in computational efficiency and robustness compared to similar approaches. Second, a new stabilization algorithm is developed specifically for nonlinear ROMs. This method adopts Particle Swarm Optimization to enforce a bounded ROM response for minimum discrepancy between the high fidelity simulation and the ROM outputs. Promising results are obtained in its application on the nonlinear ROM of an inviscid fluid flow with discontinuities. Supported by ARL.

  16. Dynamic Metabolic Model Building Based on the Ensemble Modeling Approach

    Energy Technology Data Exchange (ETDEWEB)

    Liao, James C. [Univ. of California, Los Angeles, CA (United States)

    2016-10-01

    Ensemble modeling of kinetic systems addresses the challenges of kinetic model construction, with respect to parameter value selection, and still allows for the rich insights possible from kinetic models. This project aimed to show that constructing, implementing, and analyzing such models is a useful tool for the metabolic engineering toolkit, and that they can result in actionable insights from models. Key concepts are developed and deliverable publications and results are presented.

  17. Physical models have gender‐specific effects on student understanding of protein structure–function relationships

    Science.gov (United States)

    Harris, Michelle A.; Chang, Wesley S.; Dent, Erik W.; Nordheim, Erik V.; Franzen, Margaret A.

    2016-01-01

    Abstract Understanding how basic structural units influence function is identified as a foundational/core concept for undergraduate biological and biochemical literacy. It is essential for students to understand this concept at all size scales, but it is often more difficult for students to understand structure–function relationships at the molecular level, which they cannot as effectively visualize. Students need to develop accurate, 3‐dimensional mental models of biomolecules to understand how biomolecular structure affects cellular functions at the molecular level, yet most traditional curricular tools such as textbooks include only 2‐dimensional representations. We used a controlled, backward design approach to investigate how hand‐held physical molecular model use affected students' ability to logically predict structure–function relationships. Brief (one class period) physical model use increased quiz score for females, whereas there was no significant increase in score for males using physical models. Females also self‐reported higher learning gains in their understanding of context‐specific protein function. Gender differences in spatial visualization may explain the gender‐specific benefits of physical model use observed. © 2016 The Authors Biochemistry and Molecular Biology Education published by Wiley Periodicals, Inc. on behalf of International Union of Biochemistry and Molecular Biology, 44(4):326–335, 2016. PMID:26923186

  18. Physical models have gender-specific effects on student understanding of protein structure-function relationships.

    Science.gov (United States)

    Forbes-Lorman, Robin M; Harris, Michelle A; Chang, Wesley S; Dent, Erik W; Nordheim, Erik V; Franzen, Margaret A

    2016-07-08

    Understanding how basic structural units influence function is identified as a foundational/core concept for undergraduate biological and biochemical literacy. It is essential for students to understand this concept at all size scales, but it is often more difficult for students to understand structure-function relationships at the molecular level, which they cannot as effectively visualize. Students need to develop accurate, 3-dimensional mental models of biomolecules to understand how biomolecular structure affects cellular functions at the molecular level, yet most traditional curricular tools such as textbooks include only 2-dimensional representations. We used a controlled, backward design approach to investigate how hand-held physical molecular model use affected students' ability to logically predict structure-function relationships. Brief (one class period) physical model use increased quiz score for females, whereas there was no significant increase in score for males using physical models. Females also self-reported higher learning gains in their understanding of context-specific protein function. Gender differences in spatial visualization may explain the gender-specific benefits of physical model use observed. © 2016 The Authors Biochemistry and Molecular Biology Education published by Wiley Periodicals, Inc. on behalf of International Union of Biochemistry and Molecular Biology, 44(4):326-335, 2016. © 2016 The International Union of Biochemistry and Molecular Biology.

  19. Integration models: multicultural and liberal approaches confronted

    Science.gov (United States)

    Janicki, Wojciech

    2012-01-01

    European societies have been shaped by their Christian past, upsurge of international migration, democratic rule and liberal tradition rooted in religious tolerance. Boosting globalization processes impose new challenges on European societies, striving to protect their diversity. This struggle is especially clearly visible in case of minorities trying to resist melting into mainstream culture. European countries' legal systems and cultural policies respond to these efforts in many ways. Respecting identity politics-driven group rights seems to be the most common approach, resulting in creation of a multicultural society. However, the outcome of respecting group rights may be remarkably contradictory to both individual rights growing out from liberal tradition, and to reinforced concept of integration of immigrants into host societies. The hereby paper discusses identity politics upturn in the context of both individual rights and integration of European societies.

  20. Modelling thermal plume impacts - Kalpakkam approach

    International Nuclear Information System (INIS)

    Rao, T.S.; Anup Kumar, B.; Narasimhan, S.V.

    2002-01-01

    A good understanding of temperature patterns in the receiving waters is essential to know the heat dissipation from thermal plumes originating from coastal power plants. The seasonal temperature profiles of the Kalpakkam coast near Madras Atomic Power Station (MAPS) thermal out fall site are determined and analysed. It is observed that the seasonal current reversal in the near shore zone is one of the major mechanisms for the transport of effluents away from the point of mixing. To further refine our understanding of the mixing and dilution processes, it is necessary to numerically simulate the coastal ocean processes by parameterising the key factors concerned. In this paper, we outline the experimental approach to achieve this objective. (author)

  1. The Layer-Oriented Approach to Declarative Languages for Biological Modeling

    Science.gov (United States)

    Raikov, Ivan; De Schutter, Erik

    2012-01-01

    We present a new approach to modeling languages for computational biology, which we call the layer-oriented approach. The approach stems from the observation that many diverse biological phenomena are described using a small set of mathematical formalisms (e.g. differential equations), while at the same time different domains and subdomains of computational biology require that models are structured according to the accepted terminology and classification of that domain. Our approach uses distinct semantic layers to represent the domain-specific biological concepts and the underlying mathematical formalisms. Additional functionality can be transparently added to the language by adding more layers. This approach is specifically concerned with declarative languages, and throughout the paper we note some of the limitations inherent to declarative approaches. The layer-oriented approach is a way to specify explicitly how high-level biological modeling concepts are mapped to a computational representation, while abstracting away details of particular programming languages and simulation environments. To illustrate this process, we define an example language for describing models of ionic currents, and use a general mathematical notation for semantic transformations to show how to generate model simulation code for various simulation environments. We use the example language to describe a Purkinje neuron model and demonstrate how the layer-oriented approach can be used for solving several practical issues of computational neuroscience model development. We discuss the advantages and limitations of the approach in comparison with other modeling language efforts in the domain of computational biology and outline some principles for extensible, flexible modeling language design. We conclude by describing in detail the semantic transformations defined for our language. PMID:22615554

  2. HIV-specific probabilistic models of protein evolution.

    Directory of Open Access Journals (Sweden)

    David C Nickle

    2007-06-01

    Full Text Available Comparative sequence analyses, including such fundamental bioinformatics techniques as similarity searching, sequence alignment and phylogenetic inference, have become a mainstay for researchers studying type 1 Human Immunodeficiency Virus (HIV-1 genome structure and evolution. Implicit in comparative analyses is an underlying model of evolution, and the chosen model can significantly affect the results. In general, evolutionary models describe the probabilities of replacing one amino acid character with another over a period of time. Most widely used evolutionary models for protein sequences have been derived from curated alignments of hundreds of proteins, usually based on mammalian genomes. It is unclear to what extent these empirical models are generalizable to a very different organism, such as HIV-1-the most extensively sequenced organism in existence. We developed a maximum likelihood model fitting procedure to a collection of HIV-1 alignments sampled from different viral genes, and inferred two empirical substitution models, suitable for describing between-and within-host evolution. Our procedure pools the information from multiple sequence alignments, and provided software implementation can be run efficiently in parallel on a computer cluster. We describe how the inferred substitution models can be used to generate scoring matrices suitable for alignment and similarity searches. Our models had a consistently superior fit relative to the best existing models and to parameter-rich data-driven models when benchmarked on independent HIV-1 alignments, demonstrating evolutionary biases in amino-acid substitution that are unique to HIV, and that are not captured by the existing models. The scoring matrices derived from the models showed a marked difference from common amino-acid scoring matrices. The use of an appropriate evolutionary model recovered a known viral transmission history, whereas a poorly chosen model introduced phylogenetic

  3. Modelling approach for photochemical pollution studies

    International Nuclear Information System (INIS)

    Silibello, C.; Catenacci, G.; Calori, G.; Crapanzano, G.; Pirovano, G.

    1996-01-01

    The comprehension of the relationships between primary pollutants emissions and secondary pollutants concentration and deposition is necessary to design policies and strategies for the maintenance of a healthy environment. The use of mathematical models is a powerful tool to assess the effect of the emissions and of physical and chemical transformations of pollutants on air quality. A photochemical model, Calgrid, developed by CARB (California Air Resources Board), has been used to test the effect of different meteorological and air quality, scenarios on the ozone concentration levels. This way we can evaluate the influence of these conditions to determine the most important chemical species and reactions in atmosphere. The ozone levels are strongly related to the reactive hydrocarbons concentrations and to the solar radiation flux

  4. Colour texture segmentation using modelling approach

    Czech Academy of Sciences Publication Activity Database

    Haindl, Michal; Mikeš, Stanislav

    2005-01-01

    Roč. 3687, č. - (2005), s. 484-491 ISSN 0302-9743. [International Conference on Advances in Pattern Recognition /3./. Bath, 22.08.2005-25.08.2005] R&D Projects: GA MŠk 1M0572; GA AV ČR 1ET400750407; GA AV ČR IAA2075302 Institutional research plan: CEZ:AV0Z10750506 Keywords : colour texture segmentation * image models * segmentation benchmark Subject RIV: BD - Theory of Information

  5. A combined DNA-microarray and mechanism-specific toxicity approach with zebrafish embryos to investigate the pollution of river sediments

    NARCIS (Netherlands)

    Kosmehl, Thomas; Otte, Jens C.; Yang, Lixin; Legradi, J.B.; Bluhm, Kerstin; Zinsmeister, Christian; Keiter, Steffen H.; Reifferscheid, Georg; Manz, Werner; Braunbeck, Thomas; Strähle, Uwe; Hollert, Henner

    2012-01-01

    The zebrafish embryo has repeatedly proved to be a useful model for the analysis of effects by environmental toxicants. This proof-of-concept study was performed to investigate if an approach combining mechanism-specific bioassays with microarray techniques can obtain more in-depth insights into the

  6. An approach to model validation and model-based prediction -- polyurethane foam case study.

    Energy Technology Data Exchange (ETDEWEB)

    Dowding, Kevin J.; Rutherford, Brian Milne

    2003-07-01

    Enhanced software methodology and improved computing hardware have advanced the state of simulation technology to a point where large physics-based codes can be a major contributor in many systems analyses. This shift toward the use of computational methods has brought with it new research challenges in a number of areas including characterization of uncertainty, model validation, and the analysis of computer output. It is these challenges that have motivated the work described in this report. Approaches to and methods for model validation and (model-based) prediction have been developed recently in the engineering, mathematics and statistical literatures. In this report we have provided a fairly detailed account of one approach to model validation and prediction applied to an analysis investigating thermal decomposition of polyurethane foam. A model simulates the evolution of the foam in a high temperature environment as it transforms from a solid to a gas phase. The available modeling and experimental results serve as data for a case study focusing our model validation and prediction developmental efforts on this specific thermal application. We discuss several elements of the ''philosophy'' behind the validation and prediction approach: (1) We view the validation process as an activity applying to the use of a specific computational model for a specific application. We do acknowledge, however, that an important part of the overall development of a computational simulation initiative is the feedback provided to model developers and analysts associated with the application. (2) We utilize information obtained for the calibration of model parameters to estimate the parameters and quantify uncertainty in the estimates. We rely, however, on validation data (or data from similar analyses) to measure the variability that contributes to the uncertainty in predictions for specific systems or units (unit-to-unit variability). (3) We perform statistical

  7. ISM Approach to Model Offshore Outsourcing Risks

    Directory of Open Access Journals (Sweden)

    Sunand Kumar

    2014-07-01

    Full Text Available In an effort to achieve a competitive advantage via cost reductions and improved market responsiveness, organizations are increasingly employing offshore outsourcing as a major component of their supply chain strategies. But as evident from literature number of risks such as Political risk, Risk due to cultural differences, Compliance and regulatory risk, Opportunistic risk and Organization structural risk, which adversely affect the performance of offshore outsourcing in a supply chain network. This also leads to dissatisfaction among different stake holders. The main objective of this paper is to identify and understand the mutual interaction among various risks which affect the performance of offshore outsourcing.  To this effect, authors have identified various risks through extant review of literature.  From this information, an integrated model using interpretive structural modelling (ISM for risks affecting offshore outsourcing is developed and the structural relationships between these risks are modeled.  Further, MICMAC analysis is done to analyze the driving power and dependency of risks which shall be helpful to managers to identify and classify important criterions and to reveal the direct and indirect effects of each criterion on offshore outsourcing. Results show that political risk and risk due to cultural differences are act as strong drivers.

  8. Spatial smoothing in Bayesian models: a comparison of weights matrix specifications and their impact on inference.

    Science.gov (United States)

    Duncan, Earl W; White, Nicole M; Mengersen, Kerrie

    2017-12-16

    When analysing spatial data, it is important to account for spatial autocorrelation. In Bayesian statistics, spatial autocorrelation is commonly modelled by the intrinsic conditional autoregressive prior distribution. At the heart of this model is a spatial weights matrix which controls the behaviour and degree of spatial smoothing. The purpose of this study is to review the main specifications of the spatial weights matrix found in the literature, and together with some new and less common specifications, compare the effect that they have on smoothing and model performance. The popular BYM model is described, and a simple solution for addressing the identifiability issue among the spatial random effects is provided. Seventeen different definitions of the spatial weights matrix are defined, which are classified into four classes: adjacency-based weights, and weights based on geographic distance, distance between covariate values, and a hybrid of geographic and covariate distances. These last two definitions embody the main novelty of this research. Three synthetic data sets are generated, each representing a different underlying spatial structure. These data sets together with a real spatial data set from the literature are analysed using the models. The models are evaluated using the deviance information criterion and Moran's I statistic. The deviance information criterion indicated that the model which uses binary, first-order adjacency weights to perform spatial smoothing is generally an optimal choice for achieving a good model fit. Distance-based weights also generally perform quite well and offer similar parameter interpretations. The less commonly explored options for performing spatial smoothing generally provided a worse model fit than models with more traditional approaches to smoothing, but usually outperformed the benchmark model which did not conduct spatial smoothing. The specification of the spatial weights matrix can have a colossal impact on model

  9. Automated model integration at source code level: An approach for implementing models into the NASA Land Information System

    Science.gov (United States)

    Wang, S.; Peters-Lidard, C. D.; Mocko, D. M.; Kumar, S.; Nearing, G. S.; Arsenault, K. R.; Geiger, J. V.

    2014-12-01

    Model integration bridges the data flow between modeling frameworks and models. However, models usually do not fit directly into a particular modeling environment, if not designed for it. An example includes implementing different types of models into the NASA Land Information System (LIS), a software framework for land-surface modeling and data assimilation. Model implementation requires scientific knowledge and software expertise and may take a developer months to learn LIS and model software structure. Debugging and testing of the model implementation is also time-consuming due to not fully understanding LIS or the model. This time spent is costly for research and operational projects. To address this issue, an approach has been developed to automate model integration into LIS. With this in mind, a general model interface was designed to retrieve forcing inputs, parameters, and state variables needed by the model and to provide as state variables and outputs to LIS. Every model can be wrapped to comply with the interface, usually with a FORTRAN 90 subroutine. Development efforts need only knowledge of the model and basic programming skills. With such wrappers, the logic is the same for implementing all models. Code templates defined for this general model interface could be re-used with any specific model. Therefore, the model implementation can be done automatically. An automated model implementation toolkit was developed with Microsoft Excel and its built-in VBA language. It allows model specifications in three worksheets and contains FORTRAN 90 code templates in VBA programs. According to the model specification, the toolkit generates data structures and procedures within FORTRAN modules and subroutines, which transfer data between LIS and the model wrapper. Model implementation is standardized, and about 80 - 90% of the development load is reduced. In this presentation, the automated model implementation approach is described along with LIS programming

  10. Smeared crack modelling approach for corrosion-induced concrete damage

    DEFF Research Database (Denmark)

    Thybo, Anna Emilie Anusha; Michel, Alexander; Stang, Henrik

    2017-01-01

    In this paper a smeared crack modelling approach is used to simulate corrosion-induced damage in reinforced concrete. The presented modelling approach utilizes a thermal analogy to mimic the expansive nature of solid corrosion products, while taking into account the penetration of corrosion...... products into the surrounding concrete, non-uniform precipitation of corrosion products, and creep. To demonstrate the applicability of the presented modelling approach, numerical predictions in terms of corrosion-induced deformations as well as formation and propagation of micro- and macrocracks were...

  11. Applied Regression Modeling A Business Approach

    CERN Document Server

    Pardoe, Iain

    2012-01-01

    An applied and concise treatment of statistical regression techniques for business students and professionals who have little or no background in calculusRegression analysis is an invaluable statistical methodology in business settings and is vital to model the relationship between a response variable and one or more predictor variables, as well as the prediction of a response value given values of the predictors. In view of the inherent uncertainty of business processes, such as the volatility of consumer spending and the presence of market uncertainty, business professionals use regression a

  12. Jackiw-Pi model: A superfield approach

    Science.gov (United States)

    Gupta, Saurabh

    2014-12-01

    We derive the off-shell nilpotent and absolutely anticommuting Becchi-Rouet-Stora-Tyutin (BRST) as well as anti-BRST transformations s ( a) b corresponding to the Yang-Mills gauge transformations of 3D Jackiw-Pi model by exploiting the "augmented" super-field formalism. We also show that the Curci-Ferrari restriction, which is a hallmark of any non-Abelian 1-form gauge theories, emerges naturally within this formalism and plays an instrumental role in providing the proof of absolute anticommutativity of s ( a) b .

  13. A modular approach to numerical human body modeling

    NARCIS (Netherlands)

    Forbes, P.A.; Griotto, G.; Rooij, L. van

    2007-01-01

    The choice of a human body model for a simulated automotive impact scenario must take into account both accurate model response and computational efficiency as key factors. This study presents a "modular numerical human body modeling" approach which allows the creation of a customized human body

  14. Mathematical Modeling in Mathematics Education: Basic Concepts and Approaches

    Science.gov (United States)

    Erbas, Ayhan Kürsat; Kertil, Mahmut; Çetinkaya, Bülent; Çakiroglu, Erdinç; Alacaci, Cengiz; Bas, Sinem

    2014-01-01

    Mathematical modeling and its role in mathematics education have been receiving increasing attention in Turkey, as in many other countries. The growing body of literature on this topic reveals a variety of approaches to mathematical modeling and related concepts, along with differing perspectives on the use of mathematical modeling in teaching and…

  15. Convenience of Statistical Approach in Studies of Architectural Ornament and Other Decorative Elements Specific Application

    Science.gov (United States)

    Priemetz, O.; Samoilov, K.; Mukasheva, M.

    2017-11-01

    An ornament is an actual phenomenon of the architecture modern theory, a common element in the practice of design and construction. It has been an important aspect of shaping for millennia. The description of the methods of its application occupies a large place in the studies on the theory and practice of architecture. However, the problem of the saturation of compositions with ornamentation, the specificity of its themes and forms have not been sufficiently studied yet. This aspect requires accumulation of additional knowledge. The application of quantitative methods for the plastic solutions types and a thematic diversity of facade compositions of buildings constructed in different periods creates another tool for an objective analysis of ornament development. It demonstrates the application of this approach for studying the features of the architectural development in Kazakhstan at the end of the XIX - XXI centuries.

  16. Spatial and temporal determinants of A-weighted and frequency specific sound levels-An elastic net approach.

    Science.gov (United States)

    Walker, Erica D; Hart, Jaime E; Koutrakis, Petros; Cavallari, Jennifer M; VoPham, Trang; Luna, Marcos; Laden, Francine

    2017-11-01

    Urban sound levels are a ubiquitous environmental stressor and have been shown to be associated with a wide variety of health outcomes. While much is known about the predictors of A-weighted sound pressure levels in the urban environment, far less is known about other frequencies. To develop a series of spatial-temporal sound models to predict A-weighted sound pressure levels, low, mid, and high frequency sound for Boston, Massachusetts. Short-term sound levels were gathered at n = 400 sites from February 2015 - February 2016. Spatial and meteorological attributes at or near the sound monitoring site were obtained using publicly available data and a portable weather station. An elastic net variable selection technique was used to select predictors of A-weighted, low, mid, and high frequency sound. The final models for low, mid, high, and A-weighted sound levels explained 59 - 69% of the variability in each measure. Similar to other A-weighted models, our sound models included transportation related variables such as length of roads and bus lines in the surrounding area; distance to road and rail lines; traffic volume, vehicle mix, residential and commercial land use. However, frequency specific models highlighted additional predictors not included in the A-weighted model including temperature, vegetation, impervious surfaces, vehicle mix, and density of entertainment establishments and restaurants. Building spatial temporal models to characterize sound levels across the frequency spectrum using an elastic net approach can be a promising tool for noise exposure assessments within the urban soundscape. Models of sound's character may give us additional important sound exposure metrics to be utilized in epidemiological studies. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. Implicit moral evaluations: A multinomial modeling approach.

    Science.gov (United States)

    Cameron, C Daryl; Payne, B Keith; Sinnott-Armstrong, Walter; Scheffer, Julian A; Inzlicht, Michael

    2017-01-01

    Implicit moral evaluations-i.e., immediate, unintentional assessments of the wrongness of actions or persons-play a central role in supporting moral behavior in everyday life. Yet little research has employed methods that rigorously measure individual differences in implicit moral evaluations. In five experiments, we develop a new sequential priming measure-the Moral Categorization Task-and a multinomial model that decomposes judgment on this task into multiple component processes. These include implicit moral evaluations of moral transgression primes (Unintentional Judgment), accurate moral judgments about target actions (Intentional Judgment), and a directional tendency to judge actions as morally wrong (Response Bias). Speeded response deadlines reduced Intentional Judgment but not Unintentional Judgment (Experiment 1). Unintentional Judgment was stronger toward moral transgression primes than non-moral negative primes (Experiments 2-4). Intentional Judgment was associated with increased error-related negativity, a neurophysiological indicator of behavioral control (Experiment 4). Finally, people who voted for an anti-gay marriage amendment had stronger Unintentional Judgment toward gay marriage primes (Experiment 5). Across Experiments 1-4, implicit moral evaluations converged with moral personality: Unintentional Judgment about wrong primes, but not negative primes, was negatively associated with psychopathic tendencies and positively associated with moral identity and guilt proneness. Theoretical and practical applications of formal modeling for moral psychology are discussed. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Keyring models: An approach to steerability

    Science.gov (United States)

    Miller, Carl A.; Colbeck, Roger; Shi, Yaoyun

    2018-02-01

    If a measurement is made on one half of a bipartite system, then, conditioned on the outcome, the other half has a new reduced state. If these reduced states defy classical explanation—that is, if shared randomness cannot produce these reduced states for all possible measurements—the bipartite state is said to be steerable. Determining which states are steerable is a challenging problem even for low dimensions. In the case of two-qubit systems, a criterion is known for T-states (that is, those with maximally mixed marginals) under projective measurements. In the current work, we introduce the concept of keyring models—a special class of local hidden state models. When the measurements made correspond to real projectors, these allow us to study steerability beyond T-states. Using keyring models, we completely solve the steering problem for real projective measurements when the state arises from mixing a pure two-qubit state with uniform noise. We also give a partial solution in the case when the uniform noise is replaced by independent depolarizing channels.

  19. A Pattern-Oriented Approach to a Methodical Evaluation of Modeling Methods

    Directory of Open Access Journals (Sweden)

    Michael Amberg

    1996-11-01

    Full Text Available The paper describes a pattern-oriented approach to evaluate modeling methods and to compare various methods with each other from a methodical viewpoint. A specific set of principles (the patterns is defined by investigating the notations and the documentation of comparable modeling methods. Each principle helps to examine some parts of the methods from a specific point of view. All principles together lead to an overall picture of the method under examination. First the core ("method neutral" meaning of each principle is described. Then the methods are examined regarding the principle. Afterwards the method specific interpretations are compared with each other and with the core meaning of the principle. By this procedure, the strengths and weaknesses of modeling methods regarding methodical aspects are identified. The principles are described uniformly using a principle description template according to descriptions of object oriented design patterns. The approach is demonstrated by evaluating a business process modeling method.

  20. Hepsoft - an approach for up to date multi-platform deployment of HEP specific software

    International Nuclear Information System (INIS)

    Roiser, S

    2011-01-01

    LHC experiments are depending on a rich palette of software components to build their specific applications. These underlying software components include the ROOT analysis framework, the Geant4 simulation toolkit, Monte Carlo generators, grid middle-ware, graphics libraries, scripting languages, databases, tools, etc. which are provided centrally in up to date versions on multiple platforms (Linux, Mac, Windows). Until recently this set of packages has been tested and released in a tree like structure as a consistent set of versions across operating systems, architectures and compilers for LHC experiments only. Because of the tree like deployment these releases were only usable in connection with a configuration management tool which provided the proper build and run-time environments and was hindering other parties outside LHC from easily using this palette of packages. In a new approach the releases will be grouped in 'flat structure' such that interested parties can start using it without configuration management, retaining all the above mentioned advantages. In addition to an increased usability the software shall also be distributed via system provided package deployment systems (rpm, apt, etc.). The approach of software deployment is following the ideas of providing a wide range of HEP specific software packages and tools in a coherent, up to date and modular way on multiple platforms. The target audience for such software deployments are individual developers or smaller development groups / experiments who don't have the resources to maintain this kind of infrastructure. This new software deployment strategy has already been successfully implemented for groups at CERN.

  1. Forecasting wind-driven wildfires using an inverse modelling approach

    Directory of Open Access Journals (Sweden)

    O. Rios

    2014-06-01

    Full Text Available A technology able to rapidly forecast wildfire dynamics would lead to a paradigm shift in the response to emergencies, providing the Fire Service with essential information about the ongoing fire. This paper presents and explores a novel methodology to forecast wildfire dynamics in wind-driven conditions, using real-time data assimilation and inverse modelling. The forecasting algorithm combines Rothermel's rate of spread theory with a perimeter expansion model based on Huygens principle and solves the optimisation problem with a tangent linear approach and forward automatic differentiation. Its potential is investigated using synthetic data and evaluated in different wildfire scenarios. The results show the capacity of the method to quickly predict the location of the fire front with a positive lead time (ahead of the event in the order of 10 min for a spatial scale of 100 m. The greatest strengths of our method are lightness, speed and flexibility. We specifically tailor the forecast to be efficient and computationally cheap so it can be used in mobile systems for field deployment and operativeness. Thus, we put emphasis on producing a positive lead time and the means to maximise it.

  2. PIEteR : a field specific bio-economic production model for decision support in sugar beet growing

    NARCIS (Netherlands)

    Smit, A.B.

    1996-01-01


    To support decisions in sugar beet growing, a model, PIEteR, was developed. It simulates growth and production of the crop in a field specific way, making a tailor-made approach in decision taking possible.

    PIEteR is based on causal regression analysis of Dutch data of mostly

  3. Modelling Approach to Assess Future Agricultural Water Demand

    Science.gov (United States)

    Spano, D.; Mancosu, N.; Orang, M.; Sarreshteh, S.; Snyder, R. L.

    2013-12-01

    The combination of long-term climate changes (e.g., warmer average temperatures) and extremes events (e.g., droughts) can have decisive impacts on water demand, with further implications on the ecosystems. In countries already affected by water scarcity, water management problems are becoming increasingly serious. The sustainable management of available water resources at the global, regional, and site-specific level is necessary. In agriculture, the first step is to compute how much water is needed by crops in regards to climate conditions. Modelling approach can be a way to compute crop water requirement (CWR). In this study, the improved version of the SIMETAW model was used. The model is a user friendly soil water balance model, developed by the University of California, Davis, the California Department of Water Resource, and the University of Sassari. The SIMETAW# model assesses CWR and generates hypothetical irrigation scheduling for a wide range of irrigated crops experiencing full, deficit, or no irrigation. The model computes the evapotranspiration of the applied water (ETaw), which is the sum of the net amount of irrigation water needed to match losses due to the crop evapotranspiration (ETc). ETaw is determined by first computing reference evapotranspiration (ETo) using the daily standardized Reference Evapotranspiration equation. ETaw is computed as ETaw = CETc - CEr, where CETc and CE are the cumulative total crop ET and effective rainfall values, respectively. Crop evapotranspiration is estimated as ETc = ETo x Kc, where Kc is the corrected midseason tabular crop coefficient, adjusted for climate conditions. The net irrigation amounts are determined from a daily soil water balance, using an integrated approach that considers soil and crop management information, and the daily ETc estimates. Using input information on irrigation system distribution uniformity and runoff, when appropriate, the model estimates the applied water to the low quarter of the

  4. Development of a Navy Job-Specific Vocational Interest Model

    Science.gov (United States)

    2006-12-01

    the deal with circumplex models ? Journal of Vocational Behavior, 48, 77-84. Gottfredson, G. D. (1996). Prestige in vocational interests. Journal of...Eds.), Circumplex models of personality and emotions (pp. 81-102). Washington, DC: American Psychological Association. Hansen, J. I. C. (1996). What...representations. In R. Plutchik & H.R. Conte (Eds.), Circumplex models of personality and emotions (pp. 103-132). Washington, DC: American Psychological

  5. Cellular communication and “non-targeted effects”: Modelling approaches

    Science.gov (United States)

    Ballarini, Francesca; Facoetti, Angelica; Mariotti, Luca; Nano, Rosanna; Ottolenghi, Andrea

    2009-10-01

    During the last decade, a large number of experimental studies on the so-called "non-targeted effects", in particular bystander effects, outlined that cellular communication plays a significant role in the pathways leading to radiobiological damage. Although it is known that two main types of cellular communication (i.e. via gap junctions and/or molecular messengers diffusing in the extra-cellular environment, such as cytokines, NO etc.) play a major role, it is of utmost importance to better understand the underlying mechanisms, and how such mechanisms can be modulated by ionizing radiation. Though the "final" goal is of course to elucidate the in vivo scenario, in the meanwhile also in vitro studies can provide useful insights. In the present paper we will discuss key issues on the mechanisms underlying non-targeted effects and cell communication, for which theoretical models and simulation codes can be of great help. In this framework, we will present in detail three literature models, as well as an approach under development at the University of Pavia. More specifically, we will first focus on a version of the "State-Vector Model" including bystander-induced apoptosis of initiated cells, which was successfully fitted to in vitro data on neoplastic transformation supporting the hypothesis of a protective bystander effect mediated by apoptosis. The second analyzed model, focusing on the kinetics of bystander effects in 3D tissues, was successfully fitted to data on bystander damage in an artificial 3D skin system, indicating a signal range of the order of 0.7-1 mm. A third model for bystander effect, taking into account of spatial location, cell killing and repopulation, showed dose-response curves increasing approximately linearly at low dose rates but quickly flattening out for higher dose rates, also predicting an effect augmentation following dose fractionation. Concerning the Pavia approach, which can model the release, diffusion and depletion/degradation of

  6. Internal Models Support Specific Gaits in Orthotic Devices

    DEFF Research Database (Denmark)

    Matthias Braun, Jan; Wörgötter, Florentin; Manoonpong, Poramate

    2014-01-01

    such limitations is to supply the patient—via the orthosis—with situation-dependent gait models. To achieve this, we present a method for gait recognition using model invalidation. We show that these models are capable to predict the individual patient's movements and supply the correct gait. We investigate...... the system's accuracy and robustness on a Knee-Ankle-Foot-Orthosis, introducing behaviour changes depending on the patient's current walking situation. We conclude that the here presented model-based support of different gaits has the power to enhance the patient's mobility....

  7. Functional RG approach to the Potts model

    Science.gov (United States)

    Ben Alì Zinati, Riccardo; Codello, Alessandro

    2018-01-01

    The critical behavior of the (n+1) -states Potts model in d-dimensions is studied with functional renormalization group techniques. We devise a general method to derive β-functions for continuous values of d and n and we write the flow equation for the effective potential (LPA’) when instead n is fixed. We calculate several critical exponents, which are found to be in good agreement with Monte Carlo simulations and ɛ-expansion results available in the literature. In particular, we focus on Percolation (n\\to0) and Spanning Forest (n\\to-1) which are the only non-trivial universality classes in d  =  4,5 and where our methods converge faster.

  8. Fusion modeling approach for novel plasma sources

    International Nuclear Information System (INIS)

    Melazzi, D; Manente, M; Pavarin, D; Cardinali, A

    2012-01-01

    The physics involved in the coupling, propagation and absorption of RF helicon waves (electronic whistler) in low temperature Helicon plasma sources is investigated by solving the 3D Maxwell-Vlasov model equations using a WKB asymptotic expansion. The reduced set of equations is formally Hamiltonian and allows for the reconstruction of the wave front of the propagating wave, monitoring along the calculation that the WKB expansion remains satisfied. This method can be fruitfully employed in a new investigation of the power deposition mechanisms involved in common Helicon low temperature plasma sources when a general confinement magnetic field configuration is allowed, unveiling new physical insight in the wave propagation and absorption phenomena and stimulating further research for the design of innovative and more efficient low temperature plasma sources. A brief overview of this methodology and its capabilities has been presented in this paper.

  9. Carbonate rock depositional models: A microfacies approach

    Energy Technology Data Exchange (ETDEWEB)

    Carozzi, A.V.

    1988-01-01

    Carbonate rocks contain more than 50% by weight carbonate minerals such as calcite, dolomite, and siderite. Understanding how these rocks form can lead to more efficient methods of petroleum exploration. Micofacies analysis techniques can be used as a method of predicting models of sedimentation for carbonate rocks. Micofacies in carbonate rocks can be seen clearly only in thin sections under a microscope. This section analysis of carbonate rocks is a tool that can be used to understand depositional environments, diagenetic evolution of carbonate rocks, and the formation of porosity and permeability in carbonate rocks. The use of micofacies analysis techniques is applied to understanding the origin and formation of carbonate ramps, carbonate platforms, and carbonate slopes and basins. This book will be of interest to students and professionals concerned with the disciplines of sedimentary petrology, sedimentology, petroleum geology, and palentology.

  10. Wind Turbine Control: Robust Model Based Approach

    DEFF Research Database (Denmark)

    Mirzaei, Mahmood

    . This is because, on the one hand, control methods can decrease the cost of energy by keeping the turbine close to its maximum efficiency. On the other hand, they can reduce structural fatigue and therefore increase the lifetime of the wind turbine. The power produced by a wind turbine is proportional...... to the square of its rotor radius, therefore it seems reasonable to increase the size of the wind turbine in order to capture more power. However as the size increases, the mass of the blades increases by cube of the rotor size. This means in order to keep structural feasibility and mass of the whole structure...... reasonable, the ratio of mass to size should be reduced. This trend results in more flexible structures. Control of the flexible structure of a wind turbine in a wind field with stochastic nature is very challenging. In this thesis we are examining a number of robust model based methods for wind turbine...

  11. Risk prediction model: Statistical and artificial neural network approach

    Science.gov (United States)

    Paiman, Nuur Azreen; Hariri, Azian; Masood, Ibrahim

    2017-04-01

    Prediction models are increasingly gaining popularity and had been used in numerous areas of studies to complement and fulfilled clinical reasoning and decision making nowadays. The adoption of such models assist physician's decision making, individual's behavior, and consequently improve individual outcomes and the cost-effectiveness of care. The objective of this paper is to reviewed articles related to risk prediction model in order to understand the suitable approach, development and the validation process of risk prediction model. A qualitative review of the aims, methods and significant main outcomes of the nineteen published articles that developed risk prediction models from numerous fields were done. This paper also reviewed on how researchers develop and validate the risk prediction models based on statistical and artificial neural network approach. From the review done, some methodological recommendation in developing and validating the prediction model were highlighted. According to studies that had been done, artificial neural network approached in developing the prediction model were more accurate compared to statistical approach. However currently, only limited published literature discussed on which approach is more accurate for risk prediction model development.

  12. A dual model approach to ground water recovery trench design

    International Nuclear Information System (INIS)

    Clodfelter, C.L.; Crouch, M.S.

    1992-01-01

    The design of trenches for contaminated ground water recovery must consider several variables. This paper presents a dual-model approach for effectively recovering contaminated ground water migrating toward a trench by advection. The approach involves an analytical model to determine the vertical influence of the trench and a numerical flow model to determine the capture zone within the trench and the surrounding aquifer. The analytical model is utilized by varying trench dimensions and head values to design a trench which meets the remediation criteria. The numerical flow model is utilized to select the type of backfill and location of sumps within the trench. The dual-model approach can be used to design a recovery trench which effectively captures advective migration of contaminants in the vertical and horizontal planes

  13. Using SoaML Models and Event-B Specifications for Modeling SOA Design Patterns

    OpenAIRE

    Tounsi , Imen; Zied , Hrichi; Hadj Kacem , Mohamed; Hadj Kacem , Ahmed; Drira , Khalil

    2013-01-01

    International audience; Although design patterns have become increasingly popular, most of them are presented in an informal way. Patterns, proposed by the SOA design pattern community, are described with a proprietary informal notation, which can raise ambiguity and may lead to their incorrect usage. Modeling SOA design patterns with a standard formal notation avoids misunderstanding by software architects and helps endow design methods. In this paper, we present an approach that aims, first...

  14. Univariate and Multivariate Specification Search Indices in Covariance Structure Modeling.

    Science.gov (United States)

    Hutchinson, Susan R.

    1993-01-01

    Simulated population data were used to compare relative performances of the modification index and C. Chou and P. M. Bentler's Lagrange multiplier test (a multivariate generalization of a modification index) for four levels of model misspecification. Both indices failed to recover the true model except at the lowest level of misspecification. (SLD)

  15. Task-specific visual cues for improving process model understanding

    NARCIS (Netherlands)

    Petrusel, Razvan; Mendling, Jan; Reijers, Hajo A.

    2016-01-01

    Context Business process models support various stakeholders in managing business processes and designing process-aware information systems. In order to make effective use of these models, they have to be readily understandable. Objective Prior research has emphasized the potential of visual cues to

  16. Testing the specifications of parametric models using anchoring vignettes

    NARCIS (Netherlands)

    van Soest, A.H.O.; Vonkova, H.

    Comparing assessments on a subjective scale across countries or socio-economic groups is often hampered by differences in response scales across groups. Anchoring vignettes help to correct for such differences, either in parametric models (the compound hierarchical ordered probit (CHOPIT) model and

  17. Communicative Competence: A Pedagogically Motivated Model with Content Specifications.

    Science.gov (United States)

    Celce-Murcia, Marianne; And Others

    1995-01-01

    Argues the need for an updated and explicit description of language teaching areas generated with reference to a detailed model of communicative competence. Describes two existing models of communicative competence and proposes a pedagogically motivated construct, which includes discourse, linguistic, actional, sociocultural and strategic…

  18. EPS: an empirical Bayes approach to integrating pleiotropy and tissue-specific information for prioritizing risk genes.

    Science.gov (United States)

    Liu, Jin; Wan, Xiang; Ma, Shuangge; Yang, Can

    2016-06-15

    Researchers worldwide have generated a huge volume of genomic data, including thousands of genome-wide association studies (GWAS) and massive amounts of gene expression data from different tissues. How to perform a joint analysis of these data to gain new biological insights has become a critical step in understanding the etiology of complex diseases. Due to the polygenic architecture of complex diseases, the identification of risk genes remains challenging. Motivated by the shared risk genes found in complex diseases and tissue-specific gene expression patterns, we propose as an Empirical Bayes approach to integrating Pleiotropy and Tissue-Specific information (EPS) for prioritizing risk genes. As demonstrated by extensive simulation studies, EPS greatly improves the power of identification for disease-risk genes. EPS enables rigorous hypothesis testing of pleiotropy and tissue-specific risk gene expression patterns. All of the model parameters can be adaptively estimated from the developed expectation-maximization (EM) algorithm. We applied EPS to the bipolar disorder and schizophrenia GWAS from the Psychiatric Genomics Consortium, along with the gene expression data for multiple tissues from the Genotype-Tissue Expression project. The results of the real data analysis demonstrate many advantages of EPS. The EPS software is available on https://sites.google.com/site/liujin810822 CONTACT: eeyang@hkbu.edu.hk Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  19. Simple queueing approach to segregation dynamics in Schelling model

    OpenAIRE

    Sobkowicz, Pawel

    2007-01-01

    A simple queueing approach for segregation of agents in modified one dimensional Schelling segregation model is presented. The goal is to arrive at simple formula for the number of unhappy agents remaining after the segregation.

  20. Survey of Traceability Approaches in Model-Driven Engineering

    NARCIS (Netherlands)

    Galvao, I.; Göknil, Arda

    2007-01-01

    Models have been used in various engineering fields to help managing complexity and represent information in different abstraction levels, according to specific notations and stakeholder's viewpoints. Model-Driven Engineering (MDE) gives the basic principles for the use of models as primary

  1. A systemic approach for modeling soil functions

    Science.gov (United States)

    Vogel, Hans-Jörg; Bartke, Stephan; Daedlow, Katrin; Helming, Katharina; Kögel-Knabner, Ingrid; Lang, Birgit; Rabot, Eva; Russell, David; Stößel, Bastian; Weller, Ulrich; Wiesmeier, Martin; Wollschläger, Ute

    2018-03-01

    The central importance of soil for the functioning of terrestrial systems is increasingly recognized. Critically relevant for water quality, climate control, nutrient cycling and biodiversity, soil provides more functions than just the basis for agricultural production. Nowadays, soil is increasingly under pressure as a limited resource for the production of food, energy and raw materials. This has led to an increasing demand for concepts assessing soil functions so that they can be adequately considered in decision-making aimed at sustainable soil management. The various soil science disciplines have progressively developed highly sophisticated methods to explore the multitude of physical, chemical and biological processes in soil. It is not obvious, however, how the steadily improving insight into soil processes may contribute to the evaluation of soil functions. Here, we present to a new systemic modeling framework that allows for a consistent coupling between reductionist yet observable indicators for soil functions with detailed process understanding. It is based on the mechanistic relationships between soil functional attributes, each explained by a network of interacting processes as derived from scientific evidence. The non-linear character of these interactions produces stability and resilience of soil with respect to functional characteristics. We anticipate that this new conceptional framework will integrate the various soil science disciplines and help identify important future research questions at the interface between disciplines. It allows the overwhelming complexity of soil systems to be adequately coped with and paves the way for steadily improving our capability to assess soil functions based on scientific understanding.

  2. A Constructive Neural-Network Approach to Modeling Psychological Development

    Science.gov (United States)

    Shultz, Thomas R.

    2012-01-01

    This article reviews a particular computational modeling approach to the study of psychological development--that of constructive neural networks. This approach is applied to a variety of developmental domains and issues, including Piagetian tasks, shift learning, language acquisition, number comparison, habituation of visual attention, concept…

  3. Towards Translating Graph Transformation Approaches by Model Transformations

    NARCIS (Netherlands)

    Hermann, F.; Kastenberg, H.; Modica, T.; Karsai, G.; Taentzer, G.

    2006-01-01

    Recently, many researchers are working on semantics preserving model transformation. In the field of graph transformation one can think of translating graph grammars written in one approach to a behaviourally equivalent graph grammar in another approach. In this paper we translate graph grammars

  4. An Almost Integration-free Approach to Ordered Response Models

    NARCIS (Netherlands)

    van Praag, B.M.S.; Ferrer-i-Carbonell, A.

    2006-01-01

    'In this paper we propose an alternative approach to the estimation of ordered response models. We show that the Probit-method may be replaced by a simple OLS-approach, called P(robit)OLS, without any loss of efficiency. This method can be generalized to the analysis of panel data. For large-scale

  5. Optimizing technology investments: a broad mission model approach

    Science.gov (United States)

    Shishko, R.

    2003-01-01

    A long-standing problem in NASA is how to allocate scarce technology development resources across advanced technologies in order to best support a large set of future potential missions. Within NASA, two orthogonal paradigms have received attention in recent years: the real-options approach and the broad mission model approach. This paper focuses on the latter.

  6. A generalized quarter car modelling approach with frame flexibility ...

    Indian Academy of Sciences (India)

    ... mass distribution and damping. Here we propose a generalized quarter-car modelling approach, incorporating both the frame as well as other-wheel ground contacts. Our approach is linear, uses Laplace transforms, involves vertical motions of key points of interest and has intermediate complexity with improved realism.

  7. Agent-based modeling: a new approach for theory building in social psychology.

    Science.gov (United States)

    Smith, Eliot R; Conrey, Frederica R

    2007-02-01

    Most social and psychological phenomena occur not as the result of isolated decisions by individuals but rather as the result of repeated interactions between multiple individuals over time. Yet the theory-building and modeling techniques most commonly used in social psychology are less than ideal for understanding such dynamic and interactive processes. This article describes an alternative approach to theory building, agent-based modeling (ABM), which involves simulation of large numbers of autonomous agents that interact with each other and with a simulated environment and the observation of emergent patterns from their interactions. The authors believe that the ABM approach is better able than prevailing approaches in the field, variable-based modeling (VBM) techniques such as causal modeling, to capture types of complex, dynamic, interactive processes so important in the social world. The article elaborates several important contrasts between ABM and VBM and offers specific recommendations for learning more and applying the ABM approach.

  8. Time-specific ecological niche modeling predicts spatial dynamics of vector insects and human dengue cases.

    Science.gov (United States)

    Peterson, A Townsend; Martínez-Campos, Carmen; Nakazawa, Yoshinori; Martínez-Meyer, Enrique

    2005-09-01

    Numerous human diseases-malaria, dengue, yellow fever and leishmaniasis, to name a few-are transmitted by insect vectors with brief life cycles and biting activity that varies in both space and time. Although the general geographic distributions of these epidemiologically important species are known, the spatiotemporal variation in their emergence and activity remains poorly understood. We used ecological niche modeling via a genetic algorithm to produce time-specific predictive models of monthly distributions of Aedes aegypti in Mexico in 1995. Significant predictions of monthly mosquito activity and distributions indicate that predicting spatiotemporal dynamics of disease vector species is feasible; significant coincidence with human cases of dengue indicate that these dynamics probably translate directly into transmission of dengue virus to humans. This approach provides new potential for optimizing use of resources for disease prevention and remediation via automated forecasting of disease transmission risk.

  9. Numerical approaches to expansion process modeling

    Directory of Open Access Journals (Sweden)

    G. V. Alekseev

    2017-01-01

    Full Text Available Forage production is currently undergoing a period of intensive renovation and introduction of the most advanced technologies and equipment. More and more often such methods as barley toasting, grain extrusion, steaming and grain flattening, boiling bed explosion, infrared ray treatment of cereals and legumes, followed by flattening, and one-time or two-time granulation of the purified whole grain without humidification in matrix presses By grinding the granules. These methods require special apparatuses, machines, auxiliary equipment, created on the basis of different methods of compiled mathematical models. When roasting, simulating the heat fields arising in the working chamber, provide such conditions, the decomposition of a portion of the starch to monosaccharides, which makes the grain sweetish, but due to protein denaturation the digestibility of the protein and the availability of amino acids decrease somewhat. Grain is roasted mainly for young animals in order to teach them to eat food at an early age, stimulate the secretory activity of digestion, better development of the masticatory muscles. In addition, the high temperature is detrimental to bacterial contamination and various types of fungi, which largely avoids possible diseases of the gastrointestinal tract. This method has found wide application directly on the farms. Apply when used in feeding animals and legumes: peas, soy, lupine and lentils. These feeds are preliminarily ground, and then cooked or steamed for 1 hour for 30–40 minutes. In the feed mill. Such processing of feeds allows inactivating the anti-nutrients in them, which reduce the effectiveness of their use. After processing, legumes are used as protein supplements in an amount of 25–30% of the total nutritional value of the diet. But it is recommended to cook and steal a grain of good quality. A poor-quality grain that has been stored for a long time and damaged by pathogenic micro flora is subject to

  10. Graphical approach to model reduction for nonlinear biochemical networks.

    Science.gov (United States)

    Holland, David O; Krainak, Nicholas C; Saucerman, Jeffrey J

    2011-01-01

    Model reduction is a central challenge to the development and analysis of multiscale physiology models. Advances in model reduction are needed not only for computational feasibility but also for obtaining conceptual insights from complex systems. Here, we introduce an intuitive graphical approach to model reduction based on phase plane analysis. Timescale separation is identified by the degree of hysteresis observed in phase-loops, which guides a "concentration-clamp" procedure for estimating explicit algebraic relationships between species equilibrating on fast timescales. The primary advantages of this approach over Jacobian-based timescale decomposition are that: 1) it incorporates nonlinear system dynamics, and 2) it can be easily visualized, even directly from experimental data. We tested this graphical model reduction approach using a 25-variable model of cardiac β(1)-adrenergic signaling, obtaining 6- and 4-variable reduced models that retain good predictive capabilities even in response to new perturbations. These 6 signaling species appear to be optimal "kinetic biomarkers" of the overall β(1)-adrenergic pathway. The 6-variable reduced model is well suited for integration into multiscale models of heart function, and more generally, this graphical model reduction approach is readily applicable to a variety of other complex biological systems.

  11. Graphical approach to model reduction for nonlinear biochemical networks.

    Directory of Open Access Journals (Sweden)

    David O Holland

    Full Text Available Model reduction is a central challenge to the development and analysis of multiscale physiology models. Advances in model reduction are needed not only for computational feasibility but also for obtaining conceptual insights from complex systems. Here, we introduce an intuitive graphical approach to model reduction based on phase plane analysis. Timescale separation is identified by the degree of hysteresis observed in phase-loops, which guides a "concentration-clamp" procedure for estimating explicit algebraic relationships between species equilibrating on fast timescales. The primary advantages of this approach over Jacobian-based timescale decomposition are that: 1 it incorporates nonlinear system dynamics, and 2 it can be easily visualized, even directly from experimental data. We tested this graphical model reduction approach using a 25-variable model of cardiac β(1-adrenergic signaling, obtaining 6- and 4-variable reduced models that retain good predictive capabilities even in response to new perturbations. These 6 signaling species appear to be optimal "kinetic biomarkers" of the overall β(1-adrenergic pathway. The 6-variable reduced model is well suited for integration into multiscale models of heart function, and more generally, this graphical model reduction approach is readily applicable to a variety of other complex biological systems.

  12. Subject-specific computational modeling of DBS in the PPTg area

    Directory of Open Access Journals (Sweden)

    Laura M. Zitella

    2015-07-01

    Full Text Available Deep brain stimulation (DBS in the pedunculopontine tegmental nucleus (PPTg has been proposed to alleviate medically intractable gait difficulties associated with Parkinson’s disease. Clinical trials have shown somewhat variable outcomes, stemming in part from surgical targeting variability, modulating fiber pathways implicated in side effects, and a general lack of mechanistic understanding of DBS in this brain region. Subject-specific computational models of DBS are a promising tool to investigate the underlying therapy and side effects. In this study, a parkinsonian rhesus macaque was implanted unilaterally with an 8-contact DBS lead in the PPTg region. Fiber tracts adjacent to PPTg, including the oculomotor nerve, central tegmental tract, and superior cerebellar peduncle, were reconstructed from a combination of pre-implant 7T MRI, post-implant CT, and post-mortem histology. These structures were populated with axon models and coupled with a finite element model simulating the voltage distribution in the surrounding neural tissue during stimulation. This study introduces two empirical approaches to evaluate model parameters. First, incremental monopolar cathodic stimulation (20Hz, 90µs pulse width was evaluated for each electrode, during which a right eyelid flutter was observed at the proximal four contacts (-1.0 to -1.4mA. These current amplitudes followed closely with model predicted activation of the oculomotor nerve when assuming an anisotropic conduction medium. Second, PET imaging was collected OFF-DBS and twice during DBS (two different contacts, which supported the model predicted activation of the central tegmental tract and superior cerebellar peduncle. Together, subject-specific models provide a framework to more precisely predict pathways modulated by DBS.

  13. Specification of Change Mechanisms in Pregnant Smokers for Malleable Target Identification: A Novel Approach to a Tenacious Public Health Problem.

    Science.gov (United States)

    Massey, Suena H; Decety, Jean; Wisner, Katherine L; Wakschlag, Lauren S

    2017-01-01

    Maternal smoking during pregnancy (MSDP) continues to be a leading modifiable risk factor for perinatal complications and a range of neurodevelopmental and cardio-metabolic outcomes across the lifespan. Despite 40 years of intervention research less than one in five pregnant smokers who receive an intervention quit by delivery. Within this context, recognition of pregnancy is commonly associated with abrupt suspension or reduction of smoking in the absence of intervention, yet has not been investigated as a volitional target. The goal of this article is to provide the empirical foundation for a novel direction of research aimed at identifying malleable targets for intervention through the specification of behavior change mechanisms specific to pregnant women. To do so, we: (1) summarize progress on MSDP in the United States generated from conventional empirical approaches to health behavior change; (2) discuss the phenomenon of spontaneous change in the absence of intervention among pregnant smokers to illustrate the need for mechanistic specification of behavior change motivated by concern for fetal well-being; (3) summarize component processes in neurobiological models of parental and non-parental social behaviors as a conceptual framework for understanding change mechanisms during pregnancy; (4) discuss the evidence for the malleability of these processes to support their translational relevance for preventive interventions; and (5) propose a roadmap for validating the proposed change mechanism using an experimental medicine approach. A greater understanding of social and interpersonal processes that facilitate health behavior change among expectant mothers and how these processes differ interindividually could yield novel volitional targets for prenatal interventions. More broadly, explicating other-oriented mechanisms of behavior change during pregnancy could serve as a paradigm for understanding how social and interpersonal processes positively influence

  14. Specification of Change Mechanisms in Pregnant Smokers for Malleable Target Identification: A Novel Approach to a Tenacious Public Health Problem

    Directory of Open Access Journals (Sweden)

    Suena H. Massey

    2017-09-01

    Full Text Available Maternal smoking during pregnancy (MSDP continues to be a leading modifiable risk factor for perinatal complications and a range of neurodevelopmental and cardio-metabolic outcomes across the lifespan. Despite 40 years of intervention research less than one in five pregnant smokers who receive an intervention quit by delivery. Within this context, recognition of pregnancy is commonly associated with abrupt suspension or reduction of smoking in the absence of intervention, yet has not been investigated as a volitional target. The goal of this article is to provide the empirical foundation for a novel direction of research aimed at identifying malleable targets for intervention through the specification of behavior change mechanisms specific to pregnant women. To do so, we: (1 summarize progress on MSDP in the United States generated from conventional empirical approaches to health behavior change; (2 discuss the phenomenon of spontaneous change in the absence of intervention among pregnant smokers to illustrate the need for mechanistic specification of behavior change motivated by concern for fetal well-being; (3 summarize component processes in neurobiological models of parental and non-parental social behaviors as a conceptual framework for understanding change mechanisms during pregnancy; (4 discuss the evidence for the malleability of these processes to support their translational relevance for preventive interventions; and (5 propose a roadmap for validating the proposed change mechanism using an experimental medicine approach. A greater understanding of social and interpersonal processes that facilitate health behavior change among expectant mothers and how these processes differ interindividually could yield novel volitional targets for prenatal interventions. More broadly, explicating other-oriented mechanisms of behavior change during pregnancy could serve as a paradigm for understanding how social and interpersonal processes

  15. Data Analysis A Model Comparison Approach, Second Edition

    CERN Document Server

    Judd, Charles M; Ryan, Carey S

    2008-01-01

    This completely rewritten classic text features many new examples, insights and topics including mediational, categorical, and multilevel models. Substantially reorganized, this edition provides a briefer, more streamlined examination of data analysis. Noted for its model-comparison approach and unified framework based on the general linear model, the book provides readers with a greater understanding of a variety of statistical procedures. This consistent framework, including consistent vocabulary and notation, is used throughout to develop fewer but more powerful model building techniques. T

  16. An Open Modelling Approach for Availability and Reliability of Systems - OpenMARS

    CERN Document Server

    Penttinen, Jussi-Pekka; Gutleber, Johannes

    2018-01-01

    This document introduces and gives specification for OpenMARS, which is an open modelling approach for availability and reliability of systems. It supports the most common risk assessment and operation modelling techniques. Uniquely OpenMARS allows combining and connecting models defined with different techniques. This ensures that a modeller has a high degree of freedom to accurately describe the modelled system without limitations imposed by an individual technique. Here the OpenMARS model definition is specified with a tool independent tabular format, which supports managing models developed in a collaborative fashion. Origin of our research is in Future Circular Collider (FCC) study, where we developed the unique features of our concept to model the availability and luminosity production of particle colliders. We were motivated to describe our approach in detail as we see potential further applications in performance and energy efficiency analyses of large scientific infrastructures or industrial processe...

  17. Mammary-Specific Gene Transfer for Modeling Breast Cancer

    National Research Council Canada - National Science Library

    Li, Yi

    2001-01-01

    In order to develop a mouse model system that allows for rapid assessment of genetic lesions involved in breast tumor development, we are adapting a somatic gene transfer system based on avian leukosis virus A (ALV...

  18. Pancreas specific expression of oncogenes in a porcine model

    DEFF Research Database (Denmark)

    Berthelsen, Martin Fogtmann; Callesen, Morten Møbjerg; Østergaard, Tanja Stenshøj

    2017-01-01

    crucial for successful treatment. However, pancreatic cancer is difficult to detect in its earliest stages and once symptoms appear, the cancer has often progressed beyond possibility for curing. Research into the disease has been hampered by the lack of good models. We have generated a porcine...... model of pancreatic cancer with use of transgenic overexpression of an oncogene cassette containing MYC, KRASG12D and SV40 LT. The expression was initiated from a modified Pdx-1 promoter during embryogenesis in a subset of pancreatic epithelial cells. Furthermore, cells expressing the oncogenes also...... foci, with beginning abnormality at day 45. Cells in the foci expressed the oncogenic proteins and the majority of the cells were positive for the proliferation marker, Ki67. We predict that this model could be used for advanced studies in pancreatic cancer in a large animal model with focus on early...

  19. Development of a Navy Job-Specific Vocational Interest Model

    National Research Council Canada - National Science Library

    Farmer, William L; Bearden, Ronald M; Fedak, Geoffrey E; Watson, Stephen E; Lightfoot, Mary A; Alley, William E; Schultz, Sheila R; Heggested, Eric D

    2006-01-01

    .... We combined a review of the theoretical literature with a qualitative analysis of entry-level enlisted ratings to create a model that is grounded in current research and reflects the critical work...

  20. Specification of a STEP Based Reference Model for Exchange of Robotics Models

    DEFF Research Database (Denmark)

    Haenisch, Jochen; Kroszynski, Uri; Ludwig, Arnold

    combining geometric, dynamic, process and robot specific data.The growing need for accurate information about manufacturing data (models of robots and other mechanisms) in diverse industrial applications has initiated ESPRIT Project 6457: InterRob. Besides the topics associated with standards for industrial...... of pilot processor programs are based. The processors allow for the exchange of product data models between Analysis systems (e.g. ADAMS), CAD systems (e.g. CATIA, BRAVO), Simulation and off-line programming systems (e.g. GRASP, KISMET, ROPSIM)....

  1. New Keynesian DSGE models: theory, empirical implementation, and specification

    OpenAIRE

    Röhe, Oke

    2012-01-01

    The core of the dissertation consists of three chapters. Chapter 2 provides a graphical and formal representation of a basic dynamic stochastic general equilibrium (DSGE) economy and discusses the prerequisites needed for an empirical implementation. The aim of this chapter is to present the core features of the models used in chapter 3 and 4 of this work and to introduce the estimation techniques employed in the remainder of the thesis. In chapter 3 we estimate a New Keynesian DSGE model...

  2. Communicative Competence: A Pedagogically Motivated Model with Content Specifications

    OpenAIRE

    Celce-Murcia, Marianne; Dornyei, Zoltan; Thurrell, Sarah

    1995-01-01

    This paper argues the need for an updated and explicit description of language teaching areas generated with reference to a detailed model of communicative competence. We describe two existing models of communicative competence and then propose our own pedagogically motivated construct, which includes five components: (1) discourse competence, (2) linguistic competence, (3) actional competence, (4) sociocultural competence, and (5) strategic competence. We discuss these competencies in as muc...

  3. Fractional Gaussian noise: Prior specification and model comparison

    KAUST Repository

    Sørbye, Sigrunn Holbek

    2017-07-07

    Fractional Gaussian noise (fGn) is a stationary stochastic process used to model antipersistent or persistent dependency structures in observed time series. Properties of the autocovariance function of fGn are characterised by the Hurst exponent (H), which, in Bayesian contexts, typically has been assigned a uniform prior on the unit interval. This paper argues why a uniform prior is unreasonable and introduces the use of a penalised complexity (PC) prior for H. The PC prior is computed to penalise divergence from the special case of white noise and is invariant to reparameterisations. An immediate advantage is that the exact same prior can be used for the autocorrelation coefficient ϕ(symbol) of a first-order autoregressive process AR(1), as this model also reflects a flexible version of white noise. Within the general setting of latent Gaussian models, this allows us to compare an fGn model component with AR(1) using Bayes factors, avoiding the confounding effects of prior choices for the two hyperparameters H and ϕ(symbol). Among others, this is useful in climate regression models where inference for underlying linear or smooth trends depends heavily on the assumed noise model.

  4. Current status of top-specific variant axion model

    Science.gov (United States)

    Chiang, Cheng-Wei; Fukuda, Hajime; Takeuchi, Michihisa; Yanagida, Tsutomu T.

    2018-02-01

    The invisible variant axion model is one of the very attractive models which solves the strong C P problem but does not provoke the domain wall problem. At the electroweak scale, this model requires at least two Higgs doublets, one of which carries a nonzero Peccei-Quinn (PQ) charge and the other is neutral. We consider a scenario where only the right-handed top quark is charged under the PQ symmetry and couples with the PQ-charged Higgs doublet. As a general prediction of this model, the top quark can decay to the observed standard model-like Higgs boson h and the charm or up quark, t →h c /u , which recently exhibited slight excesses at LHC run-I and run-II and will soon be testable at the LHC run-II. If the rare top decay excess stays at the observed central value, we show that tan β ˜1 or smaller is preferred by the Higgs data. The chiral nature of the Higgs flavor-changing interaction is a distinctive feature of this model and testable using the angular distribution of the t →c h decays at the LHC.

  5. Model for the perception of the specific e-leadership skills and features in learning management systems environments

    OpenAIRE

    Samartinho, João; Faria, Jorge; Silva, Paulo

    2015-01-01

    This article proposes a model for understanding the specific skills and characteristics of e-leadership in Learning Management Systems (LMS), used for implementation of virtual teams. A framework was done, to identify skills and characteristics of the e-leaders, based on the review of literature about the e-leadership paradigm and its relationship with the virtual teams. It also presents the empirical model and an explanatory table with the methodology of research and the main approaches in t...

  6. A novel approach to modeling and diagnosing the cardiovascular system

    Energy Technology Data Exchange (ETDEWEB)

    Keller, P.E.; Kangas, L.J.; Hashem, S.; Kouzes, R.T. [Pacific Northwest Lab., Richland, WA (United States); Allen, P.A. [Life Link, Richland, WA (United States)

    1995-07-01

    A novel approach to modeling and diagnosing the cardiovascular system is introduced. A model exhibits a subset of the dynamics of the cardiovascular behavior of an individual by using a recurrent artificial neural network. Potentially, a model will be incorporated into a cardiovascular diagnostic system. This approach is unique in that each cardiovascular model is developed from physiological measurements of an individual. Any differences between the modeled variables and the variables of an individual at a given time are used for diagnosis. This approach also exploits sensor fusion to optimize the utilization of biomedical sensors. The advantage of sensor fusion has been demonstrated in applications including control and diagnostics of mechanical and chemical processes.

  7. Reusable Component Model Development Approach for Parallel and Distributed Simulation

    Science.gov (United States)

    Zhu, Feng; Yao, Yiping; Chen, Huilong; Yao, Feng

    2014-01-01

    Model reuse is a key issue to be resolved in parallel and distributed simulation at present. However, component models built by different domain experts usually have diversiform interfaces, couple tightly, and bind with simulation platforms closely. As a result, they are difficult to be reused across different simulation platforms and applications. To address the problem, this paper first proposed a reusable component model framework. Based on this framework, then our reusable model development approach is elaborated, which contains two phases: (1) domain experts create simulation computational modules observing three principles to achieve their independence; (2) model developer encapsulates these simulation computational modules with six standard service interfaces to improve their reusability. The case study of a radar model indicates that the model developed using our approach has good reusability and it is easy to be used in different simulation platforms and applications. PMID:24729751

  8. Mathematical models for therapeutic approaches to control HIV disease transmission

    CERN Document Server

    Roy, Priti Kumar

    2015-01-01

    The book discusses different therapeutic approaches based on different mathematical models to control the HIV/AIDS disease transmission. It uses clinical data, collected from different cited sources, to formulate the deterministic as well as stochastic mathematical models of HIV/AIDS. It provides complementary approaches, from deterministic and stochastic points of view, to optimal control strategy with perfect drug adherence and also tries to seek viewpoints of the same issue from different angles with various mathematical models to computer simulations. The book presents essential methods and techniques for students who are interested in designing epidemiological models on HIV/AIDS. It also guides research scientists, working in the periphery of mathematical modeling, and helps them to explore a hypothetical method by examining its consequences in the form of a mathematical modelling and making some scientific predictions. The model equations, mathematical analysis and several numerical simulations that are...

  9. New prediction model for probe specificity in an allele-specific extension reaction for haplotype-specific extraction (HSE of Y chromosome mixtures.

    Directory of Open Access Journals (Sweden)

    Jessica Rothe

    Full Text Available Allele-specific extension reactions (ASERs use 3' terminus-specific primers for the selective extension of completely annealed matches by polymerase. The ability of the polymerase to extend non-specific 3' terminal mismatches leads to a failure of the reaction, a process that is only partly understood and predictable, and often requires time-consuming assay design. In our studies we investigated haplotype-specific extraction (HSE for the separation of male DNA mixtures. HSE is an ASER and provides the ability to distinguish between diploid chromosomes from one or more individuals. Here, we show that the success of HSE and allele-specific extension depend strongly on the concentration difference between complete match and 3' terminal mismatch. Using the oligonucleotide-modeling platform Visual Omp, we demonstrated the dependency of the discrimination power of the polymerase on match- and mismatch-target hybridization between different probe lengths. Therefore, the probe specificity in HSE could be predicted by performing a relative comparison of different probe designs with their simulated differences between the duplex concentration of target-probe match and mismatches. We tested this new model for probe design in more than 300 HSE reactions with 137 different probes and obtained an accordance of 88%.

  10. Initial assessment of a multi-model approach to spring flood forecasting in Sweden

    Science.gov (United States)

    Olsson, J.; Uvo, C. B.; Foster, K.; Yang, W.

    2015-06-01

    Hydropower is a major energy source in Sweden and proper reservoir management prior to the spring flood onset is crucial for optimal production. This requires useful forecasts of the accumulated discharge in the spring flood period (i.e. the spring-flood volume, SFV). Today's SFV forecasts are generated using a model-based climatological ensemble approach, where time series of precipitation and temperature from historical years are used to force a calibrated and initialised set-up of the HBV model. In this study, a number of new approaches to spring flood forecasting, that reflect the latest developments with respect to analysis and modelling on seasonal time scales, are presented and evaluated. Three main approaches, represented by specific methods, are evaluated in SFV hindcasts for three main Swedish rivers over a 10-year period with lead times between 0 and 4 months. In the first approach, historically analogue years with respect to the climate in the period preceding the spring flood are identified and used to compose a reduced ensemble. In the second, seasonal meteorological ensemble forecasts are used to drive the HBV model over the spring flood period. In the third approach, statistical relationships between SFV and the large-sale atmospheric circulation are used to build forecast models. None of the new approaches consistently outperform the climatological ensemble approach, but for specific locations and lead times improvements of 20-30 % are found. When combining all forecasts in a weighted multi-model approach, a mean improvement over all locations and lead times of nearly 10 % was indicated. This demonstrates the potential of the approach and further development and optimisation into an operational system is ongoing.

  11. A Model-Based Approach to Support Validation of Medical Cyber-Physical Systems

    Directory of Open Access Journals (Sweden)

    Lenardo C. Silva

    2015-10-01

    Full Text Available Medical Cyber-Physical Systems (MCPS are context-aware, life-critical systems with patient safety as the main concern, demanding rigorous processes for validation to guarantee user requirement compliance and specification-oriented correctness. In this article, we propose a model-based approach for early validation of MCPS, focusing on promoting reusability and productivity. It enables system developers to build MCPS formal models based on a library of patient and medical device models, and simulate the MCPS to identify undesirable behaviors at design time. Our approach has been applied to three different clinical scenarios to evaluate its reusability potential for different contexts. We have also validated our approach through an empirical evaluation with developers to assess productivity and reusability. Finally, our models have been formally verified considering functional and safety requirements and model coverage.

  12. A Model-Based Approach to Support Validation of Medical Cyber-Physical Systems.

    Science.gov (United States)

    Silva, Lenardo C; Almeida, Hyggo O; Perkusich, Angelo; Perkusich, Mirko

    2015-10-30

    Medical Cyber-Physical Systems (MCPS) are context-aware, life-critical systems with patient safety as the main concern, demanding rigorous processes for validation to guarantee user requirement compliance and specification-oriented correctness. In this article, we propose a model-based approach for early validation of MCPS, focusing on promoting reusability and productivity. It enables system developers to build MCPS formal models based on a library of patient and medical device models, and simulate the MCPS to identify undesirable behaviors at design time. Our approach has been applied to three different clinical scenarios to evaluate its reusability potential for different contexts. We have also validated our approach through an empirical evaluation with developers to assess productivity and reusability. Finally, our models have been formally verified considering functional and safety requirements and model coverage.

  13. A systems approach to predict oncometabolites via context-specific genome-scale metabolic networks.

    Directory of Open Access Journals (Sweden)

    Hojung Nam

    2014-09-01

    Full Text Available Altered metabolism in cancer cells has been viewed as a passive response required for a malignant transformation. However, this view has changed through the recently described metabolic oncogenic factors: mutated isocitrate dehydrogenases (IDH, succinate dehydrogenase (SDH, and fumarate hydratase (FH that produce oncometabolites that competitively inhibit epigenetic regulation. In this study, we demonstrate in silico predictions of oncometabolites that have the potential to dysregulate epigenetic controls in nine types of cancer by incorporating massive scale genetic mutation information (collected from more than 1,700 cancer genomes, expression profiling data, and deploying Recon 2 to reconstruct context-specific genome-scale metabolic models. Our analysis predicted 15 compounds and 24 substructures of potential oncometabolites that could result from the loss-of-function and gain-of-function mutations of metabolic enzymes, respectively. These results suggest a substantial potential for discovering unidentified oncometabolites in various forms of cancers.

  14. Irreversibility of T-Cell Specification: Insights from Computational Modelling of a Minimal Network Architecture.

    Directory of Open Access Journals (Sweden)

    Erica Manesso

    Full Text Available A cascade of gene activations under the control of Notch signalling is required during T-cell specification, when T-cell precursors gradually lose the potential to undertake other fates and become fully committed to the T-cell lineage. We elucidate how the gene/protein dynamics for a core transcriptional module governs this important process by computational means.We first assembled existing knowledge about transcription factors known to be important for T-cell specification to form a minimal core module consisting of TCF-1, GATA-3, BCL11B, and PU.1 aiming at dynamical modeling. Model architecture was based on published experimental measurements of the effects on each factor when each of the others is perturbed. While several studies provided gene expression measurements at different stages of T-cell development, pure time series are not available, thus precluding a straightforward study of the dynamical interactions among these genes. We therefore translate stage dependent data into time series. A feed-forward motif with multiple positive feed-backs can account for the observed delay between BCL11B versus TCF-1 and GATA-3 activation by Notch signalling. With a novel computational approach, all 32 possible interactions among Notch signalling, TCF-1, and GATA-3 are explored by translating combinatorial logic expressions into differential equations for BCL11B production rate.Our analysis reveals that only 3 of 32 possible configurations, where GATA-3 works as a dimer, are able to explain not only the time delay, but very importantly, also give rise to irreversibility. The winning models explain the data within the 95% confidence region and are consistent with regard to decay rates.This first generation model for early T-cell specification has relatively few players. Yet it explains the gradual transition into a committed state with no return. Encoding logics in a rate equation setting allows determination of binding properties beyond what is

  15. School Processes Mediate School Compositional Effects: Model Specification and Estimation

    Science.gov (United States)

    Liu, Hongqiang; Van Damme, Jan; Gielen, Sarah; Van Den Noortgate, Wim

    2015-01-01

    School composition effects have been consistently verified, but few studies ever attempted to study how school composition affects school achievement. Based on prior research findings, we employed multilevel mediation modeling to examine whether school processes mediate the effect of school composition upon school outcomes based on the data of 28…

  16. On the logical specification of probabilistic transition models

    CSIR Research Space (South Africa)

    Rens, G

    2013-05-01

    Full Text Available We investigate the requirements for specifying the behaviors of actions in a stochastic domain. That is, we propose how to write sentences in a logical language to capture a model of probabilistic transitions due to the execution of actions of some...

  17. Code Shift: Grid Specifications and Dynamic Wind Turbine Models

    DEFF Research Database (Denmark)

    Ackermann, Thomas; Ellis, Abraham; Fortmann, Jens

    2013-01-01

    Grid codes (GCs) and dynamic wind turbine (WT) models are key tools to allow increasing renewable energy penetration without challenging security of supply. In this article, the state of the art and the further development of both tools are discussed, focusing on the European and North American e...

  18. Teaching Higher Order Thinking in the Introductory MIS Course: A Model-Directed Approach

    Science.gov (United States)

    Wang, Shouhong; Wang, Hai

    2011-01-01

    One vision of education evolution is to change the modes of thinking of students. Critical thinking, design thinking, and system thinking are higher order thinking paradigms that are specifically pertinent to business education. A model-directed approach to teaching and learning higher order thinking is proposed. An example of application of the…

  19. A Review of Nutrition-Specific and Nutrition-Sensitive Approaches to Preventing Moderate Acute Malnutrition

    International Nuclear Information System (INIS)

    Mucha, Noreen; Jimenez, Michelle; Stone-Jimenez, Maryanne; Brown, Rebecca

    2014-01-01

    Full text: Recent literature reviews have demonstrated the limited efficacy of targeted supplementary feeding programmes aimed at both treating and preventing moderate acute malnutrition (MAM), with high rates of defaulting, low coverage and high associated costs. There is a growing interest in a) reviewing and improving protocols / tools for the management of acute malnutrition and b) increasing the quality and variety of products available for the treatment / prevention of moderate acute malnutrition. There is however, varying evidence on the impact of nutritional products aimed at preventing or treating acute malnutrition, or on the comparative efficacy of different products. Following several literature reviews and operational research with varying results, there is increasing consensus that MAM should be tackled not only through products, and that clearer guidance should be provided on broader preventive strategies, such as optimal infant and young child feeding (IYCF) and caregiving practices, optimal maternal nutrition, counselling, social protection, food security and livelihoods, and water, sanitation and hygiene (WASH). The CMAM Forum has commissioned Technical Briefs which aim to summarise current thinking and practice relating to preventive approaches to MAM, looking at the role of both nutrition-specific and nutrition-sensitive interventions. The work is being launched in January 2014 and results will be available for presentation at the IAEA MAM Symposium in May 2014. The briefs aim to provide: • An overview of approaches to preventing MAM across different sectors (e.g. agriculture, health, IYCF, social protection, water and sanitation) and in different contexts. • A review of current knowledge including: – Evidence from systematic and literature reviews. – Existing approaches and practice for prevention of MAM. – Current guidance on making programmatic choices relating to MAM prevention interventions and decision-making frameworks.

  20. A Model-Driven Approach to e-Course Management

    Science.gov (United States)

    Savic, Goran; Segedinac, Milan; Milenkovic, Dušica; Hrin, Tamara; Segedinac, Mirjana

    2018-01-01

    This paper presents research on using a model-driven approach to the development and management of electronic courses. We propose a course management system which stores a course model represented as distinct machine-readable components containing domain knowledge of different course aspects. Based on this formally defined platform-independent…

  1. Modeling Alaska boreal forests with a controlled trend surface approach

    Science.gov (United States)

    Mo Zhou; Jingjing Liang

    2012-01-01

    An approach of Controlled Trend Surface was proposed to simultaneously take into consideration large-scale spatial trends and nonspatial effects. A geospatial model of the Alaska boreal forest was developed from 446 permanent sample plots, which addressed large-scale spatial trends in recruitment, diameter growth, and mortality. The model was tested on two sets of...

  2. Sensitivity analysis approaches applied to systems biology models.

    Science.gov (United States)

    Zi, Z

    2011-11-01

    With the rising application of systems biology, sensitivity analysis methods have been widely applied to study the biological systems, including metabolic networks, signalling pathways and genetic circuits. Sensitivity analysis can provide valuable insights about how robust the biological responses are with respect to the changes of biological parameters and which model inputs are the key factors that affect the model outputs. In addition, sensitivity analysis is valuable for guiding experimental analysis, model reduction and parameter estimation. Local and global sensitivity analysis approaches are the two types of sensitivity analysis that are commonly applied in systems biology. Local sensitivity analysis is a classic method that studies the impact of small perturbations on the model outputs. On the other hand, global sensitivity analysis approaches have been applied to understand how the model outputs are affected by large variations of the model input parameters. In this review, the author introduces the basic concepts of sensitivity analysis approaches applied to systems biology models. Moreover, the author discusses the advantages and disadvantages of different sensitivity analysis methods, how to choose a proper sensitivity analysis approach, the available sensitivity analysis tools for systems biology models and the caveats in the interpretation of sensitivity analysis results.

  3. Towards modeling future energy infrastructures - the ELECTRA system engineering approach

    DEFF Research Database (Denmark)

    Uslar, Mathias; Heussen, Kai

    2016-01-01

    of the IEC 62559 use case template as well as needed changes to cope particularly with the aspects of controller conflicts and Greenfield technology modeling. From the original envisioned use of the standards, we show a possible transfer on how to properly deal with a Greenfield approach when modeling....

  4. Child human model development: a hybrid validation approach

    NARCIS (Netherlands)

    Forbes, P.A.; Rooij, L. van; Rodarius, C.; Crandall, J.

    2008-01-01

    The current study presents a development and validation approach of a child human body model that will help understand child impact injuries and improve the biofidelity of child anthropometric test devices. Due to the lack of fundamental child biomechanical data needed to fully develop such models a

  5. Refining the Committee Approach and Uncertainty Prediction in Hydrological Modelling

    NARCIS (Netherlands)

    Kayastha, N.

    2014-01-01

    Due to the complexity of hydrological systems a single model may be unable to capture the full range of a catchment response and accurately predict the streamflows. The multi modelling approach opens up possibilities for handling such difficulties and allows improve the predictive capability of

  6. Refining the committee approach and uncertainty prediction in hydrological modelling

    NARCIS (Netherlands)

    Kayastha, N.

    2014-01-01

    Due to the complexity of hydrological systems a single model may be unable to capture the full range of a catchment response and accurately predict the streamflows. The multi modelling approach opens up possibilities for handling such difficulties and allows improve the predictive capability of

  7. Product Trial Processing (PTP): a model approach from ...

    African Journals Online (AJOL)

    Product Trial Processing (PTP): a model approach from theconsumer's perspective. ... Global Journal of Social Sciences ... Among the constructs used in the model of consumer's processing of product trail includes; experiential and non- experiential attributes, perceived validity of product trial, consumer perceived expertise, ...

  8. A MIXTURE LIKELIHOOD APPROACH FOR GENERALIZED LINEAR-MODELS

    NARCIS (Netherlands)

    WEDEL, M; DESARBO, WS

    1995-01-01

    A mixture model approach is developed that simultaneously estimates the posterior membership probabilities of observations to a number of unobservable groups or latent classes, and the parameters of a generalized linear model which relates the observations, distributed according to some member of

  9. Wall Shear Stress Distribution in a Patient-Specific Cerebral Aneurysm Model using Reduced Order Modeling

    Science.gov (United States)

    Han, Suyue; Chang, Gary Han; Schirmer, Clemens; Modarres-Sadeghi, Yahya

    2016-11-01

    We construct a reduced-order model (ROM) to study the Wall Shear Stress (WSS) distributions in image-based patient-specific aneurysms models. The magnitude of WSS has been shown to be a critical factor in growth and rupture of human aneurysms. We start the process by running a training case using Computational Fluid Dynamics (CFD) simulation with time-varying flow parameters, such that these parameters cover the range of parameters of interest. The method of snapshot Proper Orthogonal Decomposition (POD) is utilized to construct the reduced-order bases using the training CFD simulation. The resulting ROM enables us to study the flow patterns and the WSS distributions over a range of system parameters computationally very efficiently with a relatively small number of modes. This enables comprehensive analysis of the model system across a range of physiological conditions without the need to re-compute the simulation for small changes in the system parameters.

  10. Comparison of Five Modeling Approaches to Quantify and ...

    Science.gov (United States)

    A generally accepted value for the Radiation Amplification Factor (RAF), with respect to the erythemal action spectrum for sunburn of human skin, is −1.1, indicating that a 1.0% increase in stratospheric ozone leads to a 1.1% decrease in the biologically damaging UV radiation in the erythemal action spectrum reaching the Earth. The RAF is used to quantify the non-linear change in the biologically damaging UV radiation in the erythemal action spectrum as a function of total column ozone (O3). Spectrophotometer measurements recorded at ten US monitoring sites were used in this analysis, and over 71,000 total UVR measurement scans of the sky were collected at those 10 sites between 1998 and 2000 to assess the RAF value. This UVR dataset was examined to determine the specific impact of clouds on the RAF. Five de novo modeling approaches were used on the dataset, and the calculated RAF values ranged from a low of −0.80 to a high of −1.38. To determine the impact of clouds on RAF, which is an indicator of the amount of UV radiation reaching the earth which can affect sunburn of human skin.

  11. Meta-analysis a structural equation modeling approach

    CERN Document Server

    Cheung, Mike W-L

    2015-01-01

    Presents a novel approach to conducting meta-analysis using structural equation modeling. Structural equation modeling (SEM) and meta-analysis are two powerful statistical methods in the educational, social, behavioral, and medical sciences. They are often treated as two unrelated topics in the literature. This book presents a unified framework on analyzing meta-analytic data within the SEM framework, and illustrates how to conduct meta-analysis using the metaSEM package in the R statistical environment. Meta-Analysis: A Structural Equation Modeling Approach begins by introducing the impo

  12. A study of multidimensional modeling approaches for data warehouse

    Science.gov (United States)

    Yusof, Sharmila Mat; Sidi, Fatimah; Ibrahim, Hamidah; Affendey, Lilly Suriani

    2016-08-01

    Data warehouse system is used to support the process of organizational decision making. Hence, the system must extract and integrate information from heterogeneous data sources in order to uncover relevant knowledge suitable for decision making process. However, the development of data warehouse is a difficult and complex process especially in its conceptual design (multidimensional modeling). Thus, there have been various approaches proposed to overcome the difficulty. This study surveys and compares the approaches of multidimensional modeling and highlights the issues, trend and solution proposed to date. The contribution is on the state of the art of the multidimensional modeling design.

  13. Numerical linked-cluster approach to quantum lattice models.

    Science.gov (United States)

    Rigol, Marcos; Bryant, Tyler; Singh, Rajiv R P

    2006-11-03

    We present a novel algorithm that allows one to obtain temperature dependent properties of quantum lattice models in the thermodynamic limit from exact diagonalization of small clusters. Our numerical linked-cluster approach provides a systematic framework to assess finite-size effects and is valid for any quantum lattice model. Unlike high temperature expansions, which have a finite radius of convergence in inverse temperature, these calculations are accurate at all temperatures provided the range of correlations is finite. We illustrate the power of our approach studying spin models on kagomé, triangular, and square lattices.

  14. Specific and generic stem biomass and volume models of tree species in a West African tropical semi-deciduous forest

    DEFF Research Database (Denmark)

    Goussanou, Cédric A.; Guendehou, Sabin; Assogbadjo, Achille E.

    2016-01-01

    The quantification of the contribution of tropical forests to global carbon stocks and climate change mitigation requires availability of data and tools such as allometric equations. This study made available volume and biomass models for eighteen tree species in a semi-deciduous tropical forest...... enabled to conclude that the non-destructive sampling was a good approach to determining reliable basic wood density. The comparative analysis of species-specific models in this study with selected generic models for tropical forests indicated low probability to identify effective generic models with good...

  15. A Companion Model Approach to Modelling and Simulation of Industrial Processes

    International Nuclear Information System (INIS)

    Juslin, K.

    2005-09-01

    Modelling and simulation provides for huge possibilities if broadly taken up by engineers as a working method. However, when considering the launching of modelling and simulation tools in an engineering design project, they shall be easy to learn and use. Then, there is no time to write equations, to consult suppliers' experts, or to manually transfer data from one tool to another. The answer seems to be in the integration of easy to use and dependable simulation software with engineering tools. Accordingly, the modelling and simulation software shall accept as input such structured design information on industrial unit processes and their connections, as provided for by e.g. CAD software and product databases. The software technology, including required specification and communication standards, is already available. Internet based service repositories make it possible for equipment manufacturers to supply 'extended products', including such design data as needed by engineers engaged in process and automation integration. There is a market niche evolving for simulation service centres, operating in co-operation with project consultants, equipment manufacturers, process integrators, automation designers, plant operating personnel, and maintenance centres. The companion model approach for specification and solution of process simulation models, as presented herein, is developed from the above premises. The focus is on how to tackle real world processes, which from the modelling point of view are heterogeneous, dynamic, very stiff, very nonlinear and only piece vice continuous, without extensive manual interventions of human experts. An additional challenge, to solve the arising equations fast and reliable, is dealt with, as well. (orig.)

  16. Mud crab ecology encourages site-specific approaches to fishery management

    Science.gov (United States)

    Dumas, P.; Léopold, M.; Frotté, L.; Peignon, C.

    2012-01-01

    Little is known about the effects of mud crabs population patterns on their exploitation. We used complementary approaches (experimental, fisher-based) to investigate how small-scale variations in density, size and sex-ratio related to the ecology of S. serrata may impact fishing practices in New Caledonia. Crabs were measured/sexed across 9 stations in contrasted mangrove systems between 2007 and 2009. Stations were described and classified in different kinds of mangrove forests (coastal, riverine, and estuarine); vegetation cover was qualitatively described at station scale. Annual catch was used as an indicator of fishing pressure. Middle-scale environmental factors (oceanic influence, vegetation cover) had significant contributions to crab density (GLM, 84.8% of variance), crab size and sex-ratio (social implications in the Pacific area, where land tenure system and traditional access rights prevent most fishers from freely selecting their harvest zones. This offers a great opportunity to encourage site-specific management of mud crab fisheries.

  17. Metamodelling Approach and Software Tools for Physical Modelling and Simulation

    Directory of Open Access Journals (Sweden)

    Vitaliy Mezhuyev

    2015-02-01

    Full Text Available In computer science, metamodelling approach becomes more and more popular for the purpose of software systems development. In this paper, we discuss applicability of the metamodelling approach for development of software tools for physical modelling and simulation.To define a metamodel for physical modelling the analysis of physical models will be done. The result of such the analyses will show the invariant physical structures, we propose to use as the basic abstractions of the physical metamodel. It is a system of geometrical objects, allowing to build a spatial structure of physical models and to set a distribution of physical properties. For such geometry of distributed physical properties, the different mathematical methods can be applied. To prove the proposed metamodelling approach, we consider the developed prototypes of software tools.

  18. Learning the Task Management Space of an Aircraft Approach Model

    Science.gov (United States)

    Krall, Joseph; Menzies, Tim; Davies, Misty

    2014-01-01

    Validating models of airspace operations is a particular challenge. These models are often aimed at finding and exploring safety violations, and aim to be accurate representations of real-world behavior. However, the rules governing the behavior are quite complex: nonlinear physics, operational modes, human behavior, and stochastic environmental concerns all determine the responses of the system. In this paper, we present a study on aircraft runway approaches as modeled in Georgia Tech's Work Models that Compute (WMC) simulation. We use a new learner, Genetic-Active Learning for Search-Based Software Engineering (GALE) to discover the Pareto frontiers defined by cognitive structures. These cognitive structures organize the prioritization and assignment of tasks of each pilot during approaches. We discuss the benefits of our approach, and also discuss future work necessary to enable uncertainty quantification.

  19. Specification, Model Generation, and Verification of Distributed Applications

    OpenAIRE

    Madelaine, Eric

    2011-01-01

    Since 2001, in the Oasis team, I have developed research on the semantics of applications based on distributed objects, applying in the context of a real language, and applications of realistic size, my previous researches in the field of process algebras. The various aspects of this work naturally include behavioral semantics and the definition of procedures for model generation, taking into account the different concepts of distributed applications, but also upstream, static code analysis a...

  20. Positive Mathematical Programming Approaches – Recent Developments in Literature and Applied Modelling

    Directory of Open Access Journals (Sweden)

    Thomas Heckelei

    2012-05-01

    Full Text Available This paper reviews and discusses the more recent literature and application of Positive Mathematical Programming in the context of agricultural supply models. Specifically, advances in the empirical foundation of parameter specifications as well as the economic rationalisation of PMP models – both criticized in earlier reviews – are investigated. Moreover, the paper provides an overview on a larger set of models with regular/repeated policy application that apply variants of PMP. Results show that most applications today avoid arbitrary parameter specifications and rely on exogenous information on supply responses to calibrate model parameters. However, only few approaches use multiple observations to estimate parameters, which is likely due to the still considerable technical challenges associated with it. Equally, we found only limited reflection on the behavioral or technological assumptions that could rationalise the PMP model structure while still keeping the model’s advantages.

  1. An integrated modeling approach to age invariant face recognition

    Science.gov (United States)

    Alvi, Fahad Bashir; Pears, Russel

    2015-03-01

    This Research study proposes a novel method for face recognition based on Anthropometric features that make use of an integrated approach comprising of a global and personalized models. The system is aimed to at situations where lighting, illumination, and pose variations cause problems in face recognition. A Personalized model covers the individual aging patterns while a Global model captures general aging patterns in the database. We introduced a de-aging factor that de-ages each individual in the database test and training sets. We used the k nearest neighbor approach for building a personalized model and global model. Regression analysis was applied to build the models. During the test phase, we resort to voting on different features. We used FG-Net database for checking the results of our technique and achieved 65 percent Rank 1 identification rate.

  2. Modeling Approaches and Systems Related to Structured Modeling.

    Science.gov (United States)

    1987-02-01

    Lasdon 򒾂> and Maturana 򒾃> for surveys of several modern systems. A -6- N NN- %0 CAMPS (Lucas and Mitra 򒾁>) -- Computer Assisted Mathe- %l...583-589. MATURANA , S. 򒾃>. "Comparative Analysis of Mathematical Modeling Systems," informal note, Graduate School of Manage- ment, UCLA, February

  3. Process-based monitoring and modeling of Karst springs - Linking intrinsic to specific vulnerability.

    Science.gov (United States)

    Epting, Jannis; Page, Rebecca M; Auckenthaler, Adrian; Huggenberger, Peter

    2018-06-01

    The presented work illustrates to what extent field investigations as well as monitoring and modeling approaches are necessary to understand the high discharge dynamics and vulnerability of Karst springs. In complex settings the application of 3D geological models is essential for evaluating the vulnerability of Karst systems. They allow deriving information on catchment characteristics, as the geometry of aquifers and aquitards as well as their displacements along faults. A series of Karst springs in northwestern Switzerland were compared and Karst system dynamics with respect to qualitative and quantitative issues were evaluated. The main objective of the studies was to combine information of catchment characteristics and data from novel monitoring systems (physicochemical and microbiological parameters) to assess the intrinsic vulnerability of Karst springs to microbiological contamination with simulated spring discharges derived from numerical modeling (linear storage models). The numerically derived relation of fast and slow groundwater flow components enabled us to relate different sources of groundwater recharge and to characterize the dynamics of the Karst springs. Our study illustrates that comparably simple model-setups were able to reproduce the overall dynamic intrinsic vulnerability of several Karst systems and that one of the most important processes involved was the temporal variation of groundwater recharge (precipitation, evapotranspiration and snow melt). Furthermore, we make a first attempt on how to link intrinsic to specific vulnerability of Karst springs, which involves activities within the catchment area as human impacts from agriculture and settlements. Likewise, by a more detailed representation of system dynamics the influence of surface water, which is impacted by release events from storm sewers, infiltrating into the Karst system, could be considered. Overall, we demonstrate that our approach can be the basis for a more flexible and

  4. Soil moisture simulations using two different modelling approaches

    Czech Academy of Sciences Publication Activity Database

    Šípek, Václav; Tesař, Miroslav

    2013-01-01

    Roč. 64, 3-4 (2013), s. 99-103 ISSN 0006-5471 R&D Projects: GA AV ČR IAA300600901; GA ČR GA205/08/1174 Institutional research plan: CEZ:AV0Z20600510 Keywords : soil moisture modelling * SWIM model * box modelling approach Subject RIV: DA - Hydrology ; Limnology http://www.boku.ac.at/diebodenkultur/volltexte/sondernummern/band-64/heft-3-4/sipek.pdf

  5. A generic approach to haptic modeling of textile artifacts

    Science.gov (United States)

    Shidanshidi, H.; Naghdy, F.; Naghdy, G.; Wood Conroy, D.

    2009-08-01

    Haptic Modeling of textile has attracted significant interest over the last decade. In spite of extensive research, no generic system has been proposed. The previous work mainly assumes that textile has a 2D planar structure. They also require time-consuming measurement of textile properties in construction of the mechanical model. A novel approach for haptic modeling of textile is proposed to overcome the existing shortcomings. The method is generic, assumes a 3D structure for the textile, and deploys computational intelligence to estimate the mechanical properties of textile. The approach is designed primarily for display of textile artifacts in museums. The haptic model is constructed by superimposing the mechanical model of textile over its geometrical model. Digital image processing is applied to the still image of textile to identify its pattern and structure through a fuzzy rule-base algorithm. The 3D geometric model of the artifact is automatically generated in VRML based on the identified pattern and structure obtained from the textile image. Selected mechanical properties of the textile are estimated by an artificial neural network; deploying the textile geometric characteristics and yarn properties as inputs. The estimated mechanical properties are then deployed in the construction of the textile mechanical model. The proposed system is introduced and the developed algorithms are described. The validation of method indicates the feasibility of the approach and its superiority to other haptic modeling algorithms.

  6. Site-specific and multielement approach to the determination of liquid-vapor isotope fractionation parameters. The case of alcohols

    International Nuclear Information System (INIS)

    Moussa, I.; Naulet, N.; Martin, M.L.; Martin, G.J.

    1990-01-01

    Isotope fractionation phenomena occurring at the natural abundance level in the course of liquid-vapor transformation have been investigated by using the SNIF-NMR method (site-specific natural isotope fractionation studied by NMR) which has a unique capability of providing simultaneous access to fractionation parameters associated with different molecular isotopomers. This new approach has been combined with the determination of overall carbon and hydrogen fractionation effects by isotope ratio mass spectrometry (IRMS). The results of distillation and evaporation experiments of alcohols performed in technical conditions of practical interest have been analyzed according to the Rayleigh-type model. In order to check the performance of the column, unit fractionation factors were measured beforehand for water and for the hydroxylic sites of methanol and ethanol for which liquid-vapor equilibrium constants were already known. Inverse isotope effects are determined in distillation experiments for the overall carbon isotope ratio and for the site-specific hydrogen isotope ratios associated with the methyl and methylene sites of methanol and ethanol. In contrast, normal isotope effects are produced by distillation for the hydroxylic sites and by evaporation for all the isotopic ratios

  7. Specificity of DNA microarray hybridization: characterization, effectors and approaches for data correction

    OpenAIRE

    Koltai, Hinanit; Weingarten-Baror, Carmiya

    2008-01-01

    Microarray-hybridization specificity is one of the main effectors of microarray result quality. In the present review, we suggest a definition for specificity that spans four hybridization levels, from the single probe to the microarray platform. For increased hybridization specificity, it is important to quantify the extent of the specificity at each of these levels, and correct the data accordingly. We outline possible effects of low hybridization specificity on the obtained results and lis...

  8. Modeling gene expression measurement error: a quasi-likelihood approach

    Directory of Open Access Journals (Sweden)

    Strimmer Korbinian

    2003-03-01

    Full Text Available Abstract Background Using suitable error models for gene expression measurements is essential in the statistical analysis of microarray data. However, the true probabilistic model underlying gene expression intensity readings is generally not known. Instead, in currently used approaches some simple parametric model is assumed (usually a transformed normal distribution or the empirical distribution is estimated. However, both these strategies may not be optimal for gene expression data, as the non-parametric approach ignores known structural information whereas the fully parametric models run the risk of misspecification. A further related problem is the choice of a suitable scale for the model (e.g. observed vs. log-scale. Results Here a simple semi-parametric model for gene expression measurement error is presented. In this approach inference is based an approximate likelihood function (the extended quasi-likelihood. Only partial knowledge about the unknown true distribution is required to construct this function. In case of gene expression this information is available in the form of the postulated (e.g. quadratic variance structure of the data. As the quasi-likelihood behaves (almost like a proper likelihood, it allows for the estimation of calibration and variance parameters, and it is also straightforward to obtain corresponding approximate confidence intervals. Unlike most other frameworks, it also allows analysis on any preferred scale, i.e. both on the original linear scale as well as on a transformed scale. It can also be employed in regression approaches to model systematic (e.g. array or dye effects. Conclusions The quasi-likelihood framework provides a simple and versatile approach to analyze gene expression data that does not make any strong distributional assumptions about the underlying error model. For several simulated as well as real data sets it provides a better fit to the data than competing models. In an example it also

  9. Model-driven design using IEC 61499 a synchronous approach for embedded and automation systems

    CERN Document Server

    Yoong, Li Hsien; Bhatti, Zeeshan E; Kuo, Matthew M Y

    2015-01-01

    This book describes a novel approach for the design of embedded systems and industrial automation systems, using a unified model-driven approach that is applicable in both domains.  The authors illustrate their methodology, using the IEC 61499 standard as the main vehicle for specification, verification, static timing analysis and automated code synthesis.  The well-known synchronous approach is used as the main vehicle for defining an unambiguous semantics that ensures determinism and deadlock freedom. The proposed approach also ensures very efficient implementations either on small-scale embedded devices or on industry-scale programmable automation controllers (PACs). It can be used for both centralized and distributed implementations. Significantly, the proposed approach can be used without the need for any run-time support. This approach, for the first time, blurs the gap between embedded systems and automation systems and can be applied in wide-ranging applications in automotive, robotics, and industri...

  10. Comparing Three Patterns of Strengths and Weaknesses Models for the Identification of Specific Learning Disabilities

    Science.gov (United States)

    Miller, Daniel C.; Maricle, Denise E.; Jones, Alicia M.

    2016-01-01

    Processing Strengths and Weaknesses (PSW) models have been proposed as a method for identifying specific learning disabilities. Three PSW models were examined for their ability to predict expert identified specific learning disabilities cases. The Dual Discrepancy/Consistency Model (DD/C; Flanagan, Ortiz, & Alfonso, 2013) as operationalized by…

  11. Site-Specific Seismic Site Response Model for the Waste Treatment Plant, Hanford, Washington

    Energy Technology Data Exchange (ETDEWEB)

    Rohay, Alan C.; Reidel, Steve P.

    2005-02-24

    This interim report documents the collection of site-specific geologic and geophysical data characterizing the Waste Treatment Plant site and the modeling of the site-specific structure response to earthquake ground motions.

  12. Space-Wise approach for airborne gravity data modelling

    Science.gov (United States)

    Sampietro, D.; Capponi, M.; Mansi, A. H.; Gatti, A.; Marchetti, P.; Sansò, F.

    2017-05-01

    Regional gravity field modelling by means of remove-compute-restore procedure is nowadays widely applied in different contexts: it is the most used technique for regional gravimetric geoid determination, and it is also used in exploration geophysics to predict grids of gravity anomalies (Bouguer, free-air, isostatic, etc.), which are useful to understand and map geological structures in a specific region. Considering this last application, due to the required accuracy and resolution, airborne gravity observations are usually adopted. However, due to the relatively high acquisition velocity, presence of atmospheric turbulence, aircraft vibration, instrumental drift, etc., airborne data are usually contaminated by a very high observation error. For this reason, a proper procedure to filter the raw observations in both the low and high frequencies should be applied to recover valuable information. In this work, a software to filter and grid raw airborne observations is presented: the proposed solution consists in a combination of an along-track Wiener filter and a classical Least Squares Collocation technique. Basically, the proposed procedure is an adaptation to airborne gravimetry of the Space-Wise approach, developed by Politecnico di Milano to process data coming from the ESA satellite mission GOCE. Among the main differences with respect to the satellite application of this approach, there is the fact that, while in processing GOCE data the stochastic characteristics of the observation error can be considered a-priori well known, in airborne gravimetry, due to the complex environment in which the observations are acquired, these characteristics are unknown and should be retrieved from the dataset itself. The presented solution is suited for airborne data analysis in order to be able to quickly filter and grid gravity observations in an easy way. Some innovative theoretical aspects focusing in particular on the theoretical covariance modelling are presented too

  13. A Toolkit Modeling Approach for Sustainable Forest Management Planning: Achieving Balance between Science and Local Needs

    Directory of Open Access Journals (Sweden)

    Brian R. Sturtevant

    2007-12-01

    Full Text Available To assist forest managers in balancing an increasing diversity of resource objectives, we developed a toolkit modeling approach for sustainable forest management (SFM. The approach inserts a meta-modeling strategy into a collaborative modeling framework grounded in adaptive management philosophy that facilitates participation among stakeholders, decision makers, and local domain experts in the meta-model building process. The modeling team works iteratively with each of these groups to define essential questions, identify data resources, and then determine whether available tools can be applied or adapted, or whether new tools can be rapidly created to fit the need. The desired goal of the process is a linked series of domain-specific models (tools that balances generalized "top-down" models (i.e., scientific models developed without input from the local system with case-specific customized "bottom-up" models that are driven primarily by local needs. Information flow between models is organized according to vertical (i.e., between scale and horizontal (i.e., within scale dimensions. We illustrate our approach within a 2.1 million hectare forest planning district in central Labrador, a forested landscape where social and ecological values receive a higher priority than economic values. However, the focus of this paper is on the process of how SFM modeling tools and concepts can be rapidly assembled and applied in new locations, balancing efficient transfer of science with adaptation to local needs. We use the Labrador case study to illustrate strengths and challenges uniquely associated with a meta-modeling approach to integrated modeling as it fits within the broader collaborative modeling framework. Principle advantages of the approach include the scientific rigor introduced by peer-reviewed models, combined with the adaptability of meta-modeling. A key challenge is the limited transparency of scientific models to different participatory groups

  14. Spatial regression-based model specifications for exogenous and endogenous spatial interaction

    OpenAIRE

    Manfred M Fischer; James P. LeSage

    2014-01-01

    Spatial interaction models represent a class of models that are used for modelling origin-destination flow data. The focus of this paper is on the log-normal version of the model. In this context, we consider spatial econometric specifications that can be used to accommodate two types of dependence scenarios, one involving endogenous interaction and the other exogenous interaction. These model specifications replace the conventional assumption of independence between origin-destination flows ...

  15. Confidence Level Based Approach to Total Dose Specification for Spacecraft Electronics

    Science.gov (United States)

    Xapsos, M. A.; Stauffer, C.; Phan, A.; McClure, S. S.; Ladbury, R. L.; Pellish, J. A.; Campola, M. J.; Label, K. A.

    2017-01-01

    A confidence level based approach to total dose radiation hardness assurance is presented for spacecraft electronics. It is applicable to both ionizing and displacement damage dose. Results are compared to the traditional approach that uses radiation design margin and advantages of the new approach are discussed.

  16. A novel scoring approach for protein co-purification data reveals high interaction specificity.

    Directory of Open Access Journals (Sweden)

    Xueping Yu

    2009-09-01

    Full Text Available Large-scale protein interaction networks (PINs have typically been discerned using affinity purification followed by mass spectrometry (AP/MS and yeast two-hybrid (Y2H techniques. It is generally recognized that Y2H screens detect direct binary interactions while the AP/MS method captures co-complex associations; however, the latter technique is known to yield prevalent false positives arising from a number of effects, including abundance. We describe a novel approach to compute the propensity for two proteins to co-purify in an AP/MS data set, thereby allowing us to assess the detected level of interaction specificity by analyzing the corresponding distribution of interaction scores. We find that two recent AP/MS data sets of yeast contain enrichments of specific, or high-scoring, associations as compared to commensurate random profiles, and that curated, direct physical interactions in two prominent data bases have consistently high scores. Our scored interaction data sets are generally more comprehensive than those of previous studies when compared against four diverse, high-quality reference sets. Furthermore, we find that our scored data sets are more enriched with curated, direct physical associations than Y2H sets. A high-confidence protein interaction network (PIN derived from the AP/MS data is revealed to be highly modular, and we show that this topology is not the result of misrepresenting indirect associations as direct interactions. In fact, we propose that the modularity in Y2H data sets may be underrepresented, as they contain indirect associations that are significantly enriched with false negatives. The AP/MS PIN is also found to contain significant assortative mixing; however, in line with a previous study we confirm that Y2H interaction data show weak disassortativeness, thus revealing more clearly the distinctive natures of the interaction detection methods. We expect that our scored yeast data sets are ideal for further

  17. A novel scoring approach for protein co-purification data reveals high interaction specificity.

    Science.gov (United States)

    Yu, Xueping; Ivanic, Joseph; Wallqvist, Anders; Reifman, Jaques

    2009-09-01

    Large-scale protein interaction networks (PINs) have typically been discerned using affinity purification followed by mass spectrometry (AP/MS) and yeast two-hybrid (Y2H) techniques. It is generally recognized that Y2H screens detect direct binary interactions while the AP/MS method captures co-complex associations; however, the latter technique is known to yield prevalent false positives arising from a number of effects, including abundance. We describe a novel approach to compute the propensity for two proteins to co-purify in an AP/MS data set, thereby allowing us to assess the detected level of interaction specificity by analyzing the corresponding distribution of interaction scores. We find that two recent AP/MS data sets of yeast contain enrichments of specific, or high-scoring, associations as compared to commensurate random profiles, and that curated, direct physical interactions in two prominent data bases have consistently high scores. Our scored interaction data sets are generally more comprehensive than those of previous studies when compared against four diverse, high-quality reference sets. Furthermore, we find that our scored data sets are more enriched with curated, direct physical associations than Y2H sets. A high-confidence protein interaction network (PIN) derived from the AP/MS data is revealed to be highly modular, and we show that this topology is not the result of misrepresenting indirect associations as direct interactions. In fact, we propose that the modularity in Y2H data sets may be underrepresented, as they contain indirect associations that are significantly enriched with false negatives. The AP/MS PIN is also found to contain significant assortative mixing; however, in line with a previous study we confirm that Y2H interaction data show weak disassortativeness, thus revealing more clearly the distinctive natures of the interaction detection methods. We expect that our scored yeast data sets are ideal for further biological discovery

  18. The Specifics and Non-Specifics of using Small Interfering RNAs for Targeting of Viral Genes in a Fish Model

    DEFF Research Database (Denmark)

    Schyth, Brian Dall

    2007-01-01

    , and to a lesser degree naked siRNAs, primarily entered free intraperitoneal cells including macrophage-like cells. Furthermore uptake correlated with antiviral activity seen as reduced mortality of fish challenged with VHSV. Protection at the disease level was not dependent upon which one of three tested si......RNAs was used and protection correlated with up-regulation of an interferon-related gene in the liver indicating a systemic interferon response. The results show the validity of the fish model for testing delivery and non-specific effects of siRNAs in a high throughput vertebrate model. The purchase......A novel in vivo-model composed of small juvenile rainbow trout and a fish-pathogenic virus is suggested to analyze delivery and antiviral effect of formulated siRNAs. This model was used for testing delivery of intraperitoneally injected siRNAs formulated in polycationic liposomes. These...

  19. Modeling electricity loads in California: a continuous-time approach

    Science.gov (United States)

    Weron, R.; Kozłowska, B.; Nowicka-Zagrajek, J.

    2001-10-01

    In this paper we address the issue of modeling electricity loads and prices with diffusion processes. More specifically, we study models which belong to the class of generalized Ornstein-Uhlenbeck processes. After comparing properties of simulated paths with those of deseasonalized data from the California power market and performing out-of-sample forecasts we conclude that, despite certain advantages, the analyzed continuous-time processes are not adequate models of electricity load and price dynamics.

  20. Wave Resource Characterization Using an Unstructured Grid Modeling Approach

    Directory of Open Access Journals (Sweden)

    Wei-Cheng Wu

    2018-03-01

    Full Text Available This paper presents a modeling study conducted on the central Oregon coast for wave resource characterization, using the unstructured grid Simulating WAve Nearshore (SWAN model coupled with a nested grid WAVEWATCH III® (WWIII model. The flexibility of models with various spatial resolutions and the effects of open boundary conditions simulated by a nested grid WWIII model with different physics packages were evaluated. The model results demonstrate the advantage of the unstructured grid-modeling approach for flexible model resolution and good model skills in simulating the six wave resource parameters recommended by the International Electrotechnical Commission in comparison to the observed data in Year 2009 at National Data Buoy Center Buoy 46050. Notably, spectral analysis indicates that the ST4 physics package improves upon the ST2 physics package’s ability to predict wave power density for large waves, which is important for wave resource assessment, load calculation of devices, and risk management. In addition, bivariate distributions show that the simulated sea state of maximum occurrence with the ST4 physics package matched the observed data better than with the ST2 physics package. This study demonstrated that the unstructured grid wave modeling approach, driven by regional nested grid WWIII outputs along with the ST4 physics package, can efficiently provide accurate wave hindcasts to support wave resource characterization. Our study also suggests that wind effects need to be considered if the dimension of the model domain is greater than approximately 100 km, or O (102 km.

  1. Intelligent Transportation and Evacuation Planning A Modeling-Based Approach

    CERN Document Server

    Naser, Arab

    2012-01-01

    Intelligent Transportation and Evacuation Planning: A Modeling-Based Approach provides a new paradigm for evacuation planning strategies and techniques. Recently, evacuation planning and modeling have increasingly attracted interest among researchers as well as government officials. This interest stems from the recent catastrophic hurricanes and weather-related events that occurred in the southeastern United States (Hurricane Katrina and Rita). The evacuation methods that were in place before and during the hurricanes did not work well and resulted in thousands of deaths. This book offers insights into the methods and techniques that allow for implementing mathematical-based, simulation-based, and integrated optimization and simulation-based engineering approaches for evacuation planning. This book also: Comprehensively discusses the application of mathematical models for evacuation and intelligent transportation modeling Covers advanced methodologies in evacuation modeling and planning Discusses principles a...

  2. A mouse model for adult cardiac-specific gene deletion with CRISPR/Cas9.

    Science.gov (United States)

    Carroll, Kelli J; Makarewich, Catherine A; McAnally, John; Anderson, Douglas M; Zentilin, Lorena; Liu, Ning; Giacca, Mauro; Bassel-Duby, Rhonda; Olson, Eric N

    2016-01-12

    Clustered regularly interspaced short palindromic repeats (CRISPR)-associated (Cas)9 genomic editing has revolutionized the generation of mutant animals by simplifying the creation of null alleles in virtually any organism. However, most current approaches with this method require zygote injection, making it difficult to assess the adult, tissue-specific functions of genes that are widely expressed or which cause embryonic lethality when mutated. Here, we describe the generation of cardiac-specific Cas9 transgenic mice, which express high levels of Cas9 in the heart, but display no overt defects. In proof-of-concept experiments, we used Adeno-Associated Virus 9 (AAV9) to deliver single-guide RNA (sgRNA) that targets the Myh6 locus exclusively in cardiomyocytes. Intraperitoneal injection of postnatal cardiac-Cas9 transgenic mice with AAV9 encoding sgRNA against Myh6 resulted in robust editing of the Myh6 locus. These mice displayed severe cardiomyopathy and loss of cardiac function, with elevation of several markers of heart failure, confirming the effectiveness of this method of adult cardiac gene deletion. Mice with cardiac-specific expression of Cas9 provide a tool that will allow rapid and accurate deletion of genes following a single injection of AAV9-sgRNAs, thereby circumventing embryonic lethality. This method will be useful for disease modeling and provides a means of rapidly editing genes of interest in the heart.

  3. A review of function modeling: Approaches and applications

    OpenAIRE

    Erden, M.S.; Komoto, H.; Van Beek, T.J.; D'Amelio, V.; Echavarria, E.; Tomiyama, T.

    2008-01-01

    This work is aimed at establishing a common frame and understanding of function modeling (FM) for our ongoing research activities. A comparative review of the literature is performed to grasp the various FM approaches with their commonalities and differences. The relations of FM with the research fields of artificial intelligence, design theory, and maintenance are discussed. In this discussion the goals are to highlight the features of various classical approaches in relation to FM, to delin...

  4. Rule-based modeling: a computational approach for studying biomolecular site dynamics in cell signaling systems.

    Science.gov (United States)

    Chylek, Lily A; Harris, Leonard A; Tung, Chang-Shung; Faeder, James R; Lopez, Carlos F; Hlavacek, William S

    2014-01-01

    Rule-based modeling was developed to address the limitations of traditional approaches for modeling chemical kinetics in cell signaling systems. These systems consist of multiple interacting biomolecules (e.g., proteins), which themselves consist of multiple parts (e.g., domains, linear motifs, and sites of phosphorylation). Consequently, biomolecules that mediate information processing generally have the potential to interact in multiple ways, with the number of possible complexes and posttranslational modification states tending to grow exponentially with the number of binary interactions considered. As a result, only large reaction networks capture all possible consequences of the molecular interactions that occur in a cell signaling system, which is problematic because traditional modeling approaches for chemical kinetics (e.g., ordinary differential equations) require explicit network specification. This problem is circumvented through representation of interactions in terms of local rules. With this approach, network specification is implicit and model specification is concise. Concise representation results in a coarse graining of chemical kinetics, which is introduced because all reactions implied by a rule inherit the rate law associated with that rule. Coarse graining can be appropriate if interactions are modular, and the coarseness of a model can be adjusted as needed. Rules can be specified using specialized model-specification languages, and recently developed tools designed for specification of rule-based models allow one to leverage powerful software engineering capabilities. A rule-based model comprises a set of rules, which can be processed by general-purpose simulation and analysis tools to achieve different objectives (e.g., to perform either a deterministic or stochastic simulation). © 2013 Wiley Periodicals, Inc.

  5. A model-data based systems approach to process intensification

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    . Their developments, however, are largely due to experiment based trial and error approaches and while they do not require validation, they can be time consuming and resource intensive. Also, one may ask, can a truly new intensified unit operation be obtained in this way? An alternative two-stage approach is to apply...... a model-based synthesis method to systematically generate and evaluate alternatives in the first stage and an experiment-model based validation in the second stage. In this way, the search for alternatives is done very quickly, reliably and systematically over a wide range, while resources are preserved...... for focused validation of only the promising candidates in the second-stage. This approach, however, would be limited to intensification based on “known” unit operations, unless the PI process synthesis/design is considered at a lower level of aggregation, namely the phenomena level. That is, the model-based...

  6. Identifying Country-Specific Cultures of Physics Education: A differential item functioning approach

    Science.gov (United States)

    Mesic, Vanes

    2012-11-01

    In international large-scale assessments of educational outcomes, student achievement is often represented by unidimensional constructs. This approach allows for drawing general conclusions about country rankings with respect to the given achievement measure, but it typically does not provide specific diagnostic information which is necessary for systematic comparisons and improvements of educational systems. Useful information could be obtained by exploring the differences in national profiles of student achievement between low-achieving and high-achieving countries. In this study, we aimed to identify the relative weaknesses and strengths of eighth graders' physics achievement in Bosnia and Herzegovina in comparison to the achievement of their peers from Slovenia. For this purpose, we ran a secondary analysis of Trends in International Mathematics and Science Study (TIMSS) 2007 data. The student sample consisted of 4,220 students from Bosnia and Herzegovina and 4,043 students from Slovenia. After analysing the cognitive demands of TIMSS 2007 physics items, the correspondent differential item functioning (DIF)/differential group functioning contrasts were estimated. Approximately 40% of items exhibited large DIF contrasts, indicating significant differences between cultures of physics education in Bosnia and Herzegovina and Slovenia. The relative strength of students from Bosnia and Herzegovina showed to be mainly associated with the topic area 'Electricity and magnetism'. Classes of items which required the knowledge of experimental method, counterintuitive thinking, proportional reasoning and/or the use of complex knowledge structures proved to be differentially easier for students from Slovenia. In the light of the presented results, the common practice of ranking countries with respect to universally established cognitive categories seems to be potentially misleading.

  7. Assessing the functional diversity of herbivorous reef fishes using a compound-specific stable isotope approach

    KAUST Repository

    Tietbohl, Matthew

    2016-12-01

    Herbivorous coral reef fishes play an important role in helping to structure their environment directly by consuming algae and indirectly by promoting coral health and growth. These fishes are generally separated into three broad groups: browsers, grazers, and excavators/scrapers, with these groupings often thought to have a fixed general function and all fishes within a group thought to have similar ecological roles. This categorization assumes a high level of functional redundancy within herbivorous fishes. However, recent evidence questions the use of this broad classification scheme, and posits that there may actually be more resource partitioning within these functional groupings. Here, I use a compound-specific stable isotope approach (CSIA) to show there appears to be a greater diversity of functional roles than previously assumed within broad functional groups. The δ13C signatures from essential amino acids of reef end-members (coral, macroalgae, detritus, and phytoplankton) and fish muscle were analyzed to investigate differences in resource use between fishes. Most end-members displayed clear isotopic differences, and most fishes within functional groups were dissimilar in their isotopic signature, implying differences in the resources they target. No grazers closely resembled each other isotopically, implying a much lower level of functional redundancy within this group; scraping parrotfish were also distinct from excavating parrotfish and to a lesser degree distinct between scrapers. This study highlights the potential of CSIA to help distinguish fine-scale ecological differences within other groups of reef organisms as well. These results question the utility of lumping nominally herbivorous fishes into broad groups with assumed similar roles. Given the apparent functional differences between nominally herbivorous reef fishes, it is important for managers to incorporate the diversity of functional roles these fish play.

  8. Flax stems: from a specific architecture to an instructive model for bioinspired composite structures.

    Science.gov (United States)

    Baley, Christophe; Goudenhooft, Camille; Gibaud, Marianne; Bourmaud, Alain

    2018-01-10

    The present paper proposes to carefully study and describe the reinforcement mechanisms within a flax stem, which is an exceptional natural model of composite structure. Thanks to accurate microscopic investigations, with both optical and SEM method, we finely depicted the flax stem architecture, which can be view as a composite structure with an outer protection, a unidirectional ply on the periphery and a porous core; each component has a specific function, such as mechanical reinforcement for the unidirectional ply and the porous core. The significant mechanical role of fibres was underlined, as well as their local organisation in cohesive bundles, obtained because of an intrusive growth and evidenced in this work through nanomechanical AFM measurement and 3D reconstruction. Following a biomimetic approach, these data provide a source of inspiration for the composite materials of tomorrow. © 2018 IOP Publishing Ltd.

  9. METHODOLOGICAL APPROACHES FOR MODELING THE RURAL SETTLEMENT DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    Gorbenkova Elena Vladimirovna

    2017-10-01

    Full Text Available Subject: the paper describes the research results on validation of a rural settlement developmental model. The basic methods and approaches for solving the problem of assessment of the urban and rural settlement development efficiency are considered. Research objectives: determination of methodological approaches to modeling and creating a model for the development of rural settlements. Materials and methods: domestic and foreign experience in modeling the territorial development of urban and rural settlements and settlement structures was generalized. The motivation for using the Pentagon-model for solving similar problems was demonstrated. Based on a systematic analysis of existing development models of urban and rural settlements as well as the authors-developed method for assessing the level of agro-towns development, the systems/factors that are necessary for a rural settlement sustainable development are identified. Results: we created the rural development model which consists of five major systems that include critical factors essential for achieving a sustainable development of a settlement system: ecological system, economic system, administrative system, anthropogenic (physical system and social system (supra-structure. The methodological approaches for creating an evaluation model of rural settlements development were revealed; the basic motivating factors that provide interrelations of systems were determined; the critical factors for each subsystem were identified and substantiated. Such an approach was justified by the composition of tasks for territorial planning of the local and state administration levels. The feasibility of applying the basic Pentagon-model, which was successfully used for solving the analogous problems of sustainable development, was shown. Conclusions: the resulting model can be used for identifying and substantiating the critical factors for rural sustainable development and also become the basis of

  10. Decision support models for solid waste management: Review and game-theoretic approaches

    Energy Technology Data Exchange (ETDEWEB)

    Karmperis, Athanasios C., E-mail: athkarmp@mail.ntua.gr [Sector of Industrial Management and Operational Research, School of Mechanical Engineering, National Technical University of Athens, Iroon Polytechniou 9, 15780 Athens (Greece); Army Corps of Engineers, Hellenic Army General Staff, Ministry of Defence (Greece); Aravossis, Konstantinos; Tatsiopoulos, Ilias P.; Sotirchos, Anastasios [Sector of Industrial Management and Operational Research, School of Mechanical Engineering, National Technical University of Athens, Iroon Polytechniou 9, 15780 Athens (Greece)

    2013-05-15

    Highlights: ► The mainly used decision support frameworks for solid waste management are reviewed. ► The LCA, CBA and MCDM models are presented and their strengths, weaknesses, similarities and possible combinations are analyzed. ► The game-theoretic approach in a solid waste management context is presented. ► The waste management bargaining game is introduced as a specific decision support framework. ► Cooperative and non-cooperative game-theoretic approaches to decision support for solid waste management are discussed. - Abstract: This paper surveys decision support models that are commonly used in the solid waste management area. Most models are mainly developed within three decision support frameworks, which are the life-cycle assessment, the cost–benefit analysis and the multi-criteria decision-making. These frameworks are reviewed and their strengths and weaknesses as well as their critical issues are analyzed, while their possible combinations and extensions are also discussed. Furthermore, the paper presents how cooperative and non-cooperative game-theoretic approaches can be used for the purpose of modeling and analyzing decision-making in situations with multiple stakeholders. Specifically, since a waste management model is sustainable when considering not only environmental and economic but also social aspects, the waste management bargaining game is introduced as a specific decision support framework in which future models can be developed.

  11. Improving Evolutionary Models for Mitochondrial Protein Data with Site-Class Specific Amino Acid Exchangeability Matrices

    Science.gov (United States)

    Dunn, Katherine A.; Jiang, Wenyi; Field, Christopher; Bielawski, Joseph P.

    2013-01-01

    Adequate modeling of mitochondrial sequence evolution is an essential component of mitochondrial phylogenomics (comparative mitogenomics). There is wide recognition within the field that lineage-specific aspects of mitochondrial evolution should be accommodated through lineage-specific amino-acid exchangeability matrices (e.g., mtMam for mammalian data). However, such a matrix must be applied to all sites and this implies that all sites are subject to the same, or largely similar, evolutionary constraints. This assumption is unjustified. Indeed, substantial differences are expected to arise from three-dimensional structures that impose different physiochemical environments on individual amino acid residues. The objectives of this paper are (1) to investigate the extent to which amino acid evolution varies among sites of mitochondrial proteins, and (2) to assess the potential benefits of explicitly modeling such variability. To achieve this, we developed a novel method for partitioning sites based on amino acid physiochemical properties. We apply this method to two datasets derived from complete mitochondrial genomes of mammals and fish, and use maximum likelihood to estimate amino acid exchangeabilities for the different groups of sites. Using this approach we identified large groups of sites evolving under unique physiochemical constraints. Estimates of amino acid exchangeabilities differed significantly among such groups. Moreover, we found that joint estimates of amino acid exchangeabilities do not adequately represent the natural variability in evolutionary processes among sites of mitochondrial proteins. Significant improvements in likelihood are obtained when the new matrices are employed. We also find that maximum likelihood estimates of branch lengths can be strongly impacted. We provide sets of matrices suitable for groups of sites subject to similar physiochemical constraints, and discuss how they might be used to analyze real data. We also discuss how

  12. Improving evolutionary models for mitochondrial protein data with site-class specific amino acid exchangeability matrices.

    Directory of Open Access Journals (Sweden)

    Katherine A Dunn

    Full Text Available Adequate modeling of mitochondrial sequence evolution is an essential component of mitochondrial phylogenomics (comparative mitogenomics. There is wide recognition within the field that lineage-specific aspects of mitochondrial evolution should be accommodated through lineage-specific amino-acid exchangeability matrices (e.g., mtMam for mammalian data. However, such a matrix must be applied to all sites and this implies that all sites are subject to the same, or largely similar, evolutionary constraints. This assumption is unjustified. Indeed, substantial differences are expected to arise from three-dimensional structures that impose different physiochemical environments on individual amino acid residues. The objectives of this paper are (1 to investigate the extent to which amino acid evolution varies among sites of mitochondrial proteins, and (2 to assess the potential benefits of explicitly modeling such variability. To achieve this, we developed a novel method for partitioning sites based on amino acid physiochemical properties. We apply this method to two datasets derived from complete mitochondrial genomes of mammals and fish, and use maximum likelihood to estimate amino acid exchangeabilities for the different groups of sites. Using this approach we identified large groups of sites evolving under unique physiochemical constraints. Estimates of amino acid exchangeabilities differed significantly among such groups. Moreover, we found that joint estimates of amino acid exchangeabilities do not adequately represent the natural variability in evolutionary processes among sites of mitochondrial proteins. Significant improvements in likelihood are obtained when the new matrices are employed. We also find that maximum likelihood estimates of branch lengths can be strongly impacted. We provide sets of matrices suitable for groups of sites subject to similar physiochemical constraints, and discuss how they might be used to analyze real data. We

  13. A stepwise approach for defining the applicability domain of SAR and QSAR models

    DEFF Research Database (Denmark)

    Dimitrov, Sabcho; Dimitrova, Gergana; Pavlov, Todor

    2005-01-01

    parametric requirements are imposed in the first stage, specifying in the domain only those chemicals that fall in the range of variation of the physicochemical properties of the chemicals in the training set. The second stage defines the structural similarity between chemicals that are correctly predicted...... by the model. The structural neighborhood of atom-centered fragments is used to determine this similarity. The third stage in defining the domain is based on a mechanistic understanding of the modeled phenomenon. Here, the model domain combines the reliability of specific reactive groups hypothesized to cause......, if metabolic activation of chemicals is a part of the (Q)SAR model. Some of the stages of the proposed approach for defining the model domain can be eliminated depending on the availability and quality of the experimental data used to derive the model, the specificity of (Q)SARs, and the goals...

  14. Pathogen-specific responses in the bovine udder. Models and immunoprophylactic concepts.

    Science.gov (United States)

    Petzl, Wolfram; Zerbe, Holm; Günther, Juliane; Seyfert, Hans-Martin; Hussen, Jamal; Schuberth, Hans-Joachim

    2018-02-01

    Bovine mastitis is a disease of major economic effects on the dairy industry worldwide. Experimental in vivo infection models have been widely proven as an effective tool for the investigation of pathogen-specific host immune responses. Staphylococcus aureus (S. aureus) and Escherichia coli (E. coli) are two common mastitis pathogens with an opposite clinical outcome of the disease. E. coli and S. aureus have proven to be valid surrogates to model clinical and subclinical mastitis respectively. Contemporary transcriptome profiling studies demonstrated that the transcriptomic response in the teat reflects the course of pathogen-specific mastitis, being ultimately determined by the immune response of the mammary epithelial cells. After an experimental in vivo challenge, E. coli induces a vigorous early transcriptional response in udder tissue being quantitatively and - notably - qualitatively distinct from the much weaker response against an S. aureus infection. E. coli mastitis models proved that the local response in the infected udder quarters is accompanied by a response in non-infected neighbouring udder quarters modulating systemically their immune responsiveness. Immunomodulation of the udder was investigated in animal models. Pathophysiological consequences were studied after intramammary administration of cytokines, chemokines, growth factors, steroidal anti-inflammatory drugs, or priming of tissue resident cells with pathogen-derived molecules. The latter approaches resulted only in a temporal protection of the udder, reducing transiently the risk of infection but sustained lowering of the severity of an eventually occurring mastitis. They offer an alternative to vaccination trials, which over decades also did not yield protection against new infections. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Injury prevention risk communication: A mental models approach

    DEFF Research Database (Denmark)

    Austin, Laurel Cecelia; Fischhoff, Baruch

    2012-01-01

    Individuals' decisions and behaviour can play a critical role in determining both the probability and severity of injury. Behavioural decision research studies peoples' decision-making processes in terms comparable to scientific models of optimal choices, providing a basis for focusing...... interventions on the most critical opportunities to reduce risks. That research often seeks to identify the ‘mental models’ that underlie individuals' interpretations of their circumstances and the outcomes of possible actions. In the context of injury prevention, a mental models approach would ask why people...... and uses examples to discuss how the approach can be used to develop scientifically validated context-sensitive injury risk communications....

  16. A modeling approach to hospital location for effective marketing.

    Science.gov (United States)

    Cokelez, S; Peacock, E

    1993-01-01

    This paper develops a mixed integer linear programming model for locating health care facilities. The parameters of the objective function of this model are based on factor rating analysis and grid method. Subjective and objective factors representative of the real life situations are incorporated into the model in a unique way permitting a trade-off analysis of certain factors pertinent to the location of hospitals. This results in a unified approach and a single model whose credibility is further enhanced by inclusion of geographical and demographical factors.

  17. Mathematical and computer modeling of electro-optic systems using a generic modeling approach

    OpenAIRE

    Smith, M.I.; Murray-Smith, D.J.; Hickman, D.

    2007-01-01

    The conventional approach to modelling electro-optic sensor systems is to develop separate models for individual systems or classes of system, depending on the detector technology employed in the sensor and the application. However, this ignores commonality in design and in components of these systems. A generic approach is presented for modelling a variety of sensor systems operating in the infrared waveband that also allows systems to be modelled with different levels of detail and at diffe...

  18. Deep Appearance Models: A Deep Boltzmann Machine Approach for Face Modeling

    OpenAIRE

    Duong, Chi Nhan; Luu, Khoa; Quach, Kha Gia; Bui, Tien D.

    2016-01-01

    The "interpretation through synthesis" approach to analyze face images, particularly Active Appearance Models (AAMs) method, has become one of the most successful face modeling approaches over the last two decades. AAM models have ability to represent face images through synthesis using a controllable parameterized Principal Component Analysis (PCA) model. However, the accuracy and robustness of the synthesized faces of AAM are highly depended on the training sets and inherently on the genera...

  19. A component-based approach to integrated modeling in the geosciences: The design of CSDMS

    Science.gov (United States)

    Peckham, Scott D.; Hutton, Eric W. H.; Norris, Boyana

    2013-04-01

    Development of scientific modeling software increasingly requires the coupling of multiple, independently developed models. Component-based software engineering enables the integration of plug-and-play components, but significant additional challenges must be addressed in any specific domain in order to produce a usable development and simulation environment that also encourages contributions and adoption by entire communities. In this paper we describe the challenges in creating a coupling environment for Earth-surface process modeling and the innovative approach that we have developed to address them within the Community Surface Dynamics Modeling System.

  20. Earthquake response analysis of RC bridges using simplified modeling approaches

    Science.gov (United States)

    Lee, Do Hyung; Kim, Dookie; Park, Taehyo

    2009-07-01

    In this paper, simplified modeling approaches describing the hysteretic behavior of reinforced concrete bridge piers are proposed. For this purpose, flexure-axial and shear-axial interaction models are developed and implemented into a nonlinear finite element analysis program. Comparative verifications for reinforced concrete columns prove that the analytical predictions obtained with the new formulations show good correlation with experimental results under various levels of axial forces and section types. In addition, analytical correlation studies for the inelastic earthquake response of reinforced concrete bridge structures are also carried out using the simplified modeling approaches. Relatively good agreement is observed in the results between the current modeling approach and the elaborated fiber models. It is thus encouraging that the present developments and approaches are capable of identifying the contribution of deformation mechanisms correctly. Subsequently, the present developments can be used as a simple yet effective tool for the deformation capacity evaluation of reinforced concrete columns in general and reinforced concrete bridge piers in particular.

  1. Bianchi VI0 and III models: self-similar approach

    International Nuclear Information System (INIS)

    Belinchon, Jose Antonio

    2009-01-01

    We study several cosmological models with Bianchi VI 0 and III symmetries under the self-similar approach. We find new solutions for the 'classical' perfect fluid model as well as for the vacuum model although they are really restrictive for the equation of state. We also study a perfect fluid model with time-varying constants, G and Λ. As in other studied models we find that the behaviour of G and Λ are related. If G behaves as a growing time function then Λ is a positive decreasing time function but if G is decreasing then Λ 0 is negative. We end by studying a massive cosmic string model, putting special emphasis in calculating the numerical values of the equations of state. We show that there is no SS solution for a string model with time-varying constants.

  2. Modeling and control approach to a distinctive quadrotor helicopter.

    Science.gov (United States)

    Wu, Jun; Peng, Hui; Chen, Qing; Peng, Xiaoyan

    2014-01-01

    The referenced quadrotor helicopter in this paper has a unique configuration. It is more complex than commonly used quadrotors because of its inaccurate parameters, unideal symmetrical structure and unknown nonlinear dynamics. A novel method was presented to handle its modeling and control problems in this paper, which adopts a MIMO RBF neural nets-based state-dependent ARX (RBF-ARX) model to represent its nonlinear dynamics, and then a MIMO RBF-ARX model-based global LQR controller is proposed to stabilize the quadrotor's attitude. By comparing with a physical model-based LQR controller and an ARX model-set-based gain scheduling LQR controller, superiority of the MIMO RBF-ARX model-based control approach was confirmed. This successful application verified the validity of the MIMO RBF-ARX modeling method to the quadrotor helicopter with complex nonlinearity. © 2013 Published by ISA. All rights reserved.

  3. Software sensors based on the grey-box modelling approach

    DEFF Research Database (Denmark)

    Carstensen, J.; Harremoës, P.; Strube, Rune

    1996-01-01

    In recent years the grey-box modelling approach has been applied to wastewater transportation and treatment Grey-box models are characterized by the combination of deterministic and stochastic terms to form a model where all the parameters are statistically identifiable from the on......-line measurements. With respect to the development of software sensors, the grey-box models possess two important features. Firstly, the on-line measurements can be filtered according to the grey-box model in order to remove noise deriving from the measuring equipment and controlling devices. Secondly, the grey......-box models may contain terms which can be estimated on-line by use of the models and measurements. In this paper, it is demonstrated that many storage basins in sewer systems can be used as an on-line flow measurement provided that the basin is monitored on-line with a level transmitter and that a grey...

  4. Environmental Radiation Effects on Mammals A Dynamical Modeling Approach

    CERN Document Server

    Smirnova, Olga A

    2010-01-01

    This text is devoted to the theoretical studies of radiation effects on mammals. It uses the framework of developed deterministic mathematical models to investigate the effects of both acute and chronic irradiation in a wide range of doses and dose rates on vital body systems including hematopoiesis, small intestine and humoral immunity, as well as on the development of autoimmune diseases. Thus, these models can contribute to the development of the system and quantitative approaches in radiation biology and ecology. This text is also of practical use. Its modeling studies of the dynamics of granulocytopoiesis and thrombocytopoiesis in humans testify to the efficiency of employment of the developed models in the investigation and prediction of radiation effects on these hematopoietic lines. These models, as well as the properly identified models of other vital body systems, could provide a better understanding of the radiation risks to health. The modeling predictions will enable the implementation of more ef...

  5. Bayesian specification analysis and estimation of simultaneous equation models using Monte Carlo methods

    NARCIS (Netherlands)

    A. Zellner (Arnold); L. Bauwens (Luc); H.K. van Dijk (Herman)

    1988-01-01

    textabstractBayesian procedures for specification analysis or diagnostic checking of modeling assumptions for structural equations of econometric models are developed and applied using Monte Carlo numerical methods. Checks on the validity of identifying restrictions, exogeneity assumptions and other

  6. Modelling Cyclic Walking in Femurs With Metastatic Lesions : Femur-Specific Accumulation of Plasticity

    NARCIS (Netherlands)

    Derikx, L.; Janssen, D.; Schepers, J.; Wesseling, M.; Verdonschot, N.; Jonkers, I.; Tanck, E.

    2015-01-01

    Introduction Clinical fracture risk assessment in metastatic bone disease is extremely difficult, but subject-specific finite element (FE) modelling may improve these assessments in the future [Derikx, 2015]. By coupling to musculoskeletal modelling, realistic loading conditions can be implemented

  7. A patient-specific computational model of hypoxia-modulated radiation resistance in glioblastoma using 18F-FMISO-PET.

    Science.gov (United States)

    Rockne, Russell C; Trister, Andrew D; Jacobs, Joshua; Hawkins-Daarud, Andrea J; Neal, Maxwell L; Hendrickson, Kristi; Mrugala, Maciej M; Rockhill, Jason K; Kinahan, Paul; Krohn, Kenneth A; Swanson, Kristin R

    2015-02-06

    Glioblastoma multiforme (GBM) is a highly invasive primary brain tumour that has poor prognosis despite aggressive treatment. A hallmark of these tumours is diffuse invasion into the surrounding brain, necessitating a multi-modal treatment approach, including surgery, radiation and chemotherapy. We have previously demonstrated the ability of our model to predict radiographic response immediately following radiation therapy in individual GBM patients using a simplified geometry of the brain and theoretical radiation dose. Using only two pre-treatment magnetic resonance imaging scans, we calculate net rates of proliferation and invasion as well as radiation sensitivity for a patient's disease. Here, we present the application of our clinically targeted modelling approach to a single glioblastoma patient as a demonstration of our method. We apply our model in the full three-dimensional architecture of the brain to quantify the effects of regional resistance to radiation owing to hypoxia in vivo determined by [(18)F]-fluoromisonidazole positron emission tomography (FMISO-PET) and the patient-specific three-dimensional radiation treatment plan. Incorporation of hypoxia into our model with FMISO-PET increases the model-data agreement by an order of magnitude. This improvement was robust to our definition of hypoxia or the degree of radiation resistance quantified with the FMISO-PET image and our computational model, respectively. This work demonstrates a useful application of patient-specific modelling in personalized medicine and how mathematical modelling has the potential to unify multi-modality imaging and radiation treatment planning.

  8. Post-closure biosphere assessment modelling: comparison of complex and more stylised approaches.

    Science.gov (United States)

    Walke, Russell C; Kirchner, Gerald; Xu, Shulan; Dverstorp, Björn

    2015-10-01

    Geological disposal facilities are the preferred option for high-level radioactive waste, due to their potential to provide isolation from the surface environment (biosphere) on very long timescales. Assessments need to strike a balance between stylised models and more complex approaches that draw more extensively on site-specific information. This paper explores the relative merits of complex versus more stylised biosphere models in the context of a site-specific assessment. The more complex biosphere modelling approach was developed by the Swedish Nuclear Fuel and Waste Management Co (SKB) for the Formark candidate site for a spent nuclear fuel repository in Sweden. SKB's approach is built on a landscape development model, whereby radionuclide releases to distinct hydrological basins/sub-catchments (termed 'objects') are represented as they evolve through land rise and climate change. Each of seventeen of these objects is represented with more than 80 site specific parameters, with about 22 that are time-dependent and result in over 5000 input values per object. The more stylised biosphere models developed for this study represent releases to individual ecosystems without environmental change and include the most plausible transport processes. In the context of regulatory review of the landscape modelling approach adopted in the SR-Site assessment in Sweden, the more stylised representation has helped to build understanding in the more complex modelling approaches by providing bounding results, checking the reasonableness of the more complex modelling, highlighting uncertainties introduced through conceptual assumptions and helping to quantify the conservatisms involved. The more stylised biosphere models are also shown capable of reproducing the results of more complex approaches. A major recommendation is that biosphere assessments need to justify the degree of complexity in modelling approaches as well as simplifying and conservative assumptions. In light of

  9. Modelling dynamic ecosystems : venturing beyond boundaries with the Ecopath approach

    OpenAIRE

    Coll, Marta; Akoglu, E.; Arreguin-Sanchez, F.; Fulton, E. A.; Gascuel, D.; Heymans, J. J.; Libralato, S.; Mackinson, S.; Palomera, I.; Piroddi, C.; Shannon, L. J.; Steenbeek, J.; Villasante, S.; Christensen, V.

    2015-01-01

    Thirty years of progress using the Ecopath with Ecosim (EwE) approach in different fields such as ecosystem impacts of fishing and climate change, emergent ecosystem dynamics, ecosystem-based management, and marine conservation and spatial planning were showcased November 2014 at the conference "Ecopath 30 years-modelling dynamic ecosystems: beyond boundaries with EwE". Exciting new developments include temporal-spatial and end-to-end modelling, as well as novel applications to environmental ...

  10. Regularization of quantum gravity in the matrix model approach

    International Nuclear Information System (INIS)

    Ueda, Haruhiko

    1991-02-01

    We study divergence problem of the partition function in the matrix model approach for two-dimensional quantum gravity. We propose a new model V(φ) = 1/2Trφ 2 + g 4 /NTrφ 4 + g'/N 4 Tr(φ 4 ) 2 and show that in the sphere case it has no divergence problem and the critical exponent is of pure gravity. (author)

  11. Development of a Conservative Model Validation Approach for Reliable Analysis

    Science.gov (United States)

    2015-01-01

    conservativeness level , the conservative probability of failure obtained from Section 4 must be maintained. The mathematical formulation of conservative model... CIE 2015 August 2-5, 2015, Boston, Massachusetts, USA [DRAFT] DETC2015-46982 DEVELOPMENT OF A CONSERVATIVE MODEL VALIDATION APPROACH FOR RELIABLE...PDF and a probability of failure are selected from these predicted output PDFs at a user-specified conservativeness level for validation. For

  12. Patient-specific induced pluripotent stem cells in neurological disease modeling: the importance of nonhuman primate models

    Directory of Open Access Journals (Sweden)

    Qiu Z

    2013-07-01

    Full Text Available Zhifang Qiu,1,2 Steven L Farnsworth,2 Anuja Mishra,1,2 Peter J Hornsby1,21Geriatric Research Education and Clinical Center, South Texas Veterans Health Care System, San Antonio, TX, USA; 2Barshop Institute for Longevity and Aging Studies, University of Texas Health Science Center, San Antonio, TX, USAAbstract: The development of the technology for derivation of induced pluripotent stem (iPS cells from human patients and animal models has opened up new pathways to the better understanding of many human diseases, and has created new opportunities for therapeutic approaches. Here, we consider one important neurological disease, Parkinson's, the development of relevant neural cell lines for studying this disease, and the animal models that are available for testing the survival and function of the cells, following transplantation into the central nervous system. Rapid progress has been made recently in the application of protocols for neuroectoderm differentiation and neural patterning of pluripotent stem cells. These developments have resulted in the ability to produce large numbers of dopaminergic neurons with midbrain characteristics for further study. These cells have been shown to be functional in both rodent and nonhuman primate (NHP models of Parkinson's disease. Patient-specific iPS cells and derived dopaminergic neurons have been developed, in particular from patients with genetic causes of Parkinson's disease. For complete modeling of the disease, it is proposed that the introduction of genetic changes into NHP iPS cells, followed by studying the phenotype of the genetic change in cells transplanted into the NHP as host animal, will yield new insights into disease processes not possible with rodent models alone.Keywords: Parkinson's disease, pluripotent cell differentiation, neural cell lines, dopaminergic neurons, cell transplantation, animal models

  13. Improvement of tool support of the spatial approach to regional planning: problems, specifics, trends

    Directory of Open Access Journals (Sweden)

    Nataliya Gennadievna Yushkova

    2015-01-01

    Full Text Available The emerging imperatives of innovation economic development in Russia determine the content of conceptual and institutional constraints to the development of regional economic systems (RES. They consider the regional planning system as a leading priority in its inseparable unity with modern public administration tasks. However, the practice of development of long-term plans in the RF subjects proves that the innovation challenges of economic policy are not reflected properly in them or they are significantly distorted. The following reasons reduce the effectiveness of modernization processes in the RF subjects and hamper the appropriate reaction of RES on their impact: the lack of coordination between socio-economic and spatial regional plans, the imbalance of interaction between state authorities engaged in long-term planning, the lack of real prerequisites for the implementation of innovation initiatives in the regions. Systematization and analysis of long-term plans make it possible to substantiate the consistency of the spatial approach to regional planning expressed in the dominance of the transformational function that synchronizes the configuration and parameters of RES, and to establish ways to integrate spatial components in the system of regional planning through optimization of its tool support. The change in the content of the instrumentation support is based on the synthesis of the predominant basic characteristics of the existing tools used in isolated subsystems of regional planning of socio-economic and territorial development. The study has established a system of tool support for regional planning that adapts to the changes in both internal and external factors in the development of RES. Three main groups of tools: organizing, regulating, and coordinating are defined by their typing in accordance with the groups of management functions. The article proposes the modeling of combinations of tools that are subordinated to the

  14. Comparative flood damage model assessment: towards a European approach

    Science.gov (United States)

    Jongman, B.; Kreibich, H.; Apel, H.; Barredo, J. I.; Bates, P. D.; Feyen, L.; Gericke, A.; Neal, J.; Aerts, J. C. J. H.; Ward, P. J.

    2012-12-01

    There is a wide variety of flood damage models in use internationally, differing substantially in their approaches and economic estimates. Since these models are being used more and more as a basis for investment and planning decisions on an increasingly large scale, there is a need to reduce the uncertainties involved and develop a harmonised European approach, in particular with respect to the EU Flood Risks Directive. In this paper we present a qualitative and quantitative assessment of seven flood damage models, using two case studies of past flood events in Germany and the United Kingdom. The qualitative analysis shows that modelling approaches vary strongly, and that current methodologies for estimating infrastructural damage are not as well developed as methodologies for the estimation of damage to buildings. The quantitative results show that the model outcomes are very sensitive to uncertainty in both vulnerability (i.e. depth-damage functions) and exposure (i.e. asset values), whereby the first has a larger effect than the latter. We conclude that care needs to be taken when using aggregated land use data for flood risk assessment, and that it is essential to adjust asset values to the regional economic situation and property characteristics. We call for the development of a flexible but consistent European framework that applies best practice from existing models while providing room for including necessary regional adjustments.

  15. A Final Approach Trajectory Model for Current Operations

    Science.gov (United States)

    Gong, Chester; Sadovsky, Alexander

    2010-01-01

    Predicting accurate trajectories with limited intent information is a challenge faced by air traffic management decision support tools in operation today. One such tool is the FAA's Terminal Proximity Alert system which is intended to assist controllers in maintaining safe separation of arrival aircraft during final approach. In an effort to improve the performance of such tools, two final approach trajectory models are proposed; one based on polynomial interpolation, the other on the Fourier transform. These models were tested against actual traffic data and used to study effects of the key final approach trajectory modeling parameters of wind, aircraft type, and weight class, on trajectory prediction accuracy. Using only the limited intent data available to today's ATM system, both the polynomial interpolation and Fourier transform models showed improved trajectory prediction accuracy over a baseline dead reckoning model. Analysis of actual arrival traffic showed that this improved trajectory prediction accuracy leads to improved inter-arrival separation prediction accuracy for longer look ahead times. The difference in mean inter-arrival separation prediction error between the Fourier transform and dead reckoning models was 0.2 nmi for a look ahead time of 120 sec, a 33 percent improvement, with a corresponding 32 percent improvement in standard deviation.

  16. The Generalised Ecosystem Modelling Approach in Radiological Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Klos, Richard

    2008-03-15

    An independent modelling capability is required by SSI in order to evaluate dose assessments carried out in Sweden by, amongst others, SKB. The main focus is the evaluation of the long-term radiological safety of radioactive waste repositories for both spent fuel and low-level radioactive waste. To meet the requirement for an independent modelling tool for use in biosphere dose assessments, SSI through its modelling team CLIMB commissioned the development of a new model in 2004, a project to produce an integrated model of radionuclides in the landscape. The generalised ecosystem modelling approach (GEMA) is the result. GEMA is a modular system of compartments representing the surface environment. It can be configured, through water and solid material fluxes, to represent local details in the range of ecosystem types found in the past, present and future Swedish landscapes. The approach is generic but fine tuning can be carried out using local details of the surface drainage system. The modular nature of the modelling approach means that GEMA modules can be linked to represent large scale surface drainage features over an extended domain in the landscape. System change can also be managed in GEMA, allowing a flexible and comprehensive model of the evolving landscape to be constructed. Environmental concentrations of radionuclides can be calculated and the GEMA dose pathway model provides a means of evaluating the radiological impact of radionuclide release to the surface environment. This document sets out the philosophy and details of GEMA and illustrates the functioning of the model with a range of examples featuring the recent CLIMB review of SKB's SR-Can assessment

  17. River water quality model no. 1 (RWQM1): I. Modelling approach

    DEFF Research Database (Denmark)

    Shanahan, P.; Borchardt, D.; Henze, Mogens

    2001-01-01

    Successful river water quality modelling requires the specification of an appropriate model structure and process formulation. Both must be related to the compartment structure of running water ecosystems including their longitudinal, vertical, and lateral zonation patterns. Furthermore...

  18. Combining engineering and data-driven approaches: Development of a generic fire risk model facilitating calibration

    DEFF Research Database (Denmark)

    De Sanctis, G.; Fischer, K.; Kohler, J.

    2014-01-01

    Fire risk models support decision making for engineering problems under the consistent consideration of the associated uncertainties. Empirical approaches can be used for cost-benefit studies when enough data about the decision problem are available. But often the empirical approaches...... are not detailed enough. Engineering risk models, on the other hand, may be detailed but typically involve assumptions that may result in a biased risk assessment and make a cost-benefit study problematic. In two related papers it is shown how engineering and data-driven modeling can be combined by developing...... a generic risk model that is calibrated to observed fire loss data. Generic risk models assess the risk of buildings based on specific risk indicators and support risk assessment at a portfolio level. After an introduction to the principles of generic risk assessment, the focus of the present paper...

  19. Modeling of phase equilibria with CPA using the homomorph approach

    DEFF Research Database (Denmark)

    Breil, Martin Peter; Tsivintzelis, Ioannis; Kontogeorgis, Georgios

    2011-01-01

    For association models, like CPA and SAFT, a classical approach is often used for estimating pure-compound and mixture parameters. According to this approach, the pure-compound parameters are estimated from vapor pressure and liquid density data. Then, the binary interaction parameters, kij......, are estimated from binary systems; one binary interaction parameter per system. No additional mixing rules are needed for cross-associating systems, but combining rules are required, e.g. the Elliott rule or the so-called CR-1 rule. There is a very large class of mixtures, e.g. water or glycols with aromatic...... interaction parameters are often used for solvating systems; one for the physical part (kij) and one for the association part (βcross). This limits the predictive capabilities and possibilities of generalization of the model. In this work we present an approach to reduce the number of adjustable parameters...

  20. A model-data based systems approach to process intensification

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    . Their developments, however, are largely due to experiment based trial and error approaches and while they do not require validation, they can be time consuming and resource intensive. Also, one may ask, can a truly new intensified unit operation be obtained in this way? An alternative two-stage approach is to apply...... for focused validation of only the promising candidates in the second-stage. This approach, however, would be limited to intensification based on “known” unit operations, unless the PI process synthesis/design is considered at a lower level of aggregation, namely the phenomena level. That is, the model......-based synthesis method must employ models at lower levels of aggregation and through combination rules for phenomena, generate (synthesize) new intensified unit operations. An efficient solution procedure for the synthesis problem is needed to tackle the potentially large number of options that would be obtained...