WorldWideScience

Sample records for modeling approach specifically

  1. An approach for activity-based DEVS model specification

    DEFF Research Database (Denmark)

    Alshareef, Abdurrahman; Sarjoughian, Hessam S.; Zarrin, Bahram

    2016-01-01

    Creation of DEVS models has been advanced through Model Driven Architecture and its frameworks. The overarching role of the frameworks has been to help develop model specifications in a disciplined fashion. Frameworks can provide intermediary layers between the higher level mathematical models...... and their corresponding software specifications from both structural and behavioral aspects. Unlike structural modeling, developing models to specify behavior of systems is known to be harder and more complex, particularly when operations with non-trivial control schemes are required. In this paper, we propose specifying...... activity-based behavior modeling of parallel DEVS atomic models. We consider UML activities and actions as fundamental units of behavior modeling, especially in the presence of recent advances in the UML 2.5 specifications. We describe in detail how to approach activity modeling with a set of elemental...

  2. A Specific N=2 Supersymmetric Quantum Mechanical Model: Supervariable Approach

    Directory of Open Access Journals (Sweden)

    Aradhya Shukla

    2017-01-01

    Full Text Available By exploiting the supersymmetric invariant restrictions on the chiral and antichiral supervariables, we derive the off-shell nilpotent symmetry transformations for a specific (0 + 1-dimensional N=2 supersymmetric quantum mechanical model which is considered on a (1, 2-dimensional supermanifold (parametrized by a bosonic variable t and a pair of Grassmannian variables (θ,θ¯. We also provide the geometrical meaning to the symmetry transformations. Finally, we show that this specific N=2 SUSY quantum mechanical model is a model for Hodge theory.

  3. Integrating UML, the Q-model and a Multi-Agent Approach in Process Specifications and Behavioural Models of Organisations

    Directory of Open Access Journals (Sweden)

    Raul Savimaa

    2005-08-01

    Full Text Available Efficient estimation and representation of an organisation's behaviour requires specification of business processes and modelling of actors' behaviour. Therefore the existing classical approaches that concentrate only on planned processes are not suitable and an approach that integrates process specifications with behavioural models of actors should be used instead. The present research indicates that a suitable approach should be based on interactive computing. This paper examines the integration of UML diagrams for process specifications, the Q-model specifications for modelling timing criteria of existing and planned processes and a multi-agent approach for simulating non-deterministic behaviour of human actors in an organisation. The corresponding original methodology is introduced and some of its applications as case studies are reviewed.

  4. A modeling approach to compare ΣPCB concentrations between congener-specific analyses

    Science.gov (United States)

    Gibson, Polly P.; Mills, Marc A.; Kraus, Johanna M.; Walters, David M.

    2017-01-01

    Changes in analytical methods over time pose problems for assessing long-term trends in environmental contamination by polychlorinated biphenyls (PCBs). Congener-specific analyses vary widely in the number and identity of the 209 distinct PCB chemical configurations (congeners) that are quantified, leading to inconsistencies among summed PCB concentrations (ΣPCB) reported by different studies. Here we present a modeling approach using linear regression to compare ΣPCB concentrations derived from different congener-specific analyses measuring different co-eluting groups. The approach can be used to develop a specific conversion model between any two sets of congener-specific analytical data from similar samples (similar matrix and geographic origin). We demonstrate the method by developing a conversion model for an example data set that includes data from two different analytical methods, a low resolution method quantifying 119 congeners and a high resolution method quantifying all 209 congeners. We used the model to show that the 119-congener set captured most (93%) of the total PCB concentration (i.e., Σ209PCB) in sediment and biological samples. ΣPCB concentrations estimated using the model closely matched measured values (mean relative percent difference = 9.6). General applications of the modeling approach include (a) generating comparable ΣPCB concentrations for samples that were analyzed for different congener sets; and (b) estimating the proportional contribution of different congener sets to ΣPCB. This approach may be especially valuable for enabling comparison of long-term remediation monitoring results even as analytical methods change over time. 

  5. Structural modeling of age specific fertility curves in Peninsular Malaysia: An approach of Lee Carter method

    Science.gov (United States)

    Hanafiah, Hazlenah; Jemain, Abdul Aziz

    2013-11-01

    In recent years, the study of fertility has been getting a lot of attention among research abroad following fear of deterioration of fertility led by the rapid economy development. Hence, this study examines the feasibility of developing fertility forecasts based on age structure. Lee Carter model (1992) is applied in this study as it is an established and widely used model in analysing demographic aspects. A singular value decomposition approach is incorporated with an ARIMA model to estimate age specific fertility rates in Peninsular Malaysia over the period 1958-2007. Residual plots is used to measure the goodness of fit of the model. Fertility index forecast using random walk drift is then utilised to predict the future age specific fertility. Results indicate that the proposed model provides a relatively good and reasonable data fitting. In addition, there is an apparent and continuous decline in age specific fertility curves in the next 10 years, particularly among mothers' in their early 20's and 40's. The study on the fertility is vital in order to maintain a balance between the population growth and the provision of facilities related resources.

  6. Lamina specific loss of inhibition may lead to distinct neuropathic manifestations: a computational modeling approach

    Directory of Open Access Journals (Sweden)

    Erick Javier Argüello Prada

    Full Text Available Introduction It has been reported that inhibitory control at the superficial dorsal horn (SDH can act in a regionally distinct manner, which suggests that regionally specific subpopulations of SDH inhibitory neurons may prevent one specific neuropathic condition. Methods In an attempt to address this issue, we provide an alternative approach by integrating neuroanatomical information provided by different studies to construct a network-model of the SDH. We use Neuroids to simulate each neuron included in that model by adapting available experimental evidence. Results Simulations suggest that the maintenance of the proper level of pain sensitivity may be attributed to lamina II inhibitory neurons and, therefore, hyperalgesia may be elicited by suppression of the inhibitory tone at that lamina. In contrast, lamina III inhibitory neurons are more likely to be responsible for keeping the nociceptive pathway from the mechanoreceptive pathway, so loss of inhibitory control in that region may result in allodynia. The SDH network-model is also able to replicate non-linearities associated to pain processing, such as Aβ-fiber mediated analgesia and frequency-dependent increase of the neural response. Discussion By incorporating biophysical accuracy and newer experimental evidence, the SDH network-model may become a valuable tool for assessing the contribution of specific SDH connectivity patterns to noxious transmission in both physiological and pathological conditions.

  7. Risk-based technical specifications: Development and application of an approach to the generation of a plant specific real-time risk model

    International Nuclear Information System (INIS)

    Puglia, B.; Gallagher, D.; Amico, P.; Atefi, B.

    1992-10-01

    This report describes a process developed to convert an existing PRA into a model amenable to real time, risk-based technical specification calculations. In earlier studies (culminating in NUREG/CR-5742), several risk-based approaches to technical specification were evaluated. A real-time approach using a plant specific PRA capable of modeling plant configurations as they change was identified as the most comprehensive approach to control plant risk. A master fault tree logic model representative of-all of the core damage sequences was developed. Portions of the system fault trees were modularized and supercomponents comprised of component failures with similar effects were developed to reduce the size of the model and, quantification times. Modifications to the master fault tree logic were made to properly model the effect of maintenance and recovery actions. Fault trees representing several actuation systems not modeled in detail in the existing PRA were added to the master fault tree logic. This process was applied to the Surry NUREG-1150 Level 1 PRA. The master logic mode was confirmed. The model was then used to evaluate frequency associated with several plant configurations using the IRRAS code. For all cases analyzed computational time was less than three minutes. This document Volume 2, contains appendices A, B, and C. These provide, respectively: Surry Technical Specifications Model Database, Surry Technical Specifications Model, and a list of supercomponents used in the Surry Technical Specifications Model

  8. Model-based versus specific dosimetry in diagnostic context: Comparison of three dosimetric approaches

    Energy Technology Data Exchange (ETDEWEB)

    Marcatili, S., E-mail: sara.marcatili@inserm.fr; Villoing, D.; Mauxion, T.; Bardiès, M. [Inserm, UMR1037 CRCT, Toulouse F-31000, France and Université Toulouse III-Paul Sabatier, UMR1037 CRCT, Toulouse F-31000 (France); McParland, B. J. [Imaging Technology Group, GE Healthcare, Life Sciences, B22U The Grove Centre, White Lion Road, Amersham, England HP7 9LL (United Kingdom)

    2015-03-15

    Purpose: The dosimetric assessment of novel radiotracers represents a legal requirement in most countries. While the techniques for the computation of internal absorbed dose in a therapeutic context have made huge progresses in recent years, in a diagnostic scenario the absorbed dose is usually extracted from model-based lookup tables, most often derived from International Commission on Radiological Protection (ICRP) or Medical Internal Radiation Dose (MIRD) Committee models. The level of approximation introduced by these models may impact the resulting dosimetry. The aim of this work is to establish whether a more refined approach to dosimetry can be implemented in nuclear medicine diagnostics, by analyzing a specific case. Methods: The authors calculated absorbed doses to various organs in six healthy volunteers administered with flutemetamol ({sup 18}F) injection. Each patient underwent from 8 to 10 whole body 3D PET/CT scans. This dataset was analyzed using a Monte Carlo (MC) application developed in-house using the toolkit GATE that is capable to take into account patient-specific anatomy and radiotracer distribution at the voxel level. They compared the absorbed doses obtained with GATE to those calculated with two commercially available software: OLINDA/EXM and STRATOS implementing a dose voxel kernel convolution approach. Results: Absorbed doses calculated with GATE were higher than those calculated with OLINDA. The average ratio between GATE absorbed doses and OLINDA’s was 1.38 ± 0.34 σ (from 0.93 to 2.23). The discrepancy was particularly high for the thyroid, with an average GATE/OLINDA ratio of 1.97 ± 0.83 σ for the six patients. Differences between STRATOS and GATE were found to be higher. The average ratio between GATE and STRATOS absorbed doses was 2.51 ± 1.21 σ (from 1.09 to 6.06). Conclusions: This study demonstrates how the choice of the absorbed dose calculation algorithm may introduce a bias when gamma radiations are of importance, as is

  9. A simplified approach to control system specification and design using domain modelling and mapping

    International Nuclear Information System (INIS)

    Ludgate, G.A.

    1992-01-01

    Recent developments in the field of accelerator-domain and computer-domain modelling have led to a better understanding of the 'art' of control system specification and design. It now appears possible to 'compile' a control system specification to produce the architectural design. The information required by the 'compiler' is discussed and one hardware optimization algorithm presented. The desired characteristics of the hardware and software components of a distributed control system architecture are discussed and the shortcomings of some commercial products. (author)

  10. Recognition of Emotions in Mexican Spanish Speech: An Approach Based on Acoustic Modelling of Emotion-Specific Vowels

    Directory of Open Access Journals (Sweden)

    Santiago-Omar Caballero-Morales

    2013-01-01

    Full Text Available An approach for the recognition of emotions in speech is presented. The target language is Mexican Spanish, and for this purpose a speech database was created. The approach consists in the phoneme acoustic modelling of emotion-specific vowels. For this, a standard phoneme-based Automatic Speech Recognition (ASR system was built with Hidden Markov Models (HMMs, where different phoneme HMMs were built for the consonants and emotion-specific vowels associated with four emotional states (anger, happiness, neutral, sadness. Then, estimation of the emotional state from a spoken sentence is performed by counting the number of emotion-specific vowels found in the ASR’s output for the sentence. With this approach, accuracy of 87–100% was achieved for the recognition of emotional state of Mexican Spanish speech.

  11. Inference of type-specific HPV transmissibility, progression and clearance rates: a mathematical modelling approach.

    Directory of Open Access Journals (Sweden)

    Helen C Johnson

    Full Text Available Quantifying rates governing the clearance of Human Papillomavirus (HPV and its progression to clinical disease, together with viral transmissibility and the duration of naturally-acquired immunity, is essential in estimating the impact of vaccination programmes and screening or testing regimes. However, the complex natural history of HPV makes this difficult. We infer the viral transmissibility, rate of waning natural immunity and rates of progression and clearance of infection of 13 high-risk and 2 non-oncogenic HPV types, making use of a number of rich datasets from Sweden. Estimates of viral transmissibility, clearance of initial infection and waning immunity were derived in a Bayesian framework by fitting a susceptible-infectious-recovered-susceptible (SIRS transmission model to age- and type-specific HPV prevalence data from both a cross-sectional study and a randomised controlled trial (RCT of primary HPV screening. The models fitted well, but over-estimated the prevalence of four high-risk types with respect to the data. Three of these types (HPV-33, -35 and -58 are among the most closely related phylogenetically to the most prevalent HPV-16. The fourth (HPV-45 is the most closely related to HPV-18; the second most prevalent type. We suggest that this may be an indicator of cross-immunity. Rates of progression and clearance of clinical lesions were additionally estimated from longitudinal data gathered as part of the same RCT. Our estimates of progression and clearance rates are consistent with the findings of survival analysis studies and we extend the literature by estimating progression and clearance rates for non-16 and non-18 high-risk types. We anticipate that such type-specific estimates will be useful in the parameterisation of further models and in developing our understanding of HPV natural history.

  12. A competing risk approach for the European Heart SCORE model based on cause-specific and all-cause mortality

    DEFF Research Database (Denmark)

    Støvring, Henrik; Harmsen, Charlotte G; Wisløff, Torbjørn

    2013-01-01

    for older individuals. When non-CVD mortality was assumed unaffected by smoking status, the absolute risk reduction due to statin treatment ranged from 0.0% to 3.5%, whereas the gain in expected residual lifetime ranged from 3 to 11 months. Statin effectiveness increased for non-smokers and declined...... pressure, and total cholesterol level. The SCORE model, however, is not mathematically consistent and does not estimate all-cause mortality. Our aim is to modify the SCORE model to allow consistent estimation of both CVD-specific and all-cause mortality. Methods: Using a competing risk approach, we first...

  13. A data-driven modeling approach to identify disease-specific multi-organ networks driving physiological dysregulation.

    Directory of Open Access Journals (Sweden)

    Warren D Anderson

    2017-07-01

    Full Text Available Multiple physiological systems interact throughout the development of a complex disease. Knowledge of the dynamics and connectivity of interactions across physiological systems could facilitate the prevention or mitigation of organ damage underlying complex diseases, many of which are currently refractory to available therapeutics (e.g., hypertension. We studied the regulatory interactions operating within and across organs throughout disease development by integrating in vivo analysis of gene expression dynamics with a reverse engineering approach to infer data-driven dynamic network models of multi-organ gene regulatory influences. We obtained experimental data on the expression of 22 genes across five organs, over a time span that encompassed the development of autonomic nervous system dysfunction and hypertension. We pursued a unique approach for identification of continuous-time models that jointly described the dynamics and structure of multi-organ networks by estimating a sparse subset of ∼12,000 possible gene regulatory interactions. Our analyses revealed that an autonomic dysfunction-specific multi-organ sequence of gene expression activation patterns was associated with a distinct gene regulatory network. We analyzed the model structures for adaptation motifs, and identified disease-specific network motifs involving genes that exhibited aberrant temporal dynamics. Bioinformatic analyses identified disease-specific single nucleotide variants within or near transcription factor binding sites upstream of key genes implicated in maintaining physiological homeostasis. Our approach illustrates a novel framework for investigating the pathogenesis through model-based analysis of multi-organ system dynamics and network properties. Our results yielded novel candidate molecular targets driving the development of cardiovascular disease, metabolic syndrome, and immune dysfunction.

  14. Incorporating Gender Specific Approaches for Incarcerated Female Adolescents: Multilevel Risk Model for Practice

    Science.gov (United States)

    Welch, Chiquitia L.; Roberts-Lewis, Amelia C.; Parker, Sharon

    2009-01-01

    The rise in female delinquency has resulted in large numbers of girls being incarcerated in Youth Development Centers (YDC). However, there are few gender specific treatment programs for incarcerated female adolescent offenders, particularly for those with a history of substance dependency. In this article, we present a Multi-level Risk Model…

  15. Towards Agent-Based Model Specification in Smart Grid: A Cognitive Agent-based Computing Approach

    OpenAIRE

    Akram, Waseem; Niazi, Muaz A.; Iantovics, Laszlo Barna

    2017-01-01

    A smart grid can be considered as a complex network where each node represents a generation unit or a consumer. Whereas links can be used to represent transmission lines. One way to study complex systems is by using the agent-based modeling (ABM) paradigm. An ABM is a way of representing a complex system of autonomous agents interacting with each other. Previously, a number of studies have been presented in the smart grid domain making use of the ABM paradigm. However, to the best of our know...

  16. Self-dual cluster renormalization group approach for the square lattice Ising model specific heat and magnetization

    International Nuclear Information System (INIS)

    Martin, H.O.; Tsallis, C.

    1981-01-01

    A simple renormalization group approach based on self-dual clusters is proposed for two-dimensional nearest-neighbour 1/2 - spin Ising model on the square lattice; it reproduces the exact critical point. The internal energy and the specific heat for vanishing external magnetic field, spontaneous magnetization and the thermal (Y sub(T)) and magnetic (Y sub(H)) critical exponents are calculated. The results obtained from the first four smallest cluster sizes strongly suggest the convergence towards the exact values when the cluster sizes increases. Even for the smallest cluster, where the calculation is very simple, the results are quite accurate, particularly in the neighbourhood of the critical point. (Author) [pt

  17. An Approach for Patient-Specific Multi-domain Vascular Mesh Generation Featuring Spatially Varying Wall Thickness Modeling

    OpenAIRE

    Raut, Samarth S.; Liu, Peng; Finol, Ender A.

    2015-01-01

    In this work, we present a computationally efficient image-derived volume mesh generation approach for vasculatures that implements spatially varying patient-specific wall thickness with a novel inward extrusion of the wall surface mesh. Multi-domain vascular meshes with arbitrary numbers, locations, and patterns of both iliac bifurcations and thrombi can be obtained without the need to specify features or landmark points as input. In addition, the mesh output is coordinate-frame independent ...

  18. An Asset Pricing Approach to Testing General Term Structure Models including Heath-Jarrow-Morton Specifications and Affine Subclasses

    DEFF Research Database (Denmark)

    Christensen, Bent Jesper; van der Wel, Michel

    of the risk premium is associated with the slope factor, and individual risk prices depend on own past values, factor realizations, and past values of other risk prices, and are significantly related to the output gap, consumption, and the equity risk price. The absence of arbitrage opportunities is strongly...... is tested, but in addition to the standard bilinear term in factor loadings and market prices of risk, the relevant mean restriction in the term structure case involves an additional nonlinear (quadratic) term in factor loadings. We estimate our general model using likelihood-based dynamic factor model...... techniques for a variety of volatility factors, and implement the relevant likelihood ratio tests. Our factor model estimates are similar across a general state space implementation and an alternative robust two-step principal components approach. The evidence favors time-varying market prices of risk. Most...

  19. A multiscale modelling approach to understand atherosclerosis formation: A patient-specific case study in the aortic bifurcation

    Science.gov (United States)

    Alimohammadi, Mona; Pichardo-Almarza, Cesar; Agu, Obiekezie; Díaz-Zuccarini, Vanessa

    2017-01-01

    Atherogenesis, the formation of plaques in the wall of blood vessels, starts as a result of lipid accumulation (low-density lipoprotein cholesterol) in the vessel wall. Such accumulation is related to the site of endothelial mechanotransduction, the endothelial response to mechanical stimuli and haemodynamics, which determines biochemical processes regulating the vessel wall permeability. This interaction between biomechanical and biochemical phenomena is complex, spanning different biological scales and is patient-specific, requiring tools able to capture such mathematical and biological complexity in a unified framework. Mathematical models offer an elegant and efficient way of doing this, by taking into account multifactorial and multiscale processes and mechanisms, in order to capture the fundamentals of plaque formation in individual patients. In this study, a mathematical model to understand plaque and calcification locations is presented: this model provides a strong interpretability and physical meaning through a multiscale, complex index or metric (the penetration site of low-density lipoprotein cholesterol, expressed as volumetric flux). Computed tomography scans of the aortic bifurcation and iliac arteries are analysed and compared with the results of the multifactorial model. The results indicate that the model shows potential to predict the majority of the plaque locations, also not predicting regions where plaques are absent. The promising results from this case study provide a proof of concept that can be applied to a larger patient population. PMID:28427316

  20. Combining the GW formalism with the polarizable continuum model: A state-specific non-equilibrium approach

    Energy Technology Data Exchange (ETDEWEB)

    Duchemin, Ivan, E-mail: ivan.duchemin@cea.fr [INAC, SP2M/L-Sim, CEA/UJF Cedex 09, 38054 Grenoble (France); Jacquemin, Denis [Laboratoire CEISAM - UMR CNR 6230, Université de Nantes, 2 Rue de la Houssinière, BP 92208, 44322 Nantes Cedex 3 (France); Institut Universitaire de France, 1 rue Descartes, 75005 Paris Cedex 5 (France); Blase, Xavier [CNRS, Inst. NÉEL, F-38000 Grenoble (France); Univ. Grenoble Alpes, Inst. NÉEL, F-38000 Grenoble (France)

    2016-04-28

    We have implemented the polarizable continuum model within the framework of the many-body Green’s function GW formalism for the calculation of electron addition and removal energies in solution. The present formalism includes both ground-state and non-equilibrium polarization effects. In addition, the polarization energies are state-specific, allowing to obtain the bath-induced renormalisation energy of all occupied and virtual energy levels. Our implementation is validated by comparisons with ΔSCF calculations performed at both the density functional theory and coupled-cluster single and double levels for solvated nucleobases. The present study opens the way to GW and Bethe-Salpeter calculations in disordered condensed phases of interest in organic optoelectronics, wet chemistry, and biology.

  1. Towards personalised management of atherosclerosis via computational models in vascular clinics: technology based on patient-specific simulation approach

    Science.gov (United States)

    Di Tomaso, Giulia; Agu, Obiekezie; Pichardo-Almarza, Cesar

    2014-01-01

    The development of a new technology based on patient-specific modelling for personalised healthcare in the case of atherosclerosis is presented. Atherosclerosis is the main cause of death in the world and it has become a burden on clinical services as it manifests itself in many diverse forms, such as coronary artery disease, cerebrovascular disease/stroke and peripheral arterial disease. It is also a multifactorial, chronic and systemic process that lasts for a lifetime, putting enormous financial and clinical pressure on national health systems. In this Letter, the postulate is that the development of new technologies for healthcare using computer simulations can, in the future, be developed as in-silico management and support systems. These new technologies will be based on predictive models (including the integration of observations, theories and predictions across a range of temporal and spatial scales, scientific disciplines, key risk factors and anatomical sub-systems) combined with digital patient data and visualisation tools. Although the problem is extremely complex, a simulation workflow and an exemplar application of this type of technology for clinical use is presented, which is currently being developed by a multidisciplinary team following the requirements and constraints of the Vascular Service Unit at the University College Hospital, London. PMID:26609369

  2. Simulation-based model checking approach to cell fate specification during Caenorhabditis elegans vulval development by hybrid functional Petri net with extension

    Directory of Open Access Journals (Sweden)

    Ueno Kazuko

    2009-04-01

    Full Text Available Abstract Background Model checking approaches were applied to biological pathway validations around 2003. Recently, Fisher et al. have proved the importance of model checking approach by inferring new regulation of signaling crosstalk in C. elegans and confirming the regulation with biological experiments. They took a discrete and state-based approach to explore all possible states of the system underlying vulval precursor cell (VPC fate specification for desired properties. However, since both discrete and continuous features appear to be an indispensable part of biological processes, it is more appropriate to use quantitative models to capture the dynamics of biological systems. Our key motivation of this paper is to establish a quantitative methodology to model and analyze in silico models incorporating the use of model checking approach. Results A novel method of modeling and simulating biological systems with the use of model checking approach is proposed based on hybrid functional Petri net with extension (HFPNe as the framework dealing with both discrete and continuous events. Firstly, we construct a quantitative VPC fate model with 1761 components by using HFPNe. Secondly, we employ two major biological fate determination rules – Rule I and Rule II – to VPC fate model. We then conduct 10,000 simulations for each of 48 sets of different genotypes, investigate variations of cell fate patterns under each genotype, and validate the two rules by comparing three simulation targets consisting of fate patterns obtained from in silico and in vivo experiments. In particular, an evaluation was successfully done by using our VPC fate model to investigate one target derived from biological experiments involving hybrid lineage observations. However, the understandings of hybrid lineages are hard to make on a discrete model because the hybrid lineage occurs when the system comes close to certain thresholds as discussed by Sternberg and Horvitz in

  3. Patient-Specific Computational Modeling

    CERN Document Server

    Peña, Estefanía

    2012-01-01

    This book addresses patient-specific modeling. It integrates computational modeling, experimental procedures, imagine clinical segmentation and mesh generation with the finite element method (FEM) to solve problems in computational biomedicine and bioengineering. Specific areas of interest include cardiovascular problems, ocular and muscular systems and soft tissue modeling. Patient-specific modeling has been the subject of serious research over the last seven years and interest in the area is continually growing and this area is expected to further develop in the near future.

  4. Industry specific financial distress modeling

    Directory of Open Access Journals (Sweden)

    Naz Sayari

    2017-01-01

    Full Text Available This study investigates uncertainty levels of various industries and tries to determine financial ratios having the greatest information content in determining the set of industry characteristics. It then uses these ratios to develop industry specific financial distress models. First, we employ factor analysis to determine the set of ratios that are most informative in specified industries. Second, we use a method based on the concept of entropy to measure the level of uncertainty in industries and also to single out the ratios that best reflect the uncertainty levels in specific industries. Finally, we conduct a logistic regression analysis and derive industry specific financial distress models which can be used to judge the predictive ability of selected financial ratios for each industry. The results show that financial ratios do indeed echo industry characteristics and that information content of specific ratios varies among different industries. Our findings show diverging impact of industry characteristics on companies; and thus the necessity of constructing industry specific financial distress models.

  5. Peptide Based Radiopharmaceuticals: Specific Construct Approach

    Energy Technology Data Exchange (ETDEWEB)

    Som, P; Rhodes, B A; Sharma, S S

    1997-10-21

    The objective of this project was to develop receptor based peptides for diagnostic imaging and therapy. A series of peptides related to cell adhesion molecules (CAM) and immune regulation were designed for radiolabeling with 99mTc and evaluated in animal models as potential diagnostic imaging agents for various disease conditions such as thrombus (clot), acute kidney failure, and inflection/inflammation imaging. The peptides for this project were designed by the industrial partner, Palatin Technologies, (formerly Rhomed, Inc.) using various peptide design approaches including a newly developed rational computer assisted drug design (CADD) approach termed MIDAS (Metal ion Induced Distinctive Array of Structures). In this approach, the biological function domain and the 99mTc complexing domain are fused together so that structurally these domains are indistinguishable. This approach allows construction of conformationally rigid metallo-peptide molecules (similar to cyclic peptides) that are metabolically stable in-vivo. All the newly designed peptides were screened in various in vitro receptor binding and functional assays to identify a lead compound. The lead compounds were formulated in a one-step 99mTc labeling kit form which were studied by BNL for detailed in-vivo imaging using various animals models of human disease. Two main peptides usingMIDAS approach evolved and were investigated: RGD peptide for acute renal failure and an immunomodulatory peptide derived from tuftsin (RMT-1) for infection/inflammation imaging. Various RGD based metallopeptides were designed, synthesized and assayed for their efficacy in inhibiting ADP-induced human platelet aggregation. Most of these peptides displayed biological activity in the 1-100 µM range. Based on previous work by others, RGD-I and RGD-II were evaluated in animal models of acute renal failure. These earlier studies showed that after acute ischemic injury the renal cortex displays

  6. General and specific attention-deficit/hyperactivity disorder factors of children 4 to 6 years of age: An exploratory structural equation modeling approach to assessing symptom multidimensionality.

    Science.gov (United States)

    Arias, Víctor B; Ponce, Fernando P; Martínez-Molina, Agustín; Arias, Benito; Núñez, Daniel

    2016-01-01

    We tested first-order factor and bifactor models of attention-deficit/hyperactivity disorder (ADHD) using confirmatory factor analysis (CFA) and exploratory structural equation modeling (ESEM) to adequately summarize the Diagnostic and Statistical Manual of Mental Disorders, 4th Edition, (DSM-IV-TR) symptoms observed in a Spanish sample of preschoolers and kindergarteners. Six ESEM and CFA models were estimated based on teacher evaluations of the behavior of 638 children 4 to 6 years of age. An ESEM bifactor model with a central dimension plus 3 specific factors (inattention, hyperactivity, and impulsivity) showed the best fit and interpretability. Strict invariance between the sexes was observed. The bifactor model provided a solution to previously encountered inconsistencies in the factorial models of ADHD in young children. However, the low reliability of the specific factors casts doubt on the utility of the subscales for ADHD measurement. More research is necessary to clarify the nature of G and S factors of ADHD. (c) 2016 APA, all rights reserved.

  7. Coupling biomechanics to a cellular level model: an approach to patient-specific image driven multi-scale and multi-physics tumor simulation.

    Science.gov (United States)

    May, Christian P; Kolokotroni, Eleni; Stamatakos, Georgios S; Büchler, Philippe

    2011-10-01

    Modeling of tumor growth has been performed according to various approaches addressing different biocomplexity levels and spatiotemporal scales. Mathematical treatments range from partial differential equation based diffusion models to rule-based cellular level simulators, aiming at both improving our quantitative understanding of the underlying biological processes and, in the mid- and long term, constructing reliable multi-scale predictive platforms to support patient-individualized treatment planning and optimization. The aim of this paper is to establish a multi-scale and multi-physics approach to tumor modeling taking into account both the cellular and the macroscopic mechanical level. Therefore, an already developed biomodel of clinical tumor growth and response to treatment is self-consistently coupled with a biomechanical model. Results are presented for the free growth case of the imageable component of an initially point-like glioblastoma multiforme tumor. The composite model leads to significant tumor shape corrections that are achieved through the utilization of environmental pressure information and the application of biomechanical principles. Using the ratio of smallest to largest moment of inertia of the tumor material to quantify the effect of our coupled approach, we have found a tumor shape correction of 20% by coupling biomechanics to the cellular simulator as compared to a cellular simulation without preferred growth directions. We conclude that the integration of the two models provides additional morphological insight into realistic tumor growth behavior. Therefore, it might be used for the development of an advanced oncosimulator focusing on tumor types for which morphology plays an important role in surgical and/or radio-therapeutic treatment planning. Copyright © 2011 Elsevier Ltd. All rights reserved.

  8. A Generalised Approach to Petri Nets and Algebraic Specifications

    International Nuclear Information System (INIS)

    Sivertsen, Terje

    1998-02-01

    The present report represents a continuation of the work on Petri nets and algebraic specifications. The reported research has focused on generalising the approach introduced in HWR-454, with the aim of facilitating the translation of a wider class of Petri nets into algebraic specification. This includes autonomous Petri nets with increased descriptive power, as well as non-autonomous Petri nets allowing the modelling of systems (1) involving extensive data processing; (2) with transitions synchronized on external events; (3) whose evolutions are time dependent. The generalised approach has the important property of being modular in the sense that the translated specifications can be gradually extended to include data processing, synchronization, and timing. The report also discusses the relative merits of state-based and transition-based specifications, and includes a non-trivial case study involving automated proofs of a large number of interrelated theorems. The examples in the report illustrate the use of the new HRP Prover. Of particular importance in this context is the automatic transformation between state-based and transitionbased specifications. It is expected that the approach introduced in HWR-454 and generalised in the present report will prove useful in future work on combination of wide variety of specification techniques

  9. HEDR modeling approach

    International Nuclear Information System (INIS)

    Shipler, D.B.; Napier, B.A.

    1992-07-01

    This report details the conceptual approaches to be used in calculating radiation doses to individuals throughout the various periods of operations at the Hanford Site. The report considers the major environmental transport pathways--atmospheric, surface water, and ground water--and projects and appropriate modeling technique for each. The modeling sequence chosen for each pathway depends on the available data on doses, the degree of confidence justified by such existing data, and the level of sophistication deemed appropriate for the particular pathway and time period being considered

  10. Friendship networks and psychological well-being from late adolescence to young adulthood: a gender-specific structural equation modeling approach.

    Science.gov (United States)

    Miething, Alexander; Almquist, Ylva B; Östberg, Viveca; Rostila, Mikael; Edling, Christofer; Rydgren, Jens

    2016-07-11

    The importance of supportive social relationships for psychological well-being has been previously recognized, but the direction of associations between both dimensions and how they evolve when adolescents enter adulthood have scarcely been addressed. The present study aims to examine the gender-specific associations between self-reported friendship network quality and psychological well-being of young people during the transition from late adolescence to young adulthood by taking into account the direction of association. A random sample of Swedes born in 1990 were surveyed at age 19 and again at age 23 regarding their own health and their relationships with a maximum of five self-nominated friends. The response rate was 55.3 % at baseline and 43.7 % at follow-up, resulting in 772 cases eligible for analysis. Gender-specific structural equation modeling was conducted to explore the associations between network quality and well-being. The measurement part included a latent measure of well-being, whereas the structural part accounted for autocorrelation for network quality and for well-being over time and further examined the cross-lagged associations. The results show that network quality increased while well-being decreased from age 19 to age 23. Females reported worse well-being at both time points, whereas no gender differences were found for network quality. Network quality at age 19 predicted network quality at age 23, and well-being at age 19 predicted well-being at age 23. The results further show positive correlations between network quality and well-being for males and females alike. The strength of the correlations diminished over time but remained significant at age 23. Simultaneously testing social causation and social selection in a series of competing models indicates that while there were no cross-lagged associations among males, there was a weak reverse association between well-being at age 19 and network quality at age 23 among females. The study

  11. Current Approaches to the Establishment of Credit Risk Specific Provisions

    Directory of Open Access Journals (Sweden)

    Ion Nitu

    2008-10-01

    Full Text Available The aim of the new Basel II and IFRS approaches is to make the operations of financial institutions more transparent and thus to create a better basis for the market participants and supervisory authorities to acquire information and make decisions. In the banking sector, a continuous debate is being led, related to the similarities and differences between IFRS approach on loan loss provisions and Basel II approach on calculating the capital requirements, judging against the classical method regarding loan provisions, currently used by the Romanian banks following the Central Bank’s regulations.Banks must take into consideration that IFRS and Basel II objectives are fundamentally different. While IFRS aims to ensure that the financial papers reflect adequately the losses recorded at each balance sheet date, the Basel II objective is to ensure that the bank has enough provisions or capital in order to face expected losses in the next 12 months and eventual unexpected losses.Consequently, there are clear differences between the objectives of the two models. Basel II works on statistical modeling of expected losses while IFRS, although allowing statistical models, requires a trigger event to have occurred before they can be used. IAS 39 specifically states that losses that are expected as a result of future events, no matter how likely, are not recognized. This is a clear and fundamental area of difference between the two frameworks.

  12. Domain specific modeling and analysis

    NARCIS (Netherlands)

    Jacob, Joost Ferdinand

    2008-01-01

    It is desirable to model software systems in such a way that analysis of the systems, and tool development for such analysis, is readily possible and feasible in the context of large scientific research projects. This thesis emphasizes the methodology that serves as a basis for such developments.

  13. The development of model generators for specific reactors

    Energy Technology Data Exchange (ETDEWEB)

    Chow, J.C. [Atomic Energy of Canada Limited, Chalk River, Ontario (Canada)

    2012-07-01

    Authoring reactor models is a routine task for practitioners in nuclear engineering for reactor design, safety analysis, and code validation. The conventional approach is to use a text-editor to either manually manipulate an existing model or to assemble a new model by copying and pasting or direct typing. This approach is error-prone and substantial effort is required for verification. Alternatively, models can be generated programmatically for a specific system via a centralized data source and with rigid algorithms to generate models consistently and efficiently. This approach is demonstrated here for model generators for MCNP and KENO for the ZED-2 reactor. (author)

  14. Model Commissioning Plan and Guide Specifications

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    The objectives of Model Commissioning Plan and Guide Specifications are to ensure that the design team applies commissioning concepts to the design and prepares commissioning specifications and a commission plan for inclusion in the bid construction documents.

  15. Risk-informed approach in US-APWR technical specifications

    International Nuclear Information System (INIS)

    Saji, Etsuro; Tanaka, Futoshi; Kuroiwa, Katsuya; Kawai, Katsunori

    2009-01-01

    The Risk-Managed Technical Specifications and the Surveillance Frequency Control Program have been adopted in the US-APWR Technical Specifications. These risk-informed approaches are unique among the technical specifications for the advanced light water reactor designs adopted by planned nuclear power stations in the United States. (author)

  16. Site specific optimization of wind turbines energy cost: Iterative approach

    International Nuclear Information System (INIS)

    Rezaei Mirghaed, Mohammad; Roshandel, Ramin

    2013-01-01

    Highlights: • Optimization model of wind turbine parameters plus rectangular farm layout is developed. • Results show that levelized cost for single turbine fluctuates between 46.6 and 54.5 $/MW h. • Modeling results for two specific farms reported optimal sizing and farm layout. • Results show that levelized cost of the wind farms fluctuates between 45.8 and 67.2 $/MW h. - Abstract: The present study was aimed at developing a model to optimize the sizing parameters and farm layout of wind turbines according to the wind resource and economic aspects. The proposed model, including aerodynamic, economic and optimization sub-models, is used to achieve minimum levelized cost of electricity. The blade element momentum theory is utilized for aerodynamic modeling of pitch-regulated horizontal axis wind turbines. Also, a comprehensive cost model including capital costs of all turbine components is considered. An iterative approach is used to develop the optimization model. The modeling results are presented for three potential regions in Iran: Khaf, Ahar and Manjil. The optimum configurations and sizing for a single turbine with minimum levelized cost of electricity are presented. The optimal cost of energy for one turbine is calculated about 46.7, 54.5 and 46.6 dollars per MW h in the studied sites, respectively. In addition, optimal size of turbines, annual electricity production, capital cost, and wind farm layout for two different rectangular and square shaped farms in the proposed areas have been recognized. According to the results, optimal system configuration corresponds to minimum levelized cost of electricity about 45.8 to 67.2 dollars per MW h in the studied wind farms

  17. On Automatic Modeling and Use of Domain-specific Ontologies

    DEFF Research Database (Denmark)

    Andreasen, Troels; Knappe, Rasmus; Bulskov, Henrik

    2005-01-01

    In this paper, we firstly introduce an approach to the modeling of a domain-specific ontology for use in connection with a given document collection. Secondly, we present a methodology for deriving conceptual similarity from the domain-specific ontology. Adopted for ontology representation is a s...

  18. An integrated in silico approach to design specific inhibitors targeting human poly(a-specific ribonuclease.

    Directory of Open Access Journals (Sweden)

    Dimitrios Vlachakis

    Full Text Available Poly(A-specific ribonuclease (PARN is an exoribonuclease/deadenylase that degrades 3'-end poly(A tails in almost all eukaryotic organisms. Much of the biochemical and structural information on PARN comes from the human enzyme. However, the existence of PARN all along the eukaryotic evolutionary ladder requires further and thorough investigation. Although the complete structure of the full-length human PARN, as well as several aspects of the catalytic mechanism still remain elusive, many previous studies indicate that PARN can be used as potent and promising anti-cancer target. In the present study, we attempt to complement the existing structural information on PARN with in-depth bioinformatics analyses, in order to get a hologram of the molecular evolution of PARNs active site. In an effort to draw an outline, which allows specific drug design targeting PARN, an unequivocally specific platform was designed for the development of selective modulators focusing on the unique structural and catalytic features of the enzyme. Extensive phylogenetic analysis based on all the publicly available genomes indicated a broad distribution for PARN across eukaryotic species and revealed structurally important amino acids which could be assigned as potentially strong contributors to the regulation of the catalytic mechanism of PARN. Based on the above, we propose a comprehensive in silico model for the PARN's catalytic mechanism and moreover, we developed a 3D pharmacophore model, which was subsequently used for the introduction of DNP-poly(A amphipathic substrate analog as a potential inhibitor of PARN. Indeed, biochemical analysis revealed that DNP-poly(A inhibits PARN competitively. Our approach provides an efficient integrated platform for the rational design of pharmacophore models as well as novel modulators of PARN with therapeutic potential.

  19. Material Modelling - Composite Approach

    DEFF Research Database (Denmark)

    Nielsen, Lauge Fuglsang

    1997-01-01

    is successfully justified comparing predicted results with experimental data obtained in the HETEK-project on creep, relaxation, and shrinkage of very young concretes cured at a temperature of T = 20^o C and a relative humidity of RH = 100%. The model is also justified comparing predicted creep, shrinkage......, and internal stresses caused by drying shrinkage with experimental results reported in the literature on the mechanical behavior of mature concretes. It is then concluded that the model presented applied in general with respect to age at loading.From a stress analysis point of view the most important finding...... in this report is that cement paste and concrete behave practically as linear-viscoelastic materials from an age of approximately 10 hours. This is a significant age extension relative to earlier studies in the literature where linear-viscoelastic behavior is only demonstrated from ages of a few days. Thus...

  20. Investigation of a CER[NP]- and [AP]-Based Stratum Corneum Modeling Membrane System: Using Specifically Deuterated CER Together with a Neutron Diffraction Approach.

    Science.gov (United States)

    Schmitt, Thomas; Lange, Stefan; Dobner, Bodo; Sonnenberger, Stefan; Hauß, Thomas; Neubert, Reinhard H H

    2018-01-30

    Neutron diffraction was used as a tool to investigate the lamellar as well as molecular nanostructure of ceramide-[NP]/ceramide-[AP]/cholesterol/lignoceric acid model systems with a nativelike 2:1 ratio and a 1:2 ratio to study the influence of the ceramide-[AP]. By using mixtures together with cholesterol and free fatty acids as well as a humidity and temperature chamber while measuring, natural conditions were simulated as closely as possible. Despite its simplicity, the system simulated the native stratum corneum lipid matrix fairly closely, showing a similar lamellar thickness with a repeat distance of 5.45 ± 0.1 nm and a similar arrangement with overlapping long C24 chains. Furthermore, despite the very minor chemical difference between ceramide-[NP] and ceramide-[AP], which is only a single OH group, it was possible to demonstrate substantial differences between the structural influence of the two ceramides. Ceramide-[AP] could be concluded to be arranged in such a way that its C24 chain in both ratios is somehow shorter than that of ceramide-[NP], not overlapping as much with the opposite lamellar leaflet. Furthermore, in the unnatural 1:2 ratio, the higher ceramide-[AP] content causes an increased tilt of the ceramide acyl chains. This leads to even less overlapping within the lamellar midplane, whereas the repeat distance stays the same as for the ceramide-[NP]-rich system. In this nativelike 2:1 ratio, the chains are arranged mostly straight, and the long C24 chains show a broad overlapping region in the lamellar midplane.

  1. Patient-Specific Modeling in Tomorrow's Medicine

    CERN Document Server

    2012-01-01

    This book reviews the frontier of research and clinical applications of Patient Specific Modeling, and provides a state-of-the-art update as well as perspectives on future directions in this exciting field. The book is useful for medical physicists, biomedical engineers and other engineers who are interested in the science and technology aspects of Patient Specific Modeling, as well as for radiologists and other medical specialists who wish to be updated about the state of implementation.

  2. Alternative approaches to risk-based technical specifications

    International Nuclear Information System (INIS)

    Atefi, B.; Gallagher, D.W.; Liner, R.T.; Lofgren, E.V.

    1987-01-01

    Four alternative risk-based approaches to Technical Specifications are identified. These are: a Probabilistic Risk Assessment (PRA) oriented approach; a reliability goal-oriented approach; an approach based on configuration control; a data-oriented approach. Based on preliminary results, the PRA-oriented approach, which has been developed further than the other approaches, seems to offer a logical, quantitative basis for setting Allowed Outage Times (AOTs) and Surveillance Test Intervals (STIs) for some plant components and systems. The most attractive feature of this approach is that it directly links the AOTs and STIs with the risk associated with the operation of the plant. This would focus the plant operator's and the regulatory agency's attention on the most risk-significant components of the plant. A series of practical issues related to the level of detail and content of the plant PRAs, requirements for the review of these PRAs, and monitoring cf the plant's performance by the regulatory agency must be resolved before the approach could be implemented. Future efforts will examine the other three approaches and their practicality before firm conclusions are drawn regarding the viability of any of these approaches

  3. Patient-specific models of cardiac biomechanics

    Science.gov (United States)

    Krishnamurthy, Adarsh; Villongco, Christopher T.; Chuang, Joyce; Frank, Lawrence R.; Nigam, Vishal; Belezzuoli, Ernest; Stark, Paul; Krummen, David E.; Narayan, Sanjiv; Omens, Jeffrey H.; McCulloch, Andrew D.; Kerckhoffs, Roy C. P.

    2013-07-01

    Patient-specific models of cardiac function have the potential to improve diagnosis and management of heart disease by integrating medical images with heterogeneous clinical measurements subject to constraints imposed by physical first principles and prior experimental knowledge. We describe new methods for creating three-dimensional patient-specific models of ventricular biomechanics in the failing heart. Three-dimensional bi-ventricular geometry is segmented from cardiac CT images at end-diastole from patients with heart failure. Human myofiber and sheet architecture is modeled using eigenvectors computed from diffusion tensor MR images from an isolated, fixed human organ-donor heart and transformed to the patient-specific geometric model using large deformation diffeomorphic mapping. Semi-automated methods were developed for optimizing the passive material properties while simultaneously computing the unloaded reference geometry of the ventricles for stress analysis. Material properties of active cardiac muscle contraction were optimized to match ventricular pressures measured by cardiac catheterization, and parameters of a lumped-parameter closed-loop model of the circulation were estimated with a circulatory adaptation algorithm making use of information derived from echocardiography. These components were then integrated to create a multi-scale model of the patient-specific heart. These methods were tested in five heart failure patients from the San Diego Veteran's Affairs Medical Center who gave informed consent. The simulation results showed good agreement with measured echocardiographic and global functional parameters such as ejection fraction and peak cavity pressures.

  4. Assessing the Learning Path Specification: a Pragmatic Quality Approach

    NARCIS (Netherlands)

    Janssen, José; Berlanga, Adriana; Heyenrath, Stef; Martens, Harrie; Vogten, Hubert; Finders, Anton; Herder, Eelco; Hermans, Henry; Melero, Javier; Schaeps, Leon; Koper, Rob

    2010-01-01

    Janssen, J., Berlanga, A. J., Heyenrath, S., Martens, H., Vogten, H., Finders, A., Herder, E., Hermans, H., Melero Gallardo, J., Schaeps, L., & Koper, R. (2010). Assessing the Learning Path Specification: a Pragmatic Quality Approach. Journal of Universal Computer Science, 16(21), 3191-3209.

  5. Formal specification with the Java modeling language

    NARCIS (Netherlands)

    Huisman, Marieke; Ahrendt, Wolfgang; Grahl, Daniel; Hentschel, Martin; Ahrendt, Wolfgang; Beckert, Bernhard; Bubel, Richard; Hähnle, Reiner; Schmitt, Peter H.; Ulbrich, Mattoas

    2016-01-01

    This text is a general, self contained, and tool independent introduction into the Java Modeling Language, JML. It appears in a book about the KeY approach and tool, because JML is the dominating starting point of KeY style Java verification. However, this chapter does not depend on KeY, nor any

  6. Equation-oriented specification of neural models for simulations

    Directory of Open Access Journals (Sweden)

    Marcel eStimberg

    2014-02-01

    Full Text Available Simulating biological neuronal networks is a core method of research in computational neuroscience. A full specification of such a network model includes a description of the dynamics and state changes of neurons and synapses, as well as the synaptic connectivity patterns and the initial values of all parameters. A standard approach in neuronal modelling software is to build models based on a library of pre-defined models and mechanisms; if a model component does not yet exist, it has to be defined in a special-purpose or general low-level language and potentially be compiled and linked with the simulator. Here we propose an alternative approach that allows flexible definition of models by writing textual descriptions based on mathematical notation. We demonstrate that this approach allows the definition of a wide range of models with minimal syntax. Furthermore, such explicit model descriptions allow the generation of executable code for various target languages and devices, since the description is not tied to an implementation. Finally, this approach also has advantages for readability and reproducibility, because the model description is fully explicit, and because it can be automatically parsed and transformed into formatted descriptions.The presented approach has been implemented in the Brian2 simulator.

  7. MODEL OF TEACHING PROFESSION SPECIFIC BILATERAL TRANSLATION

    Directory of Open Access Journals (Sweden)

    Yana Fabrychna

    2017-03-01

    Full Text Available The article deals with the author’s interpretation of the process of teaching profession specific bilateral translation to student teacher of English in the Master’s program. The goal of the model of teaching profession specific bilateral translation development is to determine the logical sequence of educational activities of the teacher as the organizer of the educational process and students as its members. English and Ukrainian texts on methods of foreign languages and cultures teaching are defined as the object of study. Learning activities aimed at the development of student teachers of English profession specific competence in bilateral translation and Translation Proficiency Language Portfolio for Student Teachers of English are suggested as teaching tools. The realization of the model of teaching profession specific bilateral translation to student teachers of English in the Master’s program is suggested within the module topics of the academic discipline «Practice of English as the first foreign language»: Globalization; Localization; Education; Work; The role of new communication technologies in personal and professional development. We believe that the amount of time needed for efficient functioning of the model is 48 academic hours, which was determined by calculating the total number of academic hours allotted for the academic discipline «Practice of English as the first foreign language» in Ukrainian universities. Peculiarities of the model realization as well as learning goals and content of class activities and home self-study work of students are outlined.

  8. Slicing AADL specifications for model checking

    NARCIS (Netherlands)

    Odenbrett, M.R.; Nguyen, V.Y.; Noll, T.

    2010-01-01

    To combat the state-space explosion problem in model checking larger systems, abstraction techniques can be employed. Here, methods that operate on the system specification before constructing its state space are preferable to those that try to minimize the resulting transition system as they

  9. Morphing patient-specific musculoskeletal models

    DEFF Research Database (Denmark)

    Rasmussen, John; Galibarov, Pavel E.; Al-Munajjed, Amir

    the resulting models do indeed represent the patients’ biomechanics. As a particularly challenging case, foot deformities based only on point sets recovered from surface scans are considered as shown in the figure. The preliminary results are promising for the cases of severe flat foot and metatarsalgia while...... other conditions may require CT or MRI data. The method and its theoretical assumptions, advantages and limitations are presented, and several examples will illustrate morphing to patient-specific models. [1] Carbes S; Tørholm S; Rasmussen, J. A Detailed Twenty-six Segments Kinematic Foot model...

  10. Set-Theoretic Approach to Maturity Models

    DEFF Research Database (Denmark)

    Lasrado, Lester Allan

    Despite being widely accepted and applied, maturity models in Information Systems (IS) have been criticized for the lack of theoretical grounding, methodological rigor, empirical validations, and ignorance of multiple and non-linear paths to maturity. This PhD thesis focuses on addressing...... these criticisms by incorporating recent developments in configuration theory, in particular application of set-theoretic approaches. The aim is to show the potential of employing a set-theoretic approach for maturity model research and empirically demonstrating equifinal paths to maturity. Specifically...... methodological guidelines consisting of detailed procedures to systematically apply set theoretic approaches for maturity model research and provides demonstrations of it application on three datasets. The thesis is a collection of six research papers that are written in a sequential manner. The first paper...

  11. Evaporator modeling - A hybrid approach

    International Nuclear Information System (INIS)

    Ding Xudong; Cai Wenjian; Jia Lei; Wen Changyun

    2009-01-01

    In this paper, a hybrid modeling approach is proposed to model two-phase flow evaporators. The main procedures for hybrid modeling includes: (1) Based on the energy and material balance, and thermodynamic principles to formulate the process fundamental governing equations; (2) Select input/output (I/O) variables responsible to the system performance which can be measured and controlled; (3) Represent those variables existing in the original equations but are not measurable as simple functions of selected I/Os or constants; (4) Obtaining a single equation which can correlate system inputs and outputs; and (5) Identify unknown parameters by linear or nonlinear least-squares methods. The method takes advantages of both physical and empirical modeling approaches and can accurately predict performance in wide operating range and in real-time, which can significantly reduce the computational burden and increase the prediction accuracy. The model is verified with the experimental data taken from a testing system. The testing results show that the proposed model can predict accurately the performance of the real-time operating evaporator with the maximum error of ±8%. The developed models will have wide applications in operational optimization, performance assessment, fault detection and diagnosis

  12. The Danish national passenger modelModel specification and results

    DEFF Research Database (Denmark)

    Rich, Jeppe; Hansen, Christian Overgaard

    2016-01-01

    The paper describes the structure of the new Danish National Passenger model and provides on this basis a general discussion of large-scale model design, cost-damping and model validation. The paper aims at providing three main contributions to the existing literature. Firstly, at the general level......, the paper provides a description of a large-scale forecast model with a discussion of the linkage between population synthesis, demand and assignment. Secondly, the paper gives specific attention to model specification and in particular choice of functional form and cost-damping. Specifically we suggest...... a family of logarithmic spline functions and illustrate how it is applied in the model. Thirdly and finally, we evaluate model sensitivity and performance by evaluating the distance distribution and elasticities. In the paper we present results where the spline-function is compared with more traditional...

  13. Model-specification uncertainty in future forest pest outbreak.

    Science.gov (United States)

    Boulanger, Yan; Gray, David R; Cooke, Barry J; De Grandpré, Louis

    2016-04-01

    Climate change will modify forest pest outbreak characteristics, although there are disagreements regarding the specifics of these changes. A large part of this variability may be attributed to model specifications. As a case study, we developed a consensus model predicting spruce budworm (SBW, Choristoneura fumiferana [Clem.]) outbreak duration using two different predictor data sets and six different correlative methods. The model was used to project outbreak duration and the uncertainty associated with using different data sets and correlative methods (=model-specification uncertainty) for 2011-2040, 2041-2070 and 2071-2100, according to three forcing scenarios (RCP 2.6, RCP 4.5 and RCP 8.5). The consensus model showed very high explanatory power and low bias. The model projected a more important northward shift and decrease in outbreak duration under the RCP 8.5 scenario. However, variation in single-model projections increases with time, making future projections highly uncertain. Notably, the magnitude of the shifts in northward expansion, overall outbreak duration and the patterns of outbreaks duration at the southern edge were highly variable according to the predictor data set and correlative method used. We also demonstrated that variation in forcing scenarios contributed only slightly to the uncertainty of model projections compared with the two sources of model-specification uncertainty. Our approach helped to quantify model-specification uncertainty in future forest pest outbreak characteristics. It may contribute to sounder decision-making by acknowledging the limits of the projections and help to identify areas where model-specification uncertainty is high. As such, we further stress that this uncertainty should be strongly considered when making forest management plans, notably by adopting adaptive management strategies so as to reduce future risks. © 2015 Her Majesty the Queen in Right of Canada Global Change Biology © 2015 Published by John

  14. Sri Lankan FRAX model and country-specific intervention thresholds.

    Science.gov (United States)

    Lekamwasam, Sarath

    2013-01-01

    There is a wide variation in fracture probabilities estimated by Asian FRAX models, although the outputs of South Asian models are concordant. Clinicians can choose either fixed or age-specific intervention thresholds when making treatment decisions in postmenopausal women. Cost-effectiveness of such approach, however, needs to be addressed. This study examined suitable fracture probability intervention thresholds (ITs) for Sri Lanka, based on the Sri Lankan FRAX model. Fracture probabilities were estimated using all Asian FRAX models for a postmenopausal woman of BMI 25 kg/m² and has no clinical risk factors apart from a fragility fracture, and they were compared. Age-specific ITs were estimated based on the Sri Lankan FRAX model using the method followed by the National Osteoporosis Guideline Group in the UK. Using the age-specific ITs as the reference standard, suitable fixed ITs were also estimated. Fracture probabilities estimated by different Asian FRAX models varied widely. Japanese and Taiwan models showed higher fracture probabilities while Chinese, Philippine, and Indonesian models gave lower fracture probabilities. Output of remaining FRAX models were generally similar. Age-specific ITs of major osteoporotic fracture probabilities (MOFP) based on the Sri Lankan FRAX model varied from 2.6 to 18% between 50 and 90 years. ITs of hip fracture probabilities (HFP) varied from 0.4 to 6.5% between 50 and 90 years. In finding fixed ITs, MOFP of 11% and HFP of 3.5% gave the lowest misclassification and highest agreement. Sri Lankan FRAX model behaves similar to other Asian FRAX models such as Indian, Singapore-Indian, Thai, and South Korean. Clinicians may use either the fixed or age-specific ITs in making therapeutic decisions in postmenopausal women. The economical aspects of such decisions, however, need to be considered.

  15. An Integrated Framework to Specify Domain-Specific Modeling Languages

    DEFF Research Database (Denmark)

    Zarrin, Bahram; Baumeister, Hubert

    2018-01-01

    , a logic-based specification language. The drawback of MS DSL Tools is it does not provide a formal and rigorous approach for semantics specifications. In this framework, we use Microsoft DSL Tools to define the metamodel and graphical notations of DSLs, and an extended version of ForSpec as a formal......In this paper, we propose an integrated framework that can be used by DSL designers to implement their desired graphical domain-specific languages. This framework relies on Microsoft DSL Tools, a meta-modeling framework to build graphical domain-specific languages, and an extension of ForSpec...... language to define their semantics. Integrating these technologies under the umbrella of Microsoft Visual Studio IDE allows DSL designers to utilize a single development environment for developing their desired domain-specific languages....

  16. Patient-Specific Modeling of Intraventricular Hemodynamics

    Science.gov (United States)

    Vedula, Vijay; Marsden, Alison

    2017-11-01

    Heart disease is the one of the leading causes of death in the world. Apart from malfunctions in electrophysiology and myocardial mechanics, abnormal hemodynamics is a major factor attributed to heart disease across all ages. Computer simulations offer an efficient means to accurately reproduce in vivo flow conditions and also make predictions of post-operative outcomes and disease progression. We present an experimentally validated computational framework for performing patient-specific modeling of intraventricular hemodynamics. Our modeling framework employs the SimVascular open source software to build an anatomic model and employs robust image registration methods to extract ventricular motion from the image data. We then employ a stabilized finite element solver to simulate blood flow in the ventricles, solving the Navier-Stokes equations in arbitrary Lagrangian-Eulerian (ALE) coordinates by prescribing the wall motion extracted during registration. We model the fluid-structure interaction effects of the cardiac valves using an immersed boundary method and discuss the potential application of this methodology in single ventricle physiology and trans-catheter aortic valve replacement (TAVR). This research is supported in part by the Stanford Child Health Research Institute and the Stanford NIH-NCATS-CTSA through Grant UL1 TR001085 and partly through NIH NHLBI R01 Grant 5R01HL129727-02.

  17. Predicting Cumulative Incidence Probability: Marginal and Cause-Specific Modelling

    DEFF Research Database (Denmark)

    Scheike, Thomas H.; Zhang, Mei-Jie

    2005-01-01

    cumulative incidence probability; cause-specific hazards; subdistribution hazard; binomial modelling......cumulative incidence probability; cause-specific hazards; subdistribution hazard; binomial modelling...

  18. A Conceptual Modeling Approach for OLAP Personalization

    Science.gov (United States)

    Garrigós, Irene; Pardillo, Jesús; Mazón, Jose-Norberto; Trujillo, Juan

    Data warehouses rely on multidimensional models in order to provide decision makers with appropriate structures to intuitively analyze data with OLAP technologies. However, data warehouses may be potentially large and multidimensional structures become increasingly complex to be understood at a glance. Even if a departmental data warehouse (also known as data mart) is used, these structures would be also too complex. As a consequence, acquiring the required information is more costly than expected and decision makers using OLAP tools may get frustrated. In this context, current approaches for data warehouse design are focused on deriving a unique OLAP schema for all analysts from their previously stated information requirements, which is not enough to lighten the complexity of the decision making process. To overcome this drawback, we argue for personalizing multidimensional models for OLAP technologies according to the continuously changing user characteristics, context, requirements and behaviour. In this paper, we present a novel approach to personalizing OLAP systems at the conceptual level based on the underlying multidimensional model of the data warehouse, a user model and a set of personalization rules. The great advantage of our approach is that a personalized OLAP schema is provided for each decision maker contributing to better satisfy their specific analysis needs. Finally, we show the applicability of our approach through a sample scenario based on our CASE tool for data warehouse development.

  19. HEDR modeling approach: Revision 1

    International Nuclear Information System (INIS)

    Shipler, D.B.; Napier, B.A.

    1994-05-01

    This report is a revision of the previous Hanford Environmental Dose Reconstruction (HEDR) Project modeling approach report. This revised report describes the methods used in performing scoping studies and estimating final radiation doses to real and representative individuals who lived in the vicinity of the Hanford Site. The scoping studies and dose estimates pertain to various environmental pathways during various periods of time. The original report discussed the concepts under consideration in 1991. The methods for estimating dose have been refined as understanding of existing data, the scope of pathways, and the magnitudes of dose estimates were evaluated through scoping studies

  20. SLS Navigation Model-Based Design Approach

    Science.gov (United States)

    Oliver, T. Emerson; Anzalone, Evan; Geohagan, Kevin; Bernard, Bill; Park, Thomas

    2018-01-01

    management of design requirements to the development of usable models, model requirements, and model verification and validation efforts. The models themselves are represented in C/C++ code and accompanying data files. Under the idealized process, potential ambiguity in specification is reduced because the model must be implementable versus a requirement which is not necessarily subject to this constraint. Further, the models are shown to emulate the hardware during validation. For models developed by the Navigation Team, a common interface/standalone environment was developed. The common environment allows for easy implementation in design and analysis tools. Mechanisms such as unit test cases ensure implementation as the developer intended. The model verification and validation process provides a very high level of component design insight. The origin and implementation of the SLS variant of Model-based Design is described from the perspective of the SLS Navigation Team. The format of the models and the requirements are described. The Model-based Design approach has many benefits but is not without potential complications. Key lessons learned associated with the implementation of the Model Based Design approach and process from infancy to verification and certification are discussed

  1. Global Environmental Change: An integrated modelling approach

    International Nuclear Information System (INIS)

    Den Elzen, M.

    1993-01-01

    Two major global environmental problems are dealt with: climate change and stratospheric ozone depletion (and their mutual interactions), briefly surveyed in part 1. In Part 2 a brief description of the integrated modelling framework IMAGE 1.6 is given. Some specific parts of the model are described in more detail in other Chapters, e.g. the carbon cycle model, the atmospheric chemistry model, the halocarbon model, and the UV-B impact model. In Part 3 an uncertainty analysis of climate change and stratospheric ozone depletion is presented (Chapter 4). Chapter 5 briefly reviews the social and economic uncertainties implied by future greenhouse gas emissions. Chapters 6 and 7 describe a model and sensitivity analysis pertaining to the scientific uncertainties and/or lacunae in the sources and sinks of methane and carbon dioxide, and their biogeochemical feedback processes. Chapter 8 presents an uncertainty and sensitivity analysis of the carbon cycle model, the halocarbon model, and the IMAGE model 1.6 as a whole. Part 4 presents the risk assessment methodology as applied to the problems of climate change and stratospheric ozone depletion more specifically. In Chapter 10, this methodology is used as a means with which to asses current ozone policy and a wide range of halocarbon policies. Chapter 11 presents and evaluates the simulated globally-averaged temperature and sea level rise (indicators) for the IPCC-1990 and 1992 scenarios, concluding with a Low Risk scenario, which would meet the climate targets. Chapter 12 discusses the impact of sea level rise on the frequency of the Dutch coastal defence system (indicator) for the IPCC-1990 scenarios. Chapter 13 presents projections of mortality rates due to stratospheric ozone depletion based on model simulations employing the UV-B chain model for a number of halocarbon policies. Chapter 14 presents an approach for allocating future emissions of CO 2 among regions. (Abstract Truncated)

  2. Seismic Safety Margins Research Program (Phase I). Project VII. Systems analysis specification of computational approach

    International Nuclear Information System (INIS)

    Wall, I.B.; Kaul, M.K.; Post, R.I.; Tagart, S.W. Jr.; Vinson, T.J.

    1979-02-01

    An initial specification is presented of a computation approach for a probabilistic risk assessment model for use in the Seismic Safety Margin Research Program. This model encompasses the whole seismic calculational chain from seismic input through soil-structure interaction, transfer functions to the probability of component failure, integration of these failures into a system model and thereby estimate the probability of a release of radioactive material to the environment. It is intended that the primary use of this model will be in sensitivity studies to assess the potential conservatism of different modeling elements in the chain and to provide guidance on priorities for research in seismic design of nuclear power plants

  3. Modeling task-specific neuronal ensembles improves decoding of grasp

    Science.gov (United States)

    Smith, Ryan J.; Soares, Alcimar B.; Rouse, Adam G.; Schieber, Marc H.; Thakor, Nitish V.

    2018-06-01

    Objective. Dexterous movement involves the activation and coordination of networks of neuronal populations across multiple cortical regions. Attempts to model firing of individual neurons commonly treat the firing rate as directly modulating with motor behavior. However, motor behavior may additionally be associated with modulations in the activity and functional connectivity of neurons in a broader ensemble. Accounting for variations in neural ensemble connectivity may provide additional information about the behavior being performed. Approach. In this study, we examined neural ensemble activity in primary motor cortex (M1) and premotor cortex (PM) of two male rhesus monkeys during performance of a center-out reach, grasp and manipulate task. We constructed point process encoding models of neuronal firing that incorporated task-specific variations in the baseline firing rate as well as variations in functional connectivity with the neural ensemble. Models were evaluated both in terms of their encoding capabilities and their ability to properly classify the grasp being performed. Main results. Task-specific ensemble models correctly predicted the performed grasp with over 95% accuracy and were shown to outperform models of neuronal activity that assume only a variable baseline firing rate. Task-specific ensemble models exhibited superior decoding performance in 82% of units in both monkeys (p  <  0.01). Inclusion of ensemble activity also broadly improved the ability of models to describe observed spiking. Encoding performance of task-specific ensemble models, measured by spike timing predictability, improved upon baseline models in 62% of units. Significance. These results suggest that additional discriminative information about motor behavior found in the variations in functional connectivity of neuronal ensembles located in motor-related cortical regions is relevant to decode complex tasks such as grasping objects, and may serve the basis for more

  4. A grammar inference approach for predicting kinase specific phosphorylation sites.

    Science.gov (United States)

    Datta, Sutapa; Mukhopadhyay, Subhasis

    2015-01-01

    Kinase mediated phosphorylation site detection is the key mechanism of post translational mechanism that plays an important role in regulating various cellular processes and phenotypes. Many diseases, like cancer are related with the signaling defects which are associated with protein phosphorylation. Characterizing the protein kinases and their substrates enhances our ability to understand the mechanism of protein phosphorylation and extends our knowledge of signaling network; thereby helping us to treat such diseases. Experimental methods for predicting phosphorylation sites are labour intensive and expensive. Also, manifold increase of protein sequences in the databanks over the years necessitates the improvement of high speed and accurate computational methods for predicting phosphorylation sites in protein sequences. Till date, a number of computational methods have been proposed by various researchers in predicting phosphorylation sites, but there remains much scope of improvement. In this communication, we present a simple and novel method based on Grammatical Inference (GI) approach to automate the prediction of kinase specific phosphorylation sites. In this regard, we have used a popular GI algorithm Alergia to infer Deterministic Stochastic Finite State Automata (DSFA) which equally represents the regular grammar corresponding to the phosphorylation sites. Extensive experiments on several datasets generated by us reveal that, our inferred grammar successfully predicts phosphorylation sites in a kinase specific manner. It performs significantly better when compared with the other existing phosphorylation site prediction methods. We have also compared our inferred DSFA with two other GI inference algorithms. The DSFA generated by our method performs superior which indicates that our method is robust and has a potential for predicting the phosphorylation sites in a kinase specific manner.

  5. A Grammar Inference Approach for Predicting Kinase Specific Phosphorylation Sites

    Science.gov (United States)

    Datta, Sutapa; Mukhopadhyay, Subhasis

    2015-01-01

    Kinase mediated phosphorylation site detection is the key mechanism of post translational mechanism that plays an important role in regulating various cellular processes and phenotypes. Many diseases, like cancer are related with the signaling defects which are associated with protein phosphorylation. Characterizing the protein kinases and their substrates enhances our ability to understand the mechanism of protein phosphorylation and extends our knowledge of signaling network; thereby helping us to treat such diseases. Experimental methods for predicting phosphorylation sites are labour intensive and expensive. Also, manifold increase of protein sequences in the databanks over the years necessitates the improvement of high speed and accurate computational methods for predicting phosphorylation sites in protein sequences. Till date, a number of computational methods have been proposed by various researchers in predicting phosphorylation sites, but there remains much scope of improvement. In this communication, we present a simple and novel method based on Grammatical Inference (GI) approach to automate the prediction of kinase specific phosphorylation sites. In this regard, we have used a popular GI algorithm Alergia to infer Deterministic Stochastic Finite State Automata (DSFA) which equally represents the regular grammar corresponding to the phosphorylation sites. Extensive experiments on several datasets generated by us reveal that, our inferred grammar successfully predicts phosphorylation sites in a kinase specific manner. It performs significantly better when compared with the other existing phosphorylation site prediction methods. We have also compared our inferred DSFA with two other GI inference algorithms. The DSFA generated by our method performs superior which indicates that our method is robust and has a potential for predicting the phosphorylation sites in a kinase specific manner. PMID:25886273

  6. Formal Specification and Automatic Analysis of Business Processes under Authorization Constraints: An Action-Based Approach

    Science.gov (United States)

    Armando, Alessandro; Giunchiglia, Enrico; Ponta, Serena Elisa

    We present an approach to the formal specification and automatic analysis of business processes under authorization constraints based on the action language \\cal{C}. The use of \\cal{C} allows for a natural and concise modeling of the business process and the associated security policy and for the automatic analysis of the resulting specification by using the Causal Calculator (CCALC). Our approach improves upon previous work by greatly simplifying the specification step while retaining the ability to perform a fully automatic analysis. To illustrate the effectiveness of the approach we describe its application to a version of a business process taken from the banking domain and use CCALC to determine resource allocation plans complying with the security policy.

  7. Cow-specific treatment of clinical mastitis: an economic approach.

    Science.gov (United States)

    Steeneveld, W; van Werven, T; Barkema, H W; Hogeveen, H

    2011-01-01

    Under Dutch circumstances, most clinical mastitis (CM) cases of cows on dairy farms are treated with a standard intramammary antimicrobial treatment. Several antimicrobial treatments are available for CM, differing in antimicrobial compound, route of application, duration, and cost. Because cow factors (e.g., parity, stage of lactation, and somatic cell count history) and the causal pathogen influence the probability of cure, cow-specific treatment of CM is often recommended. The objective of this study was to determine if cow-specific treatment of CM is economically beneficial. Using a stochastic Monte Carlo simulation model, 20,000 CM cases were simulated. These CM cases were caused by Streptococcus uberis and Streptococcus dysgalactiae (40%), Staphylococcus aureus (30%), or Escherichia coli (30%). For each simulated CM case, the consequences of using different antimicrobial treatment regimens (standard 3-d intramammary, extended 5-d intramammary, combination 3-d intramammary+systemic, combination 3-d intramammary+systemic+1-d nonsteroidal antiinflammatory drugs, and combination extended 5-d intramammary+systemic) were simulated simultaneously. Finally, total costs of the 5 antimicrobial treatment regimens were compared. Some inputs for the model were based on literature information and assumptions made by the authors were used if no information was available. Bacteriological cure for each individual cow depended on the antimicrobial treatment regimen, the causal pathogen, and the cow factors parity, stage of lactation, somatic cell count history, CM history, and whether the cow was systemically ill. Total costs for each case depended on treatment costs for the initial CM case (including costs for antibiotics, milk withdrawal, and labor), treatment costs for follow-up CM cases, costs for milk production losses, and costs for culling. Average total costs for CM using the 5 treatments were (US) $224, $247, $253, $260, and $275, respectively. Average probabilities

  8. On the specification of structural equation models for ecological systems

    Science.gov (United States)

    Grace, J.B.; Michael, Anderson T.; Han, O.; Scheiner, S.M.

    2010-01-01

    The use of structural equation modeling (SEM) is often motivated by its utility for investigating complex networks of relationships, but also because of its promise as a means of representing theoretical concepts using latent variables. In this paper, we discuss characteristics of ecological theory and some of the challenges for proper specification of theoretical ideas in structural equation models (SE models). In our presentation, we describe some of the requirements for classical latent variable models in which observed variables (indicators) are interpreted as the effects of underlying causes. We also describe alternative model specifications in which indicators are interpreted as having causal influences on the theoretical concepts. We suggest that this latter nonclassical specification (which involves another variable type-the composite) will often be appropriate for ecological studies because of the multifaceted nature of our theoretical concepts. In this paper, we employ the use of meta-models to aid the translation of theory into SE models and also to facilitate our ability to relate results back to our theories. We demonstrate our approach by showing how a synthetic theory of grassland biodiversity can be evaluated using SEM and data from a coastal grassland. In this example, the theory focuses on the responses of species richness to abiotic stress and disturbance, both directly and through intervening effects on community biomass. Models examined include both those based on classical forms (where each concept is represented using a single latent variable) and also ones in which the concepts are recognized to be multifaceted and modeled as such. To address the challenge of matching SE models with the conceptual level of our theory, two approaches are illustrated, compositing and aggregation. Both approaches are shown to have merits, with the former being preferable for cases where the multiple facets of a concept have widely differing effects in the

  9. Scaling up biomass gasifier use: an application-specific approach

    International Nuclear Information System (INIS)

    Ghosh, Debyani; Sagar, Ambuj D.; Kishore, V.V.N.

    2006-01-01

    Biomass energy accounts for about 11% of the global primary energy supply, and it is estimated that about 2 billion people worldwide depend on biomass for their energy needs. Yet, most of the use of biomass is in a primitive and inefficient manner, primarily in developing countries, leading to a host of adverse implications on human health, environment, workplace conditions, and social well being. Therefore, the utilization of biomass in a clean and efficient manner to deliver modern energy services to the world's poor remains an imperative for the development community. One possible approach to do this is through the use of biomass gasifiers. Although significant efforts have been directed towards developing and deploying biomass gasifiers in many countries, scaling up their dissemination remains an elusive goal. Based on an examination of biomass gasifier development, demonstration, and deployment efforts in India-a country with more than two decades of experiences in biomass gasifier development and dissemination, this article identifies a number of barriers that have hindered widespread deployment of biomass gasifier-based energy systems. It also suggests a possible approach for moving forward, which involves a focus on specific application areas that satisfy a set of criteria that are critical to deployment of biomass gasifiers, and then tailoring the scaling up strategy to the characteristics of the user groups for that application. Our technical, financial, economic and institutional analysis suggests an initial focus on four categories of applications-small and medium enterprises, the informal sector, biomass-processing industries, and some rural areas-may be particularly feasible and fruitful

  10. Specific Cell (Re-)Programming: Approaches and Perspectives.

    Science.gov (United States)

    Hausburg, Frauke; Jung, Julia Jeannine; David, Robert

    2018-01-01

    Many disorders are manifested by dysfunction of key cell types or their disturbed integration in complex organs. Thereby, adult organ systems often bear restricted self-renewal potential and are incapable of achieving functional regeneration. This underlies the need for novel strategies in the field of cell (re-)programming-based regenerative medicine as well as for drug development in vitro. The regenerative field has been hampered by restricted availability of adult stem cells and the potentially hazardous features of pluripotent embryonic stem cells (ESCs) and induced pluripotent stem cells (iPSCs). Moreover, ethical concerns and legal restrictions regarding the generation and use of ESCs still exist. The establishment of direct reprogramming protocols for various therapeutically valuable somatic cell types has overcome some of these limitations. Meanwhile, new perspectives for safe and efficient generation of different specified somatic cell types have emerged from numerous approaches relying on exogenous expression of lineage-specific transcription factors, coding and noncoding RNAs, and chemical compounds.It should be of highest priority to develop protocols for the production of mature and physiologically functional cells with properties ideally matching those of their endogenous counterparts. Their availability can bring together basic research, drug screening, safety testing, and ultimately clinical trials. Here, we highlight the remarkable successes in cellular (re-)programming, which have greatly advanced the field of regenerative medicine in recent years. In particular, we review recent progress on the generation of cardiomyocyte subtypes, with a focus on cardiac pacemaker cells. Graphical Abstract.

  11. A Bayesian approach to model uncertainty

    International Nuclear Information System (INIS)

    Buslik, A.

    1994-01-01

    A Bayesian approach to model uncertainty is taken. For the case of a finite number of alternative models, the model uncertainty is equivalent to parameter uncertainty. A derivation based on Savage's partition problem is given

  12. Specific Type of Knowledge Map: Mathematical Model

    OpenAIRE

    Milan, Houška; Martina, Beránková

    2005-01-01

    The article deals with relationships between mathematical models and knowledge maps. The goal of the article is to suggest how to use the mathematical model as a knowledge map and/or as a part (esp. the inference mechanism) of the knowledge system. The results are demonstrated on the case study, when the knowledge from a story is expressed by mathematical model. The model is used for both knowledge warehousing and inferencing new artificially derived knowledge.

  13. Chemical cleaning specification: few tube test model

    International Nuclear Information System (INIS)

    Hampton, L.V.; Simpson, J.L.

    1979-09-01

    The specification is for the waterside chemical cleaning of the 2 1/4 Cr - 1 Mo steel steam generator tubes. It describes the reagents and conditions for post-chemical cleaning passivation of the evaporator tubes

  14. Introducing a game approach towards IS requirements specification

    DEFF Research Database (Denmark)

    Jensen, Mika Yasuoka; Kadoya, Kyoichi; Niwa, Takashi

    2014-01-01

    Devising a system requirements specification is a challenging task. Even after several decades of system development research, specifications for large-scale, widely-used systems remain difficult. In this paper, we suggest a first step toward a requirements specification through a stakeholder inv...

  15. Domain-Specific Modelling Languages in Bigraphs

    DEFF Research Database (Denmark)

    Perrone, Gian David

    " of models, in order to improve the utility of the models we build, and to ease the process of model construction by moving the languages we use to express such models closer to their respective domains. This thesis is concerned with the study of bigraphical reactive systems as a host for domain...... for deciding reaction rule causation. Finally, we provide a mechanism for the modular construction of domain-specic modelling languages as bigraphical reactive systems, exploring the relationship between vertical renement and language specialisation in this setting. The thesis is composed of several...

  16. Cost Concept Model and Gateway Specification

    DEFF Research Database (Denmark)

    Kejser, Ulla Bøgvad

    2014-01-01

    This document introduces a Framework supporting the implementation of a cost concept model against which current and future cost models for curating digital assets can be benchmarked. The value built into this cost concept model leverages the comprehensive engagement by the 4C project with various...... to promote interoperability; • A Nested Model for Digital Curation—that visualises the core concepts, demonstrates how they interact and places them into context visually by linking them to A Cost and Benefit Model for Curation; This Framework provides guidance for data collection and associated calculations...

  17. Gaussian graphical modeling reveals specific lipid correlations in glioblastoma cells

    Science.gov (United States)

    Mueller, Nikola S.; Krumsiek, Jan; Theis, Fabian J.; Böhm, Christian; Meyer-Bäse, Anke

    2011-06-01

    Advances in high-throughput measurements of biological specimens necessitate the development of biologically driven computational techniques. To understand the molecular level of many human diseases, such as cancer, lipid quantifications have been shown to offer an excellent opportunity to reveal disease-specific regulations. The data analysis of the cell lipidome, however, remains a challenging task and cannot be accomplished solely based on intuitive reasoning. We have developed a method to identify a lipid correlation network which is entirely disease-specific. A powerful method to correlate experimentally measured lipid levels across the various samples is a Gaussian Graphical Model (GGM), which is based on partial correlation coefficients. In contrast to regular Pearson correlations, partial correlations aim to identify only direct correlations while eliminating indirect associations. Conventional GGM calculations on the entire dataset can, however, not provide information on whether a correlation is truly disease-specific with respect to the disease samples and not a correlation of control samples. Thus, we implemented a novel differential GGM approach unraveling only the disease-specific correlations, and applied it to the lipidome of immortal Glioblastoma tumor cells. A large set of lipid species were measured by mass spectrometry in order to evaluate lipid remodeling as a result to a combination of perturbation of cells inducing programmed cell death, while the other perturbations served solely as biological controls. With the differential GGM, we were able to reveal Glioblastoma-specific lipid correlations to advance biomedical research on novel gene therapies.

  18. Fusarium diversity in soil using a specific molecular approach and a cultural approach.

    Science.gov (United States)

    Edel-Hermann, Véronique; Gautheron, Nadine; Mounier, Arnaud; Steinberg, Christian

    2015-04-01

    Fusarium species are ubiquitous in soil. They cause plant and human diseases and can produce mycotoxins. Surveys of Fusarium species diversity in environmental samples usually rely on laborious culture-based methods. In the present study, we have developed a molecular method to analyze Fusarium diversity directly from soil DNA. We designed primers targeting the translation elongation factor 1-alpha (EF-1α) gene and demonstrated their specificity toward Fusarium using a large collection of fungi. We used the specific primers to construct a clone library from three contrasting soils. Sequence analysis confirmed the specificity of the assay, with 750 clones identified as Fusarium and distributed among eight species or species complexes. The Fusarium oxysporum species complex (FOSC) was the most abundant one in the three soils, followed by the Fusarium solani species complex (FSSC). We then compared our molecular approach results with those obtained by isolating Fusarium colonies on two culture media and identifying species by sequencing part of the EF-1α gene. The 750 isolates were distributed into eight species or species complexes, with the same dominant species as with the cloning method. Sequence diversity was much higher in the clone library than in the isolate collection. The molecular approach proved to be a valuable tool to assess Fusarium diversity in environmental samples. Combined with high throughput sequencing, it will allow for in-depth analysis of large numbers of samples. Published by Elsevier B.V.

  19. Specimen-specific modeling of hip fracture pattern and repair.

    Science.gov (United States)

    Ali, Azhar A; Cristofolini, Luca; Schileo, Enrico; Hu, Haixiang; Taddei, Fulvia; Kim, Raymond H; Rullkoetter, Paul J; Laz, Peter J

    2014-01-22

    Hip fracture remains a major health problem for the elderly. Clinical studies have assessed fracture risk based on bone quality in the aging population and cadaveric testing has quantified bone strength and fracture loads. Prior modeling has primarily focused on quantifying the strain distribution in bone as an indicator of fracture risk. Recent advances in the extended finite element method (XFEM) enable prediction of the initiation and propagation of cracks without requiring a priori knowledge of the crack path. Accordingly, the objectives of this study were to predict femoral fracture in specimen-specific models using the XFEM approach, to perform one-to-one comparisons of predicted and in vitro fracture patterns, and to develop a framework to assess the mechanics and load transfer in the fractured femur when it is repaired with an osteosynthesis implant. Five specimen-specific femur models were developed from in vitro experiments under a simulated stance loading condition. Predicted fracture patterns closely matched the in vitro patterns; however, predictions of fracture load differed by approximately 50% due to sensitivity to local material properties. Specimen-specific intertrochanteric fractures were induced by subjecting the femur models to a sideways fall and repaired with a contemporary implant. Under a post-surgical stance loading, model-predicted load sharing between the implant and bone across the fracture surface varied from 59%:41% to 89%:11%, underscoring the importance of considering anatomic and fracture variability in the evaluation of implants. XFEM modeling shows potential as a macro-level analysis enabling fracture investigations of clinical cohorts, including at-risk groups, and the design of robust implants. © 2013 Published by Elsevier Ltd.

  20. Modular Modelling and Simulation Approach - Applied to Refrigeration Systems

    DEFF Research Database (Denmark)

    Sørensen, Kresten Kjær; Stoustrup, Jakob

    2008-01-01

    This paper presents an approach to modelling and simulation of the thermal dynamics of a refrigeration system, specifically a reefer container. A modular approach is used and the objective is to increase the speed and flexibility of the developed simulation environment. The refrigeration system...

  1. SPECIFICITIES OF COMPETENCY APPROACH IMPLEMENTATION: UKRAINIAN AND EUROPEAN EXPERIENCE

    Directory of Open Access Journals (Sweden)

    Oksana V. Ovcharuk

    2010-08-01

    Full Text Available The article deals with the problems of competency approach implementation to the process of education content formation. The comparative analysis of European and Ukrainian experience of key competencies list discussion has done. Ukrainian perspectives of the competency approach integration to the content of education curricula are revealed.

  2. System Behavior Models: A Survey of Approaches

    Science.gov (United States)

    2016-06-01

    OF FIGURES Spiral Model .................................................................................................3 Figure 1. Approaches in... spiral model was chosen for researching and structuring this thesis, shown in Figure 1. This approach allowed multiple iterations of source material...applications and refining through iteration. 3 Spiral Model Figure 1. D. SCOPE The research is limited to a literature review, limited

  3. SPECIFIC MODELS OF REPRESENTING THE INTELLECTUAL CAPITAL

    Directory of Open Access Journals (Sweden)

    Andreea Feraru

    2014-12-01

    Full Text Available Various scientists in the modern age of management have launched different models for evaluating intellectual capital, and some of these models are analysed critically in this study, too. Most authors examine intellectual capital from a static perspective and focus on the development of its various evaluation models. In this chapter we surveyed the classical static models: Sveiby, Edvisson, Balanced Scorecard, as well as the canonical model of intellectual capital. In a spectral dynamic analysis, organisational intellectual capital is structured in: organisational knowledge, organisational intelligence, organisational values, and their value is built on certain mechanisms entitled integrators, whose chief constitutive elements are: individual knowledge, individual intelligence and individual cultural values. The organizations, as employers, must especially reconsider those employees’ work who value knowledge because they are free to choose how, and especially where they are inclined to invest their own energy, skills and time, and they can be treated as freelancers or as some little entrepreneurs .

  4. A Model-Driven Approach for Telecommunications Network Services Definition

    Science.gov (United States)

    Chiprianov, Vanea; Kermarrec, Yvon; Alff, Patrick D.

    Present day Telecommunications market imposes a short concept-to-market time for service providers. To reduce it, we propose a computer-aided, model-driven, service-specific tool, with support for collaborative work and for checking properties on models. We started by defining a prototype of the Meta-model (MM) of the service domain. Using this prototype, we defined a simple graphical modeling language specific for service designers. We are currently enlarging the MM of the domain using model transformations from Network Abstractions Layers (NALs). In the future, we will investigate approaches to ensure the support for collaborative work and for checking properties on models.

  5. Domain-specific modeling enabling full code generation

    CERN Document Server

    Kelly, Steven

    2007-01-01

    Domain-Specific Modeling (DSM) is the latest approach tosoftware development, promising to greatly increase the speed andease of software creation. Early adopters of DSM have been enjoyingproductivity increases of 500–1000% in production for over adecade. This book introduces DSM and offers examples from variousfields to illustrate to experienced developers how DSM can improvesoftware development in their teams. Two authorities in the field explain what DSM is, why it works,and how to successfully create and use a DSM solution to improveproductivity and quality. Divided into four parts, the book covers:background and motivation; fundamentals; in-depth examples; andcreating DSM solutions. There is an emphasis throughout the book onpractical guidelines for implementing DSM, including how toidentify the nece sary language constructs, how to generate fullcode from models, and how to provide tool support for a new DSMlanguage. The example cases described in the book are available thebook's Website, www.dsmbook....

  6. Learning Actions Models: Qualitative Approach

    DEFF Research Database (Denmark)

    Bolander, Thomas; Gierasimczuk, Nina

    2015-01-01

    In dynamic epistemic logic, actions are described using action models. In this paper we introduce a framework for studying learnability of action models from observations. We present first results concerning propositional action models. First we check two basic learnability criteria: finite ident...

  7. Modeling the Cumulative Effects of Social Exposures on Health: Moving beyond Disease-Specific Models

    Directory of Open Access Journals (Sweden)

    Heather L. White

    2013-03-01

    Full Text Available The traditional explanatory models used in epidemiology are “disease specific”, identifying risk factors for specific health conditions. Yet social exposures lead to a generalized, cumulative health impact which may not be specific to one illness. Disease-specific models may therefore misestimate social factors’ effects on health. Using data from the Canadian Community Health Survey and Canada 2001 Census we construct and compare “disease-specific” and “generalized health impact” (GHI models to gauge the negative health effects of one social exposure: socioeconomic position (SEP. We use logistic and multinomial multilevel modeling with neighbourhood-level material deprivation, individual-level education and household income to compare and contrast the two approaches. In disease-specific models, the social determinants under study were each associated with the health conditions of interest. However, larger effect sizes were apparent when outcomes were modeled as compound health problems (0, 1, 2, or 3+ conditions using the GHI approach. To more accurately estimate social exposures’ impacts on population health, researchers should consider a GHI framework.

  8. Position-specific isotope modeling of organic micropollutants transformation through different reaction pathways

    DEFF Research Database (Denmark)

    Jin, Biao; Rolle, Massimo

    2016-01-01

    The degradation of organic micropollutants occurs via different reaction pathways. Compound specific isotope analysis is a valuable tool to identify such degradation pathways in different environmental systems. We propose a mechanism-based modeling approach that provides a quantitative framework ...

  9. Global energy modeling - A biophysical approach

    Energy Technology Data Exchange (ETDEWEB)

    Dale, Michael

    2010-09-15

    This paper contrasts the standard economic approach to energy modelling with energy models using a biophysical approach. Neither of these approaches includes changing energy-returns-on-investment (EROI) due to declining resource quality or the capital intensive nature of renewable energy sources. Both of these factors will become increasingly important in the future. An extension to the biophysical approach is outlined which encompasses a dynamic EROI function that explicitly incorporates technological learning. The model is used to explore several scenarios of long-term future energy supply especially concerning the global transition to renewable energy sources in the quest for a sustainable energy system.

  10. Challenges and opportunities for integrating lake ecosystem modelling approaches

    Science.gov (United States)

    Mooij, Wolf M.; Trolle, Dennis; Jeppesen, Erik; Arhonditsis, George; Belolipetsky, Pavel V.; Chitamwebwa, Deonatus B.R.; Degermendzhy, Andrey G.; DeAngelis, Donald L.; Domis, Lisette N. De Senerpont; Downing, Andrea S.; Elliott, J. Alex; Ruberto, Carlos Ruberto; Gaedke, Ursula; Genova, Svetlana N.; Gulati, Ramesh D.; Hakanson, Lars; Hamilton, David P.; Hipsey, Matthew R.; Hoen, Jochem 't; Hulsmann, Stephan; Los, F. Hans; Makler-Pick, Vardit; Petzoldt, Thomas; Prokopkin, Igor G.; Rinke, Karsten; Schep, Sebastiaan A.; Tominaga, Koji; Van Dam, Anne A.; Van Nes, Egbert H.; Wells, Scott A.; Janse, Jan H.

    2010-01-01

    A large number and wide variety of lake ecosystem models have been developed and published during the past four decades. We identify two challenges for making further progress in this field. One such challenge is to avoid developing more models largely following the concept of others ('reinventing the wheel'). The other challenge is to avoid focusing on only one type of model, while ignoring new and diverse approaches that have become available ('having tunnel vision'). In this paper, we aim at improving the awareness of existing models and knowledge of concurrent approaches in lake ecosystem modelling, without covering all possible model tools and avenues. First, we present a broad variety of modelling approaches. To illustrate these approaches, we give brief descriptions of rather arbitrarily selected sets of specific models. We deal with static models (steady state and regression models), complex dynamic models (CAEDYM, CE-QUAL-W2, Delft 3D-ECO, LakeMab, LakeWeb, MyLake, PCLake, PROTECH, SALMO), structurally dynamic models and minimal dynamic models. We also discuss a group of approaches that could all be classified as individual based: super-individual models (Piscator, Charisma), physiologically structured models, stage-structured models and trait-based models. We briefly mention genetic algorithms, neural networks, Kalman filters and fuzzy logic. Thereafter, we zoom in, as an in-depth example, on the multi-decadal development and application of the lake ecosystem model PCLake and related models (PCLake Metamodel, Lake Shira Model, IPH-TRIM3D-PCLake). In the discussion, we argue that while the historical development of each approach and model is understandable given its 'leading principle', there are many opportunities for combining approaches. We take the point of view that a single 'right' approach does not exist and should not be strived for. Instead, multiple modelling approaches, applied concurrently to a given problem, can help develop an integrative

  11. A Unified Approach to Modeling and Programming

    DEFF Research Database (Denmark)

    Madsen, Ole Lehrmann; Møller-Pedersen, Birger

    2010-01-01

    of this paper is to go back to the future and get inspiration from SIMULA and propose a unied approach. In addition to reintroducing the contributions of SIMULA and the Scandinavian approach to object-oriented programming, we do this by discussing a number of issues in modeling and programming and argue3 why we......SIMULA was a language for modeling and programming and provided a unied approach to modeling and programming in contrast to methodologies based on structured analysis and design. The current development seems to be going in the direction of separation of modeling and programming. The goal...

  12. Towards a CPN-Based Modelling Approach for Reconciling Verification and Implementation of Protocol Models

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge; Kristensen, Lars Michael

    2013-01-01

    Formal modelling of protocols is often aimed at one specific purpose such as verification or automatically generating an implementation. This leads to models that are useful for one purpose, but not for others. Being able to derive models for verification and implementation from a single model...... is beneficial both in terms of reduced total modelling effort and confidence that the verification results are valid also for the implementation model. In this paper we introduce the concept of a descriptive specification model and an approach based on refining a descriptive model to target both verification...... how this model can be refined to target both verification and implementation....

  13. Modelling road accidents: An approach using structural time series

    Science.gov (United States)

    Junus, Noor Wahida Md; Ismail, Mohd Tahir

    2014-09-01

    In this paper, the trend of road accidents in Malaysia for the years 2001 until 2012 was modelled using a structural time series approach. The structural time series model was identified using a stepwise method, and the residuals for each model were tested. The best-fitted model was chosen based on the smallest Akaike Information Criterion (AIC) and prediction error variance. In order to check the quality of the model, a data validation procedure was performed by predicting the monthly number of road accidents for the year 2012. Results indicate that the best specification of the structural time series model to represent road accidents is the local level with a seasonal model.

  14. Multiple Model Approaches to Modelling and Control,

    DEFF Research Database (Denmark)

    on the ease with which prior knowledge can be incorporated. It is interesting to note that researchers in Control Theory, Neural Networks,Statistics, Artificial Intelligence and Fuzzy Logic have more or less independently developed very similar modelling methods, calling them Local ModelNetworks, Operating......, and allows direct incorporation of high-level and qualitative plant knowledge into themodel. These advantages have proven to be very appealing for industrial applications, and the practical, intuitively appealing nature of the framework isdemonstrated in chapters describing applications of local methods...... to problems in the process industries, biomedical applications and autonomoussystems. The successful application of the ideas to demanding problems is already encouraging, but creative development of the basic framework isneeded to better allow the integration of human knowledge with automated learning...

  15. Geometrical approach to fluid models

    International Nuclear Information System (INIS)

    Kuvshinov, B.N.; Schep, T.J.

    1997-01-01

    Differential geometry based upon the Cartan calculus of differential forms is applied to investigate invariant properties of equations that describe the motion of continuous media. The main feature of this approach is that physical quantities are treated as geometrical objects. The geometrical notion of invariance is introduced in terms of Lie derivatives and a general procedure for the construction of local and integral fluid invariants is presented. The solutions of the equations for invariant fields can be written in terms of Lagrange variables. A generalization of the Hamiltonian formalism for finite-dimensional systems to continuous media is proposed. Analogously to finite-dimensional systems, Hamiltonian fluids are introduced as systems that annihilate an exact two-form. It is shown that Euler and ideal, charged fluids satisfy this local definition of a Hamiltonian structure. A new class of scalar invariants of Hamiltonian fluids is constructed that generalizes the invariants that are related with gauge transformations and with symmetries (Noether). copyright 1997 American Institute of Physics

  16. TIPPtool: Compositional Specification and Analysis of Markovian Performance Models

    NARCIS (Netherlands)

    Hermanns, H.; Halbwachs, N.; Peled, D.; Mertsiotakis, V.; Siegle, M.

    1999-01-01

    In this short paper we briefly describe a tool which is based on a Markovian stochastic process algebra. The tool offers both model specification and quantitative model analysis in a compositional fashion, wrapped in a userfriendly graphical front-end.

  17. Leakage flow simulation in a specific pump model

    International Nuclear Information System (INIS)

    Dupont, P; Bayeul-Lainé, A C; Dazin, A; Bois, G; Roussette, O; Si, Q

    2014-01-01

    This paper deals with the influence of leakage flow existing in SHF pump model on the analysis of internal flow behaviour inside the vane diffuser of the pump model performance using both experiments and calculations. PIV measurements have been performed at different hub to shroud planes inside one diffuser channel passage for a given speed of rotation and various flow rates. For each operating condition, the PIV measurements have been trigged with different angular impeller positions. The performances and the static pressure rise of the diffuser were also measured using a three-hole probe. The numerical simulations were carried out with Star CCM+ 8.06 code (RANS frozen and unsteady calculations). Comparisons between numerical and experimental results are presented and discussed for three flow rates. The performances of the diffuser obtained by numerical simulation results are compared to the performances obtained by three-hole probe indications. The comparisons show few influence of fluid leakage on global performances but a real improvement concerning the efficiency of the impeller, the pump and the velocity distributions. These results show that leakage is an important parameter that has to be taken into account in order to make improved comparisons between numerical approaches and experiments in such a specific model set up

  18. Modelling the Heat Consumption in District Heating Systems using a Grey-box approach

    DEFF Research Database (Denmark)

    Nielsen, Henrik Aalborg; Madsen, Henrik

    2006-01-01

    identification of an overall model structure followed by data-based modelling, whereby the details of the model are identified. This approach is sometimes called grey-box modelling, but the specific approach used here does not require states to be specified. Overall, the paper demonstrates the power of the grey......-box approach. (c) 2005 Elsevier B.V. All rights reserved....

  19. Current approaches to gene regulatory network modelling

    Directory of Open Access Journals (Sweden)

    Brazma Alvis

    2007-09-01

    Full Text Available Abstract Many different approaches have been developed to model and simulate gene regulatory networks. We proposed the following categories for gene regulatory network models: network parts lists, network topology models, network control logic models, and dynamic models. Here we will describe some examples for each of these categories. We will study the topology of gene regulatory networks in yeast in more detail, comparing a direct network derived from transcription factor binding data and an indirect network derived from genome-wide expression data in mutants. Regarding the network dynamics we briefly describe discrete and continuous approaches to network modelling, then describe a hybrid model called Finite State Linear Model and demonstrate that some simple network dynamics can be simulated in this model.

  20. Specificity of continuous auditing approach on information technology internal controls

    Directory of Open Access Journals (Sweden)

    Kaćanski Slobodan

    2012-01-01

    Full Text Available Contemporary business world, can not be imagined without the use of information technology in all aspects of business. The use of information technology in manufacturing and non-production companies' activities can greatly facilitate and accelerate the process of operation and control. Because of its complexity, they possess vulnerable areas and provide space for the emergence of accidental and intentional frauds that can significantly materially affect the business decisions made by the companies' management. Implementation of internal controls can greatly reduce the level of errors that can contribute to making the wrong decisions. In order to protect the operating system, the company's management implement an internal audit to periodically examine the fundamental quality of the internal control systems. Since the internal audit, according to its character, only periodically checks quality of internal control systems and information technologies to be reported to the manager, the problem arises in the process of in wrong time reporting the management structures of the business entity. To eliminate this problem, management implements a special approach to internal audit, called continuous auditing.

  1. Distributed simulation a model driven engineering approach

    CERN Document Server

    Topçu, Okan; Oğuztüzün, Halit; Yilmaz, Levent

    2016-01-01

    Backed by substantive case studies, the novel approach to software engineering for distributed simulation outlined in this text demonstrates the potent synergies between model-driven techniques, simulation, intelligent agents, and computer systems development.

  2. Service creation: a model-based approach

    NARCIS (Netherlands)

    Quartel, Dick; van Sinderen, Marten J.; Ferreira Pires, Luis

    1999-01-01

    This paper presents a model-based approach to support service creation. In this approach, services are assumed to be created from (available) software components. The creation process may involve multiple design steps in which the requested service is repeatedly decomposed into more detailed

  3. Models of galaxies - The modal approach

    International Nuclear Information System (INIS)

    Lin, C.C.; Lowe, S.A.

    1990-01-01

    The general viability of the modal approach to the spiral structure in normal spirals and the barlike structure in certain barred spirals is discussed. The usefulness of the modal approach in the construction of models of such galaxies is examined, emphasizing the adoption of a model appropriate to observational data for both the spiral structure of a galaxy and its basic mass distribution. 44 refs

  4. Specific and General Human Capital in an Endogenous Growth Model

    OpenAIRE

    Evangelia Vourvachaki; Vahagn Jerbashian; : Sergey Slobodyan

    2014-01-01

    In this article, we define specific (general) human capital in terms of the occupations whose use is spread in a limited (wide) set of industries. We analyze the growth impact of an economy's composition of specific and general human capital, in a model where education and research and development are costly and complementary activities. The model suggests that a declining share of specific human capital, as observed in the Czech Republic, can be associated with a lower rate of long-term grow...

  5. A Proposal for a Flexible Trend Specification in DSGE Models

    Directory of Open Access Journals (Sweden)

    Slanicay Martin

    2016-06-01

    Full Text Available In this paper I propose a flexible trend specification for estimating DSGE models on log differences. I demonstrate this flexible trend specification on a New Keynesian DSGE model of two economies, which I consequently estimate on data from the Czech economy and the euro area, using Bayesian techniques. The advantage of the trend specification proposed is that the trend component and the cyclical component are modelled jointly in a single model. The proposed trend specification is flexible in the sense that smoothness of the trend can be easily modified by different calibration of some of the trend parameters. The results suggest that this method is capable of finding a very reasonable trend in the data. Moreover, comparison of forecast performance reveals that the proposed specification offers more reliable forecasts than the original variant of the model.

  6. Modelling subject-specific childhood growth using linear mixed-effect models with cubic regression splines.

    Science.gov (United States)

    Grajeda, Laura M; Ivanescu, Andrada; Saito, Mayuko; Crainiceanu, Ciprian; Jaganath, Devan; Gilman, Robert H; Crabtree, Jean E; Kelleher, Dermott; Cabrera, Lilia; Cama, Vitaliano; Checkley, William

    2016-01-01

    Childhood growth is a cornerstone of pediatric research. Statistical models need to consider individual trajectories to adequately describe growth outcomes. Specifically, well-defined longitudinal models are essential to characterize both population and subject-specific growth. Linear mixed-effect models with cubic regression splines can account for the nonlinearity of growth curves and provide reasonable estimators of population and subject-specific growth, velocity and acceleration. We provide a stepwise approach that builds from simple to complex models, and account for the intrinsic complexity of the data. We start with standard cubic splines regression models and build up to a model that includes subject-specific random intercepts and slopes and residual autocorrelation. We then compared cubic regression splines vis-à-vis linear piecewise splines, and with varying number of knots and positions. Statistical code is provided to ensure reproducibility and improve dissemination of methods. Models are applied to longitudinal height measurements in a cohort of 215 Peruvian children followed from birth until their fourth year of life. Unexplained variability, as measured by the variance of the regression model, was reduced from 7.34 when using ordinary least squares to 0.81 (p linear mixed-effect models with random slopes and a first order continuous autoregressive error term. There was substantial heterogeneity in both the intercept (p modeled with a first order continuous autoregressive error term as evidenced by the variogram of the residuals and by a lack of association among residuals. The final model provides a parametric linear regression equation for both estimation and prediction of population- and individual-level growth in height. We show that cubic regression splines are superior to linear regression splines for the case of a small number of knots in both estimation and prediction with the full linear mixed effect model (AIC 19,352 vs. 19

  7. A Hybrid Approach to Finding Relevant Social Media Content for Complex Domain Specific Information Needs.

    Science.gov (United States)

    Cameron, Delroy; Sheth, Amit P; Jaykumar, Nishita; Thirunarayan, Krishnaprasad; Anand, Gaurish; Smith, Gary A

    2014-12-01

    While contemporary semantic search systems offer to improve classical keyword-based search, they are not always adequate for complex domain specific information needs. The domain of prescription drug abuse, for example, requires knowledge of both ontological concepts and "intelligible constructs" not typically modeled in ontologies. These intelligible constructs convey essential information that include notions of intensity, frequency, interval, dosage and sentiments, which could be important to the holistic needs of the information seeker. In this paper, we present a hybrid approach to domain specific information retrieval that integrates ontology-driven query interpretation with synonym-based query expansion and domain specific rules, to facilitate search in social media on prescription drug abuse. Our framework is based on a context-free grammar (CFG) that defines the query language of constructs interpretable by the search system. The grammar provides two levels of semantic interpretation: 1) a top-level CFG that facilitates retrieval of diverse textual patterns, which belong to broad templates and 2) a low-level CFG that enables interpretation of specific expressions belonging to such textual patterns. These low-level expressions occur as concepts from four different categories of data: 1) ontological concepts, 2) concepts in lexicons (such as emotions and sentiments), 3) concepts in lexicons with only partial ontology representation, called lexico-ontology concepts (such as side effects and routes of administration (ROA)), and 4) domain specific expressions (such as date, time, interval, frequency and dosage) derived solely through rules. Our approach is embodied in a novel Semantic Web platform called PREDOSE, which provides search support for complex domain specific information needs in prescription drug abuse epidemiology. When applied to a corpus of over 1 million drug abuse-related web forum posts, our search framework proved effective in retrieving

  8. Feasibility assessment of a risk-based approach to technical specifications

    International Nuclear Information System (INIS)

    Atefi, B.; Gallagher, D.W.

    1991-05-01

    To assess the potential use of risk and reliability techniques for improving the effectiveness of the technical specifications to control plant operational risk, the Technical Specifications Branch of the Nuclear Regulatory Commission initiated an effort to identify and evaluate alternative risk-based approaches that could bring greater risk perspective to these requirements. In the first phase four alternative approaches were identified and their characteristics were analyzed. Among these, the risk-based approach to technical specifications is the most promising approach for controlling plant operational risk using technical specifications. The second phase of the study concentrated on detailed characteristics of the real time risk-based approach. It is concluded that a real time risk-based approach to technical specifications has the potential to improve both plant safety and availability. 33 figs., 5 figs., 6 tabs

  9. Modeling of requirement specification for safety critical real time computer system using formal mathematical specifications

    International Nuclear Information System (INIS)

    Sankar, Bindu; Sasidhar Rao, B.; Ilango Sambasivam, S.; Swaminathan, P.

    2002-01-01

    Full text: Real time computer systems are increasingly used for safety critical supervision and control of nuclear reactors. Typical application areas are supervision of reactor core against coolant flow blockage, supervision of clad hot spot, supervision of undesirable power excursion, power control and control logic for fuel handling systems. The most frequent cause of fault in safety critical real time computer system is traced to fuzziness in requirement specification. To ensure the specified safety, it is necessary to model the requirement specification of safety critical real time computer systems using formal mathematical methods. Modeling eliminates the fuzziness in the requirement specification and also helps to prepare the verification and validation schemes. Test data can be easily designed from the model of the requirement specification. Z and B are the popular languages used for modeling the requirement specification. A typical safety critical real time computer system for supervising the reactor core of prototype fast breeder reactor (PFBR) against flow blockage is taken as case study. Modeling techniques and the actual model are explained in detail. The advantages of modeling for ensuring the safety are summarized

  10. Measurement Model Specification Error in LISREL Structural Equation Models.

    Science.gov (United States)

    Baldwin, Beatrice; Lomax, Richard

    This LISREL study examines the robustness of the maximum likelihood estimates under varying degrees of measurement model misspecification. A true model containing five latent variables (two endogenous and three exogenous) and two indicator variables per latent variable was used. Measurement model misspecification considered included errors of…

  11. Multiscale approach to equilibrating model polymer melts

    DEFF Research Database (Denmark)

    Svaneborg, Carsten; Ali Karimi-Varzaneh, Hossein; Hojdis, Nils

    2016-01-01

    We present an effective and simple multiscale method for equilibrating Kremer Grest model polymer melts of varying stiffness. In our approach, we progressively equilibrate the melt structure above the tube scale, inside the tube and finally at the monomeric scale. We make use of models designed...

  12. A model-driven approach to information security compliance

    Science.gov (United States)

    Correia, Anacleto; Gonçalves, António; Teodoro, M. Filomena

    2017-06-01

    The availability, integrity and confidentiality of information are fundamental to the long-term survival of any organization. Information security is a complex issue that must be holistically approached, combining assets that support corporate systems, in an extended network of business partners, vendors, customers and other stakeholders. This paper addresses the conception and implementation of information security systems, conform the ISO/IEC 27000 set of standards, using the model-driven approach. The process begins with the conception of a domain level model (computation independent model) based on information security vocabulary present in the ISO/IEC 27001 standard. Based on this model, after embedding in the model mandatory rules for attaining ISO/IEC 27001 conformance, a platform independent model is derived. Finally, a platform specific model serves the base for testing the compliance of information security systems with the ISO/IEC 27000 set of standards.

  13. A visual approach for modeling spatiotemporal relations

    NARCIS (Netherlands)

    R.L. Guimarães (Rodrigo); C.S.S. Neto; L.F.G. Soares

    2008-01-01

    htmlabstractTextual programming languages have proven to be difficult to learn and to use effectively for many people. For this sake, visual tools can be useful to abstract the complexity of such textual languages, minimizing the specification efforts. In this paper we present a visual approach for

  14. Application of various FLD modelling approaches

    Science.gov (United States)

    Banabic, D.; Aretz, H.; Paraianu, L.; Jurco, P.

    2005-07-01

    This paper focuses on a comparison between different modelling approaches to predict the forming limit diagram (FLD) for sheet metal forming under a linear strain path using the recently introduced orthotropic yield criterion BBC2003 (Banabic D et al 2005 Int. J. Plasticity 21 493-512). The FLD models considered here are a finite element based approach, the well known Marciniak-Kuczynski model, the modified maximum force criterion according to Hora et al (1996 Proc. Numisheet'96 Conf. (Dearborn/Michigan) pp 252-6), Swift's diffuse (Swift H W 1952 J. Mech. Phys. Solids 1 1-18) and Hill's classical localized necking approach (Hill R 1952 J. Mech. Phys. Solids 1 19-30). The FLD of an AA5182-O aluminium sheet alloy has been determined experimentally in order to quantify the predictive capabilities of the models mentioned above.

  15. Characterizing economic trends by Bayesian stochastic model specification search

    DEFF Research Database (Denmark)

    Grassi, Stefano; Proietti, Tommaso

    We extend a recently proposed Bayesian model selection technique, known as stochastic model specification search, for characterising the nature of the trend in macroeconomic time series. In particular, we focus on autoregressive models with possibly time-varying intercept and slope and decide on ...

  16. Position-specific isotope modeling of organic micropollutants transformation through different reaction pathways

    International Nuclear Information System (INIS)

    Jin, Biao; Rolle, Massimo

    2016-01-01

    The degradation of organic micropollutants occurs via different reaction pathways. Compound specific isotope analysis is a valuable tool to identify such degradation pathways in different environmental systems. We propose a mechanism-based modeling approach that provides a quantitative framework to simultaneously evaluate concentration as well as bulk and position-specific multi-element isotope evolution during the transformation of organic micropollutants. The model explicitly simulates position-specific isotopologues for those atoms that experience isotope effects and, thereby, provides a mechanistic description of isotope fractionation occurring at different molecular positions. To demonstrate specific features of the modeling approach, we simulated the degradation of three selected organic micropollutants: dichlorobenzamide (BAM), isoproturon (IPU) and diclofenac (DCF). The model accurately reproduces the multi-element isotope data observed in previous experimental studies. Furthermore, it precisely captures the dual element isotope trends characteristic of different reaction pathways as well as their range of variation consistent with observed bulk isotope fractionation. It was also possible to directly validate the model capability to predict the evolution of position-specific isotope ratios with available experimental data. Therefore, the approach is useful both for a mechanism-based evaluation of experimental results and as a tool to explore transformation pathways in scenarios for which position-specific isotope data are not yet available. - Highlights: • Mechanism-based, position-specific isotope modeling of micropollutants degradation. • Simultaneous description of concentration and primary and secondary isotope effects. • Key features of the model are demonstrated with three illustrative examples. • Model as a tool to explore reaction mechanisms and to design experiments. - We propose a modeling approach incorporating mechanistic information and

  17. Risk Modelling for Passages in Approach Channel

    Directory of Open Access Journals (Sweden)

    Leszek Smolarek

    2013-01-01

    Full Text Available Methods of multivariate statistics, stochastic processes, and simulation methods are used to identify and assess the risk measures. This paper presents the use of generalized linear models and Markov models to study risks to ships along the approach channel. These models combined with simulation testing are used to determine the time required for continuous monitoring of endangered objects or period at which the level of risk should be verified.

  18. Genetic Approaches to Study Meiosis and Meiosis-Specific Gene Expression in Saccharomyces cerevisiae.

    Science.gov (United States)

    Kassir, Yona; Stuart, David T

    2017-01-01

    The budding yeast Saccharomyces cerevisiae has a long history as a model organism for studies of meiosis and the cell cycle. The popularity of this yeast as a model is in large part due to the variety of genetic and cytological approaches that can be effectively performed with the cells. Cultures of the cells can be induced to synchronously progress through meiosis and sporulation allowing large-scale gene expression and biochemical studies to be performed. Additionally, the spore tetrads resulting from meiosis make it possible to characterize the haploid products of meiosis allowing investigation of meiotic recombination and chromosome segregation. Here we describe genetic methods for analysis progression of S. cerevisiae through meiosis and sporulation with an emphasis on strategies for the genetic analysis of regulators of meiosis-specific genes.

  19. Hybrid parallel execution model for logic-based specification languages

    CERN Document Server

    Tsai, Jeffrey J P

    2001-01-01

    Parallel processing is a very important technique for improving the performance of various software development and maintenance activities. The purpose of this book is to introduce important techniques for parallel executation of high-level specifications of software systems. These techniques are very useful for the construction, analysis, and transformation of reliable large-scale and complex software systems. Contents: Current Approaches; Overview of the New Approach; FRORL Requirements Specification Language and Its Decomposition; Rewriting and Data Dependency, Control Flow Analysis of a Lo

  20. Feasibility assessment of a risk-based approach to technical specifications

    International Nuclear Information System (INIS)

    Atefi, B.; Gallagher, D.W.

    1991-05-01

    The first phase of the assessment concentrates on (1) identification of selected risk-based approaches for improving current technical specifications, (2) appraisal of characteristics of each approach, including advantages and disadvantages, and (3) recommendation of one or more approaches that might result in improving current technical specification requirements. The second phase of the work concentrates on assessment of the feasibility of implementation of a pilot program to study detailed characteristics of the preferred approach. The real time risk-based approach was identified as the preferred approach to technical specifications for controlling plant operational risk. There do not appear to be any technical or institutional obstacles to prevent initiation of a pilot program to assess the characteristics and effectiveness of such an approach. 2 tabs

  1. A Framework for the Specification of Acquisition Models

    National Research Council Canada - National Science Library

    Meyers, B

    2001-01-01

    .... The timing properties associated with the items receives special treatment. The value of a framework is that one can develop specifications of various acquisition models, such as waterfall, spiral, or incremental, as instances of that framework...

  2. Context-Specific Metabolic Model Extraction Based on Regularized Least Squares Optimization.

    Directory of Open Access Journals (Sweden)

    Semidán Robaina Estévez

    Full Text Available Genome-scale metabolic models have proven highly valuable in investigating cell physiology. Recent advances include the development of methods to extract context-specific models capable of describing metabolism under more specific scenarios (e.g., cell types. Yet, none of the existing computational approaches allows for a fully automated model extraction and determination of a flux distribution independent of user-defined parameters. Here we present RegrEx, a fully automated approach that relies solely on context-specific data and ℓ1-norm regularization to extract a context-specific model and to provide a flux distribution that maximizes its correlation to data. Moreover, the publically available implementation of RegrEx was used to extract 11 context-specific human models using publicly available RNAseq expression profiles, Recon1 and also Recon2, the most recent human metabolic model. The comparison of the performance of RegrEx and its contending alternatives demonstrates that the proposed method extracts models for which both the structure, i.e., reactions included, and the flux distributions are in concordance with the employed data. These findings are supported by validation and comparison of method performance on additional data not used in context-specific model extraction. Therefore, our study sets the ground for applications of other regularization techniques in large-scale metabolic modeling.

  3. Mathematical Modeling Approaches in Plant Metabolomics.

    Science.gov (United States)

    Fürtauer, Lisa; Weiszmann, Jakob; Weckwerth, Wolfram; Nägele, Thomas

    2018-01-01

    The experimental analysis of a plant metabolome typically results in a comprehensive and multidimensional data set. To interpret metabolomics data in the context of biochemical regulation and environmental fluctuation, various approaches of mathematical modeling have been developed and have proven useful. In this chapter, a general introduction to mathematical modeling is presented and discussed in context of plant metabolism. A particular focus is laid on the suitability of mathematical approaches to functionally integrate plant metabolomics data in a metabolic network and combine it with other biochemical or physiological parameters.

  4. Results from Development of Model Specifications for Multifamily Energy Retrofits

    Energy Technology Data Exchange (ETDEWEB)

    Brozyna, K.

    2012-08-01

    Specifications, modeled after CSI MasterFormat, provide the trade contractors and builders with requirements and recommendations on specific building materials, components and industry practices that comply with the expectations and intent of the requirements within the various funding programs associated with a project. The goal is to create a greater level of consistency in execution of energy efficiency retrofits measures across the multiple regions a developer may work. IBACOS and Mercy Housing developed sample model specifications based on a common building construction type that Mercy Housing encounters.

  5. Results From Development of Model Specifications for Multifamily Energy Retrofits

    Energy Technology Data Exchange (ETDEWEB)

    Brozyna, Kevin [IBACOS, Inc., Pittsburgh, PA (United States)

    2012-08-01

    Specifications, modeled after CSI MasterFormat, provide the trade contractors and builders with requirements and recommendations on specific building materials, components and industry practices that comply with the expectations and intent of the requirements within the various funding programs associated with a project. The goal is to create a greater level of consistency in execution of energy efficiency retrofits measures across the multiple regions a developer may work. IBACOS and Mercy Housing developed sample model specifications based on a common building construction type that Mercy Housing encounters.

  6. Stochastic approaches to inflation model building

    International Nuclear Information System (INIS)

    Ramirez, Erandy; Liddle, Andrew R.

    2005-01-01

    While inflation gives an appealing explanation of observed cosmological data, there are a wide range of different inflation models, providing differing predictions for the initial perturbations. Typically models are motivated either by fundamental physics considerations or by simplicity. An alternative is to generate large numbers of models via a random generation process, such as the flow equations approach. The flow equations approach is known to predict a definite structure to the observational predictions. In this paper, we first demonstrate a more efficient implementation of the flow equations exploiting an analytic solution found by Liddle (2003). We then consider alternative stochastic methods of generating large numbers of inflation models, with the aim of testing whether the structures generated by the flow equations are robust. We find that while typically there remains some concentration of points in the observable plane under the different methods, there is significant variation in the predictions amongst the methods considered

  7. Model validation: a systemic and systematic approach

    International Nuclear Information System (INIS)

    Sheng, G.; Elzas, M.S.; Cronhjort, B.T.

    1993-01-01

    The term 'validation' is used ubiquitously in association with the modelling activities of numerous disciplines including social, political natural, physical sciences, and engineering. There is however, a wide range of definitions which give rise to very different interpretations of what activities the process involves. Analyses of results from the present large international effort in modelling radioactive waste disposal systems illustrate the urgent need to develop a common approach to model validation. Some possible explanations are offered to account for the present state of affairs. The methodology developed treats model validation and code verification in a systematic fashion. In fact, this approach may be regarded as a comprehensive framework to assess the adequacy of any simulation study. (author)

  8. A comprehensive approach to age-dependent dosimetric modeling

    International Nuclear Information System (INIS)

    Leggett, R.W.; Cristy, M.; Eckerman, K.F.

    1986-01-01

    In the absence of age-specific biokinetic models, current retention models of the International Commission on Radiological Protection (ICRP) frequently are used as a point of departure for evaluation of exposures to the general population. These models were designed and intended for estimation of long-term integrated doses to the adult worker. Their format and empirical basis preclude incorporation of much valuable physiological information and physiologically reasonable assumptions that could be used in characterizing the age-specific behavior of radioelements in humans. In this paper we discuss a comprehensive approach to age-dependent dosimetric modeling in which consideration is given not only to changes with age in masses and relative geometries of body organs and tissues but also to best available physiological and radiobiological information relating to the age-specific biobehavior of radionuclides. This approach is useful in obtaining more accurate estimates of long-term dose commitments as a function of age at intake, but it may be particularly valuable in establishing more accurate estimates of dose rate as a function of age. Age-specific dose rates are needed for a proper analysis of the potential effects on estimates or risk of elevated dose rates per unit intake in certain stages of life, elevated response per unit dose received during some stages of life, and age-specific non-radiogenic competing risks

  9. A comprehensive approach to age-dependent dosimetric modeling

    International Nuclear Information System (INIS)

    Leggett, R.W.; Cristy, M.; Eckerman, K.F.

    1987-01-01

    In the absence of age-specific biokinetic models, current retention models of the International Commission of Radiological Protection (ICRP) frequently are used as a point of departure for evaluation of exposures to the general population. These models were designed and intended for estimation of long-term integrated doses to the adult worker. Their format and empirical basis preclude incorporation of much valuable physiological information and physiologically reasonable assumptions that could be used in characterizing the age-specific behavior of radioelements in humans. In this paper a comprehensive approach to age-dependent dosimetric modeling is discussed in which consideration is given not only to changes with age in masses and relative geometries of body organs and tissues but also to best available physiological and radiobiological information relating to the age-specific biobehavior of radionuclides. This approach is useful in obtaining more accurate estimates of long-term dose commitments as a function of age at intake, but it may be particularly valuable in establishing more accurate estimates of dose rate as a function of age. Age-specific dose rates are needed for a proper analysis of the potential effects on estimates of risk of elevated dose rates per unit intake in certain stages of life, elevated response per unit dose received during some stages of life, and age-specific non-radiogenic competing risks. 16 refs.; 3 figs.; 1 table

  10. An object-oriented approach to energy-economic modeling

    Energy Technology Data Exchange (ETDEWEB)

    Wise, M.A.; Fox, J.A.; Sands, R.D.

    1993-12-01

    In this paper, the authors discuss the experiences in creating an object-oriented economic model of the U.S. energy and agriculture markets. After a discussion of some central concepts, they provide an overview of the model, focusing on the methodology of designing an object-oriented class hierarchy specification based on standard microeconomic production functions. The evolution of the model from the class definition stage to programming it in C++, a standard object-oriented programming language, will be detailed. The authors then discuss the main differences between writing the object-oriented program versus a procedure-oriented program of the same model. Finally, they conclude with a discussion of the advantages and limitations of the object-oriented approach based on the experience in building energy-economic models with procedure-oriented approaches and languages.

  11. Variational approach to chiral quark models

    Energy Technology Data Exchange (ETDEWEB)

    Futami, Yasuhiko; Odajima, Yasuhiko; Suzuki, Akira

    1987-03-01

    A variational approach is applied to a chiral quark model to test the validity of the perturbative treatment of the pion-quark interaction based on the chiral symmetry principle. It is indispensably related to the chiral symmetry breaking radius if the pion-quark interaction can be regarded as a perturbation.

  12. A variational approach to chiral quark models

    International Nuclear Information System (INIS)

    Futami, Yasuhiko; Odajima, Yasuhiko; Suzuki, Akira.

    1987-01-01

    A variational approach is applied to a chiral quark model to test the validity of the perturbative treatment of the pion-quark interaction based on the chiral symmetry principle. It is indispensably related to the chiral symmetry breaking radius if the pion-quark interaction can be regarded as a perturbation. (author)

  13. Approaches for Establishing Clinically Relevant Dissolution Specifications for Immediate Release Solid Oral Dosage Forms.

    Science.gov (United States)

    Hermans, Andre; Abend, Andreas M; Kesisoglou, Filippos; Flanagan, Talia; Cohen, Michael J; Diaz, Dorys A; Mao, Y; Zhang, Limin; Webster, Gregory K; Lin, Yiqing; Hahn, David A; Coutant, Carrie A; Grady, Haiyan

    2017-11-01

    This manuscript represents the perspective of the Dissolution Analytical Working Group of the IQ Consortium. The intent of this manuscript is to highlight the challenges of, and to provide a recommendation on, the development of clinically relevant dissolution specifications (CRS) for immediate release (IR) solid oral dosage forms. A roadmap toward the development of CRS for IR products containing active ingredients with a non-narrow therapeutic window is discussed, within the context of mechanistic dissolution understanding, supported by in-human pharmacokinetic (PK) data. Two case studies present potential outcomes of following the CRS roadmap and setting dissolution specifications. These cases reveal some benefits and challenges of pursuing CRS with additional PK data, in light of current regulatory positions, including that of the US Food and Drug Administration (FDA), who generally favor this approach, but with the understanding that both industry and regulatory agency perspectives are still evolving in this relatively new field. The CRS roadmap discussed in this manuscript also describes a way to develop clinically relevant dissolution specifications based primarily on dissolution data for batches used in pivotal clinical studies, acknowledging that not all IR product development efforts need to be supported by additional PK studies, albeit with the associated risk of potentially unnecessarily tight manufacturing controls. Recommendations are provided on what stages during the life cycle investment into in vivo studies may be valuable. Finally, the opportunities for CRS within the context of post-approval changes, Modeling and Simulation (M&S), and the application of biowaivers, are briefly discussed.

  14. An algebraic approach to modeling in software engineering

    International Nuclear Information System (INIS)

    Loegel, C.J.; Ravishankar, C.V.

    1993-09-01

    Our work couples the formalism of universal algebras with the engineering techniques of mathematical modeling to develop a new approach to the software engineering process. Our purpose in using this combination is twofold. First, abstract data types and their specification using universal algebras can be considered a common point between the practical requirements of software engineering and the formal specification of software systems. Second, mathematical modeling principles provide us with a means for effectively analyzing real-world systems. We first use modeling techniques to analyze a system and then represent the analysis using universal algebras. The rest of the software engineering process exploits properties of universal algebras that preserve the structure of our original model. This paper describes our software engineering process and our experience using it on both research and commercial systems. We need a new approach because current software engineering practices often deliver software that is difficult to develop and maintain. Formal software engineering approaches use universal algebras to describe ''computer science'' objects like abstract data types, but in practice software errors are often caused because ''real-world'' objects are improperly modeled. There is a large semantic gap between the customer's objects and abstract data types. In contrast, mathematical modeling uses engineering techniques to construct valid models for real-world systems, but these models are often implemented in an ad hoc manner. A combination of the best features of both approaches would enable software engineering to formally specify and develop software systems that better model real systems. Software engineering, like mathematical modeling, should concern itself first and foremost with understanding a real system and its behavior under given circumstances, and then with expressing this knowledge in an executable form

  15. A Set Theoretical Approach to Maturity Models

    DEFF Research Database (Denmark)

    Lasrado, Lester; Vatrapu, Ravi; Andersen, Kim Normann

    2016-01-01

    characterized by equifinality, multiple conjunctural causation, and case diversity. We prescribe methodological guidelines consisting of a six-step procedure to systematically apply set theoretic methods to conceptualize, develop, and empirically derive maturity models and provide a demonstration......Maturity Model research in IS has been criticized for the lack of theoretical grounding, methodological rigor, empirical validations, and ignorance of multiple and non-linear paths to maturity. To address these criticisms, this paper proposes a novel set-theoretical approach to maturity models...

  16. Teaching Sustainability Using an Active Learning Constructivist Approach: Discipline-Specific Case Studies in Higher Education

    Directory of Open Access Journals (Sweden)

    Maria Kalamas Hedden

    2017-07-01

    Full Text Available In this paper we present our rationale for using an active learning constructivist approach to teach sustainability-related topics in a higher education. To push the boundaries of ecological literacy, we also develop a theoretical model for sustainability knowledge co-creation. Drawing on the experiences of faculty at a major Southeastern University in the United States, we present case studies in architecture, engineering, geography, and marketing. Four Sustainability Faculty Fellows describe their discipline-specific case studies, all of which are project-based learning experiences, and include details regarding teaching and assessment. Easily replicated in other educational contexts, these case studies contribute to the advancement of sustainability education.

  17. A hybrid modeling approach for option pricing

    Science.gov (United States)

    Hajizadeh, Ehsan; Seifi, Abbas

    2011-11-01

    The complexity of option pricing has led many researchers to develop sophisticated models for such purposes. The commonly used Black-Scholes model suffers from a number of limitations. One of these limitations is the assumption that the underlying probability distribution is lognormal and this is so controversial. We propose a couple of hybrid models to reduce these limitations and enhance the ability of option pricing. The key input to option pricing model is volatility. In this paper, we use three popular GARCH type model for estimating volatility. Then, we develop two non-parametric models based on neural networks and neuro-fuzzy networks to price call options for S&P 500 index. We compare the results with those of Black-Scholes model and show that both neural network and neuro-fuzzy network models outperform Black-Scholes model. Furthermore, comparing the neural network and neuro-fuzzy approaches, we observe that for at-the-money options, neural network model performs better and for both in-the-money and an out-of-the money option, neuro-fuzzy model provides better results.

  18. Heat transfer modeling an inductive approach

    CERN Document Server

    Sidebotham, George

    2015-01-01

    This innovative text emphasizes a "less-is-more" approach to modeling complicated systems such as heat transfer by treating them first as "1-node lumped models" that yield simple closed-form solutions. The author develops numerical techniques for students to obtain more detail, but also trains them to use the techniques only when simpler approaches fail. Covering all essential methods offered in traditional texts, but with a different order, Professor Sidebotham stresses inductive thinking and problem solving as well as a constructive understanding of modern, computer-based practice. Readers learn to develop their own code in the context of the material, rather than just how to use packaged software, offering a deeper, intrinsic grasp behind models of heat transfer. Developed from over twenty-five years of lecture notes to teach students of mechanical and chemical engineering at The Cooper Union for the Advancement of Science and Art, the book is ideal for students and practitioners across engineering discipl...

  19. Nonperturbative approach to the attractive Hubbard model

    International Nuclear Information System (INIS)

    Allen, S.; Tremblay, A.-M. S.

    2001-01-01

    A nonperturbative approach to the single-band attractive Hubbard model is presented in the general context of functional-derivative approaches to many-body theories. As in previous work on the repulsive model, the first step is based on a local-field-type ansatz, on enforcement of the Pauli principle and a number of crucial sumrules. The Mermin-Wagner theorem in two dimensions is automatically satisfied. At this level, two-particle self-consistency has been achieved. In the second step of the approximation, an improved expression for the self-energy is obtained by using the results of the first step in an exact expression for the self-energy, where the high- and low-frequency behaviors appear separately. The result is a cooperon-like formula. The required vertex corrections are included in this self-energy expression, as required by the absence of a Migdal theorem for this problem. Other approaches to the attractive Hubbard model are critically compared. Physical consequences of the present approach and agreement with Monte Carlo simulations are demonstrated in the accompanying paper (following this one)

  20. Analysis specifications for the CC3 geosphere model GEONET

    International Nuclear Information System (INIS)

    Melnyk, T.W.

    1995-04-01

    AECL is assessing a concept for disposing of Canada's nuclear fuel waste in a sealed vault deep in plutonic rock of the Canadian Shield. A computer program has been developed as an analytical tool for the postclosure assessment case study, a system model, CC3 (Canadian Concept, generation 3), has been developed to describe a hypothetical disposal system. This system model includes separate models for the engineered barriers within the disposal vault, the geosphere in which the vault is emplaced, and the biosphere in the vicinity of any discharge zones. The system model is embedded within a computer code SYVAC3, (SYstems Variability Analysis Code, generation 3), which takes parameter uncertainty into account by repeated simulation of the system. GEONET (GEOsphere NETwork) is the geosphere model component of this system model. It simulates contaminant transport from the vault to the biosphere along a transport network composed of one-dimensional transport segments that are connected together in three-dimensional space. This document is a set of specifications for GEONET that were developed over a number of years. Improvements to the code will be based on revisions to these specifications. The specifications consist of a model synopsis, describing all the relevant equations and assumptions used in the model, a set of formal data flow diagrams and minispecifications, and a data dictionary. (author). 26 refs., 20 figs

  1. Analysis specifications for the CC3 biosphere model BIOTRAC

    International Nuclear Information System (INIS)

    Szekely, J.G.; Wojciechowski, L.C.; Stephens, M.E.; Halliday, H.A.

    1994-12-01

    AECL Research is assessing a concept for disposing of Canada's nuclear fuel waste in a vault deep in plutonic rock of the Canadian Shield. A computer program called the Systems Variability Analysis Code (SYVAC) has been developed as an analytical tool for the postclosure (long-term) assessment of the concept. SYVAC3, the third generation of the code, is an executive program that directs repeated simulation of the disposal system to take into account parameter variation. For the postclosure assessment, the system model, CC3 (Canadian Concept, generation 3), was developed to describe a hypothetical disposal system that includes a disposal vault, the local geosphere and the biosphere in the vicinity of any discharge zones. BIOTRAC (BIOsphere TRansport And Consequences) is the biosphere model in the CC3 system model. The specifications for BIOTRAC, which were developed over a period of seven years, were subjected to numerous walkthrough examinations by the Biosphere Model Working Group to ensure that the intent of the model developers would be correctly specified for transformation into FORTRAN code. The FORTRAN version of BIOTRAC was written from interim versions of these specifications. Improvements to the code are based on revised versions of these specifications. The specifications consist of a data dictionary; sets of synopses, data flow diagrams and mini specs for the component models of BIOTRAC (surface water, soil, atmosphere, and food chain and dose); and supporting calculations (interface to the geosphere, consequences, and mass balance). (author). 20 refs., tabs., figs

  2. Quasirelativistic quark model in quasipotential approach

    CERN Document Server

    Matveev, V A; Savrin, V I; Sissakian, A N

    2002-01-01

    The relativistic particles interaction is described within the frames of quasipotential approach. The presentation is based on the so called covariant simultaneous formulation of the quantum field theory, where by the theory is considered on the spatial-like three-dimensional hypersurface in the Minkowski space. Special attention is paid to the methods of plotting various quasipotentials as well as to the applications of the quasipotential approach to describing the characteristics of the relativistic particles interaction in the quark models, namely: the hadrons elastic scattering amplitudes, the mass spectra and widths mesons decays, the cross sections of the deep inelastic leptons scattering on the hadrons

  3. A multiscale modeling approach for biomolecular systems

    Energy Technology Data Exchange (ETDEWEB)

    Bowling, Alan, E-mail: bowling@uta.edu; Haghshenas-Jaryani, Mahdi, E-mail: mahdi.haghshenasjaryani@mavs.uta.edu [The University of Texas at Arlington, Department of Mechanical and Aerospace Engineering (United States)

    2015-04-15

    This paper presents a new multiscale molecular dynamic model for investigating the effects of external interactions, such as contact and impact, during stepping and docking of motor proteins and other biomolecular systems. The model retains the mass properties ensuring that the result satisfies Newton’s second law. This idea is presented using a simple particle model to facilitate discussion of the rigid body model; however, the particle model does provide insights into particle dynamics at the nanoscale. The resulting three-dimensional model predicts a significant decrease in the effect of the random forces associated with Brownian motion. This conclusion runs contrary to the widely accepted notion that the motor protein’s movements are primarily the result of thermal effects. This work focuses on the mechanical aspects of protein locomotion; the effect ATP hydrolysis is estimated as internal forces acting on the mechanical model. In addition, the proposed model can be numerically integrated in a reasonable amount of time. Herein, the differences between the motion predicted by the old and new modeling approaches are compared using a simplified model of myosin V.

  4. Domain Specific Language for Modeling Waste Management Systems

    DEFF Research Database (Denmark)

    Zarrin, Bahram

    environmental technologies i.e. solid waste management systems. Flow-based programming is used to support concurrent execution of the processes, and provides a model-integration language for composing processes from homogeneous or heterogeneous domains. And a domain-specific language is used to define atomic......In order to develop sustainable waste management systems with considering life cycle perspective, scientists and domain experts in environmental science require readily applicable tools for modeling and evaluating the life cycle impacts of the waste management systems. Practice has proved...... a domain specific language for modeling of waste-management systems on the basis of our framework. We evaluate the language by providing a set of case studies. The contributions of this thesis are; addressing separation of concerns in Flow-based programming and providing the formal specification of its...

  5. Software sensors based on the grey-box modelling approach

    DEFF Research Database (Denmark)

    Carstensen, J.; Harremoës, P.; Strube, Rune

    1996-01-01

    In recent years the grey-box modelling approach has been applied to wastewater transportation and treatment Grey-box models are characterized by the combination of deterministic and stochastic terms to form a model where all the parameters are statistically identifiable from the on......-box model for the specific dynamics is identified. Similarly, an on-line software sensor for detecting the occurrence of backwater phenomena can be developed by comparing the dynamics of a flow measurement with a nearby level measurement. For treatment plants it is found that grey-box models applied to on......-line measurements. With respect to the development of software sensors, the grey-box models possess two important features. Firstly, the on-line measurements can be filtered according to the grey-box model in order to remove noise deriving from the measuring equipment and controlling devices. Secondly, the grey...

  6. A Bayesian alternative for multi-objective ecohydrological model specification

    Science.gov (United States)

    Tang, Yating; Marshall, Lucy; Sharma, Ashish; Ajami, Hoori

    2018-01-01

    Recent studies have identified the importance of vegetation processes in terrestrial hydrologic systems. Process-based ecohydrological models combine hydrological, physical, biochemical and ecological processes of the catchments, and as such are generally more complex and parametric than conceptual hydrological models. Thus, appropriate calibration objectives and model uncertainty analysis are essential for ecohydrological modeling. In recent years, Bayesian inference has become one of the most popular tools for quantifying the uncertainties in hydrological modeling with the development of Markov chain Monte Carlo (MCMC) techniques. The Bayesian approach offers an appealing alternative to traditional multi-objective hydrologic model calibrations by defining proper prior distributions that can be considered analogous to the ad-hoc weighting often prescribed in multi-objective calibration. Our study aims to develop appropriate prior distributions and likelihood functions that minimize the model uncertainties and bias within a Bayesian ecohydrological modeling framework based on a traditional Pareto-based model calibration technique. In our study, a Pareto-based multi-objective optimization and a formal Bayesian framework are implemented in a conceptual ecohydrological model that combines a hydrological model (HYMOD) and a modified Bucket Grassland Model (BGM). Simulations focused on one objective (streamflow/LAI) and multiple objectives (streamflow and LAI) with different emphasis defined via the prior distribution of the model error parameters. Results show more reliable outputs for both predicted streamflow and LAI using Bayesian multi-objective calibration with specified prior distributions for error parameters based on results from the Pareto front in the ecohydrological modeling. The methodology implemented here provides insight into the usefulness of multiobjective Bayesian calibration for ecohydrologic systems and the importance of appropriate prior

  7. A new approach for developing adjoint models

    Science.gov (United States)

    Farrell, P. E.; Funke, S. W.

    2011-12-01

    Many data assimilation algorithms rely on the availability of gradients of misfit functionals, which can be efficiently computed with adjoint models. However, the development of an adjoint model for a complex geophysical code is generally very difficult. Algorithmic differentiation (AD, also called automatic differentiation) offers one strategy for simplifying this task: it takes the abstraction that a model is a sequence of primitive instructions, each of which may be differentiated in turn. While extremely successful, this low-level abstraction runs into time-consuming difficulties when applied to the whole codebase of a model, such as differentiating through linear solves, model I/O, calls to external libraries, language features that are unsupported by the AD tool, and the use of multiple programming languages. While these difficulties can be overcome, it requires a large amount of technical expertise and an intimate familiarity with both the AD tool and the model. An alternative to applying the AD tool to the whole codebase is to assemble the discrete adjoint equations and use these to compute the necessary gradients. With this approach, the AD tool must be applied to the nonlinear assembly operators, which are typically small, self-contained units of the codebase. The disadvantage of this approach is that the assembly of the discrete adjoint equations is still very difficult to perform correctly, especially for complex multiphysics models that perform temporal integration; as it stands, this approach is as difficult and time-consuming as applying AD to the whole model. In this work, we have developed a library which greatly simplifies and automates the alternate approach of assembling the discrete adjoint equations. We propose a complementary, higher-level abstraction to that of AD: that a model is a sequence of linear solves. The developer annotates model source code with library calls that build a 'tape' of the operators involved and their dependencies, and

  8. Eutrophication Modeling Using Variable Chlorophyll Approach

    International Nuclear Information System (INIS)

    Abdolabadi, H.; Sarang, A.; Ardestani, M.; Mahjoobi, E.

    2016-01-01

    In this study, eutrophication was investigated in Lake Ontario to identify the interactions among effective drivers. The complexity of such phenomenon was modeled using a system dynamics approach based on a consideration of constant and variable stoichiometric ratios. The system dynamics approach is a powerful tool for developing object-oriented models to simulate complex phenomena that involve feedback effects. Utilizing stoichiometric ratios is a method for converting the concentrations of state variables. During the physical segmentation of the model, Lake Ontario was divided into two layers, i.e., the epilimnion and hypolimnion, and differential equations were developed for each layer. The model structure included 16 state variables related to phytoplankton, herbivorous zooplankton, carnivorous zooplankton, ammonium, nitrate, dissolved phosphorus, and particulate and dissolved carbon in the epilimnion and hypolimnion during a time horizon of one year. The results of several tests to verify the model, close to 1 Nash-Sutcliff coefficient (0.98), the data correlation coefficient (0.98), and lower standard errors (0.96), have indicated well-suited model’s efficiency. The results revealed that there were significant differences in the concentrations of the state variables in constant and variable stoichiometry simulations. Consequently, the consideration of variable stoichiometric ratios in algae and nutrient concentration simulations may be applied in future modeling studies to enhance the accuracy of the results and reduce the likelihood of inefficient control policies.

  9. Benchmarking novel approaches for modelling species range dynamics.

    Science.gov (United States)

    Zurell, Damaris; Thuiller, Wilfried; Pagel, Jörn; Cabral, Juliano S; Münkemüller, Tamara; Gravel, Dominique; Dullinger, Stefan; Normand, Signe; Schiffers, Katja H; Moore, Kara A; Zimmermann, Niklaus E

    2016-08-01

    Increasing biodiversity loss due to climate change is one of the most vital challenges of the 21st century. To anticipate and mitigate biodiversity loss, models are needed that reliably project species' range dynamics and extinction risks. Recently, several new approaches to model range dynamics have been developed to supplement correlative species distribution models (SDMs), but applications clearly lag behind model development. Indeed, no comparative analysis has been performed to evaluate their performance. Here, we build on process-based, simulated data for benchmarking five range (dynamic) models of varying complexity including classical SDMs, SDMs coupled with simple dispersal or more complex population dynamic models (SDM hybrids), and a hierarchical Bayesian process-based dynamic range model (DRM). We specifically test the effects of demographic and community processes on model predictive performance. Under current climate, DRMs performed best, although only marginally. Under climate change, predictive performance varied considerably, with no clear winners. Yet, all range dynamic models improved predictions under climate change substantially compared to purely correlative SDMs, and the population dynamic models also predicted reasonable extinction risks for most scenarios. When benchmarking data were simulated with more complex demographic and community processes, simple SDM hybrids including only dispersal often proved most reliable. Finally, we found that structural decisions during model building can have great impact on model accuracy, but prior system knowledge on important processes can reduce these uncertainties considerably. Our results reassure the clear merit in using dynamic approaches for modelling species' response to climate change but also emphasize several needs for further model and data improvement. We propose and discuss perspectives for improving range projections through combination of multiple models and for making these approaches

  10. Accounting for standard errors of vision-specific latent trait in regression models.

    Science.gov (United States)

    Wong, Wan Ling; Li, Xiang; Li, Jialiang; Wong, Tien Yin; Cheng, Ching-Yu; Lamoureux, Ecosse L

    2014-07-11

    To demonstrate the effectiveness of Hierarchical Bayesian (HB) approach in a modeling framework for association effects that accounts for SEs of vision-specific latent traits assessed using Rasch analysis. A systematic literature review was conducted in four major ophthalmic journals to evaluate Rasch analysis performed on vision-specific instruments. The HB approach was used to synthesize the Rasch model and multiple linear regression model for the assessment of the association effects related to vision-specific latent traits. The effectiveness of this novel HB one-stage "joint-analysis" approach allows all model parameters to be estimated simultaneously and was compared with the frequently used two-stage "separate-analysis" approach in our simulation study (Rasch analysis followed by traditional statistical analyses without adjustment for SE of latent trait). Sixty-six reviewed articles performed evaluation and validation of vision-specific instruments using Rasch analysis, and 86.4% (n = 57) performed further statistical analyses on the Rasch-scaled data using traditional statistical methods; none took into consideration SEs of the estimated Rasch-scaled scores. The two models on real data differed for effect size estimations and the identification of "independent risk factors." Simulation results showed that our proposed HB one-stage "joint-analysis" approach produces greater accuracy (average of 5-fold decrease in bias) with comparable power and precision in estimation of associations when compared with the frequently used two-stage "separate-analysis" procedure despite accounting for greater uncertainty due to the latent trait. Patient-reported data, using Rasch analysis techniques, do not take into account the SE of latent trait in association analyses. The HB one-stage "joint-analysis" is a better approach, producing accurate effect size estimations and information about the independent association of exposure variables with vision-specific latent traits

  11. Modeling the Development of Goal-Specificity in Mirror Neurons.

    Science.gov (United States)

    Thill, Serge; Svensson, Henrik; Ziemke, Tom

    2011-12-01

    Neurophysiological studies have shown that parietal mirror neurons encode not only actions but also the goal of these actions. Although some mirror neurons will fire whenever a certain action is perceived (goal-independently), most will only fire if the motion is perceived as part of an action with a specific goal. This result is important for the action-understanding hypothesis as it provides a potential neurological basis for such a cognitive ability. It is also relevant for the design of artificial cognitive systems, in particular robotic systems that rely on computational models of the mirror system in their interaction with other agents. Yet, to date, no computational model has explicitly addressed the mechanisms that give rise to both goal-specific and goal-independent parietal mirror neurons. In the present paper, we present a computational model based on a self-organizing map, which receives artificial inputs representing information about both the observed or executed actions and the context in which they were executed. We show that the map develops a biologically plausible organization in which goal-specific mirror neurons emerge. We further show that the fundamental cause for both the appearance and the number of goal-specific neurons can be found in geometric relationships between the different inputs to the map. The results are important to the action-understanding hypothesis as they provide a mechanism for the emergence of goal-specific parietal mirror neurons and lead to a number of predictions: (1) Learning of new goals may mostly reassign existing goal-specific neurons rather than recruit new ones; (2) input differences between executed and observed actions can explain observed corresponding differences in the number of goal-specific neurons; and (3) the percentage of goal-specific neurons may differ between motion primitives.

  12. Comparisons of Multilevel Modeling and Structural Equation Modeling Approaches to Actor-Partner Interdependence Model.

    Science.gov (United States)

    Hong, Sehee; Kim, Soyoung

    2018-01-01

    There are basically two modeling approaches applicable to analyzing an actor-partner interdependence model: the multilevel modeling (hierarchical linear model) and the structural equation modeling. This article explains how to use these two models in analyzing an actor-partner interdependence model and how these two approaches work differently. As an empirical example, marital conflict data were used to analyze an actor-partner interdependence model. The multilevel modeling and the structural equation modeling produced virtually identical estimates for a basic model. However, the structural equation modeling approach allowed more realistic assumptions on measurement errors and factor loadings, rendering better model fit indices.

  13. Evolutionary modeling-based approach for model errors correction

    Directory of Open Access Journals (Sweden)

    S. Q. Wan

    2012-08-01

    Full Text Available The inverse problem of using the information of historical data to estimate model errors is one of the science frontier research topics. In this study, we investigate such a problem using the classic Lorenz (1963 equation as a prediction model and the Lorenz equation with a periodic evolutionary function as an accurate representation of reality to generate "observational data."

    On the basis of the intelligent features of evolutionary modeling (EM, including self-organization, self-adaptive and self-learning, the dynamic information contained in the historical data can be identified and extracted by computer automatically. Thereby, a new approach is proposed to estimate model errors based on EM in the present paper. Numerical tests demonstrate the ability of the new approach to correct model structural errors. In fact, it can actualize the combination of the statistics and dynamics to certain extent.

  14. MODELS OF TECHNOLOGY ADOPTION: AN INTEGRATIVE APPROACH

    Directory of Open Access Journals (Sweden)

    Andrei OGREZEANU

    2015-06-01

    Full Text Available The interdisciplinary study of information technology adoption has developed rapidly over the last 30 years. Various theoretical models have been developed and applied such as: the Technology Acceptance Model (TAM, Innovation Diffusion Theory (IDT, Theory of Planned Behavior (TPB, etc. The result of these many years of research is thousands of contributions to the field, which, however, remain highly fragmented. This paper develops a theoretical model of technology adoption by integrating major theories in the field: primarily IDT, TAM, and TPB. To do so while avoiding mess, an approach that goes back to basics in independent variable type’s development is proposed; emphasizing: 1 the logic of classification, and 2 psychological mechanisms behind variable types. Once developed these types are then populated with variables originating in empirical research. Conclusions are developed on which types are underpopulated and present potential for future research. I end with a set of methodological recommendations for future application of the model.

  15. Interfacial Fluid Mechanics A Mathematical Modeling Approach

    CERN Document Server

    Ajaev, Vladimir S

    2012-01-01

    Interfacial Fluid Mechanics: A Mathematical Modeling Approach provides an introduction to mathematical models of viscous flow used in rapidly developing fields of microfluidics and microscale heat transfer. The basic physical effects are first introduced in the context of simple configurations and their relative importance in typical microscale applications is discussed. Then,several configurations of importance to microfluidics, most notably thin films/droplets on substrates and confined bubbles, are discussed in detail.  Topics from current research on electrokinetic phenomena, liquid flow near structured solid surfaces, evaporation/condensation, and surfactant phenomena are discussed in the later chapters. This book also:  Discusses mathematical models in the context of actual applications such as electrowetting Includes unique material on fluid flow near structured surfaces and phase change phenomena Shows readers how to solve modeling problems related to microscale multiphase flows Interfacial Fluid Me...

  16. A new modelling approach for zooplankton behaviour

    Science.gov (United States)

    Keiyu, A. Y.; Yamazaki, H.; Strickler, J. R.

    We have developed a new simulation technique to model zooplankton behaviour. The approach utilizes neither the conventional artificial intelligence nor neural network methods. We have designed an adaptive behaviour network, which is similar to BEER [(1990) Intelligence as an adaptive behaviour: an experiment in computational neuroethology, Academic Press], based on observational studies of zooplankton behaviour. The proposed method is compared with non- "intelligent" models—random walk and correlated walk models—as well as observed behaviour in a laboratory tank. Although the network is simple, the model exhibits rich behavioural patterns similar to live copepods.

  17. Continuum modeling an approach through practical examples

    CERN Document Server

    Muntean, Adrian

    2015-01-01

    This book develops continuum modeling skills and approaches the topic from three sides: (1) derivation of global integral laws together with the associated local differential equations, (2) design of constitutive laws and (3) modeling boundary processes. The focus of this presentation lies on many practical examples covering aspects such as coupled flow, diffusion and reaction in porous media or microwave heating of a pizza, as well as traffic issues in bacterial colonies and energy harvesting from geothermal wells. The target audience comprises primarily graduate students in pure and applied mathematics as well as working practitioners in engineering who are faced by nonstandard rheological topics like those typically arising in the food industry.

  18. A Principled Approach to the Specification of System Architectures for Space Missions

    Science.gov (United States)

    McKelvin, Mark L. Jr.; Castillo, Robert; Bonanne, Kevin; Bonnici, Michael; Cox, Brian; Gibson, Corrina; Leon, Juan P.; Gomez-Mustafa, Jose; Jimenez, Alejandro; Madni, Azad

    2015-01-01

    Modern space systems are increasing in complexity and scale at an unprecedented pace. Consequently, innovative methods, processes, and tools are needed to cope with the increasing complexity of architecting these systems. A key systems challenge in practice is the ability to scale processes, methods, and tools used to architect complex space systems. Traditionally, the process for specifying space system architectures has largely relied on capturing the system architecture in informal descriptions that are often embedded within loosely coupled design documents and domain expertise. Such informal descriptions often lead to misunderstandings between design teams, ambiguous specifications, difficulty in maintaining consistency as the architecture evolves throughout the system development life cycle, and costly design iterations. Therefore, traditional methods are becoming increasingly inefficient to cope with ever-increasing system complexity. We apply the principles of component-based design and platform-based design to the development of the system architecture for a practical space system to demonstrate feasibility of our approach using SysML. Our results show that we are able to apply a systematic design method to manage system complexity, thus enabling effective data management, semantic coherence and traceability across different levels of abstraction in the design chain. Just as important, our approach enables interoperability among heterogeneous tools in a concurrent engineering model based design environment.

  19. Patient Specific Modeling of Head-Up Tilt

    DEFF Research Database (Denmark)

    Williams, Nakeya; Wright, Andrew; Mehlsen, Jesper

    2014-01-01

    Short term cardiovascular responses to head-up tilt (HUT) experiments involve complex cardiovascular regulation in order to maintain blood pressure at homeostatic levels. This manuscript presents a patient specific compartmental model developed to predict dynamic changes in heart rate and arterial...

  20. Specific heat of the simple-cubic Ising model

    NARCIS (Netherlands)

    Feng, X.; Blöte, H.W.J.

    2010-01-01

    We provide an expression quantitatively describing the specific heat of the Ising model on the simple-cubic lattice in the critical region. This expression is based on finite-size scaling of numerical results obtained by means of a Monte Carlo method. It agrees satisfactorily with series expansions

  1. A conceptual model specification language (CMSL Version 2)

    NARCIS (Netherlands)

    Wieringa, Roelf J.

    1992-01-01

    Version 2 of a language (CMSL) to specify conceptual models is defined. CMSL consists of two parts, the value specification language VSL and the object spercification language OSL. There is a formal semantics and an inference system for CMSL but research on this still continues. A method for

  2. Modeling growth of specific spoilage organisms in tilapia ...

    African Journals Online (AJOL)

    Tilapia is an important aquatic fish, but severe spoilage of tilapia is most likely related to the global aquaculture. The spoilage is mostly caused by specific spoilage organisms (SSO). Therefore, it is very important to use microbial models to predict the growth of SSO in tilapia. This study firstly verified Pseudomonas and Vibrio ...

  3. Verifying large SDL-specifications using model checking

    NARCIS (Netherlands)

    Sidorova, N.; Steffen, M.; Reed, R.; Reed, J.

    2001-01-01

    In this paper we propose a methodology for model-checking based verification of large SDL specifications. The methodology is illustrated by a case study of an industrial medium-access protocol for wireless ATM. To cope with the state space explosion, the verification exploits the layered and modular

  4. Verifying OCL specifications of UML models : tool support and compositionality

    NARCIS (Netherlands)

    Kyas, Marcel

    2006-01-01

    The Unified Modelling Language (UML) and the Object Constraint Language (OCL) serve as specification languages for embedded and real-time systems used in a safety-critical environment. In this dissertation class diagrams, object diagrams, and OCL constraints are formalised. The formalisation

  5. Evaluation of Workflow Management Systems - A Meta Model Approach

    Directory of Open Access Journals (Sweden)

    Michael Rosemann

    1998-11-01

    Full Text Available The automated enactment of processes through the use of workflow management systems enables the outsourcing of the control flow from application systems. By now a large number of systems, that follow different workflow paradigms, are available. This leads to the problem of selecting the appropriate workflow management system for a given situation. In this paper we outline the benefits of a meta model approach for the evaluation and comparison of different workflow management systems. After a general introduction on the topic of meta modeling the meta models of the workflow management systems WorkParty (Siemens Nixdorf and FlowMark (IBM are compared as an example. These product specific meta models can be generalized to meta reference models, which helps to specify a workflow methodology. Exemplary, an organisational reference meta model is presented, which helps users in specifying their requirements for a workflow management system.

  6. A new biodegradation prediction model specific to petroleum hydrocarbons.

    Science.gov (United States)

    Howard, Philip; Meylan, William; Aronson, Dallas; Stiteler, William; Tunkel, Jay; Comber, Michael; Parkerton, Thomas F

    2005-08-01

    A new predictive model for determining quantitative primary biodegradation half-lives of individual petroleum hydrocarbons has been developed. This model uses a fragment-based approach similar to that of several other biodegradation models, such as those within the Biodegradation Probability Program (BIOWIN) estimation program. In the present study, a half-life in days is estimated using multiple linear regression against counts of 31 distinct molecular fragments. The model was developed using a data set consisting of 175 compounds with environmentally relevant experimental data that was divided into training and validation sets. The original fragments from the Ministry of International Trade and Industry BIOWIN model were used initially as structural descriptors and additional fragments were then added to better describe the ring systems found in petroleum hydrocarbons and to adjust for nonlinearity within the experimental data. The training and validation sets had r2 values of 0.91 and 0.81, respectively.

  7. Metal Mixture Modeling Evaluation project: 2. Comparison of four modeling approaches

    Science.gov (United States)

    Farley, Kevin J.; Meyer, Joe; Balistrieri, Laurie S.; DeSchamphelaere, Karl; Iwasaki, Yuichi; Janssen, Colin; Kamo, Masashi; Lofts, Steve; Mebane, Christopher A.; Naito, Wataru; Ryan, Adam C.; Santore, Robert C.; Tipping, Edward

    2015-01-01

    As part of the Metal Mixture Modeling Evaluation (MMME) project, models were developed by the National Institute of Advanced Industrial Science and Technology (Japan), the U.S. Geological Survey (USA), HDR⎪HydroQual, Inc. (USA), and the Centre for Ecology and Hydrology (UK) to address the effects of metal mixtures on biological responses of aquatic organisms. A comparison of the 4 models, as they were presented at the MMME Workshop in Brussels, Belgium (May 2012), is provided herein. Overall, the models were found to be similar in structure (free ion activities computed by WHAM; specific or non-specific binding of metals/cations in or on the organism; specification of metal potency factors and/or toxicity response functions to relate metal accumulation to biological response). Major differences in modeling approaches are attributed to various modeling assumptions (e.g., single versus multiple types of binding site on the organism) and specific calibration strategies that affected the selection of model parameters. The models provided a reasonable description of additive (or nearly additive) toxicity for a number of individual toxicity test results. Less-than-additive toxicity was more difficult to describe with the available models. Because of limitations in the available datasets and the strong inter-relationships among the model parameters (log KM values, potency factors, toxicity response parameters), further evaluation of specific model assumptions and calibration strategies is needed.

  8. XML for data representation and model specification in neuroscience.

    Science.gov (United States)

    Crook, Sharon M; Howell, Fred W

    2007-01-01

    EXtensible Markup Language (XML) technology provides an ideal representation for the complex structure of models and neuroscience data, as it is an open file format and provides a language-independent method for storing arbitrarily complex structured information. XML is composed of text and tags that explicitly describe the structure and semantics of the content of the document. In this chapter, we describe some of the common uses of XML in neuroscience, with case studies in representing neuroscience data and defining model descriptions based on examples from NeuroML. The specific methods that we discuss include (1) reading and writing XML from applications, (2) exporting XML from databases, (3) using XML standards to represent neuronal morphology data, (4) using XML to represent experimental metadata, and (5) creating new XML specifications for models.

  9. Crime Modeling using Spatial Regression Approach

    Science.gov (United States)

    Saleh Ahmar, Ansari; Adiatma; Kasim Aidid, M.

    2018-01-01

    Act of criminality in Indonesia increased both variety and quantity every year. As murder, rape, assault, vandalism, theft, fraud, fencing, and other cases that make people feel unsafe. Risk of society exposed to crime is the number of reported cases in the police institution. The higher of the number of reporter to the police institution then the number of crime in the region is increasing. In this research, modeling criminality in South Sulawesi, Indonesia with the dependent variable used is the society exposed to the risk of crime. Modelling done by area approach is the using Spatial Autoregressive (SAR) and Spatial Error Model (SEM) methods. The independent variable used is the population density, the number of poor population, GDP per capita, unemployment and the human development index (HDI). Based on the analysis using spatial regression can be shown that there are no dependencies spatial both lag or errors in South Sulawesi.

  10. Merging Digital Surface Models Implementing Bayesian Approaches

    Science.gov (United States)

    Sadeq, H.; Drummond, J.; Li, Z.

    2016-06-01

    In this research different DSMs from different sources have been merged. The merging is based on a probabilistic model using a Bayesian Approach. The implemented data have been sourced from very high resolution satellite imagery sensors (e.g. WorldView-1 and Pleiades). It is deemed preferable to use a Bayesian Approach when the data obtained from the sensors are limited and it is difficult to obtain many measurements or it would be very costly, thus the problem of the lack of data can be solved by introducing a priori estimations of data. To infer the prior data, it is assumed that the roofs of the buildings are specified as smooth, and for that purpose local entropy has been implemented. In addition to the a priori estimations, GNSS RTK measurements have been collected in the field which are used as check points to assess the quality of the DSMs and to validate the merging result. The model has been applied in the West-End of Glasgow containing different kinds of buildings, such as flat roofed and hipped roofed buildings. Both quantitative and qualitative methods have been employed to validate the merged DSM. The validation results have shown that the model was successfully able to improve the quality of the DSMs and improving some characteristics such as the roof surfaces, which consequently led to better representations. In addition to that, the developed model has been compared with the well established Maximum Likelihood model and showed similar quantitative statistical results and better qualitative results. Although the proposed model has been applied on DSMs that were derived from satellite imagery, it can be applied to any other sourced DSMs.

  11. MERGING DIGITAL SURFACE MODELS IMPLEMENTING BAYESIAN APPROACHES

    Directory of Open Access Journals (Sweden)

    H. Sadeq

    2016-06-01

    Full Text Available In this research different DSMs from different sources have been merged. The merging is based on a probabilistic model using a Bayesian Approach. The implemented data have been sourced from very high resolution satellite imagery sensors (e.g. WorldView-1 and Pleiades. It is deemed preferable to use a Bayesian Approach when the data obtained from the sensors are limited and it is difficult to obtain many measurements or it would be very costly, thus the problem of the lack of data can be solved by introducing a priori estimations of data. To infer the prior data, it is assumed that the roofs of the buildings are specified as smooth, and for that purpose local entropy has been implemented. In addition to the a priori estimations, GNSS RTK measurements have been collected in the field which are used as check points to assess the quality of the DSMs and to validate the merging result. The model has been applied in the West-End of Glasgow containing different kinds of buildings, such as flat roofed and hipped roofed buildings. Both quantitative and qualitative methods have been employed to validate the merged DSM. The validation results have shown that the model was successfully able to improve the quality of the DSMs and improving some characteristics such as the roof surfaces, which consequently led to better representations. In addition to that, the developed model has been compared with the well established Maximum Likelihood model and showed similar quantitative statistical results and better qualitative results. Although the proposed model has been applied on DSMs that were derived from satellite imagery, it can be applied to any other sourced DSMs.

  12. Multilevel Models for the Analysis of Angle-Specific Torque Curves with Application to Master Athletes

    Directory of Open Access Journals (Sweden)

    Carvalho Humberto M.

    2015-12-01

    Full Text Available The aim of this paper was to outline a multilevel modeling approach to fit individual angle-specific torque curves describing concentric knee extension and flexion isokinetic muscular actions in Master athletes. The potential of the analytical approach to examine between individual differences across the angle-specific torque curves was illustrated including between-individuals variation due to gender differences at a higher level. Torques in concentric muscular actions of knee extension and knee extension at 60°·s-1 were considered within a range of motion between 5°and 85° (only torques “truly” isokinetic. Multilevel time series models with autoregressive covariance structures with standard multilevel models were superior fits compared with standard multilevel models for repeated measures to fit anglespecific torque curves. Third and fourth order polynomial models were the best fits to describe angle-specific torque curves of isokinetic knee flexion and extension concentric actions, respectively. The fixed exponents allow interpretations for initial acceleration, the angle at peak torque and the decrement of torque after peak torque. Also, the multilevel models were flexible to illustrate the influence of gender differences on the shape of torque throughout the range of motion and in the shape of the curves. The presented multilevel regression models may afford a general framework to examine angle-specific moment curves by isokinetic dynamometry, and add to the understanding mechanisms of strength development, particularly the force-length relationship, both related to performance and injury prevention.

  13. A new approach for modeling composite materials

    Science.gov (United States)

    Alcaraz de la Osa, R.; Moreno, F.; Saiz, J. M.

    2013-03-01

    The increasing use of composite materials is due to their ability to tailor materials for special purposes, with applications evolving day by day. This is why predicting the properties of these systems from their constituents, or phases, has become so important. However, assigning macroscopical optical properties for these materials from the bulk properties of their constituents is not a straightforward task. In this research, we present a spectral analysis of three-dimensional random composite typical nanostructures using an Extension of the Discrete Dipole Approximation (E-DDA code), comparing different approaches and emphasizing the influences of optical properties of constituents and their concentration. In particular, we hypothesize a new approach that preserves the individual nature of the constituents introducing at the same time a variation in the optical properties of each discrete element that is driven by the surrounding medium. The results obtained with this new approach compare more favorably with the experiment than previous ones. We have also applied it to a non-conventional material composed of a metamaterial embedded in a dielectric matrix. Our version of the Discrete Dipole Approximation code, the EDDA code, has been formulated specifically to tackle this kind of problem, including materials with either magnetic and tensor properties.

  14. Carcinogen specific dosimetry model for passive smokers of various ages

    International Nuclear Information System (INIS)

    Robinson, Risa J.

    2005-01-01

    Studies indicate that being exposed to second hand smoke increases the chance of developing lung cancer. Understanding the deposition of carcinogenic particles present in second hand smoke is necessary to understand the development of specific histologic type cancers. In this study, a deposition model is presented for subjects of various ages exposed to sidestream smoke. The model included particle dynamics of coagulation, hygroscopic growth, charge and cloud behavior. Concentrations were varied from the maximum measured indoor concentrations (10 6 particles/cm 3 ) to what would be expected from wisps of smoke (10 8 particles/cm 3 ). Model results agreed well with experimental data taken from human subject deposition measurements (four studies). The model results were used to determine the dose intensity (dose per unit airway surface area) of Benzo[a]pyrene (BaP) in the respiratory tract for subjects of various ages. Model predictions for BaP surface concentration on the airway walls paralleled incident rates of tumors by location in the upper tracheobronchial region. Mass deposition efficiency was found to be larger for younger subjects, consistent with diffusion being the predominant mechanism for this particle size range. However, the actual dose intensity of BaP was found to be smaller for children than adults. This occurred due to the predominant effect of the smaller initial inhaled mass for children resulting from smaller tidal volumes. The resulting model is a useful tool to predict carcinogen specific particle deposition

  15. Mathematical Formulation Requirements and Specifications for the Process Models

    International Nuclear Information System (INIS)

    Steefel, C.; Moulton, D.; Pau, G.; Lipnikov, K.; Meza, J.; Lichtner, P.; Wolery, T.; Bacon, D.; Spycher, N.; Bell, J.; Moridis, G.; Yabusaki, S.; Sonnenthal, E.; Zyvoloski, G.; Andre, B.; Zheng, L.; Davis, J.

    2010-01-01

    The Advanced Simulation Capability for Environmental Management (ASCEM) is intended to be a state-of-the-art scientific tool and approach for understanding and predicting contaminant fate and transport in natural and engineered systems. The ASCEM program is aimed at addressing critical EM program needs to better understand and quantify flow and contaminant transport behavior in complex geological systems. It will also address the long-term performance of engineered components including cementitious materials in nuclear waste disposal facilities, in order to reduce uncertainties and risks associated with DOE EM's environmental cleanup and closure activities. Building upon national capabilities developed from decades of Research and Development in subsurface geosciences, computational and computer science, modeling and applied mathematics, and environmental remediation, the ASCEM initiative will develop an integrated, open-source, high-performance computer modeling system for multiphase, multicomponent, multiscale subsurface flow and contaminant transport. This integrated modeling system will incorporate capabilities for predicting releases from various waste forms, identifying exposure pathways and performing dose calculations, and conducting systematic uncertainty quantification. The ASCEM approach will be demonstrated on selected sites, and then applied to support the next generation of performance assessments of nuclear waste disposal and facility decommissioning across the EM complex. The Multi-Process High Performance Computing (HPC) Simulator is one of three thrust areas in ASCEM. The other two are the Platform and Integrated Toolsets (dubbed the Platform) and Site Applications. The primary objective of the HPC Simulator is to provide a flexible and extensible computational engine to simulate the coupled processes and flow scenarios described by the conceptual models developed using the ASCEM Platform. The graded and iterative approach to assessments naturally

  16. Stage-specific predictive models for breast cancer survivability.

    Science.gov (United States)

    Kate, Rohit J; Nadig, Ramya

    2017-01-01

    Survivability rates vary widely among various stages of breast cancer. Although machine learning models built in past to predict breast cancer survivability were given stage as one of the features, they were not trained or evaluated separately for each stage. To investigate whether there are differences in performance of machine learning models trained and evaluated across different stages for predicting breast cancer survivability. Using three different machine learning methods we built models to predict breast cancer survivability separately for each stage and compared them with the traditional joint models built for all the stages. We also evaluated the models separately for each stage and together for all the stages. Our results show that the most suitable model to predict survivability for a specific stage is the model trained for that particular stage. In our experiments, using additional examples of other stages during training did not help, in fact, it made it worse in some cases. The most important features for predicting survivability were also found to be different for different stages. By evaluating the models separately on different stages we found that the performance widely varied across them. We also demonstrate that evaluating predictive models for survivability on all the stages together, as was done in the past, is misleading because it overestimates performance. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  17. Towards a 3d Spatial Urban Energy Modelling Approach

    Science.gov (United States)

    Bahu, J.-M.; Koch, A.; Kremers, E.; Murshed, S. M.

    2013-09-01

    Today's needs to reduce the environmental impact of energy use impose dramatic changes for energy infrastructure and existing demand patterns (e.g. buildings) corresponding to their specific context. In addition, future energy systems are expected to integrate a considerable share of fluctuating power sources and equally a high share of distributed generation of electricity. Energy system models capable of describing such future systems and allowing the simulation of the impact of these developments thus require a spatial representation in order to reflect the local context and the boundary conditions. This paper describes two recent research approaches developed at EIFER in the fields of (a) geo-localised simulation of heat energy demand in cities based on 3D morphological data and (b) spatially explicit Agent-Based Models (ABM) for the simulation of smart grids. 3D city models were used to assess solar potential and heat energy demand of residential buildings which enable cities to target the building refurbishment potentials. Distributed energy systems require innovative modelling techniques where individual components are represented and can interact. With this approach, several smart grid demonstrators were simulated, where heterogeneous models are spatially represented. Coupling 3D geodata with energy system ABMs holds different advantages for both approaches. On one hand, energy system models can be enhanced with high resolution data from 3D city models and their semantic relations. Furthermore, they allow for spatial analysis and visualisation of the results, with emphasis on spatially and structurally correlations among the different layers (e.g. infrastructure, buildings, administrative zones) to provide an integrated approach. On the other hand, 3D models can benefit from more detailed system description of energy infrastructure, representing dynamic phenomena and high resolution models for energy use at component level. The proposed modelling strategies

  18. A nationwide modelling approach to decommissioning - 16182

    International Nuclear Information System (INIS)

    Kelly, Bernard; Lowe, Andy; Mort, Paul

    2009-01-01

    In this paper we describe a proposed UK national approach to modelling decommissioning. For the first time, we shall have an insight into optimizing the safety and efficiency of a national decommissioning strategy. To do this we use the General Case Integrated Waste Algorithm (GIA), a universal model of decommissioning nuclear plant, power plant, waste arisings and the associated knowledge capture. The model scales from individual items of plant through cells, groups of cells, buildings, whole sites and then on up to a national scale. We describe the national vision for GIA which can be broken down into three levels: 1) the capture of the chronological order of activities that an experienced decommissioner would use to decommission any nuclear facility anywhere in the world - this is Level 1 of GIA; 2) the construction of an Operational Research (OR) model based on Level 1 to allow rapid what if scenarios to be tested quickly (Level 2); 3) the construction of a state of the art knowledge capture capability that allows future generations to learn from our current decommissioning experience (Level 3). We show the progress to date in developing GIA in levels 1 and 2. As part of level 1, GIA has assisted in the development of an IMechE professional decommissioning qualification. Furthermore, we describe GIA as the basis of a UK-Owned database of decommissioning norms for such things as costs, productivity, durations etc. From level 2, we report on a pilot study that has successfully tested the basic principles for the OR numerical simulation of the algorithm. We then highlight the advantages of applying the OR modelling approach nationally. In essence, a series of 'what if...' scenarios can be tested that will improve the safety and efficiency of decommissioning. (authors)

  19. Modeling in transport phenomena a conceptual approach

    CERN Document Server

    Tosun, Ismail

    2007-01-01

    Modeling in Transport Phenomena, Second Edition presents and clearly explains with example problems the basic concepts and their applications to fluid flow, heat transfer, mass transfer, chemical reaction engineering and thermodynamics. A balanced approach is presented between analysis and synthesis, students will understand how to use the solution in engineering analysis. Systematic derivations of the equations and the physical significance of each term are given in detail, for students to easily understand and follow up the material. There is a strong incentive in science and engineering to

  20. Nuclear physics for applications. A model approach

    International Nuclear Information System (INIS)

    Prussin, S.G.

    2007-01-01

    Written by a researcher and teacher with experience at top institutes in the US and Europe, this textbook provides advanced undergraduates minoring in physics with working knowledge of the principles of nuclear physics. Simplifying models and approaches reveal the essence of the principles involved, with the mathematical and quantum mechanical background integrated in the text where it is needed and not relegated to the appendices. The practicality of the book is enhanced by numerous end-of-chapter problems and solutions available on the Wiley homepage. (orig.)

  1. Generalized framework for context-specific metabolic model extraction methods

    Directory of Open Access Journals (Sweden)

    Semidán eRobaina Estévez

    2014-09-01

    Full Text Available Genome-scale metabolic models are increasingly applied to investigate the physiology not only of simple prokaryotes, but also eukaryotes, such as plants, characterized with compartmentalized cells of multiple types. While genome-scale models aim at including the entirety of known metabolic reactions, mounting evidence has indicated that only a subset of these reactions is active in a given context, including: developmental stage, cell type, or environment. As a result, several methods have been proposed to reconstruct context-specific models from existing genome-scale models by integrating various types of high-throughput data. Here we present a mathematical framework that puts all existing methods under one umbrella and provides the means to better understand their functioning, highlight similarities and differences, and to help users in selecting a most suitable method for an application.

  2. Analysis specifications for the CC3 biosphere model biotrac

    Energy Technology Data Exchange (ETDEWEB)

    Szekely, J G; Wojciechowski, L C; Stephens, M E; Halliday, H A

    1994-12-01

    The CC3 (Canadian Concept, generation 3) model BIOTRAC (Biosphere Transport and Consequences) describes the movement in the biosphere of releases from an underground disposal vault, and the consequent radiological dose to a reference individual. Concentrations of toxic substances in different parts of the biosphere are also calculated. BIOTRAC was created specifically for the postclosure analyses of the Environmental Impact Statement that AECL is preparing on the concept for disposal of Canada`s nuclear fuel waste. The model relies on certain assumptions and constraints on the system, which are described by Davis et al. Accordingly, great care must be exercised if BIOTRAC is used for any other purpose.

  3. The contribution of emotional empathy to approachability judgements assigned to emotional faces is context specific

    Directory of Open Access Journals (Sweden)

    Megan L Willis

    2015-08-01

    Full Text Available Previous research on approachability judgements has indicated that facial expressions modulate how these judgements are made, but the relationship between emotional empathy and context in this decision-making process has not yet been examined. This study examined the contribution of emotional empathy to approachability judgements assigned to emotional faces in different contexts. One hundred and twenty female participants completed the Questionnaire Measure of Emotional Empathy. Participants provided approachability judgements to faces displaying angry, disgusted, fearful, happy, neutral and sad expressions, in three different contexts – when evaluating whether they would approach another individual to: 1 receive help; 2 give help; or 3 when no contextual information was provided. In addition, participants were also required to provide ratings of perceived threat, emotional intensity and label facial expressions. Emotional empathy significantly predicted approachability ratings for specific emotions in each context, over and above the contribution of perceived threat and intensity, which were associated with emotional empathy. Higher emotional empathy predicted less willingness to approach people with angry and disgusted faces to receive help, and a greater willingness to approach people with happy faces to receive help. Higher emotional empathy also predicted a greater willingness to approach people with sad faces to offer help, and more willingness to approach people with happy faces when no contextual information was provided. These results highlight the important contribution of individual differences in emotional empathy in predicting how approachability judgements are assigned to facial expressions in context.

  4. Pedagogic process modeling: Humanistic-integrative approach

    Directory of Open Access Journals (Sweden)

    Boritko Nikolaj M.

    2007-01-01

    Full Text Available The paper deals with some current problems of modeling the dynamics of the subject-features development of the individual. The term "process" is considered in the context of the humanistic-integrative approach, in which the principles of self education are regarded as criteria for efficient pedagogic activity. Four basic characteristics of the pedagogic process are pointed out: intentionality reflects logicality and regularity of the development of the process; discreteness (stageability in dicates qualitative stages through which the pedagogic phenomenon passes; nonlinearity explains the crisis character of pedagogic processes and reveals inner factors of self-development; situationality requires a selection of pedagogic conditions in accordance with the inner factors, which would enable steering the pedagogic process. Offered are two steps for singling out a particular stage and the algorithm for developing an integrative model for it. The suggested conclusions might be of use for further theoretic research, analyses of educational practices and for realistic predicting of pedagogical phenomena. .

  5. Mathematical modelling of digit specification by a sonic hedgehog gradient

    KAUST Repository

    Woolley, Thomas E.; Baker, Ruth E.; Tickle, Cheryll; Maini, Philip K.; Towers, Matthew

    2013-01-01

    Background: The three chick wing digits represent a classical example of a pattern specified by a morphogen gradient. Here we have investigated whether a mathematical model of a Shh gradient can describe the specification of the identities of the three chick wing digits and if it can be applied to limbs with more digits. Results: We have produced a mathematical model for specification of chick wing digit identities by a Shh gradient that can be extended to the four digits of the chick leg with Shh-producing cells forming a digit. This model cannot be extended to specify the five digits of the mouse limb. Conclusions: Our data suggest that the parameters of a classical-type morphogen gradient are sufficient to specify the identities of three different digits. However, to specify more digit identities, this core mechanism has to be coupled to alternative processes, one being that in the chick leg and mouse limb, Shh-producing cells give rise to digits; another that in the mouse limb, the cellular response to the Shh gradient adapts over time so that digit specification does not depend simply on Shh concentration. Developmental Dynamics 243:290-298, 2014. © 2013 Wiley Periodicals, Inc.

  6. Mathematical modelling of digit specification by a sonic hedgehog gradient

    KAUST Repository

    Woolley, Thomas E.

    2013-11-26

    Background: The three chick wing digits represent a classical example of a pattern specified by a morphogen gradient. Here we have investigated whether a mathematical model of a Shh gradient can describe the specification of the identities of the three chick wing digits and if it can be applied to limbs with more digits. Results: We have produced a mathematical model for specification of chick wing digit identities by a Shh gradient that can be extended to the four digits of the chick leg with Shh-producing cells forming a digit. This model cannot be extended to specify the five digits of the mouse limb. Conclusions: Our data suggest that the parameters of a classical-type morphogen gradient are sufficient to specify the identities of three different digits. However, to specify more digit identities, this core mechanism has to be coupled to alternative processes, one being that in the chick leg and mouse limb, Shh-producing cells give rise to digits; another that in the mouse limb, the cellular response to the Shh gradient adapts over time so that digit specification does not depend simply on Shh concentration. Developmental Dynamics 243:290-298, 2014. © 2013 Wiley Periodicals, Inc.

  7. Bayesian model to detect phenotype-specific genes for copy number data

    Directory of Open Access Journals (Sweden)

    González Juan R

    2012-06-01

    Full Text Available Abstract Background An important question in genetic studies is to determine those genetic variants, in particular CNVs, that are specific to different groups of individuals. This could help in elucidating differences in disease predisposition and response to pharmaceutical treatments. We propose a Bayesian model designed to analyze thousands of copy number variants (CNVs where only few of them are expected to be associated with a specific phenotype. Results The model is illustrated by analyzing three major human groups belonging to HapMap data. We also show how the model can be used to determine specific CNVs related to response to treatment in patients diagnosed with ovarian cancer. The model is also extended to address the problem of how to adjust for confounding covariates (e.g., population stratification. Through a simulation study, we show that the proposed model outperforms other approaches that are typically used to analyze this data when analyzing common copy-number polymorphisms (CNPs or complex CNVs. We have developed an R package, called bayesGen, that implements the model and estimating algorithms. Conclusions Our proposed model is useful to discover specific genetic variants when different subgroups of individuals are analyzed. The model can address studies with or without control group. By integrating all data in a unique model we can obtain a list of genes that are associated with a given phenotype as well as a different list of genes that are shared among the different subtypes of cases.

  8. Surface mesh to voxel data registration for patient-specific anatomical modeling

    Science.gov (United States)

    de Oliveira, Júlia E. E.; Giessler, Paul; Keszei, András.; Herrler, Andreas; Deserno, Thomas M.

    2016-03-01

    Virtual Physiological Human (VPH) models are frequently used for training, planning, and performing medical procedures. The Regional Anaesthesia Simulator and Assistant (RASimAs) project has the goal of increasing the application and effectiveness of regional anesthesia (RA) by combining a simulator of ultrasound-guided and electrical nerve-stimulated RA procedures and a subject-specific assistance system through an integration of image processing, physiological models, subject-specific data, and virtual reality. Individualized models enrich the virtual training tools for learning and improving regional anaesthesia (RA) skills. Therefore, we suggest patient-specific VPH models that are composed by registering the general mesh-based models with patient voxel data-based recordings. Specifically, the pelvis region has been focused for the support of the femoral nerve block. The processing pipeline is composed of different freely available toolboxes such as MatLab, the open Simulation framework (SOFA), and MeshLab. The approach of Gilles is applied for mesh-to-voxel registration. Personalized VPH models include anatomical as well as mechanical properties of the tissues. Two commercial VPH models (Zygote and Anatomium) were used together with 34 MRI data sets. Results are presented for the skin surface and pelvic bones. Future work will extend the registration procedure to cope with all model tissue (i.e., skin, muscle, bone, vessel, nerve, fascia) in a one-step procedure and extrapolating the personalized models to body regions actually being out of the captured field of view.

  9. Popularity Modeling for Mobile Apps: A Sequential Approach.

    Science.gov (United States)

    Zhu, Hengshu; Liu, Chuanren; Ge, Yong; Xiong, Hui; Chen, Enhong

    2015-07-01

    The popularity information in App stores, such as chart rankings, user ratings, and user reviews, provides an unprecedented opportunity to understand user experiences with mobile Apps, learn the process of adoption of mobile Apps, and thus enables better mobile App services. While the importance of popularity information is well recognized in the literature, the use of the popularity information for mobile App services is still fragmented and under-explored. To this end, in this paper, we propose a sequential approach based on hidden Markov model (HMM) for modeling the popularity information of mobile Apps toward mobile App services. Specifically, we first propose a popularity based HMM (PHMM) to model the sequences of the heterogeneous popularity observations of mobile Apps. Then, we introduce a bipartite based method to precluster the popularity observations. This can help to learn the parameters and initial values of the PHMM efficiently. Furthermore, we demonstrate that the PHMM is a general model and can be applicable for various mobile App services, such as trend based App recommendation, rating and review spam detection, and ranking fraud detection. Finally, we validate our approach on two real-world data sets collected from the Apple Appstore. Experimental results clearly validate both the effectiveness and efficiency of the proposed popularity modeling approach.

  10. Multiple-Input Subject-Specific Modeling of Plasma Glucose Concentration for Feedforward Control.

    Science.gov (United States)

    Kotz, Kaylee; Cinar, Ali; Mei, Yong; Roggendorf, Amy; Littlejohn, Elizabeth; Quinn, Laurie; Rollins, Derrick K

    2014-11-26

    The ability to accurately develop subject-specific, input causation models, for blood glucose concentration (BGC) for large input sets can have a significant impact on tightening control for insulin dependent diabetes. More specifically, for Type 1 diabetics (T1Ds), it can lead to an effective artificial pancreas (i.e., an automatic control system that delivers exogenous insulin) under extreme changes in critical disturbances. These disturbances include food consumption, activity variations, and physiological stress changes. Thus, this paper presents a free-living, outpatient, multiple-input, modeling method for BGC with strong causation attributes that is stable and guards against overfitting to provide an effective modeling approach for feedforward control (FFC). This approach is a Wiener block-oriented methodology, which has unique attributes for meeting critical requirements for effective, long-term, FFC.

  11. Towards patient specific thermal modelling of the prostate

    International Nuclear Information System (INIS)

    Berg, Cornelis A T van den; Kamer, Jeroen B van de; Leeuw, Astrid A C ee; Jeukens, Cecile R L P N; Raaymakers, Bas W; Vulpen, Marco van; Lagendijk, Jan J W

    2006-01-01

    The application of thermal modelling for hyperthermia and thermal ablation is severely hampered by lack of information about perfusion and vasculature. However, recently, with the advent of sophisticated angiography and dynamic contrast enhanced (DCE) imaging techniques, it has become possible to image small vessels and blood perfusion bringing the ultimate goal of patient specific thermal modelling closer within reach. In this study dynamic contrast enhanced multi-slice CT imaging techniques are employed to investigate the feasibility of this concept for regional hyperthermia treatment of the prostate. The results are retrospectively compared with clinical thermometry data of a patient group from an earlier trial. Furthermore, the role of the prostate vasculature in the establishment of the prostate temperature distribution is studied. Quantitative 3D perfusion maps of the prostate were constructed for five patients using a distributed-parameter tracer kinetics model to analyse dynamic CT data. CT angiography was applied to construct a discrete vessel model of the pelvis. Additionally, a discrete vessel model of the prostate vasculature was constructed of a prostate taken from a human corpse. Three thermal modelling schemes with increasing inclusion of the patient specific physiological information were used to simulate the temperature distribution of the prostate during regional hyperthermia. Prostate perfusion was found to be heterogeneous and T3 prostate carcinomas are often characterized by a strongly elevated tumour perfusion (up to 70-80 ml 100 g -1 min -1 ). This elevated tumour perfusion leads to 1-2 deg. C lower tumour temperatures than thermal simulations based on a homogeneous prostate perfusion. Furthermore, the comparison has shown that the simulations with the measured perfusion maps result in consistently lower prostate temperatures than clinically achieved. The simulations with the discrete vessel model indicate that significant pre-heating takes

  12. A novel approach to pipeline tensioner modeling

    Energy Technology Data Exchange (ETDEWEB)

    O' Grady, Robert; Ilie, Daniel; Lane, Michael [MCS Software Division, Galway (Ireland)

    2009-07-01

    As subsea pipeline developments continue to move into deep and ultra-deep water locations, there is an increasing need for the accurate prediction of expected pipeline fatigue life. A significant factor that must be considered as part of this process is the fatigue damage sustained by the pipeline during installation. The magnitude of this installation-related damage is governed by a number of different agents, one of which is the dynamic behavior of the tensioner systems during pipe-laying operations. There are a variety of traditional finite element methods for representing dynamic tensioner behavior. These existing methods, while basic in nature, have been proven to provide adequate forecasts in terms of the dynamic variation in typical installation parameters such as top tension and sagbend/overbend strain. However due to the simplicity of these current approaches, some of them tend to over-estimate the frequency of tensioner pay out/in under dynamic loading. This excessive level of pay out/in motion results in the prediction of additional stress cycles at certain roller beds, which in turn leads to the prediction of unrealistic fatigue damage to the pipeline. This unwarranted fatigue damage then equates to an over-conservative value for the accumulated damage experienced by a pipeline weld during installation, and so leads to a reduction in the estimated fatigue life for the pipeline. This paper describes a novel approach to tensioner modeling which allows for greater control over the velocity of dynamic tensioner pay out/in and so provides a more accurate estimation of fatigue damage experienced by the pipeline during installation. The paper reports on a case study, as outlined in the proceeding section, in which a comparison is made between results from this new tensioner model and from a more conventional approach. The comparison considers typical installation parameters as well as an in-depth look at the predicted fatigue damage for the two methods

  13. An approach to analyse the specific impact of rapamycin on mRNA-ribosome association

    Directory of Open Access Journals (Sweden)

    Jaquier-Gubler Pascale

    2008-08-01

    Full Text Available Abstract Background Recent work, using both cell culture model systems and tumour derived cell lines, suggests that the differential recruitment into polysomes of mRNA populations may be sufficient to initiate and maintain tumour formation. Consequently, a major effort is underway to use high density microarray profiles to establish molecular fingerprints for cells exposed to defined drug regimes. The aim of these pharmacogenomic approaches is to provide new information on how drugs can impact on the translational read-out within a defined cellular background. Methods We describe an approach that permits the analysis of de-novo mRNA-ribosome association in-vivo during short drug exposures. It combines hypertonic shock, polysome fractionation and high-throughput analysis to provide a molecular phenotype of translationally responsive transcripts. Compared to previous translational profiling studies, the procedure offers increased specificity due to the elimination of the drugs secondary effects (e.g. on the transcriptional read-out. For this pilot "proof-of-principle" assay we selected the drug rapamycin because of its extensively studied impact on translation initiation. Results High throughput analysis on both the light and heavy polysomal fractions has identified mRNAs whose re-recruitment onto free ribosomes responded to short exposure to the drug rapamycin. The results of the microarray have been confirmed using real-time RT-PCR. The selective down-regulation of TOP transcripts is also consistent with previous translational profiling studies using this drug. Conclusion The technical advance outlined in this manuscript offers the possibility of new insights into mRNA features that impact on translation initiation and provides a molecular fingerprint for transcript-ribosome association in any cell type and in the presence of a range of drugs of interest. Such molecular phenotypes defined pre-clinically may ultimately impact on the evaluation of

  14. A Variational Approach to the Modeling of MIMO Systems

    Directory of Open Access Journals (Sweden)

    Jraifi A

    2007-01-01

    Full Text Available Motivated by the study of the optimization of the quality of service for multiple input multiple output (MIMO systems in 3G (third generation, we develop a method for modeling MIMO channel . This method, which uses a statistical approach, is based on a variational form of the usual channel equation. The proposed equation is given by with scalar variable . Minimum distance of received vectors is used as the random variable to model MIMO channel. This variable is of crucial importance for the performance of the transmission system as it captures the degree of interference between neighbors vectors. Then, we use this approach to compute numerically the total probability of errors with respect to signal-to-noise ratio (SNR and then predict the numbers of antennas. By fixing SNR variable to a specific value, we extract informations on the optimal numbers of MIMO antennas.

  15. INDIVIDUAL BASED MODELLING APPROACH TO THERMAL ...

    Science.gov (United States)

    Diadromous fish populations in the Pacific Northwest face challenges along their migratory routes from declining habitat quality, harvest, and barriers to longitudinal connectivity. Changes in river temperature regimes are producing an additional challenge for upstream migrating adult salmon and steelhead, species that are sensitive to absolute and cumulative thermal exposure. Adult salmon populations have been shown to utilize cold water patches along migration routes when mainstem river temperatures exceed thermal optimums. We are employing an individual based model (IBM) to explore the costs and benefits of spatially-distributed cold water refugia for adult migrating salmon. Our model, developed in the HexSim platform, is built around a mechanistic behavioral decision tree that drives individual interactions with their spatially explicit simulated environment. Population-scale responses to dynamic thermal regimes, coupled with other stressors such as disease and harvest, become emergent properties of the spatial IBM. Other model outputs include arrival times, species-specific survival rates, body energetic content, and reproductive fitness levels. Here, we discuss the challenges associated with parameterizing an individual based model of salmon and steelhead in a section of the Columbia River. Many rivers and streams in the Pacific Northwest are currently listed as impaired under the Clean Water Act as a result of high summer water temperatures. Adverse effec

  16. Psychological approaches in the treatment of specific phobias: A meta-analysis

    NARCIS (Netherlands)

    Wolitzky-Taylor, K.B.; Horowitz, J.D.; Powers, M.B.; Telch, M.J.

    2008-01-01

    Data from 33 randomized treatment studies were subjected to a meta-analysis to address questions surrounding the efficacy of psychological approaches in the treatment of specific phobia. As expected, exposure-based treatment produced large effects sizes relative to no treatment. They also

  17. A Discipline-Specific Approach to the History of U.S. Science Education

    Science.gov (United States)

    Otero, Valerie K.; Meltzer, David E.

    2017-01-01

    Although much has been said and written about the value of using the history of science in teaching science, relatively little is available to guide educators in the various science disciplines through the educational history of their own discipline. Through a discipline-specific approach to a course on the history of science education in the…

  18. A bottleneck model of set-specific capture.

    Directory of Open Access Journals (Sweden)

    Katherine Sledge Moore

    Full Text Available Set-specific contingent attentional capture is a particularly strong form of capture that occurs when multiple attentional sets guide visual search (e.g., "search for green letters" and "search for orange letters". In this type of capture, a potential target that matches one attentional set (e.g. a green stimulus impairs the ability to identify a temporally proximal target that matches another attentional set (e.g. an orange stimulus. In the present study, we investigated whether set-specific capture stems from a bottleneck in working memory or from a depletion of limited resources that are distributed across multiple attentional sets. In each trial, participants searched a rapid serial visual presentation (RSVP stream for up to three target letters (T1-T3 that could appear in any of three target colors (orange, green, or lavender. The most revealing findings came from trials in which T1 and T2 matched different attentional sets and were both identified. In these trials, T3 accuracy was lower when it did not match T1's set than when it did match, but only when participants failed to identify T2. These findings support a bottleneck model of set-specific capture in which a limited-capacity mechanism in working memory enhances only one attentional set at a time, rather than a resource model in which processing capacity is simultaneously distributed across multiple attentional sets.

  19. Evaluation of mesh morphing and mapping techniques in patient specific modeling of the human pelvis.

    Science.gov (United States)

    Salo, Zoryana; Beek, Maarten; Whyne, Cari Marisa

    2013-01-01

    Robust generation of pelvic finite element models is necessary to understand the variation in mechanical behaviour resulting from differences in gender, aging, disease and injury. The objective of this study was to apply and evaluate mesh morphing and mapping techniques to facilitate the creation and structural analysis of specimen-specific finite element (FE) models of the pelvis. A specimen-specific pelvic FE model (source mesh) was generated following a traditional user-intensive meshing scheme. The source mesh was morphed onto a computed tomography scan generated target surface of a second pelvis using a landmarked-based approach, in which exterior source nodes were shifted to target surface vertices, while constrained along a normal. A second copy of the morphed model was further refined through mesh mapping, in which surface nodes of the initial morphed model were selected in patches and remapped onto the surfaces of the target model. Computed tomography intensity based material properties were assigned to each model. The source, target, morphed and mapped models were analyzed under axial compression using linear static FE analysis and their strain distributions evaluated. Morphing and mapping techniques were effectively applied to generate good quality geometrically complex specimen-specific pelvic FE models. Mapping significantly improved strain concurrence with the target pelvis FE model. Copyright © 2012 John Wiley & Sons, Ltd.

  20. Evaluation of mesh morphing and mapping techniques in patient specific modelling of the human pelvis.

    Science.gov (United States)

    Salo, Zoryana; Beek, Maarten; Whyne, Cari Marisa

    2012-08-01

    Robust generation of pelvic finite element models is necessary to understand variation in mechanical behaviour resulting from differences in gender, aging, disease and injury. The objective of this study was to apply and evaluate mesh morphing and mapping techniques to facilitate the creation and structural analysis of specimen-specific finite element (FE) models of the pelvis. A specimen-specific pelvic FE model (source mesh) was generated following a traditional user-intensive meshing scheme. The source mesh was morphed onto a computed tomography scan generated target surface of a second pelvis using a landmarked-based approach, in which exterior source nodes were shifted to target surface vertices, while constrained along a normal. A second copy of the morphed model was further refined through mesh mapping, in which surface nodes of the initial morphed model were selected in patches and remapped onto the surfaces of the target model. Computed tomography intensity-based material properties were assigned to each model. The source, target, morphed and mapped models were analyzed under axial compression using linear static FE analysis, and their strain distributions were evaluated. Morphing and mapping techniques were effectively applied to generate good quality and geometrically complex specimen-specific pelvic FE models. Mapping significantly improved strain concurrence with the target pelvis FE model. Copyright © 2012 John Wiley & Sons, Ltd.

  1. Specification of advanced safety modeling requirements (Rev. 0)

    International Nuclear Information System (INIS)

    Fanning, T. H.; Tautges, T. J.

    2008-01-01

    The U.S. Department of Energy's Global Nuclear Energy Partnership has lead to renewed interest in liquid-metal-cooled fast reactors for the purpose of closing the nuclear fuel cycle and making more efficient use of future repository capacity. However, the U.S. has not designed or constructed a fast reactor in nearly 30 years. Accurate, high-fidelity, whole-plant dynamics safety simulations will play a crucial role by providing confidence that component and system designs will satisfy established design limits and safety margins under a wide variety of operational, design basis, and beyond design basis transient conditions. Current modeling capabilities for fast reactor safety analyses have resulted from several hundred person-years of code development effort supported by experimental validation. The broad spectrum of mechanistic and phenomenological models that have been developed represent an enormous amount of institutional knowledge that needs to be maintained. Complicating this, the existing code architectures for safety modeling evolved from programming practices of the 1970s. This has lead to monolithic applications with interdependent data models which require significant knowledge of the complexities of the entire code in order for each component to be maintained. In order to develop an advanced fast reactor safety modeling capability, the limitations of the existing code architecture must be overcome while preserving the capabilities that already exist. To accomplish this, a set of advanced safety modeling requirements is defined, based on modern programming practices, that focuses on modular development within a flexible coupling framework. An approach for integrating the existing capabilities of the SAS4A/SASSYS-1 fast reactor safety analysis code into the SHARP framework is provided in order to preserve existing capabilities while providing a smooth transition to advanced modeling capabilities. In doing this, the advanced fast reactor safety models will

  2. Mathematical Formulation Requirements and Specifications for the Process Models

    Energy Technology Data Exchange (ETDEWEB)

    Steefel, C.; Moulton, D.; Pau, G.; Lipnikov, K.; Meza, J.; Lichtner, P.; Wolery, T.; Bacon, D.; Spycher, N.; Bell, J.; Moridis, G.; Yabusaki, S.; Sonnenthal, E.; Zyvoloski, G.; Andre, B.; Zheng, L.; Davis, J.

    2010-11-01

    The Advanced Simulation Capability for Environmental Management (ASCEM) is intended to be a state-of-the-art scientific tool and approach for understanding and predicting contaminant fate and transport in natural and engineered systems. The ASCEM program is aimed at addressing critical EM program needs to better understand and quantify flow and contaminant transport behavior in complex geological systems. It will also address the long-term performance of engineered components including cementitious materials in nuclear waste disposal facilities, in order to reduce uncertainties and risks associated with DOE EM's environmental cleanup and closure activities. Building upon national capabilities developed from decades of Research and Development in subsurface geosciences, computational and computer science, modeling and applied mathematics, and environmental remediation, the ASCEM initiative will develop an integrated, open-source, high-performance computer modeling system for multiphase, multicomponent, multiscale subsurface flow and contaminant transport. This integrated modeling system will incorporate capabilities for predicting releases from various waste forms, identifying exposure pathways and performing dose calculations, and conducting systematic uncertainty quantification. The ASCEM approach will be demonstrated on selected sites, and then applied to support the next generation of performance assessments of nuclear waste disposal and facility decommissioning across the EM complex. The Multi-Process High Performance Computing (HPC) Simulator is one of three thrust areas in ASCEM. The other two are the Platform and Integrated Toolsets (dubbed the Platform) and Site Applications. The primary objective of the HPC Simulator is to provide a flexible and extensible computational engine to simulate the coupled processes and flow scenarios described by the conceptual models developed using the ASCEM Platform. The graded and iterative approach to assessments

  3. PVUSA model technical specification for a turnkey photovoltaic power system

    Energy Technology Data Exchange (ETDEWEB)

    Dows, R.N.; Gough, E.J.

    1995-11-01

    One of the five objectives of PVUSA is to offer U.S. utilities hands-on experience in designing, procuring, and operating PV systems. The procurement process included the development of a detailed set of technical requirements for a PV system. PVUSA embodied its requirements in a technical specification used as an attachment to its contracts for four utility-scale PV systems in the 200 kW to 500 kW range. The technical specification has also been adapted and used by several utilities. The PVUSA Technical Specification has now been updated and is presented here as a Model Technical Specification (MTS) for utility use. The MTS text is also furnished on a computer disk in Microsoft Word 6.0 so that it may be conveniently adapted by each user. The text includes guidance in the form of comments and by the use of parentheses to indicate where technical information must be developed and inserted. Commercial terms and conditions will reflect the procurement practice of the buyer. The reader is referred to PG&E Report Number 95-3090000. 1, PVUSA Procurement, Acceptance and Rating Practices for Photovoltaic Power Plants (1995) for PVUSA experience and practice. The MTS is regarded by PVUSA as a use-proven document, but needs to be adapted with care and attention to detail.

  4. An analysis of single amino acid repeats as use case for application specific background models

    Directory of Open Access Journals (Sweden)

    Sykacek Peter

    2011-05-01

    Full Text Available Abstract Background Sequence analysis aims to identify biologically relevant signals against a backdrop of functionally meaningless variation. Increasingly, it is recognized that the quality of the background model directly affects the performance of analyses. State-of-the-art approaches rely on classical sequence models that are adapted to the studied dataset. Although performing well in the analysis of globular protein domains, these models break down in regions of stronger compositional bias or low complexity. While these regions are typically filtered, there is increasing anecdotal evidence of functional roles. This motivates an exploration of more complex sequence models and application-specific approaches for the investigation of biased regions. Results Traditional Markov-chains and application-specific regression models are compared using the example of predicting runs of single amino acids, a particularly simple class of biased regions. Cross-fold validation experiments reveal that the alternative regression models capture the multi-variate trends well, despite their low dimensionality and in contrast even to higher-order Markov-predictors. We show how the significance of unusual observations can be computed for such empirical models. The power of a dedicated model in the detection of biologically interesting signals is then demonstrated in an analysis identifying the unexpected enrichment of contiguous leucine-repeats in signal-peptides. Considering different reference sets, we show how the question examined actually defines what constitutes the 'background'. Results can thus be highly sensitive to the choice of appropriate model training sets. Conversely, the choice of reference data determines the questions that can be investigated in an analysis. Conclusions Using a specific case of studying biased regions as an example, we have demonstrated that the construction of application-specific background models is both necessary and

  5. A Subject-Specific Kinematic Model to Predict Human Motion in Exoskeleton-Assisted Gait.

    Science.gov (United States)

    Torricelli, Diego; Cortés, Camilo; Lete, Nerea; Bertelsen, Álvaro; Gonzalez-Vargas, Jose E; Del-Ama, Antonio J; Dimbwadyo, Iris; Moreno, Juan C; Florez, Julian; Pons, Jose L

    2018-01-01

    The relative motion between human and exoskeleton is a crucial factor that has remarkable consequences on the efficiency, reliability and safety of human-robot interaction. Unfortunately, its quantitative assessment has been largely overlooked in the literature. Here, we present a methodology that allows predicting the motion of the human joints from the knowledge of the angular motion of the exoskeleton frame. Our method combines a subject-specific skeletal model with a kinematic model of a lower limb exoskeleton (H2, Technaid), imposing specific kinematic constraints between them. To calibrate the model and validate its ability to predict the relative motion in a subject-specific way, we performed experiments on seven healthy subjects during treadmill walking tasks. We demonstrate a prediction accuracy lower than 3.5° globally, and around 1.5° at the hip level, which represent an improvement up to 66% compared to the traditional approach assuming no relative motion between the user and the exoskeleton.

  6. Internal Models Support Specific Gaits in Orthotic Devices

    DEFF Research Database (Denmark)

    Matthias Braun, Jan; Wörgötter, Florentin; Manoonpong, Poramate

    2014-01-01

    Patients use orthoses and prosthesis for the lower limbs to support and enable movements, they can not or only with difficulties perform themselves. Because traditional devices support only a limited set of movements, patients are restricted in their mobility. A possible approach to overcome such...... the system's accuracy and robustness on a Knee-Ankle-Foot-Orthosis, introducing behaviour changes depending on the patient's current walking situation. We conclude that the here presented model-based support of different gaits has the power to enhance the patient's mobility....

  7. A Bayesian Model of Category-Specific Emotional Brain Responses

    Science.gov (United States)

    Wager, Tor D.; Kang, Jian; Johnson, Timothy D.; Nichols, Thomas E.; Satpute, Ajay B.; Barrett, Lisa Feldman

    2015-01-01

    Understanding emotion is critical for a science of healthy and disordered brain function, but the neurophysiological basis of emotional experience is still poorly understood. We analyzed human brain activity patterns from 148 studies of emotion categories (2159 total participants) using a novel hierarchical Bayesian model. The model allowed us to classify which of five categories—fear, anger, disgust, sadness, or happiness—is engaged by a study with 66% accuracy (43-86% across categories). Analyses of the activity patterns encoded in the model revealed that each emotion category is associated with unique, prototypical patterns of activity across multiple brain systems including the cortex, thalamus, amygdala, and other structures. The results indicate that emotion categories are not contained within any one region or system, but are represented as configurations across multiple brain networks. The model provides a precise summary of the prototypical patterns for each emotion category, and demonstrates that a sufficient characterization of emotion categories relies on (a) differential patterns of involvement in neocortical systems that differ between humans and other species, and (b) distinctive patterns of cortical-subcortical interactions. Thus, these findings are incompatible with several contemporary theories of emotion, including those that emphasize emotion-dedicated brain systems and those that propose emotion is localized primarily in subcortical activity. They are consistent with componential and constructionist views, which propose that emotions are differentiated by a combination of perceptual, mnemonic, prospective, and motivational elements. Such brain-based models of emotion provide a foundation for new translational and clinical approaches. PMID:25853490

  8. Coupling of EIT with computational lung modeling for predicting patient-specific ventilatory responses.

    Science.gov (United States)

    Roth, Christian J; Becher, Tobias; Frerichs, Inéz; Weiler, Norbert; Wall, Wolfgang A

    2017-04-01

    Providing optimal personalized mechanical ventilation for patients with acute or chronic respiratory failure is still a challenge within a clinical setting for each case anew. In this article, we integrate electrical impedance tomography (EIT) monitoring into a powerful patient-specific computational lung model to create an approach for personalizing protective ventilatory treatment. The underlying computational lung model is based on a single computed tomography scan and able to predict global airflow quantities, as well as local tissue aeration and strains for any ventilation maneuver. For validation, a novel "virtual EIT" module is added to our computational lung model, allowing to simulate EIT images based on the patient's thorax geometry and the results of our numerically predicted tissue aeration. Clinically measured EIT images are not used to calibrate the computational model. Thus they provide an independent method to validate the computational predictions at high temporal resolution. The performance of this coupling approach has been tested in an example patient with acute respiratory distress syndrome. The method shows good agreement between computationally predicted and clinically measured airflow data and EIT images. These results imply that the proposed framework can be used for numerical prediction of patient-specific responses to certain therapeutic measures before applying them to an actual patient. In the long run, definition of patient-specific optimal ventilation protocols might be assisted by computational modeling. NEW & NOTEWORTHY In this work, we present a patient-specific computational lung model that is able to predict global and local ventilatory quantities for a given patient and any selected ventilation protocol. For the first time, such a predictive lung model is equipped with a virtual electrical impedance tomography module allowing real-time validation of the computed results with the patient measurements. First promising results

  9. A Combined Experimental and Computational Approach to Subject-Specific Analysis of Knee Joint Laxity

    Science.gov (United States)

    Harris, Michael D.; Cyr, Adam J.; Ali, Azhar A.; Fitzpatrick, Clare K.; Rullkoetter, Paul J.; Maletsky, Lorin P.; Shelburne, Kevin B.

    2016-01-01

    Modeling complex knee biomechanics is a continual challenge, which has resulted in many models of varying levels of quality, complexity, and validation. Beyond modeling healthy knees, accurately mimicking pathologic knee mechanics, such as after cruciate rupture or meniscectomy, is difficult. Experimental tests of knee laxity can provide important information about ligament engagement and overall contributions to knee stability for development of subject-specific models to accurately simulate knee motion and loading. Our objective was to provide combined experimental tests and finite-element (FE) models of natural knee laxity that are subject-specific, have one-to-one experiment to model calibration, simulate ligament engagement in agreement with literature, and are adaptable for a variety of biomechanical investigations (e.g., cartilage contact, ligament strain, in vivo kinematics). Calibration involved perturbing ligament stiffness, initial ligament strain, and attachment location until model-predicted kinematics and ligament engagement matched experimental reports. Errors between model-predicted and experimental kinematics averaged ligaments agreed with literature descriptions. These results demonstrate the ability of our constraint models to be customized for multiple individuals and simultaneously call attention to the need to verify that ligament engagement is in good general agreement with literature. To facilitate further investigations of subject-specific or population based knee joint biomechanics, data collected during the experimental and modeling phases of this study are available for download by the research community. PMID:27306137

  10. Approaches and models of intercultural education

    Directory of Open Access Journals (Sweden)

    Iván Manuel Sánchez Fontalvo

    2013-10-01

    Full Text Available Needed to be aware of the need to build an intercultural society, awareness must be assumed in all social spheres, where stands the role play education. A role of transcendental, since it must promote educational spaces to form people with virtues and powers that allow them to live together / as in multicultural contexts and social diversities (sometimes uneven in an increasingly globalized and interconnected world, and foster the development of feelings of civic belonging shared before the neighborhood, city, region and country, allowing them concern and critical judgement to marginalization, poverty, misery and inequitable distribution of wealth, causes of structural violence, but at the same time, wanting to work for the welfare and transformation of these scenarios. Since these budgets, it is important to know the approaches and models of intercultural education that have been developed so far, analysing their impact on the contexts educational where apply.   

  11. Transport modeling: An artificial immune system approach

    Directory of Open Access Journals (Sweden)

    Teodorović Dušan

    2006-01-01

    Full Text Available This paper describes an artificial immune system approach (AIS to modeling time-dependent (dynamic, real time transportation phenomenon characterized by uncertainty. The basic idea behind this research is to develop the Artificial Immune System, which generates a set of antibodies (decisions, control actions that altogether can successfully cover a wide range of potential situations. The proposed artificial immune system develops antibodies (the best control strategies for different antigens (different traffic "scenarios". This task is performed using some of the optimization or heuristics techniques. Then a set of antibodies is combined to create Artificial Immune System. The developed Artificial Immune transportation systems are able to generalize, adapt, and learn based on new knowledge and new information. Applications of the systems are considered for airline yield management, the stochastic vehicle routing, and real-time traffic control at the isolated intersection. The preliminary research results are very promising.

  12. System approach to modeling of industrial technologies

    Science.gov (United States)

    Toropov, V. S.; Toropov, E. S.

    2018-03-01

    The authors presented a system of methods for modeling and improving industrial technologies. The system consists of information and software. The information part is structured information about industrial technologies. The structure has its template. The template has several essential categories used to improve the technological process and eliminate weaknesses in the process chain. The base category is the physical effect that takes place when the technical process proceeds. The programming part of the system can apply various methods of creative search to the content stored in the information part of the system. These methods pay particular attention to energy transformations in the technological process. The system application will allow us to systematize the approach to improving technologies and obtaining new technical solutions.

  13. On specification of initial conditions in turbulence models

    Energy Technology Data Exchange (ETDEWEB)

    Rollin, Bertrand [Los Alamos National Laboratory; Andrews, Malcolm J [Los Alamos National Laboratory

    2010-12-01

    Recent research has shown that initial conditions have a significant influence on the evolution of a flow towards turbulence. This important finding offers a unique opportunity for turbulence control, but also raises the question of how to properly specify initial conditions in turbulence models. We study this problem in the context of the Rayleigh-Taylor instability. The Rayleigh-Taylor instability is an interfacial fluid instability that leads to turbulence and turbulent mixing. It occurs when a light fluid is accelerated in to a heavy fluid because of misalignment between density and pressure gradients. The Rayleigh-Taylor instability plays a key role in a wide variety of natural and man-made flows ranging from supernovae to the implosion phase of Inertial Confinement Fusion (ICF). Our approach consists of providing the turbulence models with a predicted profile of its key variables at the appropriate time in accordance to the initial conditions of the problem.

  14. Integrating Cloud-Computing-Specific Model into Aircraft Design

    Science.gov (United States)

    Zhimin, Tian; Qi, Lin; Guangwen, Yang

    Cloud Computing is becoming increasingly relevant, as it will enable companies involved in spreading this technology to open the door to Web 3.0. In the paper, the new categories of services introduced will slowly replace many types of computational resources currently used. In this perspective, grid computing, the basic element for the large scale supply of cloud services, will play a fundamental role in defining how those services will be provided. The paper tries to integrate cloud computing specific model into aircraft design. This work has acquired good results in sharing licenses of large scale and expensive software, such as CFD (Computational Fluid Dynamics), UG, CATIA, and so on.

  15. Tumour resistance to cisplatin: a modelling approach

    International Nuclear Information System (INIS)

    Marcu, L; Bezak, E; Olver, I; Doorn, T van

    2005-01-01

    Although chemotherapy has revolutionized the treatment of haematological tumours, in many common solid tumours the success has been limited. Some of the reasons for the limitations are: the timing of drug delivery, resistance to the drug, repopulation between cycles of chemotherapy and the lack of complete understanding of the pharmacokinetics and pharmacodynamics of a specific agent. Cisplatin is among the most effective cytotoxic agents used in head and neck cancer treatments. When modelling cisplatin as a single agent, the properties of cisplatin only have to be taken into account, reducing the number of assumptions that are considered in the generalized chemotherapy models. The aim of the present paper is to model the biological effect of cisplatin and to simulate the consequence of cisplatin resistance on tumour control. The 'treated' tumour is a squamous cell carcinoma of the head and neck, previously grown by computer-based Monte Carlo techniques. The model maintained the biological constitution of a tumour through the generation of stem cells, proliferating cells and non-proliferating cells. Cell kinetic parameters (mean cell cycle time, cell loss factor, thymidine labelling index) were also consistent with the literature. A sensitivity study on the contribution of various mechanisms leading to drug resistance is undertaken. To quantify the extent of drug resistance, the cisplatin resistance factor (CRF) is defined as the ratio between the number of surviving cells of the resistant population and the number of surviving cells of the sensitive population, determined after the same treatment time. It is shown that there is a supra-linear dependence of CRF on the percentage of cisplatin-DNA adducts formed, and a sigmoid-like dependence between CRF and the percentage of cells killed in resistant tumours. Drug resistance is shown to be a cumulative process which eventually can overcome tumour regression leading to treatment failure

  16. Tumour resistance to cisplatin: a modelling approach

    Energy Technology Data Exchange (ETDEWEB)

    Marcu, L [School of Chemistry and Physics, University of Adelaide, North Terrace, SA 5000 (Australia); Bezak, E [School of Chemistry and Physics, University of Adelaide, North Terrace, SA 5000 (Australia); Olver, I [Faculty of Medicine, University of Adelaide, North Terrace, SA 5000 (Australia); Doorn, T van [School of Chemistry and Physics, University of Adelaide, North Terrace, SA 5000 (Australia)

    2005-01-07

    Although chemotherapy has revolutionized the treatment of haematological tumours, in many common solid tumours the success has been limited. Some of the reasons for the limitations are: the timing of drug delivery, resistance to the drug, repopulation between cycles of chemotherapy and the lack of complete understanding of the pharmacokinetics and pharmacodynamics of a specific agent. Cisplatin is among the most effective cytotoxic agents used in head and neck cancer treatments. When modelling cisplatin as a single agent, the properties of cisplatin only have to be taken into account, reducing the number of assumptions that are considered in the generalized chemotherapy models. The aim of the present paper is to model the biological effect of cisplatin and to simulate the consequence of cisplatin resistance on tumour control. The 'treated' tumour is a squamous cell carcinoma of the head and neck, previously grown by computer-based Monte Carlo techniques. The model maintained the biological constitution of a tumour through the generation of stem cells, proliferating cells and non-proliferating cells. Cell kinetic parameters (mean cell cycle time, cell loss factor, thymidine labelling index) were also consistent with the literature. A sensitivity study on the contribution of various mechanisms leading to drug resistance is undertaken. To quantify the extent of drug resistance, the cisplatin resistance factor (CRF) is defined as the ratio between the number of surviving cells of the resistant population and the number of surviving cells of the sensitive population, determined after the same treatment time. It is shown that there is a supra-linear dependence of CRF on the percentage of cisplatin-DNA adducts formed, and a sigmoid-like dependence between CRF and the percentage of cells killed in resistant tumours. Drug resistance is shown to be a cumulative process which eventually can overcome tumour regression leading to treatment failure.

  17. Modelling and simulating retail management practices: a first approach

    OpenAIRE

    Siebers, Peer-Olaf; Aickelin, Uwe; Celia, Helen; Clegg, Chris

    2010-01-01

    Multi-agent systems offer a new and exciting way of understanding the world of work. We apply agent-based modeling and simulation to investigate a set of problems\\ud in a retail context. Specifically, we are working to understand the relationship between people management practices on the shop-floor and retail performance. Despite the fact we are working within a relatively novel and complex domain, it is clear that using an agent-based approach offers great potential for improving organizati...

  18. A Dual-Specific Targeting Approach Based on the Simultaneous Recognition of Duplex and Quadruplex Motifs.

    Science.gov (United States)

    Nguyen, Thi Quynh Ngoc; Lim, Kah Wai; Phan, Anh Tuân

    2017-09-20

    Small-molecule ligands targeting nucleic acids have been explored as potential therapeutic agents. Duplex groove-binding ligands have been shown to recognize DNA in a sequence-specific manner. On the other hand, quadruplex-binding ligands exhibit high selectivity between quadruplex and duplex, but show limited discrimination between different quadruplex structures. Here we propose a dual-specific approach through the simultaneous application of duplex- and quadruplex-binders. We demonstrated that a quadruplex-specific ligand and a duplex-specific ligand can simultaneously interact at two separate binding sites of a quadruplex-duplex hybrid harbouring both quadruplex and duplex structural elements. Such a dual-specific targeting strategy would combine the sequence specificity of duplex-binders and the strong binding affinity of quadruplex-binders, potentially allowing the specific targeting of unique quadruplex structures. Future research can be directed towards the development of conjugated compounds targeting specific genomic quadruplex-duplex sites, for which the linker would be highly context-dependent in terms of length and flexibility, as well as the attachment points onto both ligands.

  19. Modeling a Predictive Energy Equation Specific for Maintenance Hemodialysis.

    Science.gov (United States)

    Byham-Gray, Laura D; Parrott, J Scott; Peters, Emily N; Fogerite, Susan Gould; Hand, Rosa K; Ahrens, Sean; Marcus, Andrea Fleisch; Fiutem, Justin J

    2017-03-01

    Hypermetabolism is theorized in patients diagnosed with chronic kidney disease who are receiving maintenance hemodialysis (MHD). We aimed to distinguish key disease-specific determinants of resting energy expenditure to create a predictive energy equation that more precisely establishes energy needs with the intent of preventing protein-energy wasting. For this 3-year multisite cross-sectional study (N = 116), eligible participants were diagnosed with chronic kidney disease and were receiving MHD for at least 3 months. Predictors for the model included weight, sex, age, C-reactive protein (CRP), glycosylated hemoglobin, and serum creatinine. The outcome variable was measured resting energy expenditure (mREE). Regression modeling was used to generate predictive formulas and Bland-Altman analyses to evaluate accuracy. The majority were male (60.3%), black (81.0%), and non-Hispanic (76.7%), and 23% were ≥65 years old. After screening for multicollinearity, the best predictive model of mREE ( R 2 = 0.67) included weight, age, sex, and CRP. Two alternative models with acceptable predictability ( R 2 = 0.66) were derived with glycosylated hemoglobin or serum creatinine. Based on Bland-Altman analyses, the maintenance hemodialysis equation that included CRP had the best precision, with the highest proportion of participants' predicted energy expenditure classified as accurate (61.2%) and with the lowest number of individuals with underestimation or overestimation. This study confirms disease-specific factors as key determinants of mREE in patients on MHD and provides a preliminary predictive energy equation. Further prospective research is necessary to test the reliability and validity of this equation across diverse populations of patients who are receiving MHD.

  20. Extracting business vocabularies from business process models: SBVR and BPMN standards-based approach

    Science.gov (United States)

    Skersys, Tomas; Butleris, Rimantas; Kapocius, Kestutis

    2013-10-01

    Approaches for the analysis and specification of business vocabularies and rules are very relevant topics in both Business Process Management and Information Systems Development disciplines. However, in common practice of Information Systems Development, the Business modeling activities still are of mostly empiric nature. In this paper, basic aspects of the approach for business vocabularies' semi-automated extraction from business process models are presented. The approach is based on novel business modeling-level OMG standards "Business Process Model and Notation" (BPMN) and "Semantics for Business Vocabularies and Business Rules" (SBVR), thus contributing to OMG's vision about Model-Driven Architecture (MDA) and to model-driven development in general.

  1. ECOMOD - An ecological approach to radioecological modelling

    International Nuclear Information System (INIS)

    Sazykina, Tatiana G.

    2000-01-01

    A unified methodology is proposed to simulate the dynamic processes of radionuclide migration in aquatic food chains in parallel with their stable analogue elements. The distinguishing feature of the unified radioecological/ecological approach is the description of radionuclide migration along with dynamic equations for the ecosystem. The ability of the methodology to predict the results of radioecological experiments is demonstrated by an example of radionuclide (iron group) accumulation by a laboratory culture of the algae Platymonas viridis. Based on the unified methodology, the 'ECOMOD' radioecological model was developed to simulate dynamic radioecological processes in aquatic ecosystems. It comprises three basic modules, which are operated as a set of inter-related programs. The 'ECOSYSTEM' module solves non-linear ecological equations, describing the biomass dynamics of essential ecosystem components. The 'RADIONUCLIDE DISTRIBUTION' module calculates the radionuclide distribution in abiotic and biotic components of the aquatic ecosystem. The 'DOSE ASSESSMENT' module calculates doses to aquatic biota and doses to man from aquatic food chains. The application of the ECOMOD model to reconstruct the radionuclide distribution in the Chernobyl Cooling Pond ecosystem in the early period after the accident shows good agreement with observations

  2. Modelling Approach In Islamic Architectural Designs

    Directory of Open Access Journals (Sweden)

    Suhaimi Salleh

    2014-06-01

    Full Text Available Architectural designs contribute as one of the main factors that should be considered in minimizing negative impacts in planning and structural development in buildings such as in mosques. In this paper, the ergonomics perspective is revisited which hence focuses on the conditional factors involving organisational, psychological, social and population as a whole. This paper tries to highlight the functional and architectural integration with ecstatic elements in the form of decorative and ornamental outlay as well as incorporating the building structure such as wall, domes and gates. This paper further focuses the mathematical aspects of the architectural designs such as polar equations and the golden ratio. These designs are modelled into mathematical equations of various forms, while the golden ratio in mosque is verified using two techniques namely, the geometric construction and the numerical method. The exemplary designs are taken from theSabah Bandaraya Mosque in Likas, Kota Kinabalu and the Sarawak State Mosque in Kuching,while the Universiti Malaysia Sabah Mosque is used for the Golden Ratio. Results show thatIslamic architectural buildings and designs have long had mathematical concepts and techniques underlying its foundation, hence, a modelling approach is needed to rejuvenate these Islamic designs.

  3. Formal approach to modeling of modern Information Systems

    Directory of Open Access Journals (Sweden)

    Bálint Molnár

    2016-01-01

    Full Text Available Most recently, the concept of business documents has started to play double role. On one hand, a business document (word processing text or calculation sheet can be used as specification tool, on the other hand the business document is an immanent constituent of business processes, thereby essential component of business Information Systems. The recent tendency is that the majority of documents and their contents within business Information Systems remain in semi-structured format and a lesser part of documents is transformed into schemas of structured databases. In order to keep the emerging situation in hand, we suggest the creation (1 a theoretical framework for modeling business Information Systems; (2 and a design method for practical application based on the theoretical model that provides the structuring principles. The modeling approach that focuses on documents and their interrelationships with business processes assists in perceiving the activities of modern Information Systems.

  4. A magnetospheric specification model validation study: Geosynchronous electrons

    Science.gov (United States)

    Hilmer, R. V.; Ginet, G. P.

    2000-09-01

    The Rice University Magnetospheric Specification Model (MSM) is an operational space environment model of the inner and middle magnetosphere designed to specify charged particle fluxes up to 100keV. Validation test data taken between January 1996 and June 1998 consist of electron fluxes measured by a charge control system (CCS) on a defense satellite communications system (DSCS) spacecraft. The CCS includes both electrostatic analyzers to measure the particle environment and surface potential monitors to track differential charging between various materials and vehicle ground. While typical RMS error analysis methods provide a sense of the models overall abilities, they do not specifically address physical situations critical to operations, i.e., how well does the model specify when a high differential charging state is probable. In this validation study, differential charging states observed by DSCS are used to determine several threshold fluxes for the associated 20-50keV electrons and joint probability distributions are constructed to determine Hit, Miss, and False Alarm rates for the models. An MSM run covering the two and one-half year interval is performed using the minimum required input parameter set, consisting of only the magnetic activity index Kp, in order to statistically examine the model's seasonal and yearly performance. In addition, the relative merits of the input parameter, i.e., Kp, Dst, the equatorward boundary of diffuse aurora at midnight, cross-polar cap potential, solar wind density and velocity, and interplanetary magnetic field values, are evaluated as drivers of shorter model runs of 100 d each. In an effort to develop operational tools that can address spacecraft charging issues, we also identify temporal features in the model output that can be directly linked to input parameter variations and model boundary conditions. All model output is interpreted using the full three-dimensional, dipole tilt-dependent algorithms currently in

  5. EDF's approach to determine specifications for nuclear power plant bulk chemicals

    International Nuclear Information System (INIS)

    Basile, Alix; Dijoux, Michel; Le-Calvar, Marc; Gressier, Frederic; Mole, Didier

    2012-09-01

    Chemical impurities in the primary, secondary and auxiliary nuclear power plants circuits generate risks of corrosion of the fuel cladding, steel and nickel based alloys. The PMUC (Products and Materials Used in plants) organization established by EDF intends to limit this risk by specifying maximum levels of impurities in products and materials used for the operation and maintenance of Nuclear Power Plants (NPPs). Bulk chemicals specifications, applied on primary and secondary circuit chemicals and hydrogen and nitrogen gases, are particularly important to prevent chemical species to be involved in the corrosion of the NPPs materials. The application of EDF specifications should lead to reasonably exclude any risk of degradation of the first and second containment barriers and auxiliary circuits Important to Safety (IPS) by limiting the concentrations of chlorides, fluorides, sulfates... The risk of metal embrittlement by elements with low melting point (mercury, lead...) is also included. For the primary circuit, the specifications intend to exclude the risk of activation of impurities introduced by the bulk chemicals. For the first containment barrier, to reduce the risk of deposits like zeolites, PMUC products specifications set limit values for calcium, magnesium, aluminum and silica. EDF's approach for establishing specifications for bulk chemicals is taking also into account the capacity of industrial production, as well as costs, limitations of analytical control methods (detection limits) and environmental releases issues. This paper aims to explain EDF's approach relative to specifications of impurities in bulk chemicals. Also presented are the various parameters taken into account to determine the maximum pollution levels in the chemicals, the theoretical hypothesis to set the specifications and the calculation method used to verify that the specifications are suitable. (authors)

  6. Modelling and subject-specific validation of the heart-arterial tree system.

    Science.gov (United States)

    Guala, Andrea; Camporeale, Carlo; Tosello, Francesco; Canuto, Claudio; Ridolfi, Luca

    2015-01-01

    A modeling approach integrated with a novel subject-specific characterization is here proposed for the assessment of hemodynamic values of the arterial tree. A 1D model is adopted to characterize large-to-medium arteries, while the left ventricle, aortic valve and distal micro-circulation sectors are described by lumped submodels. A new velocity profile and a new formulation of the non-linear viscoelastic constitutive relation suitable for the {Q, A} modeling are also proposed. The model is firstly verified semi-quantitatively against literature data. A simple but effective procedure for obtaining subject-specific model characterization from non-invasive measurements is then designed. A detailed subject-specific validation against in vivo measurements from a population of six healthy young men is also performed. Several key quantities of heart dynamics-mean ejected flow, ejection fraction, and left-ventricular end-diastolic, end-systolic and stroke volumes-and the pressure waveforms (at the central, radial, brachial, femoral, and posterior tibial sites) are compared with measured data. Mean errors around 5 and 8%, obtained for the heart and arterial quantities, respectively, testify the effectiveness of the model and its subject-specific characterization.

  7. Towards Subject-Specific Strength Training Design through Predictive Use of Musculoskeletal Models

    Directory of Open Access Journals (Sweden)

    Michael Plüss

    2018-01-01

    Full Text Available Lower extremity dysfunction is often associated with hip muscle strength deficiencies. Detailed knowledge of the muscle forces generated in the hip under specific external loading conditions enables specific structures to be trained. The aim of this study was to find the most effective movement type and loading direction to enable the training of specific parts of the hip muscles using a standing posture and a pulley system. In a novel approach to release the predictive power of musculoskeletal modelling techniques based on inverse dynamics, flexion/extension and ab-/adduction movements were virtually created. To demonstrate the effectiveness of this approach, three hip orientations and an external loading force that was systematically rotated around the body were simulated using a state-of-the art OpenSim model in order to establish ideal designs for training of the anterior and posterior parts of the M. gluteus medius (GM. The external force direction as well as the hip orientation greatly influenced the muscle forces in the different parts of the GM. No setting was found for simultaneous training of the anterior and posterior parts with a muscle force higher than 50% of the maximum. Importantly, this study has demonstrated the use of musculoskeletal models as an approach to predict muscle force variations for different strength and rehabilitation exercise variations.

  8. Object instance recognition using motion cues and instance specific appearance models

    Science.gov (United States)

    Schumann, Arne

    2014-03-01

    In this paper we present an object instance retrieval approach. The baseline approach consists of a pool of image features which are computed on the bounding boxes of a query object track and compared to a database of tracks in order to find additional appearances of the same object instance. We improve over this simple baseline approach in multiple ways: 1) we include motion cues to achieve improved robustness to viewpoint and rotation changes, 2) we include operator feedback to iteratively re-rank the resulting retrieval lists and 3) we use operator feedback and location constraints to train classifiers and learn an instance specific appearance model. We use these classifiers to further improve the retrieval results. The approach is evaluated on two popular public datasets for two different applications. We evaluate person re-identification on the CAVIAR shopping mall surveillance dataset and vehicle instance recognition on the VIVID aerial dataset and achieve significant improvements over our baseline results.

  9. Agribusiness model approach to territorial food development

    Directory of Open Access Journals (Sweden)

    Murcia Hector Horacio

    2011-04-01

    Full Text Available

    Several research efforts have coordinated the academic program of Agricultural Business Management from the University De La Salle (Bogota D.C., to the design and implementation of a sustainable agribusiness model applied to food development, with territorial projection. Rural development is considered as a process that aims to improve the current capacity and potential of the inhabitant of the sector, which refers not only to production levels and productivity of agricultural items. It takes into account the guidelines of the Organization of the United Nations “Millennium Development Goals” and considered the concept of sustainable food and agriculture development, including food security and nutrition in an integrated interdisciplinary context, with holistic and systemic dimension. Analysis is specified by a model with an emphasis on sustainable agribusiness production chains related to agricultural food items in a specific region. This model was correlated with farm (technical objectives, family (social purposes and community (collective orientations projects. Within this dimension are considered food development concepts and methodologies of Participatory Action Research (PAR. Finally, it addresses the need to link the results to low-income communities, within the concepts of the “new rurality”.

  10. Specification of advanced safety modeling requirements (Rev. 0).

    Energy Technology Data Exchange (ETDEWEB)

    Fanning, T. H.; Tautges, T. J.

    2008-06-30

    The U.S. Department of Energy's Global Nuclear Energy Partnership has lead to renewed interest in liquid-metal-cooled fast reactors for the purpose of closing the nuclear fuel cycle and making more efficient use of future repository capacity. However, the U.S. has not designed or constructed a fast reactor in nearly 30 years. Accurate, high-fidelity, whole-plant dynamics safety simulations will play a crucial role by providing confidence that component and system designs will satisfy established design limits and safety margins under a wide variety of operational, design basis, and beyond design basis transient conditions. Current modeling capabilities for fast reactor safety analyses have resulted from several hundred person-years of code development effort supported by experimental validation. The broad spectrum of mechanistic and phenomenological models that have been developed represent an enormous amount of institutional knowledge that needs to be maintained. Complicating this, the existing code architectures for safety modeling evolved from programming practices of the 1970s. This has lead to monolithic applications with interdependent data models which require significant knowledge of the complexities of the entire code in order for each component to be maintained. In order to develop an advanced fast reactor safety modeling capability, the limitations of the existing code architecture must be overcome while preserving the capabilities that already exist. To accomplish this, a set of advanced safety modeling requirements is defined, based on modern programming practices, that focuses on modular development within a flexible coupling framework. An approach for integrating the existing capabilities of the SAS4A/SASSYS-1 fast reactor safety analysis code into the SHARP framework is provided in order to preserve existing capabilities while providing a smooth transition to advanced modeling capabilities. In doing this, the advanced fast reactor safety models

  11. An integrated approach to permeability modeling using micro-models

    Energy Technology Data Exchange (ETDEWEB)

    Hosseini, A.H.; Leuangthong, O.; Deutsch, C.V. [Society of Petroleum Engineers, Canadian Section, Calgary, AB (Canada)]|[Alberta Univ., Edmonton, AB (Canada)

    2008-10-15

    An important factor in predicting the performance of steam assisted gravity drainage (SAGD) well pairs is the spatial distribution of permeability. Complications that make the inference of a reliable porosity-permeability relationship impossible include the presence of short-scale variability in sand/shale sequences; preferential sampling of core data; and uncertainty in upscaling parameters. Micro-modelling is a simple and effective method for overcoming these complications. This paper proposed a micro-modeling approach to account for sampling bias, small laminated features with high permeability contrast, and uncertainty in upscaling parameters. The paper described the steps and challenges of micro-modeling and discussed the construction of binary mixture geo-blocks; flow simulation and upscaling; extended power law formalism (EPLF); and the application of micro-modeling and EPLF. An extended power-law formalism to account for changes in clean sand permeability as a function of macroscopic shale content was also proposed and tested against flow simulation results. There was close agreement between the model and simulation results. The proposed methodology was also applied to build the porosity-permeability relationship for laminated and brecciated facies of McMurray oil sands. Experimental data was in good agreement with the experimental data. 8 refs., 17 figs.

  12. Anomalous superconductivity in the tJ model; moment approach

    DEFF Research Database (Denmark)

    Sørensen, Mads Peter; Rodriguez-Nunez, J.J.

    1997-01-01

    By extending the moment approach of Nolting (Z, Phys, 225 (1972) 25) in the superconducting phase, we have constructed the one-particle spectral functions (diagonal and off-diagonal) for the tJ model in any dimensions. We propose that both the diagonal and the off-diagonal spectral functions...... Hartree shift which in the end result enlarges the bandwidth of the free carriers allowing us to take relative high values of J/t and allowing superconductivity to live in the T-c-rho phase diagram, in agreement with numerical calculations in a cluster, We have calculated the static spin susceptibility......, chi(T), and the specific heat, C-v(T), within the moment approach. We find that all the relevant physical quantities show the signature of superconductivity at T-c in the form of kinks (anomalous behavior) or jumps, for low density, in agreement with recent published literature, showing a generic...

  13. A rule-based approach to model checking of UML state machines

    Science.gov (United States)

    Grobelna, Iwona; Grobelny, Michał; Stefanowicz, Łukasz

    2016-12-01

    In the paper a new approach to formal verification of control process specification expressed by means of UML state machines in version 2.x is proposed. In contrast to other approaches from the literature, we use the abstract and universal rule-based logical model suitable both for model checking (using the nuXmv model checker), but also for logical synthesis in form of rapid prototyping. Hence, a prototype implementation in hardware description language VHDL can be obtained that fully reflects the primary, already formally verified specification in form of UML state machines. Presented approach allows to increase the assurance that implemented system meets the user-defined requirements.

  14. Risk communication: a mental models approach

    National Research Council Canada - National Science Library

    Morgan, M. Granger (Millett Granger)

    2002-01-01

    ... information about risks. The procedure uses approaches from risk and decision analysis to identify the most relevant information; it also uses approaches from psychology and communication theory to ensure that its message is understood. This book is written in nontechnical terms, designed to make the approach feasible for anyone willing to try it. It is illustrat...

  15. A quality risk management model approach for cell therapy manufacturing.

    Science.gov (United States)

    Lopez, Fabio; Di Bartolo, Chiara; Piazza, Tommaso; Passannanti, Antonino; Gerlach, Jörg C; Gridelli, Bruno; Triolo, Fabio

    2010-12-01

    International regulatory authorities view risk management as an essential production need for the development of innovative, somatic cell-based therapies in regenerative medicine. The available risk management guidelines, however, provide little guidance on specific risk analysis approaches and procedures applicable in clinical cell therapy manufacturing. This raises a number of problems. Cell manufacturing is a poorly automated process, prone to operator-introduced variations, and affected by heterogeneity of the processed organs/tissues and lot-dependent variability of reagent (e.g., collagenase) efficiency. In this study, the principal challenges faced in a cell-based product manufacturing context (i.e., high dependence on human intervention and absence of reference standards for acceptable risk levels) are identified and addressed, and a risk management model approach applicable to manufacturing of cells for clinical use is described for the first time. The use of the heuristic and pseudo-quantitative failure mode and effect analysis/failure mode and critical effect analysis risk analysis technique associated with direct estimation of severity, occurrence, and detection is, in this specific context, as effective as, but more efficient than, the analytic hierarchy process. Moreover, a severity/occurrence matrix and Pareto analysis can be successfully adopted to identify priority failure modes on which to act to mitigate risks. The application of this approach to clinical cell therapy manufacturing in regenerative medicine is also discussed. © 2010 Society for Risk Analysis.

  16. A Multi-Model Approach for System Diagnosis

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik; Poulsen, Niels Kjølstad; Bækgaard, Mikkel Ask Buur

    2007-01-01

    A multi-model approach for system diagnosis is presented in this paper. The relation with fault diagnosis as well as performance validation is considered. The approach is based on testing a number of pre-described models and find which one is the best. It is based on an active approach......,i.e. an auxiliary input to the system is applied. The multi-model approach is applied on a wind turbine system....

  17. Element-specific density profiles in interacting biomembrane models

    International Nuclear Information System (INIS)

    Schneck, Emanuel; Rodriguez-Loureiro, Ignacio; Bertinetti, Luca; Gochev, Georgi; Marin, Egor; Novikov, Dmitri; Konovalov, Oleg

    2017-01-01

    Surface interactions involving biomembranes, such as cell–cell interactions or membrane contacts inside cells play important roles in numerous biological processes. Structural insight into the interacting surfaces is a prerequisite to understand the interaction characteristics as well as the underlying physical mechanisms. Here, we work with simplified planar experimental models of membrane surfaces, composed of lipids and lipopolymers. Their interaction is quantified in terms of pressure–distance curves using ellipsometry at controlled dehydrating (interaction) pressures. For selected pressures, their internal structure is investigated by standing-wave x-ray fluorescence (SWXF). This technique yields specific density profiles of the chemical elements P and S belonging to lipid headgroups and polymer chains, as well as counter-ion profiles for charged surfaces. (paper)

  18. Model-based analysis of context-specific cognitive control

    Directory of Open Access Journals (Sweden)

    Joseph A. King

    2012-09-01

    Full Text Available Interference resolution is improved for stimuli presented in contexts (e.g. locations associated with frequent conflict. This phenomenon, the context-specific proportion congruent (CSPC effect, has challenged the traditional juxtaposition of automatic and controlled processing because it suggests that contextual cues can prime top-down control settings in a bottom-up manner. We recently obtained support for this priming of control hypothesis with fMRI by showing that CSPC effects are mediated by contextually-cued adjustments in processing selectivity. However, an equally plausible explanation is that CSPC effects reflect adjustments in response caution triggered by expectancy violations (i.e. prediction errors when encountering rare events as compared to common ones (e.g. high-conflict incongruent trials in a task context associated with infrequent conflict. Here, we applied a quantitative model of choice, the linear ballistic accumulator (LBA, to distil the reaction time and accuracy data from four independent samples that performed a modified flanker task into latent variables representing the psychological processes underlying task-related decision making. We contrasted models which differentially accounted for CSPC effects as arising either from contextually-cued shifts in the rate of sensory evidence accumulation (drift models or in the amount of evidence required to reach a decision (threshold models. For the majority of the participants, the LBA ascribed CSPC effects to increases in response threshold for contextually-infrequent trial types (e.g. low-conflict congruent trials in the frequent conflict context, suggesting that the phenomenon may reflect more a prediction error-triggered shift in decision criterion rather than enhanced sensory evidence accumulation under conditions of frequent conflict.

  19. A point of view on Otto cycle approach specific for an undergraduate thermodynamics course in CMU

    Science.gov (United States)

    Memet, F.; Preda, A.

    2015-11-01

    This paper refers to the description of the way in which can be presented to future marine engineers the analyzis of the performance of an Otto cycle, in a manner which is beyond the classic approach of the course of thermodynamics in Constanta Maritime University. The conventional course of thermodynamics is dealing with the topic of performance analysis of the cycle of the internal combustion engine with isochoric combustion for the situation in which the working medium is treated as such a perfect gas. This type of approach is viable only when are considered relatively small temperature differences. But this is the situation when specific heats are seen as constant. Instead, the practical experience has shown that small temperature differences are not viable, resulting the need for variable specific heat evaluation. The presentation bellow is available for the adiabatic exponent written as a liniar function depending on temperature. In the section of this paper dedicated to methods and materials, the situation in which the specific heat is taken as constant is not neglected, additionaly being given the algorithm for variable specific heat.For the both cases it is given the way in which it is assessed the work output. The calculus is based on the cycle shown in temperature- entropy diagram, in which are also indicated the irreversible adiabatic compression and expansion. The experience achieved after understanding this theory will allow to future professionals to deal successfully with the design practice of internal combustion engines.

  20. Modeling drug- and chemical- induced hepatotoxicity with systems biology approaches

    Directory of Open Access Journals (Sweden)

    Sudin eBhattacharya

    2012-12-01

    Full Text Available We provide an overview of computational systems biology approaches as applied to the study of chemical- and drug-induced toxicity. The concept of ‘toxicity pathways’ is described in the context of the 2007 US National Academies of Science report, Toxicity testing in the 21st Century: A Vision and A Strategy. Pathway mapping and modeling based on network biology concepts are a key component of the vision laid out in this report for a more biologically-based analysis of dose-response behavior and the safety of chemicals and drugs. We focus on toxicity of the liver (hepatotoxicity – a complex phenotypic response with contributions from a number of different cell types and biological processes. We describe three case studies of complementary multi-scale computational modeling approaches to understand perturbation of toxicity pathways in the human liver as a result of exposure to environmental contaminants and specific drugs. One approach involves development of a spatial, multicellular virtual tissue model of the liver lobule that combines molecular circuits in individual hepatocytes with cell-cell interactions and blood-mediated transport of toxicants through hepatic sinusoids, to enable quantitative, mechanistic prediction of hepatic dose-response for activation of the AhR toxicity pathway. Simultaneously, methods are being developing to extract quantitative maps of intracellular signaling and transcriptional regulatory networks perturbed by environmental contaminants, using a combination of gene expression and genome-wide protein-DNA interaction data. A predictive physiological model (DILIsymTM to understand drug-induced liver injury (DILI, the most common adverse event leading to termination of clinical development programs and regulatory actions on drugs, is also described. The model initially focuses on reactive metabolite-induced DILI in response to administration of acetaminophen, and spans multiple biological scales.

  1. Virtuous organization: A structural equation modeling approach

    Directory of Open Access Journals (Sweden)

    Majid Zamahani

    2013-02-01

    Full Text Available For years, the idea of virtue was unfavorable among researchers and virtues were traditionally considered as culture-specific, relativistic and they were supposed to be associated with social conservatism, religious or moral dogmatism, and scientific irrelevance. Virtue and virtuousness have been recently considered seriously among organizational researchers. The proposed study of this paper examines the relationships between leadership, organizational culture, human resource, structure and processes, care for community and virtuous organization. Structural equation modeling is employed to investigate the effects of each variable on other components. The data used in this study consists of questionnaire responses from employees in Payam e Noor University in Yazd province. A total of 250 questionnaires were sent out and a total of 211 valid responses were received. Our results have revealed that all the five variables have positive and significant impacts on virtuous organization. Among the five variables, organizational culture has the most direct impact (0.80 and human resource has the most total impact (0.844 on virtuous organization.

  2. Factors affecting forward pricing behaviour: implications of alternative regression model specifications

    Directory of Open Access Journals (Sweden)

    Henry Jordaan

    2010-12-01

    Full Text Available Price risk associated with maize production became a reason for concern in South Africa only after the deregulation of the agricultural commodities markets in the mid-1990s, when farmers became responsible for marketing their own crops. Although farmers can use, inter alia, the cash forward contracting and/or the derivatives market to manage price risk, few farmers actually participate in forward pricing. A similar reluctance to use forward pricing methods is also found internationally. A number of different model specifications have been used in previous research to model forward pricing behaviour which is based on the assumption that the same variables influence both the adoption and the quantity decision. This study compares the results from a model specification which models forward pricing behaviour in a single-decision framework with the results from modelling the quantity decision conditional to the adoption decision in a two-step approach. The results suggest that substantially more information is obtained by modelling forward pricing behaviour as two separate decisions rather than a single decision. Such information may be valuable in educational material compiled to educate farmers in the effective use of forward pricing methods in price risk management. Modelling forward pricing behaviour as two separate decisions  is thus a more effective means of modelling forward pricing behaviour than modelling it as a single decision.

  3. The impact of case specificity and generalisable skills on clinical performance: a correlated traits-correlated methods approach.

    Science.gov (United States)

    Wimmers, Paul F; Fung, Cha-Chi

    2008-06-01

    The finding of case or content specificity in medical problem solving moved the focus of research away from generalisable skills towards the importance of content knowledge. However, controversy about the content dependency of clinical performance and the generalisability of skills remains. This study aimed to explore the relative impact of both perspectives (case specificity and generalisable skills) on different components (history taking, physical examination, communication) of clinical performance within and across cases. Data from a clinical performance examination (CPX) taken by 350 Year 3 students were used in a correlated traits-correlated methods (CTCM) approach using confirmatory factor analysis, whereby 'traits' refers to generalisable skills and 'methods' to individual cases. The baseline CTCM model was analysed and compared with four nested models using structural equation modelling techniques. The CPX consisted of three skills components and five cases. Comparison of the four different models with the least-restricted baseline CTCM model revealed that a model with uncorrelated generalisable skills factors and correlated case-specific knowledge factors represented the data best. The generalisable processes found in history taking, physical examination and communication were responsible for half the explained variance, in comparison with the variance related to case specificity. Conclusions Pure knowledge-based and pure skill-based perspectives on clinical performance both seem too one-dimensional and new evidence supports the idea that a substantial amount of variance contributes to both aspects of performance. It could be concluded that generalisable skills and specialised knowledge go hand in hand: both are essential aspects of clinical performance.

  4. Designing Class Activities to Meet Specific Core Training Competencies: A Developmental Approach

    Science.gov (United States)

    Guth, Lorraine J.; McDonnell, Kelly A.

    2004-01-01

    This article presents a developmental model for designing and utilizing class activities to meet specific Association for Specialists in Group Work (ASGW) core training competencies for group workers. A review of the relevant literature about teaching group work and meeting core training standards is provided. The authors suggest a process by…

  5. Dark matter physics in neutrino specific two Higgs doublet model

    Energy Technology Data Exchange (ETDEWEB)

    Baek, Seungwon; Nomura, Takaaki [School of Physics, Korea Institute for Advanced Study,85 Hoegiro, Dongdaemun-gu, Seoul 02455 (Korea, Republic of)

    2017-03-10

    Although the seesaw mechanism is a natural explanation for the small neutrino masses, there are cases when the Majorana mass terms for the right-handed neutrinos are not allowed due to symmetry. In that case, if neutrino-specific Higgs doublet is introduced, neutrinos become Dirac particles and their small masses can be explained by its small VEV. We show that the same symmetry, which we assume a global U(1){sub X}, can also be used to explain the stability of dark matter. In our model, a new singlet scalar breaks the global symmetry spontaneously down to a discrete Z{sub 2} symmetry. The dark matter particle, lightest Z{sub 2}-odd fermion, is stabilized. We discuss the phenomenology of dark matter: relic density, direct detection, and indirect detection. We find that the relic density can be explained by a novel Goldstone boson channel or by resonance channel. In the most region of parameter space considered, the direct detections is suppressed well below the current experimental bound. Our model can be further tested in indirect detection experiments such as FermiLAT gamma ray searches or neutrinoless double beta decay experiments.

  6. Computational and Game-Theoretic Approaches for Modeling Bounded Rationality

    NARCIS (Netherlands)

    L. Waltman (Ludo)

    2011-01-01

    textabstractThis thesis studies various computational and game-theoretic approaches to economic modeling. Unlike traditional approaches to economic modeling, the approaches studied in this thesis do not rely on the assumption that economic agents behave in a fully rational way. Instead, economic

  7. Multi-approach assessment of the spatial distribution of the specific yield: application to the Crau plain aquifer, France

    Science.gov (United States)

    Seraphin, Pierre; Gonçalvès, Julio; Vallet-Coulomb, Christine; Champollion, Cédric

    2018-03-01

    Spatially distributed values of the specific yield, a fundamental parameter for transient groundwater mass balance calculations, were obtained by means of three independent methods for the Crau plain, France. In contrast to its traditional use to assess recharge based on a given specific yield, the water-table fluctuation (WTF) method, applied using major recharging events, gave a first set of reference values. Then, large infiltration processes recorded by monitored boreholes and caused by major precipitation events were interpreted in terms of specific yield by means of a one-dimensional vertical numerical model solving Richards' equations within the unsaturated zone. Finally, two gravity field campaigns, at low and high piezometric levels, were carried out to assess the groundwater mass variation and thus alternative specific yield values. The range obtained by the WTF method for this aquifer made of alluvial detrital material was 2.9- 26%, in line with the scarce data available so far. The average spatial value of specific yield by the WTF method (9.1%) is consistent with the aquifer scale value from the hydro-gravimetric approach. In this investigation, an estimate of the hitherto unknown spatial distribution of the specific yield over the Crau plain was obtained using the most reliable method (the WTF method). A groundwater mass balance calculation over the domain using this distribution yielded similar results to an independent quantification based on a stable isotope-mixing model. This agreement reinforces the relevance of such estimates, which can be used to build a more accurate transient hydrogeological model.

  8. A systems biology approach to the analysis of subset-specific responses to lipopolysaccharide in dendritic cells.

    Science.gov (United States)

    Hancock, David G; Shklovskaya, Elena; Guy, Thomas V; Falsafi, Reza; Fjell, Chris D; Ritchie, William; Hancock, Robert E W; Fazekas de St Groth, Barbara

    2014-01-01

    Dendritic cells (DCs) are critical for regulating CD4 and CD8 T cell immunity, controlling Th1, Th2, and Th17 commitment, generating inducible Tregs, and mediating tolerance. It is believed that distinct DC subsets have evolved to control these different immune outcomes. However, how DC subsets mount different responses to inflammatory and/or tolerogenic signals in order to accomplish their divergent functions remains unclear. Lipopolysaccharide (LPS) provides an excellent model for investigating responses in closely related splenic DC subsets, as all subsets express the LPS receptor TLR4 and respond to LPS in vitro. However, previous studies of the LPS-induced DC transcriptome have been performed only on mixed DC populations. Moreover, comparisons of the in vivo response of two closely related DC subsets to LPS stimulation have not been reported in the literature to date. We compared the transcriptomes of murine splenic CD8 and CD11b DC subsets after in vivo LPS stimulation, using RNA-Seq and systems biology approaches. We identified subset-specific gene signatures, which included multiple functional immune mediators unique to each subset. To explain the observed subset-specific differences, we used a network analysis approach. While both DC subsets used a conserved set of transcription factors and major signalling pathways, the subsets showed differential regulation of sets of genes that 'fine-tune' the network Hubs expressed in common. We propose a model in which signalling through common pathway components is 'fine-tuned' by transcriptional control of subset-specific modulators, thus allowing for distinct functional outcomes in closely related DC subsets. We extend this analysis to comparable datasets from the literature and confirm that our model can account for cell subset-specific responses to LPS stimulation in multiple subpopulations in mouse and man.

  9. Seismic safety margins research program. Phase I. Project VII: systems analysis specifications of computational approach

    International Nuclear Information System (INIS)

    Collins, J.D.; Hudson, J.M.; Chrostowski, J.D.

    1979-02-01

    A computational methodology is presented for the prediction of core melt probabilities in a nuclear power plant due to earthquake events. The proposed model has four modules: seismic hazard, structural dynamic (including soil-structure interaction), component failure and core melt sequence. The proposed modules would operate in series and would not have to be operated at the same time. The basic statistical approach uses a Monte Carlo simulation to treat random and systematic error but alternate statistical approaches are permitted by the program design

  10. Specific acoustic models for spontaneous and dictated style in indonesian speech recognition

    Science.gov (United States)

    Vista, C. B.; Satriawan, C. H.; Lestari, D. P.; Widyantoro, D. H.

    2018-03-01

    The performance of an automatic speech recognition system is affected by differences in speech style between the data the model is originally trained upon and incoming speech to be recognized. In this paper, the usage of GMM-HMM acoustic models for specific speech styles is investigated. We develop two systems for the experiments; the first employs a speech style classifier to predict the speech style of incoming speech, either spontaneous or dictated, then decodes this speech using an acoustic model specifically trained for that speech style. The second system uses both acoustic models to recognise incoming speech and decides upon a final result by calculating a confidence score of decoding. Results show that training specific acoustic models for spontaneous and dictated speech styles confers a slight recognition advantage as compared to a baseline model trained on a mixture of spontaneous and dictated training data. In addition, the speech style classifier approach of the first system produced slightly more accurate results than the confidence scoring employed in the second system.

  11. Electronic Excitations in Solution: The Interplay between State Specific Approaches and a Time-Dependent Density Functional Theory Description.

    Science.gov (United States)

    Guido, Ciro A; Jacquemin, Denis; Adamo, Carlo; Mennucci, Benedetta

    2015-12-08

    We critically analyze the performances of continuum solvation models when coupled to time-dependent density functional theory (TD-DFT) to predict solvent effects on both absorption and emission energies of chromophores in solution. Different polarization schemes of the polarizable continuum model (PCM), such as linear response (LR) and three different state specific (SS) approaches, are considered and compared. We show the necessity of introducing a SS model in cases where large electron density rearrangements are involved in the excitations, such as charge-transfer transitions in both twisted and quadrupolar compounds, and underline the very delicate interplay between the selected polarization method and the chosen exchange-correlation functional. This interplay originates in the different descriptions of the transition and ground/excited state multipolar moments by the different functionals. As a result, the choice of both the DFT functional and the solvent polarization scheme has to be consistent with the nature of the studied electronic excitation.

  12. Improving Baseline Model Assumptions: Evaluating the Impacts of Typical Methodological Approaches in Watershed Models

    Science.gov (United States)

    Muenich, R. L.; Kalcic, M. M.; Teshager, A. D.; Long, C. M.; Wang, Y. C.; Scavia, D.

    2017-12-01

    Thanks to the availability of open-source software, online tutorials, and advanced software capabilities, watershed modeling has expanded its user-base and applications significantly in the past thirty years. Even complicated models like the Soil and Water Assessment Tool (SWAT) are being used and documented in hundreds of peer-reviewed publications each year, and likely more applied in practice. These models can help improve our understanding of present, past, and future conditions, or analyze important "what-if" management scenarios. However, baseline data and methods are often adopted and applied without rigorous testing. In multiple collaborative projects, we have evaluated the influence of some of these common approaches on model results. Specifically, we examined impacts of baseline data and assumptions involved in manure application, combined sewer overflows, and climate data incorporation across multiple watersheds in the Western Lake Erie Basin. In these efforts, we seek to understand the impact of using typical modeling data and assumptions, versus using improved data and enhanced assumptions on model outcomes and thus ultimately, study conclusions. We provide guidance for modelers as they adopt and apply data and models for their specific study region. While it is difficult to quantitatively assess the full uncertainty surrounding model input data and assumptions, recognizing the impacts of model input choices is important when considering actions at the both the field and watershed scales.

  13. A Discrete Monetary Economic Growth Model with the MIU Approach

    Directory of Open Access Journals (Sweden)

    Wei-Bin Zhang

    2008-01-01

    Full Text Available This paper proposes an alternative approach to economic growth with money. The production side is the same as the Solow model, the Ramsey model, and the Tobin model. But we deal with behavior of consumers differently from the traditional approaches. The model is influenced by the money-in-the-utility (MIU approach in monetary economics. It provides a mechanism of endogenous saving which the Solow model lacks and avoids the assumption of adding up utility over a period of time upon which the Ramsey approach is based.

  14. Birthdating studies reshape models for pituitary gland cell specification.

    Science.gov (United States)

    Davis, Shannon W; Mortensen, Amanda H; Camper, Sally A

    2011-04-15

    The intermediate and anterior lobes of the pituitary gland are derived from an invagination of oral ectoderm that forms Rathke's pouch. During gestation proliferating cells are enriched around the pouch lumen, and they appear to delaminate as they exit the cell cycle and differentiate. During late mouse gestation and the postnatal period, anterior lobe progenitors re-enter the cell cycle and expand the populations of specialized, hormone-producing cells. At birth, all cell types are present, and their localization appears stratified based on cell type. We conducted a birth dating study of Rathke's pouch derivatives to determine whether the location of specialized cells at birth is correlated with the timing of cell cycle exit. We find that all of the anterior lobe cell types initiate differentiation concurrently with a peak between e11.5 and e13.5. Differentiation of intermediate lobe melanotropes is delayed relative to anterior lobe cell types. We discovered that specialized cell types are not grouped together based on birth date and are dispersed throughout the anterior lobe. Thus, the apparent stratification of specialized cells at birth is not correlated with cell cycle exit. Thus, the currently popular model of cell specification, dependent upon timing of extrinsic, directional gradients of signaling molecules, needs revision. We propose that signals intrinsic to Rathke's pouch are necessary for cell specification between e11.5 and e13.5 and that cell-cell communication likely plays an important role in regulating this process. Copyright © 2011 Elsevier Inc. All rights reserved.

  15. Numerical renormalization group calculation of impurity internal energy and specific heat of quantum impurity models

    Science.gov (United States)

    Merker, L.; Costi, T. A.

    2012-08-01

    We introduce a method to obtain the specific heat of quantum impurity models via a direct calculation of the impurity internal energy requiring only the evaluation of local quantities within a single numerical renormalization group (NRG) calculation for the total system. For the Anderson impurity model we show that the impurity internal energy can be expressed as a sum of purely local static correlation functions and a term that involves also the impurity Green function. The temperature dependence of the latter can be neglected in many cases, thereby allowing the impurity specific heat Cimp to be calculated accurately from local static correlation functions; specifically via Cimp=(∂Eionic)/(∂T)+(1)/(2)(∂Ehyb)/(∂T), where Eionic and Ehyb are the energies of the (embedded) impurity and the hybridization energy, respectively. The term involving the Green function can also be evaluated in cases where its temperature dependence is non-negligible, adding an extra term to Cimp. For the nondegenerate Anderson impurity model, we show by comparison with exact Bethe ansatz calculations that the results recover accurately both the Kondo induced peak in the specific heat at low temperatures as well as the high-temperature peak due to the resonant level. The approach applies to multiorbital and multichannel Anderson impurity models with arbitrary local Coulomb interactions. An application to the Ohmic two-state system and the anisotropic Kondo model is also given, with comparisons to Bethe ansatz calculations. The approach could also be of interest within other impurity solvers, for example, within quantum Monte Carlo techniques.

  16. Mathematical Modelling Approach in Mathematics Education

    Science.gov (United States)

    Arseven, Ayla

    2015-01-01

    The topic of models and modeling has come to be important for science and mathematics education in recent years. The topic of "Modeling" topic is especially important for examinations such as PISA which is conducted at an international level and measures a student's success in mathematics. Mathematical modeling can be defined as using…

  17. A Multivariate Approach to Functional Neuro Modeling

    DEFF Research Database (Denmark)

    Mørch, Niels J.S.

    1998-01-01

    by the application of linear and more flexible, nonlinear microscopic regression models to a real-world dataset. The dependency of model performance, as quantified by generalization error, on model flexibility and training set size is demonstrated, leading to the important realization that no uniformly optimal model......, provides the basis for a generalization theoretical framework relating model performance to model complexity and dataset size. Briefly summarized the major topics discussed in the thesis include: - An introduction of the representation of functional datasets by pairs of neuronal activity patterns...... exists. - Model visualization and interpretation techniques. The simplicity of this task for linear models contrasts the difficulties involved when dealing with nonlinear models. Finally, a visualization technique for nonlinear models is proposed. A single observation emerges from the thesis...

  18. Rival approaches to mathematical modelling in immunology

    Science.gov (United States)

    Andrew, Sarah M.; Baker, Christopher T. H.; Bocharov, Gennady A.

    2007-08-01

    In order to formulate quantitatively correct mathematical models of the immune system, one requires an understanding of immune processes and familiarity with a range of mathematical techniques. Selection of an appropriate model requires a number of decisions to be made, including a choice of the modelling objectives, strategies and techniques and the types of model considered as candidate models. The authors adopt a multidisciplinary perspective.

  19. A hybrid agent-based approach for modeling microbiological systems.

    Science.gov (United States)

    Guo, Zaiyi; Sloot, Peter M A; Tay, Joc Cing

    2008-11-21

    Models for systems biology commonly adopt Differential Equations or Agent-Based modeling approaches for simulating the processes as a whole. Models based on differential equations presuppose phenomenological intracellular behavioral mechanisms, while models based on Multi-Agent approach often use directly translated, and quantitatively less precise if-then logical rule constructs. We propose an extendible systems model based on a hybrid agent-based approach where biological cells are modeled as individuals (agents) while molecules are represented by quantities. This hybridization in entity representation entails a combined modeling strategy with agent-based behavioral rules and differential equations, thereby balancing the requirements of extendible model granularity with computational tractability. We demonstrate the efficacy of this approach with models of chemotaxis involving an assay of 10(3) cells and 1.2x10(6) molecules. The model produces cell migration patterns that are comparable to laboratory observations.

  20. A new approach for modeling dry deposition velocity of particles

    Science.gov (United States)

    Giardina, M.; Buffa, P.

    2018-05-01

    The dry deposition process is recognized as an important pathway among the various removal processes of pollutants in the atmosphere. In this field, there are several models reported in the literature useful to predict the dry deposition velocity of particles of different diameters but many of them are not capable of representing dry deposition phenomena for several categories of pollutants and deposition surfaces. Moreover, their applications is valid for specific conditions and if the data in that application meet all of the assumptions required of the data used to define the model. In this paper a new dry deposition velocity model based on an electrical analogy schema is proposed to overcome the above issues. The dry deposition velocity is evaluated by assuming that the resistances that affect the particle flux in the Quasi-Laminar Sub-layers can be combined to take into account local features of the mutual influence of inertial impact processes and the turbulent one. Comparisons with the experimental data from literature indicate that the proposed model allows to capture with good agreement the main dry deposition phenomena for the examined environmental conditions and deposition surfaces to be determined. The proposed approach could be easily implemented within atmospheric dispersion modeling codes and efficiently addressing different deposition surfaces for several particle pollution.

  1. Glass Transition Temperature- and Specific Volume- Composition Models for Tellurite Glasses

    Energy Technology Data Exchange (ETDEWEB)

    Riley, Brian J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Vienna, John D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2017-09-01

    This report provides models for predicting composition-properties for tellurite glasses, namely specific gravity and glass transition temperature. Included are the partial specific coefficients for each model, the component validity ranges, and model fit parameters.

  2. Specification and Aggregation Errors in Environmentally Extended Input-Output Models

    NARCIS (Netherlands)

    Bouwmeester, Maaike C.; Oosterhaven, Jan

    This article considers the specification and aggregation errors that arise from estimating embodied emissions and embodied water use with environmentally extended national input-output (IO) models, instead of with an environmentally extended international IO model. Model specification errors result

  3. A working group`s conclusion on site specific flow and transport modelling

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, J. [Golder Associates AB (Sweden); Ahokas, H. [Fintact Oy, Helsinki (Finland); Koskinen, L.; Poteri, A. [VTT Energy, Espoo (Finland); Niemi, A. [Royal Inst. of Technology, Stockholm (Sweden). Hydraulic Engineering; Hautojaervi, A. [Posiva Oy, Helsinki (Finland)

    1998-03-01

    This document suggests a strategy plan for groundwater flow and transport modelling to be used in the site specific performance assessment analysis of spent nuclear fuel disposal to be used for the site selection planned by the year 2000. Considering suggested general regulations in Finland, as well as suggested regulations in Sweden and the approach taken in recent safety assessment exercises conducted in these countries, it is clear that in such an analysis, in addition to showing that the proposed repository is safe, there exist needs to strengthen the link between field data, groundwater flow modelling and derivation of safety assessment parameters, and needs to assess uncertainty and variability. The suggested strategy plan builds on an evaluation of different approaches to modelling the groundwater flow in crystalline basement rock, the abundance of data collected in the site investigation programme in Finland, and the modelling methodology developed in the programme so far. It is suggested to model the whole system using nested models, where larger scale models provide the boundary conditions for the smaller ones 62 refs.

  4. A microbial-mineralization approach for syntheses of iron oxides with a high specific surface area.

    Science.gov (United States)

    Yagita, Naoki; Oaki, Yuya; Imai, Hiroaki

    2013-04-02

    Of minerals and microbes: A microbial-mineralization-inspired approach was used to facilitate the syntheses of iron oxides with a high specific surface area, such as 253 m(2)g(-1) for maghemite (γ-Fe(2)O(3)) and 148 m(2)g(-1) for hematite (α-Fe(2)O(3)). These iron oxides can be applied to electrode material of lithium-ion batteries, adsorbents, and catalysts. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Modelling of subject specific based segmental dynamics of knee joint

    Science.gov (United States)

    Nasir, N. H. M.; Ibrahim, B. S. K. K.; Huq, M. S.; Ahmad, M. K. I.

    2017-09-01

    This study determines segmental dynamics parameters based on subject specific method. Five hemiplegic patients participated in the study, two men and three women. Their ages ranged from 50 to 60 years, weights from 60 to 70 kg and heights from 145 to 170 cm. Sample group included patients with different side of stroke. The parameters of the segmental dynamics resembling the knee joint functions measured via measurement of Winter and its model generated via the employment Kane's equation of motion. Inertial parameters in the form of the anthropometry can be identified and measured by employing Standard Human Dimension on the subjects who are in hemiplegia condition. The inertial parameters are the location of centre of mass (COM) at the length of the limb segment, inertia moment around the COM and masses of shank and foot to generate accurate motion equations. This investigation has also managed to dig out a few advantages of employing the table of anthropometry in movement biomechanics of Winter's and Kane's equation of motion. A general procedure is presented to yield accurate measurement of estimation for the inertial parameters for the joint of the knee of certain subjects with stroke history.

  6. Finite element speaker-specific face model generation for the study of speech production.

    Science.gov (United States)

    Bucki, Marek; Nazari, Mohammad Ali; Payan, Yohan

    2010-08-01

    In situations where automatic mesh generation is unsuitable, the finite element (FE) mesh registration technique known as mesh-match-and-repair (MMRep) is an interesting option for quickly creating a subject-specific FE model by fitting a predefined template mesh onto the target organ. The irregular or poor quality elements produced by the elastic deformation are corrected by a 'mesh reparation' procedure ensuring that the desired regularity and quality standards are met. Here, we further extend the MMRep capabilities and demonstrate the possibility of taking into account additional relevant anatomical features. We illustrate this approach with an example of biomechanical model generation of a speaker's face comprising face muscle insertions. While taking advantage of the a priori knowledge about tissues conveyed by the template model, this novel, fast and automatic mesh registration technique makes it possible to achieve greater modelling realism by accurately representing the organ surface as well as inner anatomical or functional structures of interest.

  7. Delay equations modeling the effects of phase-specific drugs and immunotherapy on proliferating tumor cells.

    Science.gov (United States)

    Barbarossa, Maria Vittoria; Kuttler, Christina; Zinsl, Jonathan

    2012-04-01

    In this work we present a mathematical model for tumor growth based on the biology of the cell cycle. For an appropriate description of the effects of phase-specific drugs, it is necessary to look at the cell cycle and its phases. Our model reproduces the dynamics of three different tumor cell populations: quiescent cells, cells during the interphase and mitotic cells. Starting from a partial differential equations (PDEs) setting, a delay differential equations (DDE) model is derived for an easier and more realistic approach. Our equations also include interactions of tumor cells with immune system effectors. We investigate the model both from the analytical and the numerical point of view, give conditions for positivity of solutions and focus on the stability of the cancer-free equilibrium. Different immunotherapeutic strategies and their effects on the tumor growth are considered, as well.

  8. Numerical modelling approach for mine backfill

    Indian Academy of Sciences (India)

    Muhammad Zaka Emad

    2017-07-24

    Jul 24, 2017 ... conditions. This paper discusses a numerical modelling strategy for modelling mine backfill material. The .... placed in an ore pass that leads the ore to the ore bin and crusher, from ... 1 year, depending on the mine plan.

  9. Uncertainty in biology a computational modeling approach

    CERN Document Server

    Gomez-Cabrero, David

    2016-01-01

    Computational modeling of biomedical processes is gaining more and more weight in the current research into the etiology of biomedical problems and potential treatment strategies.  Computational modeling allows to reduce, refine and replace animal experimentation as well as to translate findings obtained in these experiments to the human background. However these biomedical problems are inherently complex with a myriad of influencing factors, which strongly complicates the model building and validation process.  This book wants to address four main issues related to the building and validation of computational models of biomedical processes: Modeling establishment under uncertainty Model selection and parameter fitting Sensitivity analysis and model adaptation Model predictions under uncertainty In each of the abovementioned areas, the book discusses a number of key-techniques by means of a general theoretical description followed by one or more practical examples.  This book is intended for graduate stude...

  10. OILMAP: A global approach to spill modeling

    International Nuclear Information System (INIS)

    Spaulding, M.L.; Howlett, E.; Anderson, E.; Jayko, K.

    1992-01-01

    OILMAP is an oil spill model system suitable for use in both rapid response mode and long-range contingency planning. It was developed for a personal computer and employs full-color graphics to enter data, set up spill scenarios, and view model predictions. The major components of OILMAP include environmental data entry and viewing capabilities, the oil spill models, and model prediction display capabilities. Graphic routines are provided for entering wind data, currents, and any type of geographically referenced data. Several modes of the spill model are available. The surface trajectory mode is intended for quick spill response. The weathering model includes the spreading, evaporation, entrainment, emulsification, and shoreline interaction of oil. The stochastic and receptor models simulate a large number of trajectories from a single site for generating probability statistics. Each model and the algorithms they use are described. Several additional capabilities are planned for OILMAP, including simulation of tactical spill response and subsurface oil transport. 8 refs

  11. Relaxed memory models: an operational approach

    OpenAIRE

    Boudol , Gérard; Petri , Gustavo

    2009-01-01

    International audience; Memory models define an interface between programs written in some language and their implementation, determining which behaviour the memory (and thus a program) is allowed to have in a given model. A minimal guarantee memory models should provide to the programmer is that well-synchronized, that is, data-race free code has a standard semantics. Traditionally, memory models are defined axiomatically, setting constraints on the order in which memory operations are allow...

  12. Vaccination with lipid core peptides fails to induce epitope-specific T cell responses but confers non-specific protective immunity in a malaria model.

    Directory of Open Access Journals (Sweden)

    Simon H Apte

    Full Text Available Vaccines against many pathogens for which conventional approaches have failed remain an unmet public health priority. Synthetic peptide-based vaccines offer an attractive alternative to whole protein and whole organism vaccines, particularly for complex pathogens that cause chronic infection. Previously, we have reported a promising lipid core peptide (LCP vaccine delivery system that incorporates the antigen, carrier, and adjuvant in a single molecular entity. LCP vaccines have been used to deliver several peptide subunit-based vaccine candidates and induced high titre functional antibodies and protected against Group A streptococcus in mice. Herein, we have evaluated whether LCP constructs incorporating defined CD4(+ and/or CD8(+ T cell epitopes could induce epitope-specific T cell responses and protect against pathogen challenge in a rodent malaria model. We show that LCP vaccines failed to induce an expansion of antigen-specific CD8(+ T cells following primary immunization or by boosting. We further demonstrated that the LCP vaccines induced a non-specific type 2 polarized cytokine response, rather than an epitope-specific canonical CD8(+ T cell type 1 response. Cytotoxic responses of unknown specificity were also induced. These non-specific responses were able to protect against parasite challenge. These data demonstrate that vaccination with lipid core peptides fails to induce canonical epitope-specific T cell responses, at least in our rodent model, but can nonetheless confer non-specific protective immunity against Plasmodium parasite challenge.

  13. Modeling composting kinetics: A review of approaches

    NARCIS (Netherlands)

    Hamelers, H.V.M.

    2004-01-01

    Composting kinetics modeling is necessary to design and operate composting facilities that comply with strict market demands and tight environmental legislation. Current composting kinetics modeling can be characterized as inductive, i.e. the data are the starting point of the modeling process and

  14. Conformally invariant models: A new approach

    International Nuclear Information System (INIS)

    Fradkin, E.S.; Palchik, M.Ya.; Zaikin, V.N.

    1996-02-01

    A pair of mathematical models of quantum field theory in D dimensions is analyzed, particularly, a model of a charged scalar field defined by two generations of secondary fields in the space of even dimensions D>=4 and a model of a neutral scalar field defined by two generations of secondary fields in two-dimensional space. 6 refs

  15. A Centerline Based Model Morphing Algorithm for Patient-Specific Finite Element Modelling of the Left Ventricle.

    Science.gov (United States)

    Behdadfar, S; Navarro, L; Sundnes, J; Maleckar, M; Ross, S; Odland, H H; Avril, S

    2017-09-20

    Hexahedral automatic model generation is a recurrent problem in computer vision and computational biomechanics. It may even become a challenging problem when one wants to develop a patient-specific finite-element (FE) model of the left ventricle (LV), particularly when only low resolution images are available. In the present study, a fast and efficient algorithm is presented and tested to address such a situation. A template FE hexahedral model was created for a LV geometry using a General Electric (GE) ultrasound (US) system. A system of centerline was considered for this LV mesh. Then, the nodes located over the endocardial and epicardial surfaces are respectively projected from this centerline onto the actual endocardial and epicardial surfaces reconstructed from a patient's US data. Finally, the position of the internal nodes is derived by finding the deformations with minimal elastic energy. This approach was applied to eight patients suffering from congestive heart disease. A FE analysis was performed to derive the stress induced in the LV tissue by diastolic blood pressure on each of them. Our model morphing algorithm was applied successfully and the obtained meshes showed only marginal mismatches when compared to the corresponding US geometries. The diastolic FE analyses were successfully performed in seven patients to derive the distribution of principal stresses. The original model morphing algorithm is fast and robust with low computational cost. This low cost model morphing algorithm may be highly beneficial for future patient-specific reduced-order modelling of the LV with potential application to other crucial organs.

  16. The female gametophyte: an emerging model for cell type-specific systems biology in plant development

    Directory of Open Access Journals (Sweden)

    Marc William Schmid

    2015-11-01

    Full Text Available Systems biology, a holistic approach describing a system emerging from the interactions of its molecular components, critically depends on accurate qualitative determination and quantitative measurements of these components. Development and improvement of large-scale profiling methods (omics now facilitates comprehensive measurements of many relevant molecules. For multicellular organisms, such as animals, fungi, algae, and plants, the complexity of the system is augmented by the presence of specialized cell types and organs, and a complex interplay within and between them. Cell type-specific analyses are therefore crucial for the understanding of developmental processes and environmental responses. This review first gives an overview of current methods used for large-scale profiling of specific cell types exemplified by recent advances in plant biology. The focus then lies on suitable model systems to study plant development and cell type specification. We introduce the female gametophyte of flowering plants as an ideal model to study fundamental developmental processes. Moreover, the female reproductive lineage is of importance for the emergence of evolutionary novelties such as an unequal parental contribution to the tissue nurturing the embryo or the clonal production of seeds by asexual reproduction (apomixis. Understanding these processes is not only interesting from a developmental or evolutionary perspective, but bears great potential for further crop improvement and the simplification of breeding efforts. We finally highlight novel methods, which are already available or which will likely soon facilitate large-scale profiling of the specific cell types of the female gametophyte in both model and non-model species. We conclude that it may take only few years until an evolutionary systems biology approach toward female gametogenesis may decipher some of its biologically most interesting and economically most valuable processes.

  17. Linear mixed-effects modeling approach to FMRI group analysis.

    Science.gov (United States)

    Chen, Gang; Saad, Ziad S; Britton, Jennifer C; Pine, Daniel S; Cox, Robert W

    2013-06-01

    Conventional group analysis is usually performed with Student-type t-test, regression, or standard AN(C)OVA in which the variance-covariance matrix is presumed to have a simple structure. Some correction approaches are adopted when assumptions about the covariance structure is violated. However, as experiments are designed with different degrees of sophistication, these traditional methods can become cumbersome, or even be unable to handle the situation at hand. For example, most current FMRI software packages have difficulty analyzing the following scenarios at group level: (1) taking within-subject variability into account when there are effect estimates from multiple runs or sessions; (2) continuous explanatory variables (covariates) modeling in the presence of a within-subject (repeated measures) factor, multiple subject-grouping (between-subjects) factors, or the mixture of both; (3) subject-specific adjustments in covariate modeling; (4) group analysis with estimation of hemodynamic response (HDR) function by multiple basis functions; (5) various cases of missing data in longitudinal studies; and (6) group studies involving family members or twins. Here we present a linear mixed-effects modeling (LME) methodology that extends the conventional group analysis approach to analyze many complicated cases, including the six prototypes delineated above, whose analyses would be otherwise either difficult or unfeasible under traditional frameworks such as AN(C)OVA and general linear model (GLM). In addition, the strength of the LME framework lies in its flexibility to model and estimate the variance-covariance structures for both random effects and residuals. The intraclass correlation (ICC) values can be easily obtained with an LME model with crossed random effects, even at the presence of confounding fixed effects. The simulations of one prototypical scenario indicate that the LME modeling keeps a balance between the control for false positives and the sensitivity

  18. Position-specific isotope modeling of organic micropollutants transformations through different reaction pathways

    Science.gov (United States)

    Jin, Biao; Rolle, Massimo

    2016-04-01

    Organic compounds are produced in vast quantities for industrial and agricultural use, as well as for human and animal healthcare [1]. These chemicals and their metabolites are frequently detected at trace levels in fresh water environments where they undergo degradation via different reaction pathways. Compound specific stable isotope analysis (CSIA) is a valuable tool to identify such degradation pathways in different environmental systems. Recent advances in analytical techniques have promoted the fast development and implementation of multi-element CSIA. However, quantitative frameworks to evaluate multi-element stable isotope data and incorporating mechanistic information on the degradation processes [2,3] are still lacking. In this study we propose a mechanism-based modeling approach to simultaneously evaluate concentration as well as bulk and position-specific multi-element isotope evolution during the transformation of organic micropollutants. The model explicitly simulates position-specific isotopologues for those atoms that experience isotope effects and, thereby, provides a mechanistic description of isotope fractionation occurring at different molecular positions. We validate the proposed approach with the concentration and multi-element isotope data of three selected organic micropollutants: dichlorobenzamide (BAM), isoproturon (IPU) and diclofenac (DCF). The model precisely captures the dual element isotope trends characteristic of different reaction pathways and their range of variation consistent with observed multi-element (C, N) bulk isotope fractionation. The proposed approach can also be used as a tool to explore transformation pathways in scenarios for which position-specific isotope data are not yet available. [1] Schwarzenbach, R.P., Egli, T., Hofstetter, T.B., von Gunten, U., Wehrli, B., 2010. Global Water Pollution and Human Health. Annu. Rev. Environ. Resour. doi:10.1146/annurev-environ-100809-125342. [2] Jin, B., Haderlein, S.B., Rolle, M

  19. Personalization of models with many model parameters: an efficient sensitivity analysis approach.

    Science.gov (United States)

    Donders, W P; Huberts, W; van de Vosse, F N; Delhaas, T

    2015-10-01

    Uncertainty quantification and global sensitivity analysis are indispensable for patient-specific applications of models that enhance diagnosis or aid decision-making. Variance-based sensitivity analysis methods, which apportion each fraction of the output uncertainty (variance) to the effects of individual input parameters or their interactions, are considered the gold standard. The variance portions are called the Sobol sensitivity indices and can be estimated by a Monte Carlo (MC) approach (e.g., Saltelli's method [1]) or by employing a metamodel (e.g., the (generalized) polynomial chaos expansion (gPCE) [2, 3]). All these methods require a large number of model evaluations when estimating the Sobol sensitivity indices for models with many parameters [4]. To reduce the computational cost, we introduce a two-step approach. In the first step, a subset of important parameters is identified for each output of interest using the screening method of Morris [5]. In the second step, a quantitative variance-based sensitivity analysis is performed using gPCE. Efficient sampling strategies are introduced to minimize the number of model runs required to obtain the sensitivity indices for models considering multiple outputs. The approach is tested using a model that was developed for predicting post-operative flows after creation of a vascular access for renal failure patients. We compare the sensitivity indices obtained with the novel two-step approach with those obtained from a reference analysis that applies Saltelli's MC method. The two-step approach was found to yield accurate estimates of the sensitivity indices at two orders of magnitude lower computational cost. Copyright © 2015 John Wiley & Sons, Ltd.

  20. Numerical modelling of carbonate platforms and reefs: approaches and opportunities

    Energy Technology Data Exchange (ETDEWEB)

    Dalmasso, H.; Montaggioni, L.F.; Floquet, M. [Universite de Provence, Marseille (France). Centre de Sedimentologie-Palaeontologie; Bosence, D. [Royal Holloway University of London, Egham (United Kingdom). Dept. of Geology

    2001-07-01

    This paper compares different computing procedures that have been utilized in simulating shallow-water carbonate platform development. Based on our geological knowledge we can usually give a rather accurate qualitative description of the mechanisms controlling geological phenomena. Further description requires the use of computer stratigraphic simulation models that allow quantitative evaluation and understanding of the complex interactions of sedimentary depositional carbonate systems. The roles of modelling include: (1) encouraging accuracy and precision in data collection and process interpretation (Watney et al., 1999); (2) providing a means to quantitatively test interpretations concerning the control of various mechanisms on producing sedimentary packages; (3) predicting or extrapolating results into areas of limited control; (4) gaining new insights regarding the interaction of parameters; (5) helping focus on future studies to resolve specific problems. This paper addresses two main questions, namely: (1) What are the advantages and disadvantages of various types of models? (2) How well do models perform? In this paper we compare and discuss the application of five numerical models: CARBONATE (Bosence and Waltham, 1990), FUZZIM (Nordlund, 1999), CARBPLAT (Bosscher, 1992), DYNACARB (Li et al., 1993), PHIL (Bowman, 1997) and SEDPAK (Kendall et al., 1991). The comparison, testing and evaluation of these models allow one to gain a better knowledge and understanding of controlling parameters of carbonate platform development, which are necessary for modelling. Evaluating numerical models, critically comparing results from models using different approaches, and pushing experimental tests to their limits, provide an effective vehicle to improve and develop new numerical models. A main feature of this paper is to closely compare the performance between two numerical models: a forward model (CARBONATE) and a fuzzy logic model (FUZZIM). These two models use common

  1. Methodologies for Development of Patient Specific Bone Models from Human Body CT Scans

    Science.gov (United States)

    Chougule, Vikas Narayan; Mulay, Arati Vinayak; Ahuja, Bharatkumar Bhagatraj

    2016-06-01

    This work deals with development of algorithm for physical replication of patient specific human bone and construction of corresponding implants/inserts RP models by using Reverse Engineering approach from non-invasive medical images for surgical purpose. In medical field, the volumetric data i.e. voxel and triangular facet based models are primarily used for bio-modelling and visualization, which requires huge memory space. On the other side, recent advances in Computer Aided Design (CAD) technology provides additional facilities/functions for design, prototyping and manufacturing of any object having freeform surfaces based on boundary representation techniques. This work presents a process to physical replication of 3D rapid prototyping (RP) physical models of human bone from various CAD modeling techniques developed by using 3D point cloud data which is obtained from non-invasive CT/MRI scans in DICOM 3.0 format. This point cloud data is used for construction of 3D CAD model by fitting B-spline curves through these points and then fitting surface between these curve networks by using swept blend techniques. This process also can be achieved by generating the triangular mesh directly from 3D point cloud data without developing any surface model using any commercial CAD software. The generated STL file from 3D point cloud data is used as a basic input for RP process. The Delaunay tetrahedralization approach is used to process the 3D point cloud data to obtain STL file. CT scan data of Metacarpus (human bone) is used as the case study for the generation of the 3D RP model. A 3D physical model of the human bone is generated on rapid prototyping machine and its virtual reality model is presented for visualization. The generated CAD model by different techniques is compared for the accuracy and reliability. The results of this research work are assessed for clinical reliability in replication of human bone in medical field.

  2. A Sampling Based Approach to Spacecraft Autonomous Maneuvering with Safety Specifications

    Science.gov (United States)

    Starek, Joseph A.; Barbee, Brent W.; Pavone, Marco

    2015-01-01

    This paper presents a methods for safe spacecraft autonomous maneuvering that leverages robotic motion-planning techniques to spacecraft control. Specifically the scenario we consider is an in-plan rendezvous of a chaser spacecraft in proximity to a target spacecraft at the origin of the Clohessy Wiltshire Hill frame. The trajectory for the chaser spacecraft is generated in a receding horizon fashion by executing a sampling based robotic motion planning algorithm name Fast Marching Trees (FMT) which efficiently grows a tree of trajectories over a set of probabillistically drawn samples in the state space. To enforce safety the tree is only grown over actively safe samples for which there exists a one-burn collision avoidance maneuver that circularizes the spacecraft orbit along a collision-free coasting arc and that can be executed under potential thrusters failures. The overall approach establishes a provably correct framework for the systematic encoding of safety specifications into the spacecraft trajectory generations process and appears amenable to real time implementation on orbit. Simulation results are presented for a two-fault tolerant spacecraft during autonomous approach to a single client in Low Earth Orbit.

  3. A novel patient-specific model to compute coronary fractional flow reserve.

    Science.gov (United States)

    Kwon, Soon-Sung; Chung, Eui-Chul; Park, Jin-Seo; Kim, Gook-Tae; Kim, Jun-Woo; Kim, Keun-Hong; Shin, Eun-Seok; Shim, Eun Bo

    2014-09-01

    The fractional flow reserve (FFR) is a widely used clinical index to evaluate the functional severity of coronary stenosis. A computer simulation method based on patients' computed tomography (CT) data is a plausible non-invasive approach for computing the FFR. This method can provide a detailed solution for the stenosed coronary hemodynamics by coupling computational fluid dynamics (CFD) with the lumped parameter model (LPM) of the cardiovascular system. In this work, we have implemented a simple computational method to compute the FFR. As this method uses only coronary arteries for the CFD model and includes only the LPM of the coronary vascular system, it provides simpler boundary conditions for the coronary geometry and is computationally more efficient than existing approaches. To test the efficacy of this method, we simulated a three-dimensional straight vessel using CFD coupled with the LPM. The computed results were compared with those of the LPM. To validate this method in terms of clinically realistic geometry, a patient-specific model of stenosed coronary arteries was constructed from CT images, and the computed FFR was compared with clinically measured results. We evaluated the effect of a model aorta on the computed FFR and compared this with a model without the aorta. Computationally, the model without the aorta was more efficient than that with the aorta, reducing the CPU time required for computing a cardiac cycle to 43.4%. Copyright © 2014. Published by Elsevier Ltd.

  4. A Residual Approach for Balanced Truncation Model Reduction (BTMR of Compartmental Systems

    Directory of Open Access Journals (Sweden)

    William La Cruz

    2014-05-01

    Full Text Available This paper presents a residual approach of the square root balanced truncation algorithm for model order reduction of continuous, linear and time-invariante compartmental systems. Specifically, the new approach uses a residual method to approximate the controllability and observability gramians, whose resolution is an essential step of the square root balanced truncation algorithm, that requires a great computational cost. Numerical experiences are included to highlight the efficacy of the proposed approach.

  5. Quantification of Cooperativity in Heterodimer-DNA Binding Improves the Accuracy of Binding Specificity Models*

    Science.gov (United States)

    Isakova, Alina; Berset, Yves; Hatzimanikatis, Vassily; Deplancke, Bart

    2016-01-01

    Many transcription factors (TFs) have the ability to cooperate on DNA elements as heterodimers. Despite the significance of TF heterodimerization for gene regulation, a quantitative understanding of cooperativity between various TF dimer partners and its impact on heterodimer DNA binding specificity models is still lacking. Here, we used a novel integrative approach, combining microfluidics-steered measurements of dimer-DNA assembly with mechanistic modeling of the implicated protein-protein-DNA interactions to quantitatively interrogate the cooperative DNA binding behavior of the adipogenic peroxisome proliferator-activated receptor γ (PPARγ):retinoid X receptor α (RXRα) heterodimer. Using the high throughput MITOMI (mechanically induced trapping of molecular interactions) platform, we derived equilibrium DNA binding data for PPARγ, RXRα, as well as the PPARγ:RXRα heterodimer to more than 300 target DNA sites and variants thereof. We then quantified cooperativity underlying heterodimer-DNA binding and derived an integrative heterodimer DNA binding constant. Using this cooperativity-inclusive constant, we were able to build a heterodimer-DNA binding specificity model that has superior predictive power than the one based on a regular one-site equilibrium. Our data further revealed that individual nucleotide substitutions within the target site affect the extent of cooperativity in PPARγ:RXRα-DNA binding. Our study therefore emphasizes the importance of assessing cooperativity when generating DNA binding specificity models for heterodimers. PMID:26912662

  6. Quantification of Cooperativity in Heterodimer-DNA Binding Improves the Accuracy of Binding Specificity Models.

    Science.gov (United States)

    Isakova, Alina; Berset, Yves; Hatzimanikatis, Vassily; Deplancke, Bart

    2016-05-06

    Many transcription factors (TFs) have the ability to cooperate on DNA elements as heterodimers. Despite the significance of TF heterodimerization for gene regulation, a quantitative understanding of cooperativity between various TF dimer partners and its impact on heterodimer DNA binding specificity models is still lacking. Here, we used a novel integrative approach, combining microfluidics-steered measurements of dimer-DNA assembly with mechanistic modeling of the implicated protein-protein-DNA interactions to quantitatively interrogate the cooperative DNA binding behavior of the adipogenic peroxisome proliferator-activated receptor γ (PPARγ):retinoid X receptor α (RXRα) heterodimer. Using the high throughput MITOMI (mechanically induced trapping of molecular interactions) platform, we derived equilibrium DNA binding data for PPARγ, RXRα, as well as the PPARγ:RXRα heterodimer to more than 300 target DNA sites and variants thereof. We then quantified cooperativity underlying heterodimer-DNA binding and derived an integrative heterodimer DNA binding constant. Using this cooperativity-inclusive constant, we were able to build a heterodimer-DNA binding specificity model that has superior predictive power than the one based on a regular one-site equilibrium. Our data further revealed that individual nucleotide substitutions within the target site affect the extent of cooperativity in PPARγ:RXRα-DNA binding. Our study therefore emphasizes the importance of assessing cooperativity when generating DNA binding specificity models for heterodimers. © 2016 by The American Society for Biochemistry and Molecular Biology, Inc.

  7. Systems approaches to computational modeling of the oral microbiome

    Directory of Open Access Journals (Sweden)

    Dimiter V. Dimitrov

    2013-07-01

    Full Text Available Current microbiome research has generated tremendous amounts of data providing snapshots of molecular activity in a variety of organisms, environments, and cell types. However, turning this knowledge into whole system level of understanding on pathways and processes has proven to be a challenging task. In this review we highlight the applicability of bioinformatics and visualization techniques to large collections of data in order to better understand the information that contains related diet – oral microbiome – host mucosal transcriptome interactions. In particular we focus on systems biology of Porphyromonas gingivalis in the context of high throughput computational methods tightly integrated with translational systems medicine. Those approaches have applications for both basic research, where we can direct specific laboratory experiments in model organisms and cell cultures, to human disease, where we can validate new mechanisms and biomarkers for prevention and treatment of chronic disorders

  8. A systemic approach to modelling of radiobiological effects

    International Nuclear Information System (INIS)

    Obaturov, G.M.

    1988-01-01

    Basic principles of the systemic approach to modelling of the radiobiological effects at different levels of cell organization have been formulated. The methodology is proposed for theoretical modelling of the effects at these levels

  9. Serpentinization reaction pathways: implications for modeling approach

    Energy Technology Data Exchange (ETDEWEB)

    Janecky, D.R.

    1986-01-01

    Experimental seawater-peridotite reaction pathways to form serpentinites at 300/sup 0/C, 500 bars, can be accurately modeled using the EQ3/6 codes in conjunction with thermodynamic and kinetic data from the literature and unpublished compilations. These models provide both confirmation of experimental interpretations and more detailed insight into hydrothermal reaction processes within the oceanic crust. The accuracy of these models depends on careful evaluation of the aqueous speciation model, use of mineral compositions that closely reproduce compositions in the experiments, and definition of realistic reactive components in terms of composition, thermodynamic data, and reaction rates.

  10. Consumer preference models: fuzzy theory approach

    Science.gov (United States)

    Turksen, I. B.; Wilson, I. A.

    1993-12-01

    Consumer preference models are widely used in new product design, marketing management, pricing and market segmentation. The purpose of this article is to develop and test a fuzzy set preference model which can represent linguistic variables in individual-level models implemented in parallel with existing conjoint models. The potential improvements in market share prediction and predictive validity can substantially improve management decisions about what to make (product design), for whom to make it (market segmentation) and how much to make (market share prediction).

  11. The technique for 3D printing patient-specific models for auricular reconstruction.

    Science.gov (United States)

    Flores, Roberto L; Liss, Hannah; Raffaelli, Samuel; Humayun, Aiza; Khouri, Kimberly S; Coelho, Paulo G; Witek, Lukasz

    2017-06-01

    Currently, surgeons approach autogenous microtia repair by creating a two-dimensional (2D) tracing of the unaffected ear to approximate a three-dimensional (3D) construct, a difficult process. To address these shortcomings, this study introduces the fabrication of patient-specific, sterilizable 3D printed auricular model for autogenous auricular reconstruction. A high-resolution 3D digital photograph was captured of the patient's unaffected ear and surrounding anatomic structures. The photographs were exported and uploaded into Amira, for transformation into a digital (.stl) model, which was imported into Blender, an open source software platform for digital modification of data. The unaffected auricle as digitally isolated and inverted to render a model for the contralateral side. The depths of the scapha, triangular fossa, and cymba were deepened to accentuate their contours. Extra relief was added to the helical root to further distinguish this structure. The ear was then digitally deconstructed and separated into its individual auricular components for reconstruction. The completed ear and its individual components were 3D printed using polylactic acid filament and sterilized following manufacturer specifications. The sterilized models were brought to the operating room to be utilized by the surgeon. The models allowed for more accurate anatomic measurements compared to 2D tracings, which reduced the degree of estimation required by surgeons. Approximately 20 g of the PLA filament were utilized for the construction of these models, yielding a total material cost of approximately $1. Using the methodology detailed in this report, as well as departmentally available resources (3D digital photography and 3D printing), a sterilizable, patient-specific, and inexpensive 3D auricular model was fabricated to be used intraoperatively. This technique of printing customized-to-patient models for surgeons to use as 'guides' shows great promise. Copyright © 2017 European

  12. Modelling energy demand of developing countries: Are the specific features adequately captured?

    International Nuclear Information System (INIS)

    Bhattacharyya, Subhes C.; Timilsina, Govinda R.

    2010-01-01

    This paper critically reviews existing energy demand forecasting methodologies highlighting the methodological diversities and developments over the past four decades in order to investigate whether the existing energy demand models are appropriate for capturing the specific features of developing countries. The study finds that two types of approaches, econometric and end-use accounting, are commonly used in the existing energy demand models. Although energy demand models have greatly evolved since the early seventies, key issues such as the poor-rich and urban-rural divides, traditional energy resources and differentiation between commercial and non-commercial energy commodities are often poorly reflected in these models. While the end-use energy accounting models with detailed sectoral representations produce more realistic projections as compared to the econometric models, they still suffer from huge data deficiencies especially in developing countries. Development and maintenance of more detailed energy databases, further development of models to better reflect developing country context and institutionalizing the modelling capacity in developing countries are the key requirements for energy demand modelling to deliver richer and more reliable input to policy formulation in developing countries.

  13. A modeling approach for compounds affecting body composition.

    Science.gov (United States)

    Gennemark, Peter; Jansson-Löfmark, Rasmus; Hyberg, Gina; Wigstrand, Maria; Kakol-Palm, Dorota; Håkansson, Pernilla; Hovdal, Daniel; Brodin, Peter; Fritsch-Fredin, Maria; Antonsson, Madeleine; Ploj, Karolina; Gabrielsson, Johan

    2013-12-01

    Body composition and body mass are pivotal clinical endpoints in studies of welfare diseases. We present a combined effort of established and new mathematical models based on rigorous monitoring of energy intake (EI) and body mass in mice. Specifically, we parameterize a mechanistic turnover model based on the law of energy conservation coupled to a drug mechanism model. Key model variables are fat-free mass (FFM) and fat mass (FM), governed by EI and energy expenditure (EE). An empirical Forbes curve relating FFM to FM was derived experimentally for female C57BL/6 mice. The Forbes curve differs from a previously reported curve for male C57BL/6 mice, and we thoroughly analyse how the choice of Forbes curve impacts model predictions. The drug mechanism function acts on EI or EE, or both. Drug mechanism parameters (two to three parameters) and system parameters (up to six free parameters) could be estimated with good precision (coefficients of variation typically mass and FM changes at different drug provocations using a similar model for man. Surprisingly, model simulations indicate that an increase in EI (e.g. 10 %) was more efficient than an equal lowering of EI. Also, the relative change in body mass and FM is greater in man than in mouse at the same relative change in either EI or EE. We acknowledge that this assumes the same drug mechanism impact across the two species. A set of recommendations regarding the Forbes curve, vehicle control groups, dual action on EI and loss, and translational aspects are discussed. This quantitative approach significantly improves data interpretation, disease system understanding, safety assessment and translation across species.

  14. Modelling energy demand of developing countries: Are the specific features adequately captured?

    Energy Technology Data Exchange (ETDEWEB)

    Bhattacharyya, Subhes C. [CEPMLP, University of Dundee, Dundee DD1 4HN (United Kingdom); Timilsina, Govinda R. [Development Research Group, The World Bank, Washington DC (United States)

    2010-04-15

    This paper critically reviews existing energy demand forecasting methodologies highlighting the methodological diversities and developments over the past four decades in order to investigate whether the existing energy demand models are appropriate for capturing the specific features of developing countries. The study finds that two types of approaches, econometric and end-use accounting, are commonly used in the existing energy demand models. Although energy demand models have greatly evolved since the early seventies, key issues such as the poor-rich and urban-rural divides, traditional energy resources and differentiation between commercial and non-commercial energy commodities are often poorly reflected in these models. While the end-use energy accounting models with detailed sectoral representations produce more realistic projections as compared to the econometric models, they still suffer from huge data deficiencies especially in developing countries. Development and maintenance of more detailed energy databases, further development of models to better reflect developing country context and institutionalizing the modelling capacity in developing countries are the key requirements for energy demand modelling to deliver richer and more reliable input to policy formulation in developing countries. (author)

  15. PRODUCT TRIAL PROCESSING (PTP): A MODEL APPROACH ...

    African Journals Online (AJOL)

    Admin

    This study is a theoretical approach to consumer's processing of product trail, and equally explored ... consumer's first usage experience with a company's brand or product that is most important in determining ... product, what it is really marketing is the expected ..... confidence, thus there is a positive relationship between ...

  16. Targeting lysine specific demethylase 4A (KDM4A) tandem TUDOR domain - A fragment based approach.

    Science.gov (United States)

    Upadhyay, Anup K; Judge, Russell A; Li, Leiming; Pithawalla, Ron; Simanis, Justin; Bodelle, Pierre M; Marin, Violeta L; Henry, Rodger F; Petros, Andrew M; Sun, Chaohong

    2018-06-01

    The tandem TUDOR domains present in the non-catalytic C-terminal half of the KDM4A, 4B and 4C enzymes play important roles in regulating their chromatin localizations and substrate specificities. They achieve this regulatory role by binding to different tri-methylated lysine residues on histone H3 (H3-K4me3, H3-K23me3) and histone H4 (H4-K20me3) depending upon the specific chromatin environment. In this work, we have used a 2D-NMR based fragment screening approach to identify a novel fragment (1a), which binds to the KDM4A-TUDOR domain and shows modest competition with H3-K4me3 binding in biochemical as well as in vitro cell based assays. A co-crystal structure of KDM4A TUDOR domain in complex with 1a shows that the fragment binds stereo-specifically to the methyl lysine binding pocket forming a network of strong hydrogen bonds and hydrophobic interactions. We anticipate that the fragment 1a can be further developed into a novel allosteric inhibitor of the KDM4 family of enzymes through targeting their C-terminal tandem TUDOR domain. Copyright © 2018 Elsevier Ltd. All rights reserved.

  17. Bus drivers' exposure to bullying at work: an occupation-specific approach.

    Science.gov (United States)

    Glasø, Lars; Bele, Edvard; Nielsen, Morten Birkeland; Einarsen, Ståle

    2011-10-01

    The present study employs an occupation-specific approach to examine bus drivers' exposure to bullying and their trait anger, job engagement, job satisfaction and turnover intentions. A total of 1,023 bus drivers from a large public transport organization participated in the study. The findings show that bus driving can be a high risk occupation with regard to bullying, since 70% of the bus drivers had experienced one or more acts typical of bullying during the last six months. As many as 11% defined themselves as victims of bullying, 33% of whom (i.e. 3.6% of the total sample) see themselves as victims of frequent bullying. Colleagues were most frequently reported as perpetrators. Exposure to bullying was negatively related to job engagement and job satisfaction and positively related to turnover intentions. Job engagement and job satisfaction mediated the relationship between bullying and intention to leave, respectively. Trait anger had an interaction effect on the relationship between bullying and turnover intentions. This study indicates that workplace bullying has context-specific aspects that require increased use of context-specific policies and intervention methods. © 2011 The Authors. Scandinavian Journal of Psychology © 2011 The Scandinavian Psychological Associations.

  18. Nonlinear Modeling of the PEMFC Based On NNARX Approach

    OpenAIRE

    Shan-Jen Cheng; Te-Jen Chang; Kuang-Hsiung Tan; Shou-Ling Kuo

    2015-01-01

    Polymer Electrolyte Membrane Fuel Cell (PEMFC) is such a time-vary nonlinear dynamic system. The traditional linear modeling approach is hard to estimate structure correctly of PEMFC system. From this reason, this paper presents a nonlinear modeling of the PEMFC using Neural Network Auto-regressive model with eXogenous inputs (NNARX) approach. The multilayer perception (MLP) network is applied to evaluate the structure of the NNARX model of PEMFC. The validity and accurac...

  19. Development of a Conservative Model Validation Approach for Reliable Analysis

    Science.gov (United States)

    2015-01-01

    CIE 2015 August 2-5, 2015, Boston, Massachusetts, USA [DRAFT] DETC2015-46982 DEVELOPMENT OF A CONSERVATIVE MODEL VALIDATION APPROACH FOR RELIABLE...obtain a conservative simulation model for reliable design even with limited experimental data. Very little research has taken into account the...3, the proposed conservative model validation is briefly compared to the conventional model validation approach. Section 4 describes how to account

  20. New Approaches in Reuseable Booster System Life Cycle Cost Modeling

    Science.gov (United States)

    Zapata, Edgar

    2013-01-01

    This paper presents the results of a 2012 life cycle cost (LCC) study of hybrid Reusable Booster Systems (RBS) conducted by NASA Kennedy Space Center (KSC) and the Air Force Research Laboratory (AFRL). The work included the creation of a new cost estimating model and an LCC analysis, building on past work where applicable, but emphasizing the integration of new approaches in life cycle cost estimation. Specifically, the inclusion of industry processes/practices and indirect costs were a new and significant part of the analysis. The focus of LCC estimation has traditionally been from the perspective of technology, design characteristics, and related factors such as reliability. Technology has informed the cost related support to decision makers interested in risk and budget insight. This traditional emphasis on technology occurs even though it is well established that complex aerospace systems costs are mostly about indirect costs, with likely only partial influence in these indirect costs being due to the more visible technology products. Organizational considerations, processes/practices, and indirect costs are traditionally derived ("wrapped") only by relationship to tangible product characteristics. This traditional approach works well as long as it is understood that no significant changes, and by relation no significant improvements, are being pursued in the area of either the government acquisition or industry?s indirect costs. In this sense then, most launch systems cost models ignore most costs. The alternative was implemented in this LCC study, whereby the approach considered technology and process/practices in balance, with as much detail for one as the other. This RBS LCC study has avoided point-designs, for now, instead emphasizing exploring the trade-space of potential technology advances joined with potential process/practice advances. Given the range of decisions, and all their combinations, it was necessary to create a model of the original model

  1. New Approaches in Reusable Booster System Life Cycle Cost Modeling

    Science.gov (United States)

    Zapata, Edgar

    2013-01-01

    This paper presents the results of a 2012 life cycle cost (LCC) study of hybrid Reusable Booster Systems (RBS) conducted by NASA Kennedy Space Center (KSC) and the Air Force Research Laboratory (AFRL). The work included the creation of a new cost estimating model and an LCC analysis, building on past work where applicable, but emphasizing the integration of new approaches in life cycle cost estimation. Specifically, the inclusion of industry processes/practices and indirect costs were a new and significant part of the analysis. The focus of LCC estimation has traditionally been from the perspective of technology, design characteristics, and related factors such as reliability. Technology has informed the cost related support to decision makers interested in risk and budget insight. This traditional emphasis on technology occurs even though it is well established that complex aerospace systems costs are mostly about indirect costs, with likely only partial influence in these indirect costs being due to the more visible technology products. Organizational considerations, processes/practices, and indirect costs are traditionally derived ("wrapped") only by relationship to tangible product characteristics. This traditional approach works well as long as it is understood that no significant changes, and by relation no significant improvements, are being pursued in the area of either the government acquisition or industry?s indirect costs. In this sense then, most launch systems cost models ignore most costs. The alternative was implemented in this LCC study, whereby the approach considered technology and process/practices in balance, with as much detail for one as the other. This RBS LCC study has avoided point-designs, for now, instead emphasizing exploring the trade-space of potential technology advances joined with potential process/practice advances. Given the range of decisions, and all their combinations, it was necessary to create a model of the original model

  2. Comparison of two novel approaches to model fibre reinforced concrete

    NARCIS (Netherlands)

    Radtke, F.K.F.; Simone, A.; Sluys, L.J.

    2009-01-01

    We present two approaches to model fibre reinforced concrete. In both approaches, discrete fibre distributions and the behaviour of the fibre-matrix interface are explicitly considered. One approach employs the reaction forces from fibre to matrix while the other is based on the partition of unity

  3. Statistical modelling approach to derive quantitative nanowastes classification index; estimation of nanomaterials exposure

    CSIR Research Space (South Africa)

    Ntaka, L

    2013-08-01

    Full Text Available . In this work, statistical inference approach specifically the non-parametric bootstrapping and linear model were applied. Data used to develop the model were sourced from the literature. 104 data points with information on aggregation, natural organic matter...

  4. Specific modes of vibratory technological machines: mathematical models, peculiarities of interaction of system elements

    Science.gov (United States)

    Eliseev, A. V.; Sitov, I. S.; Eliseev, S. V.

    2018-03-01

    The methodological basis of constructing mathematical models of vibratory technological machines is developed in the article. An approach is proposed that makes it possible to introduce a vibration table in a specific mode that provides conditions for the dynamic damping of oscillations for the zone of placement of a vibration exciter while providing specified vibration parameters in the working zone of the vibration table. The aim of the work is to develop methods of mathematical modeling, oriented to technological processes with long cycles. The technologies of structural mathematical modeling are used with structural schemes, transfer functions and amplitude-frequency characteristics. The concept of the work is to test the possibilities of combining the conditions for reducing loads with working components of a vibration exciter while simultaneously maintaining sufficiently wide limits in variating the parameters of the vibrational field.

  5. Modeling thrombin generation: plasma composition based approach.

    Science.gov (United States)

    Brummel-Ziedins, Kathleen E; Everse, Stephen J; Mann, Kenneth G; Orfeo, Thomas

    2014-01-01

    Thrombin has multiple functions in blood coagulation and its regulation is central to maintaining the balance between hemorrhage and thrombosis. Empirical and computational methods that capture thrombin generation can provide advancements to current clinical screening of the hemostatic balance at the level of the individual. In any individual, procoagulant and anticoagulant factor levels together act to generate a unique coagulation phenotype (net balance) that is reflective of the sum of its developmental, environmental, genetic, nutritional and pharmacological influences. Defining such thrombin phenotypes may provide a means to track disease progression pre-crisis. In this review we briefly describe thrombin function, methods for assessing thrombin dynamics as a phenotypic marker, computationally derived thrombin phenotypes versus determined clinical phenotypes, the boundaries of normal range thrombin generation using plasma composition based approaches and the feasibility of these approaches for predicting risk.

  6. A Simulation Modeling Approach Method Focused on the Refrigerated Warehouses Using Design of Experiment

    Science.gov (United States)

    Cho, G. S.

    2017-09-01

    For performance optimization of Refrigerated Warehouses, design parameters are selected based on the physical parameters such as number of equipment and aisles, speeds of forklift for ease of modification. This paper provides a comprehensive framework approach for the system design of Refrigerated Warehouses. We propose a modeling approach which aims at the simulation optimization so as to meet required design specifications using the Design of Experiment (DOE) and analyze a simulation model using integrated aspect-oriented modeling approach (i-AOMA). As a result, this suggested method can evaluate the performance of a variety of Refrigerated Warehouses operations.

  7. 3D Modelling and Printing Technology to Produce Patient-Specific 3D Models.

    Science.gov (United States)

    Birbara, Nicolette S; Otton, James M; Pather, Nalini

    2017-11-10

    A comprehensive knowledge of mitral valve (MV) anatomy is crucial in the assessment of MV disease. While the use of three-dimensional (3D) modelling and printing in MV assessment has undergone early clinical evaluation, the precision and usefulness of this technology requires further investigation. This study aimed to assess and validate 3D modelling and printing technology to produce patient-specific 3D MV models. A prototype method for MV 3D modelling and printing was developed from computed tomography (CT) scans of a plastinated human heart. Mitral valve models were printed using four 3D printing methods and validated to assess precision. Cardiac CT and 3D echocardiography imaging data of four MV disease patients was used to produce patient-specific 3D printed models, and 40 cardiac health professionals (CHPs) were surveyed on the perceived value and potential uses of 3D models in a clinical setting. The prototype method demonstrated submillimetre precision for all four 3D printing methods used, and statistical analysis showed a significant difference (p3D printed models, particularly using multiple print materials, were considered useful by CHPs for preoperative planning, as well as other applications such as teaching and training. This study suggests that, with further advances in 3D modelling and printing technology, patient-specific 3D MV models could serve as a useful clinical tool. The findings also highlight the potential of this technology to be applied in a variety of medical areas within both clinical and educational settings. Copyright © 2017 Australian and New Zealand Society of Cardiac and Thoracic Surgeons (ANZSCTS) and the Cardiac Society of Australia and New Zealand (CSANZ). Published by Elsevier B.V. All rights reserved.

  8. A simple approach to modeling ductile failure.

    Energy Technology Data Exchange (ETDEWEB)

    Wellman, Gerald William

    2012-06-01

    Sandia National Laboratories has the need to predict the behavior of structures after the occurrence of an initial failure. In some cases determining the extent of failure, beyond initiation, is required, while in a few cases the initial failure is a design feature used to tailor the subsequent load paths. In either case, the ability to numerically simulate the initiation and propagation of failures is a highly desired capability. This document describes one approach to the simulation of failure initiation and propagation.

  9. Disease-specific induced pluripotent stem cells: a platform for human disease modeling and drug discovery.

    Science.gov (United States)

    Jang, Jiho; Yoo, Jeong-Eun; Lee, Jeong-Ah; Lee, Dongjin R; Kim, Ji Young; Huh, Yong Jun; Kim, Dae-Sung; Park, Chul-Yong; Hwang, Dong-Youn; Kim, Han-Soo; Kang, Hoon-Chul; Kim, Dong-Wook

    2012-03-31

    The generation of disease-specific induced pluripotent stem cell (iPSC) lines from patients with incurable diseases is a promising approach for studying disease mechanisms and drug screening. Such innovation enables to obtain autologous cell sources in regenerative medicine. Herein, we report the generation and characterization of iPSCs from fibroblasts of patients with sporadic or familial diseases, including Parkinson's disease (PD), Alzheimer's disease (AD), juvenile-onset, type I diabetes mellitus (JDM), and Duchenne type muscular dystrophy (DMD), as well as from normal human fibroblasts (WT). As an example to modeling disease using disease-specific iPSCs, we also discuss the previously established childhood cerebral adrenoleukodystrophy (CCALD)- and adrenomyeloneuropathy (AMN)-iPSCs by our group. Through DNA fingerprinting analysis, the origins of generated disease-specific iPSC lines were identified. Each iPSC line exhibited an intense alkaline phosphatase activity, expression of pluripotent markers, and the potential to differentiate into all three embryonic germ layers: the ectoderm, endoderm, and mesoderm. Expression of endogenous pluripotent markers and downregulation of retrovirus-delivered transgenes [OCT4 (POU5F1), SOX2, KLF4, and c-MYC] were observed in the generated iPSCs. Collectively, our results demonstrated that disease-specific iPSC lines characteristically resembled hESC lines. Furthermore, we were able to differentiate PD-iPSCs, one of the disease-specific-iPSC lines we generated, into dopaminergic (DA) neurons, the cell type mostly affected by PD. These PD-specific DA neurons along with other examples of cell models derived from disease-specific iPSCs would provide a powerful platform for examining the pathophysiology of relevant diseases at the cellular and molecular levels and for developing new drugs and therapeutic regimens.

  10. Approach to modeling of human performance for purposes of probabilistic risk assessment

    International Nuclear Information System (INIS)

    Swain, A.D.

    1983-01-01

    This paper describes the general approach taken in NUREG/CR-1278 to model human performance in sufficienct detail to permit probabilistic risk assessments of nuclear power plant operations. To show the basis for the more specific models in the above NUREG, a simplified model of the human component in man-machine systems is presented, the role of performance shaping factors is discussed, and special problems in modeling the cognitive aspect of behavior are described

  11. Forecasting selected specific age mortality rate of Malaysia by using Lee-Carter model

    Science.gov (United States)

    Shukri Kamaruddin, Halim; Ismail, Noriszura

    2018-03-01

    Observing mortality pattern and trend is an important subject for any country to maintain a good social-economy in the next projection years. The declining in mortality trend gives a good impression of what a government has done towards macro citizen in one nation. Selecting a particular mortality model can be a tricky based on the approached method adapting. Lee-Carter model is adapted because of its simplicity and reliability of the outcome results with approach of regression. Implementation of Lee-Carter in finding a fitted model and hence its projection has been used worldwide in most of mortality research in developed countries. This paper studies the mortality pattern of Malaysia in the past by using original model of Lee-Carter (1992) and hence its cross-sectional observation for a single age. The data is indexed by age of death and year of death from 1984 to 2012, in which are supplied by Department of Statistics Malaysia. The results are modelled by using RStudio and the keen analysis will focus on the trend and projection of mortality rate and age specific mortality rate in the future. This paper can be extended to different variants extensions of Lee-Carter or any stochastic mortality tool by using Malaysia mortality experience as a centre of the main issue.

  12. Theoretical approach to the phonon modes and specific heat of germanium nanowires

    Energy Technology Data Exchange (ETDEWEB)

    Trejo, A.; López-Palacios, L.; Vázquez-Medina, R.; Cruz-Irisson, M., E-mail: irisson@ipn.mx

    2014-11-15

    The phonon modes and specific heat of Ge nanowires were computed using a first principles density functional theory scheme with a generalized gradient approximation and finite-displacement supercell algorithms. The nanowires were modeled in three different directions: [001], [111], and [110], using the supercell technique. All surface dangling bonds were saturated with Hydrogen atoms. The results show that the specific heat of the GeNWs at room temperature increases as the nanowire diameter decreases, regardless the orientation due to the phonon confinement and surface passivation. Also the phonon confinement effects could be observed since the highest optical phonon modes in the Ge vibration interval shifted to a lower frequency compared to their bulk counterparts.

  13. An interdisciplinary approach to modeling tritium transfer into the environment

    International Nuclear Information System (INIS)

    Galeriu, D; Melintescu, A.

    2005-01-01

    equations between soil and plants. Considering mammals, we recently showed that the simplistic models currently applied did not accurately match experimental data from rats and sheep. Specific data for many farm and wild animals are scarce. In this paper, we are advancing a different approach based on energy metabolism, which can be parameterized predominantly based on published metabolic data for mature mammals. We started with the observation that the measured dynamics of 14 C and non-exchangeable organically bound tritium (OBT) were, not surprisingly, similar. We therefore introduced a metabolic definition for the 14 C and OBT loss rate (assumed to be the same) from the whole body and specific organs. We assumed that this was given by the specific metabolic rate of the whole body or organ, divided by the enthalpy of combustion of a kilogram of fresh matter. Since basal metabolism data were taken from the literature, they were modified for energy expenditure above basal need. To keep the model simple, organs were grouped according to their metabolic activity or importance in the food chain. Pools considered were viscera (high metabolic rate organs except the brain), muscle, adipose tissue, blood, and other (all other tissues). We disregarded any detail on substrate utilization from the dietary intake and condensed the postprandial respiration in a single rate. We included considerations of net maintenance and growth needs. For tritium, the transfer between body water and organic compartments was modeled using knowledge of basic metabolism and published relations. We considered the potential influence of rumen digestion and bacterial protein in ruminants. As for model application, we focused on laboratory and farm animals, where some experimental data were available. The model performed well for rat muscle, viscera and adipose tissue, but due to the simplicity of model structure and assumptions, blood and urine data were only satisfactorily reproduced. Whilst for sheep fed

  14. An Integrated Approach to Modeling Evacuation Behavior

    Science.gov (United States)

    2011-02-01

    A spate of recent hurricanes and other natural disasters have drawn a lot of attention to the evacuation decision of individuals. Here we focus on evacuation models that incorporate two economic phenomena that seem to be increasingly important in exp...

  15. Infectious disease modeling a hybrid system approach

    CERN Document Server

    Liu, Xinzhi

    2017-01-01

    This volume presents infectious diseases modeled mathematically, taking seasonality and changes in population behavior into account, using a switched and hybrid systems framework. The scope of coverage includes background on mathematical epidemiology, including classical formulations and results; a motivation for seasonal effects and changes in population behavior, an investigation into term-time forced epidemic models with switching parameters, and a detailed account of several different control strategies. The main goal is to study these models theoretically and to establish conditions under which eradication or persistence of the disease is guaranteed. In doing so, the long-term behavior of the models is determined through mathematical techniques from switched systems theory. Numerical simulations are also given to augment and illustrate the theoretical results and to help study the efficacy of the control schemes.

  16. On Combining Language Models: Oracle Approach

    National Research Council Canada - National Science Library

    Hacioglu, Kadri; Ward, Wayne

    2001-01-01

    In this paper, we address the of combining several language models (LMs). We find that simple interpolation methods, like log-linear and linear interpolation, improve the performance but fall short of the performance of an oracle...

  17. HIV-specific probabilistic models of protein evolution.

    Directory of Open Access Journals (Sweden)

    David C Nickle

    2007-06-01

    Full Text Available Comparative sequence analyses, including such fundamental bioinformatics techniques as similarity searching, sequence alignment and phylogenetic inference, have become a mainstay for researchers studying type 1 Human Immunodeficiency Virus (HIV-1 genome structure and evolution. Implicit in comparative analyses is an underlying model of evolution, and the chosen model can significantly affect the results. In general, evolutionary models describe the probabilities of replacing one amino acid character with another over a period of time. Most widely used evolutionary models for protein sequences have been derived from curated alignments of hundreds of proteins, usually based on mammalian genomes. It is unclear to what extent these empirical models are generalizable to a very different organism, such as HIV-1-the most extensively sequenced organism in existence. We developed a maximum likelihood model fitting procedure to a collection of HIV-1 alignments sampled from different viral genes, and inferred two empirical substitution models, suitable for describing between-and within-host evolution. Our procedure pools the information from multiple sequence alignments, and provided software implementation can be run efficiently in parallel on a computer cluster. We describe how the inferred substitution models can be used to generate scoring matrices suitable for alignment and similarity searches. Our models had a consistently superior fit relative to the best existing models and to parameter-rich data-driven models when benchmarked on independent HIV-1 alignments, demonstrating evolutionary biases in amino-acid substitution that are unique to HIV, and that are not captured by the existing models. The scoring matrices derived from the models showed a marked difference from common amino-acid scoring matrices. The use of an appropriate evolutionary model recovered a known viral transmission history, whereas a poorly chosen model introduced phylogenetic

  18. Advanced language modeling approaches, case study: Expert search

    NARCIS (Netherlands)

    Hiemstra, Djoerd

    2008-01-01

    This tutorial gives a clear and detailed overview of advanced language modeling approaches and tools, including the use of document priors, translation models, relevance models, parsimonious models and expectation maximization training. Expert search will be used as a case study to explain the

  19. Approaches to modelling hydrology and ecosystem interactions

    Science.gov (United States)

    Silberstein, Richard P.

    2014-05-01

    As the pressures of industry, agriculture and mining on groundwater resources increase there is a burgeoning un-met need to be able to capture these multiple, direct and indirect stresses in a formal framework that will enable better assessment of impact scenarios. While there are many catchment hydrological models and there are some models that represent ecological states and change (e.g. FLAMES, Liedloff and Cook, 2007), these have not been linked in any deterministic or substantive way. Without such coupled eco-hydrological models quantitative assessments of impacts from water use intensification on water dependent ecosystems under changing climate are difficult, if not impossible. The concept would include facility for direct and indirect water related stresses that may develop around mining and well operations, climate stresses, such as rainfall and temperature, biological stresses, such as diseases and invasive species, and competition such as encroachment from other competing land uses. Indirect water impacts could be, for example, a change in groundwater conditions has an impact on stream flow regime, and hence aquatic ecosystems. This paper reviews previous work examining models combining ecology and hydrology with a view to developing a conceptual framework linking a biophysically defensable model that combines ecosystem function with hydrology. The objective is to develop a model capable of representing the cumulative impact of multiple stresses on water resources and associated ecosystem function.

  20. Constructing a justice model based on Sen's capability approach

    OpenAIRE

    Yüksel, Sevgi; Yuksel, Sevgi

    2008-01-01

    The thesis provides a possible justice model based on Sen's capability approach. For this goal, we first analyze the general structure of a theory of justice, identifying the main variables and issues. Furthermore, based on Sen (2006) and Kolm (1998), we look at 'transcendental' and 'comparative' approaches to justice and concentrate on the sufficiency condition for the comparative approach. Then, taking Rawls' theory of justice as a starting point, we present how Sen's capability approach em...

  1. Convenience of Statistical Approach in Studies of Architectural Ornament and Other Decorative Elements Specific Application

    Science.gov (United States)

    Priemetz, O.; Samoilov, K.; Mukasheva, M.

    2017-11-01

    An ornament is an actual phenomenon of the architecture modern theory, a common element in the practice of design and construction. It has been an important aspect of shaping for millennia. The description of the methods of its application occupies a large place in the studies on the theory and practice of architecture. However, the problem of the saturation of compositions with ornamentation, the specificity of its themes and forms have not been sufficiently studied yet. This aspect requires accumulation of additional knowledge. The application of quantitative methods for the plastic solutions types and a thematic diversity of facade compositions of buildings constructed in different periods creates another tool for an objective analysis of ornament development. It demonstrates the application of this approach for studying the features of the architectural development in Kazakhstan at the end of the XIX - XXI centuries.

  2. An Isomer-Specific Approach to Endocrine-Disrupting Nonylphenol in Infant Food.

    Science.gov (United States)

    Günther, Klaus; Räcker, Torsten; Böhme, Roswitha

    2017-02-15

    Nonylphenols (NPs) are persistent endocrine disruptors that are priority hazardous substances of the European Union Water Framework Directive. Their presence in the environment has caused growing concern regarding their impact on human health. Recent studies have shown that nonylphenol is ubiquitous in commercially available foodstuffs and is also present in human blood. The isomer distribution of 4-nonylphenol was analyzed by gas chromatography - mass spectrometry in 44 samples of infant food. Our study shows that the distribution of nonylphenol isomers is dependent on the foodstuff analyzed. Although some isomer groups prevail, different distributions are frequent. Variations are even found in the same food group. Nonylphenol is a complex mixture of isomers, and the estrogenic potentials of each of these isomers are very different. Consequently, to determine the potential toxicological impact of NP in food, an isomer-specific approach is necessary.

  3. An ontology-based approach for modelling architectural styles

    OpenAIRE

    Pahl, Claus; Giesecke, Simon; Hasselbring, Wilhelm

    2007-01-01

    peer-reviewed The conceptual modelling of software architectures is of central importance for the quality of a software system. A rich modelling language is required to integrate the different aspects of architecture modelling, such as architectural styles, structural and behavioural modelling, into a coherent framework.We propose an ontological approach for architectural style modelling based on description logic as an abstract, meta-level modelling instrument. Architect...

  4. Mathematical modelling a case studies approach

    CERN Document Server

    Illner, Reinhard; McCollum, Samantha; Roode, Thea van

    2004-01-01

    Mathematical modelling is a subject without boundaries. It is the means by which mathematics becomes useful to virtually any subject. Moreover, modelling has been and continues to be a driving force for the development of mathematics itself. This book explains the process of modelling real situations to obtain mathematical problems that can be analyzed, thus solving the original problem. The presentation is in the form of case studies, which are developed much as they would be in true applications. In many cases, an initial model is created, then modified along the way. Some cases are familiar, such as the evaluation of an annuity. Others are unique, such as the fascinating situation in which an engineer, armed only with a slide rule, had 24 hours to compute whether a valve would hold when a temporary rock plug was removed from a water tunnel. Each chapter ends with a set of exercises and some suggestions for class projects. Some projects are extensive, as with the explorations of the predator-prey model; oth...

  5. Hepsoft - an approach for up to date multi-platform deployment of HEP specific software

    International Nuclear Information System (INIS)

    Roiser, S

    2011-01-01

    LHC experiments are depending on a rich palette of software components to build their specific applications. These underlying software components include the ROOT analysis framework, the Geant4 simulation toolkit, Monte Carlo generators, grid middle-ware, graphics libraries, scripting languages, databases, tools, etc. which are provided centrally in up to date versions on multiple platforms (Linux, Mac, Windows). Until recently this set of packages has been tested and released in a tree like structure as a consistent set of versions across operating systems, architectures and compilers for LHC experiments only. Because of the tree like deployment these releases were only usable in connection with a configuration management tool which provided the proper build and run-time environments and was hindering other parties outside LHC from easily using this palette of packages. In a new approach the releases will be grouped in 'flat structure' such that interested parties can start using it without configuration management, retaining all the above mentioned advantages. In addition to an increased usability the software shall also be distributed via system provided package deployment systems (rpm, apt, etc.). The approach of software deployment is following the ideas of providing a wide range of HEP specific software packages and tools in a coherent, up to date and modular way on multiple platforms. The target audience for such software deployments are individual developers or smaller development groups / experiments who don't have the resources to maintain this kind of infrastructure. This new software deployment strategy has already been successfully implemented for groups at CERN.

  6. A Subject-Specific Kinematic Model to Predict Human Motion in Exoskeleton-Assisted Gait

    Science.gov (United States)

    Torricelli, Diego; Cortés, Camilo; Lete, Nerea; Bertelsen, Álvaro; Gonzalez-Vargas, Jose E.; del-Ama, Antonio J.; Dimbwadyo, Iris; Moreno, Juan C.; Florez, Julian; Pons, Jose L.

    2018-01-01

    The relative motion between human and exoskeleton is a crucial factor that has remarkable consequences on the efficiency, reliability and safety of human-robot interaction. Unfortunately, its quantitative assessment has been largely overlooked in the literature. Here, we present a methodology that allows predicting the motion of the human joints from the knowledge of the angular motion of the exoskeleton frame. Our method combines a subject-specific skeletal model with a kinematic model of a lower limb exoskeleton (H2, Technaid), imposing specific kinematic constraints between them. To calibrate the model and validate its ability to predict the relative motion in a subject-specific way, we performed experiments on seven healthy subjects during treadmill walking tasks. We demonstrate a prediction accuracy lower than 3.5° globally, and around 1.5° at the hip level, which represent an improvement up to 66% compared to the traditional approach assuming no relative motion between the user and the exoskeleton. PMID:29755336

  7. Effective use of integrated hydrological models in basin-scale water resources management: surrogate modeling approaches

    Science.gov (United States)

    Zheng, Y.; Wu, B.; Wu, X.

    2015-12-01

    Integrated hydrological models (IHMs) consider surface water and subsurface water as a unified system, and have been widely adopted in basin-scale water resources studies. However, due to IHMs' mathematical complexity and high computational cost, it is difficult to implement them in an iterative model evaluation process (e.g., Monte Carlo Simulation, simulation-optimization analysis, etc.), which diminishes their applicability for supporting decision-making in real-world situations. Our studies investigated how to effectively use complex IHMs to address real-world water issues via surrogate modeling. Three surrogate modeling approaches were considered, including 1) DYCORS (DYnamic COordinate search using Response Surface models), a well-established response surface-based optimization algorithm; 2) SOIM (Surrogate-based Optimization for Integrated surface water-groundwater Modeling), a response surface-based optimization algorithm that we developed specifically for IHMs; and 3) Probabilistic Collocation Method (PCM), a stochastic response surface approach. Our investigation was based on a modeling case study in the Heihe River Basin (HRB), China's second largest endorheic river basin. The GSFLOW (Coupled Ground-Water and Surface-Water Flow Model) model was employed. Two decision problems were discussed. One is to optimize, both in time and in space, the conjunctive use of surface water and groundwater for agricultural irrigation in the middle HRB region; and the other is to cost-effectively collect hydrological data based on a data-worth evaluation. Overall, our study results highlight the value of incorporating an IHM in making decisions of water resources management and hydrological data collection. An IHM like GSFLOW can provide great flexibility to formulating proper objective functions and constraints for various optimization problems. On the other hand, it has been demonstrated that surrogate modeling approaches can pave the path for such incorporation in real

  8. The simplified models approach to constraining supersymmetry

    Energy Technology Data Exchange (ETDEWEB)

    Perez, Genessis [Institut fuer Theoretische Physik, Karlsruher Institut fuer Technologie (KIT), Wolfgang-Gaede-Str. 1, 76131 Karlsruhe (Germany); Kulkarni, Suchita [Laboratoire de Physique Subatomique et de Cosmologie, Universite Grenoble Alpes, CNRS IN2P3, 53 Avenue des Martyrs, 38026 Grenoble (France)

    2015-07-01

    The interpretation of the experimental results at the LHC are model dependent, which implies that the searches provide limited constraints on scenarios such as supersymmetry (SUSY). The Simplified Models Spectra (SMS) framework used by ATLAS and CMS collaborations is useful to overcome this limitation. SMS framework involves a small number of parameters (all the properties are reduced to the mass spectrum, the production cross section and the branching ratio) and hence is more generic than presenting results in terms of soft parameters. In our work, the SMS framework was used to test Natural SUSY (NSUSY) scenario. To accomplish this task, two automated tools (SModelS and Fastlim) were used to decompose the NSUSY parameter space in terms of simplified models and confront the theoretical predictions against the experimental results. The achievement of both, just as the strengths and limitations, are here expressed for the NSUSY scenario.

  9. Lightweight approach to model traceability in a CASE tool

    Science.gov (United States)

    Vileiniskis, Tomas; Skersys, Tomas; Pavalkis, Saulius; Butleris, Rimantas; Butkiene, Rita

    2017-07-01

    A term "model-driven" is not at all a new buzzword within the ranks of system development community. Nevertheless, the ever increasing complexity of model-driven approaches keeps fueling all kinds of discussions around this paradigm and pushes researchers forward to research and develop new and more effective ways to system development. With the increasing complexity, model traceability, and model management as a whole, becomes indispensable activities of model-driven system development process. The main goal of this paper is to present a conceptual design and implementation of a practical lightweight approach to model traceability in a CASE tool.

  10. Testing Process Predictions of Models of Risky Choice: A Quantitative Model Comparison Approach

    Directory of Open Access Journals (Sweden)

    Thorsten ePachur

    2013-09-01

    Full Text Available This article presents a quantitative model comparison contrasting the process predictions of two prominent views on risky choice. One view assumes a trade-off between probabilities and outcomes (or nonlinear functions thereof and the separate evaluation of risky options (expectation models. Another view assumes that risky choice is based on comparative evaluation, limited search, aspiration levels, and the forgoing of trade-offs (heuristic models. We derived quantitative process predictions for a generic expectation model and for a specific heuristic model, namely the priority heuristic (Brandstätter, Gigerenzer, & Hertwig, 2006, and tested them in two experiments. The focus was on two key features of the cognitive process: acquisition frequencies (i.e., how frequently individual reasons are looked up and direction of search (i.e., gamble-wise vs. reason-wise. In Experiment 1, the priority heuristic predicted direction of search better than the expectation model (although neither model predicted the acquisition process perfectly; acquisition frequencies, however, were inconsistent with both models. Additional analyses revealed that these frequencies were primarily a function of what Rubinstein (1988 called similarity. In Experiment 2, the quantitative model comparison approach showed that people seemed to rely more on the priority heuristic in difficult problems, but to make more trade-offs in easy problems. This finding suggests that risky choice may be based on a mental toolbox of strategies.

  11. Testing process predictions of models of risky choice: a quantitative model comparison approach

    Science.gov (United States)

    Pachur, Thorsten; Hertwig, Ralph; Gigerenzer, Gerd; Brandstätter, Eduard

    2013-01-01

    This article presents a quantitative model comparison contrasting the process predictions of two prominent views on risky choice. One view assumes a trade-off between probabilities and outcomes (or non-linear functions thereof) and the separate evaluation of risky options (expectation models). Another view assumes that risky choice is based on comparative evaluation, limited search, aspiration levels, and the forgoing of trade-offs (heuristic models). We derived quantitative process predictions for a generic expectation model and for a specific heuristic model, namely the priority heuristic (Brandstätter et al., 2006), and tested them in two experiments. The focus was on two key features of the cognitive process: acquisition frequencies (i.e., how frequently individual reasons are looked up) and direction of search (i.e., gamble-wise vs. reason-wise). In Experiment 1, the priority heuristic predicted direction of search better than the expectation model (although neither model predicted the acquisition process perfectly); acquisition frequencies, however, were inconsistent with both models. Additional analyses revealed that these frequencies were primarily a function of what Rubinstein (1988) called “similarity.” In Experiment 2, the quantitative model comparison approach showed that people seemed to rely more on the priority heuristic in difficult problems, but to make more trade-offs in easy problems. This finding suggests that risky choice may be based on a mental toolbox of strategies. PMID:24151472

  12. New approaches for modeling type Ia supernovae

    International Nuclear Information System (INIS)

    Zingale, Michael; Almgren, Ann S.; Bell, John B.; Day, Marcus S.; Rendleman, Charles A.; Woosley, Stan

    2007-01-01

    Type Ia supernovae (SNe Ia) are the largest thermonuclear explosions in the Universe. Their light output can be seen across great distances and has led to the discovery that the expansion rate of the Universe is accelerating. Despite the significance of SNe Ia, there are still a large number of uncertainties in current theoretical models. Computational modeling offers the promise to help answer the outstanding questions. However, even with today's supercomputers, such calculations are extremely challenging because of the wide range of length and timescales. In this paper, we discuss several new algorithms for simulations of SNe Ia and demonstrate some of their successes

  13. Chancroid transmission dynamics: a mathematical modeling approach.

    Science.gov (United States)

    Bhunu, C P; Mushayabasa, S

    2011-12-01

    Mathematical models have long been used to better understand disease transmission dynamics and how to effectively control them. Here, a chancroid infection model is presented and analyzed. The disease-free equilibrium is shown to be globally asymptotically stable when the reproduction number is less than unity. High levels of treatment are shown to reduce the reproduction number suggesting that treatment has the potential to control chancroid infections in any given community. This result is also supported by numerical simulations which show a decline in chancroid cases whenever the reproduction number is less than unity.

  14. A kinetic approach to magnetospheric modeling

    International Nuclear Information System (INIS)

    Whipple, E.C. Jr.

    1979-01-01

    The earth's magnetosphere is caused by the interaction between the flowing solar wind and the earth's magnetic dipole, with the distorted magnetic field in the outer parts of the magnetosphere due to the current systems resulting from this interaction. It is surprising that even the conceptually simple problem of the collisionless interaction of a flowing plasma with a dipole magnetic field has not been solved. A kinetic approach is essential if one is to take into account the dispersion of particles with different energies and pitch angles and the fact that particles on different trajectories have different histories and may come from different sources. Solving the interaction problem involves finding the various types of possible trajectories, populating them with particles appropriately, and then treating the electric and magnetic fields self-consistently with the resulting particle densities and currents. This approach is illustrated by formulating a procedure for solving the collisionless interaction problem on open field lines in the case of a slowly flowing magnetized plasma interacting with a magnetic dipole

  15. A kinetic approach to magnetospheric modeling

    Science.gov (United States)

    Whipple, E. C., Jr.

    1979-01-01

    The earth's magnetosphere is caused by the interaction between the flowing solar wind and the earth's magnetic dipole, with the distorted magnetic field in the outer parts of the magnetosphere due to the current systems resulting from this interaction. It is surprising that even the conceptually simple problem of the collisionless interaction of a flowing plasma with a dipole magnetic field has not been solved. A kinetic approach is essential if one is to take into account the dispersion of particles with different energies and pitch angles and the fact that particles on different trajectories have different histories and may come from different sources. Solving the interaction problem involves finding the various types of possible trajectories, populating them with particles appropriately, and then treating the electric and magnetic fields self-consistently with the resulting particle densities and currents. This approach is illustrated by formulating a procedure for solving the collisionless interaction problem on open field lines in the case of a slowly flowing magnetized plasma interacting with a magnetic dipole.

  16. Fractal approach to computer-analytical modelling of tree crown

    International Nuclear Information System (INIS)

    Berezovskaya, F.S.; Karev, G.P.; Kisliuk, O.F.; Khlebopros, R.G.; Tcelniker, Yu.L.

    1993-09-01

    In this paper we discuss three approaches to the modeling of a tree crown development. These approaches are experimental (i.e. regressive), theoretical (i.e. analytical) and simulation (i.e. computer) modeling. The common assumption of these is that a tree can be regarded as one of the fractal objects which is the collection of semi-similar objects and combines the properties of two- and three-dimensional bodies. We show that a fractal measure of crown can be used as the link between the mathematical models of crown growth and light propagation through canopy. The computer approach gives the possibility to visualize a crown development and to calibrate the model on experimental data. In the paper different stages of the above-mentioned approaches are described. The experimental data for spruce, the description of computer system for modeling and the variant of computer model are presented. (author). 9 refs, 4 figs

  17. Common and specific liability to addiction: approaches to association studies of opioid addiction.

    Science.gov (United States)

    Nielsen, David A; Kreek, Mary Jeanne

    2012-06-01

    Opioid addiction, whether to opiates such as heroin and morphine, and/or to non-medical use of opioids, is a major problem worldwide. Although drug-induced and environmental factors are essential for the liability to develop opioid addiction, the genetic background of an individual is now known also to play a substantial role. The overall goal of this article is to address the common and specific liabilities to addiction in the context of approaches to studies of one addiction, opioid addiction. Literature on identifying genetic variants that may play a role in the development of opioid addiction was reviewed. A substantial number of genetic variants have been reported to be associated with opioid addiction. No single variant has been found in any of the reported GWAS studies with a substantial effect size on the liability to develop heroin addiction. It appears that there is a complex interaction of a large number of variants, some rare, some common, which interact with the environment and in response to specific drugs of abuse to increase the liability of developing opioid addiction. In spite of the inherent difficulties in obtaining large well-phenotyped cohorts for genetic studies, new findings have been reported that are being used to develop testable hypotheses into the biological basis of opioid addiction. Copyright © 2012. Published by Elsevier Ireland Ltd.

  18. A novel approach to modeling atmospheric convection

    Science.gov (United States)

    Goodman, A.

    2016-12-01

    The inadequate representation of clouds continues to be a large source of uncertainty in the projections from global climate models (GCMs). With continuous advances in computational power, however, the ability for GCMs to explicitly resolve cumulus convection will soon be realized. For this purpose, Jung and Arakawa (2008) proposed the Vector Vorticity Model (VVM), in which vorticity is the predicted variable instead of momentum. This has the advantage of eliminating the pressure gradient force within the framework of an anelastic system. However, the VVM was designed for use on a planar quadrilateral grid, making it unsuitable for implementation in global models discretized on the sphere. Here we have proposed a modification to the VVM where instead the curl of the horizontal vorticity is the primary predicted variable. This allows us to maintain the benefits of the original VVM while working within the constraints of a non-quadrilateral mesh. We found that our proposed model produced results from a warm bubble simulation that were consistent with the VVM. Further improvements that can be made to the VVM are also discussed.

  19. A new approach to model mixed hydrates

    Czech Academy of Sciences Publication Activity Database

    Hielscher, S.; Vinš, Václav; Jäger, A.; Hrubý, Jan; Breitkopf, C.; Span, R.

    2018-01-01

    Roč. 459, March (2018), s. 170-185 ISSN 0378-3812 R&D Projects: GA ČR(CZ) GA17-08218S Institutional support: RVO:61388998 Keywords : gas hydrate * mixture * modeling Subject RIV: BJ - Thermodynamics Impact factor: 2.473, year: 2016 https://www.sciencedirect.com/science/article/pii/S0378381217304983

  20. Energy and development : A modelling approach

    NARCIS (Netherlands)

    van Ruijven, B.J.|info:eu-repo/dai/nl/304834521

    2008-01-01

    Rapid economic growth of developing countries like India and China implies that these countries become important actors in the global energy system. Examples of this impact are the present day oil shortages and rapidly increasing emissions of greenhouse gases. Global energy models are used explore

  1. Modeling Approaches for Describing Microbial Population Heterogeneity

    DEFF Research Database (Denmark)

    Lencastre Fernandes, Rita

    environmental conditions. Three cases are presented and discussed in this thesis. Common to all is the use of S. cerevisiae as model organism, and the use of cell size and cell cycle position as single-cell descriptors. The first case focuses on the experimental and mathematical description of a yeast...

  2. Specification for a standard radar sea clutter model

    Science.gov (United States)

    Paulus, Richard A.

    1990-09-01

    A model for the average sea clutter radar cross section is proposed for the Oceanographic and Atmospheric Master Library. This model is a function of wind speed (or sea state), wind direction relative to the antenna, refractive conditions, radar antenna height, frequency, polarization, horizontal beamwidth, and compressed pulse length. The model is fully described, a FORTRAN 77 computer listing is provided, and test cases are given to demonstrate the proper operation of the program.

  3. Energy and Development. A Modelling Approach

    International Nuclear Information System (INIS)

    Van Ruijven, B.J.

    2008-01-01

    Rapid economic growth of developing countries like India and China implies that these countries become important actors in the global energy system. Examples of this impact are the present day oil shortages and rapidly increasing emissions of greenhouse gases. Global energy models are used to explore possible future developments of the global energy system and identify policies to prevent potential problems. Such estimations of future energy use in developing countries are very uncertain. Crucial factors in the future energy use of these regions are electrification, urbanisation and income distribution, issues that are generally not included in present day global energy models. Model simulations in this thesis show that current insight in developments in low-income regions lead to a wide range of expected energy use in 2030 of the residential and transport sectors. This is mainly caused by many different model calibration options that result from the limited data availability for model development and calibration. We developed a method to identify the impact of model calibration uncertainty on future projections. We developed a new model for residential energy use in India, in collaboration with the Indian Institute of Science. Experiments with this model show that the impact of electrification and income distribution is less univocal than often assumed. The use of fuelwood, with related health risks, can decrease rapidly if the income of poor groups increases. However, there is a trade off in terms of CO2 emissions because these groups gain access to electricity and the ownership of appliances increases. Another issue is the potential role of new technologies in developing countries: will they use the opportunities of leapfrogging? We explored the potential role of hydrogen, an energy carrier that might play a central role in a sustainable energy system. We found that hydrogen only plays a role before 2050 under very optimistic assumptions. Regional energy

  4. A path integral approach to the Hodgkin-Huxley model

    Science.gov (United States)

    Baravalle, Roman; Rosso, Osvaldo A.; Montani, Fernando

    2017-11-01

    To understand how single neurons process sensory information, it is necessary to develop suitable stochastic models to describe the response variability of the recorded spike trains. Spikes in a given neuron are produced by the synergistic action of sodium and potassium of the voltage-dependent channels that open or close the gates. Hodgkin and Huxley (HH) equations describe the ionic mechanisms underlying the initiation and propagation of action potentials, through a set of nonlinear ordinary differential equations that approximate the electrical characteristics of the excitable cell. Path integral provides an adequate approach to compute quantities such as transition probabilities, and any stochastic system can be expressed in terms of this methodology. We use the technique of path integrals to determine the analytical solution driven by a non-Gaussian colored noise when considering the HH equations as a stochastic system. The different neuronal dynamics are investigated by estimating the path integral solutions driven by a non-Gaussian colored noise q. More specifically we take into account the correlational structures of the complex neuronal signals not just by estimating the transition probability associated to the Gaussian approach of the stochastic HH equations, but instead considering much more subtle processes accounting for the non-Gaussian noise that could be induced by the surrounding neural network and by feedforward correlations. This allows us to investigate the underlying dynamics of the neural system when different scenarios of noise correlations are considered.

  5. Estimating inverse probability weights using super learner when weight-model specification is unknown in a marginal structural Cox model context.

    Science.gov (United States)

    Karim, Mohammad Ehsanul; Platt, Robert W

    2017-06-15

    Correct specification of the inverse probability weighting (IPW) model is necessary for consistent inference from a marginal structural Cox model (MSCM). In practical applications, researchers are typically unaware of the true specification of the weight model. Nonetheless, IPWs are commonly estimated using parametric models, such as the main-effects logistic regression model. In practice, assumptions underlying such models may not hold and data-adaptive statistical learning methods may provide an alternative. Many candidate statistical learning approaches are available in the literature. However, the optimal approach for a given dataset is impossible to predict. Super learner (SL) has been proposed as a tool for selecting an optimal learner from a set of candidates using cross-validation. In this study, we evaluate the usefulness of a SL in estimating IPW in four different MSCM simulation scenarios, in which we varied the specification of the true weight model specification (linear and/or additive). Our simulations show that, in the presence of weight model misspecification, with a rich and diverse set of candidate algorithms, SL can generally offer a better alternative to the commonly used statistical learning approaches in terms of MSE as well as the coverage probabilities of the estimated effect in an MSCM. The findings from the simulation studies guided the application of the MSCM in a multiple sclerosis cohort from British Columbia, Canada (1995-2008), to estimate the impact of beta-interferon treatment in delaying disability progression. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  6. Modeling growth of specific spoilage organisms in tilapia ...

    African Journals Online (AJOL)

    enoh

    2012-03-29

    Mar 29, 2012 ... used for market research and is a relatively new method in data classification. The three ... characteristics of the predictor variables in a set of as few as possible ..... Segmentation approaches in data-mining: A comparison of ...

  7. Vehicle-specific emissions modeling based upon on-road measurements.

    Science.gov (United States)

    Frey, H Christopher; Zhang, Kaishan; Rouphail, Nagui M

    2010-05-01

    Vehicle-specific microscale fuel use and emissions rate models are developed based upon real-world hot-stabilized tailpipe measurements made using a portable emissions measurement system. Consecutive averaging periods of one to three multiples of the response time are used to compare two semiempirical physically based modeling schemes. One scheme is based on internally observable variables (IOVs), such as engine speed and manifold absolute pressure, while the other is based on externally observable variables (EOVs), such as speed, acceleration, and road grade. For NO, HC, and CO emission rates, the average R(2) ranged from 0.41 to 0.66 for the former and from 0.17 to 0.30 for the latter. The EOV models have R(2) for CO(2) of 0.43 to 0.79 versus 0.99 for the IOV models. The models are sensitive to episodic events in driving cycles such as high acceleration. Intervehicle and fleet average modeling approaches are compared; the former account for microscale variations that might be useful for some types of assessments. EOV-based models have practical value for traffic management or simulation applications since IOVs usually are not available or not used for emission estimation.

  8. Integration models: multicultural and liberal approaches confronted

    Science.gov (United States)

    Janicki, Wojciech

    2012-01-01

    European societies have been shaped by their Christian past, upsurge of international migration, democratic rule and liberal tradition rooted in religious tolerance. Boosting globalization processes impose new challenges on European societies, striving to protect their diversity. This struggle is especially clearly visible in case of minorities trying to resist melting into mainstream culture. European countries' legal systems and cultural policies respond to these efforts in many ways. Respecting identity politics-driven group rights seems to be the most common approach, resulting in creation of a multicultural society. However, the outcome of respecting group rights may be remarkably contradictory to both individual rights growing out from liberal tradition, and to reinforced concept of integration of immigrants into host societies. The hereby paper discusses identity politics upturn in the context of both individual rights and integration of European societies.

  9. Modelling thermal plume impacts - Kalpakkam approach

    International Nuclear Information System (INIS)

    Rao, T.S.; Anup Kumar, B.; Narasimhan, S.V.

    2002-01-01

    A good understanding of temperature patterns in the receiving waters is essential to know the heat dissipation from thermal plumes originating from coastal power plants. The seasonal temperature profiles of the Kalpakkam coast near Madras Atomic Power Station (MAPS) thermal out fall site are determined and analysed. It is observed that the seasonal current reversal in the near shore zone is one of the major mechanisms for the transport of effluents away from the point of mixing. To further refine our understanding of the mixing and dilution processes, it is necessary to numerically simulate the coastal ocean processes by parameterising the key factors concerned. In this paper, we outline the experimental approach to achieve this objective. (author)

  10. Dynamic Metabolic Model Building Based on the Ensemble Modeling Approach

    Energy Technology Data Exchange (ETDEWEB)

    Liao, James C. [Univ. of California, Los Angeles, CA (United States)

    2016-10-01

    Ensemble modeling of kinetic systems addresses the challenges of kinetic model construction, with respect to parameter value selection, and still allows for the rich insights possible from kinetic models. This project aimed to show that constructing, implementing, and analyzing such models is a useful tool for the metabolic engineering toolkit, and that they can result in actionable insights from models. Key concepts are developed and deliverable publications and results are presented.

  11. PIEteR : a field specific bio-economic production model for decision support in sugar beet growing

    NARCIS (Netherlands)

    Smit, A.B.

    1996-01-01


    To support decisions in sugar beet growing, a model, PIEteR, was developed. It simulates growth and production of the crop in a field specific way, making a tailor-made approach in decision taking possible.

    PIEteR is based on causal regression analysis of Dutch data of mostly

  12. Conceptual language models for domain-specific retrieval

    NARCIS (Netherlands)

    Meij, E.; Trieschnigg, D.; de Rijke, M.; Kraaij, W.

    2010-01-01

    Over the years, various meta-languages have been used to manually enrich documents with conceptual knowledge of some kind. Examples include keyword assignment to citations or, more recently, tags to websites. In this paper we propose generative concept models as an extension to query modeling within

  13. Task-specific visual cues for improving process model understanding

    NARCIS (Netherlands)

    Petrusel, Razvan; Mendling, Jan; Reijers, Hajo A.

    2016-01-01

    Context Business process models support various stakeholders in managing business processes and designing process-aware information systems. In order to make effective use of these models, they have to be readily understandable. Objective Prior research has emphasized the potential of visual cues to

  14. Univariate and Multivariate Specification Search Indices in Covariance Structure Modeling.

    Science.gov (United States)

    Hutchinson, Susan R.

    1993-01-01

    Simulated population data were used to compare relative performances of the modification index and C. Chou and P. M. Bentler's Lagrange multiplier test (a multivariate generalization of a modification index) for four levels of model misspecification. Both indices failed to recover the true model except at the lowest level of misspecification. (SLD)

  15. Nuclear security assessment with Markov model approach

    International Nuclear Information System (INIS)

    Suzuki, Mitsutoshi; Terao, Norichika

    2013-01-01

    Nuclear security risk assessment with the Markov model based on random event is performed to explore evaluation methodology for physical protection in nuclear facilities. Because the security incidences are initiated by malicious and intentional acts, expert judgment and Bayes updating are used to estimate scenario and initiation likelihood, and it is assumed that the Markov model derived from stochastic process can be applied to incidence sequence. Both an unauthorized intrusion as Design Based Threat (DBT) and a stand-off attack as beyond-DBT are assumed to hypothetical facilities, and performance of physical protection and mitigation and minimization of consequence are investigated to develop the assessment methodology in a semi-quantitative manner. It is shown that cooperation between facility operator and security authority is important to respond to the beyond-DBT incidence. (author)

  16. An integrated chemical biology approach identifies specific vulnerability of Ewing's sarcoma to combined inhibition of Aurora kinases A and B.

    Science.gov (United States)

    Winter, Georg E; Rix, Uwe; Lissat, Andrej; Stukalov, Alexey; Müllner, Markus K; Bennett, Keiryn L; Colinge, Jacques; Nijman, Sebastian M; Kubicek, Stefan; Kovar, Heinrich; Kontny, Udo; Superti-Furga, Giulio

    2011-10-01

    Ewing's sarcoma is a pediatric cancer of the bone that is characterized by the expression of the chimeric transcription factor EWS-FLI1 that confers a highly malignant phenotype and results from the chromosomal translocation t(11;22)(q24;q12). Poor overall survival and pronounced long-term side effects associated with traditional chemotherapy necessitate the development of novel, targeted, therapeutic strategies. We therefore conducted a focused viability screen with 200 small molecule kinase inhibitors in 2 different Ewing's sarcoma cell lines. This resulted in the identification of several potential molecular intervention points. Most notably, tozasertib (VX-680, MK-0457) displayed unique nanomolar efficacy, which extended to other cell lines, but was specific for Ewing's sarcoma. Furthermore, tozasertib showed strong synergies with the chemotherapeutic drugs etoposide and doxorubicin, the current standard agents for Ewing's sarcoma. To identify the relevant targets underlying the specific vulnerability toward tozasertib, we determined its cellular target profile by chemical proteomics. We identified 20 known and unknown serine/threonine and tyrosine protein kinase targets. Additional target deconvolution and functional validation by RNAi showed simultaneous inhibition of Aurora kinases A and B to be responsible for the observed tozasertib sensitivity, thereby revealing a new mechanism for targeting Ewing's sarcoma. We further corroborated our cellular observations with xenograft mouse models. In summary, the multilayered chemical biology approach presented here identified a specific vulnerability of Ewing's sarcoma to concomitant inhibition of Aurora kinases A and B by tozasertib and danusertib, which has the potential to become a new therapeutic option.

  17. An Approach for Modeling Supplier Resilience

    Science.gov (United States)

    2016-04-30

    interests include resilience modeling of supply chains, reliability engineering, and meta- heuristic optimization. [m.hosseini@ou.edu] Abstract...be availability , or the extent to which the products produced by the supply chain are available for use (measured as a ratio of uptime to total time...of the use of the product). Available systems are important in many industries, particularly in the Department of Defense, where weapons systems

  18. ISM Approach to Model Offshore Outsourcing Risks

    Directory of Open Access Journals (Sweden)

    Sunand Kumar

    2014-07-01

    Full Text Available In an effort to achieve a competitive advantage via cost reductions and improved market responsiveness, organizations are increasingly employing offshore outsourcing as a major component of their supply chain strategies. But as evident from literature number of risks such as Political risk, Risk due to cultural differences, Compliance and regulatory risk, Opportunistic risk and Organization structural risk, which adversely affect the performance of offshore outsourcing in a supply chain network. This also leads to dissatisfaction among different stake holders. The main objective of this paper is to identify and understand the mutual interaction among various risks which affect the performance of offshore outsourcing.  To this effect, authors have identified various risks through extant review of literature.  From this information, an integrated model using interpretive structural modelling (ISM for risks affecting offshore outsourcing is developed and the structural relationships between these risks are modeled.  Further, MICMAC analysis is done to analyze the driving power and dependency of risks which shall be helpful to managers to identify and classify important criterions and to reveal the direct and indirect effects of each criterion on offshore outsourcing. Results show that political risk and risk due to cultural differences are act as strong drivers.

  19. Remote sensing approach to structural modelling

    International Nuclear Information System (INIS)

    El Ghawaby, M.A.

    1989-01-01

    Remote sensing techniques are quite dependable tools in investigating geologic problems, specially those related to structural aspects. The Landsat imagery provides discrimination between rock units, detection of large scale structures as folds and faults, as well as small scale fabric elements such as foliation and banding. In order to fulfill the aim of geologic application of remote sensing, some essential surveying maps might be done from images prior to the structural interpretation: land-use, land-form drainage pattern, lithological unit and structural lineament maps. Afterwards, the field verification should lead to interpretation of a comprehensive structural model of the study area to apply for the target problem. To deduce such a model, there are two ways of analysis the interpreter may go through: the direct and the indirect methods. The direct one is needed in cases where the resources or the targets are controlled by an obvious or exposed structural element or pattern. The indirect way is necessary for areas where the target is governed by a complicated structural pattern. Some case histories of structural modelling methods applied successfully for exploration of radioactive minerals, iron deposits and groundwater aquifers in Egypt are presented. The progress in imagery, enhancement and integration of remote sensing data with the other geophysical and geochemical data allow a geologic interpretation to be carried out which become better than that achieved with either of the individual data sets. 9 refs

  20. Conceptual model of the globalization for domain-specific languages

    NARCIS (Netherlands)

    Clark, T.; van den Brand, M.; Combemale, B.; Rumpe, B.; Combemale, B.

    2015-01-01

    Domain Specific Languages (DSL) have received some prominence recently. Designing a DSL and all their tools is still cumbersome and lots of work. Engineering of DSLs is still at infancy, not even the terms have been coined and agreed on. In particular globalization and all its consequences need to

  1. Assessing the accuracy of subject-specific, muscle-model parameters determined by optimizing to match isometric strength.

    Science.gov (United States)

    DeSmitt, Holly J; Domire, Zachary J

    2016-12-01

    Biomechanical models are sensitive to the choice of model parameters. Therefore, determination of accurate subject specific model parameters is important. One approach to generate these parameters is to optimize the values such that the model output will match experimentally measured strength curves. This approach is attractive as it is inexpensive and should provide an excellent match to experimentally measured strength. However, given the problem of muscle redundancy, it is not clear that this approach generates accurate individual muscle forces. The purpose of this investigation is to evaluate this approach using simulated data to enable a direct comparison. It is hypothesized that the optimization approach will be able to recreate accurate muscle model parameters when information from measurable parameters is given. A model of isometric knee extension was developed to simulate a strength curve across a range of knee angles. In order to realistically recreate experimentally measured strength, random noise was added to the modeled strength. Parameters were solved for using a genetic search algorithm. When noise was added to the measurements the strength curve was reasonably recreated. However, the individual muscle model parameters and force curves were far less accurate. Based upon this examination, it is clear that very different sets of model parameters can recreate similar strength curves. Therefore, experimental variation in strength measurements has a significant influence on the results. Given the difficulty in accurately recreating individual muscle parameters, it may be more appropriate to perform simulations with lumped actuators representing similar muscles.

  2. Automatic generation of a subject-specific model for accurate markerless motion capture and biomechanical applications.

    Science.gov (United States)

    Corazza, Stefano; Gambaretto, Emiliano; Mündermann, Lars; Andriacchi, Thomas P

    2010-04-01

    A novel approach for the automatic generation of a subject-specific model consisting of morphological and joint location information is described. The aim is to address the need for efficient and accurate model generation for markerless motion capture (MMC) and biomechanical studies. The algorithm applied and expanded on previous work on human shapes space by embedding location information for ten joint centers in a subject-specific free-form surface. The optimal locations of joint centers in the 3-D mesh were learned through linear regression over a set of nine subjects whose joint centers were known. The model was shown to be sufficiently accurate for both kinematic (joint centers) and morphological (shape of the body) information to allow accurate tracking with MMC systems. The automatic model generation algorithm was applied to 3-D meshes of different quality and resolution such as laser scans and visual hulls. The complete method was tested using nine subjects of different gender, body mass index (BMI), age, and ethnicity. Experimental training error and cross-validation errors were 19 and 25 mm, respectively, on average over the joints of the ten subjects analyzed in the study.

  3. A moving approach for the Vector Hysteron Model

    Energy Technology Data Exchange (ETDEWEB)

    Cardelli, E. [Department of Engineering, University of Perugia, Via G. Duranti 93, 06125 Perugia (Italy); Faba, A., E-mail: antonio.faba@unipg.it [Department of Engineering, University of Perugia, Via G. Duranti 93, 06125 Perugia (Italy); Laudani, A. [Department of Engineering, Roma Tre University, Via V. Volterra 62, 00146 Rome (Italy); Quondam Antonio, S. [Department of Engineering, University of Perugia, Via G. Duranti 93, 06125 Perugia (Italy); Riganti Fulginei, F.; Salvini, A. [Department of Engineering, Roma Tre University, Via V. Volterra 62, 00146 Rome (Italy)

    2016-04-01

    A moving approach for the VHM (Vector Hysteron Model) is here described, to reconstruct both scalar and rotational magnetization of electrical steels with weak anisotropy, such as the non oriented grain Silicon steel. The hysterons distribution is postulated to be function of the magnetization state of the material, in order to overcome the practical limitation of the congruency property of the standard VHM approach. By using this formulation and a suitable accommodation procedure, the results obtained indicate that the model is accurate, in particular in reproducing the experimental behavior approaching to the saturation region, allowing a real improvement respect to the previous approach.

  4. An approach to model validation and model-based prediction -- polyurethane foam case study.

    Energy Technology Data Exchange (ETDEWEB)

    Dowding, Kevin J.; Rutherford, Brian Milne

    2003-07-01

    Enhanced software methodology and improved computing hardware have advanced the state of simulation technology to a point where large physics-based codes can be a major contributor in many systems analyses. This shift toward the use of computational methods has brought with it new research challenges in a number of areas including characterization of uncertainty, model validation, and the analysis of computer output. It is these challenges that have motivated the work described in this report. Approaches to and methods for model validation and (model-based) prediction have been developed recently in the engineering, mathematics and statistical literatures. In this report we have provided a fairly detailed account of one approach to model validation and prediction applied to an analysis investigating thermal decomposition of polyurethane foam. A model simulates the evolution of the foam in a high temperature environment as it transforms from a solid to a gas phase. The available modeling and experimental results serve as data for a case study focusing our model validation and prediction developmental efforts on this specific thermal application. We discuss several elements of the ''philosophy'' behind the validation and prediction approach: (1) We view the validation process as an activity applying to the use of a specific computational model for a specific application. We do acknowledge, however, that an important part of the overall development of a computational simulation initiative is the feedback provided to model developers and analysts associated with the application. (2) We utilize information obtained for the calibration of model parameters to estimate the parameters and quantify uncertainty in the estimates. We rely, however, on validation data (or data from similar analyses) to measure the variability that contributes to the uncertainty in predictions for specific systems or units (unit-to-unit variability). (3) We perform statistical

  5. Specification of Change Mechanisms in Pregnant Smokers for Malleable Target Identification: A Novel Approach to a Tenacious Public Health Problem

    Directory of Open Access Journals (Sweden)

    Suena H. Massey

    2017-09-01

    Full Text Available Maternal smoking during pregnancy (MSDP continues to be a leading modifiable risk factor for perinatal complications and a range of neurodevelopmental and cardio-metabolic outcomes across the lifespan. Despite 40 years of intervention research less than one in five pregnant smokers who receive an intervention quit by delivery. Within this context, recognition of pregnancy is commonly associated with abrupt suspension or reduction of smoking in the absence of intervention, yet has not been investigated as a volitional target. The goal of this article is to provide the empirical foundation for a novel direction of research aimed at identifying malleable targets for intervention through the specification of behavior change mechanisms specific to pregnant women. To do so, we: (1 summarize progress on MSDP in the United States generated from conventional empirical approaches to health behavior change; (2 discuss the phenomenon of spontaneous change in the absence of intervention among pregnant smokers to illustrate the need for mechanistic specification of behavior change motivated by concern for fetal well-being; (3 summarize component processes in neurobiological models of parental and non-parental social behaviors as a conceptual framework for understanding change mechanisms during pregnancy; (4 discuss the evidence for the malleability of these processes to support their translational relevance for preventive interventions; and (5 propose a roadmap for validating the proposed change mechanism using an experimental medicine approach. A greater understanding of social and interpersonal processes that facilitate health behavior change among expectant mothers and how these processes differ interindividually could yield novel volitional targets for prenatal interventions. More broadly, explicating other-oriented mechanisms of behavior change during pregnancy could serve as a paradigm for understanding how social and interpersonal processes

  6. Engineering approach to modeling of piled systems

    International Nuclear Information System (INIS)

    Coombs, R.F.; Silva, M.A.G. da

    1980-01-01

    Available methods of analysis of piled systems subjected to dynamic excitation invade areas of mathematics usually beyond the reach of a practising engineer. A simple technique that avoids that conflict is proposed, at least for preliminary studies, and its application, compared with other methods, is shown to be satisfactory. A corrective factor for parameters currently used to represent transmitting boundaries is derived for a finite strip that models an infinite layer. The influence of internal damping on the dynamic stiffness of the layer and on radiation damping is analysed. (Author) [pt

  7. Jackiw-Pi model: A superfield approach

    Science.gov (United States)

    Gupta, Saurabh

    2014-12-01

    We derive the off-shell nilpotent and absolutely anticommuting Becchi-Rouet-Stora-Tyutin (BRST) as well as anti-BRST transformations s ( a) b corresponding to the Yang-Mills gauge transformations of 3D Jackiw-Pi model by exploiting the "augmented" super-field formalism. We also show that the Curci-Ferrari restriction, which is a hallmark of any non-Abelian 1-form gauge theories, emerges naturally within this formalism and plays an instrumental role in providing the proof of absolute anticommutativity of s ( a) b .

  8. Applied Regression Modeling A Business Approach

    CERN Document Server

    Pardoe, Iain

    2012-01-01

    An applied and concise treatment of statistical regression techniques for business students and professionals who have little or no background in calculusRegression analysis is an invaluable statistical methodology in business settings and is vital to model the relationship between a response variable and one or more predictor variables, as well as the prediction of a response value given values of the predictors. In view of the inherent uncertainty of business processes, such as the volatility of consumer spending and the presence of market uncertainty, business professionals use regression a

  9. Time-specific ecological niche modeling predicts spatial dynamics of vector insects and human dengue cases.

    Science.gov (United States)

    Peterson, A Townsend; Martínez-Campos, Carmen; Nakazawa, Yoshinori; Martínez-Meyer, Enrique

    2005-09-01

    Numerous human diseases-malaria, dengue, yellow fever and leishmaniasis, to name a few-are transmitted by insect vectors with brief life cycles and biting activity that varies in both space and time. Although the general geographic distributions of these epidemiologically important species are known, the spatiotemporal variation in their emergence and activity remains poorly understood. We used ecological niche modeling via a genetic algorithm to produce time-specific predictive models of monthly distributions of Aedes aegypti in Mexico in 1995. Significant predictions of monthly mosquito activity and distributions indicate that predicting spatiotemporal dynamics of disease vector species is feasible; significant coincidence with human cases of dengue indicate that these dynamics probably translate directly into transmission of dengue virus to humans. This approach provides new potential for optimizing use of resources for disease prevention and remediation via automated forecasting of disease transmission risk.

  10. Implicit moral evaluations: A multinomial modeling approach.

    Science.gov (United States)

    Cameron, C Daryl; Payne, B Keith; Sinnott-Armstrong, Walter; Scheffer, Julian A; Inzlicht, Michael

    2017-01-01

    Implicit moral evaluations-i.e., immediate, unintentional assessments of the wrongness of actions or persons-play a central role in supporting moral behavior in everyday life. Yet little research has employed methods that rigorously measure individual differences in implicit moral evaluations. In five experiments, we develop a new sequential priming measure-the Moral Categorization Task-and a multinomial model that decomposes judgment on this task into multiple component processes. These include implicit moral evaluations of moral transgression primes (Unintentional Judgment), accurate moral judgments about target actions (Intentional Judgment), and a directional tendency to judge actions as morally wrong (Response Bias). Speeded response deadlines reduced Intentional Judgment but not Unintentional Judgment (Experiment 1). Unintentional Judgment was stronger toward moral transgression primes than non-moral negative primes (Experiments 2-4). Intentional Judgment was associated with increased error-related negativity, a neurophysiological indicator of behavioral control (Experiment 4). Finally, people who voted for an anti-gay marriage amendment had stronger Unintentional Judgment toward gay marriage primes (Experiment 5). Across Experiments 1-4, implicit moral evaluations converged with moral personality: Unintentional Judgment about wrong primes, but not negative primes, was negatively associated with psychopathic tendencies and positively associated with moral identity and guilt proneness. Theoretical and practical applications of formal modeling for moral psychology are discussed. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. Modeling Saturn's Inner Plasmasphere: Cassini's Closest Approach

    Science.gov (United States)

    Moore, L.; Mendillo, M.

    2005-05-01

    Ion densities from the three-dimensional Saturn-Thermosphere-Ionosphere-Model (STIM, Moore et al., 2004) are extended above the plasma exobase using the formalism of Pierrard and Lemaire (1996, 1998), which evaluates the balance of gravitational, centrifugal and electric forces on the plasma. The parameter space of low-energy ionospheric contributions to Saturn's plasmasphere is explored by comparing results that span the observed extremes of plasma temperature, 650 K to 1700 K, and a range of velocity distributions, Lorentzian (or Kappa) to Maxwellian. Calculations are made for plasma densities along the path of the Cassini spacecraft's orbital insertion on 1 July 2004. These calculations neglect any ring or satellite sources of plasma, which are most likely minor contributors at 1.3 Saturn radii. Modeled densities will be compared with Cassini measurements as they become available. Moore, L.E., M. Mendillo, I.C.F. Mueller-Wodarg, and D.L. Murr, Icarus, 172, 503-520, 2004. Pierrard, V. and J. Lemaire, J. Geophys. Res., 101, 7923-7934, 1996. Pierrard, V. and J. Lemaire, J. Geophys. Res., 103, 4117, 1998.

  12. Keyring models: An approach to steerability

    Science.gov (United States)

    Miller, Carl A.; Colbeck, Roger; Shi, Yaoyun

    2018-02-01

    If a measurement is made on one half of a bipartite system, then, conditioned on the outcome, the other half has a new reduced state. If these reduced states defy classical explanation—that is, if shared randomness cannot produce these reduced states for all possible measurements—the bipartite state is said to be steerable. Determining which states are steerable is a challenging problem even for low dimensions. In the case of two-qubit systems, a criterion is known for T-states (that is, those with maximally mixed marginals) under projective measurements. In the current work, we introduce the concept of keyring models—a special class of local hidden state models. When the measurements made correspond to real projectors, these allow us to study steerability beyond T-states. Using keyring models, we completely solve the steering problem for real projective measurements when the state arises from mixing a pure two-qubit state with uniform noise. We also give a partial solution in the case when the uniform noise is replaced by independent depolarizing channels.

  13. Fractional Gaussian noise: Prior specification and model comparison

    KAUST Repository

    Sø rbye, Sigrunn Holbek; Rue, Haavard

    2017-01-01

    Fractional Gaussian noise (fGn) is a stationary stochastic process used to model antipersistent or persistent dependency structures in observed time series. Properties of the autocovariance function of fGn are characterised by the Hurst exponent (H), which, in Bayesian contexts, typically has been assigned a uniform prior on the unit interval. This paper argues why a uniform prior is unreasonable and introduces the use of a penalised complexity (PC) prior for H. The PC prior is computed to penalise divergence from the special case of white noise and is invariant to reparameterisations. An immediate advantage is that the exact same prior can be used for the autocorrelation coefficient ϕ(symbol) of a first-order autoregressive process AR(1), as this model also reflects a flexible version of white noise. Within the general setting of latent Gaussian models, this allows us to compare an fGn model component with AR(1) using Bayes factors, avoiding the confounding effects of prior choices for the two hyperparameters H and ϕ(symbol). Among others, this is useful in climate regression models where inference for underlying linear or smooth trends depends heavily on the assumed noise model.

  14. Fractional Gaussian noise: Prior specification and model comparison

    KAUST Repository

    Sørbye, Sigrunn Holbek

    2017-07-07

    Fractional Gaussian noise (fGn) is a stationary stochastic process used to model antipersistent or persistent dependency structures in observed time series. Properties of the autocovariance function of fGn are characterised by the Hurst exponent (H), which, in Bayesian contexts, typically has been assigned a uniform prior on the unit interval. This paper argues why a uniform prior is unreasonable and introduces the use of a penalised complexity (PC) prior for H. The PC prior is computed to penalise divergence from the special case of white noise and is invariant to reparameterisations. An immediate advantage is that the exact same prior can be used for the autocorrelation coefficient ϕ(symbol) of a first-order autoregressive process AR(1), as this model also reflects a flexible version of white noise. Within the general setting of latent Gaussian models, this allows us to compare an fGn model component with AR(1) using Bayes factors, avoiding the confounding effects of prior choices for the two hyperparameters H and ϕ(symbol). Among others, this is useful in climate regression models where inference for underlying linear or smooth trends depends heavily on the assumed noise model.

  15. Current status of top-specific variant axion model

    Science.gov (United States)

    Chiang, Cheng-Wei; Fukuda, Hajime; Takeuchi, Michihisa; Yanagida, Tsutomu T.

    2018-02-01

    The invisible variant axion model is one of the very attractive models which solves the strong C P problem but does not provoke the domain wall problem. At the electroweak scale, this model requires at least two Higgs doublets, one of which carries a nonzero Peccei-Quinn (PQ) charge and the other is neutral. We consider a scenario where only the right-handed top quark is charged under the PQ symmetry and couples with the PQ-charged Higgs doublet. As a general prediction of this model, the top quark can decay to the observed standard model-like Higgs boson h and the charm or up quark, t →h c /u , which recently exhibited slight excesses at LHC run-I and run-II and will soon be testable at the LHC run-II. If the rare top decay excess stays at the observed central value, we show that tan β ˜1 or smaller is preferred by the Higgs data. The chiral nature of the Higgs flavor-changing interaction is a distinctive feature of this model and testable using the angular distribution of the t →c h decays at the LHC.

  16. Mathematical Modeling in Mathematics Education: Basic Concepts and Approaches

    Science.gov (United States)

    Erbas, Ayhan Kürsat; Kertil, Mahmut; Çetinkaya, Bülent; Çakiroglu, Erdinç; Alacaci, Cengiz; Bas, Sinem

    2014-01-01

    Mathematical modeling and its role in mathematics education have been receiving increasing attention in Turkey, as in many other countries. The growing body of literature on this topic reveals a variety of approaches to mathematical modeling and related concepts, along with differing perspectives on the use of mathematical modeling in teaching and…

  17. A BEHAVIORAL-APPROACH TO LINEAR EXACT MODELING

    NARCIS (Netherlands)

    ANTOULAS, AC; WILLEMS, JC

    1993-01-01

    The behavioral approach to system theory provides a parameter-free framework for the study of the general problem of linear exact modeling and recursive modeling. The main contribution of this paper is the solution of the (continuous-time) polynomial-exponential time series modeling problem. Both

  18. A modular approach to numerical human body modeling

    NARCIS (Netherlands)

    Forbes, P.A.; Griotto, G.; Rooij, L. van

    2007-01-01

    The choice of a human body model for a simulated automotive impact scenario must take into account both accurate model response and computational efficiency as key factors. This study presents a "modular numerical human body modeling" approach which allows the creation of a customized human body

  19. A Pattern-Oriented Approach to a Methodical Evaluation of Modeling Methods

    Directory of Open Access Journals (Sweden)

    Michael Amberg

    1996-11-01

    Full Text Available The paper describes a pattern-oriented approach to evaluate modeling methods and to compare various methods with each other from a methodical viewpoint. A specific set of principles (the patterns is defined by investigating the notations and the documentation of comparable modeling methods. Each principle helps to examine some parts of the methods from a specific point of view. All principles together lead to an overall picture of the method under examination. First the core ("method neutral" meaning of each principle is described. Then the methods are examined regarding the principle. Afterwards the method specific interpretations are compared with each other and with the core meaning of the principle. By this procedure, the strengths and weaknesses of modeling methods regarding methodical aspects are identified. The principles are described uniformly using a principle description template according to descriptions of object oriented design patterns. The approach is demonstrated by evaluating a business process modeling method.

  20. Subject-specific computational modeling of DBS in the PPTg area

    Directory of Open Access Journals (Sweden)

    Laura M. Zitella

    2015-07-01

    Full Text Available Deep brain stimulation (DBS in the pedunculopontine tegmental nucleus (PPTg has been proposed to alleviate medically intractable gait difficulties associated with Parkinson’s disease. Clinical trials have shown somewhat variable outcomes, stemming in part from surgical targeting variability, modulating fiber pathways implicated in side effects, and a general lack of mechanistic understanding of DBS in this brain region. Subject-specific computational models of DBS are a promising tool to investigate the underlying therapy and side effects. In this study, a parkinsonian rhesus macaque was implanted unilaterally with an 8-contact DBS lead in the PPTg region. Fiber tracts adjacent to PPTg, including the oculomotor nerve, central tegmental tract, and superior cerebellar peduncle, were reconstructed from a combination of pre-implant 7T MRI, post-implant CT, and post-mortem histology. These structures were populated with axon models and coupled with a finite element model simulating the voltage distribution in the surrounding neural tissue during stimulation. This study introduces two empirical approaches to evaluate model parameters. First, incremental monopolar cathodic stimulation (20Hz, 90µs pulse width was evaluated for each electrode, during which a right eyelid flutter was observed at the proximal four contacts (-1.0 to -1.4mA. These current amplitudes followed closely with model predicted activation of the oculomotor nerve when assuming an anisotropic conduction medium. Second, PET imaging was collected OFF-DBS and twice during DBS (two different contacts, which supported the model predicted activation of the central tegmental tract and superior cerebellar peduncle. Together, subject-specific models provide a framework to more precisely predict pathways modulated by DBS.

  1. A shorter and more specific oral sensitization-based experimental model of food allergy in mice.

    Science.gov (United States)

    Bailón, Elvira; Cueto-Sola, Margarita; Utrilla, Pilar; Rodríguez-Ruiz, Judith; Garrido-Mesa, Natividad; Zarzuelo, Antonio; Xaus, Jordi; Gálvez, Julio; Comalada, Mònica

    2012-07-31

    Cow's milk protein allergy (CMPA) is one of the most prevalent human food-borne allergies, particularly in children. Experimental animal models have become critical tools with which to perform research on new therapeutic approaches and on the molecular mechanisms involved. However, oral food allergen sensitization in mice requires several weeks and is usually associated with unspecific immune responses. To overcome these inconveniences, we have developed a new food allergy model that takes only two weeks while retaining the main characters of allergic response to food antigens. The new model is characterized by oral sensitization of weaned Balb/c mice with 5 doses of purified cow's milk protein (CMP) plus cholera toxin (CT) for only two weeks and posterior challenge with an intraperitoneal administration of the allergen at the end of the sensitization period. In parallel, we studied a conventional protocol that lasts for seven weeks, and also the non-specific effects exerted by CT in both protocols. The shorter protocol achieves a similar clinical score as the original food allergy model without macroscopically affecting gut morphology or physiology. Moreover, the shorter protocol caused an increased IL-4 production and a more selective antigen-specific IgG1 response. Finally, the extended CT administration during the sensitization period of the conventional protocol is responsible for the exacerbated immune response observed in that model. Therefore, the new model presented here allows a reduction not only in experimental time but also in the number of animals required per experiment while maintaining the features of conventional allergy models. We propose that the new protocol reported will contribute to advancing allergy research. Copyright © 2012 Elsevier B.V. All rights reserved.

  2. Forecasting wind-driven wildfires using an inverse modelling approach

    Directory of Open Access Journals (Sweden)

    O. Rios

    2014-06-01

    Full Text Available A technology able to rapidly forecast wildfire dynamics would lead to a paradigm shift in the response to emergencies, providing the Fire Service with essential information about the ongoing fire. This paper presents and explores a novel methodology to forecast wildfire dynamics in wind-driven conditions, using real-time data assimilation and inverse modelling. The forecasting algorithm combines Rothermel's rate of spread theory with a perimeter expansion model based on Huygens principle and solves the optimisation problem with a tangent linear approach and forward automatic differentiation. Its potential is investigated using synthetic data and evaluated in different wildfire scenarios. The results show the capacity of the method to quickly predict the location of the fire front with a positive lead time (ahead of the event in the order of 10 min for a spatial scale of 100 m. The greatest strengths of our method are lightness, speed and flexibility. We specifically tailor the forecast to be efficient and computationally cheap so it can be used in mobile systems for field deployment and operativeness. Thus, we put emphasis on producing a positive lead time and the means to maximise it.

  3. Multi-atlas and label fusion approach for patient-specific MRI based skull estimation.

    Science.gov (United States)

    Torrado-Carvajal, Angel; Herraiz, Joaquin L; Hernandez-Tamames, Juan A; San Jose-Estepar, Raul; Eryaman, Yigitcan; Rozenholc, Yves; Adalsteinsson, Elfar; Wald, Lawrence L; Malpica, Norberto

    2016-04-01

    MRI-based skull segmentation is a useful procedure for many imaging applications. This study describes a methodology for automatic segmentation of the complete skull from a single T1-weighted volume. The skull is estimated using a multi-atlas segmentation approach. Using a whole head computed tomography (CT) scan database, the skull in a new MRI volume is detected by nonrigid image registration of the volume to every CT, and combination of the individual segmentations by label-fusion. We have compared Majority Voting, Simultaneous Truth and Performance Level Estimation (STAPLE), Shape Based Averaging (SBA), and the Selective and Iterative Method for Performance Level Estimation (SIMPLE) algorithms. The pipeline has been evaluated quantitatively using images from the Retrospective Image Registration Evaluation database (reaching an overlap of 72.46 ± 6.99%), a clinical CT-MR dataset (maximum overlap of 78.31 ± 6.97%), and a whole head CT-MRI pair (maximum overlap 78.68%). A qualitative evaluation has also been performed on MRI acquisition of volunteers. It is possible to automatically segment the complete skull from MRI data using a multi-atlas and label fusion approach. This will allow the creation of complete MRI-based tissue models that can be used in electromagnetic dosimetry applications and attenuation correction in PET/MR. © 2015 Wiley Periodicals, Inc.

  4. A Bayesian approach for quantification of model uncertainty

    International Nuclear Information System (INIS)

    Park, Inseok; Amarchinta, Hemanth K.; Grandhi, Ramana V.

    2010-01-01

    In most engineering problems, more than one model can be created to represent an engineering system's behavior. Uncertainty is inevitably involved in selecting the best model from among the models that are possible. Uncertainty in model selection cannot be ignored, especially when the differences between the predictions of competing models are significant. In this research, a methodology is proposed to quantify model uncertainty using measured differences between experimental data and model outcomes under a Bayesian statistical framework. The adjustment factor approach is used to propagate model uncertainty into prediction of a system response. A nonlinear vibration system is used to demonstrate the processes for implementing the adjustment factor approach. Finally, the methodology is applied on the engineering benefits of a laser peening process, and a confidence band for residual stresses is established to indicate the reliability of model prediction.

  5. A Networks Approach to Modeling Enzymatic Reactions.

    Science.gov (United States)

    Imhof, P

    2016-01-01

    Modeling enzymatic reactions is a demanding task due to the complexity of the system, the many degrees of freedom involved and the complex, chemical, and conformational transitions associated with the reaction. Consequently, enzymatic reactions are not determined by precisely one reaction pathway. Hence, it is beneficial to obtain a comprehensive picture of possible reaction paths and competing mechanisms. By combining individually generated intermediate states and chemical transition steps a network of such pathways can be constructed. Transition networks are a discretized representation of a potential energy landscape consisting of a multitude of reaction pathways connecting the end states of the reaction. The graph structure of the network allows an easy identification of the energetically most favorable pathways as well as a number of alternative routes. © 2016 Elsevier Inc. All rights reserved.

  6. Carbonate rock depositional models: A microfacies approach

    Energy Technology Data Exchange (ETDEWEB)

    Carozzi, A.V.

    1988-01-01

    Carbonate rocks contain more than 50% by weight carbonate minerals such as calcite, dolomite, and siderite. Understanding how these rocks form can lead to more efficient methods of petroleum exploration. Micofacies analysis techniques can be used as a method of predicting models of sedimentation for carbonate rocks. Micofacies in carbonate rocks can be seen clearly only in thin sections under a microscope. This section analysis of carbonate rocks is a tool that can be used to understand depositional environments, diagenetic evolution of carbonate rocks, and the formation of porosity and permeability in carbonate rocks. The use of micofacies analysis techniques is applied to understanding the origin and formation of carbonate ramps, carbonate platforms, and carbonate slopes and basins. This book will be of interest to students and professionals concerned with the disciplines of sedimentary petrology, sedimentology, petroleum geology, and palentology.

  7. 76 FR 189 - Notice of Availability of the Models for Plant-Specific Adoption of Technical Specifications Task...

    Science.gov (United States)

    2011-01-03

    ... [pressurized water reactor] Operability Requirements and Actions for RCS [reactor coolant system] Leakage... Specifications (STS) to define a new time limit for restoring inoperable RCS leakage detection instrumentation to... operability of the RCS leakage detection instrumentation. The CLIIP model SE will facilitate expedited...

  8. 75 FR 79048 - Notice of Availability of the Models for Plant-Specific Adoption of Technical Specifications Task...

    Science.gov (United States)

    2010-12-17

    ... [boiling water reactor] Operability Requirements and Actions for RCS [reactor coolant system] Leakage... Specifications (STS) to define a new time limit for restoring inoperable RCS leakage detection instrumentation to... operability of the RCS leakage detection instrumentation. The CLIIP model SE will facilitate expedited...

  9. Specification Search for Identifying the Correct Mean Trajectory in Polynomial Latent Growth Models

    Science.gov (United States)

    Kim, Minjung; Kwok, Oi-Man; Yoon, Myeongsun; Willson, Victor; Lai, Mark H. C.

    2016-01-01

    This study investigated the optimal strategy for model specification search under the latent growth modeling (LGM) framework, specifically on searching for the correct polynomial mean or average growth model when there is no a priori hypothesized model in the absence of theory. In this simulation study, the effectiveness of different starting…

  10. Risk prediction model: Statistical and artificial neural network approach

    Science.gov (United States)

    Paiman, Nuur Azreen; Hariri, Azian; Masood, Ibrahim

    2017-04-01

    Prediction models are increasingly gaining popularity and had been used in numerous areas of studies to complement and fulfilled clinical reasoning and decision making nowadays. The adoption of such models assist physician's decision making, individual's behavior, and consequently improve individual outcomes and the cost-effectiveness of care. The objective of this paper is to reviewed articles related to risk prediction model in order to understand the suitable approach, development and the validation process of risk prediction model. A qualitative review of the aims, methods and significant main outcomes of the nineteen published articles that developed risk prediction models from numerous fields were done. This paper also reviewed on how researchers develop and validate the risk prediction models based on statistical and artificial neural network approach. From the review done, some methodological recommendation in developing and validating the prediction model were highlighted. According to studies that had been done, artificial neural network approached in developing the prediction model were more accurate compared to statistical approach. However currently, only limited published literature discussed on which approach is more accurate for risk prediction model development.

  11. Modelling contaminant transport using site specific data from Vaalputs

    International Nuclear Information System (INIS)

    Botha, J.F.

    1986-01-01

    The transport of a contaminant through the upper layers of the earth's surface is a complex phenomenon. To develop a model for this, requires a good understanding of the physical nature of the phenomenon. This paper discusses two difficulties frequently encountered in developing such a model - the nature of the subsurface and the mathematical representation of the unsaturated hydraulic parameters. It is proposed that information obtained from pump- and packer tests be used to circumvent the first difficulty, and that the unsaturated flow parameters be approximated by C -∞ continuous function

  12. A systems approach to predict oncometabolites via context-specific genome-scale metabolic networks.

    Directory of Open Access Journals (Sweden)

    Hojung Nam

    2014-09-01

    Full Text Available Altered metabolism in cancer cells has been viewed as a passive response required for a malignant transformation. However, this view has changed through the recently described metabolic oncogenic factors: mutated isocitrate dehydrogenases (IDH, succinate dehydrogenase (SDH, and fumarate hydratase (FH that produce oncometabolites that competitively inhibit epigenetic regulation. In this study, we demonstrate in silico predictions of oncometabolites that have the potential to dysregulate epigenetic controls in nine types of cancer by incorporating massive scale genetic mutation information (collected from more than 1,700 cancer genomes, expression profiling data, and deploying Recon 2 to reconstruct context-specific genome-scale metabolic models. Our analysis predicted 15 compounds and 24 substructures of potential oncometabolites that could result from the loss-of-function and gain-of-function mutations of metabolic enzymes, respectively. These results suggest a substantial potential for discovering unidentified oncometabolites in various forms of cancers.

  13. A dual model approach to ground water recovery trench design

    International Nuclear Information System (INIS)

    Clodfelter, C.L.; Crouch, M.S.

    1992-01-01

    The design of trenches for contaminated ground water recovery must consider several variables. This paper presents a dual-model approach for effectively recovering contaminated ground water migrating toward a trench by advection. The approach involves an analytical model to determine the vertical influence of the trench and a numerical flow model to determine the capture zone within the trench and the surrounding aquifer. The analytical model is utilized by varying trench dimensions and head values to design a trench which meets the remediation criteria. The numerical flow model is utilized to select the type of backfill and location of sumps within the trench. The dual-model approach can be used to design a recovery trench which effectively captures advective migration of contaminants in the vertical and horizontal planes

  14. A Review of Nutrition-Specific and Nutrition-Sensitive Approaches to Preventing Moderate Acute Malnutrition

    International Nuclear Information System (INIS)

    Mucha, Noreen; Jimenez, Michelle; Stone-Jimenez, Maryanne; Brown, Rebecca

    2014-01-01

    Full text: Recent literature reviews have demonstrated the limited efficacy of targeted supplementary feeding programmes aimed at both treating and preventing moderate acute malnutrition (MAM), with high rates of defaulting, low coverage and high associated costs. There is a growing interest in a) reviewing and improving protocols / tools for the management of acute malnutrition and b) increasing the quality and variety of products available for the treatment / prevention of moderate acute malnutrition. There is however, varying evidence on the impact of nutritional products aimed at preventing or treating acute malnutrition, or on the comparative efficacy of different products. Following several literature reviews and operational research with varying results, there is increasing consensus that MAM should be tackled not only through products, and that clearer guidance should be provided on broader preventive strategies, such as optimal infant and young child feeding (IYCF) and caregiving practices, optimal maternal nutrition, counselling, social protection, food security and livelihoods, and water, sanitation and hygiene (WASH). The CMAM Forum has commissioned Technical Briefs which aim to summarise current thinking and practice relating to preventive approaches to MAM, looking at the role of both nutrition-specific and nutrition-sensitive interventions. The work is being launched in January 2014 and results will be available for presentation at the IAEA MAM Symposium in May 2014. The briefs aim to provide: • An overview of approaches to preventing MAM across different sectors (e.g. agriculture, health, IYCF, social protection, water and sanitation) and in different contexts. • A review of current knowledge including: – Evidence from systematic and literature reviews. – Existing approaches and practice for prevention of MAM. – Current guidance on making programmatic choices relating to MAM prevention interventions and decision-making frameworks.

  15. Predicting the functions and specificity of triterpenoid synthases: a mechanism-based multi-intermediate docking approach.

    Directory of Open Access Journals (Sweden)

    Bo-Xue Tian

    2014-10-01

    Full Text Available Terpenoid synthases construct the carbon skeletons of tens of thousands of natural products. To predict functions and specificity of triterpenoid synthases, a mechanism-based, multi-intermediate docking approach is proposed. In addition to enzyme function prediction, other potential applications of the current approach, such as enzyme mechanistic studies and enzyme redesign by mutagenesis, are discussed.

  16. Re-conceptualising prenatal life stressors in predicting post-partum depression: cumulative-, specific-, and domain-specific approaches to calculating risk.

    Science.gov (United States)

    Liu, Cindy H; Tronick, Ed

    2013-09-01

    Prenatal life stress predicts post-partum depression (PPD); however, studies generally examine individual stressors (a specific approach) or the summation of such exposure (a cumulative approach) and their associations with PPD. Such approaches may oversimplify prenatal life stress as a risk factor for PPD. We evaluated approaches in assessing prenatal life stress as a predictor of PPD diagnosis, including a domain-specific approach that captures cumulative life stress while accounting for stress across different life stress domains: financial, relational, and physical health. The Pregnancy Risk Assessment Monitoring System, a population-based survey, was used to analyse the association of prenatal life stressors with PPD diagnoses among 3566 New York City post-partum women. Specific stressors were not associated with PPD diagnosis after controlling for sociodemographic variables. Exposure to a greater number of stressors was associated with PPD diagnosis, even after adjusting for both sociodemographic variables and specific stressors [odds ratio (OR) = 3.1, 95% confidence interval (CI) = 1.5, 6.7]. Individuals reporting a moderate-to-high number of financial problems along with a moderate-to-high number of physical problems were at greater odds of PPD (OR = 4.2, 95% CI = 1.2, 15.3); those with a moderate-to-high number of problems in all three domains were at over fivefold increased odds of PPD (OR = 5.5, CI = 1.1, 28.5). In assessing prenatal stress, clinicians should consider the extent to which stressors occur across different life domains; this association appears stronger with PPD diagnosis than simple assessments of individual stressors, which typically overestimate risk or cumulative exposures. © 2013 John Wiley & Sons Ltd.

  17. A systemic approach for modeling soil functions

    Science.gov (United States)

    Vogel, Hans-Jörg; Bartke, Stephan; Daedlow, Katrin; Helming, Katharina; Kögel-Knabner, Ingrid; Lang, Birgit; Rabot, Eva; Russell, David; Stößel, Bastian; Weller, Ulrich; Wiesmeier, Martin; Wollschläger, Ute

    2018-03-01

    The central importance of soil for the functioning of terrestrial systems is increasingly recognized. Critically relevant for water quality, climate control, nutrient cycling and biodiversity, soil provides more functions than just the basis for agricultural production. Nowadays, soil is increasingly under pressure as a limited resource for the production of food, energy and raw materials. This has led to an increasing demand for concepts assessing soil functions so that they can be adequately considered in decision-making aimed at sustainable soil management. The various soil science disciplines have progressively developed highly sophisticated methods to explore the multitude of physical, chemical and biological processes in soil. It is not obvious, however, how the steadily improving insight into soil processes may contribute to the evaluation of soil functions. Here, we present to a new systemic modeling framework that allows for a consistent coupling between reductionist yet observable indicators for soil functions with detailed process understanding. It is based on the mechanistic relationships between soil functional attributes, each explained by a network of interacting processes as derived from scientific evidence. The non-linear character of these interactions produces stability and resilience of soil with respect to functional characteristics. We anticipate that this new conceptional framework will integrate the various soil science disciplines and help identify important future research questions at the interface between disciplines. It allows the overwhelming complexity of soil systems to be adequately coped with and paves the way for steadily improving our capability to assess soil functions based on scientific understanding.

  18. On the specification of structural equation models for ecological systems

    NARCIS (Netherlands)

    Grace, James B.; Anderson, T. Michael; Olff, Han; Scheiner, Samuel M.

    The use of structural equation modeling (SEM) is often motivated by its utility for investigating complex networks of relationships, but also because of its promise as a means of representing theoretical Concepts using latent variables. In this paper, we discuss characteristics of ecological theory

  19. On the logical specification of probabilistic transition models

    CSIR Research Space (South Africa)

    Rens, G

    2013-05-01

    Full Text Available We investigate the requirements for specifying the behaviors of actions in a stochastic domain. That is, we propose how to write sentences in a logical language to capture a model of probabilistic transitions due to the execution of actions of some...

  20. School Processes Mediate School Compositional Effects: Model Specification and Estimation

    Science.gov (United States)

    Liu, Hongqiang; Van Damme, Jan; Gielen, Sarah; Van Den Noortgate, Wim

    2015-01-01

    School composition effects have been consistently verified, but few studies ever attempted to study how school composition affects school achievement. Based on prior research findings, we employed multilevel mediation modeling to examine whether school processes mediate the effect of school composition upon school outcomes based on the data of 28…

  1. Model Adoption Exchange Payment System: Technical Specifications and User Instructions.

    Science.gov (United States)

    Ambrosino, Robert J.

    This user's manual, designed to meet the needs of adoption exchange administrators and program managers for a formal tool to assist them in the overall management and operation of their program, presents the Model Adoption Exchange Payment System (MAEPS), which was developed to improve the delivery of adoption exchange services throughout the…

  2. Code Shift: Grid Specifications and Dynamic Wind Turbine Models

    DEFF Research Database (Denmark)

    Ackermann, Thomas; Ellis, Abraham; Fortmann, Jens

    2013-01-01

    Grid codes (GCs) and dynamic wind turbine (WT) models are key tools to allow increasing renewable energy penetration without challenging security of supply. In this article, the state of the art and the further development of both tools are discussed, focusing on the European and North American e...

  3. Pancreas specific expression of oncogenes in a porcine model

    DEFF Research Database (Denmark)

    Berthelsen, Martin Fogtmann; Callesen, Morten Møbjerg; Østergaard, Tanja Stenshøj

    2017-01-01

    crucial for successful treatment. However, pancreatic cancer is difficult to detect in its earliest stages and once symptoms appear, the cancer has often progressed beyond possibility for curing. Research into the disease has been hampered by the lack of good models. We have generated a porcine m...

  4. Modeling of phase equilibria with CPA using the homomorph approach

    DEFF Research Database (Denmark)

    Breil, Martin Peter; Tsivintzelis, Ioannis; Kontogeorgis, Georgios

    2011-01-01

    For association models, like CPA and SAFT, a classical approach is often used for estimating pure-compound and mixture parameters. According to this approach, the pure-compound parameters are estimated from vapor pressure and liquid density data. Then, the binary interaction parameters, kij, are ...

  5. A Constructive Neural-Network Approach to Modeling Psychological Development

    Science.gov (United States)

    Shultz, Thomas R.

    2012-01-01

    This article reviews a particular computational modeling approach to the study of psychological development--that of constructive neural networks. This approach is applied to a variety of developmental domains and issues, including Piagetian tasks, shift learning, language acquisition, number comparison, habituation of visual attention, concept…

  6. The Intersystem Model of Psychotherapy: An Integrated Systems Treatment Approach

    Science.gov (United States)

    Weeks, Gerald R.; Cross, Chad L.

    2004-01-01

    This article introduces the intersystem model of psychotherapy and discusses its utility as a truly integrative and comprehensive approach. The foundation of this conceptually complex approach comes from dialectic metatheory; hence, its derivation requires an understanding of both foundational and integrational constructs. The article provides a…

  7. Bystander Approaches: Empowering Students to Model Ethical Sexual Behavior

    Science.gov (United States)

    Lynch, Annette; Fleming, Wm. Michael

    2005-01-01

    Sexual violence on college campuses is well documented. Prevention education has emerged as an alternative to victim-- and perpetrator--oriented approaches used in the past. One sexual violence prevention education approach focuses on educating and empowering the bystander to become a point of ethical intervention. In this model, bystanders to…

  8. FDTD-based Transcranial Magnetic Stimulation model applied to specific neurodegenerative disorders.

    Science.gov (United States)

    Fanjul-Vélez, Félix; Salas-García, Irene; Ortega-Quijano, Noé; Arce-Diego, José Luis

    2015-01-01

    Non-invasive treatment of neurodegenerative diseases is particularly challenging in Western countries, where the population age is increasing. In this work, magnetic propagation in human head is modelled by Finite-Difference Time-Domain (FDTD) method, taking into account specific characteristics of Transcranial Magnetic Stimulation (TMS) in neurodegenerative diseases. It uses a realistic high-resolution three-dimensional human head mesh. The numerical method is applied to the analysis of magnetic radiation distribution in the brain using two realistic magnetic source models: a circular coil and a figure-8 coil commonly employed in TMS. The complete model was applied to the study of magnetic stimulation in Alzheimer and Parkinson Diseases (AD, PD). The results show the electrical field distribution when magnetic stimulation is supplied to those brain areas of specific interest for each particular disease. Thereby the current approach entails a high potential for the establishment of the current underdeveloped TMS dosimetry in its emerging application to AD and PD. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  9. Numerical approaches to expansion process modeling

    Directory of Open Access Journals (Sweden)

    G. V. Alekseev

    2017-01-01

    Full Text Available Forage production is currently undergoing a period of intensive renovation and introduction of the most advanced technologies and equipment. More and more often such methods as barley toasting, grain extrusion, steaming and grain flattening, boiling bed explosion, infrared ray treatment of cereals and legumes, followed by flattening, and one-time or two-time granulation of the purified whole grain without humidification in matrix presses By grinding the granules. These methods require special apparatuses, machines, auxiliary equipment, created on the basis of different methods of compiled mathematical models. When roasting, simulating the heat fields arising in the working chamber, provide such conditions, the decomposition of a portion of the starch to monosaccharides, which makes the grain sweetish, but due to protein denaturation the digestibility of the protein and the availability of amino acids decrease somewhat. Grain is roasted mainly for young animals in order to teach them to eat food at an early age, stimulate the secretory activity of digestion, better development of the masticatory muscles. In addition, the high temperature is detrimental to bacterial contamination and various types of fungi, which largely avoids possible diseases of the gastrointestinal tract. This method has found wide application directly on the farms. Apply when used in feeding animals and legumes: peas, soy, lupine and lentils. These feeds are preliminarily ground, and then cooked or steamed for 1 hour for 30–40 minutes. In the feed mill. Such processing of feeds allows inactivating the anti-nutrients in them, which reduce the effectiveness of their use. After processing, legumes are used as protein supplements in an amount of 25–30% of the total nutritional value of the diet. But it is recommended to cook and steal a grain of good quality. A poor-quality grain that has been stored for a long time and damaged by pathogenic micro flora is subject to

  10. Design Approach and Implementation of Application Specific Instruction Set Processor for SHA-3 BLAKE Algorithm

    Science.gov (United States)

    Zhang, Yuli; Han, Jun; Weng, Xinqian; He, Zhongzhu; Zeng, Xiaoyang

    This paper presents an Application Specific Instruction-set Processor (ASIP) for the SHA-3 BLAKE algorithm family by instruction set extensions (ISE) from an RISC (reduced instruction set computer) processor. With a design space exploration for this ASIP to increase the performance and reduce the area cost, we accomplish an efficient hardware and software implementation of BLAKE algorithm. The special instructions and their well-matched hardware function unit improve the calculation of the key section of the algorithm, namely G-functions. Also, relaxing the time constraint of the special function unit can decrease its hardware cost, while keeping the high data throughput of the processor. Evaluation results reveal the ASIP achieves 335Mbps and 176Mbps for BLAKE-256 and BLAKE-512. The extra area cost is only 8.06k equivalent gates. The proposed ASIP outperforms several software approaches on various platforms in cycle per byte. In fact, both high throughput and low hardware cost achieved by this programmable processor are comparable to that of ASIC implementations.

  11. Modelling and Generating Ajax Applications : A Model-Driven Approach

    NARCIS (Netherlands)

    Gharavi, V.; Mesbah, A.; Van Deursen, A.

    2008-01-01

    Preprint of paper published in: IWWOST 2008 - 7th International Workshop on Web-Oriented Software Technologies, 14-15 July 2008 AJAX is a promising and rapidly evolving approach for building highly interactive web applications. In AJAX, user interface components and the event-based interaction

  12. Agent-based modeling: a new approach for theory building in social psychology.

    Science.gov (United States)

    Smith, Eliot R; Conrey, Frederica R

    2007-02-01

    Most social and psychological phenomena occur not as the result of isolated decisions by individuals but rather as the result of repeated interactions between multiple individuals over time. Yet the theory-building and modeling techniques most commonly used in social psychology are less than ideal for understanding such dynamic and interactive processes. This article describes an alternative approach to theory building, agent-based modeling (ABM), which involves simulation of large numbers of autonomous agents that interact with each other and with a simulated environment and the observation of emergent patterns from their interactions. The authors believe that the ABM approach is better able than prevailing approaches in the field, variable-based modeling (VBM) techniques such as causal modeling, to capture types of complex, dynamic, interactive processes so important in the social world. The article elaborates several important contrasts between ABM and VBM and offers specific recommendations for learning more and applying the ABM approach.

  13. Specific and generic stem biomass and volume models of tree species in a West African tropical semi-deciduous forest

    DEFF Research Database (Denmark)

    Goussanou, Cédric A.; Guendehou, Sabin; Assogbadjo, Achille E.

    2016-01-01

    The quantification of the contribution of tropical forests to global carbon stocks and climate change mitigation requires availability of data and tools such as allometric equations. This study made available volume and biomass models for eighteen tree species in a semi-deciduous tropical forest...... in West Africa. Generic models were also developed for the forest ecosystem, and basic wood density determined for the tree species. Non-destructive sampling approach was carried out on five hundred and one sample trees to analyse stem volume and biomass. From the modelling of volume and biomass...... enabled to conclude that the non-destructive sampling was a good approach to determining reliable basic wood density. The comparative analysis of species-specific models in this study with selected generic models for tropical forests indicated low probability to identify effective generic models with good...

  14. Understanding Gulf War Illness: An Integrative Modeling Approach

    Science.gov (United States)

    2017-10-01

    using a novel mathematical model. The computational biology approach will enable the consortium to quickly identify targets of dysfunction and find... computer / mathematical paradigms for evaluation of treatment strategies 12-30 50% Develop pilot clinical trials on basis of animal studies 24-36 60...the goal of testing chemical treatments. The immune and autonomic biomarkers will be tested using a computational modeling approach allowing for a

  15. A Structural Modeling Approach to a Multilevel Random Coefficients Model.

    Science.gov (United States)

    Rovine, Michael J.; Molenaar, Peter C. M.

    2000-01-01

    Presents a method for estimating the random coefficients model using covariance structure modeling and allowing one to estimate both fixed and random effects. The method is applied to real and simulated data, including marriage data from J. Belsky and M. Rovine (1990). (SLD)

  16. Data Analysis A Model Comparison Approach, Second Edition

    CERN Document Server

    Judd, Charles M; Ryan, Carey S

    2008-01-01

    This completely rewritten classic text features many new examples, insights and topics including mediational, categorical, and multilevel models. Substantially reorganized, this edition provides a briefer, more streamlined examination of data analysis. Noted for its model-comparison approach and unified framework based on the general linear model, the book provides readers with a greater understanding of a variety of statistical procedures. This consistent framework, including consistent vocabulary and notation, is used throughout to develop fewer but more powerful model building techniques. T

  17. Modelling Career Intent of Specific Air Force Personnel Categories

    Science.gov (United States)

    1982-09-01

    Contributions are payments the participant makes to the organization in the form of work. It is postu- lated that as the balance of inducements...contributions is believed to have the opposite effect. The inducement-contribution balance is a function of two major components: perceived ease of movement...set of issues, preliminary efforts centered around the development of a theoretically based quality of worklife model which would provide a logical

  18. An Open Modelling Approach for Availability and Reliability of Systems - OpenMARS

    CERN Document Server

    Penttinen, Jussi-Pekka; Gutleber, Johannes

    2018-01-01

    This document introduces and gives specification for OpenMARS, which is an open modelling approach for availability and reliability of systems. It supports the most common risk assessment and operation modelling techniques. Uniquely OpenMARS allows combining and connecting models defined with different techniques. This ensures that a modeller has a high degree of freedom to accurately describe the modelled system without limitations imposed by an individual technique. Here the OpenMARS model definition is specified with a tool independent tabular format, which supports managing models developed in a collaborative fashion. Origin of our research is in Future Circular Collider (FCC) study, where we developed the unique features of our concept to model the availability and luminosity production of particle colliders. We were motivated to describe our approach in detail as we see potential further applications in performance and energy efficiency analyses of large scientific infrastructures or industrial processe...

  19. Specification of a STEP Based Reference Model for Exchange of Robotics Models

    DEFF Research Database (Denmark)

    Haenisch, Jochen; Kroszynski, Uri; Ludwig, Arnold

    robot programming, the descriptions of geometry, kinematics, robotics, dynamics, and controller data using STEP are addressed as major goals of the project.The Project Consortium has now released the "Specificatin of a STEP Based Reference Model for Exchange of Robotics Models" on which a series......ESPRIT Project 6457: "Interoperability of Standards for Robotics in CIME" (InterRob) belongs to the Subprogram "Computer Integrated Manufacturing and Engineering" of ESPRIT, the European Specific Programme for Research and Development in Information Technology supported by the European Commision....... InterRob aims to develop an integrated solution to precision manufacturing by combining product data and database technologies with robotic off-line programming and simulation. Benefits arise from the use of high level simulation tools and developing standards for the exchange of product model data...

  20. ON THE RELATIVE IMPORTANCE OF SPECIFIC AND NONSPECIFIC APPROACHES TO ORAL MICROBIAL ADHESION

    NARCIS (Netherlands)

    BUSSCHER, HJ; COWAN, MM; VANDERMEI, HC

    In this paper, it is suggested that specificity and non-specificity in (oral) microbial adhesion are different expressions for the same phenomena. It is argued that the same basic, physicochemical forces are responsible for so-called 'non-specific' and 'specific' binding and that from a

  1. Pathway index models for construction of patient-specific risk profiles.

    Science.gov (United States)

    Eng, Kevin H; Wang, Sijian; Bradley, William H; Rader, Janet S; Kendziorski, Christina

    2013-04-30

    Statistical methods for variable selection, prediction, and classification have proven extremely useful in moving personalized genomics medicine forward, in particular, leading to a number of genomic-based assays now in clinical use for predicting cancer recurrence. Although invaluable in individual cases, the information provided by these assays is limited. Most often, a patient is classified into one of very few groups (e.g., recur or not), limiting the potential for truly personalized treatment. Furthermore, although these assays provide information on which individuals are at most risk (e.g., those for which recurrence is predicted), they provide no information on the aberrant biological pathways that give rise to the increased risk. We have developed an approach to address these limitations. The approach models a time-to-event outcome as a function of known biological pathways, identifies important genomic aberrations, and provides pathway-based patient-specific assessments of risk. As we demonstrate in a study of ovarian cancer from The Cancer Genome Atlas project, the patient-specific risk profiles are powerful and efficient characterizations useful in addressing a number of questions related to identifying informative patient subtypes and predicting survival. Copyright © 2012 John Wiley & Sons, Ltd.

  2. A novel approach to modeling and diagnosing the cardiovascular system

    Energy Technology Data Exchange (ETDEWEB)

    Keller, P.E.; Kangas, L.J.; Hashem, S.; Kouzes, R.T. [Pacific Northwest Lab., Richland, WA (United States); Allen, P.A. [Life Link, Richland, WA (United States)

    1995-07-01

    A novel approach to modeling and diagnosing the cardiovascular system is introduced. A model exhibits a subset of the dynamics of the cardiovascular behavior of an individual by using a recurrent artificial neural network. Potentially, a model will be incorporated into a cardiovascular diagnostic system. This approach is unique in that each cardiovascular model is developed from physiological measurements of an individual. Any differences between the modeled variables and the variables of an individual at a given time are used for diagnosis. This approach also exploits sensor fusion to optimize the utilization of biomedical sensors. The advantage of sensor fusion has been demonstrated in applications including control and diagnostics of mechanical and chemical processes.

  3. Synthesis of industrial applications of local approach to fracture models

    International Nuclear Information System (INIS)

    Eripret, C.

    1993-03-01

    This report gathers different applications of local approach to fracture models to various industrial configurations, such as nuclear pressure vessel steel, cast duplex stainless steels, or primary circuit welds such as bimetallic welds. As soon as models are developed on the basis of microstructural observations, damage mechanisms analyses, and fracture process, the local approach to fracture proves to solve problems where classical fracture mechanics concepts fail. Therefore, local approach appears to be a powerful tool, which completes the standard fracture criteria used in nuclear industry by exhibiting where and why those classical concepts become unvalid. (author). 1 tab., 18 figs., 25 refs

  4. The calculation of exchange forces: General results and specific models

    International Nuclear Information System (INIS)

    Scott, T.C.; Babb, J.F.; Dalgarno, A.; Morgan, J.D. III

    1993-01-01

    In order to clarify questions about the calculation of the exchange energy of a homonuclear molecular ion, an analysis is carried out of a model problem consisting of the one-dimensional limit of H 2 + . It is demonstrated that the use of the infinite polarization expansion for the localized wave function in the Holstein--Herring formula yields an approximate exchange energy which at large internuclear distances R has the correct leading behavior to O(e -R ) and is close to but not equal to the exact exchange energy. The extension to the n-dimensional double-well problem is presented

  5. 78 FR 26849 - Model Specifications for Breath Alcohol Ignition Interlock Devices (BAIIDs)

    Science.gov (United States)

    2013-05-08

    ... requirements, and asked whether the Model Specifications should limit sensor technology to alcohol-specific sensors (such as fuel cell technology based on electrochemical oxidation of alcohol) or other emerging... have demanded alcohol- specific sensor technology. [Interlocks that] are not alcohol-specific...

  6. Site-specific and multielement approach to the determination of liquid-vapor isotope fractionation parameters. The case of alcohols

    International Nuclear Information System (INIS)

    Moussa, I.; Naulet, N.; Martin, M.L.; Martin, G.J.

    1990-01-01

    Isotope fractionation phenomena occurring at the natural abundance level in the course of liquid-vapor transformation have been investigated by using the SNIF-NMR method (site-specific natural isotope fractionation studied by NMR) which has a unique capability of providing simultaneous access to fractionation parameters associated with different molecular isotopomers. This new approach has been combined with the determination of overall carbon and hydrogen fractionation effects by isotope ratio mass spectrometry (IRMS). The results of distillation and evaporation experiments of alcohols performed in technical conditions of practical interest have been analyzed according to the Rayleigh-type model. In order to check the performance of the column, unit fractionation factors were measured beforehand for water and for the hydroxylic sites of methanol and ethanol for which liquid-vapor equilibrium constants were already known. Inverse isotope effects are determined in distillation experiments for the overall carbon isotope ratio and for the site-specific hydrogen isotope ratios associated with the methyl and methylene sites of methanol and ethanol. In contrast, normal isotope effects are produced by distillation for the hydroxylic sites and by evaporation for all the isotopic ratios

  7. Mathematical models for therapeutic approaches to control HIV disease transmission

    CERN Document Server

    Roy, Priti Kumar

    2015-01-01

    The book discusses different therapeutic approaches based on different mathematical models to control the HIV/AIDS disease transmission. It uses clinical data, collected from different cited sources, to formulate the deterministic as well as stochastic mathematical models of HIV/AIDS. It provides complementary approaches, from deterministic and stochastic points of view, to optimal control strategy with perfect drug adherence and also tries to seek viewpoints of the same issue from different angles with various mathematical models to computer simulations. The book presents essential methods and techniques for students who are interested in designing epidemiological models on HIV/AIDS. It also guides research scientists, working in the periphery of mathematical modeling, and helps them to explore a hypothetical method by examining its consequences in the form of a mathematical modelling and making some scientific predictions. The model equations, mathematical analysis and several numerical simulations that are...

  8. Patient-specific fibre-based models of muscle wrapping

    Science.gov (United States)

    Kohout, J.; Clapworthy, G. J.; Zhao, Y.; Tao, Y.; Gonzalez-Garcia, G.; Dong, F.; Wei, H.; Kohoutová, E.

    2013-01-01

    In many biomechanical problems, the availability of a suitable model for the wrapping of muscles when undergoing movement is essential for the estimation of forces produced on and by the body during motion. This is an important factor in the Osteoporotic Virtual Physiological Human project which is investigating the likelihood of fracture for osteoporotic patients undertaking a variety of movements. The weakening of their skeletons makes them particularly vulnerable to bone fracture caused by excessive loading being placed on the bones, even in simple everyday tasks. This paper provides an overview of a novel volumetric model that describes muscle wrapping around bones and other muscles during movement, and which includes a consideration of how the orientations of the muscle fibres change during the motion. The method can calculate the form of wrapping of a muscle of medium size and visualize the outcome within tenths of seconds on commodity hardware, while conserving muscle volume. This makes the method suitable not only for educational biomedical software, but also for clinical applications used to identify weak muscles that should be strengthened during rehabilitation or to identify bone stresses in order to estimate the risk of fractures. PMID:24427519

  9. One Approach for Dynamic L-lysine Modelling of Repeated Fed-batch Fermentation

    Directory of Open Access Journals (Sweden)

    Kalin Todorov

    2007-03-01

    Full Text Available This article deals with establishment of dynamic unstructured model of variable volume fed-batch fermentation process with intensive droppings for L-lysine production. The presented approach of the investigation includes the following main procedures: description of the process by generalized stoichiometric equations; preliminary data processing and calculation of specific rates for main kinetic variables; identification of the specific rates as a second-order non-linear dynamic models; establishment and optimisation of dynamic model of the process; simulation researches. MATLAB is used as a research environment.

  10. Initial assessment of a multi-model approach to spring flood forecasting in Sweden

    Science.gov (United States)

    Olsson, J.; Uvo, C. B.; Foster, K.; Yang, W.

    2015-06-01

    Hydropower is a major energy source in Sweden and proper reservoir management prior to the spring flood onset is crucial for optimal production. This requires useful forecasts of the accumulated discharge in the spring flood period (i.e. the spring-flood volume, SFV). Today's SFV forecasts are generated using a model-based climatological ensemble approach, where time series of precipitation and temperature from historical years are used to force a calibrated and initialised set-up of the HBV model. In this study, a number of new approaches to spring flood forecasting, that reflect the latest developments with respect to analysis and modelling on seasonal time scales, are presented and evaluated. Three main approaches, represented by specific methods, are evaluated in SFV hindcasts for three main Swedish rivers over a 10-year period with lead times between 0 and 4 months. In the first approach, historically analogue years with respect to the climate in the period preceding the spring flood are identified and used to compose a reduced ensemble. In the second, seasonal meteorological ensemble forecasts are used to drive the HBV model over the spring flood period. In the third approach, statistical relationships between SFV and the large-sale atmospheric circulation are used to build forecast models. None of the new approaches consistently outperform the climatological ensemble approach, but for specific locations and lead times improvements of 20-30 % are found. When combining all forecasts in a weighted multi-model approach, a mean improvement over all locations and lead times of nearly 10 % was indicated. This demonstrates the potential of the approach and further development and optimisation into an operational system is ongoing.

  11. Model-centric approaches for the development of health information systems.

    Science.gov (United States)

    Tuomainen, Mika; Mykkänen, Juha; Luostarinen, Heli; Pöyhölä, Assi; Paakkanen, Esa

    2007-01-01

    Modeling is used increasingly in healthcare to increase shared knowledge, to improve the processes, and to document the requirements of the solutions related to health information systems (HIS). There are numerous modeling approaches which aim to support these aims, but a careful assessment of their strengths, weaknesses and deficiencies is needed. In this paper, we compare three model-centric approaches in the context of HIS development: the Model-Driven Architecture, Business Process Modeling with BPMN and BPEL and the HL7 Development Framework. The comparison reveals that all these approaches are viable candidates for the development of HIS. However, they have distinct strengths and abstraction levels, they require local and project-specific adaptation and offer varying levels of automation. In addition, illustration of the solutions to the end users must be improved.

  12. Teaching Higher Order Thinking in the Introductory MIS Course: A Model-Directed Approach

    Science.gov (United States)

    Wang, Shouhong; Wang, Hai

    2011-01-01

    One vision of education evolution is to change the modes of thinking of students. Critical thinking, design thinking, and system thinking are higher order thinking paradigms that are specifically pertinent to business education. A model-directed approach to teaching and learning higher order thinking is proposed. An example of application of the…

  13. Segmented Assimilation Theory and the Life Model: An Integrated Approach to Understanding Immigrants and Their Children

    Science.gov (United States)

    Piedra, Lissette M.; Engstrom, David W.

    2009-01-01

    The life model offers social workers a promising framework to use in assisting immigrant families. However, the complexities of adaptation to a new country may make it difficult for social workers to operate from a purely ecological approach. The authors use segmented assimilation theory to better account for the specificities of the immigrant…

  14. A model-based combinatorial optimisation approach for energy-efficient processing of microalgae

    NARCIS (Netherlands)

    Slegers, P.M.; Koetzier, B.J.; Fasaei, F.; Wijffels, R.H.; Straten, van G.; Boxtel, van A.J.B.

    2014-01-01

    The analyses of algae biorefinery performance are commonly based on fixed performance data for each processing step. In this work, we demonstrate a model-based combinatorial approach to derive the design-specific upstream energy consumption and biodiesel yield in the production of biodiesel from

  15. Total Variability Modeling using Source-specific Priors

    DEFF Research Database (Denmark)

    Shepstone, Sven Ewan; Lee, Kong Aik; Li, Haizhou

    2016-01-01

    sequence of an utterance. In both cases the prior for the latent variable is assumed to be non-informative, since for homogeneous datasets there is no gain in generality in using an informative prior. This work shows in the heterogeneous case, that using informative priors for com- puting the posterior......, can lead to favorable results. We focus on modeling the priors using minimum divergence criterion or fac- tor analysis techniques. Tests on the NIST 2008 and 2010 Speaker Recognition Evaluation (SRE) dataset show that our proposed method beats four baselines: For i-vector extraction using an already...... trained matrix, for the short2-short3 task in SRE’08, five out of eight female and four out of eight male common conditions, were improved. For the core-extended task in SRE’10, four out of nine female and six out of nine male common conditions were improved. When incorporating prior information...

  16. Site-Specific Seismic Site Response Model for the Waste Treatment Plant, Hanford, Washington

    Energy Technology Data Exchange (ETDEWEB)

    Rohay, Alan C.; Reidel, Steve P.

    2005-02-24

    This interim report documents the collection of site-specific geologic and geophysical data characterizing the Waste Treatment Plant site and the modeling of the site-specific structure response to earthquake ground motions.

  17. Identifying Country-Specific Cultures of Physics Education: A differential item functioning approach

    Science.gov (United States)

    Mesic, Vanes

    2012-11-01

    In international large-scale assessments of educational outcomes, student achievement is often represented by unidimensional constructs. This approach allows for drawing general conclusions about country rankings with respect to the given achievement measure, but it typically does not provide specific diagnostic information which is necessary for systematic comparisons and improvements of educational systems. Useful information could be obtained by exploring the differences in national profiles of student achievement between low-achieving and high-achieving countries. In this study, we aimed to identify the relative weaknesses and strengths of eighth graders' physics achievement in Bosnia and Herzegovina in comparison to the achievement of their peers from Slovenia. For this purpose, we ran a secondary analysis of Trends in International Mathematics and Science Study (TIMSS) 2007 data. The student sample consisted of 4,220 students from Bosnia and Herzegovina and 4,043 students from Slovenia. After analysing the cognitive demands of TIMSS 2007 physics items, the correspondent differential item functioning (DIF)/differential group functioning contrasts were estimated. Approximately 40% of items exhibited large DIF contrasts, indicating significant differences between cultures of physics education in Bosnia and Herzegovina and Slovenia. The relative strength of students from Bosnia and Herzegovina showed to be mainly associated with the topic area 'Electricity and magnetism'. Classes of items which required the knowledge of experimental method, counterintuitive thinking, proportional reasoning and/or the use of complex knowledge structures proved to be differentially easier for students from Slovenia. In the light of the presented results, the common practice of ranking countries with respect to universally established cognitive categories seems to be potentially misleading.

  18. Assessing the functional diversity of herbivorous reef fishes using a compound-specific stable isotope approach

    KAUST Repository

    Tietbohl, Matthew

    2016-12-01

    Herbivorous coral reef fishes play an important role in helping to structure their environment directly by consuming algae and indirectly by promoting coral health and growth. These fishes are generally separated into three broad groups: browsers, grazers, and excavators/scrapers, with these groupings often thought to have a fixed general function and all fishes within a group thought to have similar ecological roles. This categorization assumes a high level of functional redundancy within herbivorous fishes. However, recent evidence questions the use of this broad classification scheme, and posits that there may actually be more resource partitioning within these functional groupings. Here, I use a compound-specific stable isotope approach (CSIA) to show there appears to be a greater diversity of functional roles than previously assumed within broad functional groups. The δ13C signatures from essential amino acids of reef end-members (coral, macroalgae, detritus, and phytoplankton) and fish muscle were analyzed to investigate differences in resource use between fishes. Most end-members displayed clear isotopic differences, and most fishes within functional groups were dissimilar in their isotopic signature, implying differences in the resources they target. No grazers closely resembled each other isotopically, implying a much lower level of functional redundancy within this group; scraping parrotfish were also distinct from excavating parrotfish and to a lesser degree distinct between scrapers. This study highlights the potential of CSIA to help distinguish fine-scale ecological differences within other groups of reef organisms as well. These results question the utility of lumping nominally herbivorous fishes into broad groups with assumed similar roles. Given the apparent functional differences between nominally herbivorous reef fishes, it is important for managers to incorporate the diversity of functional roles these fish play.

  19. Assessing the functional diversity of herbivorous reef fishes using a compound-specific stable isotope approach

    KAUST Repository

    Tietbohl, Matthew

    2016-01-01

    Herbivorous coral reef fishes play an important role in helping to structure their environment directly by consuming algae and indirectly by promoting coral health and growth. These fishes are generally separated into three broad groups: browsers, grazers, and excavators/scrapers, with these groupings often thought to have a fixed general function and all fishes within a group thought to have similar ecological roles. This categorization assumes a high level of functional redundancy within herbivorous fishes. However, recent evidence questions the use of this broad classification scheme, and posits that there may actually be more resource partitioning within these functional groupings. Here, I use a compound-specific stable isotope approach (CSIA) to show there appears to be a greater diversity of functional roles than previously assumed within broad functional groups. The δ13C signatures from essential amino acids of reef end-members (coral, macroalgae, detritus, and phytoplankton) and fish muscle were analyzed to investigate differences in resource use between fishes. Most end-members displayed clear isotopic differences, and most fishes within functional groups were dissimilar in their isotopic signature, implying differences in the resources they target. No grazers closely resembled each other isotopically, implying a much lower level of functional redundancy within this group; scraping parrotfish were also distinct from excavating parrotfish and to a lesser degree distinct between scrapers. This study highlights the potential of CSIA to help distinguish fine-scale ecological differences within other groups of reef organisms as well. These results question the utility of lumping nominally herbivorous fishes into broad groups with assumed similar roles. Given the apparent functional differences between nominally herbivorous reef fishes, it is important for managers to incorporate the diversity of functional roles these fish play.

  20. Partial information decomposition as a unified approach to the specification of neural goal functions.

    Science.gov (United States)

    Wibral, Michael; Priesemann, Viola; Kay, Jim W; Lizier, Joseph T; Phillips, William A

    2017-03-01

    In many neural systems anatomical motifs are present repeatedly, but despite their structural similarity they can serve very different tasks. A prime example for such a motif is the canonical microcircuit of six-layered neo-cortex, which is repeated across cortical areas, and is involved in a number of different tasks (e.g. sensory, cognitive, or motor tasks). This observation has spawned interest in finding a common underlying principle, a 'goal function', of information processing implemented in this structure. By definition such a goal function, if universal, cannot be cast in processing-domain specific language (e.g. 'edge filtering', 'working memory'). Thus, to formulate such a principle, we have to use a domain-independent framework. Information theory offers such a framework. However, while the classical framework of information theory focuses on the relation between one input and one output (Shannon's mutual information), we argue that neural information processing crucially depends on the combination of multiple inputs to create the output of a processor. To account for this, we use a very recent extension of Shannon Information theory, called partial information decomposition (PID). PID allows to quantify the information that several inputs provide individually (unique information), redundantly (shared information) or only jointly (synergistic information) about the output. First, we review the framework of PID. Then we apply it to reevaluate and analyze several earlier proposals of information theoretic neural goal functions (predictive coding, infomax and coherent infomax, efficient coding). We find that PID allows to compare these goal functions in a common framework, and also provides a versatile approach to design new goal functions from first principles. Building on this, we design and analyze a novel goal function, called 'coding with synergy', which builds on combining external input and prior knowledge in a synergistic manner. We suggest that

  1. Comparing Three Patterns of Strengths and Weaknesses Models for the Identification of Specific Learning Disabilities

    Science.gov (United States)

    Miller, Daniel C.; Maricle, Denise E.; Jones, Alicia M.

    2016-01-01

    Processing Strengths and Weaknesses (PSW) models have been proposed as a method for identifying specific learning disabilities. Three PSW models were examined for their ability to predict expert identified specific learning disabilities cases. The Dual Discrepancy/Consistency Model (DD/C; Flanagan, Ortiz, & Alfonso, 2013) as operationalized by…

  2. A new approach to improve the specificity of flow-mediated dilation for indicating endothelial function in cardiovascular research.

    Science.gov (United States)

    Atkinson, Greg; Batterham, Alan M; Thijssen, Dick H J; Green, Daniel J

    2013-02-01

    Flow-mediated dilation (FMD) is a noninvasive indicator of endothelial function and is routinely expressed as the percentage change in arterial diameter (FMD%) from a resting baseline (Dbase) to a postischemic peak (Dpeak). This expression is equivalent to the ratio of Dpeak/Dbase and is, therefore, dependent on important statistical assumptions, which have never been analysed in the context of FMD%. We aimed to investigate these assumptions, via a comparison of FMD between samples of children and adults, as well as to explore other approaches to scaling diameter change for Dbase. We found that FMD% did not scale accurately for interindividual differences in Dbase but, as expected, overestimated endothelial function for low Dbase and vice versa. We argue that this imprecise scaling of FMD% is predictable, not explained by physiology and is probably common. This problem is resolved by applying scaling principles, whereby the difference in diameter is the outcome and Dbase is a covariate in a logarithmic-linked generalized linear model. A specific allometric expression of FMD can be derived and we found this to be Dpeak/Dbase rather than a simple ratio in our particular dataset. We found that sample differences in endothelial function were inaccurate with FMD% versus our new allometric approach, and that FMD% misclassified participants into 'high' and 'low'cohorts, which has implications for prognostic-type studies. We conclude that the general use of FMD% could have led to biased comparisons of different conditions and/or populations in past studies. Our new approach to scaling FMD is flexible for different datasets and is not based on the current assumption that a percentage change is appropriate in all circumstances.

  3. Modelling diversity in building occupant behaviour: a novel statistical approach

    DEFF Research Database (Denmark)

    Haldi, Frédéric; Calì, Davide; Andersen, Rune Korsholm

    2016-01-01

    We propose an advanced modelling framework to predict the scope and effects of behavioural diversity regarding building occupant actions on window openings, shading devices and lighting. We develop a statistical approach based on generalised linear mixed models to account for the longitudinal nat...

  4. Sensitivity analysis approaches applied to systems biology models.

    Science.gov (United States)

    Zi, Z

    2011-11-01

    With the rising application of systems biology, sensitivity analysis methods have been widely applied to study the biological systems, including metabolic networks, signalling pathways and genetic circuits. Sensitivity analysis can provide valuable insights about how robust the biological responses are with respect to the changes of biological parameters and which model inputs are the key factors that affect the model outputs. In addition, sensitivity analysis is valuable for guiding experimental analysis, model reduction and parameter estimation. Local and global sensitivity analysis approaches are the two types of sensitivity analysis that are commonly applied in systems biology. Local sensitivity analysis is a classic method that studies the impact of small perturbations on the model outputs. On the other hand, global sensitivity analysis approaches have been applied to understand how the model outputs are affected by large variations of the model input parameters. In this review, the author introduces the basic concepts of sensitivity analysis approaches applied to systems biology models. Moreover, the author discusses the advantages and disadvantages of different sensitivity analysis methods, how to choose a proper sensitivity analysis approach, the available sensitivity analysis tools for systems biology models and the caveats in the interpretation of sensitivity analysis results.

  5. A qualitative evaluation approach for energy system modelling frameworks

    DEFF Research Database (Denmark)

    Wiese, Frauke; Hilpert, Simon; Kaldemeyer, Cord

    2018-01-01

    properties define how useful it is in regard to the existing challenges. For energy system models, evaluation methods exist, but we argue that many decisions upon properties are rather made on the model generator or framework level. Thus, this paper presents a qualitative approach to evaluate frameworks...

  6. Modeling Alaska boreal forests with a controlled trend surface approach

    Science.gov (United States)

    Mo Zhou; Jingjing Liang

    2012-01-01

    An approach of Controlled Trend Surface was proposed to simultaneously take into consideration large-scale spatial trends and nonspatial effects. A geospatial model of the Alaska boreal forest was developed from 446 permanent sample plots, which addressed large-scale spatial trends in recruitment, diameter growth, and mortality. The model was tested on two sets of...

  7. Refining the Committee Approach and Uncertainty Prediction in Hydrological Modelling

    NARCIS (Netherlands)

    Kayastha, N.

    2014-01-01

    Due to the complexity of hydrological systems a single model may be unable to capture the full range of a catchment response and accurately predict the streamflows. The multi modelling approach opens up possibilities for handling such difficulties and allows improve the predictive capability of

  8. Towards modeling future energy infrastructures - the ELECTRA system engineering approach

    DEFF Research Database (Denmark)

    Uslar, Mathias; Heussen, Kai

    2016-01-01

    of the IEC 62559 use case template as well as needed changes to cope particularly with the aspects of controller conflicts and Greenfield technology modeling. From the original envisioned use of the standards, we show a possible transfer on how to properly deal with a Greenfield approach when modeling....

  9. A Model-Driven Approach to e-Course Management

    Science.gov (United States)

    Savic, Goran; Segedinac, Milan; Milenkovic, Dušica; Hrin, Tamara; Segedinac, Mirjana

    2018-01-01

    This paper presents research on using a model-driven approach to the development and management of electronic courses. We propose a course management system which stores a course model represented as distinct machine-readable components containing domain knowledge of different course aspects. Based on this formally defined platform-independent…

  10. A study of multidimensional modeling approaches for data warehouse

    Science.gov (United States)

    Yusof, Sharmila Mat; Sidi, Fatimah; Ibrahim, Hamidah; Affendey, Lilly Suriani

    2016-08-01

    Data warehouse system is used to support the process of organizational decision making. Hence, the system must extract and integrate information from heterogeneous data sources in order to uncover relevant knowledge suitable for decision making process. However, the development of data warehouse is a difficult and complex process especially in its conceptual design (multidimensional modeling). Thus, there have been various approaches proposed to overcome the difficulty. This study surveys and compares the approaches of multidimensional modeling and highlights the issues, trend and solution proposed to date. The contribution is on the state of the art of the multidimensional modeling design.

  11. Gray-box modelling approach for description of storage tunnel

    DEFF Research Database (Denmark)

    Harremoës, Poul; Carstensen, Jacob

    1999-01-01

    The dynamics of a storage tunnel is examined using a model based on on-line measured data and a combination of simple deterministic and black-box stochastic elements. This approach, called gray-box modeling, is a new promising methodology for giving an on-line state description of sewer systems...... of the water in the overflow structures. The capacity of a pump draining the storage tunnel is estimated for two different rain events, revealing that the pump was malfunctioning during the first rain event. The proposed modeling approach can be used in automated online surveillance and control and implemented...

  12. Meta-analysis a structural equation modeling approach

    CERN Document Server

    Cheung, Mike W-L

    2015-01-01

    Presents a novel approach to conducting meta-analysis using structural equation modeling. Structural equation modeling (SEM) and meta-analysis are two powerful statistical methods in the educational, social, behavioral, and medical sciences. They are often treated as two unrelated topics in the literature. This book presents a unified framework on analyzing meta-analytic data within the SEM framework, and illustrates how to conduct meta-analysis using the metaSEM package in the R statistical environment. Meta-Analysis: A Structural Equation Modeling Approach begins by introducing the impo

  13. A Companion Model Approach to Modelling and Simulation of Industrial Processes

    International Nuclear Information System (INIS)

    Juslin, K.

    2005-09-01

    Modelling and simulation provides for huge possibilities if broadly taken up by engineers as a working method. However, when considering the launching of modelling and simulation tools in an engineering design project, they shall be easy to learn and use. Then, there is no time to write equations, to consult suppliers' experts, or to manually transfer data from one tool to another. The answer seems to be in the integration of easy to use and dependable simulation software with engineering tools. Accordingly, the modelling and simulation software shall accept as input such structured design information on industrial unit processes and their connections, as provided for by e.g. CAD software and product databases. The software technology, including required specification and communication standards, is already available. Internet based service repositories make it possible for equipment manufacturers to supply 'extended products', including such design data as needed by engineers engaged in process and automation integration. There is a market niche evolving for simulation service centres, operating in co-operation with project consultants, equipment manufacturers, process integrators, automation designers, plant operating personnel, and maintenance centres. The companion model approach for specification and solution of process simulation models, as presented herein, is developed from the above premises. The focus is on how to tackle real world processes, which from the modelling point of view are heterogeneous, dynamic, very stiff, very nonlinear and only piece vice continuous, without extensive manual interventions of human experts. An additional challenge, to solve the arising equations fast and reliable, is dealt with, as well. (orig.)

  14. Positive Mathematical Programming Approaches – Recent Developments in Literature and Applied Modelling

    Directory of Open Access Journals (Sweden)

    Thomas Heckelei

    2012-05-01

    Full Text Available This paper reviews and discusses the more recent literature and application of Positive Mathematical Programming in the context of agricultural supply models. Specifically, advances in the empirical foundation of parameter specifications as well as the economic rationalisation of PMP models – both criticized in earlier reviews – are investigated. Moreover, the paper provides an overview on a larger set of models with regular/repeated policy application that apply variants of PMP. Results show that most applications today avoid arbitrary parameter specifications and rely on exogenous information on supply responses to calibrate model parameters. However, only few approaches use multiple observations to estimate parameters, which is likely due to the still considerable technical challenges associated with it. Equally, we found only limited reflection on the behavioral or technological assumptions that could rationalise the PMP model structure while still keeping the model’s advantages.

  15. Learning the Task Management Space of an Aircraft Approach Model

    Science.gov (United States)

    Krall, Joseph; Menzies, Tim; Davies, Misty

    2014-01-01

    Validating models of airspace operations is a particular challenge. These models are often aimed at finding and exploring safety violations, and aim to be accurate representations of real-world behavior. However, the rules governing the behavior are quite complex: nonlinear physics, operational modes, human behavior, and stochastic environmental concerns all determine the responses of the system. In this paper, we present a study on aircraft runway approaches as modeled in Georgia Tech's Work Models that Compute (WMC) simulation. We use a new learner, Genetic-Active Learning for Search-Based Software Engineering (GALE) to discover the Pareto frontiers defined by cognitive structures. These cognitive structures organize the prioritization and assignment of tasks of each pilot during approaches. We discuss the benefits of our approach, and also discuss future work necessary to enable uncertainty quantification.

  16. A novel approach of modeling continuous dark hydrogen fermentation.

    Science.gov (United States)

    Alexandropoulou, Maria; Antonopoulou, Georgia; Lyberatos, Gerasimos

    2018-02-01

    In this study a novel modeling approach for describing fermentative hydrogen production in a continuous stirred tank reactor (CSTR) was developed, using the Aquasim modeling platform. This model accounts for the key metabolic reactions taking place in a fermentative hydrogen producing reactor, using fixed stoichiometry but different reaction rates. Biomass yields are determined based on bioenergetics. The model is capable of describing very well the variation in the distribution of metabolic products for a wide range of hydraulic retention times (HRT). The modeling approach is demonstrated using the experimental data obtained from a CSTR, fed with food industry waste (FIW), operating at different HRTs. The kinetic parameters were estimated through fitting to the experimental results. Hydrogen and total biogas production rates were predicted very well by the model, validating the basic assumptions regarding the implicated stoichiometric biochemical reactions and their kinetic rates. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. An integrated modeling approach to age invariant face recognition

    Science.gov (United States)

    Alvi, Fahad Bashir; Pears, Russel

    2015-03-01

    This Research study proposes a novel method for face recognition based on Anthropometric features that make use of an integrated approach comprising of a global and personalized models. The system is aimed to at situations where lighting, illumination, and pose variations cause problems in face recognition. A Personalized model covers the individual aging patterns while a Global model captures general aging patterns in the database. We introduced a de-aging factor that de-ages each individual in the database test and training sets. We used the k nearest neighbor approach for building a personalized model and global model. Regression analysis was applied to build the models. During the test phase, we resort to voting on different features. We used FG-Net database for checking the results of our technique and achieved 65 percent Rank 1 identification rate.

  18. Depletion-of-Battery Attack: Specificity, Modelling and Analysis.

    Science.gov (United States)

    Shakhov, Vladimir; Koo, Insoo

    2018-06-06

    The emerging Internet of Things (IoT) has great potential; however, the societal costs of the IoT can outweigh its benefits. To unlock IoT potential, there needs to be improvement in the security of IoT applications. There are several standardization initiatives for sensor networks, which eventually converge with the Internet of Things. As sensor-based applications are deployed, security emerges as an essential requirement. One of the critical issues of wireless sensor technology is limited sensor resources, including sensor batteries. This creates a vulnerability to battery-exhausting attacks. Rapid exhaustion of sensor battery power is not only explained by intrusions, but can also be due to random failure of embedded sensor protocols. Thus, most wireless sensor applications, without tools to defend against rash battery exhausting, would be unable to function during prescribed times. In this paper, we consider a special type of threat, in which the harm is malicious depletion of sensor battery power. In contrast to the traditional denial-of-service attack, quality of service under the considered attack is not necessarily degraded. Moreover, the quality of service can increase up to the moment of the sensor set crashes. We argue that this is a distinguishing type of attack. Hence, the application of a traditional defense mechanism against this threat is not always possible. Therefore, effective methods should be developed to counter the threat. We first discuss the feasibility of rash depletion of battery power. Next, we propose a model for evaluation of energy consumption when under attack. Finally, a technique to counter the attack is discussed.

  19. Depletion-of-Battery Attack: Specificity, Modelling and Analysis

    Directory of Open Access Journals (Sweden)

    Vladimir Shakhov

    2018-06-01

    Full Text Available The emerging Internet of Things (IoT has great potential; however, the societal costs of the IoT can outweigh its benefits. To unlock IoT potential, there needs to be improvement in the security of IoT applications. There are several standardization initiatives for sensor networks, which eventually converge with the Internet of Things. As sensor-based applications are deployed, security emerges as an essential requirement. One of the critical issues of wireless sensor technology is limited sensor resources, including sensor batteries. This creates a vulnerability to battery-exhausting attacks. Rapid exhaustion of sensor battery power is not only explained by intrusions, but can also be due to random failure of embedded sensor protocols. Thus, most wireless sensor applications, without tools to defend against rash battery exhausting, would be unable to function during prescribed times. In this paper, we consider a special type of threat, in which the harm is malicious depletion of sensor battery power. In contrast to the traditional denial-of-service attack, quality of service under the considered attack is not necessarily degraded. Moreover, the quality of service can increase up to the moment of the sensor set crashes. We argue that this is a distinguishing type of attack. Hence, the application of a traditional defense mechanism against this threat is not always possible. Therefore, effective methods should be developed to counter the threat. We first discuss the feasibility of rash depletion of battery power. Next, we propose a model for evaluation of energy consumption when under attack. Finally, a technique to counter the attack is discussed.

  20. Development of a Subject-Specific Foot-Ground Contact Model for Walking.

    Science.gov (United States)

    Jackson, Jennifer N; Hass, Chris J; Fregly, Benjamin J

    2016-09-01

    largest errors in AP CoP occurred at the beginning and end of stance phase when the vertical ground reaction force (vGRF) was small. Subject-specific deformable foot-ground contact models created using this approach should enable changes in foot-ground contact pattern to be predicted accurately by gait optimization studies, which may lead to improvements in personalized rehabilitation medicine.

  1. On a model-based approach to radiation protection

    International Nuclear Information System (INIS)

    Waligorski, M.P.R.

    2002-01-01

    There is a preoccupation with linearity and absorbed dose as the basic quantifiers of radiation hazard. An alternative is the fluence approach, whereby radiation hazard may be evaluated, at least in principle, via an appropriate action cross section. In order to compare these approaches, it may be useful to discuss them as quantitative descriptors of survival and transformation-like endpoints in cell cultures in vitro - a system thought to be relevant to modelling radiation hazard. If absorbed dose is used to quantify these biological endpoints, then non-linear dose-effect relations have to be described, and, e.g. after doses of densely ionising radiation, dose-correction factors as high as 20 are required. In the fluence approach only exponential effect-fluence relationships can be readily described. Neither approach alone exhausts the scope of experimentally observed dependencies of effect on dose or fluence. Two-component models, incorporating a suitable mixture of the two approaches, are required. An example of such a model is the cellular track structure theory developed by Katz over thirty years ago. The practical consequences of modelling radiation hazard using this mixed two-component approach are discussed. (author)

  2. The harmony model - A case study on new contractors approach

    NARCIS (Netherlands)

    Wagemakers, G.; Favie, R.; Eekelen, van A.L.M.; Willemsen, J.G.L.; Maas, G.J.; Milford, R.

    2007-01-01

    This paper describes the added value of the more integrated project organisation. After explaining the differences between the traditional model and the integrated process, the characteristics of the specific industrial clients are elaborated on. The needs of these clients are related to both models

  3. Model-driven design using IEC 61499 a synchronous approach for embedded and automation systems

    CERN Document Server

    Yoong, Li Hsien; Bhatti, Zeeshan E; Kuo, Matthew M Y

    2015-01-01

    This book describes a novel approach for the design of embedded systems and industrial automation systems, using a unified model-driven approach that is applicable in both domains.  The authors illustrate their methodology, using the IEC 61499 standard as the main vehicle for specification, verification, static timing analysis and automated code synthesis.  The well-known synchronous approach is used as the main vehicle for defining an unambiguous semantics that ensures determinism and deadlock freedom. The proposed approach also ensures very efficient implementations either on small-scale embedded devices or on industry-scale programmable automation controllers (PACs). It can be used for both centralized and distributed implementations. Significantly, the proposed approach can be used without the need for any run-time support. This approach, for the first time, blurs the gap between embedded systems and automation systems and can be applied in wide-ranging applications in automotive, robotics, and industri...

  4. Modeling gene expression measurement error: a quasi-likelihood approach

    Directory of Open Access Journals (Sweden)

    Strimmer Korbinian

    2003-03-01

    Full Text Available Abstract Background Using suitable error models for gene expression measurements is essential in the statistical analysis of microarray data. However, the true probabilistic model underlying gene expression intensity readings is generally not known. Instead, in currently used approaches some simple parametric model is assumed (usually a transformed normal distribution or the empirical distribution is estimated. However, both these strategies may not be optimal for gene expression data, as the non-parametric approach ignores known structural information whereas the fully parametric models run the risk of misspecification. A further related problem is the choice of a suitable scale for the model (e.g. observed vs. log-scale. Results Here a simple semi-parametric model for gene expression measurement error is presented. In this approach inference is based an approximate likelihood function (the extended quasi-likelihood. Only partial knowledge about the unknown true distribution is required to construct this function. In case of gene expression this information is available in the form of the postulated (e.g. quadratic variance structure of the data. As the quasi-likelihood behaves (almost like a proper likelihood, it allows for the estimation of calibration and variance parameters, and it is also straightforward to obtain corresponding approximate confidence intervals. Unlike most other frameworks, it also allows analysis on any preferred scale, i.e. both on the original linear scale as well as on a transformed scale. It can also be employed in regression approaches to model systematic (e.g. array or dye effects. Conclusions The quasi-likelihood framework provides a simple and versatile approach to analyze gene expression data that does not make any strong distributional assumptions about the underlying error model. For several simulated as well as real data sets it provides a better fit to the data than competing models. In an example it also

  5. Space-Wise approach for airborne gravity data modelling

    Science.gov (United States)

    Sampietro, D.; Capponi, M.; Mansi, A. H.; Gatti, A.; Marchetti, P.; Sansò, F.

    2017-05-01

    Regional gravity field modelling by means of remove-compute-restore procedure is nowadays widely applied in different contexts: it is the most used technique for regional gravimetric geoid determination, and it is also used in exploration geophysics to predict grids of gravity anomalies (Bouguer, free-air, isostatic, etc.), which are useful to understand and map geological structures in a specific region. Considering this last application, due to the required accuracy and resolution, airborne gravity observations are usually adopted. However, due to the relatively high acquisition velocity, presence of atmospheric turbulence, aircraft vibration, instrumental drift, etc., airborne data are usually contaminated by a very high observation error. For this reason, a proper procedure to filter the raw observations in both the low and high frequencies should be applied to recover valuable information. In this work, a software to filter and grid raw airborne observations is presented: the proposed solution consists in a combination of an along-track Wiener filter and a classical Least Squares Collocation technique. Basically, the proposed procedure is an adaptation to airborne gravimetry of the Space-Wise approach, developed by Politecnico di Milano to process data coming from the ESA satellite mission GOCE. Among the main differences with respect to the satellite application of this approach, there is the fact that, while in processing GOCE data the stochastic characteristics of the observation error can be considered a-priori well known, in airborne gravimetry, due to the complex environment in which the observations are acquired, these characteristics are unknown and should be retrieved from the dataset itself. The presented solution is suited for airborne data analysis in order to be able to quickly filter and grid gravity observations in an easy way. Some innovative theoretical aspects focusing in particular on the theoretical covariance modelling are presented too

  6. Rosetta comparative modeling for library design: Engineering alternative inducer specificity in a transcription factor.

    Science.gov (United States)

    Jha, Ramesh K; Chakraborti, Subhendu; Kern, Theresa L; Fox, David T; Strauss, Charlie E M

    2015-07-01

    Structure-based rational mutagenesis for engineering protein functionality has been limited by the scarcity and difficulty of obtaining crystal structures of desired proteins. On the other hand, when high-throughput selection is possible, directed evolution-based approaches for gaining protein functionalities have been random and fortuitous with limited rationalization. We combine comparative modeling of dimer structures, ab initio loop reconstruction, and ligand docking to select positions for mutagenesis to create a library focused on the ligand-contacting residues. The rationally reduced library requirement enabled conservative control of the substitutions by oligonucleotide synthesis and bounding its size within practical transformation efficiencies (∼ 10(7) variants). This rational approach was successfully applied on an inducer-binding domain of an Acinetobacter transcription factor (TF), pobR, which shows high specificity for natural effector molecule, 4-hydroxy benzoate (4HB), but no native response to 3,4-dihydroxy benzoate (34DHB). Selection for mutants with high transcriptional induction by 34DHB was carried out at the single-cell level under flow cytometry (via green fluorescent protein expression under the control of pobR promoter). Critically, this selection protocol allows both selection for induction and rejection of constitutively active mutants. In addition to gain-of-function for 34DHB induction, the selected mutants also showed enhanced sensitivity and response for 4HB (native inducer) while no sensitivity was observed for a non-targeted but chemically similar molecule, 2-hydroxy benzoate (2HB). This is unique application of the Rosetta modeling protocols for library design to engineer a TF. Our approach extends applicability of the Rosetta redesign protocol into regimes without a priori precision structural information. © 2015 Wiley Periodicals, Inc.

  7. Improvement of tool support of the spatial approach to regional planning: problems, specifics, trends

    Directory of Open Access Journals (Sweden)

    Nataliya Gennadievna Yushkova

    2015-01-01

    Full Text Available The emerging imperatives of innovation economic development in Russia determine the content of conceptual and institutional constraints to the development of regional economic systems (RES. They consider the regional planning system as a leading priority in its inseparable unity with modern public administration tasks. However, the practice of development of long-term plans in the RF subjects proves that the innovation challenges of economic policy are not reflected properly in them or they are significantly distorted. The following reasons reduce the effectiveness of modernization processes in the RF subjects and hamper the appropriate reaction of RES on their impact: the lack of coordination between socio-economic and spatial regional plans, the imbalance of interaction between state authorities engaged in long-term planning, the lack of real prerequisites for the implementation of innovation initiatives in the regions. Systematization and analysis of long-term plans make it possible to substantiate the consistency of the spatial approach to regional planning expressed in the dominance of the transformational function that synchronizes the configuration and parameters of RES, and to establish ways to integrate spatial components in the system of regional planning through optimization of its tool support. The change in the content of the instrumentation support is based on the synthesis of the predominant basic characteristics of the existing tools used in isolated subsystems of regional planning of socio-economic and territorial development. The study has established a system of tool support for regional planning that adapts to the changes in both internal and external factors in the development of RES. Three main groups of tools: organizing, regulating, and coordinating are defined by their typing in accordance with the groups of management functions. The article proposes the modeling of combinations of tools that are subordinated to the

  8. High resolution bone material property assignment yields robust subject specific finite element models of complex thin bone structures.

    Science.gov (United States)

    Pakdel, Amirreza; Fialkov, Jeffrey; Whyne, Cari M

    2016-06-14

    Accurate finite element (FE) modeling of complex skeletal anatomy requires high resolution in both meshing and the heterogeneous mapping of material properties onto the generated mesh. This study introduces Node-based elastic Modulus Assignment with Partial-volume correction (NMAP) as a new approach for FE material property assignment to thin bone structures. The NMAP approach incorporates point spread function based deblurring of CT images, partial-volume correction of CT image voxel intensities and anisotropic interpolation and mapping of CT intensity assignment to FE mesh nodes. The NMAP procedure combined with a derived craniomaxillo-facial skeleton (CMFS) specific density-isotropic elastic modulus relationship was applied to produce specimen-specific FE models of 6 cadaveric heads. The NMAP procedure successfully generated models of the complex thin bone structures with surface elastic moduli reflective of cortical bone material properties. The specimen-specific CMFS FE models were able to accurately predict experimental strains measured under in vitro temporalis and masseter muscle loading (r=0.93, slope=1.01, n=5). The strength of this correlation represents a robust validation for CMFS FE modeling that can be used to better understand load transfer in this complex musculoskeletal system. The developed methodology offers a systematic process-flow able to address the complexity of the CMFS that can be further applied to create high-fidelity models of any musculoskeletal anatomy. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. A Toolkit Modeling Approach for Sustainable Forest Management Planning: Achieving Balance between Science and Local Needs

    Directory of Open Access Journals (Sweden)

    Brian R. Sturtevant

    2007-12-01

    Full Text Available To assist forest managers in balancing an increasing diversity of resource objectives, we developed a toolkit modeling approach for sustainable forest management (SFM. The approach inserts a meta-modeling strategy into a collaborative modeling framework grounded in adaptive management philosophy that facilitates participation among stakeholders, decision makers, and local domain experts in the meta-model building process. The modeling team works iteratively with each of these groups to define essential questions, identify data resources, and then determine whether available tools can be applied or adapted, or whether new tools can be rapidly created to fit the need. The desired goal of the process is a linked series of domain-specific models (tools that balances generalized "top-down" models (i.e., scientific models developed without input from the local system with case-specific customized "bottom-up" models that are driven primarily by local needs. Information flow between models is organized according to vertical (i.e., between scale and horizontal (i.e., within scale dimensions. We illustrate our approach within a 2.1 million hectare forest planning district in central Labrador, a forested landscape where social and ecological values receive a higher priority than economic values. However, the focus of this paper is on the process of how SFM modeling tools and concepts can be rapidly assembled and applied in new locations, balancing efficient transfer of science with adaptation to local needs. We use the Labrador case study to illustrate strengths and challenges uniquely associated with a meta-modeling approach to integrated modeling as it fits within the broader collaborative modeling framework. Principle advantages of the approach include the scientific rigor introduced by peer-reviewed models, combined with the adaptability of meta-modeling. A key challenge is the limited transparency of scientific models to different participatory groups

  10. Modelling Cyclic Walking in Femurs With Metastatic Lesions : Femur-Specific Accumulation of Plasticity

    NARCIS (Netherlands)

    Derikx, L.; Janssen, D.; Schepers, J.; Wesseling, M.; Verdonschot, N.; Jonkers, I.; Tanck, E.

    2015-01-01

    Introduction Clinical fracture risk assessment in metastatic bone disease is extremely difficult, but subject-specific finite element (FE) modelling may improve these assessments in the future [Derikx, 2015]. By coupling to musculoskeletal modelling, realistic loading conditions can be implemented

  11. A review of function modeling: Approaches and applications

    OpenAIRE

    Erden, M.S.; Komoto, H.; Van Beek, T.J.; D'Amelio, V.; Echavarria, E.; Tomiyama, T.

    2008-01-01

    This work is aimed at establishing a common frame and understanding of function modeling (FM) for our ongoing research activities. A comparative review of the literature is performed to grasp the various FM approaches with their commonalities and differences. The relations of FM with the research fields of artificial intelligence, design theory, and maintenance are discussed. In this discussion the goals are to highlight the features of various classical approaches in relation to FM, to delin...

  12. Top-down approach to unified supergravity models

    International Nuclear Information System (INIS)

    Hempfling, R.

    1994-03-01

    We introduce a new approach for studying unified supergravity models. In this approach all the parameters of the grand unified theory (GUT) are fixed by imposing the corresponding number of low energy observables. This determines the remaining particle spectrum whose dependence on the low energy observables can now be investigated. We also include some SUSY threshold corrections that have previously been neglected. In particular the SUSY threshold corrections to the fermion masses can have a significant impact on the Yukawa coupling unification. (orig.)

  13. Intelligent Transportation and Evacuation Planning A Modeling-Based Approach

    CERN Document Server

    Naser, Arab

    2012-01-01

    Intelligent Transportation and Evacuation Planning: A Modeling-Based Approach provides a new paradigm for evacuation planning strategies and techniques. Recently, evacuation planning and modeling have increasingly attracted interest among researchers as well as government officials. This interest stems from the recent catastrophic hurricanes and weather-related events that occurred in the southeastern United States (Hurricane Katrina and Rita). The evacuation methods that were in place before and during the hurricanes did not work well and resulted in thousands of deaths. This book offers insights into the methods and techniques that allow for implementing mathematical-based, simulation-based, and integrated optimization and simulation-based engineering approaches for evacuation planning. This book also: Comprehensively discusses the application of mathematical models for evacuation and intelligent transportation modeling Covers advanced methodologies in evacuation modeling and planning Discusses principles a...

  14. Multi-model approach to characterize human handwriting motion.

    Science.gov (United States)

    Chihi, I; Abdelkrim, A; Benrejeb, M

    2016-02-01

    This paper deals with characterization and modelling of human handwriting motion from two forearm muscle activity signals, called electromyography signals (EMG). In this work, an experimental approach was used to record the coordinates of a pen tip moving on the (x, y) plane and EMG signals during the handwriting act. The main purpose is to design a new mathematical model which characterizes this biological process. Based on a multi-model approach, this system was originally developed to generate letters and geometric forms written by different writers. A Recursive Least Squares algorithm is used to estimate the parameters of each sub-model of the multi-model basis. Simulations show good agreement between predicted results and the recorded data.

  15. Wave Resource Characterization Using an Unstructured Grid Modeling Approach

    Directory of Open Access Journals (Sweden)

    Wei-Cheng Wu

    2018-03-01

    Full Text Available This paper presents a modeling study conducted on the central Oregon coast for wave resource characterization, using the unstructured grid Simulating WAve Nearshore (SWAN model coupled with a nested grid WAVEWATCH III® (WWIII model. The flexibility of models with various spatial resolutions and the effects of open boundary conditions simulated by a nested grid WWIII model with different physics packages were evaluated. The model results demonstrate the advantage of the unstructured grid-modeling approach for flexible model resolution and good model skills in simulating the six wave resource parameters recommended by the International Electrotechnical Commission in comparison to the observed data in Year 2009 at National Data Buoy Center Buoy 46050. Notably, spectral analysis indicates that the ST4 physics package improves upon the ST2 physics package’s ability to predict wave power density for large waves, which is important for wave resource assessment, load calculation of devices, and risk management. In addition, bivariate distributions show that the simulated sea state of maximum occurrence with the ST4 physics package matched the observed data better than with the ST2 physics package. This study demonstrated that the unstructured grid wave modeling approach, driven by regional nested grid WWIII outputs along with the ST4 physics package, can efficiently provide accurate wave hindcasts to support wave resource characterization. Our study also suggests that wind effects need to be considered if the dimension of the model domain is greater than approximately 100 km, or O (102 km.

  16. Rule-based modeling: a computational approach for studying biomolecular site dynamics in cell signaling systems

    Science.gov (United States)

    Chylek, Lily A.; Harris, Leonard A.; Tung, Chang-Shung; Faeder, James R.; Lopez, Carlos F.

    2013-01-01

    Rule-based modeling was developed to address the limitations of traditional approaches for modeling chemical kinetics in cell signaling systems. These systems consist of multiple interacting biomolecules (e.g., proteins), which themselves consist of multiple parts (e.g., domains, linear motifs, and sites of phosphorylation). Consequently, biomolecules that mediate information processing generally have the potential to interact in multiple ways, with the number of possible complexes and post-translational modification states tending to grow exponentially with the number of binary interactions considered. As a result, only large reaction networks capture all possible consequences of the molecular interactions that occur in a cell signaling system, which is problematic because traditional modeling approaches for chemical kinetics (e.g., ordinary differential equations) require explicit network specification. This problem is circumvented through representation of interactions in terms of local rules. With this approach, network specification is implicit and model specification is concise. Concise representation results in a coarse graining of chemical kinetics, which is introduced because all reactions implied by a rule inherit the rate law associated with that rule. Coarse graining can be appropriate if interactions are modular, and the coarseness of a model can be adjusted as needed. Rules can be specified using specialized model-specification languages, and recently developed tools designed for specification of rule-based models allow one to leverage powerful software engineering capabilities. A rule-based model comprises a set of rules, which can be processed by general-purpose simulation and analysis tools to achieve different objectives (e.g., to perform either a deterministic or stochastic simulation). PMID:24123887

  17. Rule-based modeling: a computational approach for studying biomolecular site dynamics in cell signaling systems.

    Science.gov (United States)

    Chylek, Lily A; Harris, Leonard A; Tung, Chang-Shung; Faeder, James R; Lopez, Carlos F; Hlavacek, William S

    2014-01-01

    Rule-based modeling was developed to address the limitations of traditional approaches for modeling chemical kinetics in cell signaling systems. These systems consist of multiple interacting biomolecules (e.g., proteins), which themselves consist of multiple parts (e.g., domains, linear motifs, and sites of phosphorylation). Consequently, biomolecules that mediate information processing generally have the potential to interact in multiple ways, with the number of possible complexes and posttranslational modification states tending to grow exponentially with the number of binary interactions considered. As a result, only large reaction networks capture all possible consequences of the molecular interactions that occur in a cell signaling system, which is problematic because traditional modeling approaches for chemical kinetics (e.g., ordinary differential equations) require explicit network specification. This problem is circumvented through representation of interactions in terms of local rules. With this approach, network specification is implicit and model specification is concise. Concise representation results in a coarse graining of chemical kinetics, which is introduced because all reactions implied by a rule inherit the rate law associated with that rule. Coarse graining can be appropriate if interactions are modular, and the coarseness of a model can be adjusted as needed. Rules can be specified using specialized model-specification languages, and recently developed tools designed for specification of rule-based models allow one to leverage powerful software engineering capabilities. A rule-based model comprises a set of rules, which can be processed by general-purpose simulation and analysis tools to achieve different objectives (e.g., to perform either a deterministic or stochastic simulation). © 2013 Wiley Periodicals, Inc.

  18. A comprehensive dynamic modeling approach for giant magnetostrictive material actuators

    International Nuclear Information System (INIS)

    Gu, Guo-Ying; Zhu, Li-Min; Li, Zhi; Su, Chun-Yi

    2013-01-01

    In this paper, a comprehensive modeling approach for a giant magnetostrictive material actuator (GMMA) is proposed based on the description of nonlinear electromagnetic behavior, the magnetostrictive effect and frequency response of the mechanical dynamics. It maps the relationships between current and magnetic flux at the electromagnetic part to force and displacement at the mechanical part in a lumped parameter form. Towards this modeling approach, the nonlinear hysteresis effect of the GMMA appearing only in the electrical part is separated from the linear dynamic plant in the mechanical part. Thus, a two-module dynamic model is developed to completely characterize the hysteresis nonlinearity and the dynamic behaviors of the GMMA. The first module is a static hysteresis model to describe the hysteresis nonlinearity, and the cascaded second module is a linear dynamic plant to represent the dynamic behavior. To validate the proposed dynamic model, an experimental platform is established. Then, the linear dynamic part and the nonlinear hysteresis part of the proposed model are identified in sequence. For the linear part, an approach based on axiomatic design theory is adopted. For the nonlinear part, a Prandtl–Ishlinskii model is introduced to describe the hysteresis nonlinearity and a constrained quadratic optimization method is utilized to identify its coefficients. Finally, experimental tests are conducted to demonstrate the effectiveness of the proposed dynamic model and the corresponding identification method. (paper)

  19. A website evaluation model by integration of previous evaluation models using a quantitative approach

    Directory of Open Access Journals (Sweden)

    Ali Moeini

    2015-01-01

    Full Text Available Regarding the ecommerce growth, websites play an essential role in business success. Therefore, many authors have offered website evaluation models since 1995. Although, the multiplicity and diversity of evaluation models make it difficult to integrate them into a single comprehensive model. In this paper a quantitative method has been used to integrate previous models into a comprehensive model that is compatible with them. In this approach the researcher judgment has no role in integration of models and the new model takes its validity from 93 previous models and systematic quantitative approach.

  20. Tracing carbon flow through coral reef food webs using a compound-specific stable isotope approach.

    Science.gov (United States)

    McMahon, Kelton W; Thorrold, Simon R; Houghton, Leah A; Berumen, Michael L

    2016-03-01

    Coral reefs support spectacularly productive and diverse communities in tropical and sub-tropical waters throughout the world's oceans. Debate continues, however, on the degree to which reef biomass is supported by new water column production, benthic primary production, and recycled detrital carbon (C). We coupled compound-specific stable C isotope ratio (δ(13)C) analyses with Bayesian mixing models to quantify C flow from primary producers to coral reef fishes across multiple feeding guilds and trophic positions in the Red Sea. Analyses of reef fishes with putative diets composed primarily of zooplankton (Amblyglyphidodon indicus), benthic macroalgae (Stegastes nigricans), reef-associated detritus (Ctenochaetus striatus), and coral tissue (Chaetodon trifascialis) confirmed that δ(13)C values of essential amino acids from all baseline C sources were both isotopically diagnostic and accurately recorded in consumer tissues. While all four source end-members contributed to the production of coral reef fishes in our study, a single-source end-member often dominated dietary C assimilation of a given species, even for highly mobile, generalist top predators. Microbially reworked detritus was an important secondary C source for most species. Seascape configuration played an important role in structuring resource utilization patterns. For instance, Lutjanus ehrenbergii showed a significant shift from a benthic macroalgal food web on shelf reefs (71 ± 13 % of dietary C) to a phytoplankton-based food web (72 ± 11 %) on oceanic reefs. Our work provides insights into the roles that diverse C sources play in the structure and function of coral reef ecosystems and illustrates a powerful fingerprinting method to develop and test nutritional frameworks for understanding resource utilization.

  1. Tracing carbon flow through coral reef food webs using a compound-specific stable isotope approach

    KAUST Repository

    McMahon, Kelton

    2015-11-21

    Coral reefs support spectacularly productive and diverse communities in tropical and sub-tropical waters throughout the world’s oceans. Debate continues, however, on the degree to which reef biomass is supported by new water column production, benthic primary production, and recycled detrital carbon (C). We coupled compound-specific stable C isotope ratio (δ13C) analyses with Bayesian mixing models to quantify C flow from primary producers to coral reef fishes across multiple feeding guilds and trophic positions in the Red Sea. Analyses of reef fishes with putative diets composed primarily of zooplankton (Amblyglyphidodon indicus), benthic macroalgae (Stegastes nigricans), reef-associated detritus (Ctenochaetus striatus), and coral tissue (Chaetodon trifascialis) confirmed that δ13C values of essential amino acids from all baseline C sources were both isotopically diagnostic and accurately recorded in consumer tissues. While all four source end-members contributed to the production of coral reef fishes in our study, a single-source end-member often dominated dietary C assimilation of a given species, even for highly mobile, generalist top predators. Microbially reworked detritus was an important secondary C source for most species. Seascape configuration played an important role in structuring resource utilization patterns. For instance, Lutjanus ehrenbergii showed a significant shift from a benthic macroalgal food web on shelf reefs (71 ± 13 % of dietary C) to a phytoplankton-based food web (72 ± 11 %) on oceanic reefs. Our work provides insights into the roles that diverse C sources play in the structure and function of coral reef ecosystems and illustrates a powerful fingerprinting method to develop and test nutritional frameworks for understanding resource utilization.

  2. Decision support models for solid waste management: Review and game-theoretic approaches

    International Nuclear Information System (INIS)

    Karmperis, Athanasios C.; Aravossis, Konstantinos; Tatsiopoulos, Ilias P.; Sotirchos, Anastasios

    2013-01-01

    Highlights: ► The mainly used decision support frameworks for solid waste management are reviewed. ► The LCA, CBA and MCDM models are presented and their strengths, weaknesses, similarities and possible combinations are analyzed. ► The game-theoretic approach in a solid waste management context is presented. ► The waste management bargaining game is introduced as a specific decision support framework. ► Cooperative and non-cooperative game-theoretic approaches to decision support for solid waste management are discussed. - Abstract: This paper surveys decision support models that are commonly used in the solid waste management area. Most models are mainly developed within three decision support frameworks, which are the life-cycle assessment, the cost–benefit analysis and the multi-criteria decision-making. These frameworks are reviewed and their strengths and weaknesses as well as their critical issues are analyzed, while their possible combinations and extensions are also discussed. Furthermore, the paper presents how cooperative and non-cooperative game-theoretic approaches can be used for the purpose of modeling and analyzing decision-making in situations with multiple stakeholders. Specifically, since a waste management model is sustainable when considering not only environmental and economic but also social aspects, the waste management bargaining game is introduced as a specific decision support framework in which future models can be developed

  3. Decision support models for solid waste management: Review and game-theoretic approaches

    Energy Technology Data Exchange (ETDEWEB)

    Karmperis, Athanasios C., E-mail: athkarmp@mail.ntua.gr [Sector of Industrial Management and Operational Research, School of Mechanical Engineering, National Technical University of Athens, Iroon Polytechniou 9, 15780 Athens (Greece); Army Corps of Engineers, Hellenic Army General Staff, Ministry of Defence (Greece); Aravossis, Konstantinos; Tatsiopoulos, Ilias P.; Sotirchos, Anastasios [Sector of Industrial Management and Operational Research, School of Mechanical Engineering, National Technical University of Athens, Iroon Polytechniou 9, 15780 Athens (Greece)

    2013-05-15

    Highlights: ► The mainly used decision support frameworks for solid waste management are reviewed. ► The LCA, CBA and MCDM models are presented and their strengths, weaknesses, similarities and possible combinations are analyzed. ► The game-theoretic approach in a solid waste management context is presented. ► The waste management bargaining game is introduced as a specific decision support framework. ► Cooperative and non-cooperative game-theoretic approaches to decision support for solid waste management are discussed. - Abstract: This paper surveys decision support models that are commonly used in the solid waste management area. Most models are mainly developed within three decision support frameworks, which are the life-cycle assessment, the cost–benefit analysis and the multi-criteria decision-making. These frameworks are reviewed and their strengths and weaknesses as well as their critical issues are analyzed, while their possible combinations and extensions are also discussed. Furthermore, the paper presents how cooperative and non-cooperative game-theoretic approaches can be used for the purpose of modeling and analyzing decision-making in situations with multiple stakeholders. Specifically, since a waste management model is sustainable when considering not only environmental and economic but also social aspects, the waste management bargaining game is introduced as a specific decision support framework in which future models can be developed.

  4. Smeared crack modelling approach for corrosion-induced concrete damage

    DEFF Research Database (Denmark)

    Thybo, Anna Emilie Anusha; Michel, Alexander; Stang, Henrik

    2017-01-01

    In this paper a smeared crack modelling approach is used to simulate corrosion-induced damage in reinforced concrete. The presented modelling approach utilizes a thermal analogy to mimic the expansive nature of solid corrosion products, while taking into account the penetration of corrosion...... products into the surrounding concrete, non-uniform precipitation of corrosion products, and creep. To demonstrate the applicability of the presented modelling approach, numerical predictions in terms of corrosion-induced deformations as well as formation and propagation of micro- and macrocracks were......-induced damage phenomena in reinforced concrete. Moreover, good agreements were also found between experimental and numerical data for corrosion-induced deformations along the circumference of the reinforcement....

  5. A model-data based systems approach to process intensification

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    . Their developments, however, are largely due to experiment based trial and error approaches and while they do not require validation, they can be time consuming and resource intensive. Also, one may ask, can a truly new intensified unit operation be obtained in this way? An alternative two-stage approach is to apply...... a model-based synthesis method to systematically generate and evaluate alternatives in the first stage and an experiment-model based validation in the second stage. In this way, the search for alternatives is done very quickly, reliably and systematically over a wide range, while resources are preserved...... for focused validation of only the promising candidates in the second-stage. This approach, however, would be limited to intensification based on “known” unit operations, unless the PI process synthesis/design is considered at a lower level of aggregation, namely the phenomena level. That is, the model-based...

  6. METHODOLOGICAL APPROACHES FOR MODELING THE RURAL SETTLEMENT DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    Gorbenkova Elena Vladimirovna

    2017-10-01

    Full Text Available Subject: the paper describes the research results on validation of a rural settlement developmental model. The basic methods and approaches for solving the problem of assessment of the urban and rural settlement development efficiency are considered. Research objectives: determination of methodological approaches to modeling and creating a model for the development of rural settlements. Materials and methods: domestic and foreign experience in modeling the territorial development of urban and rural settlements and settlement structures was generalized. The motivation for using the Pentagon-model for solving similar problems was demonstrated. Based on a systematic analysis of existing development models of urban and rural settlements as well as the authors-developed method for assessing the level of agro-towns development, the systems/factors that are necessary for a rural settlement sustainable development are identified. Results: we created the rural development model which consists of five major systems that include critical factors essential for achieving a sustainable development of a settlement system: ecological system, economic system, administrative system, anthropogenic (physical system and social system (supra-structure. The methodological approaches for creating an evaluation model of rural settlements development were revealed; the basic motivating factors that provide interrelations of systems were determined; the critical factors for each subsystem were identified and substantiated. Such an approach was justified by the composition of tasks for territorial planning of the local and state administration levels. The feasibility of applying the basic Pentagon-model, which was successfully used for solving the analogous problems of sustainable development, was shown. Conclusions: the resulting model can be used for identifying and substantiating the critical factors for rural sustainable development and also become the basis of

  7. Patient-specific induced pluripotent stem cells in neurological disease modeling: the importance of nonhuman primate models

    Directory of Open Access Journals (Sweden)

    Qiu Z

    2013-07-01

    Full Text Available Zhifang Qiu,1,2 Steven L Farnsworth,2 Anuja Mishra,1,2 Peter J Hornsby1,21Geriatric Research Education and Clinical Center, South Texas Veterans Health Care System, San Antonio, TX, USA; 2Barshop Institute for Longevity and Aging Studies, University of Texas Health Science Center, San Antonio, TX, USAAbstract: The development of the technology for derivation of induced pluripotent stem (iPS cells from human patients and animal models has opened up new pathways to the better understanding of many human diseases, and has created new opportunities for therapeutic approaches. Here, we consider one important neurological disease, Parkinson's, the development of relevant neural cell lines for studying this disease, and the animal models that are available for testing the survival and function of the cells, following transplantation into the central nervous system. Rapid progress has been made recently in the application of protocols for neuroectoderm differentiation and neural patterning of pluripotent stem cells. These developments have resulted in the ability to produce large numbers of dopaminergic neurons with midbrain characteristics for further study. These cells have been shown to be functional in both rodent and nonhuman primate (NHP models of Parkinson's disease. Patient-specific iPS cells and derived dopaminergic neurons have been developed, in particular from patients with genetic causes of Parkinson's disease. For complete modeling of the disease, it is proposed that the introduction of genetic changes into NHP iPS cells, followed by studying the phenotype of the genetic change in cells transplanted into the NHP as host animal, will yield new insights into disease processes not possible with rodent models alone.Keywords: Parkinson's disease, pluripotent cell differentiation, neural cell lines, dopaminergic neurons, cell transplantation, animal models

  8. Modelling of ductile and cleavage fracture by local approach

    International Nuclear Information System (INIS)

    Samal, M.K.; Dutta, B.K.; Kushwaha, H.S.

    2000-08-01

    This report describes the modelling of ductile and cleavage fracture processes by local approach. It is now well known that the conventional fracture mechanics method based on single parameter criteria is not adequate to model the fracture processes. It is because of the existence of effect of size and geometry of flaw, loading type and rate on the fracture resistance behaviour of any structure. Hence, it is questionable to use same fracture resistance curves as determined from standard tests in the analysis of real life components because of existence of all the above effects. So, there is need to have a method in which the parameters used for the analysis will be true material properties, i.e. independent of geometry and size. One of the solutions to the above problem is the use of local approaches. These approaches have been extensively studied and applied to different materials (including SA33 Gr.6) in this report. Each method has been studied and reported in a separate section. This report has been divided into five sections. Section-I gives a brief review of the fundamentals of fracture process. Section-II deals with modelling of ductile fracture by locally uncoupled type of models. In this section, the critical cavity growth parameters of the different models have been determined for the primary heat transport (PHT) piping material of Indian pressurised heavy water reactor (PHWR). A comparative study has been done among different models. The dependency of the critical parameters on stress triaxiality factor has also been studied. It is observed that Rice and Tracey's model is the most suitable one. But, its parameters are not fully independent of triaxiality factor. For this purpose, a modification to Rice and Tracery's model is suggested in Section-III. Section-IV deals with modelling of ductile fracture process by locally coupled type of models. Section-V deals with the modelling of cleavage fracture process by Beremins model, which is based on Weibulls

  9. A time series modeling approach in risk appraisal of violent and sexual recidivism.

    Science.gov (United States)

    Bani-Yaghoub, Majid; Fedoroff, J Paul; Curry, Susan; Amundsen, David E

    2010-10-01

    For over half a century, various clinical and actuarial methods have been employed to assess the likelihood of violent recidivism. Yet there is a need for new methods that can improve the accuracy of recidivism predictions. This study proposes a new time series modeling approach that generates high levels of predictive accuracy over short and long periods of time. The proposed approach outperformed two widely used actuarial instruments (i.e., the Violence Risk Appraisal Guide and the Sex Offender Risk Appraisal Guide). Furthermore, analysis of temporal risk variations based on specific time series models can add valuable information into risk assessment and management of violent offenders.

  10. Atomistic approach for modeling metal-semiconductor interfaces

    DEFF Research Database (Denmark)

    Stradi, Daniele; Martinez, Umberto; Blom, Anders

    2016-01-01

    realistic metal-semiconductor interfaces and allows for a direct comparison between theory and experiments via the I–V curve. In particular, it will be demonstrated how doping — and bias — modifies the Schottky barrier, and how finite size models (the slab approach) are unable to describe these interfaces......We present a general framework for simulating interfaces using an atomistic approach based on density functional theory and non-equilibrium Green's functions. The method includes all the relevant ingredients, such as doping and an accurate value of the semiconductor band gap, required to model...

  11. Systems and context modeling approach to requirements analysis

    Science.gov (United States)

    Ahuja, Amrit; Muralikrishna, G.; Patwari, Puneet; Subhrojyoti, C.; Swaminathan, N.; Vin, Harrick

    2014-08-01

    Ensuring completeness and correctness of the requirements for a complex system such as the SKA is challenging. Current system engineering practice includes developing a stakeholder needs definition, a concept of operations, and defining system requirements in terms of use cases and requirements statements. We present a method that enhances this current practice into a collection of system models with mutual consistency relationships. These include stakeholder goals, needs definition and system-of-interest models, together with a context model that participates in the consistency relationships among these models. We illustrate this approach by using it to analyze the SKA system requirements.

  12. An approach to multiscale modelling with graph grammars.

    Science.gov (United States)

    Ong, Yongzhi; Streit, Katarína; Henke, Michael; Kurth, Winfried

    2014-09-01

    Functional-structural plant models (FSPMs) simulate biological processes at different spatial scales. Methods exist for multiscale data representation and modification, but the advantages of using multiple scales in the dynamic aspects of FSPMs remain unclear. Results from multiscale models in various other areas of science that share fundamental modelling issues with FSPMs suggest that potential advantages do exist, and this study therefore aims to introduce an approach to multiscale modelling in FSPMs. A three-part graph data structure and grammar is revisited, and presented with a conceptual framework for multiscale modelling. The framework is used for identifying roles, categorizing and describing scale-to-scale interactions, thus allowing alternative approaches to model development as opposed to correlation-based modelling at a single scale. Reverse information flow (from macro- to micro-scale) is catered for in the framework. The methods are implemented within the programming language XL. Three example models are implemented using the proposed multiscale graph model and framework. The first illustrates the fundamental usage of the graph data structure and grammar, the second uses probabilistic modelling for organs at the fine scale in order to derive crown growth, and the third combines multiscale plant topology with ozone trends and metabolic network simulations in order to model juvenile beech stands under exposure to a toxic trace gas. The graph data structure supports data representation and grammar operations at multiple scales. The results demonstrate that multiscale modelling is a viable method in FSPM and an alternative to correlation-based modelling. Advantages and disadvantages of multiscale modelling are illustrated by comparisons with single-scale implementations, leading to motivations for further research in sensitivity analysis and run-time efficiency for these models.

  13. Mucin 1-specific immunotherapy in a mouse model of spontaneous breast cancer.

    Science.gov (United States)

    Mukherjee, Pinku; Madsen, Cathy S; Ginardi, Amelia R; Tinder, Teresa L; Jacobs, Fred; Parker, Joanne; Agrawal, Babita; Longenecker, B Michael; Gendler, Sandra J

    2003-01-01

    Human mucin 1 (MUC1) is an epithelial mucin glycoprotein that is overexpressed in 90% of all adenocarcinomas including breast, lung, pancreas, prostate, stomach, colon, and ovary. MUC1 is a target for immune intervention, because, in patients with solid adenocarcinomas, low-level cellular and humoral immune responses to MUC1 have been observed, which are not sufficiently strong to eradicate the growing tumor. The hypothesis for this study is that enhancing MUC1-specific immunity will result in antitumor immunity. To test this, the authors have developed a clinically relevant breast cancer model that demonstrates peripheral and central tolerance to MUC1 and develops spontaneous tumors of the mammary gland. In these mice, the authors tested a vaccine formulation comprised of liposomal-MUC1 lipopeptide and human recombinant interleukin-2. Results indicate that when compared with untreated mice, immunized mice develop T cells that express intracellular IFN-gamma, are reactive with MHC class I H-2Db/MUC1 tetramer, and are cytotoxic against MUC1-expressing tumor cells in vitro. The presence of MUC1-specific CTL did not translate into a clinical response as measured by time of tumor onset, tumor burden, and survival. The authors demonstrate that some of the immune-evasion mechanisms used by the tumor cells include downregulation of MHC-class I molecule, expression of TGF-beta2, and decrease in IFN-gamma -expressing effector T cells as tumors progress. Finally, utilizing an injectable breast cancer model, the authors show that targeting a single tumor antigen may not be an effective antitumor treatment, but that immunization with dendritic cells fed with whole tumor lysate is effective in breaking tolerance and protecting mice from subsequent tumor challenge. A physiologically relevant spontaneous breast cancer model has been developed to test improved immunotherapeutic approaches.

  14. 75 FR 61820 - Model Specifications for Breath Alcohol Ignition Interlock Devices (BAIIDs)

    Science.gov (United States)

    2010-10-06

    ... technology to alcohol-specific sensors (such as fuel cell technology based on electro-chemical oxidation of alcohol) or other emerging sensor technologies? Or, should NHTSA not specify the sensor technology and... require alcohol- specific technology in the Model Specifications, but that the particular sensor design...

  15. A robust quantitative near infrared modeling approach for blend monitoring.

    Science.gov (United States)

    Mohan, Shikhar; Momose, Wataru; Katz, Jeffrey M; Hossain, Md Nayeem; Velez, Natasha; Drennen, James K; Anderson, Carl A

    2018-01-30

    This study demonstrates a material sparing Near-Infrared modeling approach for powder blend monitoring. In this new approach, gram scale powder mixtures are subjected to compression loads to simulate the effect of scale using an Instron universal testing system. Models prepared by the new method development approach (small-scale method) and by a traditional method development (blender-scale method) were compared by simultaneously monitoring a 1kg batch size blend run. Both models demonstrated similar model performance. The small-scale method strategy significantly reduces the total resources expended to develop Near-Infrared calibration models for on-line blend monitoring. Further, this development approach does not require the actual equipment (i.e., blender) to which the method will be applied, only a similar optical interface. Thus, a robust on-line blend monitoring method can be fully developed before any large-scale blending experiment is viable, allowing the blend method to be used during scale-up and blend development trials. Copyright © 2017. Published by Elsevier B.V.

  16. Deep Appearance Models: A Deep Boltzmann Machine Approach for Face Modeling

    OpenAIRE

    Duong, Chi Nhan; Luu, Khoa; Quach, Kha Gia; Bui, Tien D.

    2016-01-01

    The "interpretation through synthesis" approach to analyze face images, particularly Active Appearance Models (AAMs) method, has become one of the most successful face modeling approaches over the last two decades. AAM models have ability to represent face images through synthesis using a controllable parameterized Principal Component Analysis (PCA) model. However, the accuracy and robustness of the synthesized faces of AAM are highly depended on the training sets and inherently on the genera...

  17. Technical note: Comparison of methane ebullition modelling approaches used in terrestrial wetland models

    Science.gov (United States)

    Peltola, Olli; Raivonen, Maarit; Li, Xuefei; Vesala, Timo

    2018-02-01

    Emission via bubbling, i.e. ebullition, is one of the main methane (CH4) emission pathways from wetlands to the atmosphere. Direct measurement of gas bubble formation, growth and release in the peat-water matrix is challenging and in consequence these processes are relatively unknown and are coarsely represented in current wetland CH4 emission models. In this study we aimed to evaluate three ebullition modelling approaches and their effect on model performance. This was achieved by implementing the three approaches in one process-based CH4 emission model. All the approaches were based on some kind of threshold: either on CH4 pore water concentration (ECT), pressure (EPT) or free-phase gas volume (EBG) threshold. The model was run using 4 years of data from a boreal sedge fen and the results were compared with eddy covariance measurements of CH4 fluxes.Modelled annual CH4 emissions were largely unaffected by the different ebullition modelling approaches; however, temporal variability in CH4 emissions varied an order of magnitude between the approaches. Hence the ebullition modelling approach drives the temporal variability in modelled CH4 emissions and therefore significantly impacts, for instance, high-frequency (daily scale) model comparison and calibration against measurements. The modelling approach based on the most recent knowledge of the ebullition process (volume threshold, EBG) agreed the best with the measured fluxes (R2 = 0.63) and hence produced the most reasonable results, although there was a scale mismatch between the measurements (ecosystem scale with heterogeneous ebullition locations) and model results (single horizontally homogeneous peat column). The approach should be favoured over the two other more widely used ebullition modelling approaches and researchers are encouraged to implement it into their CH4 emission models.

  18. A model for firm-specific strategic wisdom : including illustrations and 49 guiding questions

    NARCIS (Netherlands)

    van Straten, Roeland Peter

    2017-01-01

    This PhD thesis provides an answer to the question ‘How may one think strategically’. It does so by presenting a new prescriptive ‘Model for Firm-Specific Strategic Wisdom’. This Model aims to guide any individual strategist in his or her thinking from a state of firm-specific ‘ignorance’ to a state

  19. Bianchi VI0 and III models: self-similar approach

    International Nuclear Information System (INIS)

    Belinchon, Jose Antonio

    2009-01-01

    We study several cosmological models with Bianchi VI 0 and III symmetries under the self-similar approach. We find new solutions for the 'classical' perfect fluid model as well as for the vacuum model although they are really restrictive for the equation of state. We also study a perfect fluid model with time-varying constants, G and Λ. As in other studied models we find that the behaviour of G and Λ are related. If G behaves as a growing time function then Λ is a positive decreasing time function but if G is decreasing then Λ 0 is negative. We end by studying a massive cosmic string model, putting special emphasis in calculating the numerical values of the equations of state. We show that there is no SS solution for a string model with time-varying constants.

  20. Environmental Radiation Effects on Mammals A Dynamical Modeling Approach

    CERN Document Server

    Smirnova, Olga A

    2010-01-01

    This text is devoted to the theoretical studies of radiation effects on mammals. It uses the framework of developed deterministic mathematical models to investigate the effects of both acute and chronic irradiation in a wide range of doses and dose rates on vital body systems including hematopoiesis, small intestine and humoral immunity, as well as on the development of autoimmune diseases. Thus, these models can contribute to the development of the system and quantitative approaches in radiation biology and ecology. This text is also of practical use. Its modeling studies of the dynamics of granulocytopoiesis and thrombocytopoiesis in humans testify to the efficiency of employment of the developed models in the investigation and prediction of radiation effects on these hematopoietic lines. These models, as well as the properly identified models of other vital body systems, could provide a better understanding of the radiation risks to health. The modeling predictions will enable the implementation of more ef...