WorldWideScience

Sample records for modeling approach specifically

  1. An approach for activity-based DEVS model specification

    DEFF Research Database (Denmark)

    Alshareef, Abdurrahman; Sarjoughian, Hessam S.; Zarrin, Bahram

    2016-01-01

    activity-based behavior modeling of parallel DEVS atomic models. We consider UML activities and actions as fundamental units of behavior modeling, especially in the presence of recent advances in the UML 2.5 specifications. We describe in detail how to approach activity modeling with a set of elemental...

  2. Integrating UML, the Q-model and a Multi-Agent Approach in Process Specifications and Behavioural Models of Organisations

    Directory of Open Access Journals (Sweden)

    Raul Savimaa

    2005-08-01

    Full Text Available Efficient estimation and representation of an organisation's behaviour requires specification of business processes and modelling of actors' behaviour. Therefore the existing classical approaches that concentrate only on planned processes are not suitable and an approach that integrates process specifications with behavioural models of actors should be used instead. The present research indicates that a suitable approach should be based on interactive computing. This paper examines the integration of UML diagrams for process specifications, the Q-model specifications for modelling timing criteria of existing and planned processes and a multi-agent approach for simulating non-deterministic behaviour of human actors in an organisation. The corresponding original methodology is introduced and some of its applications as case studies are reviewed.

  3. A Domain-specific Modeling Approach to the Development of Online Peer Assessment

    NARCIS (Netherlands)

    Miao, Yongwu; Koper, Rob

    2007-01-01

    Miao, Y., & Koper, R. (2007). A Domain-specific Modeling Approach to the Development of Online Peer Assessment. In T. Navarette, J. Blat & R. Koper (Eds.). Proceedings of the 3rd TENCompetence Open Workshop 'Current Research on IMS Learning Design and Lifelong Competence Development Infrastructures'

  4. A modeling approach to compare ΣPCB concentrations between congener-specific analyses

    Science.gov (United States)

    Gibson, Polly P.; Mills, Marc A.; Kraus, Johanna M.; Walters, David M.

    2017-01-01

    Changes in analytical methods over time pose problems for assessing long-term trends in environmental contamination by polychlorinated biphenyls (PCBs). Congener-specific analyses vary widely in the number and identity of the 209 distinct PCB chemical configurations (congeners) that are quantified, leading to inconsistencies among summed PCB concentrations (ΣPCB) reported by different studies. Here we present a modeling approach using linear regression to compare ΣPCB concentrations derived from different congener-specific analyses measuring different co-eluting groups. The approach can be used to develop a specific conversion model between any two sets of congener-specific analytical data from similar samples (similar matrix and geographic origin). We demonstrate the method by developing a conversion model for an example data set that includes data from two different analytical methods, a low resolution method quantifying 119 congeners and a high resolution method quantifying all 209 congeners. We used the model to show that the 119-congener set captured most (93%) of the total PCB concentration (i.e., Σ209PCB) in sediment and biological samples. ΣPCB concentrations estimated using the model closely matched measured values (mean relative percent difference = 9.6). General applications of the modeling approach include (a) generating comparable ΣPCB concentrations for samples that were analyzed for different congener sets; and (b) estimating the proportional contribution of different congener sets to ΣPCB. This approach may be especially valuable for enabling comparison of long-term remediation monitoring results even as analytical methods change over time. 

  5. Structural modeling of age specific fertility curves in Peninsular Malaysia: An approach of Lee Carter method

    Science.gov (United States)

    Hanafiah, Hazlenah; Jemain, Abdul Aziz

    2013-11-01

    In recent years, the study of fertility has been getting a lot of attention among research abroad following fear of deterioration of fertility led by the rapid economy development. Hence, this study examines the feasibility of developing fertility forecasts based on age structure. Lee Carter model (1992) is applied in this study as it is an established and widely used model in analysing demographic aspects. A singular value decomposition approach is incorporated with an ARIMA model to estimate age specific fertility rates in Peninsular Malaysia over the period 1958-2007. Residual plots is used to measure the goodness of fit of the model. Fertility index forecast using random walk drift is then utilised to predict the future age specific fertility. Results indicate that the proposed model provides a relatively good and reasonable data fitting. In addition, there is an apparent and continuous decline in age specific fertility curves in the next 10 years, particularly among mothers' in their early 20's and 40's. The study on the fertility is vital in order to maintain a balance between the population growth and the provision of facilities related resources.

  6. Lamina specific loss of inhibition may lead to distinct neuropathic manifestations: a computational modeling approach

    Directory of Open Access Journals (Sweden)

    Erick Javier Argüello Prada

    Full Text Available Introduction It has been reported that inhibitory control at the superficial dorsal horn (SDH can act in a regionally distinct manner, which suggests that regionally specific subpopulations of SDH inhibitory neurons may prevent one specific neuropathic condition. Methods In an attempt to address this issue, we provide an alternative approach by integrating neuroanatomical information provided by different studies to construct a network-model of the SDH. We use Neuroids to simulate each neuron included in that model by adapting available experimental evidence. Results Simulations suggest that the maintenance of the proper level of pain sensitivity may be attributed to lamina II inhibitory neurons and, therefore, hyperalgesia may be elicited by suppression of the inhibitory tone at that lamina. In contrast, lamina III inhibitory neurons are more likely to be responsible for keeping the nociceptive pathway from the mechanoreceptive pathway, so loss of inhibitory control in that region may result in allodynia. The SDH network-model is also able to replicate non-linearities associated to pain processing, such as Aβ-fiber mediated analgesia and frequency-dependent increase of the neural response. Discussion By incorporating biophysical accuracy and newer experimental evidence, the SDH network-model may become a valuable tool for assessing the contribution of specific SDH connectivity patterns to noxious transmission in both physiological and pathological conditions.

  7. Computational modeling of bone density profiles in response to gait: a subject-specific approach.

    Science.gov (United States)

    Pang, Henry; Shiwalkar, Abhishek P; Madormo, Chris M; Taylor, Rebecca E; Andriacchi, Thomas P; Kuhl, Ellen

    2012-03-01

    The goal of this study is to explore the potential of computational growth models to predict bone density profiles in the proximal tibia in response to gait-induced loading. From a modeling point of view, we design a finite element-based computational algorithm using the theory of open system thermodynamics. In this algorithm, the biological problem, the balance of mass, is solved locally on the integration point level, while the mechanical problem, the balance of linear momentum, is solved globally on the node point level. Specifically, the local bone mineral density is treated as an internal variable, which is allowed to change in response to mechanical loading. From an experimental point of view, we perform a subject-specific gait analysis to identify the relevant forces during walking using an inverse dynamics approach. These forces are directly applied as loads in the finite element simulation. To validate the model, we take a Dual-Energy X-ray Absorptiometry scan of the subject's right knee from which we create a geometric model of the proximal tibia. For qualitative validation, we compare the computationally predicted density profiles to the bone mineral density extracted from this scan. For quantitative validation, we adopt the region of interest method and determine the density values at fourteen discrete locations using standard and custom-designed image analysis tools. Qualitatively, our two- and three-dimensional density predictions are in excellent agreement with the experimental measurements. Quantitatively, errors are less than 3% for the two-dimensional analysis and less than 10% for the three-dimensional analysis. The proposed approach has the potential to ultimately improve the long-term success of possible treatment options for chronic diseases such as osteoarthritis on a patient-specific basis by accurately addressing the complex interactions between ambulatory loads and tissue changes.

  8. Recognition of Emotions in Mexican Spanish Speech: An Approach Based on Acoustic Modelling of Emotion-Specific Vowels

    OpenAIRE

    Santiago-Omar Caballero-Morales

    2013-01-01

    An approach for the recognition of emotions in speech is presented. The target language is Mexican Spanish, and for this purpose a speech database was created. The approach consists in the phoneme acoustic modelling of emotion-specific vowels. For this, a standard phoneme-based Automatic Speech Recognition (ASR) system was built with Hidden Markov Models (HMMs), where different phoneme HMMs were built for the consonants and emotion-specific vowels associated with four emotional states (anger,...

  9. A competing risk approach for the European Heart SCORE model based on cause-specific and all-cause mortality

    DEFF Research Database (Denmark)

    Støvring, Henrik; Harmsen, Charlotte G; Wisløff, Torbjørn;

    2013-01-01

    pressure, and total cholesterol level. The SCORE model, however, is not mathematically consistent and does not estimate all-cause mortality. Our aim is to modify the SCORE model to allow consistent estimation of both CVD-specific and all-cause mortality. Methods: Using a competing risk approach, we first...

  10. Patient-specific in vitro models for hemodynamic analysis of congenital heart disease - Additive manufacturing approach.

    Science.gov (United States)

    Medero, Rafael; García-Rodríguez, Sylvana; François, Christopher J; Roldán-Alzate, Alejandro

    2017-03-21

    Non-invasive hemodynamic assessment of total cavopulmonary connection (TCPC) is challenging due to the complex anatomy. Additive manufacturing (AM) is a suitable alternative for creating patient-specific in vitro models for flow measurements using four-dimensional (4D) Flow MRI. These in vitro systems have the potential to serve as validation for computational fluid dynamics (CFD), simulating different physiological conditions. This study investigated three different AM technologies, stereolithography (SLA), selective laser sintering (SLS) and fused deposition modeling (FDM), to determine differences in hemodynamics when measuring flow using 4D Flow MRI. The models were created using patient-specific MRI data from an extracardiac TCPC. These models were connected to a perfusion pump circulating water at three different flow rates. Data was processed for visualization and quantification of velocity, flow distribution, vorticity and kinetic energy. These results were compared between each model. In addition, the flow distribution obtained in vitro was compared to in vivo. The results showed significant difference in velocities measured at the outlets of the models that required internal support material when printing. Furthermore, an ultrasound flow sensor was used to validate flow measurements at the inlets and outlets of the in vitro models. These results were highly correlated to those measured with 4D Flow MRI. This study showed that commercially available AM technologies can be used to create patient-specific vascular models for in vitro hemodynamic studies at reasonable costs. However, technologies that do not require internal supports during manufacturing allow smoother internal surfaces, which makes them better suited for flow analyses. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Recognition of Emotions in Mexican Spanish Speech: An Approach Based on Acoustic Modelling of Emotion-Specific Vowels

    Directory of Open Access Journals (Sweden)

    Santiago-Omar Caballero-Morales

    2013-01-01

    Full Text Available An approach for the recognition of emotions in speech is presented. The target language is Mexican Spanish, and for this purpose a speech database was created. The approach consists in the phoneme acoustic modelling of emotion-specific vowels. For this, a standard phoneme-based Automatic Speech Recognition (ASR system was built with Hidden Markov Models (HMMs, where different phoneme HMMs were built for the consonants and emotion-specific vowels associated with four emotional states (anger, happiness, neutral, sadness. Then, estimation of the emotional state from a spoken sentence is performed by counting the number of emotion-specific vowels found in the ASR’s output for the sentence. With this approach, accuracy of 87–100% was achieved for the recognition of emotional state of Mexican Spanish speech.

  12. Recognition of emotions in Mexican Spanish speech: an approach based on acoustic modelling of emotion-specific vowels.

    Science.gov (United States)

    Caballero-Morales, Santiago-Omar

    2013-01-01

    An approach for the recognition of emotions in speech is presented. The target language is Mexican Spanish, and for this purpose a speech database was created. The approach consists in the phoneme acoustic modelling of emotion-specific vowels. For this, a standard phoneme-based Automatic Speech Recognition (ASR) system was built with Hidden Markov Models (HMMs), where different phoneme HMMs were built for the consonants and emotion-specific vowels associated with four emotional states (anger, happiness, neutral, sadness). Then, estimation of the emotional state from a spoken sentence is performed by counting the number of emotion-specific vowels found in the ASR's output for the sentence. With this approach, accuracy of 87-100% was achieved for the recognition of emotional state of Mexican Spanish speech.

  13. Nilpotent Symmetries of a Specific N = 2 Supersymmetric Quantum Mechanical Model: A Novel Approach

    CERN Document Server

    Krishna, S; Malik, R P

    2013-01-01

    We derive the on-shell nilpotent supersymmetric (SUSY) transformations for the N = 2 SUSY quantum mechanical model of a one (0 + 1)-dimensional free particle by exploiting the SUSY invariant restrictions on the (anti-)chiral supervariables of the SUSY theory that is defined on a (1, 2)-dimensional supermanifold (parametrized by a bosonic variable t and a pair of Grassmannian variables \\theta and \\bar \\theta with \\theta^2 = \\bar \\theta^2 = 0,\\theta \\bar \\theta + \\bar \\theta \\theta = 0). Within the framework of our novel approach, we express the Lagrangian and conserved SUSY charges in terms of the (anti-)chiral supervariables to demonstrate the SUSY invariance of the Lagrangian and nilpotency of the conserved charges in a simple manner. Our approach has the potential to be generalized to the description of other N = 2 SUSY quantum mechanical systems with physically interesting potential functions.

  14. A data-driven modeling approach to identify disease-specific multi-organ networks driving physiological dysregulation.

    Directory of Open Access Journals (Sweden)

    Warren D Anderson

    2017-07-01

    Full Text Available Multiple physiological systems interact throughout the development of a complex disease. Knowledge of the dynamics and connectivity of interactions across physiological systems could facilitate the prevention or mitigation of organ damage underlying complex diseases, many of which are currently refractory to available therapeutics (e.g., hypertension. We studied the regulatory interactions operating within and across organs throughout disease development by integrating in vivo analysis of gene expression dynamics with a reverse engineering approach to infer data-driven dynamic network models of multi-organ gene regulatory influences. We obtained experimental data on the expression of 22 genes across five organs, over a time span that encompassed the development of autonomic nervous system dysfunction and hypertension. We pursued a unique approach for identification of continuous-time models that jointly described the dynamics and structure of multi-organ networks by estimating a sparse subset of ∼12,000 possible gene regulatory interactions. Our analyses revealed that an autonomic dysfunction-specific multi-organ sequence of gene expression activation patterns was associated with a distinct gene regulatory network. We analyzed the model structures for adaptation motifs, and identified disease-specific network motifs involving genes that exhibited aberrant temporal dynamics. Bioinformatic analyses identified disease-specific single nucleotide variants within or near transcription factor binding sites upstream of key genes implicated in maintaining physiological homeostasis. Our approach illustrates a novel framework for investigating the pathogenesis through model-based analysis of multi-organ system dynamics and network properties. Our results yielded novel candidate molecular targets driving the development of cardiovascular disease, metabolic syndrome, and immune dysfunction.

  15. The tripartite model of fear in children with specific phobias: assessing concordance and discordance using the behavioral approach test.

    Science.gov (United States)

    Ollendick, Thomas; Allen, Ben; Benoit, Kristy; Cowart, Maria

    2011-08-01

    Lang's tripartite model posits that three main components characterize a fear response: physiological arousal, cognitive (subjective) distress, and behavioral avoidance. These components may occur in tandem with one another (concordance) or they may vary independently (discordance). The behavioral approach test (BAT) has been used to simultaneously examine the three components of the fear response. In the present study, 73 clinic-referred children and adolescents with a specific phobia participated in a phobia-specific BAT. Results revealed an overall pattern of concordance: correlation analyses revealed the three indices were significantly related to one another in the predicted directions. However, considerable variation was noted such that some children were concordant across the response components while others were not. More specifically, based on levels of physiological arousal and subjective distress, two concordant groups (high arousal-high distress, low arousal-low distress) and one discordant (high arousal-low distress or low arousal-high distress) group of youth were identified. These concordant and discordant groups were then compared on the percentage of behavioral steps completed on the BAT. Analyses revealed that the low arousal-low distress group completed a significantly greater percentage of steps than the high arousal-high distress group, and a marginally greater percentage of steps than the discordant group. Potential group differences associated with age, gender, phobia severity, and phobia type were also explored and no significant differences were detected. Implications for theory and treatment are discussed. Copyright © 2011 Elsevier Ltd. All rights reserved.

  16. An Asset Pricing Approach to Testing General Term Structure Models including Heath-Jarrow-Morton Specifications and Affine Subclasses

    DEFF Research Database (Denmark)

    Christensen, Bent Jesper; van der Wel, Michel

    of the risk premium is associated with the slope factor, and individual risk prices depend on own past values, factor realizations, and past values of other risk prices, and are significantly related to the output gap, consumption, and the equity risk price. The absence of arbitrage opportunities is strongly...... is tested, but in addition to the standard bilinear term in factor loadings and market prices of risk, the relevant mean restriction in the term structure case involves an additional nonlinear (quadratic) term in factor loadings. We estimate our general model using likelihood-based dynamic factor model...... techniques for a variety of volatility factors, and implement the relevant likelihood ratio tests. Our factor model estimates are similar across a general state space implementation and an alternative robust two-step principal components approach. The evidence favors time-varying market prices of risk. Most...

  17. A multiscale modelling approach to understand atherosclerosis formation: A patient-specific case study in the aortic bifurcation.

    Science.gov (United States)

    Alimohammadi, Mona; Pichardo-Almarza, Cesar; Agu, Obiekezie; Díaz-Zuccarini, Vanessa

    2017-05-01

    Atherogenesis, the formation of plaques in the wall of blood vessels, starts as a result of lipid accumulation (low-density lipoprotein cholesterol) in the vessel wall. Such accumulation is related to the site of endothelial mechanotransduction, the endothelial response to mechanical stimuli and haemodynamics, which determines biochemical processes regulating the vessel wall permeability. This interaction between biomechanical and biochemical phenomena is complex, spanning different biological scales and is patient-specific, requiring tools able to capture such mathematical and biological complexity in a unified framework. Mathematical models offer an elegant and efficient way of doing this, by taking into account multifactorial and multiscale processes and mechanisms, in order to capture the fundamentals of plaque formation in individual patients. In this study, a mathematical model to understand plaque and calcification locations is presented: this model provides a strong interpretability and physical meaning through a multiscale, complex index or metric (the penetration site of low-density lipoprotein cholesterol, expressed as volumetric flux). Computed tomography scans of the aortic bifurcation and iliac arteries are analysed and compared with the results of the multifactorial model. The results indicate that the model shows potential to predict the majority of the plaque locations, also not predicting regions where plaques are absent. The promising results from this case study provide a proof of concept that can be applied to a larger patient population.

  18. Combining the GW formalism with the polarizable continuum model: A state-specific non-equilibrium approach.

    Science.gov (United States)

    Duchemin, Ivan; Jacquemin, Denis; Blase, Xavier

    2016-04-28

    We have implemented the polarizable continuum model within the framework of the many-body Green's function GW formalism for the calculation of electron addition and removal energies in solution. The present formalism includes both ground-state and non-equilibrium polarization effects. In addition, the polarization energies are state-specific, allowing to obtain the bath-induced renormalisation energy of all occupied and virtual energy levels. Our implementation is validated by comparisons with ΔSCF calculations performed at both the density functional theory and coupled-cluster single and double levels for solvated nucleobases. The present study opens the way to GW and Bethe-Salpeter calculations in disordered condensed phases of interest in organic optoelectronics, wet chemistry, and biology.

  19. Towards personalised management of atherosclerosis via computational models in vascular clinics: technology based on patient-specific simulation approach.

    Science.gov (United States)

    Díaz-Zuccarini, Vanessa; Di Tomaso, Giulia; Agu, Obiekezie; Pichardo-Almarza, Cesar

    2014-01-01

    The development of a new technology based on patient-specific modelling for personalised healthcare in the case of atherosclerosis is presented. Atherosclerosis is the main cause of death in the world and it has become a burden on clinical services as it manifests itself in many diverse forms, such as coronary artery disease, cerebrovascular disease/stroke and peripheral arterial disease. It is also a multifactorial, chronic and systemic process that lasts for a lifetime, putting enormous financial and clinical pressure on national health systems. In this Letter, the postulate is that the development of new technologies for healthcare using computer simulations can, in the future, be developed as in-silico management and support systems. These new technologies will be based on predictive models (including the integration of observations, theories and predictions across a range of temporal and spatial scales, scientific disciplines, key risk factors and anatomical sub-systems) combined with digital patient data and visualisation tools. Although the problem is extremely complex, a simulation workflow and an exemplar application of this type of technology for clinical use is presented, which is currently being developed by a multidisciplinary team following the requirements and constraints of the Vascular Service Unit at the University College Hospital, London.

  20. Simulation-based model checking approach to cell fate specification during Caenorhabditis elegans vulval development by hybrid functional Petri net with extension

    Directory of Open Access Journals (Sweden)

    Ueno Kazuko

    2009-04-01

    Full Text Available Abstract Background Model checking approaches were applied to biological pathway validations around 2003. Recently, Fisher et al. have proved the importance of model checking approach by inferring new regulation of signaling crosstalk in C. elegans and confirming the regulation with biological experiments. They took a discrete and state-based approach to explore all possible states of the system underlying vulval precursor cell (VPC fate specification for desired properties. However, since both discrete and continuous features appear to be an indispensable part of biological processes, it is more appropriate to use quantitative models to capture the dynamics of biological systems. Our key motivation of this paper is to establish a quantitative methodology to model and analyze in silico models incorporating the use of model checking approach. Results A novel method of modeling and simulating biological systems with the use of model checking approach is proposed based on hybrid functional Petri net with extension (HFPNe as the framework dealing with both discrete and continuous events. Firstly, we construct a quantitative VPC fate model with 1761 components by using HFPNe. Secondly, we employ two major biological fate determination rules – Rule I and Rule II – to VPC fate model. We then conduct 10,000 simulations for each of 48 sets of different genotypes, investigate variations of cell fate patterns under each genotype, and validate the two rules by comparing three simulation targets consisting of fate patterns obtained from in silico and in vivo experiments. In particular, an evaluation was successfully done by using our VPC fate model to investigate one target derived from biological experiments involving hybrid lineage observations. However, the understandings of hybrid lineages are hard to make on a discrete model because the hybrid lineage occurs when the system comes close to certain thresholds as discussed by Sternberg and Horvitz in

  1. Patient-Specific Computational Modeling

    CERN Document Server

    Peña, Estefanía

    2012-01-01

    This book addresses patient-specific modeling. It integrates computational modeling, experimental procedures, imagine clinical segmentation and mesh generation with the finite element method (FEM) to solve problems in computational biomedicine and bioengineering. Specific areas of interest include cardiovascular problems, ocular and muscular systems and soft tissue modeling. Patient-specific modeling has been the subject of serious research over the last seven years and interest in the area is continually growing and this area is expected to further develop in the near future.

  2. General and specific attention-deficit/hyperactivity disorder factors of children 4 to 6 years of age: An exploratory structural equation modeling approach to assessing symptom multidimensionality.

    Science.gov (United States)

    Arias, Víctor B; Ponce, Fernando P; Martínez-Molina, Agustín; Arias, Benito; Núñez, Daniel

    2016-01-01

    We tested first-order factor and bifactor models of attention-deficit/hyperactivity disorder (ADHD) using confirmatory factor analysis (CFA) and exploratory structural equation modeling (ESEM) to adequately summarize the Diagnostic and Statistical Manual of Mental Disorders, 4th Edition, (DSM-IV-TR) symptoms observed in a Spanish sample of preschoolers and kindergarteners. Six ESEM and CFA models were estimated based on teacher evaluations of the behavior of 638 children 4 to 6 years of age. An ESEM bifactor model with a central dimension plus 3 specific factors (inattention, hyperactivity, and impulsivity) showed the best fit and interpretability. Strict invariance between the sexes was observed. The bifactor model provided a solution to previously encountered inconsistencies in the factorial models of ADHD in young children. However, the low reliability of the specific factors casts doubt on the utility of the subscales for ADHD measurement. More research is necessary to clarify the nature of G and S factors of ADHD.

  3. A Specification Test of Stochastic Diffusion Models

    Institute of Scientific and Technical Information of China (English)

    Shu-lin ZHANG; Zheng-hong WEI; Qiu-xiang BI

    2013-01-01

    In this paper,we propose a hypothesis testing approach to checking model mis-specification in continuous-time stochastic diffusion model.The key idea behind the development of our test statistic is rooted in the generalized information equality in the context of martingale estimating equations.We propose a bootstrap resampling method to implement numerically the proposed diagnostic procedure.Through intensive simulation studies,we show that our approach is well performed in the aspects of type Ⅰ error control,power improvement as well as computational efficiency.

  4. Peptide Based Radiopharmaceuticals: Specific Construct Approach

    Energy Technology Data Exchange (ETDEWEB)

    Som, P; Rhodes, B A; Sharma, S S

    1997-10-21

    The objective of this project was to develop receptor based peptides for diagnostic imaging and therapy. A series of peptides related to cell adhesion molecules (CAM) and immune regulation were designed for radiolabeling with 99mTc and evaluated in animal models as potential diagnostic imaging agents for various disease conditions such as thrombus (clot), acute kidney failure, and inflection/inflammation imaging. The peptides for this project were designed by the industrial partner, Palatin Technologies, (formerly Rhomed, Inc.) using various peptide design approaches including a newly developed rational computer assisted drug design (CADD) approach termed MIDAS (Metal ion Induced Distinctive Array of Structures). In this approach, the biological function domain and the 99mTc complexing domain are fused together so that structurally these domains are indistinguishable. This approach allows construction of conformationally rigid metallo-peptide molecules (similar to cyclic peptides) that are metabolically stable in-vivo. All the newly designed peptides were screened in various in vitro receptor binding and functional assays to identify a lead compound. The lead compounds were formulated in a one-step 99mTc labeling kit form which were studied by BNL for detailed in-vivo imaging using various animals models of human disease. Two main peptides usingMIDAS approach evolved and were investigated: RGD peptide for acute renal failure and an immunomodulatory peptide derived from tuftsin (RMT-1) for infection/inflammation imaging. Various RGD based metallopeptides were designed, synthesized and assayed for their efficacy in inhibiting ADP-induced human platelet aggregation. Most of these peptides displayed biological activity in the 1-100 µM range. Based on previous work by others, RGD-I and RGD-II were evaluated in animal models of acute renal failure. These earlier studies showed that after acute ischemic injury the renal cortex displays

  5. Some Approaches for Integration of Specification Techniques

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth

    2000-01-01

    It is often useful to apply several specification techniques within the same software development project. This raises the question how specification techniques can be integrated. In this presentation we give three different examples of how this can be done. In the first example, we summarise how...

  6. An Approach to User Interface Specification with Attribute Grammars

    Institute of Scientific and Technical Information of China (English)

    华庆一

    1997-01-01

    An approach to supporting user interfaces using an attribute grammar combined with an event model is described.The main emphasis is how to represent a multi-thread dialogue model in direct manipulation user interfaces.It is shown that control sequence within dialogues,communication with other dialogues, and some computations for applications can be specified with a syntactic and semantic notation.The attribute grammar specification can be implemented using an attribute grammar intepreter embedded in the run-time structure supporting communication,synchronization,and dialogue executon.

  7. Postoperative pain: specific-procedure approach

    OpenAIRE

    Martínez-Vísbal Alfonso Luis; Rodríguez-Betancourt Nancy Tatiana

    2012-01-01

    Pain is an unpleasant sensorial and emotional experience associated to real or potentialtissue response, in special if the trigger factor is known, as it occurs in a surgicalprocedure. In spite of the advance in analgesic techniques and medication, moderateto severe postoperative pain is presented in 70% of the operated patients. Multimodalanalgesia has been proposed due to it involves preventive analgesia and advanceanalgesia. Equally an specific analgesic management for each surgical proced...

  8. Inter-annual and inter-specific differences in the drift of fish eggs and yolksac larvae in the North Sea: A biophysical modeling approach

    Directory of Open Access Journals (Sweden)

    Myron A. Peck

    2009-10-01

    Full Text Available We employed 3-D biophysical modeling and dispersion kernel analysis to explore inter-annual and inter-specific differences in the drift trajectories of eggs and yolksac larvae of plaice (Pleuronectes platessa, Atlantic cod (Gadus morhua, sprat (Sprattus sprattus and horse mackerel (Trachurus trachurus in the North Sea. In this region, these four species exhibit peak spawning during the boreal winter, late winter/early spring, late spring/early summer, and mid-summer respectively, but utilize the same spawning locations (our simulations included Dogger Bank, Southern Bight and the German Bight. Inter-annual differences in the temperature history, and an increase in the area of dispersion and final distribution at the end of the yolksac phase were more pronounced (and related to the North Atlantic Oscillation for winter- and early spring-spawners compared to late spring/summer spawners. The progeny of the latter experienced the largest (up to 10-fold inter-annual differences in drift distances, although absolute drift distances were modest (~2 to 30 km when compared to those of the former (~ 20 to 130 km. Our results highlight the complex interplay that exists between the specific life history strategies of the different species and the impacts of the variability in (climate-driven physical factors during the earliest life stages of marine fish.

  9. Modeling software behavior a craftsman's approach

    CERN Document Server

    Jorgensen, Paul C

    2009-01-01

    A common problem with most texts on requirements specifications is that they emphasize structural models to the near exclusion of behavioral models-focusing on what the software is, rather than what it does. If they do cover behavioral models, the coverage is brief and usually focused on a single model. Modeling Software Behavior: A Craftsman's Approach provides detailed treatment of various models of software behavior that support early analysis, comprehension, and model-based testing. Based on the popular and continually evolving course on requirements specification models taught by the auth

  10. Current Approaches to the Establishment of Credit Risk Specific Provisions

    Directory of Open Access Journals (Sweden)

    Ion Nitu

    2008-10-01

    Full Text Available The aim of the new Basel II and IFRS approaches is to make the operations of financial institutions more transparent and thus to create a better basis for the market participants and supervisory authorities to acquire information and make decisions. In the banking sector, a continuous debate is being led, related to the similarities and differences between IFRS approach on loan loss provisions and Basel II approach on calculating the capital requirements, judging against the classical method regarding loan provisions, currently used by the Romanian banks following the Central Bank’s regulations.Banks must take into consideration that IFRS and Basel II objectives are fundamentally different. While IFRS aims to ensure that the financial papers reflect adequately the losses recorded at each balance sheet date, the Basel II objective is to ensure that the bank has enough provisions or capital in order to face expected losses in the next 12 months and eventual unexpected losses.Consequently, there are clear differences between the objectives of the two models. Basel II works on statistical modeling of expected losses while IFRS, although allowing statistical models, requires a trigger event to have occurred before they can be used. IAS 39 specifically states that losses that are expected as a result of future events, no matter how likely, are not recognized. This is a clear and fundamental area of difference between the two frameworks.

  11. Foundational Issues in Statistical Modeling: Statistical Model Specification and Validation

    Directory of Open Access Journals (Sweden)

    Aris Spanos

    2011-01-01

    Full Text Available Statistical model specification and validation raise crucial foundational problems whose pertinent resolution holds the key to learning from data by securing the reliability of frequentist inference. The paper questions the judiciousness of several current practices, including the theory-driven approach, and the Akaike-type model selection procedures, arguing that they often lead to unreliable inferences. This is primarily due to the fact that goodness-of-fit/prediction measures and other substantive and pragmatic criteria are of questionable value when the estimated model is statistically misspecified. Foisting one's favorite model on the data often yields estimated models which are both statistically and substantively misspecified, but one has no way to delineate between the two sources of error and apportion blame. The paper argues that the error statistical approach can address this Duhemian ambiguity by distinguishing between statistical and substantive premises and viewing empirical modeling in a piecemeal way with a view to delineate the various issues more effectively. It is also argued that Hendry's general to specific procedures does a much better job in model selection than the theory-driven and the Akaike-type procedures primary because of its error statistical underpinnings.

  12. Introducing a game approach towards IS requirements specification

    DEFF Research Database (Denmark)

    Jensen, Mika Yasuoka; Kadoya, Kyoichi; Niwa, Takashi

    2014-01-01

    involvement approach with game elements. We report preliminary findings from a practice case in which our methods are applied to the requirement specification phase of a project management system. The analysis showed that our game approach fostered innovative idea generation and captured implicit user...... stakeholder involvement method with game elements can be effectively utilized as a first step towards requirement specification....

  13. Model Commissioning Plan and Guide Specifications

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    The objectives of Model Commissioning Plan and Guide Specifications are to ensure that the design team applies commissioning concepts to the design and prepares commissioning specifications and a commission plan for inclusion in the bid construction documents.

  14. Visualisation of Domain-Specific Modelling Languages Using UML

    NARCIS (Netherlands)

    Graaf, B.; Van Deursen, A.

    2006-01-01

    Currently, general-purpose modelling tools are often only used to draw diagrams for the documentation. The introduction of model-driven software development approaches involves the definition of domain-specific modelling languages that allow code generation. Although graphical representations of the

  15. Specific energy optimization in sawing of rocks using Taguchi approach

    Institute of Scientific and Technical Information of China (English)

    Izzet Karakurt

    2014-01-01

    This work aims at selecting optimal operating variables to obtain the minimum specific energy (SE) in sawing of rocks. A particular granite was sampled and sawn by a fully automated circular diamond sawblades. The peripheral speed, the traverse speed, the cut depth and the flow rate of cooling fluid were selected as the operating variables. Taguchi approach was adopted as a statistical design of experimental technique for optimization studies. The results were evaluated based on the analysis of variance and signal-to-noise ratio (S/N ratio). Statistically significant operating variables and their percentage contribution to the process were also determined. Additionally, a statistical model was developed to demonstrate the relationship between SE and operating variables using regression analysis and the model was then verified. It was found that the optimal combination of operating variables for minimum SE is the peripheral speed of 25 m/s, the traverse speed of 70 cm/min, the cut depth of 2 cm and the flow rate of cooling fluid of 100 mL/s. The cut depth and traverse speed were statistically determined as the significant operating variables affecting the SE, respectively. Furthermore, the regression model results reveal that the predictive model has a high applicability for practical applications.

  16. Model Oriented Approach for Industrial Software Development

    Directory of Open Access Journals (Sweden)

    P. D. Drobintsev

    2015-01-01

    Full Text Available The article considers the specifics of a model oriented approach to software development based on the usage of Model Driven Architecture (MDA, Model Driven Software Development (MDSD and Model Driven Development (MDD technologies. Benefits of this approach usage in the software development industry are described. The main emphasis is put on the system design, automated code generation for large systems, verification, proof of system properties and reduction of bug density. Drawbacks of the approach are also considered. The approach proposed in the article is specific for industrial software systems development. These systems are characterized by different levels of abstraction, which is used on modeling and code development phases. The approach allows to detail the model to the level of the system code, at the same time store the verified model semantics and provide the checking of the whole detailed model. Steps of translating abstract data structures (including transactions, signals and their parameters into data structures used in detailed system implementation are presented. Also the grammar of a language for specifying rules of abstract model data structures transformation into real system detailed data structures is described. The results of applying the proposed method in the industrial technology are shown.The article is published in the authors’ wording.

  17. An integrated in silico approach to design specific inhibitors targeting human poly(a-specific ribonuclease.

    Directory of Open Access Journals (Sweden)

    Dimitrios Vlachakis

    Full Text Available Poly(A-specific ribonuclease (PARN is an exoribonuclease/deadenylase that degrades 3'-end poly(A tails in almost all eukaryotic organisms. Much of the biochemical and structural information on PARN comes from the human enzyme. However, the existence of PARN all along the eukaryotic evolutionary ladder requires further and thorough investigation. Although the complete structure of the full-length human PARN, as well as several aspects of the catalytic mechanism still remain elusive, many previous studies indicate that PARN can be used as potent and promising anti-cancer target. In the present study, we attempt to complement the existing structural information on PARN with in-depth bioinformatics analyses, in order to get a hologram of the molecular evolution of PARNs active site. In an effort to draw an outline, which allows specific drug design targeting PARN, an unequivocally specific platform was designed for the development of selective modulators focusing on the unique structural and catalytic features of the enzyme. Extensive phylogenetic analysis based on all the publicly available genomes indicated a broad distribution for PARN across eukaryotic species and revealed structurally important amino acids which could be assigned as potentially strong contributors to the regulation of the catalytic mechanism of PARN. Based on the above, we propose a comprehensive in silico model for the PARN's catalytic mechanism and moreover, we developed a 3D pharmacophore model, which was subsequently used for the introduction of DNP-poly(A amphipathic substrate analog as a potential inhibitor of PARN. Indeed, biochemical analysis revealed that DNP-poly(A inhibits PARN competitively. Our approach provides an efficient integrated platform for the rational design of pharmacophore models as well as novel modulators of PARN with therapeutic potential.

  18. Introducing a game approach towards IS requirements specification

    DEFF Research Database (Denmark)

    Jensen, Mika Yasuoka; Kadoya, Kyoichi; Niwa, Takashi

    2014-01-01

    Devising a system requirements specification is a challenging task. Even after several decades of system development research, specifications for large-scale, widely-used systems remain difficult. In this paper, we suggest a first step toward a requirements specification through a stakeholder...... involvement approach with game elements. We report preliminary findings from a practice case in which our methods are applied to the requirement specification phase of a project management system. The analysis showed that our game approach fostered innovative idea generation and captured implicit user...... expectations, and as a result provided a list of requirements from other perspectives than those of conventional specification analysis. The granularities of extracted system requirements need to be refined and transferred to detailed requirements for developers to use, however, our results imply that our...

  19. Industrial application of formal models generated from domain specific languages

    NARCIS (Netherlands)

    Hooman, J.

    2016-01-01

    Domain Specific Languages (DSLs) provide a lightweight approach to incorporate formal techniques into the industrial workflow. From DSL instances, formal models and other artefacts can be generated, such as simulation models and code. Having a single source for all artefacts improves maintenance and

  20. A Systematic Approach for Developing Bacteria-Specific Imaging Tracers.

    Science.gov (United States)

    Ordonez, Alvaro A; Weinstein, Edward A; Bambarger, Lauren E; Saini, Vikram; Chang, Yong S; DeMarco, Vincent P; Klunk, Mariah H; Urbanowski, Michael E; Moulton, Kimberly L; Murawski, Allison M; Pokkali, Supriya; Kalinda, Alvin S; Jain, Sanjay K

    2017-01-01

    The modern patient is increasingly susceptible to bacterial infections including those due to multidrug-resistant organisms (MDROs). Noninvasive whole-body analysis with pathogen-specific imaging technologies can significantly improve patient outcomes by rapidly identifying a source of infection and monitoring the response to treatment, but no such technology exists clinically. We systematically screened 961 random radiolabeled molecules in silico as substrates for essential metabolic pathways in bacteria, followed by in vitro uptake in representative bacteria-Staphylococcus aureus, Escherichia coli, Pseudomonas aeruginosa, and mycobacteria. Fluorine-labeled analogs, that could be developed as PET-based imaging tracers, were evaluated in a murine myositis model. We identified 3 novel, nontoxic molecules demonstrating selective bacterial uptake: para-aminobenzoic acid (PABA), with uptake in all representative bacteria including Mycobacterium tuberculosis; mannitol, with selective uptake in S. aureus and E. coli; and sorbitol, accumulating only in E. coli None accumulated in mammalian cells or heat-killed bacteria, suggesting metabolism-derived specificity. In addition to an extended bacterial panel of laboratory strains, all 3 molecules rapidly accumulated in respective clinical isolates of interest including MDROs such as methicillin-resistant S. aureus, extended-spectrum β-lactamase-producing, and carbapenem-resistant Enterobacteriaceae. In a murine myositis model, fluorine-labeled analogs of all 3 molecules could rapidly detect and differentiate infection sites from sterile inflammation in mice (P = 0.03). Finally, 2-deoxy-2-[F-18]fluoro-d-sorbitol ((18)F-FDS) can be easily synthesized from (18)F-FDG. PET, with (18)F-FDS synthesized using current good manufacturing practice, could rapidly differentiate true infection from sterile inflammation to selectively localize E. coli infection in mice. We have developed a systematic approach that exploits unique

  1. Evaluation of methods for modeling transcription-factor sequence specificity

    Science.gov (United States)

    Weirauch, Matthew T.; Cote, Atina; Norel, Raquel; Annala, Matti; Zhao, Yue; Riley, Todd R.; Saez-Rodriguez, Julio; Cokelaer, Thomas; Vedenko, Anastasia; Talukder, Shaheynoor; Bussemaker, Harmen J.; Morris, Quaid D.; Bulyk, Martha L.; Stolovitzky, Gustavo

    2013-01-01

    Genomic analyses often involve scanning for potential transcription-factor (TF) binding sites using models of the sequence specificity of DNA binding proteins. Many approaches have been developed to model and learn a protein’s binding specificity, but these methods have not been systematically compared. Here we applied 26 such approaches to in vitro protein binding microarray data for 66 mouse TFs belonging to various families. For 9 TFs, we also scored the resulting motif models on in vivo data, and found that the best in vitro–derived motifs performed similarly to motifs derived from in vivo data. Our results indicate that simple models based on mononucleotide position weight matrices learned by the best methods perform similarly to more complex models for most TFs examined, but fall short in specific cases (<10%). In addition, the best-performing motifs typically have relatively low information content, consistent with widespread degeneracy in eukaryotic TF sequence preferences. PMID:23354101

  2. Patient-Specific Modeling in Tomorrow's Medicine

    CERN Document Server

    2012-01-01

    This book reviews the frontier of research and clinical applications of Patient Specific Modeling, and provides a state-of-the-art update as well as perspectives on future directions in this exciting field. The book is useful for medical physicists, biomedical engineers and other engineers who are interested in the science and technology aspects of Patient Specific Modeling, as well as for radiologists and other medical specialists who wish to be updated about the state of implementation.

  3. Multiple Model Approaches to Modelling and Control,

    DEFF Research Database (Denmark)

    Why Multiple Models?This book presents a variety of approaches which produce complex models or controllers by piecing together a number of simpler subsystems. Thisdivide-and-conquer strategy is a long-standing and general way of copingwith complexity in engineering systems, nature and human probl...

  4. Assessing the Learning Path Specification: a Pragmatic Quality Approach

    NARCIS (Netherlands)

    Janssen, José; Berlanga, Adriana; Heyenrath, Stef; Martens, Harrie; Vogten, Hubert; Finders, Anton; Herder, Eelco; Hermans, Henry; Melero, Javier; Schaeps, Leon; Koper, Rob

    2010-01-01

    Janssen, J., Berlanga, A. J., Heyenrath, S., Martens, H., Vogten, H., Finders, A., Herder, E., Hermans, H., Melero Gallardo, J., Schaeps, L., & Koper, R. (2010). Assessing the Learning Path Specification: a Pragmatic Quality Approach. Journal of Universal Computer Science, 16(21), 3191-3209.

  5. New Approach to Total Dose Specification for Spacecraft Electronics

    Science.gov (United States)

    Xapsos, Michael

    2017-01-01

    Variability of the space radiation environment is investigated with regard to total dose specification for spacecraft electronics. It is shown to have a significant impact. A new approach is developed for total dose requirements that replaces the radiation design margin concept with failure probability during a mission.

  6. Morphing patient-specific musculoskeletal models

    DEFF Research Database (Denmark)

    Rasmussen, John; Galibarov, Pavel E.; Al-Munajjed, Amir

    other conditions may require CT or MRI data. The method and its theoretical assumptions, advantages and limitations are presented, and several examples will illustrate morphing to patient-specific models. [1] Carbes S; Tørholm S; Rasmussen, J. A Detailed Twenty-six Segments Kinematic Foot model...

  7. Multiple Model Approaches to Modelling and Control,

    DEFF Research Database (Denmark)

    on the ease with which prior knowledge can be incorporated. It is interesting to note that researchers in Control Theory, Neural Networks,Statistics, Artificial Intelligence and Fuzzy Logic have more or less independently developed very similar modelling methods, calling them Local ModelNetworks, Operating...... of introduction of existing knowledge, as well as the ease of model interpretation. This book attempts to outlinemuch of the common ground between the various approaches, encouraging the transfer of ideas.Recent progress in algorithms and analysis is presented, with constructive algorithms for automated model...

  8. Model Construct Based Enterprise Model Architecture and Its Modeling Approach

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    In order to support enterprise integration, a kind of model construct based enterprise model architecture and its modeling approach are studied in this paper. First, the structural makeup and internal relationships of enterprise model architecture are discussed. Then, the concept of reusable model construct (MC) which belongs to the control view and can help to derive other views is proposed. The modeling approach based on model construct consists of three steps, reference model architecture synthesis, enterprise model customization, system design and implementation. According to MC based modeling approach a case study with the background of one-kind-product machinery manufacturing enterprises is illustrated. It is shown that proposal model construct based enterprise model architecture and modeling approach are practical and efficient.

  9. Towards a Multiscale Approach to Cybersecurity Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Hogan, Emilie A.; Hui, Peter SY; Choudhury, Sutanay; Halappanavar, Mahantesh; Oler, Kiri J.; Joslyn, Cliff A.

    2013-11-12

    We propose a multiscale approach to modeling cyber networks, with the goal of capturing a view of the network and overall situational awareness with respect to a few key properties--- connectivity, distance, and centrality--- for a system under an active attack. We focus on theoretical and algorithmic foundations of multiscale graphs, coming from an algorithmic perspective, with the goal of modeling cyber system defense as a specific use case scenario. We first define a notion of \\emph{multiscale} graphs, in contrast with their well-studied single-scale counterparts. We develop multiscale analogs of paths and distance metrics. As a simple, motivating example of a common metric, we present a multiscale analog of the all-pairs shortest-path problem, along with a multiscale analog of a well-known algorithm which solves it. From a cyber defense perspective, this metric might be used to model the distance from an attacker's position in the network to a sensitive machine. In addition, we investigate probabilistic models of connectivity. These models exploit the hierarchy to quantify the likelihood that sensitive targets might be reachable from compromised nodes. We believe that our novel multiscale approach to modeling cyber-physical systems will advance several aspects of cyber defense, specifically allowing for a more efficient and agile approach to defending these systems.

  10. Learning Actions Models: Qualitative Approach

    DEFF Research Database (Denmark)

    Bolander, Thomas; Gierasimczuk, Nina

    2015-01-01

    —they are identifiable in the limit.We then move on to a particular learning method, which proceeds via restriction of a space of events within a learning-specific action model. This way of learning closely resembles the well-known update method from dynamic epistemic logic. We introduce several different learning......In dynamic epistemic logic, actions are described using action models. In this paper we introduce a framework for studying learnability of action models from observations. We present first results concerning propositional action models. First we check two basic learnability criteria: finite...... identifiability (conclusively inferring the appropriate action model in finite time) and identifiability in the limit (inconclusive convergence to the right action model). We show that deterministic actions are finitely identifiable, while non-deterministic actions require more learning power...

  11. Formal specification with the Java modeling language

    NARCIS (Netherlands)

    Huisman, Marieke; Ahrendt, Wolfgang; Grahl, Daniel; Hentschel, Martin; Ahrendt, Wolfgang; Beckert, Bernhard; Bubel, Richard; Hähnle, Reiner; Schmitt, Peter H.; Ulbrich, Mattoas

    2016-01-01

    This text is a general, self contained, and tool independent introduction into the Java Modeling Language, JML. It appears in a book about the KeY approach and tool, because JML is the dominating starting point of KeY style Java verification. However, this chapter does not depend on KeY, nor any

  12. Ensemble approach to predict specificity determinants: benchmarking and validation

    Directory of Open Access Journals (Sweden)

    Panchenko Anna R

    2009-07-01

    Full Text Available Abstract Background It is extremely important and challenging to identify the sites that are responsible for functional specification or diversification in protein families. In this study, a rigorous comparative benchmarking protocol was employed to provide a reliable evaluation of methods which predict the specificity determining sites. Subsequently, three best performing methods were applied to identify new potential specificity determining sites through ensemble approach and common agreement of their prediction results. Results It was shown that the analysis of structural characteristics of predicted specificity determining sites might provide the means to validate their prediction accuracy. For example, we found that for smaller distances it holds true that the more reliable the prediction method is, the closer predicted specificity determining sites are to each other and to the ligand. Conclusion We observed certain similarities of structural features between predicted and actual subsites which might point to their functional relevance. We speculate that majority of the identified potential specificity determining sites might be indirectly involved in specific interactions and could be ideal target for mutagenesis experiments.

  13. Matrix Model Approach to Cosmology

    CERN Document Server

    Chaney, A; Stern, A

    2015-01-01

    We perform a systematic search for rotationally invariant cosmological solutions to matrix models, or more specifically the bosonic sector of Lorentzian IKKT-type matrix models, in dimensions $d$ less than ten, specifically $d=3$ and $d=5$. After taking a continuum (or commutative) limit they yield $d-1$ dimensional space-time surfaces, with an attached Poisson structure, which can be associated with closed, open or static cosmologies. For $d=3$, we obtain recursion relations from which it is possible to generate rotationally invariant matrix solutions which yield open universes in the continuum limit. Specific examples of matrix solutions have also been found which are associated with closed and static two-dimensional space-times in the continuum limit. The solutions provide for a matrix resolution of cosmological singularities. The commutative limit reveals other desirable features, such as a solution describing a smooth transition from an initial inflation to a noninflationary era. Many of the $d=3$ soluti...

  14. Globalization and Income Distribution: A Specific Factors Continuum Approach

    OpenAIRE

    Anderson, James E.

    2009-01-01

    Does globalization widen inequality or increase income risk? In the specific factors continuum model of this paper, globalization widens inequality, amplifying the positive (negative) premia for export (import- competing) sectors. Globalization amplifies the risk from idiosyncratic relative productivity shocks but reduces risk from aggregate shocks to absolute advantage, relative endowments and transfers. Aggregate-shock-induced income risk bears most heavily on the poorest specific factors, ...

  15. Learning Actions Models: Qualitative Approach

    DEFF Research Database (Denmark)

    Bolander, Thomas; Gierasimczuk, Nina

    2015-01-01

    identifiability (conclusively inferring the appropriate action model in finite time) and identifiability in the limit (inconclusive convergence to the right action model). We show that deterministic actions are finitely identifiable, while non-deterministic actions require more learning power......—they are identifiable in the limit.We then move on to a particular learning method, which proceeds via restriction of a space of events within a learning-specific action model. This way of learning closely resembles the well-known update method from dynamic epistemic logic. We introduce several different learning...

  16. Challenges in structural approaches to cell modeling.

    Science.gov (United States)

    Im, Wonpil; Liang, Jie; Olson, Arthur; Zhou, Huan-Xiang; Vajda, Sandor; Vakser, Ilya A

    2016-07-31

    Computational modeling is essential for structural characterization of biomolecular mechanisms across the broad spectrum of scales. Adequate understanding of biomolecular mechanisms inherently involves our ability to model them. Structural modeling of individual biomolecules and their interactions has been rapidly progressing. However, in terms of the broader picture, the focus is shifting toward larger systems, up to the level of a cell. Such modeling involves a more dynamic and realistic representation of the interactomes in vivo, in a crowded cellular environment, as well as membranes and membrane proteins, and other cellular components. Structural modeling of a cell complements computational approaches to cellular mechanisms based on differential equations, graph models, and other techniques to model biological networks, imaging data, etc. Structural modeling along with other computational and experimental approaches will provide a fundamental understanding of life at the molecular level and lead to important applications to biology and medicine. A cross section of diverse approaches presented in this review illustrates the developing shift from the structural modeling of individual molecules to that of cell biology. Studies in several related areas are covered: biological networks; automated construction of three-dimensional cell models using experimental data; modeling of protein complexes; prediction of non-specific and transient protein interactions; thermodynamic and kinetic effects of crowding; cellular membrane modeling; and modeling of chromosomes. The review presents an expert opinion on the current state-of-the-art in these various aspects of structural modeling in cellular biology, and the prospects of future developments in this emerging field. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. A conditional likelihood approach for regression analysis using biomarkers measured with batch-specific error.

    Science.gov (United States)

    Wang, Ming; Flanders, W Dana; Bostick, Roberd M; Long, Qi

    2012-12-20

    Measurement error is common in epidemiological and biomedical studies. When biomarkers are measured in batches or groups, measurement error is potentially correlated within each batch or group. In regression analysis, most existing methods are not applicable in the presence of batch-specific measurement error in predictors. We propose a robust conditional likelihood approach to account for batch-specific error in predictors when batch effect is additive and the predominant source of error, which requires no assumptions on the distribution of measurement error. Although a regression model with batch as a categorical covariable yields the same parameter estimates as the proposed conditional likelihood approach for linear regression, this result does not hold in general for all generalized linear models, in particular, logistic regression. Our simulation studies show that the conditional likelihood approach achieves better finite sample performance than the regression calibration approach or a naive approach without adjustment for measurement error. In the case of logistic regression, our proposed approach is shown to also outperform the regression approach with batch as a categorical covariate. In addition, we also examine a 'hybrid' approach combining the conditional likelihood method and the regression calibration method, which is shown in simulations to achieve good performance in the presence of both batch-specific and measurement-specific errors. We illustrate our method by using data from a colorectal adenoma study.

  18. Hydraulic Modeling of Lock Approaches

    Science.gov (United States)

    2016-08-01

    cation was that the guidewall design changed from a solid wall to one on pilings in which water was allowed to flow through and/or under the wall ...develops innovative solutions in civil and military engineering, geospatial sciences, water resources, and environmental sciences for the Army, the...magnitudes and directions at lock approaches for open river conditions. The meshes were developed using the Surface- water Modeling System. The two

  19. LP Approach to Statistical Modeling

    OpenAIRE

    Mukhopadhyay, Subhadeep; Parzen, Emanuel

    2014-01-01

    We present an approach to statistical data modeling and exploratory data analysis called `LP Statistical Data Science.' It aims to generalize and unify traditional and novel statistical measures, methods, and exploratory tools. This article outlines fundamental concepts along with real-data examples to illustrate how the `LP Statistical Algorithm' can systematically tackle different varieties of data types, data patterns, and data structures under a coherent theoretical framework. A fundament...

  20. 1-Meter Digital Elevation Model specification

    Science.gov (United States)

    Arundel, Samantha T.; Archuleta, Christy-Ann M.; Phillips, Lori A.; Roche, Brittany L.; Constance, Eric W.

    2015-10-21

    In January 2015, the U.S. Geological Survey National Geospatial Technical Operations Center began producing the 1-Meter Digital Elevation Model data product. This new product was developed to provide high resolution bare-earth digital elevation models from light detection and ranging (lidar) elevation data and other elevation data collected over the conterminous United States (lower 48 States), Hawaii, and potentially Alaska and the U.S. territories. The 1-Meter Digital Elevation Model consists of hydroflattened, topographic bare-earth raster digital elevation models, with a 1-meter x 1-meter cell size, and is available in 10,000-meter x 10,000-meter square blocks with a 6-meter overlap. This report details the specifications required for the production of the 1-Meter Digital Elevation Model.

  1. The Danish national passenger modelModel specification and results

    DEFF Research Database (Denmark)

    Rich, Jeppe; Hansen, Christian Overgaard

    2016-01-01

    The paper describes the structure of the new Danish National Passenger model and provides on this basis a general discussion of large-scale model design, cost-damping and model validation. The paper aims at providing three main contributions to the existing literature. Firstly, at the general level......, the paper provides a description of a large-scale forecast model with a discussion of the linkage between population synthesis, demand and assignment. Secondly, the paper gives specific attention to model specification and in particular choice of functional form and cost-damping. Specifically we suggest...... a family of logarithmic spline functions and illustrate how it is applied in the model. Thirdly and finally, we evaluate model sensitivity and performance by evaluating the distance distribution and elasticities. In the paper we present results where the spline-function is compared with more traditional...

  2. Approaches to Modeling of Recrystallization

    Directory of Open Access Journals (Sweden)

    Håkan Hallberg

    2011-10-01

    Full Text Available Control of the material microstructure in terms of the grain size is a key component in tailoring material properties of metals and alloys and in creating functionally graded materials. To exert this control, reliable and efficient modeling and simulation of the recrystallization process whereby the grain size evolves is vital. The present contribution is a review paper, summarizing the current status of various approaches to modeling grain refinement due to recrystallization. The underlying mechanisms of recrystallization are briefly recollected and different simulation methods are discussed. Analytical and empirical models, continuum mechanical models and discrete methods as well as phase field, vertex and level set models of recrystallization will be considered. Such numerical methods have been reviewed previously, but with the present focus on recrystallization modeling and with a rapidly increasing amount of related publications, an updated review is called for. Advantages and disadvantages of the different methods are discussed in terms of applicability, underlying assumptions, physical relevance, implementation issues and computational efficiency.

  3. Subject-specific modeling of intracranial aneurysms

    Science.gov (United States)

    Cebral, Juan R.; Hernandez, Monica; Frangi, Alejandro; Putman, Christopher; Pergolizzi, Richard; Burgess, James

    2004-04-01

    Characterization of the blood flow patterns in cerebral aneurysms is important to explore possible correlations between the hemodynamics conditions and the morphology, location, type and risk of rupture of intracranial aneurysms. For this purpose, realistic patient-specific models are constructed from computed tomography angiography and 3D rotational angiography image data. Visualizations of the distribution of hemodynamics forces on the aneurysm walls as well as the intra-aneurysmal flow patterns are presented for a number of cerebral aneurysms of different sizes, types and locations. The numerical models indicate that there are different classes of intra-aneurysmal flow patterns, that may carry different risks of rupture.

  4. Domain-specific and domain-general processes in social perception--A complementary approach.

    Science.gov (United States)

    Michael, John; D'Ausilio, Alessandro

    2015-11-01

    In this brief discussion, we explicate and evaluate Heyes and colleagues' deflationary approach to interpreting apparent evidence of domain-specific processes for social perception. We argue that the deflationary approach sheds important light on how functionally specific processes in social perception can be subserved at least in part by domain-general processes. On the other hand, we also argue that the fruitfulness of this approach has been unnecessarily hampered by a contrastive conception of the relationship between domain-general and domain-specific processes. As an alternative, we propose a complementary conception: the identification of domain-general processes that are engaged in instances of social perception can play a positive, structuring role by adding additional constraints to be accounted for in modelling the domain-specific processes that are also involved in such instances. Copyright © 2014 Elsevier Inc. All rights reserved.

  5. Academic literacies approaches for facilitating language for specific purposes

    Directory of Open Access Journals (Sweden)

    Magnus Gustafsson

    2011-10-01

    Full Text Available This paper offers a possible framework for working with language for specific purposes (LSP in an integrated fashion, i.e. with disciplinary learning as the main lever to promote academic literacy. I suggest that a genuine literacies approach in higher education is already disciplinary by necessity and that even if we do not have an immediate disciplinary context to work in, we still need to work with the students’ understanding of the communities they are active in. The framework draws on previous research on “literacies” and “generic skills” as the basic components and incorporates ways of adapting other frameworks such as peer learning and activity theory at the institutional level. The framework is applied on three cases at the Division for Language and Communication. The examples indicate how important flexibility in application is, and how the facilitation of learning under an umbrella concept like “academic literacies” is inherently dependent on learning philosophy. The examples also show how the consistent implementation of a framework philosophy requires versatile solutions of the constructive alignment puzzle in designing the environment, the activities, and the assessment of specific interventions. In combination with the three examples, the suggested framework offers a way of prioritising approaches for arriving at academic literacy.

  6. [Neurocognitive and pharmacological approach to specific learning disorders].

    Science.gov (United States)

    Etchepareborda, M C

    1999-02-01

    Specific learning disorders are distinguished from general development disorders since, in general, only a certain number of processing mechanisms are involved whilst the remainder are unaffected. The classification proposed by the DSM-IV takes a step towards clinical understanding and use of a common nomenclature. However, neuropsychological assessment is essential to understanding clinical subtypes. The neuro-cognitive approach, when taking into account the processing systems affected or involved, should include the strategies and principles of a cognitive-behavioural approach, accompanied by computerized cognitive training. Pharmacological treatment uses drugs with different modes of action depending on the specific neuropsychological characteristics of each type of disorder of nerve development. We discuss the clinical use of various drugs in view of investigations, present and past: methylphenidate for the dys-attentional subtype of ADHD; piracetam in developmental dyslexia of dysideatic type; citocolina in the infantile dysphasias of sensory input predominance, thiapride in dysfluencial and combined subtype of ADHD; pipamperona in behaviour disorders and the hyperactive-impulsive subtype of ADHD, with or without associated and selegilina in the dysattention subtype of ADHD and the dysgraphias of the subtype with predominance of calligraphy and spatial disorders.

  7. Animal models of antimuscle specific kinase myasthenia

    Science.gov (United States)

    Richman, David P.; Nishi, Kayoko; Ferns, Michael J.; Schnier, Joachim; Pytel, Peter; Maselli, Ricardo A.; Agius, Mark A.

    2014-01-01

    Antimuscle specific kinase (anti-MuSK) myasthenia (AMM) differs from antiacetylcholine receptor myasthenia gravis in exhibiting more focal muscle involvement (neck, shoulder, facial, and bulbar muscles) with wasting of the involved, primarily axial, muscles. AMM is not associated with thymic hyperplasia and responds poorly to anticholinesterase treatment. Animal models of AMM have been induced in rabbits, mice, and rats by immunization with purified xenogeneic MuSK ectodomain, and by passive transfer of large quantities of purified serum IgG from AMM patients into mice. The models have confirmed the pathogenic role of the MuSK antibodies in AMM and have demonstrated the involvement of both the presynaptic and postsynaptic components of the neuromuscular junction. The observations in this human disease and its animal models demonstrate the role of MuSK not only in the formation of this synapse but also in its maintenance. PMID:23252909

  8. Subject-specific models for image-guided cardiac surgery

    Science.gov (United States)

    Wierzbicki, Marcin; Moore, John; Drangova, Maria; Peters, Terry

    2008-10-01

    Three-dimensional visualization for planning and guidance is still not routinely available for minimally invasive cardiac surgery (MICS). This can be addressed by providing the surgeon with subject-specific geometric models derived from 3D preoperative images for planning of port locations or to rehearse the procedure. For guidance purposes, these models can also be registered to the subject using intraoperative images. In this paper, we present a method for extracting subject-specific heart geometry from preoperative MR images. The main obstacle we face is the low quality of clinical data in terms of resolution, signal-to-noise ratio, and presence of artefacts. Instead of using these images directly, we approach the problem in three steps: (1) generate a high quality template model, (2) register the template with the preoperative data, and (3) animate the result over the cardiac cycle. Validation of this approach showed that dynamic subject-specific models can be generated with a mean error of 3.6 ± 1.1 mm from low resolution target images (6 mm slices). Thus, the models are sufficiently accurate for MICS training and procedure planning. In terms of guidance, we also demonstrate how the resulting models may be adapted to the operating room using intraoperative ultrasound imaging.

  9. Context-specific graphical models for discret longitudinal data

    DEFF Research Database (Denmark)

    Edwards, David; Anantharama Ankinakatte, Smitha

    2015-01-01

    Ron et al. (1998) introduced a rich family of models for discrete longitudinal data called acyclic probabilistic finite automata. These may be represented as directed graphs that embody context-specific conditional independence relations. Here, the approach is developed from a statistical...... perspective. It is shown here that likelihood ratio tests may be constructed using standard contingency table methods, a model selection procedure that minimizes a penalized likelihood criterion is described, and a way to extend the models to incorporate covariates is proposed. The methods are applied...

  10. Dromedary immune response and specific Kv2.1 antibody generation using a specific immunization approach.

    Science.gov (United States)

    Hassiki, Rym; Labro, Alain J; Benlasfar, Zakaria; Vincke, Cécile; Somia, Mahmoud; El Ayeb, Mohamed; Muyldermans, Serge; Snyders, Dirk J; Bouhaouala-Zahar, Balkiss

    2016-12-01

    Voltage-gated potassium (Kv) channels form cells repolarizing power and are commonly expressed in excitable cells. In non-excitable cells, Kv channels such as Kv2.1 are involved in cell differentiation and growth. Due to the involvement of Kv2.1 in several physiological processes, these channels are promising therapeutic targets. To develop Kv2.1 specific antibody-based channel modulators, we applied a novel approach and immunized a dromedary with heterologous Ltk- cells that overexpress the mouse Kv2.1 channel instead of immunizing with channel protein fragments. The advantage of this approach is that the channel is presented in its native tetrameric configuration. Using a Cell-ELISA, we demonstrated the ability of the immune serum to detect Kv2.1 channels on the surface of cells that express the channel. Then, using a Patch Clamp electrophysiology assay we explored the capability of the dromedary serum in modulating Kv2.1 currents. Cells that were incubated for 3h with serum taken at Day 51 from the start of the immunization displayed a statistically significant 2-fold reduction in current density compared to control conditions as well as cells incubated with serum from Day 0. Here we show that an immunization approach with cells overexpressing the Kv2.1 channel yields immune serum with Kv2.1 specific antibodies.

  11. Using Built-In Domain-Specific Modeling Support to Guide Model-Based Test Generation

    CERN Document Server

    Kanstrén, Teemu; 10.4204/EPTCS.80.5

    2012-01-01

    We present a model-based testing approach to support automated test generation with domain-specific concepts. This includes a language expert who is an expert at building test models and domain experts who are experts in the domain of the system under test. First, we provide a framework to support the language expert in building test models using a full (Java) programming language with the help of simple but powerful modeling elements of the framework. Second, based on the model built with this framework, the toolset automatically forms a domain-specific modeling language that can be used to further constrain and guide test generation from these models by a domain expert. This makes it possible to generate a large set of test cases covering the full model, chosen (constrained) parts of the model, or manually define specific test cases on top of the model while using concepts familiar to the domain experts.

  12. Using Built-In Domain-Specific Modeling Support to Guide Model-Based Test Generation

    Directory of Open Access Journals (Sweden)

    Teemu Kanstrén

    2012-02-01

    Full Text Available We present a model-based testing approach to support automated test generation with domain-specific concepts. This includes a language expert who is an expert at building test models and domain experts who are experts in the domain of the system under test. First, we provide a framework to support the language expert in building test models using a full (Java programming language with the help of simple but powerful modeling elements of the framework. Second, based on the model built with this framework, the toolset automatically forms a domain-specific modeling language that can be used to further constrain and guide test generation from these models by a domain expert. This makes it possible to generate a large set of test cases covering the full model, chosen (constrained parts of the model, or manually define specific test cases on top of the model while using concepts familiar to the domain experts.

  13. A Three Doublet Lepton-Specific Model

    CERN Document Server

    Merchand, Marco

    2016-01-01

    In the lepton-specific version of two Higgs doublet models, a discrete symmetry is used to couple one Higgs, $\\Phi_2$, to quarks and the other, $\\Phi_1$, to leptons. The symmetry eliminates tree level flavor changing neutral currents (FCNC). Motivated by strong constraints on such currents in the quark sector from meson-antimeson mixing, and by hints of $h \\to \\mu\\tau$ in the lepton sector, we study a simple three Higgs doublet model in which one doublet couples to quarks and the other two to leptons. Unlike most other studies of three Higgs doublet models, we impose no flavor symmetry and just use a $Z_2$ symmetry to constrain the Yukawa couplings. We present the model and discuss the various mixing angles. Constraining the parameters to be consistent with observations of the Higgs boson at the LHC, we study the properties of the charged Higgs boson(s) in the model, focusing on the case in which the charged Higgs is above the top threshold. It is found that one can have the branching fraction of the charged ...

  14. Seismic Safety Margins Research Program (Phase I). Project VII. Systems analysis specification of computational approach

    Energy Technology Data Exchange (ETDEWEB)

    Wall, I.B.; Kaul, M.K.; Post, R.I.; Tagart, S.W. Jr.; Vinson, T.J.

    1979-02-01

    An initial specification is presented of a computation approach for a probabilistic risk assessment model for use in the Seismic Safety Margin Research Program. This model encompasses the whole seismic calculational chain from seismic input through soil-structure interaction, transfer functions to the probability of component failure, integration of these failures into a system model and thereby estimate the probability of a release of radioactive material to the environment. It is intended that the primary use of this model will be in sensitivity studies to assess the potential conservatism of different modeling elements in the chain and to provide guidance on priorities for research in seismic design of nuclear power plants.

  15. Modelling the World Wool Market: A Hybrid Approach

    OpenAIRE

    2007-01-01

    We present a model of the world wool market that merges two modelling traditions: the partialequilibrium commodity-specific approach and the computable general-equilibrium approach. The model captures the multistage nature of the wool production system, and the heterogeneous nature of raw wool, processed wool and wool garments. It also captures the important wool producing and consuming regions of the world. We illustrate the utility of the model by estimating the effects of tariff barriers o...

  16. Modeling Airflow Using Subject-Specific 4DCT-Based Deformable Volumetric Lung Models

    OpenAIRE

    Olusegun J. Ilegbusi; Zhiliang Li; Behnaz Seyfi; Yugang Min; Sanford Meeks; Patrick Kupelian; Santhanam, Anand P.

    2012-01-01

    Lung radiotherapy is greatly benefitted when the tumor motion caused by breathing can be modeled. The aim of this paper is to present the importance of using anisotropic and subject-specific tissue elasticity for simulating the airflow inside the lungs. A computational-fluid-dynamics (CFD) based approach is presented to simulate airflow inside a subject-specific deformable lung for modeling lung tumor motion and the motion of the surrounding tissues during radiotherapy. A flow-structure inter...

  17. A grammar inference approach for predicting kinase specific phosphorylation sites.

    Science.gov (United States)

    Datta, Sutapa; Mukhopadhyay, Subhasis

    2015-01-01

    Kinase mediated phosphorylation site detection is the key mechanism of post translational mechanism that plays an important role in regulating various cellular processes and phenotypes. Many diseases, like cancer are related with the signaling defects which are associated with protein phosphorylation. Characterizing the protein kinases and their substrates enhances our ability to understand the mechanism of protein phosphorylation and extends our knowledge of signaling network; thereby helping us to treat such diseases. Experimental methods for predicting phosphorylation sites are labour intensive and expensive. Also, manifold increase of protein sequences in the databanks over the years necessitates the improvement of high speed and accurate computational methods for predicting phosphorylation sites in protein sequences. Till date, a number of computational methods have been proposed by various researchers in predicting phosphorylation sites, but there remains much scope of improvement. In this communication, we present a simple and novel method based on Grammatical Inference (GI) approach to automate the prediction of kinase specific phosphorylation sites. In this regard, we have used a popular GI algorithm Alergia to infer Deterministic Stochastic Finite State Automata (DSFA) which equally represents the regular grammar corresponding to the phosphorylation sites. Extensive experiments on several datasets generated by us reveal that, our inferred grammar successfully predicts phosphorylation sites in a kinase specific manner. It performs significantly better when compared with the other existing phosphorylation site prediction methods. We have also compared our inferred DSFA with two other GI inference algorithms. The DSFA generated by our method performs superior which indicates that our method is robust and has a potential for predicting the phosphorylation sites in a kinase specific manner.

  18. Formal Specification and Automatic Analysis of Business Processes under Authorization Constraints: An Action-Based Approach

    Science.gov (United States)

    Armando, Alessandro; Giunchiglia, Enrico; Ponta, Serena Elisa

    We present an approach to the formal specification and automatic analysis of business processes under authorization constraints based on the action language \\cal{C}. The use of \\cal{C} allows for a natural and concise modeling of the business process and the associated security policy and for the automatic analysis of the resulting specification by using the Causal Calculator (CCALC). Our approach improves upon previous work by greatly simplifying the specification step while retaining the ability to perform a fully automatic analysis. To illustrate the effectiveness of the approach we describe its application to a version of a business process taken from the banking domain and use CCALC to determine resource allocation plans complying with the security policy.

  19. HA novel approach to investigate tissue-specific trinucleotide repeat instability

    Directory of Open Access Journals (Sweden)

    Boily Marie-Josee

    2010-03-01

    Full Text Available Abstract Background In Huntington's disease (HD, an expanded CAG repeat produces characteristic striatal neurodegeneration. Interestingly, the HD CAG repeat, whose length determines age at onset, undergoes tissue-specific somatic instability, predominant in the striatum, suggesting that tissue-specific CAG length changes could modify the disease process. Therefore, understanding the mechanisms underlying the tissue specificity of somatic instability may provide novel routes to therapies. However progress in this area has been hampered by the lack of sensitive high-throughput instability quantification methods and global approaches to identify the underlying factors. Results Here we describe a novel approach to gain insight into the factors responsible for the tissue specificity of somatic instability. Using accurate genetic knock-in mouse models of HD, we developed a reliable, high-throughput method to quantify tissue HD CAG repeat instability and integrated this with genome-wide bioinformatic approaches. Using tissue instability quantified in 16 tissues as a phenotype and tissue microarray gene expression as a predictor, we built a mathematical model and identified a gene expression signature that accurately predicted tissue instability. Using the predictive ability of this signature we found that somatic instability was not a consequence of pathogenesis. In support of this, genetic crosses with models of accelerated neuropathology failed to induce somatic instability. In addition, we searched for genes and pathways that correlated with tissue instability. We found that expression levels of DNA repair genes did not explain the tissue specificity of somatic instability. Instead, our data implicate other pathways, particularly cell cycle, metabolism and neurotransmitter pathways, acting in combination to generate tissue-specific patterns of instability. Conclusion Our study clearly demonstrates that multiple tissue factors reflect the level of

  20. On the specification of structural equation models for ecological systems

    Science.gov (United States)

    Grace, J.B.; Michael, Anderson T.; Han, O.; Scheiner, S.M.

    2010-01-01

    The use of structural equation modeling (SEM) is often motivated by its utility for investigating complex networks of relationships, but also because of its promise as a means of representing theoretical concepts using latent variables. In this paper, we discuss characteristics of ecological theory and some of the challenges for proper specification of theoretical ideas in structural equation models (SE models). In our presentation, we describe some of the requirements for classical latent variable models in which observed variables (indicators) are interpreted as the effects of underlying causes. We also describe alternative model specifications in which indicators are interpreted as having causal influences on the theoretical concepts. We suggest that this latter nonclassical specification (which involves another variable type-the composite) will often be appropriate for ecological studies because of the multifaceted nature of our theoretical concepts. In this paper, we employ the use of meta-models to aid the translation of theory into SE models and also to facilitate our ability to relate results back to our theories. We demonstrate our approach by showing how a synthetic theory of grassland biodiversity can be evaluated using SEM and data from a coastal grassland. In this example, the theory focuses on the responses of species richness to abiotic stress and disturbance, both directly and through intervening effects on community biomass. Models examined include both those based on classical forms (where each concept is represented using a single latent variable) and also ones in which the concepts are recognized to be multifaceted and modeled as such. To address the challenge of matching SE models with the conceptual level of our theory, two approaches are illustrated, compositing and aggregation. Both approaches are shown to have merits, with the former being preferable for cases where the multiple facets of a concept have widely differing effects in the

  1. New technologies and the Mission Specific Platform approach

    Science.gov (United States)

    McInroy, D.; Smith, D.; Freudenthal, T.

    2009-04-01

    Within the Integrated Ocean Drilling Program (IODP), ECORD-operated Mission Specific Platforms (MSPs) have allowed scientific ocean drilling to recover core from targets that are generally inaccessible to the two dedicated IODP platforms: the US-operated JOIDES Resolution and the Japanese-operated Chikyu. By contracting vessels, drilling and logging services on a case-by-case basis, IODP has used MSPs to successfully conduct expeditions in the high Arctic Ocean and around Tahiti, and has shown that the program can recover cores in ice-covered waters and in very shallow water. The key strength of the MSP approach is that vessels, drilling and logging systems can be contracted to meet the particular needs of a scientific proposal. Within IODP, MSPs carry the necessary staff and equipment to recover and curate the core, to carry out initial descriptions, undertake a tailored downhole logging program and conduct essential measurements of physical and ephemeral properties. Comprehensive description and analysis of the cores to IODP standards takes place after the offshore phase has ended at the IODP Bremen Core Repository (BCR) in Germany. Depending on availability and cost, potentially any vessel, drilling or logging system can be hired to conduct an MSP. Future possibilities may include the Aurora Borealis that is currently being planned as an ice-breaking drilling vessel with the capability to penetrate 1000 m in 5000 m of water. The concept of MSPs could also be widened beyond vessels with conventional drill rigs. New and alternative technologies can be contracted as part of an MSP Expedition, for example remotely-operated shallow rock drills like MeBo (developed by the MARUM - Center for Marine Environmental Sciences) and the BGS Rockdrills (developed by the British Geological Survey). Such technologies have many advantages: they can be quickly deployed from a range of research and industry vessels, they can operate in a wide range of water depths (up to 6000 m by

  2. 77 FR 27814 - Model Safety Evaluation for Plant-Specific Adoption of Technical Specifications Task Force...

    Science.gov (United States)

    2012-05-11

    ... COMMISSION Model Safety Evaluation for Plant-Specific Adoption of Technical Specifications Task Force... availability. SUMMARY: The U.S. Nuclear Regulatory Commission (NRC) is announcing the availability of the model safety evaluation (SE) for plant-specific adoption of Technical Specifications (TSs) Task Force...

  3. Modular Modelling and Simulation Approach - Applied to Refrigeration Systems

    DEFF Research Database (Denmark)

    Sørensen, Kresten Kjær; Stoustrup, Jakob

    2008-01-01

    This paper presents an approach to modelling and simulation of the thermal dynamics of a refrigeration system, specifically a reefer container. A modular approach is used and the objective is to increase the speed and flexibility of the developed simulation environment. The refrigeration system...

  4. Validation of Modeling Flow Approaching Navigation Locks

    Science.gov (United States)

    2013-08-01

    instrumentation, direction vernier . ........................................................................ 8  Figure 11. Plan A lock approach, upstream approach...13-9 8 Figure 9. Tools and instrumentation, bracket attached to rail. Figure 10. Tools and instrumentation, direction vernier . Numerical model

  5. Fusarium diversity in soil using a specific molecular approach and a cultural approach.

    Science.gov (United States)

    Edel-Hermann, Véronique; Gautheron, Nadine; Mounier, Arnaud; Steinberg, Christian

    2015-04-01

    Fusarium species are ubiquitous in soil. They cause plant and human diseases and can produce mycotoxins. Surveys of Fusarium species diversity in environmental samples usually rely on laborious culture-based methods. In the present study, we have developed a molecular method to analyze Fusarium diversity directly from soil DNA. We designed primers targeting the translation elongation factor 1-alpha (EF-1α) gene and demonstrated their specificity toward Fusarium using a large collection of fungi. We used the specific primers to construct a clone library from three contrasting soils. Sequence analysis confirmed the specificity of the assay, with 750 clones identified as Fusarium and distributed among eight species or species complexes. The Fusarium oxysporum species complex (FOSC) was the most abundant one in the three soils, followed by the Fusarium solani species complex (FSSC). We then compared our molecular approach results with those obtained by isolating Fusarium colonies on two culture media and identifying species by sequencing part of the EF-1α gene. The 750 isolates were distributed into eight species or species complexes, with the same dominant species as with the cloning method. Sequence diversity was much higher in the clone library than in the isolate collection. The molecular approach proved to be a valuable tool to assess Fusarium diversity in environmental samples. Combined with high throughput sequencing, it will allow for in-depth analysis of large numbers of samples.

  6. Model Mapping Approach Based on Ontology Semantics

    Directory of Open Access Journals (Sweden)

    Jinkui Hou

    2013-09-01

    Full Text Available The mapping relations between different models are the foundation for model transformation in model-driven software development. On the basis of ontology semantics, model mappings between different levels are classified by using structural semantics of modeling languages. The general definition process for mapping relations is explored, and the principles of structure mapping are proposed subsequently. The approach is further illustrated by the mapping relations from class model of object oriented modeling language to the C programming codes. The application research shows that the approach provides a theoretical guidance for the realization of model mapping, and thus can make an effective support to model-driven software development

  7. Fence - An Efficient Parser with Ambiguity Support for Model-Driven Language Specification

    CERN Document Server

    Quesada, Luis; Cortijo, Francisco J

    2011-01-01

    Model-based language specification has applications in the implementation of language processors, the design of domain-specific languages, model-driven software development, data integration, text mining, natural language processing, and corpus-based induction of models. Model-based language specification decouples language design from language processing and, unlike traditional grammar-driven approaches, which constrain language designers to specific kinds of grammars, it needs general parser generators able to deal with ambiguities. In this paper, we propose Fence, an efficient bottom-up parsing algorithm with lexical and syntactic ambiguity support that enables the use of model-based language specification in practice.

  8. A Model-Driven Approach for Telecommunications Network Services Definition

    Science.gov (United States)

    Chiprianov, Vanea; Kermarrec, Yvon; Alff, Patrick D.

    Present day Telecommunications market imposes a short concept-to-market time for service providers. To reduce it, we propose a computer-aided, model-driven, service-specific tool, with support for collaborative work and for checking properties on models. We started by defining a prototype of the Meta-model (MM) of the service domain. Using this prototype, we defined a simple graphical modeling language specific for service designers. We are currently enlarging the MM of the domain using model transformations from Network Abstractions Layers (NALs). In the future, we will investigate approaches to ensure the support for collaborative work and for checking properties on models.

  9. The 727 approach energy management system avionics specification (preliminary)

    Science.gov (United States)

    Jackson, D. O.; Lambregts, A. A.

    1976-01-01

    Hardware and software requirements for an Approach Energy Management System (AEMS) consisting of an airborne digital computer and cockpit displays are presented. The displays provide the pilot with a visual indication of when to manually operate the gear, flaps, and throttles during a delayed flap approach so as to reduce approach time, fuel consumption, and community noise. The AEMS is an independent system that does not interact with other navigation or control systems, and is compatible with manually flown or autopilot coupled approaches. Operational use of the AEMS requires a DME ground station colocated with the flight path reference.

  10. Learning Action Models: Qualitative Approach

    NARCIS (Netherlands)

    Bolander, T.; Gierasimczuk, N.; van der Hoek, W.; Holliday, W.H.; Wang, W.-F.

    2015-01-01

    In dynamic epistemic logic, actions are described using action models. In this paper we introduce a framework for studying learnability of action models from observations. We present first results concerning propositional action models. First we check two basic learnability criteria: finite

  11. Computer models of bacterial cells: from generalized coarsegrained to genome-specific modular models

    Science.gov (United States)

    Nikolaev, Evgeni V.; Atlas, Jordan C.; Shuler, Michael L.

    2006-09-01

    We discuss a modular modelling framework to rapidly develop mathematical models of bacterial cells that would explicitly link genomic details to cell physiology and population response. An initial step in this approach is the development of a coarse-grained model, describing pseudo-chemical interactions between lumped species. A hybrid model of interest can then be constructed by embedding genome-specific detail for a particular cellular subsystem (e.g. central metabolism), called here a module, into the coarse-grained model. Specifically, a new strategy for sensitivity analysis of the cell division limit cycle is introduced to identify which pseudo-molecular processes should be delumped to implement a particular biological function in a growing cell (e.g. ethanol overproduction or pathogen viability). To illustrate the modeling principles and highlight computational challenges, the Cornell coarsegrained model of Escherichia coli B/r-A is used to benchmark the proposed framework.

  12. Geometrical approach to fluid models

    NARCIS (Netherlands)

    Kuvshinov, B. N.; Schep, T. J.

    1997-01-01

    Differential geometry based upon the Cartan calculus of differential forms is applied to investigate invariant properties of equations that describe the motion of continuous media. The main feature of this approach is that physical quantities are treated as geometrical objects. The geometrical

  13. Geometrical approach to fluid models

    NARCIS (Netherlands)

    Kuvshinov, B. N.; Schep, T. J.

    1997-01-01

    Differential geometry based upon the Cartan calculus of differential forms is applied to investigate invariant properties of equations that describe the motion of continuous media. The main feature of this approach is that physical quantities are treated as geometrical objects. The geometrical notio

  14. Model based feature fusion approach

    NARCIS (Netherlands)

    Schwering, P.B.W.

    2001-01-01

    In recent years different sensor data fusion approaches have been analyzed and evaluated in the field of mine detection. In various studies comparisons have been made between different techniques. Although claims can be made for advantages for using certain techniques, until now there has been no si

  15. State-oriented models in software specification

    Directory of Open Access Journals (Sweden)

    Adilson Luiz Bonifácio

    2004-01-01

    Full Text Available These techniques can be formal or not according to the developing system. In this work, a formal modeling technique is applied in a case study. The Finite State Machine model is used to specify the calculator functionalities, which models the basic arithmetical operations.

  16. Global energy modeling - A biophysical approach

    Energy Technology Data Exchange (ETDEWEB)

    Dale, Michael

    2010-09-15

    This paper contrasts the standard economic approach to energy modelling with energy models using a biophysical approach. Neither of these approaches includes changing energy-returns-on-investment (EROI) due to declining resource quality or the capital intensive nature of renewable energy sources. Both of these factors will become increasingly important in the future. An extension to the biophysical approach is outlined which encompasses a dynamic EROI function that explicitly incorporates technological learning. The model is used to explore several scenarios of long-term future energy supply especially concerning the global transition to renewable energy sources in the quest for a sustainable energy system.

  17. A POMDP approach to Affective Dialogue Modeling

    NARCIS (Netherlands)

    Bui Huu Trung, B.H.T.; Poel, Mannes; Nijholt, Antinus; Zwiers, Jakob; Keller, E.; Marinaro, M.; Bratanic, M.

    2007-01-01

    We propose a novel approach to developing a dialogue model that is able to take into account some aspects of the user's affective state and to act appropriately. Our dialogue model uses a Partially Observable Markov Decision Process approach with observations composed of the observed user's

  18. The chronic diseases modelling approach

    NARCIS (Netherlands)

    Hoogenveen RT; Hollander AEM de; Genugten MLL van; CCM

    1998-01-01

    A mathematical model structure is described that can be used to simulate the changes of the Dutch public health state over time. The model is based on the concept of demographic and epidemiologic processes (events) and is mathematically based on the lifetable method. The population is divided over s

  19. Cost Concept Model and Gateway Specification

    DEFF Research Database (Denmark)

    Kejser, Ulla Bøgvad

    2014-01-01

    to promote interoperability; • A Nested Model for Digital Curation—that visualises the core concepts, demonstrates how they interact and places them into context visually by linking them to A Cost and Benefit Model for Curation; This Framework provides guidance for data collection and associated calculations......This document introduces a Framework supporting the implementation of a cost concept model against which current and future cost models for curating digital assets can be benchmarked. The value built into this cost concept model leverages the comprehensive engagement by the 4C project with various...... user communities and builds upon our understanding of the requirements, drivers, obstacles and objectives that various stakeholder groups have relating to digital curation. Ultimately, this concept model should provide a critical input to the development and refinement of cost models as well as helping...

  20. Fast reconstruction of compact context-specific metabolic network models.

    Directory of Open Access Journals (Sweden)

    Nikos Vlassis

    2014-01-01

    Full Text Available Systemic approaches to the study of a biological cell or tissue rely increasingly on the use of context-specific metabolic network models. The reconstruction of such a model from high-throughput data can routinely involve large numbers of tests under different conditions and extensive parameter tuning, which calls for fast algorithms. We present fastcore, a generic algorithm for reconstructing context-specific metabolic network models from global genome-wide metabolic network models such as Recon X. fastcore takes as input a core set of reactions that are known to be active in the context of interest (e.g., cell or tissue, and it searches for a flux consistent subnetwork of the global network that contains all reactions from the core set and a minimal set of additional reactions. Our key observation is that a minimal consistent reconstruction can be defined via a set of sparse modes of the global network, and fastcore iteratively computes such a set via a series of linear programs. Experiments on liver data demonstrate speedups of several orders of magnitude, and significantly more compact reconstructions, over a rival method. Given its simplicity and its excellent performance, fastcore can form the backbone of many future metabolic network reconstruction algorithms.

  1. Challenges and opportunities for integrating lake ecosystem modelling approaches

    Science.gov (United States)

    Mooij, Wolf M.; Trolle, Dennis; Jeppesen, Erik; Arhonditsis, George; Belolipetsky, Pavel V.; Chitamwebwa, Deonatus B.R.; Degermendzhy, Andrey G.; DeAngelis, Donald L.; Domis, Lisette N. De Senerpont; Downing, Andrea S.; Elliott, J. Alex; Ruberto, Carlos Ruberto; Gaedke, Ursula; Genova, Svetlana N.; Gulati, Ramesh D.; Hakanson, Lars; Hamilton, David P.; Hipsey, Matthew R.; Hoen, Jochem 't; Hulsmann, Stephan; Los, F. Hans; Makler-Pick, Vardit; Petzoldt, Thomas; Prokopkin, Igor G.; Rinke, Karsten; Schep, Sebastiaan A.; Tominaga, Koji; Van Dam, Anne A.; Van Nes, Egbert H.; Wells, Scott A.; Janse, Jan H.

    2010-01-01

    A large number and wide variety of lake ecosystem models have been developed and published during the past four decades. We identify two challenges for making further progress in this field. One such challenge is to avoid developing more models largely following the concept of others ('reinventing the wheel'). The other challenge is to avoid focusing on only one type of model, while ignoring new and diverse approaches that have become available ('having tunnel vision'). In this paper, we aim at improving the awareness of existing models and knowledge of concurrent approaches in lake ecosystem modelling, without covering all possible model tools and avenues. First, we present a broad variety of modelling approaches. To illustrate these approaches, we give brief descriptions of rather arbitrarily selected sets of specific models. We deal with static models (steady state and regression models), complex dynamic models (CAEDYM, CE-QUAL-W2, Delft 3D-ECO, LakeMab, LakeWeb, MyLake, PCLake, PROTECH, SALMO), structurally dynamic models and minimal dynamic models. We also discuss a group of approaches that could all be classified as individual based: super-individual models (Piscator, Charisma), physiologically structured models, stage-structured models and trait-based models. We briefly mention genetic algorithms, neural networks, Kalman filters and fuzzy logic. Thereafter, we zoom in, as an in-depth example, on the multi-decadal development and application of the lake ecosystem model PCLake and related models (PCLake Metamodel, Lake Shira Model, IPH-TRIM3D-PCLake). In the discussion, we argue that while the historical development of each approach and model is understandable given its 'leading principle', there are many opportunities for combining approaches. We take the point of view that a single 'right' approach does not exist and should not be strived for. Instead, multiple modelling approaches, applied concurrently to a given problem, can help develop an integrative

  2. Towards a CPN-Based Modelling Approach for Reconciling Verification and Implementation of Protocol Models

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge; Kristensen, Lars Michael

    2013-01-01

    and implementation. Our approach has been developed in the context of the Coloured Petri Nets (CPNs) modelling language. We illustrate our approach by presenting a descriptive specification model of the Websocket protocol which is currently under development by the Internet Engineering Task Force (IETF), and we show......Formal modelling of protocols is often aimed at one specific purpose such as verification or automatically generating an implementation. This leads to models that are useful for one purpose, but not for others. Being able to derive models for verification and implementation from a single model...

  3. Face recognition: a model specific ability

    Directory of Open Access Journals (Sweden)

    Jeremy B Wilmer

    2014-10-01

    Full Text Available In our everyday lives, we view it as a matter of course that different people are good at different things. It can be surprising, in this context, to learn that most of what is known about cognitive ability variation across individuals concerns the broadest of all cognitive abilities, often labeled g. In contrast, our knowledge of specific abilities, those that correlate little with g, is severely constrained. Here, we draw upon our experience investigating an exceptionally specific ability, face recognition, to make the case that many specific abilities could easily have been missed. In making this case, we derive key insights from earlier false starts in the measurement of face recognition’s variation across individuals, and we highlight the convergence of factors that enabled the recent discovery that this variation is specific. We propose that the case of face recognition ability illustrates a set of tools and perspectives that could accelerate fruitful work on specific cognitive abilities. By revealing relatively independent dimensions of human ability, such work would enhance our capacity to understand the uniqueness of individual minds.

  4. Morphing patient-specific musculoskeletal models

    DEFF Research Database (Denmark)

    Rasmussen, John; Galibarov, Pavel E.; Al-Munajjed, Amir;

    Anatomically realistic musculoskeletal models tend to be very complicated. The current full-body model of the AnyScript Model Repository comprises more than 1000 individually activated muscles and hundreds of bones and joints, and the development of these generic body parts represents an investment...... the generic model differs significantly from the patient in question. The scenario therefore entails two sets of data: (i) a generic musculoskeletal model representing a single (average) individual, and (ii) a set of 3-D medical imaging data, typically in the form of a DICOM file obtained from CT, MRI...... or surface scans. Furthermore, we assume that a set of corresponding anatomical landmarks can be identified in the medical imaging data and on the generic musculoskeletal model. A nonlinear transformation, i.e. a morphing, is created by means of radial basis functions that maps points set (i) to point set...

  5. Cost Concept Model and Gateway Specification

    DEFF Research Database (Denmark)

    Kejser, Ulla Bøgvad

    2014-01-01

    This document introduces a Framework supporting the implementation of a cost concept model against which current and future cost models for curating digital assets can be benchmarked. The value built into this cost concept model leverages the comprehensive engagement by the 4C project with various...... to ensure that the curation and preservation solutions and services that will inevitably arise from the commercial sector as ‘supply’ respond to a much better understood ‘demand’ for cost-effective and relevant tools. To meet acknowledged gaps in current provision, a nested model of curation which addresses...... user communities and builds upon our understanding of the requirements, drivers, obstacles and objectives that various stakeholder groups have relating to digital curation. Ultimately, this concept model should provide a critical input to the development and refinement of cost models as well as helping...

  6. Domain-specific modeling enabling full code generation

    CERN Document Server

    Kelly, Steven

    2007-01-01

    Domain-Specific Modeling (DSM) is the latest approach tosoftware development, promising to greatly increase the speed andease of software creation. Early adopters of DSM have been enjoyingproductivity increases of 500–1000% in production for over adecade. This book introduces DSM and offers examples from variousfields to illustrate to experienced developers how DSM can improvesoftware development in their teams. Two authorities in the field explain what DSM is, why it works,and how to successfully create and use a DSM solution to improveproductivity and quality. Divided into four parts, the book covers:background and motivation; fundamentals; in-depth examples; andcreating DSM solutions. There is an emphasis throughout the book onpractical guidelines for implementing DSM, including how toidentify the nece sary language constructs, how to generate fullcode from models, and how to provide tool support for a new DSMlanguage. The example cases described in the book are available thebook's Website, www.dsmbook....

  7. A Unified Approach to Modeling and Programming

    DEFF Research Database (Denmark)

    Madsen, Ole Lehrmann; Møller-Pedersen, Birger

    2010-01-01

    of this paper is to go back to the future and get inspiration from SIMULA and propose a unied approach. In addition to reintroducing the contributions of SIMULA and the Scandinavian approach to object-oriented programming, we do this by discussing a number of issues in modeling and programming and argue3 why we......SIMULA was a language for modeling and programming and provided a unied approach to modeling and programming in contrast to methodologies based on structured analysis and design. The current development seems to be going in the direction of separation of modeling and programming. The goal...

  8. 77 FR 15399 - Model Safety Evaluation for Plant-Specific Adoption of Technical Specifications Task Force...

    Science.gov (United States)

    2012-03-15

    ... COMMISSION Model Safety Evaluation for Plant-Specific Adoption of Technical Specifications Task Force... Regulatory Commission (NRC) is announcing the availability of the model safety evaluation (SE) for plant..., Revision 1, is available in ADAMS under Accession No. ML111650552; the model application is available...

  9. 77 FR 58421 - Model Safety Evaluation for Plant-Specific Adoption of Technical Specifications Task Force...

    Science.gov (United States)

    2012-09-20

    ... COMMISSION Model Safety Evaluation for Plant-Specific Adoption of Technical Specifications Task Force...-415- 4737, or by email to pdr.resource@nrc.gov . TSTF-522, Revision 0, includes a model application and is available in ADAMS under Accession No. ML100890316. The model safety evaluation (SE) of...

  10. A semiparametric approach to physiological flow models.

    Science.gov (United States)

    Verotta, D; Sheiner, L B; Ebling, W F; Stanski, D R

    1989-08-01

    By regarding sampled tissues in a physiological model as linear subsystems, the usual advantages of flow models are preserved while mitigating two of their disadvantages, (i) the need for assumptions regarding intratissue kinetics, and (ii) the need to simultaneously fit data from several tissues. To apply the linear systems approach, both arterial blood and (interesting) tissue drug concentrations must be measured. The body is modeled as having an arterial compartment (A) distributing drug to different linear subsystems (tissues), connected in a specific way by blood flow. The response (CA, with dimensions of concentration) of A is measured. Tissues receive input from A (and optionally from other tissues), and send output to the outside or to other parts of the body. The response (CT, total amount of drug in the tissue (T) divided by the volume of T) from the T-th one, for example, of such tissues is also observed. From linear systems theory, CT can be expressed as the convolution of CA with a disposition function, F(t) (with dimensions 1/time). The function F(t) depends on the (unknown) structure of T, but has certain other constant properties: The integral integral infinity0 F(t) dt is the steady state ratio of CT to CA, and the point F(0) is the clearance rate of drug from A to T divided by the volume of T. A formula for the clearance rate of drug from T to outside T can be derived. To estimate F(t) empirically, and thus mitigate disadvantage (i), we suggest that, first, a nonparametric (or parametric) function be fitted to CA data yielding predicted values, CA, and, second, the convolution integral of CA with F(t) be fitted to CT data using a deconvolution method. By so doing, each tissue's data are analyzed separately, thus mitigating disadvantage (ii). A method for system simulation is also proposed. The results of applying the approach to simulated data and to real thiopental data are reported.

  11. SPECIFIC MODELS OF REPRESENTING THE INTELLECTUAL CAPITAL

    Directory of Open Access Journals (Sweden)

    Andreea Feraru

    2014-12-01

    Full Text Available Various scientists in the modern age of management have launched different models for evaluating intellectual capital, and some of these models are analysed critically in this study, too. Most authors examine intellectual capital from a static perspective and focus on the development of its various evaluation models. In this chapter we surveyed the classical static models: Sveiby, Edvisson, Balanced Scorecard, as well as the canonical model of intellectual capital. In a spectral dynamic analysis, organisational intellectual capital is structured in: organisational knowledge, organisational intelligence, organisational values, and their value is built on certain mechanisms entitled integrators, whose chief constitutive elements are: individual knowledge, individual intelligence and individual cultural values. The organizations, as employers, must especially reconsider those employees’ work who value knowledge because they are free to choose how, and especially where they are inclined to invest their own energy, skills and time, and they can be treated as freelancers or as some little entrepreneurs .

  12. Domain-Specific Modelling Languages in Bigraphs

    DEFF Research Database (Denmark)

    Perrone, Gian David

    " of models, in order to improve the utility of the models we build, and to ease the process of model construction by moving the languages we use to express such models closer to their respective domains. This thesis is concerned with the study of bigraphical reactive systems as a host for domain......-specic modelling languages. We present a number of novel technical developments, including a new complete meta-calculus presentation of bigraphical reactive systems, an abstract machine that instantiates to an abstract machine for any instance calculus, and a mechanism for dening declarative sorting predicates...... that always give rise to wellbehaved sortings. We explore bigraphical renement relations that permit formalisation of the relationship between dierent languages instantiated as bigraphical reactive systems. We detail a prototype verication tool for instance calculi, and provide a tractable heuristic...

  13. A historical approach to English for Specific Purposes in Ecuador

    Directory of Open Access Journals (Sweden)

    Maritza Sandra Pibaque Pionce

    2015-01-01

    Full Text Available The systematization of experiences in the process of doctoral training in Educational Sciences, allows proposing the historical analysis of English for Specific Purposes in Ecuador, preferably in International Business Degree students, as the basis for the theoretical analysis for the scientific assessment of educational research works. The definition and justification for the periods established in time are presented, which aims at assessing the historical development of the teaching-learning process of English for Specific Purposes related to business formation, upon which is necessary to address the effectiveness of teaching English. Thus the lingo-cultural competence focuses on the ability to negotiate cultural meanings and implement effective communication behaviors concentrating connected aspects of languages directly with specific cultural facts of a society. This research work has been based on bibliography analysis, inductive - deductive and analysis – synthesis methods.

  14. Allergen-specific immunotherapy: from therapeutic vaccines to prophylactic approaches.

    Science.gov (United States)

    Valenta, R; Campana, R; Marth, K; van Hage, M

    2012-08-01

    Immunoglobulin E-mediated allergies affect more than 25% of the population. Allergen exposure induces a variety of symptoms in allergic patients, which include rhinitis, conjunctivitis, asthma, dermatitis, food allergy and life-threatening systemic anaphylaxis. At present, allergen-specific immunotherapy (SIT), which is based on the administration of the disease-causing allergens, is the only disease-modifying treatment for allergy. Current therapeutic allergy vaccines are still prepared from relatively poorly defined allergen extracts. However, with the availability of the structures of the most common allergen molecules, it has become possible to produce well-defined recombinant and synthetic allergy vaccines that allow specific targeting of the mechanisms of allergic disease. Here we provide a summary of the development and mechanisms of SIT, and then review new forms of therapeutic vaccines that are based on recombinant and synthetic molecules. Finally, we discuss possible allergen-specific strategies for prevention of allergic disease.

  15. Modelling road accidents: An approach using structural time series

    Science.gov (United States)

    Junus, Noor Wahida Md; Ismail, Mohd Tahir

    2014-09-01

    In this paper, the trend of road accidents in Malaysia for the years 2001 until 2012 was modelled using a structural time series approach. The structural time series model was identified using a stepwise method, and the residuals for each model were tested. The best-fitted model was chosen based on the smallest Akaike Information Criterion (AIC) and prediction error variance. In order to check the quality of the model, a data validation procedure was performed by predicting the monthly number of road accidents for the year 2012. Results indicate that the best specification of the structural time series model to represent road accidents is the local level with a seasonal model.

  16. A FORMAL SPECIFICATION LANGUAGE FOR DYNAMIC STRAND SPACE MODEL

    Institute of Scientific and Technical Information of China (English)

    刘东喜; 李晓勇; 白英彩

    2002-01-01

    Specification language is used to provide enough information for the model of the cryptographic protocol. This paper first extends strand space model to dynamic strand model, and then a formal specification language for this model is defined by using BNF grammar. Compared with those in literatures, it is simpler because of only concerning the algebraic properties of cryptographic protocols.

  17. Blogs for Specific Purposes: Expressivist or Socio-Cognitivist Approach?

    Science.gov (United States)

    Murray, Liam; Hourigan, Triona

    2008-01-01

    This paper represents an earnest attempt to identify specific pedagogical roles for blogs in language learning. After briefly describing various types of blogs and defining their purposes (Herring "et al.", 2005) we attempt to accommodate their position and application within language teaching (Thorne & Scott Payne, 2005), relating evidence from…

  18. Modelling the Heat Consumption in District Heating Systems using a Grey-box approach

    DEFF Research Database (Denmark)

    Nielsen, Henrik Aalborg; Madsen, Henrik

    2006-01-01

    identification of an overall model structure followed by data-based modelling, whereby the details of the model are identified. This approach is sometimes called grey-box modelling, but the specific approach used here does not require states to be specified. Overall, the paper demonstrates the power of the grey......-box approach. (c) 2005 Elsevier B.V. All rights reserved....

  19. Modelling the Heat Consumption in District Heating Systems using a Grey-box approach

    DEFF Research Database (Denmark)

    Nielsen, Henrik Aalborg; Madsen, Henrik

    2006-01-01

    identification of an overall model structure followed by data-based modelling, whereby the details of the model are identified. This approach is sometimes called grey-box modelling, but the specific approach used here does not require states to be specified. Overall, the paper demonstrates the power of the grey......-box approach. (c) 2005 Elsevier B.V. All rights reserved....

  20. Szekeres models: a covariant approach

    CERN Document Server

    Apostolopoulos, Pantelis S

    2016-01-01

    We exploit the 1+1+2 formalism to covariantly describe the inhomogeneous and anisotropic Szekeres models. It is shown that an \\emph{average scale length} can be defined \\emph{covariantly} which satisfies a 2d equation of motion driven from the \\emph{effective gravitational mass} (EGM) contained in the dust cloud. The contributions to the EGM are encoded to the energy density of the dust fluid and the free gravitational field $E_{ab}$. In addition the notions of the Apparent and Absolute Apparent Horizons are briefly discussed and we give an alternative gauge-invariant form to define them in terms of the kinematical variables of the spacelike congruences. We argue that the proposed program can be used in order to express the Sachs optical equations in a covariant form and analyze the confrontation of a spatially inhomogeneous irrotational overdense fluid model with the observational data.

  1. A new approach to adaptive data models

    Directory of Open Access Journals (Sweden)

    Ion LUNGU

    2016-12-01

    Full Text Available Over the last decade, there has been a substantial increase in the volume and complexity of data we collect, store and process. We are now aware of the increasing demand for real time data processing in every continuous business process that evolves within the organization. We witness a shift from a traditional static data approach to a more adaptive model approach. This article aims to extend understanding in the field of data models used in information systems by examining how an adaptive data model approach for managing business processes can help organizations accommodate on the fly and build dynamic capabilities to react in a dynamic environment.

  2. TAVI device selection: time for a patient-specific approach.

    Science.gov (United States)

    Lee, Marcus; Modine, Thomas; Piazza, Nicolo; Mylotte, Darren

    2016-09-18

    Individualised, patient-centred care is a central tenet of modern medicine. The variety of transcatheter heart valves currently available affords the opportunity to select the most appropriate device for each individual patient. Prosthesis selection should be based on operator experience and pre-procedural multimodal three-dimensional imaging. Herein, we outline a number of clinical scenarios where specific transcatheter heart valve technologies have the potential to optimise clinical outcome.

  3. A chemical genetics approach for specific differentiation of stem cells to somatic cells: a new promising therapeutical approach.

    Science.gov (United States)

    Sachinidis, Agapios; Sotiriadou, Isaia; Seelig, Bianca; Berkessel, Albrecht; Hescheler, Jürgen

    2008-01-01

    Cell replacement therapy of severe degenerative diseases such as diabetes, myocardial infarction and Parkinson's disease through transplantation of somatic cells generated from embryonic stem (ES) cells is currently receiving considerable attention for the therapeutic applications. ES cells harvested from the inner cell mass (ICM) of the early embryo, can proliferate indefinitely in vitro while retaining the ability to differentiate into all somatic cells thereby providing an unlimited renewable source of somatic cells. In this context, identifying soluble factors, in particular chemically synthesized small molecules, and signal cascades involved in specific differentiation processes toward a defined tissue specific cell type are crucial for optimizing the generation of somatic cells in vitro for therapeutic approaches. However, experimental models are required allowing rapid and "easy-to-handle" parallel screening of chemical libraries to achieve this goal. Recently, the forward chemical genetic screening strategy has been postulated to screen small molecules in cellular systems for a specific desired phenotypic effect. The current review is focused on the progress of ES cell research in the context of the chemical genetics to identify small molecules promoting specific differentiation of ES cells to desired cell phenotype. Chemical genetics in the context of the cell ES-based cell replacement therapy remains a challenge for the near future for several scientific fields including chemistry, molecular biology, medicinal physics and robotic technologies.

  4. Current approaches to gene regulatory network modelling

    Directory of Open Access Journals (Sweden)

    Brazma Alvis

    2007-09-01

    Full Text Available Abstract Many different approaches have been developed to model and simulate gene regulatory networks. We proposed the following categories for gene regulatory network models: network parts lists, network topology models, network control logic models, and dynamic models. Here we will describe some examples for each of these categories. We will study the topology of gene regulatory networks in yeast in more detail, comparing a direct network derived from transcription factor binding data and an indirect network derived from genome-wide expression data in mutants. Regarding the network dynamics we briefly describe discrete and continuous approaches to network modelling, then describe a hybrid model called Finite State Linear Model and demonstrate that some simple network dynamics can be simulated in this model.

  5. Comparison of approaches for parameter estimation on stochastic models: Generic least squares versus specialized approaches.

    Science.gov (United States)

    Zimmer, Christoph; Sahle, Sven

    2016-04-01

    Parameter estimation for models with intrinsic stochasticity poses specific challenges that do not exist for deterministic models. Therefore, specialized numerical methods for parameter estimation in stochastic models have been developed. Here, we study whether dedicated algorithms for stochastic models are indeed superior to the naive approach of applying the readily available least squares algorithm designed for deterministic models. We compare the performance of the recently developed multiple shooting for stochastic systems (MSS) method designed for parameter estimation in stochastic models, a stochastic differential equations based Bayesian approach and a chemical master equation based techniques with the least squares approach for parameter estimation in models of ordinary differential equations (ODE). As test data, 1000 realizations of the stochastic models are simulated. For each realization an estimation is performed with each method, resulting in 1000 estimates for each approach. These are compared with respect to their deviation to the true parameter and, for the genetic toggle switch, also their ability to reproduce the symmetry of the switching behavior. Results are shown for different set of parameter values of a genetic toggle switch leading to symmetric and asymmetric switching behavior as well as an immigration-death and a susceptible-infected-recovered model. This comparison shows that it is important to choose a parameter estimation technique that can treat intrinsic stochasticity and that the specific choice of this algorithm shows only minor performance differences.

  6. Measurement of the Specific Heat Using a Gravity Cancellation Approach

    Science.gov (United States)

    Zhong, Fang

    2003-01-01

    The specific heat at constant volume C(sob V) of a simple fluid diverges near its liquid-vapor critical point. However, gravity-induced density stratification due to the divergence of isothermal susceptibility hinders the direct comparison of the experimental data with the predictions of renormalization group theory. In the past, a microgravity environment has been considered essential to eliminate the density stratification. We propose to perform specific heat measurements of He-3 on the ground using a method to cancel the density stratification. A He-3 fluid layer will be heated from below, using the thermal expansion of the fluid to cancel the hydrostatic compression. A 6% density stratification at a reduced temperature of 10(exp -5) can be cancelled to better than 0.1% with a steady 1.7 micro K temperature difference across a 0.05 cm thick fluid layer. A conventional AC calorimetry technique will be used to determine the heat capacity. The minimized bulk density stratification with a relaxation time 6500 sec at a reduced temperature of 10(exp -5) will stay unchanged during 1 Hz AC heating. The smear of the specific heat divergence due to the temperature difference across the cell is about 0.1% at a reduced temperature of 10(exp -6). The combination of using High Resolution Thermometry with a 0.5 n K temperature resolution in the AC technique and the cancellation of the density stratification will enable C(sub V) to be measured down to a reduced temperature of 10(exp -6) with less than a 1% systematic error.

  7. Distributed simulation a model driven engineering approach

    CERN Document Server

    Topçu, Okan; Oğuztüzün, Halit; Yilmaz, Levent

    2016-01-01

    Backed by substantive case studies, the novel approach to software engineering for distributed simulation outlined in this text demonstrates the potent synergies between model-driven techniques, simulation, intelligent agents, and computer systems development.

  8. Public sector risk management: a specific model.

    Science.gov (United States)

    Lawlor, Ted

    2002-07-01

    Risk management programs for state mental health authorities are generally limited in scope and reactive in nature. Recent changes in how mental health care is provided render it necessary to redirect the risk management focus from its present institutional basis to a statewide, network-based paradigm that is integrated across public and private inpatient and community programs alike. These changes include treating an increasing number of individuals in less-secure settings and contracting for an increasing number of public mental health services with private providers. The model proposed here is closely linked to the Quality Management Process.

  9. Fundamentals of object-oriented information systems specification and design: the OBLOG/TROLL approach

    Science.gov (United States)

    Ehrich, Hans-Dieter

    1994-12-01

    A survey of concepts for an information system specification is given, based on the viewpoint that an information system is a community of interacting objects. Objects are self-contained units of structure and behavior capable of operating independently and cooperating concurrently. The approach integrates concepts from semantic data modeling and concurrent processes, adopting structuring principles partly developed in the framework of object-orientation and partly in that of abstract data types. The languages OBLOG and TROLL are based on these concepts and their use is illustrated by examples.

  10. A multilevel approach to modeling of porous bioceramics

    Science.gov (United States)

    Mikushina, Valentina A.; Sidorenko, Yury N.

    2015-10-01

    The paper is devoted to discussion of multiscale models of heterogeneous materials using principles. The specificity of approach considered is the using of geometrical model of composites representative volume, which must be generated with taking the materials reinforcement structure into account. In framework of such model may be considered different physical processes which have influence on the effective mechanical properties of composite, in particular, the process of damage accumulation. It is shown that such approach can be used to prediction the value of composite macroscopic ultimate strength. As an example discussed the particular problem of the study the mechanical properties of biocomposite representing porous ceramics matrix filled with cortical bones tissue.

  11. Polychronous Interpretation of Synoptic, a Domain Specific Modeling Language for Embedded Flight-Software

    CERN Document Server

    Besnard, L; Ouy, J; Talpin, J -P; Bodeveix, J -P; Cortier, A; Pantel, M; Strecker, M; Garcia, G; Rugina, A; Buisson, J; Dagnat, F

    2010-01-01

    The SPaCIFY project, which aims at bringing advances in MDE to the satellite flight software industry, advocates a top-down approach built on a domain-specific modeling language named Synoptic. In line with previous approaches to real-time modeling such as Statecharts and Simulink, Synoptic features hierarchical decomposition of application and control modules in synchronous block diagrams and state machines. Its semantics is described in the polychronous model of computation, which is that of the synchronous language Signal.

  12. Synthesizing Distributed Protocol Specifications from a UML State Machine Modeled Service Specification

    Institute of Scientific and Technical Information of China (English)

    Jehad Al Dallal; Kassem A.Saleh

    2012-01-01

    The object-oriented paradigm is widely applied in designing and implementing communication systems.Unified Modeling Language (UML) is a standard language used to model the design of object-oriented systems.A protocol state machine is a UML adopted diagram that is widely used in designing communication protocols.It has two key attractive advantages over traditional finite state machines:modeling concurrency and modeling nested hierarchical states.In a distributed communication system,each entity of the system has its own protocol that defines when and how the entity exchanges messages with other communicating entities in the system.The order of the exchanged messages must conform to the overall service specifications of the system.In object-oriented systems,both the service and the protocol specifications are modeled in UML protocol state machines.Protocol specification synthesis methods have to be applied to automatically derive the protocol specification from the service specification.Otherwise,a time-consuming process of design,analysis,and error detection and correction has to be applied iteratively until the design of the protocol becomes error-free and consistent with the service specification.Several synthesis methods are proposed in the literature for models other than UML protocol state machines,and therefore,because of the unique features of the protocol state machines,these methods are inapplicable to services modeled in UML protocol state machines.In this paper,we propose a synthesis method that automatically synthesizes the protocol specification of distributed protocol entities from the service specification,given that both types of specifications are modeled in UML protocol state machines.Our method is based on the latest UML version (UML2.3),and it is proven to synthesize protocol specifications that are syntactically and semantically correct.As an example application,the synthesis method is used to derive the protocol specification of the H.323

  13. Specificity of continuous auditing approach on information technology internal controls

    Directory of Open Access Journals (Sweden)

    Kaćanski Slobodan

    2012-01-01

    Full Text Available Contemporary business world, can not be imagined without the use of information technology in all aspects of business. The use of information technology in manufacturing and non-production companies' activities can greatly facilitate and accelerate the process of operation and control. Because of its complexity, they possess vulnerable areas and provide space for the emergence of accidental and intentional frauds that can significantly materially affect the business decisions made by the companies' management. Implementation of internal controls can greatly reduce the level of errors that can contribute to making the wrong decisions. In order to protect the operating system, the company's management implement an internal audit to periodically examine the fundamental quality of the internal control systems. Since the internal audit, according to its character, only periodically checks quality of internal control systems and information technologies to be reported to the manager, the problem arises in the process of in wrong time reporting the management structures of the business entity. To eliminate this problem, management implements a special approach to internal audit, called continuous auditing.

  14. Low temperature specific heat of glasses: a non-extensive approach

    OpenAIRE

    Razdan, Ashok

    2005-01-01

    Specific heat is calculated using Tsallis statistics. It is observed that it is possible to explain some low temperature specific heat properties of glasses using non-extensive approach. A similarity between temperature dependence of non-extensive specific heat and fractal specific heat is also discussed.

  15. A model-driven approach to information security compliance

    Science.gov (United States)

    Correia, Anacleto; Gonçalves, António; Teodoro, M. Filomena

    2017-06-01

    The availability, integrity and confidentiality of information are fundamental to the long-term survival of any organization. Information security is a complex issue that must be holistically approached, combining assets that support corporate systems, in an extended network of business partners, vendors, customers and other stakeholders. This paper addresses the conception and implementation of information security systems, conform the ISO/IEC 27000 set of standards, using the model-driven approach. The process begins with the conception of a domain level model (computation independent model) based on information security vocabulary present in the ISO/IEC 27001 standard. Based on this model, after embedding in the model mandatory rules for attaining ISO/IEC 27001 conformance, a platform independent model is derived. Finally, a platform specific model serves the base for testing the compliance of information security systems with the ISO/IEC 27000 set of standards.

  16. A Set Theoretical Approach to Maturity Models

    DEFF Research Database (Denmark)

    Lasrado, Lester; Vatrapu, Ravi; Andersen, Kim Normann

    2016-01-01

    Maturity Model research in IS has been criticized for the lack of theoretical grounding, methodological rigor, empirical validations, and ignorance of multiple and non-linear paths to maturity. To address these criticisms, this paper proposes a novel set-theoretical approach to maturity models ch...

  17. Modeling diffuse pollution with a distributed approach.

    Science.gov (United States)

    León, L F; Soulis, E D; Kouwen, N; Farquhar, G J

    2002-01-01

    The transferability of parameters for non-point source pollution models to other watersheds, especially those in remote areas without enough data for calibration, is a major problem in diffuse pollution modeling. A water quality component was developed for WATFLOOD (a flood forecast hydrological model) to deal with sediment and nutrient transport. The model uses a distributed group response unit approach for water quantity and quality modeling. Runoff, sediment yield and soluble nutrient concentrations are calculated separately for each land cover class, weighted by area and then routed downstream. The distributed approach for the water quality model for diffuse pollution in agricultural watersheds is described in this paper. Integrating the model with data extracted using GIS technology (Geographical Information Systems) for a local watershed, the model is calibrated for the hydrologic response and validated for the water quality component. With the connection to GIS and the group response unit approach used in this paper, model portability increases substantially, which will improve non-point source modeling at the watershed scale level.

  18. LFTOP: An LF-Based Approach to Domain-Specific Reasoning

    Institute of Scientific and Technical Information of China (English)

    Jian-Min Pang; Paul Callaghan; Zhao-Hui Luo

    2005-01-01

    A new approach to domain-specific reasoning is presented that is based on a type-theoretic logical framework (LF) but does not require the user to be an expert in type theory. The concepts of the domain and its related reasoning systems are formalized in LF, but the user works with the system through a syntax and interface appropriate to his/her work. A middle layer provides translation between the user syntax and LF, and allows additional support for reasoning (e.g., model checking). Thus, the complexity of the logical framework is hidden but the benefits of using type theory and its related tools are retained, such as precision and machine-checkable proofs. This approach is investigated through a number of case studies: here, the authors consider the verification of properties of concurrency. The authors have formalized a specification language (CCS) and logic (μ-calculus) in LF, together with useful lemmas, and a user-oriented syntax has been designed. The authors demonstrate the approach with simple examples. However, applying lemmas to objects introduced by the user may result in framework-level objects which cannot be translated back to the user level. The authors discuss this problem, define a notion of adequacy, and prove that in this case study, translation can always be reversed.

  19. MODULAR APPROACH WITH ROUGH DECISION MODELS

    Directory of Open Access Journals (Sweden)

    Ahmed T. Shawky

    2012-09-01

    Full Text Available Decision models which adopt rough set theory have been used effectively in many real world applications.However, rough decision models suffer the high computational complexity when dealing with datasets ofhuge size. In this research we propose a new rough decision model that allows making decisions based onmodularity mechanism. According to the proposed approach, large-size datasets can be divided intoarbitrary moderate-size datasets, then a group of rough decision models can be built as separate decisionmodules. The overall model decision is computed as the consensus decision of all decision modulesthrough some aggregation technique. This approach provides a flexible and a quick way for extractingdecision rules of large size information tables using rough decision models.

  20. Modular Approach with Rough Decision Models

    Directory of Open Access Journals (Sweden)

    Ahmed T. Shawky

    2012-10-01

    Full Text Available Decision models which adopt rough set theory have been used effectively in many real world applications.However, rough decision models suffer the high computational complexity when dealing with datasets ofhuge size. In this research we propose a new rough decision model that allows making decisions based onmodularity mechanism. According to the proposed approach, large-size datasets can be divided intoarbitrary moderate-size datasets, then a group of rough decision models can be built as separate decisionmodules. The overall model decision is computed as the consensus decision of all decision modulesthrough some aggregation technique. This approach provides a flexible and a quick way for extractingdecision rules of large size information tables using rough decision models.

  1. Analysis of Geometrical Specification Model Based on the New GeometricalProduct Specification Language

    Institute of Scientific and Technical Information of China (English)

    马利民; 王金星; 蒋向前; 李柱; 徐振高

    2004-01-01

    Geometrical Product Specification and verification (GPS) is an ISO standard system coveting standards of size, dimension,geometrical tolerance and surface texture of geometrical product. ISO/TC213 on the GPS has been working towards coordination of the previous standards in tolerance and related metrology in order to publish the next generation of the GPS language. This paper introduces the geometrical product specification model for design, manufacturing and verification based on the improved GPS and its new concepts,i.e., surface models, geometrical features and operations. An application example for the geometrical product specification model is then given.

  2. Modeling approach suitable for energy system

    Energy Technology Data Exchange (ETDEWEB)

    Goetschel, D. V.

    1979-01-01

    Recently increased attention has been placed on optimization problems related to the determination and analysis of operating strategies for energy systems. Presented in this paper is a nonlinear model that can be used in the formulation of certain energy-conversion systems-modeling problems. The model lends itself nicely to solution approaches based on nonlinear-programming algorithms and, in particular, to those methods falling into the class of variable metric algorithms for nonlinearly constrained optimization.

  3. Stormwater infiltration trenches: a conceptual modelling approach.

    Science.gov (United States)

    Freni, Gabriele; Mannina, Giorgio; Viviani, Gaspare

    2009-01-01

    In recent years, limitations linked to traditional urban drainage schemes have been pointed out and new approaches are developing introducing more natural methods for retaining and/or disposing of stormwater. These mitigation measures are generally called Best Management Practices or Sustainable Urban Drainage System and they include practices such as infiltration and storage tanks in order to reduce the peak flow and retain part of the polluting components. The introduction of such practices in urban drainage systems entails an upgrade of existing modelling frameworks in order to evaluate their efficiency in mitigating the impact of urban drainage systems on receiving water bodies. While storage tank modelling approaches are quite well documented in literature, some gaps are still present about infiltration facilities mainly dependent on the complexity of the involved physical processes. In this study, a simplified conceptual modelling approach for the simulation of the infiltration trenches is presented. The model enables to assess the performance of infiltration trenches. The main goal is to develop a model that can be employed for the assessment of the mitigation efficiency of infiltration trenches in an integrated urban drainage context. Particular care was given to the simulation of infiltration structures considering the performance reduction due to clogging phenomena. The proposed model has been compared with other simplified modelling approaches and with a physically based model adopted as benchmark. The model performed better compared to other approaches considering both unclogged facilities and the effect of clogging. On the basis of a long-term simulation of six years of rain data, the performance and the effectiveness of an infiltration trench measure are assessed. The study confirmed the important role played by the clogging phenomenon on such infiltration structures.

  4. Building Water Models, A Different Approach

    CERN Document Server

    Izadi, Saeed; Onufriev, Alexey V

    2014-01-01

    Simplified, classical models of water are an integral part of atomistic molecular simulations, especially in biology and chemistry where hydration effects are critical. Yet, despite several decades of effort, these models are still far from perfect. Presented here is an alternative approach to constructing point charge water models - currently, the most commonly used type. In contrast to the conventional approach, we do not impose any geometry constraints on the model other than symmetry. Instead, we optimize the distribution of point charges to best describe the "electrostatics" of the water molecule, which is key to many unusual properties of liquid water. The search for the optimal charge distribution is performed in 2D parameter space of key lowest multipole moments of the model, to find best fit to a small set of bulk water properties at room temperature. A virtually exhaustive search is enabled via analytical equations that relate the charge distribution to the multipole moments. The resulting "optimal"...

  5. Characterizing economic trends by Bayesian stochastic model specification search

    DEFF Research Database (Denmark)

    Grassi, Stefano; Proietti, Tommaso

    We extend a recently proposed Bayesian model selection technique, known as stochastic model specification search, for characterising the nature of the trend in macroeconomic time series. In particular, we focus on autoregressive models with possibly time-varying intercept and slope and decide...... on whether their parameters are fixed or evolutive. Stochastic model specification is carried out to discriminate two alternative hypotheses concerning the generation of trends: the trend-stationary hypothesis, on the one hand, for which the trend is a deterministic function of time and the short run......, estimated by a suitable Gibbs sampling scheme, provides useful insight on quasi-integrated nature of the specifications selected....

  6. Towards new approaches in phenological modelling

    Science.gov (United States)

    Chmielewski, Frank-M.; Götz, Klaus-P.; Rawel, Harshard M.; Homann, Thomas

    2014-05-01

    Modelling of phenological stages is based on temperature sums for many decades, describing both the chilling and the forcing requirement of woody plants until the beginning of leafing or flowering. Parts of this approach go back to Reaumur (1735), who originally proposed the concept of growing degree-days. Now, there is a growing body of opinion that asks for new methods in phenological modelling and more in-depth studies on dormancy release of woody plants. This requirement is easily understandable if we consider the wide application of phenological models, which can even affect the results of climate models. To this day, in phenological models still a number of parameters need to be optimised on observations, although some basic physiological knowledge of the chilling and forcing requirement of plants is already considered in these approaches (semi-mechanistic models). Limiting, for a fundamental improvement of these models, is the lack of knowledge about the course of dormancy in woody plants, which cannot be directly observed and which is also insufficiently described in the literature. Modern metabolomic methods provide a solution for this problem and allow both, the validation of currently used phenological models as well as the development of mechanistic approaches. In order to develop this kind of models, changes of metabolites (concentration, temporal course) must be set in relation to the variability of environmental (steering) parameters (weather, day length, etc.). This necessarily requires multi-year (3-5 yr.) and high-resolution (weekly probes between autumn and spring) data. The feasibility of this approach has already been tested in a 3-year pilot-study on sweet cherries. Our suggested methodology is not only limited to the flowering of fruit trees, it can be also applied to tree species of the natural vegetation, where even greater deficits in phenological modelling exist.

  7. Modelling subject-specific childhood growth using linear mixed-effect models with cubic regression splines.

    Science.gov (United States)

    Grajeda, Laura M; Ivanescu, Andrada; Saito, Mayuko; Crainiceanu, Ciprian; Jaganath, Devan; Gilman, Robert H; Crabtree, Jean E; Kelleher, Dermott; Cabrera, Lilia; Cama, Vitaliano; Checkley, William

    2016-01-01

    Childhood growth is a cornerstone of pediatric research. Statistical models need to consider individual trajectories to adequately describe growth outcomes. Specifically, well-defined longitudinal models are essential to characterize both population and subject-specific growth. Linear mixed-effect models with cubic regression splines can account for the nonlinearity of growth curves and provide reasonable estimators of population and subject-specific growth, velocity and acceleration. We provide a stepwise approach that builds from simple to complex models, and account for the intrinsic complexity of the data. We start with standard cubic splines regression models and build up to a model that includes subject-specific random intercepts and slopes and residual autocorrelation. We then compared cubic regression splines vis-à-vis linear piecewise splines, and with varying number of knots and positions. Statistical code is provided to ensure reproducibility and improve dissemination of methods. Models are applied to longitudinal height measurements in a cohort of 215 Peruvian children followed from birth until their fourth year of life. Unexplained variability, as measured by the variance of the regression model, was reduced from 7.34 when using ordinary least squares to 0.81 (p linear mixed-effect models with random slopes and a first order continuous autoregressive error term. There was substantial heterogeneity in both the intercept (p linear regression equation for both estimation and prediction of population- and individual-level growth in height. We show that cubic regression splines are superior to linear regression splines for the case of a small number of knots in both estimation and prediction with the full linear mixed effect model (AIC 19,352 vs. 19,598, respectively). While the regression parameters are more complex to interpret in the former, we argue that inference for any problem depends more on the estimated curve or differences in curves rather

  8. Position-specific isotope modeling of organic micropollutants transformation through different reaction pathways

    DEFF Research Database (Denmark)

    Jin, Biao; Rolle, Massimo

    2016-01-01

    The degradation of organic micropollutants occurs via different reaction pathways. Compound specific isotope analysis is a valuable tool to identify such degradation pathways in different environmental systems. We propose a mechanism-based modeling approach that provides a quantitative framework...... to simultaneously evaluate concentration as well as bulk and position-specific multi-element isotope evolution during the transformation of organic micropollutants. The model explicitly simulates position-specific isotopologues for those atoms that experience isotope effects and, thereby, provides a mechanistic...... description of isotope fractionation occurring at different molecular positions. To demonstrate specific features of the modeling approach, we simulated the degradation of three selected organic micropollutants: dichlorobenzamide (BAM), isoproturon (IPU) and diclofenac (DCF). The model accurately reproduces...

  9. 3D Image Modelling and Specific Treatments in Orthodontics Domain

    Directory of Open Access Journals (Sweden)

    Dionysis Goularas

    2007-01-01

    Full Text Available In this article, we present a 3D specific dental plaster treatment system for orthodontics. From computer tomography scanner images, we propose first a 3D image modelling and reconstruction method of the Mandible and Maxillary based on an adaptive triangulation allowing management of contours meant for the complex topologies. Secondly, we present two specific treatment methods directly achieved on obtained 3D model allowing the automatic correction for the setting in occlusion of the Mandible and the Maxillary, and the teeth segmentation allowing more specific dental examinations. Finally, these specific treatments are presented via a client/server application with the aim of allowing a telediagnosis and treatment.

  10. Modelling Coagulation Systems: A Stochastic Approach

    CERN Document Server

    Ryazanov, V V

    2011-01-01

    A general stochastic approach to the description of coagulating aerosol system is developed. As the object of description one can consider arbitrary mesoscopic values (number of aerosol clusters, their size etc). The birth-and-death formalism for a number of clusters can be regarded as a partial case of the generalized storage model. An application of the storage model to the number of monomers in a cluster is discussed.

  11. Contextual Information and Specific Language Models for Spoken Language Understanding

    CERN Document Server

    Baggia, P; Gerbino, E; Moisa, L M; Popovici, C; Baggia, Paolo; Danieli, Morena; Gerbino, Elisabetta; Moisa, Loreta M.; Popovici, Cosmin

    1999-01-01

    In this paper we explain how contextual expectations are generated and used in the task-oriented spoken language understanding system Dialogos. The hard task of recognizing spontaneous speech on the telephone may greatly benefit from the use of specific language models during the recognition of callers' utterances. By 'specific language models' we mean a set of language models that are trained on contextually appropriated data, and that are used during different states of the dialogue on the basis of the information sent to the acoustic level by the dialogue management module. In this paper we describe how the specific language models are obtained on the basis of contextual information. The experimental result we report show that recognition and understanding performance are improved thanks to the use of specific language models.

  12. A Multiple Model Approach to Modeling Based on LPF Algorithm

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Input-output data fitting methods are often used for unknown-structure nonlinear system modeling. Based on model-on-demand tactics, a multiple model approach to modeling for nonlinear systems is presented. The basic idea is to find out, from vast historical system input-output data sets, some data sets matching with the current working point, then to develop a local model using Local Polynomial Fitting (LPF) algorithm. With the change of working points, multiple local models are built, which realize the exact modeling for the global system. By comparing to other methods, the simulation results show good performance for its simple, effective and reliable estimation.``

  13. Towards patient-specific modelling of lesion formation during radiofrequency catheter ablation for atrial fibrillation

    Science.gov (United States)

    Soor, Navjeevan; Morgan, Ross; Varela, Marta; Aslanidi, Oleg V.

    2017-01-01

    Radiofrequency catheter ablation procedures are a first-line method of clinical treatment for atrial fibrillation. However, they suffer from suboptimal success rates and are also prone to potentially serious adverse effects. These limitations can be at least partially attributed to the inter- and intra- patient variations in atrial wall thickness, and could be mitigated by patient-specific approaches to the procedure. In this study, a modelling approach to optimising ablation procedures in subject-specific 3D atrial geometries was applied. The approach enabled the evaluation of optimal ablation times to create lesions for a given wall thickness measured from MRI. A nonliner relationship was revealed between the thickness and catheter contact time required for fully transmural lesions. Hence, our approach based on MRI reconstruction of the atrial wall combined with subject-specific modelling of ablation can provide useful information for improving clinical procedures. PMID:28261003

  14. On the multistream approach of relativistic Weibel instability. I. Linear analysis and specific illustrations

    Energy Technology Data Exchange (ETDEWEB)

    Ghizzo, A.; Bertrand, P. [Institut Jean Lamour-UMR 7168, University of Lorraine, BP 239 F-54506 Vandoeuvre les Nancy (France)

    2013-08-15

    A one-dimensional multistream formalism is extended for the study of temperature anisotropy driven Weibel-type instabilities in collisionless and relativistic plasma. The formulation is based on a Hamiltonian reduction technique using the invariance of generalized canonical momentum in transverse direction. The Vlasov-Maxwell model is expressed in terms of an ensemble of one-dimensional Vlasov-type equations, coupled together with the Maxwell equations in a self-consistent way. Although the model is fundamentally nonlinear, this first of three companion papers focuses on the linear aspect. Dispersion relations of the Weibel instability are derived in the linear regime for different kinds of polarization of the electromagnetic potential vector. The model allows new unexpected insights on the instability: enhanced growth rates for the Weibel instability are predicted when a dissymmetric distribution is considered in p{sub ⊥}. In the case of a circular polarization, a simplification of the linear analysis can be obtained by the introduction of the “multiring” approach allowing to extend the analytical model of Yoon and Davidson [Phys. Rev. A 35, 2718 (1987)]. Applications of this model are left to the other two papers of the series where specific problems are addressed pertaining to the nonlinear and relativistic dynamics of magnetically trapped particles met in the saturation regime of the Weibel instability.

  15. A transformation approach for collaboration based requirement models

    CERN Document Server

    Harbouche, Ahmed; Mokhtari, Aicha

    2012-01-01

    Distributed software engineering is widely recognized as a complex task. Among the inherent complexities is the process of obtaining a system design from its global requirement specification. This paper deals with such transformation process and suggests an approach to derive the behavior of a given system components, in the form of distributed Finite State Machines, from the global system requirements, in the form of an augmented UML Activity Diagrams notation. The process of the suggested approach is summarized in three steps: the definition of the appropriate source Meta-Model (requirements Meta-Model), the definition of the target Design Meta-Model and the definition of the rules to govern the transformation during the derivation process. The derivation process transforms the global system requirements described as UML diagram activities (extended with collaborations) to system roles behaviors represented as UML finite state machines. The approach is implemented using Atlas Transformation Language (ATL).

  16. A TRANSFORMATION APPROACH FOR COLLABORATION BASED REQUIREMENT MODELS

    Directory of Open Access Journals (Sweden)

    Ahmed Harbouche

    2012-02-01

    Full Text Available Distributed software engineering is widely recognized as a complex task. Among the inherent complexitiesis the process of obtaining a system design from its global requirement specification. This paper deals withsuch transformation process and suggests an approach to derive the behavior of a given systemcomponents, in the form of distributed Finite State Machines, from the global system requirements, in theform of an augmented UML Activity Diagrams notation. The process of the suggested approach issummarized in three steps: the definition of the appropriate source Meta-Model (requirements Meta-Model, the definition of the target Design Meta-Model and the definition of the rules to govern thetransformation during the derivation process. The derivation process transforms the global systemrequirements described as UML diagram activities (extended with collaborations to system rolesbehaviors represented as UML finite state machines. The approach is implemented using AtlasTransformation Language (ATL.

  17. Context-Specific Metabolic Model Extraction Based on Regularized Least Squares Optimization.

    Directory of Open Access Journals (Sweden)

    Semidán Robaina Estévez

    Full Text Available Genome-scale metabolic models have proven highly valuable in investigating cell physiology. Recent advances include the development of methods to extract context-specific models capable of describing metabolism under more specific scenarios (e.g., cell types. Yet, none of the existing computational approaches allows for a fully automated model extraction and determination of a flux distribution independent of user-defined parameters. Here we present RegrEx, a fully automated approach that relies solely on context-specific data and ℓ1-norm regularization to extract a context-specific model and to provide a flux distribution that maximizes its correlation to data. Moreover, the publically available implementation of RegrEx was used to extract 11 context-specific human models using publicly available RNAseq expression profiles, Recon1 and also Recon2, the most recent human metabolic model. The comparison of the performance of RegrEx and its contending alternatives demonstrates that the proposed method extracts models for which both the structure, i.e., reactions included, and the flux distributions are in concordance with the employed data. These findings are supported by validation and comparison of method performance on additional data not used in context-specific model extraction. Therefore, our study sets the ground for applications of other regularization techniques in large-scale metabolic modeling.

  18. Post-16 Biology--Some Model Approaches?

    Science.gov (United States)

    Lock, Roger

    1997-01-01

    Outlines alternative approaches to the teaching of difficult concepts in A-level biology which may help student learning by making abstract ideas more concrete and accessible. Examples include models, posters, and poems for illustrating meiosis, mitosis, genetic mutations, and protein synthesis. (DDR)

  19. Hybrid parallel execution model for logic-based specification languages

    CERN Document Server

    Tsai, Jeffrey J P

    2001-01-01

    Parallel processing is a very important technique for improving the performance of various software development and maintenance activities. The purpose of this book is to introduce important techniques for parallel executation of high-level specifications of software systems. These techniques are very useful for the construction, analysis, and transformation of reliable large-scale and complex software systems. Contents: Current Approaches; Overview of the New Approach; FRORL Requirements Specification Language and Its Decomposition; Rewriting and Data Dependency, Control Flow Analysis of a Lo

  20. An algebraic approach to modeling in software engineering

    Energy Technology Data Exchange (ETDEWEB)

    Loegel, G.J. [Superconducting Super Collider Lab., Dallas, TX (United States)]|[Michigan Univ., Ann Arbor, MI (United States); Ravishankar, C.V. [Michigan Univ., Ann Arbor, MI (United States)

    1993-09-01

    Our work couples the formalism of universal algebras with the engineering techniques of mathematical modeling to develop a new approach to the software engineering process. Our purpose in using this combination is twofold. First, abstract data types and their specification using universal algebras can be considered a common point between the practical requirements of software engineering and the formal specification of software systems. Second, mathematical modeling principles provide us with a means for effectively analyzing real-world systems. We first use modeling techniques to analyze a system and then represent the analysis using universal algebras. The rest of the software engineering process exploits properties of universal algebras that preserve the structure of our original model. This paper describes our software engineering process and our experience using it on both research and commercial systems. We need a new approach because current software engineering practices often deliver software that is difficult to develop and maintain. Formal software engineering approaches use universal algebras to describe ``computer science`` objects like abstract data types, but in practice software errors are often caused because ``real-world`` objects are improperly modeled. There is a large semantic gap between the customer`s objects and abstract data types. In contrast, mathematical modeling uses engineering techniques to construct valid models for real-world systems, but these models are often implemented in an ad hoc manner. A combination of the best features of both approaches would enable software engineering to formally specify and develop software systems that better model real systems. Software engineering, like mathematical modeling, should concern itself first and foremost with understanding a real system and its behavior under given circumstances, and then with expressing this knowledge in an executable form.

  1. Decomposition approach to model smart suspension struts

    Science.gov (United States)

    Song, Xubin

    2008-10-01

    Model and simulation study is the starting point for engineering design and development, especially for developing vehicle control systems. This paper presents a methodology to build models for application of smart struts for vehicle suspension control development. The modeling approach is based on decomposition of the testing data. Per the strut functions, the data is dissected according to both control and physical variables. Then the data sets are characterized to represent different aspects of the strut working behaviors. Next different mathematical equations can be built and optimized to best fit the corresponding data sets, respectively. In this way, the model optimization can be facilitated in comparison to a traditional approach to find out a global optimum set of model parameters for a complicated nonlinear model from a series of testing data. Finally, two struts are introduced as examples for this modeling study: magneto-rheological (MR) dampers and compressible fluid (CF) based struts. The model validation shows that this methodology can truly capture macro-behaviors of these struts.

  2. Heat transfer modeling an inductive approach

    CERN Document Server

    Sidebotham, George

    2015-01-01

    This innovative text emphasizes a "less-is-more" approach to modeling complicated systems such as heat transfer by treating them first as "1-node lumped models" that yield simple closed-form solutions. The author develops numerical techniques for students to obtain more detail, but also trains them to use the techniques only when simpler approaches fail. Covering all essential methods offered in traditional texts, but with a different order, Professor Sidebotham stresses inductive thinking and problem solving as well as a constructive understanding of modern, computer-based practice. Readers learn to develop their own code in the context of the material, rather than just how to use packaged software, offering a deeper, intrinsic grasp behind models of heat transfer. Developed from over twenty-five years of lecture notes to teach students of mechanical and chemical engineering at The Cooper Union for the Advancement of Science and Art, the book is ideal for students and practitioners across engineering discipl...

  3. Modeling and Simulation. II. Specificity Models for Visual Cortex Development.

    Science.gov (United States)

    1986-12-12

    Excitation and Inhibition Excit at ion in VC comesvia 3 1 * the specific thialamic afferents. * spiny stellate interneurons , * *0 collaterals of local p...D., Receptive field p~rolperties of EPSPs and IPSPs in cat visual cortex, Soc. Neurosci. Abstr. 10, 521;- 1984. * 32. Freeman, P. D. and A. B. Bonds

  4. On Automatic Modeling and Use of Domain-specific Ontologies

    DEFF Research Database (Denmark)

    Andreasen, Troels; Knappe, Rasmus; Bulskov, Henrik

    2005-01-01

    is a specific lattice-based concept algebraic language by which ontologies are inherently generative. The modeling of a domain specific ontology is based on a general ontology built upon common knowledge resources as dictionaries and thesauri. Based on analysis of concept occurrences in the object document......-based navigation. Finally, a measure of concept similarity is derived from the domain specific ontology based on occurrences, commonalities, and distances in the ontology....

  5. Building enterprise reuse program--A model-based approach

    Institute of Scientific and Technical Information of China (English)

    梅宏; 杨芙清

    2002-01-01

    Reuse is viewed as a realistically effective approach to solving software crisis. For an organization that wants to build a reuse program, technical and non-technical issues must be considered in parallel. In this paper, a model-based approach to building systematic reuse program is presented. Component-based reuse is currently a dominant approach to software reuse. In this approach, building the right reusable component model is the first important step. In order to achieve systematic reuse, a set of component models should be built from different perspectives. Each of these models will give a specific view of the components so as to satisfy different needs of different persons involved in the enterprise reuse program. There already exist some component models for reuse from technical perspectives. But less attention is paid to the reusable components from a non-technical view, especially from the view of process and management. In our approach, a reusable component model--FLP model for reusable component--is introduced. This model describes components from three dimensions (Form, Level, and Presentation) and views components and their relationships from the perspective of process and management. It determines the sphere of reusable components, the time points of reusing components in the development process, and the needed means to present components in terms of the abstraction level, logic granularity and presentation media. Being the basis on which the management and technical decisions are made, our model will be used as the kernel model to initialize and normalize a systematic enterprise reuse program.

  6. Teaching Sustainability Using an Active Learning Constructivist Approach: Discipline-Specific Case Studies in Higher Education

    Directory of Open Access Journals (Sweden)

    Maria Kalamas Hedden

    2017-07-01

    Full Text Available In this paper we present our rationale for using an active learning constructivist approach to teach sustainability-related topics in a higher education. To push the boundaries of ecological literacy, we also develop a theoretical model for sustainability knowledge co-creation. Drawing on the experiences of faculty at a major Southeastern University in the United States, we present case studies in architecture, engineering, geography, and marketing. Four Sustainability Faculty Fellows describe their discipline-specific case studies, all of which are project-based learning experiences, and include details regarding teaching and assessment. Easily replicated in other educational contexts, these case studies contribute to the advancement of sustainability education.

  7. A Bayesian Shrinkage Approach for AMMI Models.

    Science.gov (United States)

    da Silva, Carlos Pereira; de Oliveira, Luciano Antonio; Nuvunga, Joel Jorge; Pamplona, Andrezza Kéllen Alves; Balestre, Marcio

    2015-01-01

    Linear-bilinear models, especially the additive main effects and multiplicative interaction (AMMI) model, are widely applicable to genotype-by-environment interaction (GEI) studies in plant breeding programs. These models allow a parsimonious modeling of GE interactions, retaining a small number of principal components in the analysis. However, one aspect of the AMMI model that is still debated is the selection criteria for determining the number of multiplicative terms required to describe the GE interaction pattern. Shrinkage estimators have been proposed as selection criteria for the GE interaction components. In this study, a Bayesian approach was combined with the AMMI model with shrinkage estimators for the principal components. A total of 55 maize genotypes were evaluated in nine different environments using a complete blocks design with three replicates. The results show that the traditional Bayesian AMMI model produces low shrinkage of singular values but avoids the usual pitfalls in determining the credible intervals in the biplot. On the other hand, Bayesian shrinkage AMMI models have difficulty with the credible interval for model parameters, but produce stronger shrinkage of the principal components, converging to GE matrices that have more shrinkage than those obtained using mixed models. This characteristic allowed more parsimonious models to be chosen, and resulted in models being selected that were similar to those obtained by the Cornelius F-test (α = 0.05) in traditional AMMI models and cross validation based on leave-one-out. This characteristic allowed more parsimonious models to be chosen and more GEI pattern retained on the first two components. The resulting model chosen by posterior distribution of singular value was also similar to those produced by the cross-validation approach in traditional AMMI models. Our method enables the estimation of credible interval for AMMI biplot plus the choice of AMMI model based on direct posterior

  8. A Bayesian Shrinkage Approach for AMMI Models.

    Directory of Open Access Journals (Sweden)

    Carlos Pereira da Silva

    Full Text Available Linear-bilinear models, especially the additive main effects and multiplicative interaction (AMMI model, are widely applicable to genotype-by-environment interaction (GEI studies in plant breeding programs. These models allow a parsimonious modeling of GE interactions, retaining a small number of principal components in the analysis. However, one aspect of the AMMI model that is still debated is the selection criteria for determining the number of multiplicative terms required to describe the GE interaction pattern. Shrinkage estimators have been proposed as selection criteria for the GE interaction components. In this study, a Bayesian approach was combined with the AMMI model with shrinkage estimators for the principal components. A total of 55 maize genotypes were evaluated in nine different environments using a complete blocks design with three replicates. The results show that the traditional Bayesian AMMI model produces low shrinkage of singular values but avoids the usual pitfalls in determining the credible intervals in the biplot. On the other hand, Bayesian shrinkage AMMI models have difficulty with the credible interval for model parameters, but produce stronger shrinkage of the principal components, converging to GE matrices that have more shrinkage than those obtained using mixed models. This characteristic allowed more parsimonious models to be chosen, and resulted in models being selected that were similar to those obtained by the Cornelius F-test (α = 0.05 in traditional AMMI models and cross validation based on leave-one-out. This characteristic allowed more parsimonious models to be chosen and more GEI pattern retained on the first two components. The resulting model chosen by posterior distribution of singular value was also similar to those produced by the cross-validation approach in traditional AMMI models. Our method enables the estimation of credible interval for AMMI biplot plus the choice of AMMI model based on direct

  9. Model Convolution: A Computational Approach to Digital Image Interpretation

    Science.gov (United States)

    Gardner, Melissa K.; Sprague, Brian L.; Pearson, Chad G.; Cosgrove, Benjamin D.; Bicek, Andrew D.; Bloom, Kerry; Salmon, E. D.

    2010-01-01

    Digital fluorescence microscopy is commonly used to track individual proteins and their dynamics in living cells. However, extracting molecule-specific information from fluorescence images is often limited by the noise and blur intrinsic to the cell and the imaging system. Here we discuss a method called “model-convolution,” which uses experimentally measured noise and blur to simulate the process of imaging fluorescent proteins whose spatial distribution cannot be resolved. We then compare model-convolution to the more standard approach of experimental deconvolution. In some circumstances, standard experimental deconvolution approaches fail to yield the correct underlying fluorophore distribution. In these situations, model-convolution removes the uncertainty associated with deconvolution and therefore allows direct statistical comparison of experimental and theoretical data. Thus, if there are structural constraints on molecular organization, the model-convolution method better utilizes information gathered via fluorescence microscopy, and naturally integrates experiment and theory. PMID:20461132

  10. Benchmarking novel approaches for modelling species range dynamics.

    Science.gov (United States)

    Zurell, Damaris; Thuiller, Wilfried; Pagel, Jörn; Cabral, Juliano S; Münkemüller, Tamara; Gravel, Dominique; Dullinger, Stefan; Normand, Signe; Schiffers, Katja H; Moore, Kara A; Zimmermann, Niklaus E

    2016-08-01

    Increasing biodiversity loss due to climate change is one of the most vital challenges of the 21st century. To anticipate and mitigate biodiversity loss, models are needed that reliably project species' range dynamics and extinction risks. Recently, several new approaches to model range dynamics have been developed to supplement correlative species distribution models (SDMs), but applications clearly lag behind model development. Indeed, no comparative analysis has been performed to evaluate their performance. Here, we build on process-based, simulated data for benchmarking five range (dynamic) models of varying complexity including classical SDMs, SDMs coupled with simple dispersal or more complex population dynamic models (SDM hybrids), and a hierarchical Bayesian process-based dynamic range model (DRM). We specifically test the effects of demographic and community processes on model predictive performance. Under current climate, DRMs performed best, although only marginally. Under climate change, predictive performance varied considerably, with no clear winners. Yet, all range dynamic models improved predictions under climate change substantially compared to purely correlative SDMs, and the population dynamic models also predicted reasonable extinction risks for most scenarios. When benchmarking data were simulated with more complex demographic and community processes, simple SDM hybrids including only dispersal often proved most reliable. Finally, we found that structural decisions during model building can have great impact on model accuracy, but prior system knowledge on important processes can reduce these uncertainties considerably. Our results reassure the clear merit in using dynamic approaches for modelling species' response to climate change but also emphasize several needs for further model and data improvement. We propose and discuss perspectives for improving range projections through combination of multiple models and for making these approaches

  11. Position-specific isotope modeling of organic micropollutants transformation through different reaction pathways.

    Science.gov (United States)

    Jin, Biao; Rolle, Massimo

    2016-03-01

    The degradation of organic micropollutants occurs via different reaction pathways. Compound specific isotope analysis is a valuable tool to identify such degradation pathways in different environmental systems. We propose a mechanism-based modeling approach that provides a quantitative framework to simultaneously evaluate concentration as well as bulk and position-specific multi-element isotope evolution during the transformation of organic micropollutants. The model explicitly simulates position-specific isotopologues for those atoms that experience isotope effects and, thereby, provides a mechanistic description of isotope fractionation occurring at different molecular positions. To demonstrate specific features of the modeling approach, we simulated the degradation of three selected organic micropollutants: dichlorobenzamide (BAM), isoproturon (IPU) and diclofenac (DCF). The model accurately reproduces the multi-element isotope data observed in previous experimental studies. Furthermore, it precisely captures the dual element isotope trends characteristic of different reaction pathways as well as their range of variation consistent with observed bulk isotope fractionation. It was also possible to directly validate the model capability to predict the evolution of position-specific isotope ratios with available experimental data. Therefore, the approach is useful both for a mechanism-based evaluation of experimental results and as a tool to explore transformation pathways in scenarios for which position-specific isotope data are not yet available.

  12. Results from Development of Model Specifications for Multifamily Energy Retrofits

    Energy Technology Data Exchange (ETDEWEB)

    Brozyna, K.

    2012-08-01

    Specifications, modeled after CSI MasterFormat, provide the trade contractors and builders with requirements and recommendations on specific building materials, components and industry practices that comply with the expectations and intent of the requirements within the various funding programs associated with a project. The goal is to create a greater level of consistency in execution of energy efficiency retrofits measures across the multiple regions a developer may work. IBACOS and Mercy Housing developed sample model specifications based on a common building construction type that Mercy Housing encounters.

  13. Results From Development of Model Specifications for Multifamily Energy Retrofits

    Energy Technology Data Exchange (ETDEWEB)

    Brozyna, Kevin [IBACOS, Inc., Pittsburgh, PA (United States)

    2012-08-01

    Specifications, modeled after CSI MasterFormat, provide the trade contractors and builders with requirements and recommendations on specific building materials, components and industry practices that comply with the expectations and intent of the requirements within the various funding programs associated with a project. The goal is to create a greater level of consistency in execution of energy efficiency retrofits measures across the multiple regions a developer may work. IBACOS and Mercy Housing developed sample model specifications based on a common building construction type that Mercy Housing encounters.

  14. A model selection approach to analysis of variance and covariance.

    Science.gov (United States)

    Alber, Susan A; Weiss, Robert E

    2009-06-15

    An alternative to analysis of variance is a model selection approach where every partition of the treatment means into clusters with equal value is treated as a separate model. The null hypothesis that all treatments are equal corresponds to the partition with all means in a single cluster. The alternative hypothesis correspond to the set of all other partitions of treatment means. A model selection approach can also be used for a treatment by covariate interaction, where the null hypothesis and each alternative correspond to a partition of treatments into clusters with equal covariate effects. We extend the partition-as-model approach to simultaneous inference for both treatment main effect and treatment interaction with a continuous covariate with separate partitions for the intercepts and treatment-specific slopes. The model space is the Cartesian product of the intercept partition and the slope partition, and we develop five joint priors for this model space. In four of these priors the intercept and slope partition are dependent. We advise on setting priors over models, and we use the model to analyze an orthodontic data set that compares the frictional resistance created by orthodontic fixtures. Copyright (c) 2009 John Wiley & Sons, Ltd.

  15. Scientific Theories, Models and the Semantic Approach

    Directory of Open Access Journals (Sweden)

    Décio Krause

    2007-12-01

    Full Text Available According to the semantic view, a theory is characterized by a class of models. In this paper, we examine critically some of the assumptions that underlie this approach. First, we recall that models are models of something. Thus we cannot leave completely aside the axiomatization of the theories under consideration, nor can we ignore the metamathematics used to elaborate these models, for changes in the metamathematics often impose restrictions on the resulting models. Second, based on a parallel between van Fraassen’s modal interpretation of quantum mechanics and Skolem’s relativism regarding set-theoretic concepts, we introduce a distinction between relative and absolute concepts in the context of the models of a scientific theory. And we discuss the significance of that distinction. Finally, by focusing on contemporary particle physics, we raise the question: since there is no general accepted unification of the parts of the standard model (namely, QED and QCD, we have no theory, in the usual sense of the term. This poses a difficulty: if there is no theory, how can we speak of its models? What are the latter models of? We conclude by noting that it is unclear that the semantic view can be applied to contemporary physical theories.

  16. Computational Modeling of Traffic Related Thoracic Injury of a 10-Year-Old Child Using Subject-Specific Modeling Technique.

    Science.gov (United States)

    Zhu, Feng; Jiang, Binhui; Hu, Jingwen; Wang, Yulong; Shen, Ming; Yang, King H

    2016-01-01

    Traffic injuries have become a major health-related issue to school-aged children. To study this type of injury with numerical simulations, a finite element model was developed to represent the full body of a 10-year-old (YO) child. The model has been validated against test data at both body-part and full-body levels in previous studies. Representing only the average 10-YO child, this model did not include subject-specific attributes, such as the variations in size and shape among different children. In this paper, a new modeling approach was used to morph this baseline model to a subject-specific model, based on anthropometric data collected from pediatric subjects. This mesh-morphing method was then used to rapidly morph the baseline mesh into the subject-specific geometry while maintaining a good mesh quality. The morphed model was subsequently applied to simulate a real-world motor vehicle crash accident. A lung injury observed in the accident was well captured by the subject-specific model. The findings of this study demonstrate the feasibility of the proposed morphing approach to develop subject-specific human models, and confirm their capability in prediction of traffic injuries.

  17. Reconciliation with oneself and with others: From approach to model

    Directory of Open Access Journals (Sweden)

    Nikolić-Ristanović Vesna

    2010-01-01

    Full Text Available The paper intends to present the approach to dealing with war and its consequences which was developed within Victimology Society of Serbia over the last five years, in the framework of Association Joint Action for Truth and Reconciliation (ZAIP. First, the short review of the Association and the process through which ZAIP approach to dealing with a past was developed is presented. Then, the detailed description of the approach itself, with identification of its most important specificities, is presented. In the conclusion, next steps, aimed at development of the model of reconciliation which will have the basis in ZAIP approach and which will be appropriate to social context of Serbia and its surrounding, are suggested.

  18. A Model Management Approach for Co-Simulation Model Evaluation

    NARCIS (Netherlands)

    Zhang, X.C.; Broenink, Johannes F.; Filipe, Joaquim; Kacprzyk, Janusz; Pina, Nuno

    2011-01-01

    Simulating formal models is a common means for validating the correctness of the system design and reduce the time-to-market. In most of the embedded control system design, multiple engineering disciplines and various domain-specific models are often involved, such as mechanical, control, software

  19. Software Requirements Specification Verifiable Fuel Cycle Simulation (VISION) Model

    Energy Technology Data Exchange (ETDEWEB)

    D. E. Shropshire; W. H. West

    2005-11-01

    The purpose of this Software Requirements Specification (SRS) is to define the top-level requirements for a Verifiable Fuel Cycle Simulation Model (VISION) of the Advanced Fuel Cycle (AFC). This simulation model is intended to serve a broad systems analysis and study tool applicable to work conducted as part of the AFCI (including costs estimates) and Generation IV reactor development studies.

  20. Multiscale Model Approach for Magnetization Dynamics Simulations

    CERN Document Server

    De Lucia, Andrea; Tretiakov, Oleg A; Kläui, Mathias

    2016-01-01

    Simulations of magnetization dynamics in a multiscale environment enable rapid evaluation of the Landau-Lifshitz-Gilbert equation in a mesoscopic sample with nanoscopic accuracy in areas where such accuracy is required. We have developed a multiscale magnetization dynamics simulation approach that can be applied to large systems with spin structures that vary locally on small length scales. To implement this, the conventional micromagnetic simulation framework has been expanded to include a multiscale solving routine. The software selectively simulates different regions of a ferromagnetic sample according to the spin structures located within in order to employ a suitable discretization and use either a micromagnetic or an atomistic model. To demonstrate the validity of the multiscale approach, we simulate the spin wave transmission across the regions simulated with the two different models and different discretizations. We find that the interface between the regions is fully transparent for spin waves with f...

  1. Continuum modeling an approach through practical examples

    CERN Document Server

    Muntean, Adrian

    2015-01-01

    This book develops continuum modeling skills and approaches the topic from three sides: (1) derivation of global integral laws together with the associated local differential equations, (2) design of constitutive laws and (3) modeling boundary processes. The focus of this presentation lies on many practical examples covering aspects such as coupled flow, diffusion and reaction in porous media or microwave heating of a pizza, as well as traffic issues in bacterial colonies and energy harvesting from geothermal wells. The target audience comprises primarily graduate students in pure and applied mathematics as well as working practitioners in engineering who are faced by nonstandard rheological topics like those typically arising in the food industry.

  2. A Multivariate Approach to Functional Neuro Modeling

    DEFF Research Database (Denmark)

    Mørch, Niels J.S.

    1998-01-01

    This Ph.D. thesis, A Multivariate Approach to Functional Neuro Modeling, deals with the analysis and modeling of data from functional neuro imaging experiments. A multivariate dataset description is provided which facilitates efficient representation of typical datasets and, more importantly...... and overall conditions governing the functional experiment, via associated micro- and macroscopic variables. The description facilitates an efficient microscopic re-representation, as well as a handle on the link between brain and behavior; the latter is achieved by hypothesizing variations in the micro...... a generalization theoretical framework centered around measures of model generalization error. - Only few, if any, examples of the application of generalization theory to functional neuro modeling currently exist in the literature. - Exemplification of the proposed generalization theoretical framework...

  3. Interfacial Fluid Mechanics A Mathematical Modeling Approach

    CERN Document Server

    Ajaev, Vladimir S

    2012-01-01

    Interfacial Fluid Mechanics: A Mathematical Modeling Approach provides an introduction to mathematical models of viscous flow used in rapidly developing fields of microfluidics and microscale heat transfer. The basic physical effects are first introduced in the context of simple configurations and their relative importance in typical microscale applications is discussed. Then,several configurations of importance to microfluidics, most notably thin films/droplets on substrates and confined bubbles, are discussed in detail.  Topics from current research on electrokinetic phenomena, liquid flow near structured solid surfaces, evaporation/condensation, and surfactant phenomena are discussed in the later chapters. This book also:  Discusses mathematical models in the context of actual applications such as electrowetting Includes unique material on fluid flow near structured surfaces and phase change phenomena Shows readers how to solve modeling problems related to microscale multiphase flows Interfacial Fluid Me...

  4. Systematic approach to MIS model creation

    Directory of Open Access Journals (Sweden)

    Macura Perica

    2004-01-01

    Full Text Available In this paper-work, by application of basic principles of general theory of system (systematic approach, we have formulated a model of marketing information system. Bases for research were basic characteristics of systematic approach and marketing system. Informational base for management of marketing system, i.e. marketing instruments was presented in a way that the most important information for decision making were listed per individual marketing mix instruments. In projected model of marketing information system, information listed in this way create a base for establishing of data bases, i.e. bases of information (data bases of: product, price, distribution, promotion. This paper-work gives basic preconditions for formulation and functioning of the model. Model was presented by explication of elements of its structure (environment, data bases operators, analysts of information system, decision makers - managers, i.e. input, process, output, feedback and relations between these elements which are necessary for its optimal functioning. Beside that, here are basic elements for implementation of the model into business system, as well as conditions for its efficient functioning and development.

  5. Comparing Supply-Side Specifications in Models of Global Agriculture and the Food System

    Energy Technology Data Exchange (ETDEWEB)

    Robinson, Sherman; van Meijl, Hans; Willenbockel, Dirk; Valin, Hugo; Fujimori, Shinichiro; Masui, Toshihiko; Sands, Ronald; Wise, Marshall A.; Calvin, Katherine V.; Havlik, Petr; Mason d' Croz, Daniel; Tabeau, Andrzej; Kavallari, Aikaterini; Schmitz, Christoph; Dietrich, Jan P.; von Lampe, Martin

    2014-01-01

    This paper compares the theoretical specification of production and technical change across the partial equilibrium (PE) and computable general equilibrium (CGE) models of the global agricultural and food system included in the AgMIP model comparison study. The two modeling approaches have different theoretical underpinnings concerning the scope of economic activity they capture and how they represent technology and the behavior of supply and demand in markets. This paper focuses on their different specifications of technology and supply behavior, comparing their theoretical and empirical treatments. While the models differ widely in their specifications of technology, both within and between the PE and CGE classes of models, we find that the theoretical responsiveness of supply to changes in prices can be similar, depending on parameter choices that define the behavior of supply functions over the domain of applicability defined by the common scenarios used in the AgMIP comparisons. In particular, we compare the theoretical specification of supply in CGE models with neoclassical production functions and PE models that focus on land and crop yields in agriculture. In practice, however, comparability of results given parameter choices is an empirical question, and the models differ in their sensitivity to variations in specification. To illustrate the issues, sensitivity analysis is done with one global CGE model, MAGNET, to indicate how the results vary with different specification of technical change, and how they compare with the results from PE models.

  6. Joint Modeling of Multiple Crimes: A Bayesian Spatial Approach

    Directory of Open Access Journals (Sweden)

    Hongqiang Liu

    2017-01-01

    Full Text Available A multivariate Bayesian spatial modeling approach was used to jointly model the counts of two types of crime, i.e., burglary and non-motor vehicle theft, and explore the geographic pattern of crime risks and relevant risk factors. In contrast to the univariate model, which assumes independence across outcomes, the multivariate approach takes into account potential correlations between crimes. Six independent variables are included in the model as potential risk factors. In order to fully present this method, both the multivariate model and its univariate counterpart are examined. We fitted the two models to the data and assessed them using the deviance information criterion. A comparison of the results from the two models indicates that the multivariate model was superior to the univariate model. Our results show that population density and bar density are clearly associated with both burglary and non-motor vehicle theft risks and indicate a close relationship between these two types of crime. The posterior means and 2.5% percentile of type-specific crime risks estimated by the multivariate model were mapped to uncover the geographic patterns. The implications, limitations and future work of the study are discussed in the concluding section.

  7. Evaluation of Workflow Management Systems - A Meta Model Approach

    Directory of Open Access Journals (Sweden)

    Michael Rosemann

    1998-11-01

    Full Text Available The automated enactment of processes through the use of workflow management systems enables the outsourcing of the control flow from application systems. By now a large number of systems, that follow different workflow paradigms, are available. This leads to the problem of selecting the appropriate workflow management system for a given situation. In this paper we outline the benefits of a meta model approach for the evaluation and comparison of different workflow management systems. After a general introduction on the topic of meta modeling the meta models of the workflow management systems WorkParty (Siemens Nixdorf and FlowMark (IBM are compared as an example. These product specific meta models can be generalized to meta reference models, which helps to specify a workflow methodology. Exemplary, an organisational reference meta model is presented, which helps users in specifying their requirements for a workflow management system.

  8. Component-specific modeling. [jet engine hot section components

    Science.gov (United States)

    Mcknight, R. L.; Maffeo, R. J.; Tipton, M. T.; Weber, G.

    1992-01-01

    Accomplishments are described for a 3 year program to develop methodology for component-specific modeling of aircraft hot section components (turbine blades, turbine vanes, and burner liners). These accomplishments include: (1) engine thermodynamic and mission models, (2) geometry model generators, (3) remeshing, (4) specialty three-dimensional inelastic structural analysis, (5) computationally efficient solvers, (6) adaptive solution strategies, (7) engine performance parameters/component response variables decomposition and synthesis, (8) integrated software architecture and development, and (9) validation cases for software developed.

  9. Formal Specification of the OpenMP Memory Model

    Energy Technology Data Exchange (ETDEWEB)

    Bronevetsky, G; de Supinski, B R

    2006-05-17

    OpenMP [1] is an important API for shared memory programming, combining shared memory's potential for performance with a simple programming interface. Unfortunately, OpenMP lacks a critical tool for demonstrating whether programs are correct: a formal memory model. Instead, the current official definition of the OpenMP memory model (the OpenMP 2.5 specification [1]) is in terms of informal prose. As a result, it is impossible to verify OpenMP applications formally since the prose does not provide a formal consistency model that precisely describes how reads and writes on different threads interact. This paper focuses on the formal verification of OpenMP programs through a proposed formal memory model that is derived from the existing prose model [1]. Our formalization provides a two-step process to verify whether an observed OpenMP execution is conformant. In addition to this formalization, our contributions include a discussion of ambiguities in the current prose-based memory model description. Although our formal model may not capture the current informal memory model perfectly, in part due to these ambiguities, our model reflects our understanding of the informal model's intent. We conclude with several examples that may indicate areas of the OpenMP memory model that need further refinement however it is specified. Our goal is to motivate the OpenMP community to adopt those refinements eventually, ideally through a formal model, in later OpenMP specifications.

  10. Integrating Platform Selection Rules in the Model-Driven Architecture Approach

    NARCIS (Netherlands)

    Tekinerdogan, B.; Bilir, S.; Abatlevi, C.; Assmann, U.; Aksit, M.; Rensink, A.

    2005-01-01

    A key issue in the MDA approach is the transformation of platform independent models to platform specific models. Before transforming to a platform specific model, however, it is necessary to select the appropriate platform. Various platforms exist with different properties and the selection of the

  11. A Principled Approach to the Specification of System Architectures for Space Missions

    Science.gov (United States)

    McKelvin, Mark L. Jr.; Castillo, Robert; Bonanne, Kevin; Bonnici, Michael; Cox, Brian; Gibson, Corrina; Leon, Juan P.; Gomez-Mustafa, Jose; Jimenez, Alejandro; Madni, Azad

    2015-01-01

    Modern space systems are increasing in complexity and scale at an unprecedented pace. Consequently, innovative methods, processes, and tools are needed to cope with the increasing complexity of architecting these systems. A key systems challenge in practice is the ability to scale processes, methods, and tools used to architect complex space systems. Traditionally, the process for specifying space system architectures has largely relied on capturing the system architecture in informal descriptions that are often embedded within loosely coupled design documents and domain expertise. Such informal descriptions often lead to misunderstandings between design teams, ambiguous specifications, difficulty in maintaining consistency as the architecture evolves throughout the system development life cycle, and costly design iterations. Therefore, traditional methods are becoming increasingly inefficient to cope with ever-increasing system complexity. We apply the principles of component-based design and platform-based design to the development of the system architecture for a practical space system to demonstrate feasibility of our approach using SysML. Our results show that we are able to apply a systematic design method to manage system complexity, thus enabling effective data management, semantic coherence and traceability across different levels of abstraction in the design chain. Just as important, our approach enables interoperability among heterogeneous tools in a concurrent engineering model based design environment.

  12. Metal Mixture Modeling Evaluation project: 2. Comparison of four modeling approaches

    Science.gov (United States)

    Farley, Kevin J.; Meyer, Joe; Balistrieri, Laurie S.; DeSchamphelaere, Karl; Iwasaki, Yuichi; Janssen, Colin; Kamo, Masashi; Lofts, Steve; Mebane, Christopher A.; Naito, Wataru; Ryan, Adam C.; Santore, Robert C.; Tipping, Edward

    2015-01-01

    As part of the Metal Mixture Modeling Evaluation (MMME) project, models were developed by the National Institute of Advanced Industrial Science and Technology (Japan), the U.S. Geological Survey (USA), HDR⎪HydroQual, Inc. (USA), and the Centre for Ecology and Hydrology (UK) to address the effects of metal mixtures on biological responses of aquatic organisms. A comparison of the 4 models, as they were presented at the MMME Workshop in Brussels, Belgium (May 2012), is provided herein. Overall, the models were found to be similar in structure (free ion activities computed by WHAM; specific or non-specific binding of metals/cations in or on the organism; specification of metal potency factors and/or toxicity response functions to relate metal accumulation to biological response). Major differences in modeling approaches are attributed to various modeling assumptions (e.g., single versus multiple types of binding site on the organism) and specific calibration strategies that affected the selection of model parameters. The models provided a reasonable description of additive (or nearly additive) toxicity for a number of individual toxicity test results. Less-than-additive toxicity was more difficult to describe with the available models. Because of limitations in the available datasets and the strong inter-relationships among the model parameters (log KM values, potency factors, toxicity response parameters), further evaluation of specific model assumptions and calibration strategies is needed.

  13. Regularization of turbulence - a comprehensive modeling approach

    Science.gov (United States)

    Geurts, B. J.

    2011-12-01

    Turbulence readily arises in numerous flows in nature and technology. The large number of degrees of freedom of turbulence poses serious challenges to numerical approaches aimed at simulating and controlling such flows. While the Navier-Stokes equations are commonly accepted to precisely describe fluid turbulence, alternative coarsened descriptions need to be developed to cope with the wide range of length and time scales. These coarsened descriptions are known as large-eddy simulations in which one aims to capture only the primary features of a flow, at considerably reduced computational effort. Such coarsening introduces a closure problem that requires additional phenomenological modeling. A systematic approach to the closure problem, know as regularization modeling, will be reviewed. Its application to multiphase turbulent will be illustrated in which a basic regularization principle is enforced to physically consistently approximate momentum and scalar transport. Examples of Leray and LANS-alpha regularization are discussed in some detail, as are compatible numerical strategies. We illustrate regularization modeling to turbulence under the influence of rotation and buoyancy and investigate the accuracy with which particle-laden flow can be represented. A discussion of the numerical and modeling errors incurred will be given on the basis of homogeneous isotropic turbulence.

  14. An autologous in situ tumor vaccination approach for hepatocellular carcinoma. 2. Tumor-specific immunity and cure after radio-inducible suicide gene therapy and systemic CD40-ligand and Flt3-ligand gene therapy in an orthotopic tumor model.

    Science.gov (United States)

    Kawashita, Yujo; Deb, Niloy J; Garg, Madhur K; Kabarriti, Rafi; Fan, Zuoheng; Alfieri, Alan A; Roy-Chowdhury, Jayanta; Guha, Chandan

    2014-08-01

    Diffuse hepatocellular carcinoma (HCC) is a lethal disease that radiation therapy (RT) currently has a limited role in treating because of the potential for developing fatal radiation-induced liver disease. However, recently diffuse HCC, "radio-inducible suicide gene therapy" has been shown to enhance local tumor control and residual microscopic disease within the liver for diffuse HCC, by using a combination of chemoactivation and molecular radiosensitization. We have demonstrated that the addition of recombinant adenovirus-expressing human Flt3 ligand (Adeno-Flt3L) after radio-inducible suicide gene therapy induced a Th1-biased, immune response and enhanced tumor control in an ectopic model of HCC. We hypothesized that sequential administration of recombinant adenovirus-expressing CD40L (Adeno-CD40L) could further potentiate the efficacy of our trimodal therapy with RT + HSV-TK + Adeno-Flt3L. We examined our hypothesis in an orthotopic model of diffuse HCC using BNL1ME A.7R.1 (BNL) cells in Balb/c mice. BNL murine hepatoma cells (5 × 10(4)) transfected with an expression vector of HSV-TK under the control of a radiation-inducible promoter were injected intraportally into BALB/cJ mice. Fourteen days after the HCC injection, mice were treated with a 25 Gy dose of radiation to the whole liver, followed by ganciclovir (GCV) treatment and systemic adenoviral cytokine gene therapy (Flt3L or CD40L or both). Untreated mice died in 27 ± 4 days. Radiation therapy alone had a marginal effect on survival (median = 35 ± 7 days) and the addition of HSV-TK/GCV gene therapy improved the median survival to 47 ± 6 days. However, the addition of Adeno-Flt3L to radiation therapy and HSV-TK/GCV therapy significantly (P = 0.0005) increased survival to a median of 63 ± 20 days with 44% (7/16) of the animals still alive 116 days after tumor implantation. The curative effect of Flt3L was completely abolished when using immunodeficient nude mice or mice depleted for CD4, CD8 and

  15. ON SOME APPROACHES TO ECONOMICMATHEMATICAL MODELING OF SMALL BUSINESS

    Directory of Open Access Journals (Sweden)

    Orlov A. I.

    2015-04-01

    Full Text Available Small business is an important part of modern Russian economy. We give a wide panorama developed by us of possible approaches to the construction of economic-mathematical models that may be useful to describe the dynamics of small businesses, as well as management. As for the description of certain problems of small business can use a variety of types of economic-mathematical and econometric models, we found it useful to consider a fairly wide range of such models, which resulted in quite a short description of the specific models. In this description of the models brought to such a level that an experienced professional in the field of economic-mathematical modeling could, if necessary, to develop their own specific model to the stage of design formulas and numerical results. Particular attention is paid to the use of statistical methods of non-numeric data, the most pressing at the moment. Are considered the problems of economic-mathematical modeling in solving problems of small business marketing. We have accumulated some experience in application of the methodology of economic-mathematical modeling in solving practical problems in small business marketing, in particular in the field of consumer goods and industrial purposes, educational services, as well as in the analysis and modeling of inflation, taxation and others. In marketing models of decision making theory we apply rankings and ratings. Is considered the problem of comparing averages. We present some models of the life cycle of small businesses - flow model projects, model of capture niches, and model of niche selection. We discuss the development of research on economic-mathematical modeling of small businesses

  16. Formal Specification of the OpenMP Memory Model

    Energy Technology Data Exchange (ETDEWEB)

    Bronevetsky, G; de Supinski, B

    2006-12-19

    OpenMP [2] is an important API for shared memory programming, combining shared memory's potential for performance with a simple programming interface. Unfortunately, OpenMP lacks a critical tool for demonstrating whether programs are correct: a formal memory model. Instead, the current official definition of the OpenMP memory model (the OpenMP 2.5 specification [2]) is in terms of informal prose. As a result, it is impossible to verify OpenMP applications formally since the prose does not provide a formal consistency model that precisely describes how reads and writes on different threads interact. We expand on our previous work that focused on the formal verification of OpenMP programs through a formal memory model [?]. As in that work, our formalization, which is derived from the existing prose model [2], provides a two-step process to verify whether an observed OpenMP execution is conformant. This paper extends the model to cover the entire specification. In addition to this formalization, our contributions include a discussion of ambiguities in the current prose-based memory model description. Although our formal model may not capture the current informal memory model perfectly, in part due to these ambiguities, our model reflects our understanding of the informal model's intent. We conclude with several examples that may indicate areas of the OpenMP memory model that need further refinement, however it is specified. Our goal is to motivate the OpenMP community to adopt those refinements eventually, ideally through a formal model, in later OpenMP specifications.

  17. Generic solar photovoltaic system dynamic simulation model specification

    Energy Technology Data Exchange (ETDEWEB)

    Ellis, Abraham [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Behnke, Michael Robert [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Elliott, Ryan Thomas [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2013-10-01

    This document is intended to serve as a specification for generic solar photovoltaic (PV) system positive-sequence dynamic models to be implemented by software developers and approved by the WECC MVWG for use in bulk system dynamic simulations in accordance with NERC MOD standards. Two specific dynamic models are included in the scope of this document. The first, a Central Station PV System model, is intended to capture the most important dynamic characteristics of large scale (> 10 MW) PV systems with a central Point of Interconnection (POI) at the transmission level. The second, a Distributed PV System model, is intended to represent an aggregation of smaller, distribution-connected systems that comprise a portion of a composite load that might be modeled at a transmission load bus.

  18. Polychronous Interpretation of Synoptic, a Domain Specific Modeling Language for Embedded Flight-Software

    Directory of Open Access Journals (Sweden)

    Loïc Besnard

    2010-03-01

    Full Text Available The SPaCIFY project, which aims at bringing advances in MDE to the satellite flight software industry, advocates a top-down approach built on a domain-specific modeling language named Synoptic. In line with previous approaches to real-time modeling such as Statecharts and Simulink, Synoptic features hierarchical decomposition of application and control modules in synchronous block diagrams and state machines. Its semantics is described in the polychronous model of computation, which is that of the synchronous language SIGNAL.

  19. Multilevel Models for the Analysis of Angle-Specific Torque Curves with Application to Master Athletes.

    Science.gov (United States)

    Carvalho, Humberto M

    2015-12-22

    The aim of this paper was to outline a multilevel modeling approach to fit individual angle-specific torque curves describing concentric knee extension and flexion isokinetic muscular actions in Master athletes. The potential of the analytical approach to examine between individual differences across the angle-specific torque curves was illustrated including between-individuals variation due to gender differences at a higher level. Torques in concentric muscular actions of knee extension and knee extension at 60º·s(-1) were considered within a range of motion between 5º and 85º (only torques "truly" isokinetic). Multilevel time series models with autoregressive covariance structures with standard multilevel models were superior fits compared with standard multilevel models for repeated measures to fit angle-specific torque curves. Third and fourth order polynomial models were the best fits to describe angle-specific torque curves of isokinetic knee flexion and extension concentric actions, respectively. The fixed exponents allow interpretations for initial acceleration, the angle at peak torque and the decrement of torque after peak torque. Also, the multilevel models were flexible to illustrate the influence of gender differences on the shape of torque throughout the range of motion and in the shape of the curves. The presented multilevel regression models may afford a general framework to examine angle-specific moment curves by isokinetic dynamometry, and add to the understanding mechanisms of strength development, particularly the force-length relationship, both related to performance and injury prevention.

  20. Modeling software with finite state machines a practical approach

    CERN Document Server

    Wagner, Ferdinand; Wagner, Thomas; Wolstenholme, Peter

    2006-01-01

    Modeling Software with Finite State Machines: A Practical Approach explains how to apply finite state machines to software development. It provides a critical analysis of using finite state machines as a foundation for executable specifications to reduce software development effort and improve quality. This book discusses the design of a state machine and of a system of state machines. It also presents a detailed analysis of development issues relating to behavior modeling with design examples and design rules for using finite state machines. This volume describes a coherent and well-tested fr

  1. Merging Digital Surface Models Implementing Bayesian Approaches

    Science.gov (United States)

    Sadeq, H.; Drummond, J.; Li, Z.

    2016-06-01

    In this research different DSMs from different sources have been merged. The merging is based on a probabilistic model using a Bayesian Approach. The implemented data have been sourced from very high resolution satellite imagery sensors (e.g. WorldView-1 and Pleiades). It is deemed preferable to use a Bayesian Approach when the data obtained from the sensors are limited and it is difficult to obtain many measurements or it would be very costly, thus the problem of the lack of data can be solved by introducing a priori estimations of data. To infer the prior data, it is assumed that the roofs of the buildings are specified as smooth, and for that purpose local entropy has been implemented. In addition to the a priori estimations, GNSS RTK measurements have been collected in the field which are used as check points to assess the quality of the DSMs and to validate the merging result. The model has been applied in the West-End of Glasgow containing different kinds of buildings, such as flat roofed and hipped roofed buildings. Both quantitative and qualitative methods have been employed to validate the merged DSM. The validation results have shown that the model was successfully able to improve the quality of the DSMs and improving some characteristics such as the roof surfaces, which consequently led to better representations. In addition to that, the developed model has been compared with the well established Maximum Likelihood model and showed similar quantitative statistical results and better qualitative results. Although the proposed model has been applied on DSMs that were derived from satellite imagery, it can be applied to any other sourced DSMs.

  2. MERGING DIGITAL SURFACE MODELS IMPLEMENTING BAYESIAN APPROACHES

    Directory of Open Access Journals (Sweden)

    H. Sadeq

    2016-06-01

    Full Text Available In this research different DSMs from different sources have been merged. The merging is based on a probabilistic model using a Bayesian Approach. The implemented data have been sourced from very high resolution satellite imagery sensors (e.g. WorldView-1 and Pleiades. It is deemed preferable to use a Bayesian Approach when the data obtained from the sensors are limited and it is difficult to obtain many measurements or it would be very costly, thus the problem of the lack of data can be solved by introducing a priori estimations of data. To infer the prior data, it is assumed that the roofs of the buildings are specified as smooth, and for that purpose local entropy has been implemented. In addition to the a priori estimations, GNSS RTK measurements have been collected in the field which are used as check points to assess the quality of the DSMs and to validate the merging result. The model has been applied in the West-End of Glasgow containing different kinds of buildings, such as flat roofed and hipped roofed buildings. Both quantitative and qualitative methods have been employed to validate the merged DSM. The validation results have shown that the model was successfully able to improve the quality of the DSMs and improving some characteristics such as the roof surfaces, which consequently led to better representations. In addition to that, the developed model has been compared with the well established Maximum Likelihood model and showed similar quantitative statistical results and better qualitative results. Although the proposed model has been applied on DSMs that were derived from satellite imagery, it can be applied to any other sourced DSMs.

  3. Diagnosing Hybrid Systems: a Bayesian Model Selection Approach

    Science.gov (United States)

    McIlraith, Sheila A.

    2005-01-01

    In this paper we examine the problem of monitoring and diagnosing noisy complex dynamical systems that are modeled as hybrid systems-models of continuous behavior, interleaved by discrete transitions. In particular, we examine continuous systems with embedded supervisory controllers that experience abrupt, partial or full failure of component devices. Building on our previous work in this area (MBCG99;MBCG00), our specific focus in this paper ins on the mathematical formulation of the hybrid monitoring and diagnosis task as a Bayesian model tracking algorithm. The nonlinear dynamics of many hybrid systems present challenges to probabilistic tracking. Further, probabilistic tracking of a system for the purposes of diagnosis is problematic because the models of the system corresponding to failure modes are numerous and generally very unlikely. To focus tracking on these unlikely models and to reduce the number of potential models under consideration, we exploit logic-based techniques for qualitative model-based diagnosis to conjecture a limited initial set of consistent candidate models. In this paper we discuss alternative tracking techniques that are relevant to different classes of hybrid systems, focusing specifically on a method for tracking multiple models of nonlinear behavior simultaneously using factored sampling and conditional density propagation. To illustrate and motivate the approach described in this paper we examine the problem of monitoring and diganosing NASA's Sprint AERCam, a small spherical robotic camera unit with 12 thrusters that enable both linear and rotational motion.

  4. Similarity transformation approach to identifiability analysis of nonlinear compartmental models.

    Science.gov (United States)

    Vajda, S; Godfrey, K R; Rabitz, H

    1989-04-01

    Through use of the local state isomorphism theorem instead of the algebraic equivalence theorem of linear systems theory, the similarity transformation approach is extended to nonlinear models, resulting in finitely verifiable sufficient and necessary conditions for global and local identifiability. The approach requires testing of certain controllability and observability conditions, but in many practical examples these conditions prove very easy to verify. In principle the method also involves nonlinear state variable transformations, but in all of the examples presented in the paper the transformations turn out to be linear. The method is applied to an unidentifiable nonlinear model and a locally identifiable nonlinear model, and these are the first nonlinear models other than bilinear models where the reason for lack of global identifiability is nontrivial. The method is also applied to two models with Michaelis-Menten elimination kinetics, both of considerable importance in pharmacokinetics, and for both of which the complicated nature of the algebraic equations arising from the Taylor series approach has hitherto defeated attempts to establish identifiability results for specific input functions.

  5. A new approach for Bayesian model averaging

    Institute of Scientific and Technical Information of China (English)

    TIAN XiangJun; XIE ZhengHui; WANG AiHui; YANG XiaoChun

    2012-01-01

    Bayesian model averaging (BMA) is a recently proposed statistical method for calibrating forecast ensembles from numerical weather models.However,successful implementation of BMA requires accurate estimates of the weights and variances of the individual competing models in the ensemble.Two methods,namely the Expectation-Maximization (EM) and the Markov Chain Monte Carlo (MCMC) algorithms,are widely used for BMA model training.Both methods have their own respective strengths and weaknesses.In this paper,we first modify the BMA log-likelihood function with the aim of removing the additional limitation that requires that the BMA weights add to one,and then use a limited memory quasi-Newtonian algorithm for solving the nonlinear optimization problem,thereby formulating a new approach for BMA (referred to as BMA-BFGS).Several groups of multi-model soil moisture simulation experiments from three land surface models show that the performance of BMA-BFGS is similar to the MCMC method in terms of simulation accuracy,and that both are superior to the EM algorithm.On the other hand,the computational cost of the BMA-BFGS algorithm is substantially less than for MCMC and is almost equivalent to that for EM.

  6. Negative specific heat in a thermodynamic model of multifragmentation

    CERN Document Server

    Das, C B; Mekjian, A Z

    2003-01-01

    We consider a soluble model of multifragmentation which is similar in spirit to many models which have been used to fit intermediate energy heavy ion collision data. In this model $c_v$ is always positive but for finite nuclei $c_p$ can be negative for some temperatures and pressures. Furthermore, negative values of $c_p$ can be obtained in canonical treatment. One does not need to use the microcanonical ensemble. Negative values for $c_p$ can persist for systems as large as 200 paticles but this depends upon parameters used in the model calculation. As expected, negative specific heats are absent in the thermodynamic limit.

  7. Patient-specific modeling of human cardiovascular system elements

    Science.gov (United States)

    Kossovich, Leonid Yu.; Kirillova, Irina V.; Golyadkina, Anastasiya A.; Polienko, Asel V.; Chelnokova, Natalia O.; Ivanov, Dmitriy V.; Murylev, Vladimir V.

    2016-03-01

    Object of study: The research is aimed at development of personalized medical treatment. Algorithm was developed for patient-specific surgical interventions of the cardiovascular system pathologies. Methods: Geometrical models of the biological objects and initial and boundary conditions were realized by medical diagnostic data of the specific patient. Mechanical and histomorphological parameters were obtained with the help mechanical experiments on universal testing machine. Computer modeling of the studied processes was conducted with the help of the finite element method. Results: Results of the numerical simulation allowed evaluating the physiological processes in the studied object in normal state, in presence of different pathologies and after different types of surgical procedures.

  8. Bayesian Student Modeling and the Problem of Parameter Specification.

    Science.gov (United States)

    Millan, Eva; Agosta, John Mark; Perez de la Cruz, Jose Luis

    2001-01-01

    Discusses intelligent tutoring systems and the application of Bayesian networks to student modeling. Considers reasons for not using Bayesian networks, including the computational complexity of the algorithms and the difficulty of knowledge acquisition, and proposes an approach to simplify knowledge acquisition that applies causal independence to…

  9. Modeling the Development of Goal-Specificity in Mirror Neurons.

    Science.gov (United States)

    Thill, Serge; Svensson, Henrik; Ziemke, Tom

    2011-12-01

    Neurophysiological studies have shown that parietal mirror neurons encode not only actions but also the goal of these actions. Although some mirror neurons will fire whenever a certain action is perceived (goal-independently), most will only fire if the motion is perceived as part of an action with a specific goal. This result is important for the action-understanding hypothesis as it provides a potential neurological basis for such a cognitive ability. It is also relevant for the design of artificial cognitive systems, in particular robotic systems that rely on computational models of the mirror system in their interaction with other agents. Yet, to date, no computational model has explicitly addressed the mechanisms that give rise to both goal-specific and goal-independent parietal mirror neurons. In the present paper, we present a computational model based on a self-organizing map, which receives artificial inputs representing information about both the observed or executed actions and the context in which they were executed. We show that the map develops a biologically plausible organization in which goal-specific mirror neurons emerge. We further show that the fundamental cause for both the appearance and the number of goal-specific neurons can be found in geometric relationships between the different inputs to the map. The results are important to the action-understanding hypothesis as they provide a mechanism for the emergence of goal-specific parietal mirror neurons and lead to a number of predictions: (1) Learning of new goals may mostly reassign existing goal-specific neurons rather than recruit new ones; (2) input differences between executed and observed actions can explain observed corresponding differences in the number of goal-specific neurons; and (3) the percentage of goal-specific neurons may differ between motion primitives.

  10. AN AUTOMATIC APPROACH TO BOX & JENKINS MODELLING

    OpenAIRE

    MARCELO KRIEGER

    1983-01-01

    Apesar do reconhecimento amplo da qualidade das previsões obtidas na aplicação de um modelo ARIMA à previsão de séries temporais univariadas, seu uso tem permanecido restrito pela falta de procedimentos automáticos, computadorizados. Neste trabalho este problema é discutido e um algoritmo é proposto. Inspite of general recognition of the good forecasting ability of ARIMA models in predicting time series, this approach is not widely used because of the lack of ...

  11. Modeling in transport phenomena a conceptual approach

    CERN Document Server

    Tosun, Ismail

    2007-01-01

    Modeling in Transport Phenomena, Second Edition presents and clearly explains with example problems the basic concepts and their applications to fluid flow, heat transfer, mass transfer, chemical reaction engineering and thermodynamics. A balanced approach is presented between analysis and synthesis, students will understand how to use the solution in engineering analysis. Systematic derivations of the equations and the physical significance of each term are given in detail, for students to easily understand and follow up the material. There is a strong incentive in science and engineering to

  12. Modeling for fairness: A Rawlsian approach.

    Science.gov (United States)

    Diekmann, Sven; Zwart, Sjoerd D

    2014-06-01

    In this paper we introduce the overlapping design consensus for the construction of models in design and the related value judgments. The overlapping design consensus is inspired by Rawls' overlapping consensus. The overlapping design consensus is a well-informed, mutual agreement among all stakeholders based on fairness. Fairness is respected if all stakeholders' interests are given due and equal attention. For reaching such fair agreement, we apply Rawls' original position and reflective equilibrium to modeling. We argue that by striving for the original position, stakeholders expel invalid arguments, hierarchies, unwarranted beliefs, and bargaining effects from influencing the consensus. The reflective equilibrium requires that stakeholders' beliefs cohere with the final agreement and its justification. Therefore, the overlapping design consensus is not only an agreement to decisions, as most other stakeholder approaches, it is also an agreement to their justification and that this justification is consistent with each stakeholders' beliefs. For supporting fairness, we argue that fairness qualifies as a maxim in modeling. We furthermore distinguish values embedded in a model from values that are implied by its context of application. Finally, we conclude that for reaching an overlapping design consensus communication about properties of and values related to a model is required.

  13. A Formal Semantic Model for the Access Specification Language RASP

    Directory of Open Access Journals (Sweden)

    Mark Evered

    2015-05-01

    Full Text Available The access specification language RASP extends traditional role-based access control (RBAC concepts to provide greater expressive power often required for fine-grained access control in sensitive information systems. Existing formal models of RBAC are not sufficient to describe these extensions. In this paper, we define a new model for RBAC which formalizes the RASP concepts of controlled role appointment and transitions, object attributes analogous to subject roles and a transitive role/attribute derivation relationship.

  14. Popularity Modeling for Mobile Apps: A Sequential Approach.

    Science.gov (United States)

    Zhu, Hengshu; Liu, Chuanren; Ge, Yong; Xiong, Hui; Chen, Enhong

    2015-07-01

    The popularity information in App stores, such as chart rankings, user ratings, and user reviews, provides an unprecedented opportunity to understand user experiences with mobile Apps, learn the process of adoption of mobile Apps, and thus enables better mobile App services. While the importance of popularity information is well recognized in the literature, the use of the popularity information for mobile App services is still fragmented and under-explored. To this end, in this paper, we propose a sequential approach based on hidden Markov model (HMM) for modeling the popularity information of mobile Apps toward mobile App services. Specifically, we first propose a popularity based HMM (PHMM) to model the sequences of the heterogeneous popularity observations of mobile Apps. Then, we introduce a bipartite based method to precluster the popularity observations. This can help to learn the parameters and initial values of the PHMM efficiently. Furthermore, we demonstrate that the PHMM is a general model and can be applicable for various mobile App services, such as trend based App recommendation, rating and review spam detection, and ranking fraud detection. Finally, we validate our approach on two real-world data sets collected from the Apple Appstore. Experimental results clearly validate both the effectiveness and efficiency of the proposed popularity modeling approach.

  15. Pedagogic process modeling: Humanistic-integrative approach

    Directory of Open Access Journals (Sweden)

    Boritko Nikolaj M.

    2007-01-01

    Full Text Available The paper deals with some current problems of modeling the dynamics of the subject-features development of the individual. The term "process" is considered in the context of the humanistic-integrative approach, in which the principles of self education are regarded as criteria for efficient pedagogic activity. Four basic characteristics of the pedagogic process are pointed out: intentionality reflects logicality and regularity of the development of the process; discreteness (stageability in dicates qualitative stages through which the pedagogic phenomenon passes; nonlinearity explains the crisis character of pedagogic processes and reveals inner factors of self-development; situationality requires a selection of pedagogic conditions in accordance with the inner factors, which would enable steering the pedagogic process. Offered are two steps for singling out a particular stage and the algorithm for developing an integrative model for it. The suggested conclusions might be of use for further theoretic research, analyses of educational practices and for realistic predicting of pedagogical phenomena. .

  16. Nuclear level density: Shell-model approach

    Science.gov (United States)

    Sen'kov, Roman; Zelevinsky, Vladimir

    2016-06-01

    Knowledge of the nuclear level density is necessary for understanding various reactions, including those in the stellar environment. Usually the combinatorics of a Fermi gas plus pairing is used for finding the level density. Recently a practical algorithm avoiding diagonalization of huge matrices was developed for calculating the density of many-body nuclear energy levels with certain quantum numbers for a full shell-model Hamiltonian. The underlying physics is that of quantum chaos and intrinsic thermalization in a closed system of interacting particles. We briefly explain this algorithm and, when possible, demonstrate the agreement of the results with those derived from exact diagonalization. The resulting level density is much smoother than that coming from conventional mean-field combinatorics. We study the role of various components of residual interactions in the process of thermalization, stressing the influence of incoherent collision-like processes. The shell-model results for the traditionally used parameters are also compared with standard phenomenological approaches.

  17. Modeling Social Annotation: a Bayesian Approach

    CERN Document Server

    Plangprasopchok, Anon

    2008-01-01

    Collaborative tagging systems, such as del.icio.us, CiteULike, and others, allow users to annotate objects, e.g., Web pages or scientific papers, with descriptive labels called tags. The social annotations, contributed by thousands of users, can potentially be used to infer categorical knowledge, classify documents or recommend new relevant information. Traditional text inference methods do not make best use of socially-generated data, since they do not take into account variations in individual users' perspectives and vocabulary. In a previous work, we introduced a simple probabilistic model that takes interests of individual annotators into account in order to find hidden topics of annotated objects. Unfortunately, our proposed approach had a number of shortcomings, including overfitting, local maxima and the requirement to specify values for some parameters. In this paper we address these shortcomings in two ways. First, we extend the model to a fully Bayesian framework. Second, we describe an infinite ver...

  18. Age and gender specific biokinetic model for strontium in humans

    Energy Technology Data Exchange (ETDEWEB)

    Shagina, N. B.; Tolstykh, E. I.; Degteva, M. O.; Anspaugh, L. R.; Napier, Bruce A.

    2015-03-01

    A biokinetic model for strontium in humans is necessary for quantification of internal doses due to strontium radioisotopes. The ICRP-recommended biokinetic model for strontium has limitation for use in a population study, because it is not gender specific and does not cover all age ranges. The extensive Techa River data set on 90Sr in humans (tens of thousands of measurements) is a unique source of data on long-term strontium retention for men and women of all ages at intake. These, as well as published data, were used for evaluation of age- and gender-specific parameters for a new compartment biokinetic model for strontium (Sr-AGe model). The Sr-AGe model has similar structure as the ICRP model for the alkaline earth elements. The following parameters were mainly reevaluated: gastro-intestinal absorption and parameters related to the processes of bone formation and resorption defining calcium and strontium transfers in skeletal compartments. The Sr-AGe model satisfactorily describes available data sets on strontium retention for different kinds of intake (dietary and intravenous) at different ages (0–80 years old) and demonstrates good agreement with data sets for different ethnic groups. The Sr-AGe model can be used for dose assessment in epidemiological studies of general population exposed to ingested strontium radioisotopes.

  19. Specific heat of the simple-cubic Ising model

    NARCIS (Netherlands)

    Feng, X.; Blöte, H.W.J.

    2010-01-01

    We provide an expression quantitatively describing the specific heat of the Ising model on the simple-cubic lattice in the critical region. This expression is based on finite-size scaling of numerical results obtained by means of a Monte Carlo method. It agrees satisfactorily with series expansions

  20. Verifying OCL specifications of UML models : tool support and compositionality

    NARCIS (Netherlands)

    Kyas, Marcel

    2006-01-01

    The Unified Modelling Language (UML) and the Object Constraint Language (OCL) serve as specification languages for embedded and real-time systems used in a safety-critical environment. In this dissertation class diagrams, object diagrams, and OCL constraints are formalised. The formalisation serve

  1. Multilevel Models for the Analysis of Angle-Specific Torque Curves with Application to Master Athletes

    Directory of Open Access Journals (Sweden)

    Carvalho Humberto M.

    2015-12-01

    Full Text Available The aim of this paper was to outline a multilevel modeling approach to fit individual angle-specific torque curves describing concentric knee extension and flexion isokinetic muscular actions in Master athletes. The potential of the analytical approach to examine between individual differences across the angle-specific torque curves was illustrated including between-individuals variation due to gender differences at a higher level. Torques in concentric muscular actions of knee extension and knee extension at 60°·s-1 were considered within a range of motion between 5°and 85° (only torques “truly” isokinetic. Multilevel time series models with autoregressive covariance structures with standard multilevel models were superior fits compared with standard multilevel models for repeated measures to fit anglespecific torque curves. Third and fourth order polynomial models were the best fits to describe angle-specific torque curves of isokinetic knee flexion and extension concentric actions, respectively. The fixed exponents allow interpretations for initial acceleration, the angle at peak torque and the decrement of torque after peak torque. Also, the multilevel models were flexible to illustrate the influence of gender differences on the shape of torque throughout the range of motion and in the shape of the curves. The presented multilevel regression models may afford a general framework to examine angle-specific moment curves by isokinetic dynamometry, and add to the understanding mechanisms of strength development, particularly the force-length relationship, both related to performance and injury prevention.

  2. A Variational Approach to the Modeling of MIMO Systems

    Directory of Open Access Journals (Sweden)

    Jraifi A

    2007-01-01

    Full Text Available Motivated by the study of the optimization of the quality of service for multiple input multiple output (MIMO systems in 3G (third generation, we develop a method for modeling MIMO channel . This method, which uses a statistical approach, is based on a variational form of the usual channel equation. The proposed equation is given by with scalar variable . Minimum distance of received vectors is used as the random variable to model MIMO channel. This variable is of crucial importance for the performance of the transmission system as it captures the degree of interference between neighbors vectors. Then, we use this approach to compute numerically the total probability of errors with respect to signal-to-noise ratio (SNR and then predict the numbers of antennas. By fixing SNR variable to a specific value, we extract informations on the optimal numbers of MIMO antennas.

  3. A Learning Based Approach to Control Synthesis of Markov Decision Processes for Linear Temporal Logic Specifications

    Science.gov (United States)

    2014-09-20

    A Learning Based Approach to Control Synthesis of Markov Decision Processes for Linear Temporal Logic Specifications Dorsa Sadigh Eric Kim Samuel...2014 to 00-00-2014 4. TITLE AND SUBTITLE A Learning Based Approach to Control Synthesis of Markov Decision Processes for Linear Temporal Logic...ABSTRACT We propose to synthesize a control policy for a Markov decision process (MDP) such that the resulting traces of the MDP satisfy a linear

  4. Size-specific sensitivity: Applying a new structured population model

    Energy Technology Data Exchange (ETDEWEB)

    Easterling, M.R.; Ellner, S.P.; Dixon, P.M.

    2000-03-01

    Matrix population models require the population to be divided into discrete stage classes. In many cases, especially when classes are defined by a continuous variable, such as length or mass, there are no natural breakpoints, and the division is artificial. The authors introduce the integral projection model, which eliminates the need for division into discrete classes, without requiring any additional biological assumptions. Like a traditional matrix model, the integral projection model provides estimates of the asymptotic growth rate, stable size distribution, reproductive values, and sensitivities of the growth rate to changes in vital rates. However, where the matrix model represents the size distributions, reproductive value, and sensitivities as step functions (constant within a stage class), the integral projection model yields smooth curves for each of these as a function of individual size. The authors describe a method for fitting the model to data, and they apply this method to data on an endangered plant species, northern monkshood (Aconitum noveboracense), with individuals classified by stem diameter. The matrix and integral models yield similar estimates of the asymptotic growth rate, but the reproductive values and sensitivities in the matrix model are sensitive to the choice of stage classes. The integral projection model avoids this problem and yields size-specific sensitivities that are not affected by stage duration. These general properties of the integral projection model will make it advantageous for other populations where there is no natural division of individuals into stage classes.

  5. An Integrated Approach to Flexible Modelling and Animated Simulation

    Institute of Scientific and Technical Information of China (English)

    Li Shuliang; Wu Zhenye

    1994-01-01

    Based on the software support of SIMAN/CINEMA, this paper presents an integrated approach to flexible modelling and simulation with animation. The methodology provides a structured way of integrating mathematical and logical model, statistical experinentation, and statistical analysis with computer animation. Within this methodology, an animated simulation study is separated into six different activities: simulation objectives identification , system model development, simulation experiment specification, animation layout construction, real-time simulation and animation run, and output data analysis. These six activities are objectives driven, relatively independent, and integrate through software organization and simulation files. The key ideas behind this methodology are objectives orientation, modelling flexibility,simulation and animation integration, and application tailorability. Though the methodology is closely related to SIMAN/CINEMA, it can be extended to other software environments.

  6. The importance of understanding: Model space moderates goal specificity effects.

    Science.gov (United States)

    Kistner, Saskia; Burns, Bruce D; Vollmeyer, Regina; Kortenkamp, Ulrich

    2016-01-01

    The three-space theory of problem solving predicts that the quality of a learner's model and the goal specificity of a task interact on knowledge acquisition. In Experiment 1 participants used a computer simulation of a lever system to learn about torques. They either had to test hypotheses (nonspecific goal), or to produce given values for variables (specific goal). In the good- but not in the poor-model condition they saw torque depicted as an area. Results revealed the predicted interaction. A nonspecific goal only resulted in better learning when a good model of torques was provided. In Experiment 2 participants learned to manipulate the inputs of a system to control its outputs. A nonspecific goal to explore the system helped performance when compared to a specific goal to reach certain values when participants were given a good model, but not when given a poor model that suggested the wrong hypothesis space. Our findings support the three-space theory. They emphasize the importance of understanding for problem solving and stress the need to study underlying processes.

  7. A Tool for Model-Based Language Specification

    CERN Document Server

    Quesada, Luis; Cubero, Juan-Carlos

    2011-01-01

    Formal languages let us define the textual representation of data with precision. Formal grammars, typically in the form of BNF-like productions, describe the language syntax, which is then annotated for syntax-directed translation and completed with semantic actions. When, apart from the textual representation of data, an explicit representation of the corresponding data structure is required, the language designer has to devise the mapping between the suitable data model and its proper language specification, and then develop the conversion procedure from the parse tree to the data model instance. Unfortunately, whenever the format of the textual representation has to be modified, changes have to propagated throughout the entire language processor tool chain. These updates are time-consuming, tedious, and error-prone. Besides, in case different applications use the same language, several copies of the same language specification have to be maintained. In this paper, we introduce a model-based parser generat...

  8. Multicomponent Equilibrium Models for Testing Geothermometry Approaches

    Energy Technology Data Exchange (ETDEWEB)

    Carl D. Palmer; Robert W. Smith; Travis L. McLing

    2013-02-01

    Geothermometry is an important tool for estimating deep reservoir temperature from the geochemical composition of shallower and cooler waters. The underlying assumption of geothermometry is that the waters collected from shallow wells and seeps maintain a chemical signature that reflects equilibrium in the deeper reservoir. Many of the geothermometers used in practice are based on correlation between water temperatures and composition or using thermodynamic calculations based a subset (typically silica, cations or cation ratios) of the dissolved constituents. An alternative approach is to use complete water compositions and equilibrium geochemical modeling to calculate the degree of disequilibrium (saturation index) for large number of potential reservoir minerals as a function of temperature. We have constructed several “forward” geochemical models using The Geochemist’s Workbench to simulate the change in chemical composition of reservoir fluids as they migrate toward the surface. These models explicitly account for the formation (mass and composition) of a steam phase and equilibrium partitioning of volatile components (e.g., CO2, H2S, and H2) into the steam as a result of pressure decreases associated with upward fluid migration from depth. We use the synthetic data generated from these simulations to determine the advantages and limitations of various geothermometry and optimization approaches for estimating the likely conditions (e.g., temperature, pCO2) to which the water was exposed in the deep subsurface. We demonstrate the magnitude of errors that can result from boiling, loss of volatiles, and analytical error from sampling and instrumental analysis. The estimated reservoir temperatures for these scenarios are also compared to conventional geothermometers. These results can help improve estimation of geothermal resource temperature during exploration and early development.

  9. A Bayesian Alternative for Multi-objective Ecohydrological Model Specification

    Science.gov (United States)

    Tang, Y.; Marshall, L. A.; Sharma, A.; Ajami, H.

    2015-12-01

    Process-based ecohydrological models combine the study of hydrological, physical, biogeochemical and ecological processes of the catchments, which are usually more complex and parametric than conceptual hydrological models. Thus, appropriate calibration objectives and model uncertainty analysis are essential for ecohydrological modeling. In recent years, Bayesian inference has become one of the most popular tools for quantifying the uncertainties in hydrological modeling with the development of Markov Chain Monte Carlo (MCMC) techniques. Our study aims to develop appropriate prior distributions and likelihood functions that minimize the model uncertainties and bias within a Bayesian ecohydrological framework. In our study, a formal Bayesian approach is implemented in an ecohydrological model which combines a hydrological model (HyMOD) and a dynamic vegetation model (DVM). Simulations focused on one objective likelihood (Streamflow/LAI) and multi-objective likelihoods (Streamflow and LAI) with different weights are compared. Uniform, weakly informative and strongly informative prior distributions are used in different simulations. The Kullback-leibler divergence (KLD) is used to measure the dis(similarity) between different priors and corresponding posterior distributions to examine the parameter sensitivity. Results show that different prior distributions can strongly influence posterior distributions for parameters, especially when the available data is limited or parameters are insensitive to the available data. We demonstrate differences in optimized parameters and uncertainty limits in different cases based on multi-objective likelihoods vs. single objective likelihoods. We also demonstrate the importance of appropriately defining the weights of objectives in multi-objective calibration according to different data types.

  10. An approach to analyse the specific impact of rapamycin on mRNA-ribosome association

    Directory of Open Access Journals (Sweden)

    Jaquier-Gubler Pascale

    2008-08-01

    Full Text Available Abstract Background Recent work, using both cell culture model systems and tumour derived cell lines, suggests that the differential recruitment into polysomes of mRNA populations may be sufficient to initiate and maintain tumour formation. Consequently, a major effort is underway to use high density microarray profiles to establish molecular fingerprints for cells exposed to defined drug regimes. The aim of these pharmacogenomic approaches is to provide new information on how drugs can impact on the translational read-out within a defined cellular background. Methods We describe an approach that permits the analysis of de-novo mRNA-ribosome association in-vivo during short drug exposures. It combines hypertonic shock, polysome fractionation and high-throughput analysis to provide a molecular phenotype of translationally responsive transcripts. Compared to previous translational profiling studies, the procedure offers increased specificity due to the elimination of the drugs secondary effects (e.g. on the transcriptional read-out. For this pilot "proof-of-principle" assay we selected the drug rapamycin because of its extensively studied impact on translation initiation. Results High throughput analysis on both the light and heavy polysomal fractions has identified mRNAs whose re-recruitment onto free ribosomes responded to short exposure to the drug rapamycin. The results of the microarray have been confirmed using real-time RT-PCR. The selective down-regulation of TOP transcripts is also consistent with previous translational profiling studies using this drug. Conclusion The technical advance outlined in this manuscript offers the possibility of new insights into mRNA features that impact on translation initiation and provides a molecular fingerprint for transcript-ribosome association in any cell type and in the presence of a range of drugs of interest. Such molecular phenotypes defined pre-clinically may ultimately impact on the evaluation of

  11. The National Map seamless digital elevation model specifications

    Science.gov (United States)

    Archuleta, Christy-Ann M.; Constance, Eric W.; Arundel, Samantha T.; Lowe, Amanda J.; Mantey, Kimberly S.; Phillips, Lori A.

    2017-08-02

    This specification documents the requirements and standards used to produce the seamless elevation layers for The National Map of the United States. Seamless elevation data are available for the conterminous United States, Hawaii, Alaska, and the U.S. territories, in three different resolutions—1/3-arc-second, 1-arc-second, and 2-arc-second. These specifications include requirements and standards information about source data requirements, spatial reference system, distribution tiling schemes, horizontal resolution, vertical accuracy, digital elevation model surface treatment, georeferencing, data source and tile dates, distribution and supporting file formats, void areas, metadata, spatial metadata, and quality assurance and control.

  12. Specific heat of a non-local attractive Hubbard model

    Energy Technology Data Exchange (ETDEWEB)

    Calegari, E.J., E-mail: eleonir@ufsm.br [Laboratório de Teoria da Matéria Condensada, Departamento de Física, UFSM, 97105-900, Santa Maria, RS (Brazil); Lobo, C.O. [Laboratório de Teoria da Matéria Condensada, Departamento de Física, UFSM, 97105-900, Santa Maria, RS (Brazil); Magalhaes, S.G. [Instituto de Física, Universidade Federal Fluminense, Av. Litorânea s/n, 24210, 346, Niterói, Rio de Janeiro (Brazil); Chaves, C.M.; Troper, A. [Centro Brasileiro de Pesquisas Físicas, Rua Xavier Sigaud 150, 22290-180, Rio de Janeiro, RJ (Brazil)

    2013-10-01

    The specific heat C(T) of an attractive (interaction G<0) non-local Hubbard model is investigated within a two-pole approximation that leads to a set of correlation functions, which play an important role as a source of anomalies as the pseudogap. For a giving range of G and n{sub T} (where n{sub T}=n{sub ↑}+n{sub ↓}), the specific heat as a function of the temperature presents a two peak structure. Nevertehelesss, the presence of a pseudogap eliminates the two peak structure. The effects of the second nearest-neighbor hopping on C(T) are also investigated.

  13. A Joint Specification Test for Response Probabilities in Unordered Multinomial Choice Models

    Directory of Open Access Journals (Sweden)

    Masamune Iwasawa

    2015-09-01

    Full Text Available Estimation results obtained by parametric models may be seriously misleading when the model is misspecified or poorly approximates the true model. This study proposes a test that jointly tests the specifications of multiple response probabilities in unordered multinomial choice models. The test statistic is asymptotically chi-square distributed, consistent against a fixed alternative and able to detect a local alternative approaching to the null at a rate slower than the parametric rate. We show that rejection regions can be calculated by a simple parametric bootstrap procedure, when the sample size is small. The size and power of the tests are investigated by Monte Carlo experiments.

  14. Tritium Specific Adsorption Simulation Utilizing the OSPREY Model

    Energy Technology Data Exchange (ETDEWEB)

    Veronica Rutledge; Lawrence Tavlarides; Ronghong Lin; Austin Ladshaw

    2013-09-01

    During the processing of used nuclear fuel, volatile radionuclides will be discharged to the atmosphere if no recovery processes are in place to limit their release. The volatile radionuclides of concern are 3H, 14C, 85Kr, and 129I. Methods are being developed, via adsorption and absorption unit operations, to capture these radionuclides. It is necessary to model these unit operations to aid in the evaluation of technologies and in the future development of an advanced used nuclear fuel processing plant. A collaboration between Fuel Cycle Research and Development Offgas Sigma Team member INL and a NEUP grant including ORNL, Syracuse University, and Georgia Institute of Technology has been formed to develop off gas models and support off gas research. This report is discusses the development of a tritium specific adsorption model. Using the OSPREY model and integrating it with a fundamental level isotherm model developed under and experimental data provided by the NEUP grant, the tritium specific adsorption model was developed.

  15. Mesh structure-independent modeling of patient-specific atrial fiber orientation

    Directory of Open Access Journals (Sweden)

    Wachter Andreas

    2015-09-01

    Full Text Available The fiber orientation in the atria has a significant contribution to the electrophysiologic behavior of the heart and to the genesis of arrhythmia. Atrial fiber orientation has a direct effect on excitation propagation, activation patterns and the P-wave. We present a rule-based algorithm that works robustly on different volumetric meshes composed of either isotropic hexahedra or arbitrary tetrahedra as well as on 3-dimensional triangular surface meshes in patient-specific geometric models. This method fosters the understanding of general proarrhythmic mechanisms and enhances patient-specific modeling approaches.

  16. Evaluating face trustworthiness: a model based approach.

    Science.gov (United States)

    Todorov, Alexander; Baron, Sean G; Oosterhof, Nikolaas N

    2008-06-01

    Judgments of trustworthiness from faces determine basic approach/avoidance responses and approximate the valence evaluation of faces that runs across multiple person judgments. Here, based on trustworthiness judgments and using a computer model for face representation, we built a model for representing face trustworthiness (study 1). Using this model, we generated novel faces with an increased range of trustworthiness and used these faces as stimuli in a functional Magnetic Resonance Imaging study (study 2). Although participants did not engage in explicit evaluation of the faces, the amygdala response changed as a function of face trustworthiness. An area in the right amygdala showed a negative linear response-as the untrustworthiness of faces increased so did the amygdala response. Areas in the left and right putamen, the latter area extended into the anterior insula, showed a similar negative linear response. The response in the left amygdala was quadratic--strongest for faces on both extremes of the trustworthiness dimension. The medial prefrontal cortex and precuneus also showed a quadratic response, but their response was strongest to faces in the middle range of the trustworthiness dimension.

  17. A Discipline-Specific Approach to the History of U.S. Science Education

    Science.gov (United States)

    Otero, Valerie K.; Meltzer, David E.

    2017-01-01

    Although much has been said and written about the value of using the history of science in teaching science, relatively little is available to guide educators in the various science disciplines through the educational history of their own discipline. Through a discipline-specific approach to a course on the history of science education in the…

  18. A Functional-Notional Approach for English for Specific Purposes (ESP) Programs.

    Science.gov (United States)

    Kim, Young-Min

    English for Specific Purposes (ESP) programs, characterized by the special needs of the language learners, are described and a review of the literature on a functional-notional approach to the syllabus design of ESP programs is presented. It is suggested that effective ESP programs should teach the language skills necessary to function and perform…

  19. A Discipline-Specific Approach to the History of U.S. Science Education

    Science.gov (United States)

    Otero, Valerie K.; Meltzer, David E.

    2017-01-01

    Although much has been said and written about the value of using the history of science in teaching science, relatively little is available to guide educators in the various science disciplines through the educational history of their own discipline. Through a discipline-specific approach to a course on the history of science education in the…

  20. Psychological approaches in the treatment of specific phobias: A meta-analysis

    NARCIS (Netherlands)

    Wolitzky-Taylor, K.B.; Horowitz, J.D.; Powers, M.B.; Telch, M.J.

    2008-01-01

    Data from 33 randomized treatment studies were subjected to a meta-analysis to address questions surrounding the efficacy of psychological approaches in the treatment of specific phobia. As expected, exposure-based treatment produced large effects sizes relative to no treatment. They also outperform

  1. Shaping Approach Responses as Intervention for Specific Phobia in a Child with Autism

    Science.gov (United States)

    Ricciardi, Joseph N.; Luiselli, James K.; Camare, Marianne

    2006-01-01

    We evaluated contact desensitization (reinforcing approach responses) as intervention for specific phobia with a child diagnosed with autism. During hospital-based intervention, the boy was able to encounter previously avoided stimuli. Parental report suggested that results were maintained postdischarge. (Contains 1 figure.)

  2. Scaffolding in tissue engineering: general approaches and tissue-specific considerations.

    Science.gov (United States)

    Chan, B P; Leong, K W

    2008-12-01

    Scaffolds represent important components for tissue engineering. However, researchers often encounter an enormous variety of choices when selecting scaffolds for tissue engineering. This paper aims to review the functions of scaffolds and the major scaffolding approaches as important guidelines for selecting scaffolds and discuss the tissue-specific considerations for scaffolding, using intervertebral disc as an example.

  3. Surface mesh to voxel data registration for patient-specific anatomical modeling

    Science.gov (United States)

    de Oliveira, Júlia E. E.; Giessler, Paul; Keszei, András.; Herrler, Andreas; Deserno, Thomas M.

    2016-03-01

    Virtual Physiological Human (VPH) models are frequently used for training, planning, and performing medical procedures. The Regional Anaesthesia Simulator and Assistant (RASimAs) project has the goal of increasing the application and effectiveness of regional anesthesia (RA) by combining a simulator of ultrasound-guided and electrical nerve-stimulated RA procedures and a subject-specific assistance system through an integration of image processing, physiological models, subject-specific data, and virtual reality. Individualized models enrich the virtual training tools for learning and improving regional anaesthesia (RA) skills. Therefore, we suggest patient-specific VPH models that are composed by registering the general mesh-based models with patient voxel data-based recordings. Specifically, the pelvis region has been focused for the support of the femoral nerve block. The processing pipeline is composed of different freely available toolboxes such as MatLab, the open Simulation framework (SOFA), and MeshLab. The approach of Gilles is applied for mesh-to-voxel registration. Personalized VPH models include anatomical as well as mechanical properties of the tissues. Two commercial VPH models (Zygote and Anatomium) were used together with 34 MRI data sets. Results are presented for the skin surface and pelvic bones. Future work will extend the registration procedure to cope with all model tissue (i.e., skin, muscle, bone, vessel, nerve, fascia) in a one-step procedure and extrapolating the personalized models to body regions actually being out of the captured field of view.

  4. Approaches and models of intercultural education

    Directory of Open Access Journals (Sweden)

    Iván Manuel Sánchez Fontalvo

    2013-10-01

    Full Text Available Needed to be aware of the need to build an intercultural society, awareness must be assumed in all social spheres, where stands the role play education. A role of transcendental, since it must promote educational spaces to form people with virtues and powers that allow them to live together / as in multicultural contexts and social diversities (sometimes uneven in an increasingly globalized and interconnected world, and foster the development of feelings of civic belonging shared before the neighborhood, city, region and country, allowing them concern and critical judgement to marginalization, poverty, misery and inequitable distribution of wealth, causes of structural violence, but at the same time, wanting to work for the welfare and transformation of these scenarios. Since these budgets, it is important to know the approaches and models of intercultural education that have been developed so far, analysing their impact on the contexts educational where apply.   

  5. Psychological approaches in the treatment of specific phobias: a meta-analysis.

    Science.gov (United States)

    Wolitzky-Taylor, Kate B; Horowitz, Jonathan D; Powers, Mark B; Telch, Michael J

    2008-07-01

    Data from 33 randomized treatment studies were subjected to a meta-analysis to address questions surrounding the efficacy of psychological approaches in the treatment of specific phobia. As expected, exposure-based treatment produced large effects sizes relative to no treatment. They also outperformed placebo conditions and alternative active psychotherapeutic approaches. Treatments involving in vivo contact with the phobic target also outperformed alternative modes of exposure (e.g., imaginal exposure, virtual reality, etc.) at post-treatment but not at follow-up. Placebo treatments were significantly more effective than no treatment suggesting that specific phobia sufferers are moderately responsive to placebo interventions. Multi-session treatments marginally outperformed single-session treatments on domain-specific questionnaire measures of phobic dysfunction, and moderator analyses revealed that more sessions predicted more favorable outcomes. Contrary to expectation, effect sizes for the major comparisons of interest were not moderated by type of specific phobia. These findings provide the first quantitative summary evidence supporting the superiority of exposure-based treatments over alternative treatment approaches for those presenting with specific phobia. Recommendations for future research are also discussed.

  6. Flexible parametric modelling of cause-specific hazards to estimate cumulative incidence functions

    Science.gov (United States)

    2013-01-01

    Background Competing risks are a common occurrence in survival analysis. They arise when a patient is at risk of more than one mutually exclusive event, such as death from different causes, and the occurrence of one of these may prevent any other event from ever happening. Methods There are two main approaches to modelling competing risks: the first is to model the cause-specific hazards and transform these to the cumulative incidence function; the second is to model directly on a transformation of the cumulative incidence function. We focus on the first approach in this paper. This paper advocates the use of the flexible parametric survival model in this competing risk framework. Results An illustrative example on the survival of breast cancer patients has shown that the flexible parametric proportional hazards model has almost perfect agreement with the Cox proportional hazards model. However, the large epidemiological data set used here shows clear evidence of non-proportional hazards. The flexible parametric model is able to adequately account for these through the incorporation of time-dependent effects. Conclusion A key advantage of using this approach is that smooth estimates of both the cause-specific hazard rates and the cumulative incidence functions can be obtained. It is also relatively easy to incorporate time-dependent effects which are commonly seen in epidemiological studies. PMID:23384310

  7. Generalized framework for context-specific metabolic model extraction methods

    Directory of Open Access Journals (Sweden)

    Semidán eRobaina Estévez

    2014-09-01

    Full Text Available Genome-scale metabolic models are increasingly applied to investigate the physiology not only of simple prokaryotes, but also eukaryotes, such as plants, characterized with compartmentalized cells of multiple types. While genome-scale models aim at including the entirety of known metabolic reactions, mounting evidence has indicated that only a subset of these reactions is active in a given context, including: developmental stage, cell type, or environment. As a result, several methods have been proposed to reconstruct context-specific models from existing genome-scale models by integrating various types of high-throughput data. Here we present a mathematical framework that puts all existing methods under one umbrella and provides the means to better understand their functioning, highlight similarities and differences, and to help users in selecting a most suitable method for an application.

  8. Analysis specifications for the CC3 biosphere model biotrac

    Energy Technology Data Exchange (ETDEWEB)

    Szekely, J.G.; Wojciechowski, L.C.; Stephens, M.E.; Halliday, H.A.

    1994-12-01

    The CC3 (Canadian Concept, generation 3) model BIOTRAC (Biosphere Transport and Consequences) describes the movement in the biosphere of releases from an underground disposal vault, and the consequent radiological dose to a reference individual. Concentrations of toxic substances in different parts of the biosphere are also calculated. BIOTRAC was created specifically for the postclosure analyses of the Environmental Impact Statement that AECL is preparing on the concept for disposal of Canada`s nuclear fuel waste. The model relies on certain assumptions and constraints on the system, which are described by Davis et al. Accordingly, great care must be exercised if BIOTRAC is used for any other purpose.

  9. Computational cognitive modeling for the diagnosis of Specific Language Impairment.

    Science.gov (United States)

    Oliva, Jesus; Serrano, J Ignacio; del Castillo, M Dolores; Iglesias, Angel

    2013-01-01

    Specific Language Impairment (SLI), as many other cognitive deficits, is difficult to diagnose given its heterogeneous profile and its overlap with other impairments. Existing techniques are based on different criteria using behavioral variables on different tasks. In this paper we propose a methodology for the diagnosis of SLI that uses computational cognitive modeling in order to capture the internal mechanisms of the normal and impaired brain. We show that machine learning techniques that use the information of these models perform better than those that only use behavioral variables.

  10. Genetic Algorithm Approaches to Prebiobiotic Chemistry Modeling

    Science.gov (United States)

    Lohn, Jason; Colombano, Silvano

    1997-01-01

    We model an artificial chemistry comprised of interacting polymers by specifying two initial conditions: a distribution of polymers and a fixed set of reversible catalytic reactions. A genetic algorithm is used to find a set of reactions that exhibit a desired dynamical behavior. Such a technique is useful because it allows an investigator to determine whether a specific pattern of dynamics can be produced, and if it can, the reaction network found can be then analyzed. We present our results in the context of studying simplified chemical dynamics in theorized protocells - hypothesized precursors of the first living organisms. Our results show that given a small sample of plausible protocell reaction dynamics, catalytic reaction sets can be found. We present cases where this is not possible and also analyze the evolved reaction sets.

  11. Mathematical modelling of digit specification by a sonic hedgehog gradient

    KAUST Repository

    Woolley, Thomas E.

    2013-11-26

    Background: The three chick wing digits represent a classical example of a pattern specified by a morphogen gradient. Here we have investigated whether a mathematical model of a Shh gradient can describe the specification of the identities of the three chick wing digits and if it can be applied to limbs with more digits. Results: We have produced a mathematical model for specification of chick wing digit identities by a Shh gradient that can be extended to the four digits of the chick leg with Shh-producing cells forming a digit. This model cannot be extended to specify the five digits of the mouse limb. Conclusions: Our data suggest that the parameters of a classical-type morphogen gradient are sufficient to specify the identities of three different digits. However, to specify more digit identities, this core mechanism has to be coupled to alternative processes, one being that in the chick leg and mouse limb, Shh-producing cells give rise to digits; another that in the mouse limb, the cellular response to the Shh gradient adapts over time so that digit specification does not depend simply on Shh concentration. Developmental Dynamics 243:290-298, 2014. © 2013 Wiley Periodicals, Inc.

  12. Systems pharmacology modeling: an approach to improving drug safety.

    Science.gov (United States)

    Bai, Jane P F; Fontana, Robert J; Price, Nathan D; Sangar, Vineet

    2014-01-01

    Advances in systems biology in conjunction with the expansion in knowledge of drug effects and diseases present an unprecedented opportunity to extend traditional pharmacokinetic and pharmacodynamic modeling/analysis to conduct systems pharmacology modeling. Many drugs that cause liver injury and myopathies have been studied extensively. Mitochondrion-centric systems pharmacology modeling is important since drug toxicity across a large number of pharmacological classes converges to mitochondrial injury and death. Approaches to systems pharmacology modeling of drug effects need to consider drug exposure, organelle and cellular phenotypes across all key cell types of human organs, organ-specific clinical biomarkers/phenotypes, gene-drug interaction and immune responses. Systems modeling approaches, that leverage the knowledge base constructed from curating a selected list of drugs across a wide range of pharmacological classes, will provide a critically needed blueprint for making informed decisions to reduce the rate of attrition for drugs in development and increase the number of drugs with an acceptable benefit/risk ratio.

  13. Spatiotemporal infectious disease modeling: a BME-SIR approach.

    Science.gov (United States)

    Angulo, Jose; Yu, Hwa-Lung; Langousis, Andrea; Kolovos, Alexander; Wang, Jinfeng; Madrid, Ana Esther; Christakos, George

    2013-01-01

    This paper is concerned with the modeling of infectious disease spread in a composite space-time domain under conditions of uncertainty. We focus on stochastic modeling that accounts for basic mechanisms of disease distribution and multi-sourced in situ uncertainties. Starting from the general formulation of population migration dynamics and the specification of transmission and recovery rates, the model studies the functional formulation of the evolution of the fractions of susceptible-infected-recovered individuals. The suggested approach is capable of: a) modeling population dynamics within and across localities, b) integrating the disease representation (i.e. susceptible-infected-recovered individuals) with observation time series at different geographical locations and other sources of information (e.g. hard and soft data, empirical relationships, secondary information), and c) generating predictions of disease spread and associated parameters in real time, while considering model and observation uncertainties. Key aspects of the proposed approach are illustrated by means of simulations (i.e. synthetic studies), and a real-world application using hand-foot-mouth disease (HFMD) data from China.

  14. 78 FR 3921 - Proposed Models for Plant-Specific Adoption of Technical Specifications Task Force Traveler TSTF...

    Science.gov (United States)

    2013-01-17

    ... From the Federal Register Online via the Government Publishing Office NUCLEAR REGULATORY COMMISSION Proposed Models for Plant-Specific Adoption of Technical Specifications Task Force Traveler TSTF... (SE) for plant- specific adoption of Technical Specifications (TS) Task Force (TSTF) Traveler TSTF-426...

  15. Evaluation of mesh morphing and mapping techniques in patient specific modelling of the human pelvis.

    Science.gov (United States)

    Salo, Zoryana; Beek, Maarten; Whyne, Cari Marisa

    2012-08-01

    Robust generation of pelvic finite element models is necessary to understand variation in mechanical behaviour resulting from differences in gender, aging, disease and injury. The objective of this study was to apply and evaluate mesh morphing and mapping techniques to facilitate the creation and structural analysis of specimen-specific finite element (FE) models of the pelvis. A specimen-specific pelvic FE model (source mesh) was generated following a traditional user-intensive meshing scheme. The source mesh was morphed onto a computed tomography scan generated target surface of a second pelvis using a landmarked-based approach, in which exterior source nodes were shifted to target surface vertices, while constrained along a normal. A second copy of the morphed model was further refined through mesh mapping, in which surface nodes of the initial morphed model were selected in patches and remapped onto the surfaces of the target model. Computed tomography intensity-based material properties were assigned to each model. The source, target, morphed and mapped models were analyzed under axial compression using linear static FE analysis, and their strain distributions were evaluated. Morphing and mapping techniques were effectively applied to generate good quality and geometrically complex specimen-specific pelvic FE models. Mapping significantly improved strain concurrence with the target pelvis FE model.

  16. Flexible parametric modelling of the cause-specific cumulative incidence function.

    Science.gov (United States)

    Lambert, Paul C; Wilkes, Sally R; Crowther, Michael J

    2016-12-22

    Competing risks arise with time-to-event data when individuals are at risk of more than one type of event and the occurrence of one event precludes the occurrence of all other events. A useful measure with competing risks is the cause-specific cumulative incidence function (CIF), which gives the probability of experiencing a particular event as a function of follow-up time, accounting for the fact that some individuals may have a competing event. When modelling the cause-specific CIF, the most common model is a semi-parametric proportional subhazards model. In this paper, we propose the use of flexible parametric survival models to directly model the cause-specific CIF where the effect of follow-up time is modelled using restricted cubic splines. The models provide smooth estimates of the cause-specific CIF with the important advantage that the approach is easily extended to model time-dependent effects. The models can be fitted using standard survival analysis tools by a combination of data expansion and introducing time-dependent weights. Various link functions are available that allow modelling on different scales and have proportional subhazards, proportional odds and relative absolute risks as particular cases. We conduct a simulation study to evaluate how well the spline functions approximate subhazard functions with complex shapes. The methods are illustrated using data from the European Blood and Marrow Transplantation Registry showing excellent agreement between parametric estimates of the cause-specific CIF and those obtained from a semi-parametric model. We also fit models relaxing the proportional subhazards assumption using alternative link functions and/or including time-dependent effects. Copyright © 2016 John Wiley & Sons, Ltd.

  17. A rule-based approach to model checking of UML state machines

    Science.gov (United States)

    Grobelna, Iwona; Grobelny, Michał; Stefanowicz, Łukasz

    2016-12-01

    In the paper a new approach to formal verification of control process specification expressed by means of UML state machines in version 2.x is proposed. In contrast to other approaches from the literature, we use the abstract and universal rule-based logical model suitable both for model checking (using the nuXmv model checker), but also for logical synthesis in form of rapid prototyping. Hence, a prototype implementation in hardware description language VHDL can be obtained that fully reflects the primary, already formally verified specification in form of UML state machines. Presented approach allows to increase the assurance that implemented system meets the user-defined requirements.

  18. A Bayesian modeling approach for generalized semiparametric structural equation models.

    Science.gov (United States)

    Song, Xin-Yuan; Lu, Zhao-Hua; Cai, Jing-Heng; Ip, Edward Hak-Sing

    2013-10-01

    In behavioral, biomedical, and psychological studies, structural equation models (SEMs) have been widely used for assessing relationships between latent variables. Regression-type structural models based on parametric functions are often used for such purposes. In many applications, however, parametric SEMs are not adequate to capture subtle patterns in the functions over the entire range of the predictor variable. A different but equally important limitation of traditional parametric SEMs is that they are not designed to handle mixed data types-continuous, count, ordered, and unordered categorical. This paper develops a generalized semiparametric SEM that is able to handle mixed data types and to simultaneously model different functional relationships among latent variables. A structural equation of the proposed SEM is formulated using a series of unspecified smooth functions. The Bayesian P-splines approach and Markov chain Monte Carlo methods are developed to estimate the smooth functions and the unknown parameters. Moreover, we examine the relative benefits of semiparametric modeling over parametric modeling using a Bayesian model-comparison statistic, called the complete deviance information criterion (DIC). The performance of the developed methodology is evaluated using a simulation study. To illustrate the method, we used a data set derived from the National Longitudinal Survey of Youth.

  19. Anomalous superconductivity in the tJ model; moment approach

    DEFF Research Database (Denmark)

    Sørensen, Mads Peter; Rodriguez-Nunez, J.J.

    1997-01-01

    By extending the moment approach of Nolting (Z, Phys, 225 (1972) 25) in the superconducting phase, we have constructed the one-particle spectral functions (diagonal and off-diagonal) for the tJ model in any dimensions. We propose that both the diagonal and the off-diagonal spectral functions...... Hartree shift which in the end result enlarges the bandwidth of the free carriers allowing us to take relative high values of J/t and allowing superconductivity to live in the T-c-rho phase diagram, in agreement with numerical calculations in a cluster, We have calculated the static spin susceptibility......, chi(T), and the specific heat, C-v(T), within the moment approach. We find that all the relevant physical quantities show the signature of superconductivity at T-c in the form of kinks (anomalous behavior) or jumps, for low density, in agreement with recent published literature, showing a generic...

  20. Patient-Specific Computational Modeling of Human Phonation

    Science.gov (United States)

    Xue, Qian; Zheng, Xudong; University of Maine Team

    2013-11-01

    Phonation is a common biological process resulted from the complex nonlinear coupling between glottal aerodynamics and vocal fold vibrations. In the past, the simplified symmetric straight geometric models were commonly employed for experimental and computational studies. The shape of larynx lumen and vocal folds are highly three-dimensional indeed and the complex realistic geometry produces profound impacts on both glottal flow and vocal fold vibrations. To elucidate the effect of geometric complexity on voice production and improve the fundamental understanding of human phonation, a full flow-structure interaction simulation is carried out on a patient-specific larynx model. To the best of our knowledge, this is the first patient-specific flow-structure interaction study of human phonation. The simulation results are well compared to the established human data. The effects of realistic geometry on glottal flow and vocal fold dynamics are investigated. It is found that both glottal flow and vocal fold dynamics present a high level of difference from the previous simplified model. This study also paved the important step toward the development of computer model for voice disease diagnosis and surgical planning. The project described was supported by Grant Number ROlDC007125 from the National Institute on Deafness and Other Communication Disorders (NIDCD).

  1. Building Energy Modeling: A Data-Driven Approach

    Science.gov (United States)

    Cui, Can

    Buildings consume nearly 50% of the total energy in the United States, which drives the need to develop high-fidelity models for building energy systems. Extensive methods and techniques have been developed, studied, and applied to building energy simulation and forecasting, while most of work have focused on developing dedicated modeling approach for generic buildings. In this study, an integrated computationally efficient and high-fidelity building energy modeling framework is proposed, with the concentration on developing a generalized modeling approach for various types of buildings. First, a number of data-driven simulation models are reviewed and assessed on various types of computationally expensive simulation problems. Motivated by the conclusion that no model outperforms others if amortized over diverse problems, a meta-learning based recommendation system for data-driven simulation modeling is proposed. To test the feasibility of the proposed framework on the building energy system, an extended application of the recommendation system for short-term building energy forecasting is deployed on various buildings. Finally, Kalman filter-based data fusion technique is incorporated into the building recommendation system for on-line energy forecasting. Data fusion enables model calibration to update the state estimation in real-time, which filters out the noise and renders more accurate energy forecast. The framework is composed of two modules: off-line model recommendation module and on-line model calibration module. Specifically, the off-line model recommendation module includes 6 widely used data-driven simulation models, which are ranked by meta-learning recommendation system for off-line energy modeling on a given building scenario. Only a selective set of building physical and operational characteristic features is needed to complete the recommendation task. The on-line calibration module effectively addresses system uncertainties, where data fusion on

  2. Animal models of antimuscle-specific kinase myasthenia.

    Science.gov (United States)

    Richman, David P; Nishi, Kayoko; Ferns, Michael J; Schnier, Joachim; Pytel, Peter; Maselli, Ricardo A; Agius, Mark A

    2012-12-01

    Antimuscle-specific kinase (anti-MuSK) myasthenia (AMM) differs from antiacetylcholine receptor myasthenia gravis in exhibiting more focal muscle involvement (neck, shoulder, facial, and bulbar muscles) with wasting of the involved, primarily axial, muscles. AMM is not associated with thymic hyperplasia and responds poorly to anticholinesterase treatment. Animal models of AMM have been induced in rabbits, mice, and rats by immunization with purified xenogeneic MuSK ectodomain, and by passive transfer of large quantities of purified serum IgG from AMM patients into mice. The models have confirmed the pathogenic role of the MuSK antibodies in AMM and have demonstrated the involvement of both the presynaptic and postsynaptic components of the neuromuscular junction. The observations in this human disease and its animal models demonstrate the role of MuSK not only in the formation of this synapse but also in its maintenance.

  3. Cyber-Physical Energy Systems Modeling, Test Specification, and Co-Simulation Based Testing

    DEFF Research Database (Denmark)

    van der Meer, A. A.; Palensky, P.; Heussen, Kai

    2017-01-01

    The gradual deployment of intelligent and coordinated devices in the electrical power system needs careful investigation of the interactions between the various domains involved. Especially due to the coupling between ICT and power systems a holistic approach for testing and validating is require....... The presented method addresses most modeling and specification challenges in cyber-physical energy systems and is extensible for future additions such as uncertainty quantification.......The gradual deployment of intelligent and coordinated devices in the electrical power system needs careful investigation of the interactions between the various domains involved. Especially due to the coupling between ICT and power systems a holistic approach for testing and validating is required....... Taking existing (quasi-) standardised smart grid system and test specification methods as a starting point, we are developing a holistic testing and validation approach that allows a very flexible way of assessing the system level aspects by various types of experiments (including virtual, real...

  4. Agribusiness model approach to territorial food development

    Directory of Open Access Journals (Sweden)

    Murcia Hector Horacio

    2011-04-01

    Full Text Available

    Several research efforts have coordinated the academic program of Agricultural Business Management from the University De La Salle (Bogota D.C., to the design and implementation of a sustainable agribusiness model applied to food development, with territorial projection. Rural development is considered as a process that aims to improve the current capacity and potential of the inhabitant of the sector, which refers not only to production levels and productivity of agricultural items. It takes into account the guidelines of the Organization of the United Nations “Millennium Development Goals” and considered the concept of sustainable food and agriculture development, including food security and nutrition in an integrated interdisciplinary context, with holistic and systemic dimension. Analysis is specified by a model with an emphasis on sustainable agribusiness production chains related to agricultural food items in a specific region. This model was correlated with farm (technical objectives, family (social purposes and community (collective orientations projects. Within this dimension are considered food development concepts and methodologies of Participatory Action Research (PAR. Finally, it addresses the need to link the results to low-income communities, within the concepts of the “new rurality”.

  5. An approach to model validation and model-based prediction -- polyurethane foam case study.

    Energy Technology Data Exchange (ETDEWEB)

    Dowding, Kevin J.; Rutherford, Brian Milne

    2003-07-01

    Enhanced software methodology and improved computing hardware have advanced the state of simulation technology to a point where large physics-based codes can be a major contributor in many systems analyses. This shift toward the use of computational methods has brought with it new research challenges in a number of areas including characterization of uncertainty, model validation, and the analysis of computer output. It is these challenges that have motivated the work described in this report. Approaches to and methods for model validation and (model-based) prediction have been developed recently in the engineering, mathematics and statistical literatures. In this report we have provided a fairly detailed account of one approach to model validation and prediction applied to an analysis investigating thermal decomposition of polyurethane foam. A model simulates the evolution of the foam in a high temperature environment as it transforms from a solid to a gas phase. The available modeling and experimental results serve as data for a case study focusing our model validation and prediction developmental efforts on this specific thermal application. We discuss several elements of the ''philosophy'' behind the validation and prediction approach: (1) We view the validation process as an activity applying to the use of a specific computational model for a specific application. We do acknowledge, however, that an important part of the overall development of a computational simulation initiative is the feedback provided to model developers and analysts associated with the application. (2) We utilize information obtained for the calibration of model parameters to estimate the parameters and quantify uncertainty in the estimates. We rely, however, on validation data (or data from similar analyses) to measure the variability that contributes to the uncertainty in predictions for specific systems or units (unit-to-unit variability). (3) We perform statistical

  6. A quality risk management model approach for cell therapy manufacturing.

    Science.gov (United States)

    Lopez, Fabio; Di Bartolo, Chiara; Piazza, Tommaso; Passannanti, Antonino; Gerlach, Jörg C; Gridelli, Bruno; Triolo, Fabio

    2010-12-01

    International regulatory authorities view risk management as an essential production need for the development of innovative, somatic cell-based therapies in regenerative medicine. The available risk management guidelines, however, provide little guidance on specific risk analysis approaches and procedures applicable in clinical cell therapy manufacturing. This raises a number of problems. Cell manufacturing is a poorly automated process, prone to operator-introduced variations, and affected by heterogeneity of the processed organs/tissues and lot-dependent variability of reagent (e.g., collagenase) efficiency. In this study, the principal challenges faced in a cell-based product manufacturing context (i.e., high dependence on human intervention and absence of reference standards for acceptable risk levels) are identified and addressed, and a risk management model approach applicable to manufacturing of cells for clinical use is described for the first time. The use of the heuristic and pseudo-quantitative failure mode and effect analysis/failure mode and critical effect analysis risk analysis technique associated with direct estimation of severity, occurrence, and detection is, in this specific context, as effective as, but more efficient than, the analytic hierarchy process. Moreover, a severity/occurrence matrix and Pareto analysis can be successfully adopted to identify priority failure modes on which to act to mitigate risks. The application of this approach to clinical cell therapy manufacturing in regenerative medicine is also discussed.

  7. Stimulus-specific oscillations in a retinal model.

    Science.gov (United States)

    Kenyon, Garrett T; Travis, Bryan J; Theiler, James; George, John S; Stephens, Gregory J; Marshak, David W

    2004-09-01

    High-frequency oscillatory potentials (HFOPs) in the vertebrate retina are stimulus specific. The phases of HFOPs recorded at any given retinal location drift randomly over time, but regions activated by the same stimulus tend to remain phase locked with approximately zero lag, whereas regions activated by spatially separate stimuli are typically uncorrelated. Based on retinal anatomy, we previously postulated that HFOPs are mediated by feedback from a class of axon-bearing amacrine cells that receive excitation from neighboring ganglion cells-via gap junctions-and make inhibitory synapses back onto the surrounding ganglion cells. Using a computer model, we show here that such circuitry can account for the stimulus specificity of HFOPs in response to both high- and low-contrast features. Phase locking between pairs of model ganglion cells did not depend critically on their separation distance, but on whether the applied stimulus created a continuous path between them. The degree of phase locking between spatially separate stimuli was reduced by lateral inhibition, which created a buffer zone around strongly activated regions. Stimulating the inhibited region between spatially separate stimuli increased their degree of phase locking proportionately. Our results suggest several experimental strategies for testing the hypothesis that stimulus-specific HFOPs arise from axon-mediated feedback in the inner retina.

  8. MODEL STUDIES OF MODE-SPECIFICITY IN UNIMOLECULAR REACTION DYNAMICS

    Energy Technology Data Exchange (ETDEWEB)

    Waite, Boyd A.; Miller, William H.

    1980-06-01

    Essentially exact quantum mechanical calculations are carried out to determine the energies and lifetimes of the quasi-bound states for a system of two (non~linearly) coupled oscillators (one of which is harmonic, the other being able to dissociate). For weak coupling the system displays mode-specificity, i.e., the unimolecular rate constants are not a monotonic function of the total energy, but increased coupling and frequency degeneracy tends to destroy mode-specificity. A somewhat surprising result is that for a given coupling the degree of modespecificity is roughly independent of the energy, in marked contrast to the fact that there is an energetic threshold for the onset of "stochastic trajectories" of the corresponding classical system; i.e., there seems to be no relation between statistical/mode-specific behavior of the unimolecular rate constants and stochastic/regular classical trajectories. In order to be able to treat more physically relevant models--i.e., those with more than two degrees of freedom--a semiclassical model is constructed and seen to be able to reproduce the accurate quantum mechanical rates reasonably well.

  9. A bottleneck model of set-specific capture.

    Science.gov (United States)

    Moore, Katherine Sledge; Weissman, Daniel H

    2014-01-01

    Set-specific contingent attentional capture is a particularly strong form of capture that occurs when multiple attentional sets guide visual search (e.g., "search for green letters" and "search for orange letters"). In this type of capture, a potential target that matches one attentional set (e.g. a green stimulus) impairs the ability to identify a temporally proximal target that matches another attentional set (e.g. an orange stimulus). In the present study, we investigated whether set-specific capture stems from a bottleneck in working memory or from a depletion of limited resources that are distributed across multiple attentional sets. In each trial, participants searched a rapid serial visual presentation (RSVP) stream for up to three target letters (T1-T3) that could appear in any of three target colors (orange, green, or lavender). The most revealing findings came from trials in which T1 and T2 matched different attentional sets and were both identified. In these trials, T3 accuracy was lower when it did not match T1's set than when it did match, but only when participants failed to identify T2. These findings support a bottleneck model of set-specific capture in which a limited-capacity mechanism in working memory enhances only one attentional set at a time, rather than a resource model in which processing capacity is simultaneously distributed across multiple attentional sets.

  10. Mathematical Formulation Requirements and Specifications for the Process Models

    Energy Technology Data Exchange (ETDEWEB)

    Steefel, C.; Moulton, D.; Pau, G.; Lipnikov, K.; Meza, J.; Lichtner, P.; Wolery, T.; Bacon, D.; Spycher, N.; Bell, J.; Moridis, G.; Yabusaki, S.; Sonnenthal, E.; Zyvoloski, G.; Andre, B.; Zheng, L.; Davis, J.

    2010-11-01

    The Advanced Simulation Capability for Environmental Management (ASCEM) is intended to be a state-of-the-art scientific tool and approach for understanding and predicting contaminant fate and transport in natural and engineered systems. The ASCEM program is aimed at addressing critical EM program needs to better understand and quantify flow and contaminant transport behavior in complex geological systems. It will also address the long-term performance of engineered components including cementitious materials in nuclear waste disposal facilities, in order to reduce uncertainties and risks associated with DOE EM's environmental cleanup and closure activities. Building upon national capabilities developed from decades of Research and Development in subsurface geosciences, computational and computer science, modeling and applied mathematics, and environmental remediation, the ASCEM initiative will develop an integrated, open-source, high-performance computer modeling system for multiphase, multicomponent, multiscale subsurface flow and contaminant transport. This integrated modeling system will incorporate capabilities for predicting releases from various waste forms, identifying exposure pathways and performing dose calculations, and conducting systematic uncertainty quantification. The ASCEM approach will be demonstrated on selected sites, and then applied to support the next generation of performance assessments of nuclear waste disposal and facility decommissioning across the EM complex. The Multi-Process High Performance Computing (HPC) Simulator is one of three thrust areas in ASCEM. The other two are the Platform and Integrated Toolsets (dubbed the Platform) and Site Applications. The primary objective of the HPC Simulator is to provide a flexible and extensible computational engine to simulate the coupled processes and flow scenarios described by the conceptual models developed using the ASCEM Platform. The graded and iterative approach to assessments

  11. PVUSA model technical specification for a turnkey photovoltaic power system

    Energy Technology Data Exchange (ETDEWEB)

    Dows, R.N.; Gough, E.J.

    1995-11-01

    One of the five objectives of PVUSA is to offer U.S. utilities hands-on experience in designing, procuring, and operating PV systems. The procurement process included the development of a detailed set of technical requirements for a PV system. PVUSA embodied its requirements in a technical specification used as an attachment to its contracts for four utility-scale PV systems in the 200 kW to 500 kW range. The technical specification has also been adapted and used by several utilities. The PVUSA Technical Specification has now been updated and is presented here as a Model Technical Specification (MTS) for utility use. The MTS text is also furnished on a computer disk in Microsoft Word 6.0 so that it may be conveniently adapted by each user. The text includes guidance in the form of comments and by the use of parentheses to indicate where technical information must be developed and inserted. Commercial terms and conditions will reflect the procurement practice of the buyer. The reader is referred to PG&E Report Number 95-3090000. 1, PVUSA Procurement, Acceptance and Rating Practices for Photovoltaic Power Plants (1995) for PVUSA experience and practice. The MTS is regarded by PVUSA as a use-proven document, but needs to be adapted with care and attention to detail.

  12. Systematic approach to verification and validation: High explosive burn models

    Energy Technology Data Exchange (ETDEWEB)

    Menikoff, Ralph [Los Alamos National Laboratory; Scovel, Christina A. [Los Alamos National Laboratory

    2012-04-16

    Most material models used in numerical simulations are based on heuristics and empirically calibrated to experimental data. For a specific model, key questions are determining its domain of applicability and assessing its relative merits compared to other models. Answering these questions should be a part of model verification and validation (V and V). Here, we focus on V and V of high explosive models. Typically, model developers implemented their model in their own hydro code and use different sets of experiments to calibrate model parameters. Rarely can one find in the literature simulation results for different models of the same experiment. Consequently, it is difficult to assess objectively the relative merits of different models. This situation results in part from the fact that experimental data is scattered through the literature (articles in journals and conference proceedings) and that the printed literature does not allow the reader to obtain data from a figure in electronic form needed to make detailed comparisons among experiments and simulations. In addition, it is very time consuming to set up and run simulations to compare different models over sufficiently many experiments to cover the range of phenomena of interest. The first difficulty could be overcome if the research community were to support an online web based database. The second difficulty can be greatly reduced by automating procedures to set up and run simulations of similar types of experiments. Moreover, automated testing would be greatly facilitated if the data files obtained from a database were in a standard format that contained key experimental parameters as meta-data in a header to the data file. To illustrate our approach to V and V, we have developed a high explosive database (HED) at LANL. It now contains a large number of shock initiation experiments. Utilizing the header information in a data file from HED, we have written scripts to generate an input file for a hydro code

  13. An integrated approach to permeability modeling using micro-models

    Energy Technology Data Exchange (ETDEWEB)

    Hosseini, A.H.; Leuangthong, O.; Deutsch, C.V. [Society of Petroleum Engineers, Canadian Section, Calgary, AB (Canada)]|[Alberta Univ., Edmonton, AB (Canada)

    2008-10-15

    An important factor in predicting the performance of steam assisted gravity drainage (SAGD) well pairs is the spatial distribution of permeability. Complications that make the inference of a reliable porosity-permeability relationship impossible include the presence of short-scale variability in sand/shale sequences; preferential sampling of core data; and uncertainty in upscaling parameters. Micro-modelling is a simple and effective method for overcoming these complications. This paper proposed a micro-modeling approach to account for sampling bias, small laminated features with high permeability contrast, and uncertainty in upscaling parameters. The paper described the steps and challenges of micro-modeling and discussed the construction of binary mixture geo-blocks; flow simulation and upscaling; extended power law formalism (EPLF); and the application of micro-modeling and EPLF. An extended power-law formalism to account for changes in clean sand permeability as a function of macroscopic shale content was also proposed and tested against flow simulation results. There was close agreement between the model and simulation results. The proposed methodology was also applied to build the porosity-permeability relationship for laminated and brecciated facies of McMurray oil sands. Experimental data was in good agreement with the experimental data. 8 refs., 17 figs.

  14. Construction of cell type-specific logic models of signaling networks using CellNOpt.

    Science.gov (United States)

    Morris, Melody K; Melas, Ioannis; Saez-Rodriguez, Julio

    2013-01-01

    Mathematical models are useful tools for understanding protein signaling networks because they provide an integrated view of pharmacological and toxicological processes at the molecular level. Here we describe an approach previously introduced based on logic modeling to generate cell-specific, mechanistic and predictive models of signal transduction. Models are derived from a network encoding prior knowledge that is trained to signaling data, and can be either binary (based on Boolean logic) or quantitative (using a recently developed formalism, constrained fuzzy logic). The approach is implemented in the freely available tool CellNetOptimizer (CellNOpt). We explain the process CellNOpt uses to train a prior knowledge network to data and illustrate its application with a toy example as well as a realistic case describing signaling networks in the HepG2 liver cancer cell line.

  15. Development of a computationally efficient urban modeling approach

    DEFF Research Database (Denmark)

    Wolfs, Vincent; Murla, Damian; Ntegeka, Victor

    2016-01-01

    This paper presents a parsimonious and data-driven modelling approach to simulate urban floods. Flood levels simulated by detailed 1D-2D hydrodynamic models can be emulated using the presented conceptual modelling approach with a very short calculation time. In addition, the model detail can be a...

  16. Kinetics and specificity of nickel hypersensitivity in the murine model.

    Science.gov (United States)

    Siller, G M; Seymour, G J

    1994-01-01

    Nickel contact dermatitis appears to be almost exclusively a disease of females despite the increasing exposure of males to nickel. Successful murine models of nickel allergic contact dermatitis have been described. The purpose of this study is to investigate the kinetics and specificity of the response in this model and to examine if any differences exist between male and female. Mice were sensitised epicutaneously with nickel sulphate in aqueous solution of varying concentration, volume and duration of application. Following intradermal challenge, dose dependent response kinetics which approximated linearity were demonstrated upto the point of toxicity. Sensitised mice were challenged with Cobaltous chloride, Chromic chloride and Cupric sulphate and demonstrated no evidence of cross sensitivity to cobalt or chrome. Copper produced an irritant response making interpretation difficult. Earlier and stronger responses were observed in female mice, however these differences fell short of statistical significance. The results of the present study therefore establishes a reliable model for nickel hypersensitivity, that demonstrates both specificity and dose dependent kinetics without significant sex differences.

  17. Mark-specific hazard ratio model with missing multivariate marks.

    Science.gov (United States)

    Juraska, Michal; Gilbert, Peter B

    2016-10-01

    An objective of randomized placebo-controlled preventive HIV vaccine efficacy (VE) trials is to assess the relationship between vaccine effects to prevent HIV acquisition and continuous genetic distances of the exposing HIVs to multiple HIV strains represented in the vaccine. The set of genetic distances, only observed in failures, is collectively termed the 'mark.' The objective has motivated a recent study of a multivariate mark-specific hazard ratio model in the competing risks failure time analysis framework. Marks of interest, however, are commonly subject to substantial missingness, largely due to rapid post-acquisition viral evolution. In this article, we investigate the mark-specific hazard ratio model with missing multivariate marks and develop two inferential procedures based on (i) inverse probability weighting (IPW) of the complete cases, and (ii) augmentation of the IPW estimating functions by leveraging auxiliary data predictive of the mark. Asymptotic properties and finite-sample performance of the inferential procedures are presented. This research also provides general inferential methods for semiparametric density ratio/biased sampling models with missing data. We apply the developed procedures to data from the HVTN 502 'Step' HIV VE trial.

  18. Social fear conditioning: a novel and specific animal model to study social anxiety disorder.

    Science.gov (United States)

    Toth, Iulia; Neumann, Inga D; Slattery, David A

    2012-05-01

    Social anxiety disorder (SAD) is a major health concern with high lifetime prevalence. The current medication is rather unspecific and, despite considerable efforts, its efficacy is still unsatisfactory. However, there are no appropriate and specific animal models available to study the underlying etiology of the disorder. Therefore, we aimed to establish a model of specific social fear in mice and use this social fear conditioning (SFC) model to assess the therapeutic efficacy of the benzodiazepine diazepam and of the antidepressant paroxetine; treatments currently used for SAD patients. We show that by administering electric foot shocks (2-5, 1 s, 0.7 mA) during the investigation of a con-specific, the investigation of unfamiliar con-specifics was reduced for both the short- and long-term, indicating lasting social fear. The induced fear was specific to social stimuli and did not lead to other behavioral alterations, such as fear of novelty, general anxiety, depression, and impaired locomotion. We show that social fear was dose-dependently reversed by acute diazepam, at doses that were not anxiolytic in a non-social context, such as the elevated plus maze. Finally, we show that chronic paroxetine treatment reversed social fear. All in all, we demonstrated robust social fear after exposure to SFC in mice, which was reversed with both acute benzodiazepine and chronic antidepressant treatment. We propose the SFC model as an appropriate animal model to identify the underlying etiology of SAD and possible novel treatment approaches.

  19. Much Pain, Little Gain? Paradigm-Specific Models and Methods in Experimental Psychology.

    Science.gov (United States)

    Meiser, Thorsten

    2011-03-01

    Paradigm-oriented research strategies in experimental psychology have strengths and limitations. On the one hand, experimental paradigms play a crucial epistemic and heuristic role in basic psychological research. On the other hand, empirical research is often limited to the observed effects in a certain paradigm, and theoretical models are frequently tied to the particular features of the given paradigm. A paradigm-driven research strategy therefore jeopardizes the pursuit of research questions and theoretical models that go beyond a specific paradigm. As one example of a more integrative approach, recent research on illusory and spurious correlations has attempted to overcome the limitations of paradigm-specific models in the context of biased contingency perception and social stereotyping. Last but not least, the use of statistical models for the analysis of elementary cognitive functions is a means toward a more integrative terminology and theoretical perspective across different experimental paradigms and research domains. © The Author(s) 2011.

  20. Diagnostic classification of specific phobia subtypes using structural MRI data: a machine-learning approach.

    Science.gov (United States)

    Lueken, Ulrike; Hilbert, Kevin; Wittchen, Hans-Ulrich; Reif, Andreas; Hahn, Tim

    2015-01-01

    While neuroimaging research has advanced our knowledge about fear circuitry dysfunctions in anxiety disorders, findings based on diagnostic groups do not translate into diagnostic value for the individual patient. Machine-learning generates predictive information that can be used for single subject classification. We applied Gaussian process classifiers to a sample of patients with specific phobia as a model disorder for pathological forms of anxiety to test for classification based on structural MRI data. Gray (GM) and white matter (WM) volumetric data were analyzed in 33 snake phobics (SP; animal subtype), 26 dental phobics (DP; blood-injection-injury subtype) and 37 healthy controls (HC). Results showed good accuracy rates for GM and WM data in predicting phobia subtypes (GM: 62 % phobics vs. HC, 86 % DP vs. HC, 89 % SP vs. HC, 89 % DP vs. SP; WM: 88 % phobics vs. HC, 89 % DP vs. HC, 79 % SP vs. HC, 79 % DP vs. HC). Regarding GM, classification improved when considering the subtype compared to overall phobia status. The discriminatory brain pattern was not solely based on fear circuitry structures but included widespread cortico-subcortical networks. Results demonstrate that multivariate pattern recognition represents a promising approach for the development of neuroimaging-based diagnostic markers that could support clinical decisions. Regarding the increasing number of fMRI studies on anxiety disorders, researchers are encouraged to use functional and structural data not only for studying phenotype characteristics on a group level, but also to evaluate their incremental value for diagnostic or prognostic purposes.

  1. On specification of initial conditions in turbulence models

    Energy Technology Data Exchange (ETDEWEB)

    Rollin, Bertrand [Los Alamos National Laboratory; Andrews, Malcolm J [Los Alamos National Laboratory

    2010-12-01

    Recent research has shown that initial conditions have a significant influence on the evolution of a flow towards turbulence. This important finding offers a unique opportunity for turbulence control, but also raises the question of how to properly specify initial conditions in turbulence models. We study this problem in the context of the Rayleigh-Taylor instability. The Rayleigh-Taylor instability is an interfacial fluid instability that leads to turbulence and turbulent mixing. It occurs when a light fluid is accelerated in to a heavy fluid because of misalignment between density and pressure gradients. The Rayleigh-Taylor instability plays a key role in a wide variety of natural and man-made flows ranging from supernovae to the implosion phase of Inertial Confinement Fusion (ICF). Our approach consists of providing the turbulence models with a predicted profile of its key variables at the appropriate time in accordance to the initial conditions of the problem.

  2. Modeling drug- and chemical- induced hepatotoxicity with systems biology approaches

    Directory of Open Access Journals (Sweden)

    Sudin eBhattacharya

    2012-12-01

    Full Text Available We provide an overview of computational systems biology approaches as applied to the study of chemical- and drug-induced toxicity. The concept of ‘toxicity pathways’ is described in the context of the 2007 US National Academies of Science report, Toxicity testing in the 21st Century: A Vision and A Strategy. Pathway mapping and modeling based on network biology concepts are a key component of the vision laid out in this report for a more biologically-based analysis of dose-response behavior and the safety of chemicals and drugs. We focus on toxicity of the liver (hepatotoxicity – a complex phenotypic response with contributions from a number of different cell types and biological processes. We describe three case studies of complementary multi-scale computational modeling approaches to understand perturbation of toxicity pathways in the human liver as a result of exposure to environmental contaminants and specific drugs. One approach involves development of a spatial, multicellular virtual tissue model of the liver lobule that combines molecular circuits in individual hepatocytes with cell-cell interactions and blood-mediated transport of toxicants through hepatic sinusoids, to enable quantitative, mechanistic prediction of hepatic dose-response for activation of the AhR toxicity pathway. Simultaneously, methods are being developing to extract quantitative maps of intracellular signaling and transcriptional regulatory networks perturbed by environmental contaminants, using a combination of gene expression and genome-wide protein-DNA interaction data. A predictive physiological model (DILIsymTM to understand drug-induced liver injury (DILI, the most common adverse event leading to termination of clinical development programs and regulatory actions on drugs, is also described. The model initially focuses on reactive metabolite-induced DILI in response to administration of acetaminophen, and spans multiple biological scales.

  3. Identification and validation of specific markers of Bacillus anthracis spores by proteomics and genomics approaches.

    Science.gov (United States)

    Chenau, Jérôme; Fenaille, François; Caro, Valérie; Haustant, Michel; Diancourt, Laure; Klee, Silke R; Junot, Christophe; Ezan, Eric; Goossens, Pierre L; Becher, François

    2014-03-01

    Bacillus anthracis is the causative bacteria of anthrax, an acute and often fatal disease in humans. The infectious agent, the spore, represents a real bioterrorism threat and its specific identification is crucial. However, because of the high genomic relatedness within the Bacillus cereus group, it is still a real challenge to identify B. anthracis spores confidently. Mass spectrometry-based tools represent a powerful approach to the efficient discovery and identification of such protein markers. Here we undertook comparative proteomics analyses of Bacillus anthracis, cereus and thuringiensis spores to identify proteoforms unique to B. anthracis. The marker discovery pipeline developed combined peptide- and protein-centric approaches using liquid chromatography coupled to tandem mass spectrometry experiments using a high resolution/high mass accuracy LTQ-Orbitrap instrument. By combining these data with those from complementary bioinformatics approaches, we were able to highlight a dozen novel proteins consistently observed across all the investigated B. anthracis spores while being absent in B. cereus/thuringiensis spores. To further demonstrate the relevance of these markers and their strict specificity to B. anthracis, the number of strains studied was extended to 55, by including closely related strains such as B. thuringiensis 9727, and above all the B. cereus biovar anthracis CI, CA strains that possess pXO1- and pXO2-like plasmids. Under these conditions, the combination of proteomics and genomics approaches confirms the pertinence of 11 markers. Genes encoding these 11 markers are located on the chromosome, which provides additional targets complementary to the commonly used plasmid-encoded markers. Last but not least, we also report the development of a targeted liquid chromatography coupled to tandem mass spectrometry method involving the selection reaction monitoring mode for the monitoring of the 4 most suitable protein markers. Within a proof

  4. Lightning Modelling: From 3D to Circuit Approach

    Science.gov (United States)

    Moussa, H.; Abdi, M.; Issac, F.; Prost, D.

    2012-05-01

    The topic of this study is electromagnetic environment and electromagnetic interferences (EMI) effects, specifically the modelling of lightning indirect effects [1] on aircraft electrical systems present on deported and highly exposed equipments, such as nose landing gear (NLG) and nacelle, through a circuit approach. The main goal of the presented work, funded by a French national project: PREFACE, is to propose a simple equivalent electrical circuit to represent a geometrical structure, taking into account mutual, self inductances, and resistances, which play a fundamental role in the lightning current distribution. Then this model is intended to be coupled to a functional one, describing a power train chain composed of: a converter, a shielded power harness and a motor or a set of resistors used as a load for the converter. The novelty here, is to provide a pre-sizing qualitative approach allowing playing on integration in pre-design phases. This tool intends to offer a user-friendly way for replying rapidly to calls for tender, taking into account the lightning constraints. Two cases are analysed: first, a NLG that is composed of tubular pieces that can be easily approximated by equivalent cylindrical straight conductors. Therefore, passive R, L, M elements of the structure can be extracted through analytical engineer formulas such as those implemented in the partial element equivalent circuit (PEEC) [2] technique. Second, the same approach is intended to be applied on an electrical de-icing nacelle sub-system.

  5. Engine structures analysis software: Component Specific Modeling (COSMO)

    Science.gov (United States)

    McKnight, R. L.; Maffeo, R. J.; Schwartz, S.

    1994-08-01

    A component specific modeling software program has been developed for propulsion systems. This expert program is capable of formulating the component geometry as finite element meshes for structural analysis which, in the future, can be spun off as NURB geometry for manufacturing. COSMO currently has geometry recipes for combustors, turbine blades, vanes, and disks. Component geometry recipes for nozzles, inlets, frames, shafts, and ducts are being added. COSMO uses component recipes that work through neutral files with the Technology Benefit Estimator (T/BEST) program which provides the necessary base parameters and loadings. This report contains the users manual for combustors, turbine blades, vanes, and disks.

  6. Integrating Cloud-Computing-Specific Model into Aircraft Design

    Science.gov (United States)

    Zhimin, Tian; Qi, Lin; Guangwen, Yang

    Cloud Computing is becoming increasingly relevant, as it will enable companies involved in spreading this technology to open the door to Web 3.0. In the paper, the new categories of services introduced will slowly replace many types of computational resources currently used. In this perspective, grid computing, the basic element for the large scale supply of cloud services, will play a fundamental role in defining how those services will be provided. The paper tries to integrate cloud computing specific model into aircraft design. This work has acquired good results in sharing licenses of large scale and expensive software, such as CFD (Computational Fluid Dynamics), UG, CATIA, and so on.

  7. Software Requirements Specification of the UIFA's UUIS -- a Team 4 COMP5541-W10 Project Approach

    CERN Document Server

    Alhazmi, Ali; Liu, Bing; Oliveira, Deyvisson; Sobh, Kanj; Mayantz, Max; de Bled, Robin; Zhang, Yu Ming

    2010-01-01

    This document presents the business requirement of Unified University Inventory System (UUIS) in Technology-independent manner. All attempts have been made in using mostly business terminology and business language while describing the requirements in this document. Very minimal and commonly understood Technical terminology is used. Use case approach is used in modeling the business requirements in this document.

  8. Quantitative versus qualitative modeling: a complementary approach in ecosystem study.

    Science.gov (United States)

    Bondavalli, C; Favilla, S; Bodini, A

    2009-02-01

    Natural disturbance or human perturbation act upon ecosystems by changing some dynamical parameters of one or more species. Foreseeing these modifications is necessary before embarking on an intervention: predictions may help to assess management options and define hypothesis for interventions. Models become valuable tools for studying and making predictions only when they capture types of interactions and their magnitude. Quantitative models are more precise and specific about a system, but require a large effort in model construction. Because of this very often ecological systems remain only partially specified and one possible approach to their description and analysis comes from qualitative modelling. Qualitative models yield predictions as directions of change in species abundance but in complex systems these predictions are often ambiguous, being the result of opposite actions exerted on the same species by way of multiple pathways of interactions. Again, to avoid such ambiguities one needs to know the intensity of all links in the system. One way to make link magnitude explicit in a way that can be used in qualitative analysis is described in this paper and takes advantage of another type of ecosystem representation: ecological flow networks. These flow diagrams contain the structure, the relative position and the connections between the components of a system, and the quantity of matter flowing along every connection. In this paper it is shown how these ecological flow networks can be used to produce a quantitative model similar to the qualitative counterpart. Analyzed through the apparatus of loop analysis this quantitative model yields predictions that are by no means ambiguous, solving in an elegant way the basic problem of qualitative analysis. The approach adopted in this work is still preliminary and we must be careful in its application.

  9. Virtuous organization: A structural equation modeling approach

    Directory of Open Access Journals (Sweden)

    Majid Zamahani

    2013-02-01

    Full Text Available For years, the idea of virtue was unfavorable among researchers and virtues were traditionally considered as culture-specific, relativistic and they were supposed to be associated with social conservatism, religious or moral dogmatism, and scientific irrelevance. Virtue and virtuousness have been recently considered seriously among organizational researchers. The proposed study of this paper examines the relationships between leadership, organizational culture, human resource, structure and processes, care for community and virtuous organization. Structural equation modeling is employed to investigate the effects of each variable on other components. The data used in this study consists of questionnaire responses from employees in Payam e Noor University in Yazd province. A total of 250 questionnaires were sent out and a total of 211 valid responses were received. Our results have revealed that all the five variables have positive and significant impacts on virtuous organization. Among the five variables, organizational culture has the most direct impact (0.80 and human resource has the most total impact (0.844 on virtuous organization.

  10. A novel proteomic approach for specific identification of tyrosine kinase substrates using [13C]tyrosine.

    Science.gov (United States)

    Ibarrola, Nieves; Molina, Henrik; Iwahori, Akiko; Pandey, Akhilesh

    2004-04-16

    Proteomic studies to find substrates of tyrosine kinases generally rely on identification of protein bands that are "pulled down" by antiphosphotyrosine antibodies from ligand-stimulated samples. One can obtain erroneous results from such experiments because of two major reasons. First, some proteins might be basally phosphorylated on tyrosine residues in the absence of ligand stimulation. Second, proteins can bind non-specifically to the antibodies or the affinity matrix. Induction of phosphorylation of proteins by ligand must therefore be confirmed by a different approach, which is not always feasible. We have developed a novel proteomic approach to identify substrates of tyrosine kinases in signaling pathways studies based on in vivo labeling of proteins with "light" (12C-labeled) or "heavy" (13C-labeled) tyrosine. This stable isotope labeling in cell culture method enables the unequivocal identification of tyrosine kinase substrates, as peptides derived from true substrates give rise to a unique signature in a mass spectrometry experiment. By using this approach, from a single experiment, we have successfully identified several known substrates of insulin signaling pathway and a novel substrate, polymerase I and transcript release factor, a protein that is implicated in the control of RNA metabolism and regulation of type I collagen promoters. This approach is amenable to high throughput global studies as it simplifies the specific identification of substrates of tyrosine kinases as well as serine/threonine kinases using mass spectrometry.

  11. Research on the Development Approach for Reusable Model in Parallel Discrete Event Simulation

    Directory of Open Access Journals (Sweden)

    Jianbo Li

    2015-01-01

    Full Text Available Model reuse is an essential means to meet the demand of model development in complex simulation. An effective approach to realize the model reusability is to establish standard model specification including interface specification and representation specification. By standardizing model’s external interfaces, Reusable Component Model Framework (RCMF achieves the model reusability acting as an interface specification. However, the RCMF model is presently developed just through manual programing. Besides implementing model’s business logic, modeler should also ensure the model strictly following the reusable framework, which is very distracting. And there lacks model description information for instructing model reuse or integration. To address these issues, we first explored an XML-based model description file which completed RCMF as the model representation and then proposed a RCMF model development tool—SuKit. Model description file describes a RCMF model and can be used for regenerating a model and instructing model integration. SuKit can generate a skeleton RCMF model together with a model-customized description file with the configured information. Modeler then just needs to concentrate on the model processing logic. The case study indicates that SuKit has good capability of developing RCMF models and the well-formed description file can be used for model reuse and integration.

  12. Connectivity of channelized reservoirs: a modelling approach

    Energy Technology Data Exchange (ETDEWEB)

    Larue, David K. [ChevronTexaco, Bakersfield, CA (United States); Hovadik, Joseph [ChevronTexaco, San Ramon, CA (United States)

    2006-07-01

    Connectivity represents one of the fundamental properties of a reservoir that directly affects recovery. If a portion of the reservoir is not connected to a well, it cannot be drained. Geobody or sandbody connectivity is defined as the percentage of the reservoir that is connected, and reservoir connectivity is defined as the percentage of the reservoir that is connected to wells. Previous studies have mostly considered mathematical, physical and engineering aspects of connectivity. In the current study, the stratigraphy of connectivity is characterized using simple, 3D geostatistical models. Based on these modelling studies, stratigraphic connectivity is good, usually greater than 90%, if the net: gross ratio, or sand fraction, is greater than about 30%. At net: gross values less than 30%, there is a rapid diminishment of connectivity as a function of net: gross. This behaviour between net: gross and connectivity defines a characteristic 'S-curve', in which the connectivity is high for net: gross values above 30%, then diminishes rapidly and approaches 0. Well configuration factors that can influence reservoir connectivity are well density, well orientation (vertical or horizontal; horizontal parallel to channels or perpendicular) and length of completion zones. Reservoir connectivity as a function of net: gross can be improved by several factors: presence of overbank sandy facies, deposition of channels in a channel belt, deposition of channels with high width/thickness ratios, and deposition of channels during variable floodplain aggradation rates. Connectivity can be reduced substantially in two-dimensional reservoirs, in map view or in cross-section, by volume support effects and by stratigraphic heterogeneities. It is well known that in two dimensions, the cascade zone for the 'S-curve' of net: gross plotted against connectivity occurs at about 60% net: gross. Generalizing this knowledge, any time that a reservoir can be regarded as &apos

  13. A point of view on Otto cycle approach specific for an undergraduate thermodynamics course in CMU

    Science.gov (United States)

    Memet, F.; Preda, A.

    2015-11-01

    This paper refers to the description of the way in which can be presented to future marine engineers the analyzis of the performance of an Otto cycle, in a manner which is beyond the classic approach of the course of thermodynamics in Constanta Maritime University. The conventional course of thermodynamics is dealing with the topic of performance analysis of the cycle of the internal combustion engine with isochoric combustion for the situation in which the working medium is treated as such a perfect gas. This type of approach is viable only when are considered relatively small temperature differences. But this is the situation when specific heats are seen as constant. Instead, the practical experience has shown that small temperature differences are not viable, resulting the need for variable specific heat evaluation. The presentation bellow is available for the adiabatic exponent written as a liniar function depending on temperature. In the section of this paper dedicated to methods and materials, the situation in which the specific heat is taken as constant is not neglected, additionaly being given the algorithm for variable specific heat.For the both cases it is given the way in which it is assessed the work output. The calculus is based on the cycle shown in temperature- entropy diagram, in which are also indicated the irreversible adiabatic compression and expansion. The experience achieved after understanding this theory will allow to future professionals to deal successfully with the design practice of internal combustion engines.

  14. Role of Rate of Specific Growth Rate in Different Growth Processes: A First Principle Approach

    CERN Document Server

    Biswas, Dibyendu; Patra, Sankar Nayaran

    2015-01-01

    In the present communication, effort is given for the development of a common platform that helps to address several growth processes found in literature. Based on first principle approach, the role of rate of specific growth rate in different growth processes has been considered in an unified manner. It is found that different growth equations can be derived from the same rate equation of specific growth rate. The dependence of growth features of different growth processes on the parameters of the rate equation of specific growth rate has been examined in detail. It is found that competitive environment may increase the saturation level of population size. The exponential growth could also be addressed in terms of two important factors of growth dynamics, as reproduction and competition. These features are, most probably, not reported earlier.

  15. Range-Specific High-resolution Mesoscale Model Setup

    Science.gov (United States)

    Watson, Leela R.

    2013-01-01

    This report summarizes the findings from an AMU task to determine the best model configuration for operational use at the ER and WFF to best predict winds, precipitation, and temperature. The AMU ran test cases in the warm and cool seasons at the ER and for the spring and fall seasons at WFF. For both the ER and WFF, the ARW core outperformed the NMM core. Results for the ER indicate that the Lin microphysical scheme and the YSU PBL scheme is the optimal model configuration for the ER. It consistently produced the best surface and upper air forecasts, while performing fairly well for the precipitation forecasts. Both the Ferrier and Lin microphysical schemes in combination with the YSU PBL scheme performed well for WFF in the spring and fall seasons. The AMU has been tasked with a follow-on modeling effort to recommended local DA and numerical forecast model design optimized for both the ER and WFF to support space launch activities. The AMU will determine the best software and type of assimilation to use, as well as determine the best grid resolution for the initialization based on spatial and temporal availability of data and the wall clock run-time of the initialization. The AMU will transition from the WRF EMS to NU-WRF, a NASA-specific version of the WRF that takes advantage of unique NASA software and datasets. 37

  16. Specific Approach for Size-Control III-V Quantum/Nano LED Fabrication for Prospective White Light Source

    Science.gov (United States)

    2007-08-10

    The Final Report Title: Specific approach for size-control III-V based quantum/nano LED fabrication for prospective white ...COVERED 14-06-2005 to 14-12-2005 4. TITLE AND SUBTITLE Size controlled GaN based quantum dot LED for the prospective white light source 5a. CONTRACT...structure LED The physical model of the PC LED for optical simulation is shown in Figure 10. The LED are composed with p-type GaN/ MQW of InGaN /GaN/ n

  17. The Einstein specific heat model for finite systems

    Science.gov (United States)

    Boscheto, E.; de Souza, M.; López-Castillo, A.

    2016-06-01

    The theoretical model proposed by Einstein to describe the phononic specific heat of solids as a function of temperature consists of the very first application of the concept of energy quantization to describe the physical properties of a real system. Its central assumption lies in the consideration of a total energy distribution among N (in the thermodynamic limit N → ∞) non-interacting oscillators vibrating at the same frequency (ω). Nowadays, it is well-known that most materials behave differently at the nanoscale, having thus some cases physical properties with potential technological applications. Here, a version of the Einstein model composed of a finite number of particles/oscillators is proposed. The main findings obtained in the frame of the present work are: (i) a qualitative description of the specific heat in the limit of low-temperatures for systems with nano-metric dimensions; (ii) the observation that the corresponding chemical potential function for finite solids becomes null at finite temperatures as observed in the Bose-Einstein condensation and; (iii) emergence of a first-order like phase transition driven by varying N.

  18. Specification Representation and Test Case Reduction by Analyzing the Interaction Patterns in System Model

    Directory of Open Access Journals (Sweden)

    Ashish Kumari

    2012-01-01

    Full Text Available Extended Finite State Machine uses the formal description language to model the requirement specification of the system. The system models are frequently changed because of the specification changes. We can show the changes in specification by changing the model represented using finite state machine. To test the modified parts of the model the selective test generation techniques are used. However, the regression test suits still may be very large according to the size. In this paper, we have discussed the method whichdefine the test suits reduction and the requirement specification that used for testing the main system after the modifications in the requirements and implementation. Extended finite state machine uses the state transition diagram for representing the requirement specification. It shows how system changes states and action and variable used during each transition. After that data dependency andcontrol dependency are find out among the transitions of state transition diagram. After these dependencies we can find out the affecting and affected portion in the system introduced by the modification. The main condition is: “If two test cases generate same affecting and affected pattern, it means it is enough to implement only one test case rather than two.” So using this approach we can substantially reduce the size of original test suite.

  19. A systems biology approach to the analysis of subset-specific responses to lipopolysaccharide in dendritic cells.

    Directory of Open Access Journals (Sweden)

    David G Hancock

    Full Text Available Dendritic cells (DCs are critical for regulating CD4 and CD8 T cell immunity, controlling Th1, Th2, and Th17 commitment, generating inducible Tregs, and mediating tolerance. It is believed that distinct DC subsets have evolved to control these different immune outcomes. However, how DC subsets mount different responses to inflammatory and/or tolerogenic signals in order to accomplish their divergent functions remains unclear. Lipopolysaccharide (LPS provides an excellent model for investigating responses in closely related splenic DC subsets, as all subsets express the LPS receptor TLR4 and respond to LPS in vitro. However, previous studies of the LPS-induced DC transcriptome have been performed only on mixed DC populations. Moreover, comparisons of the in vivo response of two closely related DC subsets to LPS stimulation have not been reported in the literature to date. We compared the transcriptomes of murine splenic CD8 and CD11b DC subsets after in vivo LPS stimulation, using RNA-Seq and systems biology approaches. We identified subset-specific gene signatures, which included multiple functional immune mediators unique to each subset. To explain the observed subset-specific differences, we used a network analysis approach. While both DC subsets used a conserved set of transcription factors and major signalling pathways, the subsets showed differential regulation of sets of genes that 'fine-tune' the network Hubs expressed in common. We propose a model in which signalling through common pathway components is 'fine-tuned' by transcriptional control of subset-specific modulators, thus allowing for distinct functional outcomes in closely related DC subsets. We extend this analysis to comparable datasets from the literature and confirm that our model can account for cell subset-specific responses to LPS stimulation in multiple subpopulations in mouse and man.

  20. Effects of Sample Size, Estimation Methods, and Model Specification on Structural Equation Modeling Fit Indexes.

    Science.gov (United States)

    Fan, Xitao; Wang, Lin; Thompson, Bruce

    1999-01-01

    A Monte Carlo simulation study investigated the effects on 10 structural equation modeling fit indexes of sample size, estimation method, and model specification. Some fit indexes did not appear to be comparable, and it was apparent that estimation method strongly influenced almost all fit indexes examined, especially for misspecified models. (SLD)

  1. 78 FR 32476 - Models for Plant-Specific Adoption of Technical Specifications Task Force Traveler TSTF-426...

    Science.gov (United States)

    2013-05-30

    ... COMMISSION Models for Plant-Specific Adoption of Technical Specifications Task Force Traveler TSTF-426... of Technical Specifications (TSs) Task Force (TSTF) Traveler TSTF-426, Revision 5, ``Revise or Add... finds the proposed TS (Volume 1) and TS Bases (Volume 2) changes in Traveler TSTF-426 acceptable for...

  2. A Bayesian model of category-specific emotional brain responses.

    Directory of Open Access Journals (Sweden)

    Tor D Wager

    2015-04-01

    Full Text Available Understanding emotion is critical for a science of healthy and disordered brain function, but the neurophysiological basis of emotional experience is still poorly understood. We analyzed human brain activity patterns from 148 studies of emotion categories (2159 total participants using a novel hierarchical Bayesian model. The model allowed us to classify which of five categories--fear, anger, disgust, sadness, or happiness--is engaged by a study with 66% accuracy (43-86% across categories. Analyses of the activity patterns encoded in the model revealed that each emotion category is associated with unique, prototypical patterns of activity across multiple brain systems including the cortex, thalamus, amygdala, and other structures. The results indicate that emotion categories are not contained within any one region or system, but are represented as configurations across multiple brain networks. The model provides a precise summary of the prototypical patterns for each emotion category, and demonstrates that a sufficient characterization of emotion categories relies on (a differential patterns of involvement in neocortical systems that differ between humans and other species, and (b distinctive patterns of cortical-subcortical interactions. Thus, these findings are incompatible with several contemporary theories of emotion, including those that emphasize emotion-dedicated brain systems and those that propose emotion is localized primarily in subcortical activity. They are consistent with componential and constructionist views, which propose that emotions are differentiated by a combination of perceptual, mnemonic, prospective, and motivational elements. Such brain-based models of emotion provide a foundation for new translational and clinical approaches.

  3. Generating patient-specific pulmonary vascular models for surgical planning

    Science.gov (United States)

    Murff, Daniel; Co-Vu, Jennifer; O'Dell, Walter G.

    2015-03-01

    Each year in the U.S., 7.4 million surgical procedures involving the major vessels are performed. Many of our patients require multiple surgeries, and many of the procedures include "surgical exploration". Procedures of this kind come with a significant amount of risk, carrying up to a 17.4% predicted mortality rate. This is especially concerning for our target population of pediatric patients with congenital abnormalities of the heart and major pulmonary vessels. This paper offers a novel approach to surgical planning which includes studying virtual and physical models of pulmonary vasculature of an individual patient before operation obtained from conventional 3D X-ray computed tomography (CT) scans of the chest. These models would provide clinicians with a non-invasive, intricately detailed representation of patient anatomy, and could reduce the need for invasive planning procedures such as exploratory surgery. Researchers involved in the AirPROM project have already demonstrated the utility of virtual and physical models in treatment planning of the airways of the chest. Clinicians have acknowledged the potential benefit from such a technology. A method for creating patient-derived physical models is demonstrated on pulmonary vasculature extracted from a CT scan with contrast of an adult human. Using a modified version of the NIH ImageJ program, a series of image processing functions are used to extract and mathematically reconstruct the vasculature tree structures of interest. An auto-generated STL file is sent to a 3D printer to create a physical model of the major pulmonary vasculature generated from 3D CT scans of patients.

  4. Specification of advanced safety modeling requirements (Rev. 0).

    Energy Technology Data Exchange (ETDEWEB)

    Fanning, T. H.; Tautges, T. J.

    2008-06-30

    The U.S. Department of Energy's Global Nuclear Energy Partnership has lead to renewed interest in liquid-metal-cooled fast reactors for the purpose of closing the nuclear fuel cycle and making more efficient use of future repository capacity. However, the U.S. has not designed or constructed a fast reactor in nearly 30 years. Accurate, high-fidelity, whole-plant dynamics safety simulations will play a crucial role by providing confidence that component and system designs will satisfy established design limits and safety margins under a wide variety of operational, design basis, and beyond design basis transient conditions. Current modeling capabilities for fast reactor safety analyses have resulted from several hundred person-years of code development effort supported by experimental validation. The broad spectrum of mechanistic and phenomenological models that have been developed represent an enormous amount of institutional knowledge that needs to be maintained. Complicating this, the existing code architectures for safety modeling evolved from programming practices of the 1970s. This has lead to monolithic applications with interdependent data models which require significant knowledge of the complexities of the entire code in order for each component to be maintained. In order to develop an advanced fast reactor safety modeling capability, the limitations of the existing code architecture must be overcome while preserving the capabilities that already exist. To accomplish this, a set of advanced safety modeling requirements is defined, based on modern programming practices, that focuses on modular development within a flexible coupling framework. An approach for integrating the existing capabilities of the SAS4A/SASSYS-1 fast reactor safety analysis code into the SHARP framework is provided in order to preserve existing capabilities while providing a smooth transition to advanced modeling capabilities. In doing this, the advanced fast reactor safety models

  5. Site-specific water quality guidelines: 1. Derivation approaches based on physicochemical, ecotoxicological and ecological data.

    Science.gov (United States)

    van Dam, R A; Humphrey, C L; Harford, A J; Sinclair, A; Jones, D R; Davies, S; Storey, A W

    2014-01-01

    Generic water quality guidelines (WQGs) are developed by countries/regions as broad scale tools to assist with the protection of aquatic ecosystems from the impacts of toxicants. However, since generic WQGs cannot adequately account for the many environmental factors that may affect toxicity at a particular site, site-specific WQGs are often needed, especially for high environmental value ecosystems. The Australian and New Zealand Guidelines for Fresh and Marine Water Quality provide comprehensive guidance on methods for refining or deriving WQGs for site-specific purposes. This paper describes three such methods for deriving site-specific WQGs, namely: (1) using local reference water quality data, (2) using biological effects data from laboratory-based toxicity testing, and (3) using biological effects data from field surveys. Two case studies related to the assessment of impacts arising from mining operations in northern Australia are used to illustrate the application of these methods. Finally, the potential of several emerging methods designed to assess thresholds of ecological change from field data for deriving site-specific WQGs is discussed. Ideally, multiple lines of evidence approaches, integrating both laboratory and field data, are recommended for deriving site-specific WQGs.

  6. Teaching Formal Models of Concurrency Specification and Analysis

    Directory of Open Access Journals (Sweden)

    N. V. Shilov

    2015-01-01

    Full Text Available There is a widespread and rapidly growing interest to the parallel programming nowadays. This interest is based on availability of supercomputers, computer clusters and powerful graphic processors for computational mathematics and simulation. MPI, OpenMP, CUDA and other technologies provide opportunity to write C and FORTRAN code for parallel speed-up of execution without races for resources. Nevertheless concurrency issues (like races are still very important for parallel systems in general and distributed systems in particular. Due to this reason, there is a need of research, study and teaching of formal models of concurrency and methods of distributed system verification.The paper presents an individual experience with teaching Formal Models of Concurrency as a graduate elective course for students specializing in high-performance computing. First it sketches course background, objectives, lecture plan and topics. Then the paper presents how to formalize (i.e. specify a reachability puzzle in semantic, syntactic and logic formal models, namely: in Petri nets, in a dialect of Calculus of Communicating Systems (CCS and in Computation Tree Logic (CTL. This puzzle is a good educational example to present specifics of different formal notations.The article is published in the author’s wording.

  7. Modelling, Specification and Robustness Issues for Robotic Manipulation Tasks

    Directory of Open Access Journals (Sweden)

    Danica Kragic

    2008-11-01

    Full Text Available In this paper, a system for modeling of service robot tasks is presented. Our work is motivated by the idea that a robotic task may be represented as a set of tractable modules each responsible for a certain part of the task. For general fetch-and-carry robotic applications, there will be varying demands for precision and degrees of freedom involved depending on complexity of the individual module. The particular research problem considered here is the development of a system that supports simple design of complex tasks from a set of basic primitives. The three system levels considered are: i task graph generation which allows the user to easily design or model a task, ii task graph execution which executes the task graph, and iii at the lowest level, the specification and development of primitives required for general fetch-and-carry robotic applications. In terms of robustness, we believe that one way of increasing the robustness of the whole system is by increasing the robustness of individual modules. In particular, we consider a number of different parameters that effect the performance of a model-based tracking system. Parameters such as color channels, feature detection, validation gates, outliers rejection and feature selection are considered here and their affect to the overall system performance is discussed. Experimental evaluation shows how some of these parameters can successfully be evaluated (learned on-line and consequently improve the performance of the system.

  8. Factors affecting forward pricing behaviour: implications of alternative regression model specifications

    Directory of Open Access Journals (Sweden)

    Henry Jordaan

    2010-12-01

    Full Text Available Price risk associated with maize production became a reason for concern in South Africa only after the deregulation of the agricultural commodities markets in the mid-1990s, when farmers became responsible for marketing their own crops. Although farmers can use, inter alia, the cash forward contracting and/or the derivatives market to manage price risk, few farmers actually participate in forward pricing. A similar reluctance to use forward pricing methods is also found internationally. A number of different model specifications have been used in previous research to model forward pricing behaviour which is based on the assumption that the same variables influence both the adoption and the quantity decision. This study compares the results from a model specification which models forward pricing behaviour in a single-decision framework with the results from modelling the quantity decision conditional to the adoption decision in a two-step approach. The results suggest that substantially more information is obtained by modelling forward pricing behaviour as two separate decisions rather than a single decision. Such information may be valuable in educational material compiled to educate farmers in the effective use of forward pricing methods in price risk management. Modelling forward pricing behaviour as two separate decisions  is thus a more effective means of modelling forward pricing behaviour than modelling it as a single decision.

  9. Ab initio state-specific N2 + O dissociation and exchange modeling for molecular simulations

    Science.gov (United States)

    Luo, Han; Kulakhmetov, Marat; Alexeenko, Alina

    2017-02-01

    Quasi-classical trajectory (QCT) calculations are used in this work to calculate state-specific N2(X1Σ ) +O(3P ) →2 N(4S ) +O(3P ) dissociation and N2(X1Σ ) +O(3P ) →NO(X2Π ) +N(4S ) exchange cross sections and rates based on the 13A″ and 13A' ab initio potential energy surface by Gamallo et al. [J. Chem. Phys. 119, 2545-2556 (2003)]. The calculations consider translational energies up to 23 eV and temperatures between 1000 K and 20 000 K. Vibrational favoring is observed for dissociation reaction at the whole range of collision energies and for exchange reaction around the dissociation limit. For the same collision energy, cross sections for v = 30 are 4 to 6 times larger than those for the ground state. The exchange reaction has an effective activation energy that is dependent on the initial rovibrational level, which is different from dissociation reaction. In addition, the exchange cross sections have a maximum when the total collision energy (TCE) approaches dissociation energy. The calculations are used to generate compact QCT-derived state-specific dissociation (QCT-SSD) and QCT-derived state-specific exchange (QCT-SSE) models, which describe over 1 × 106 cross sections with about 150 model parameters. The models can be used directly within direct simulation Monte Carlo and computational fluid dynamics simulations. Rate constants predicted by the new models are compared to the experimental measurements, direct QCT calculations and predictions by other models that include: TCE model, Bose-Candler QCT-based exchange model, Macheret-Fridman dissociation model, Macheret's exchange model, and Park's two-temperature model. The new models match QCT-calculated and experimental rates within 30% under nonequilibrium conditions while other models under predict by over an order of magnitude under vibrationally-cold conditions.

  10. Femur specific polyaffine model to regularize the log-domain demons registration

    Science.gov (United States)

    Seiler, Christof; Pennec, Xavier; Ritacco, Lucas; Reyes, Mauricio

    2011-03-01

    Osteoarticular allograft transplantation is a popular treatment method in wide surgical resections with large defects. For this reason hospitals are building bone data banks. Performing the optimal allograft selection on bone banks is crucial to the surgical outcome and patient recovery. However, current approaches are very time consuming hindering an efficient selection. We present an automatic method based on registration of femur bones to overcome this limitation. We introduce a new regularization term for the log-domain demons algorithm. This term replaces the standard Gaussian smoothing with a femur specific polyaffine model. The polyaffine femur model is constructed with two affine (femoral head and condyles) and one rigid (shaft) transformation. Our main contribution in this paper is to show that the demons algorithm can be improved in specific cases with an appropriate model. We are not trying to find the most optimal polyaffine model of the femur, but the simplest model with a minimal number of parameters. There is no need to optimize for different number of regions, boundaries and choice of weights, since this fine tuning will be done automatically by a final demons relaxation step with Gaussian smoothing. The newly developed synthesis approach provides a clear anatomically motivated modeling contribution through the specific three component transformation model, and clearly shows a performance improvement (in terms of anatomical meaningful correspondences) on 146 CT images of femurs compared to a standard multiresolution demons. In addition, this simple model improves the robustness of the demons while preserving its accuracy. The ground truth are manual measurements performed by medical experts.

  11. Element-specific density profiles in interacting biomembrane models

    Science.gov (United States)

    Schneck, Emanuel; Rodriguez-Loureiro, Ignacio; Bertinetti, Luca; Marin, Egor; Novikov, Dmitri; Konovalov, Oleg; Gochev, Georgi

    2017-03-01

    Surface interactions involving biomembranes, such as cell–cell interactions or membrane contacts inside cells play important roles in numerous biological processes. Structural insight into the interacting surfaces is a prerequisite to understand the interaction characteristics as well as the underlying physical mechanisms. Here, we work with simplified planar experimental models of membrane surfaces, composed of lipids and lipopolymers. Their interaction is quantified in terms of pressure–distance curves using ellipsometry at controlled dehydrating (interaction) pressures. For selected pressures, their internal structure is investigated by standing-wave x-ray fluorescence (SWXF). This technique yields specific density profiles of the chemical elements P and S belonging to lipid headgroups and polymer chains, as well as counter-ion profiles for charged surfaces.

  12. Shortening trinucleotide repeats using highly specific endonucleases: a possible approach to gene therapy?

    Science.gov (United States)

    Richard, Guy-Franck

    2015-04-01

    Trinucleotide repeat expansions are involved in more than two dozen neurological and developmental disorders. Conventional therapeutic approaches aimed at regulating the expression level of affected genes, which rely on drugs, oligonucleotides, and/or transgenes, have met with only limited success so far. An alternative approach is to shorten repeats to non-pathological lengths using highly specific nucleases. Here, I review early experiments using meganucleases, zinc-finger nucleases (ZFN), and transcription-activator like effector nucleases (TALENs) to contract trinucleotide repeats, and discuss the possibility of using CRISPR-Cas nucleases to the same end. Although this is a nascent field, I explore the possibility of designing nucleases and effectively delivering them in the context of gene therapy.

  13. A mechanism-based approach for absorption modeling: the Gastro-Intestinal Transit Time (GITT) model.

    Science.gov (United States)

    Hénin, Emilie; Bergstrand, Martin; Standing, Joseph F; Karlsson, Mats O

    2012-06-01

    Absorption models used in the estimation of pharmacokinetic drug characteristics from plasma concentration data are generally empirical and simple, utilizing no prior information on gastro-intestinal (GI) transit patterns. Our aim was to develop and evaluate an estimation strategy based on a mechanism-based model for drug absorption, which takes into account the tablet movement through the GI transit. This work is an extension of a previous model utilizing tablet movement characteristics derived from magnetic marker monitoring (MMM) and pharmacokinetic data. The new approach, which replaces MMM data with a GI transit model, was evaluated in data sets where MMM data were available (felodipine) or not available (diclofenac). Pharmacokinetic profiles in both datasets were well described by the model according to goodness-of-fit plots. Visual predictive checks showed the model to give superior simulation properties compared with a standard empirical approach (first-order absorption rate + lag-time). This model represents a step towards an integrated mechanism-based NLME model, where the use of physiological knowledge and in vitro–in vivo correlation helps fully characterize PK and generate hypotheses for new formulations or specific populations.

  14. Uncertainty in biology a computational modeling approach

    CERN Document Server

    Gomez-Cabrero, David

    2016-01-01

    Computational modeling of biomedical processes is gaining more and more weight in the current research into the etiology of biomedical problems and potential treatment strategies.  Computational modeling allows to reduce, refine and replace animal experimentation as well as to translate findings obtained in these experiments to the human background. However these biomedical problems are inherently complex with a myriad of influencing factors, which strongly complicates the model building and validation process.  This book wants to address four main issues related to the building and validation of computational models of biomedical processes: Modeling establishment under uncertainty Model selection and parameter fitting Sensitivity analysis and model adaptation Model predictions under uncertainty In each of the abovementioned areas, the book discusses a number of key-techniques by means of a general theoretical description followed by one or more practical examples.  This book is intended for graduate stude...

  15. Patient-specific biomechanical model as whole-body CT image registration tool.

    Science.gov (United States)

    Li, Mao; Miller, Karol; Joldes, Grand Roman; Doyle, Barry; Garlapati, Revanth Reddy; Kikinis, Ron; Wittek, Adam

    2015-05-01

    Whole-body computed tomography (CT) image registration is important for cancer diagnosis, therapy planning and treatment. Such registration requires accounting for large differences between source and target images caused by deformations of soft organs/tissues and articulated motion of skeletal structures. The registration algorithms relying solely on image processing methods exhibit deficiencies in accounting for such deformations and motion. We propose to predict the deformations and movements of body organs/tissues and skeletal structures for whole-body CT image registration using patient-specific non-linear biomechanical modelling. Unlike the conventional biomechanical modelling, our approach for building the biomechanical models does not require time-consuming segmentation of CT scans to divide the whole body into non-overlapping constituents with different material properties. Instead, a Fuzzy C-Means (FCM) algorithm is used for tissue classification to assign the constitutive properties automatically at integration points of the computation grid. We use only very simple segmentation of the spine when determining vertebrae displacements to define loading for biomechanical models. We demonstrate the feasibility and accuracy of our approach on CT images of seven patients suffering from cancer and aortic disease. The results confirm that accurate whole-body CT image registration can be achieved using a patient-specific non-linear biomechanical model constructed without time-consuming segmentation of the whole-body images.

  16. Sex-specific patterns and differences in dementia and Alzheimer's disease using informatics approaches.

    Science.gov (United States)

    Ronquillo, Jay Geronimo; Baer, Merritt Rachel; Lester, William T

    2016-01-01

    The National Institutes of Health Office of Research on Women's Health recently highlighted the critical need for explicitly addressing sex differences in biomedical research, including Alzheimer's disease and dementia. The purpose of our study was to perform a sex-stratified analysis of cognitive impairment using diverse medical, clinical, and genetic factors of unprecedented scale and scope by applying informatics approaches to three large Alzheimer's databases. Analyses suggested females were 1.5 times more likely than males to have a documented diagnosis of probable Alzheimer's disease, and several other factors fell along sex-specific lines and were possibly associated with severity of cognitive impairment.

  17. Proteomic Approaches for Site-specific O-GlcNAcylation Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Sheng; Yang, Feng; Camp, David G.; Rodland, Karin D.; Qian, Weijun; Liu, Tao; Smith, Richard D.

    2014-10-01

    O-GlcNAcylation is a dynamic protein post-translational modification of serine or threonine residues by an O-linked monosaccharide N-acetylglucosamine (O-GlcNAc). O-GlcNAcylation was discovered three decades ago, and it has been shown to contribute to various disease states, such as metabolic diseases, cancer and neurological diseases. Yet it remains technically difficult to characterize comprehensively and quantitatively, due to its exceptionally low abundance and extremely labile nature under conventional tandem mass spectrometry conditions. Herein, we review the recent efforts for tackling these challenges in developing proteomic approaches for site-specific O-GlcNAcylation analysis, such as specific enrichment of O-GlcNAc peptides/proteins, unambiguous site-determination of O-GlcNAc modification, and quantitative analysis of O-GlcNAcylation.

  18. Literature Review of Cosmetic Procedures in Men: Approaches and Techniques are Gender Specific.

    Science.gov (United States)

    Cohen, Brandon E; Bashey, Sameer; Wysong, Ashley

    2017-02-01

    The proportion of men receiving non-surgical cosmetic procedures has risen substantially in recent years. Various physiologic, anatomic, and motivational considerations differentiate the treatments for male and female patients. Nevertheless, research regarding approaches to the male cosmetic patient is scarce. We sought to provide an overview and sex-specific discussion of the most popular cosmetic dermatologic procedures pursued by men by conducting a comprehensive literature review pertaining to non-surgical cosmetic procedures in male patients. The most common and rapidly expanding non-surgical interventions in men include botulinum toxin, filler injection, chemical peels, microdermabrasion, laser resurfacing, laser hair removal, hair transplantation, and minimally invasive techniques for adipose tissue reduction. Important sex-specific factors associated with each of these procedures should be considered to best serve the male cosmetic patient.

  19. An Approach to Improve the State Scalability of Source Specific Multicast

    Directory of Open Access Journals (Sweden)

    S. A. Al-Talib

    2009-01-01

    Full Text Available Problem statement: Source Specific Multicast (SSM is an acceptable solution for current multicast applications; since the driving applications to date are one to many, including Internet TV, distance learning, file distribution and streaming media. Approach: It was useful for billing, address allocation and security. SSM still had serious state scalability problem when there were a large number of simultaneous on-going multicast groups in the network. Results: In this study, a scheme had been devised to improve the state scalability of source specific multicast. The scheme consisted of two stages: Conclusion/Recommendations: The first stage was to cluster the receivers based on their IP addresses and the second stage was to reduce the multicast state at routers. In order to prove the correctness of the proposed scheme, it had been applied to multicast trees built by other researchers. The results of the comparison approved our statement.

  20. ALREST High Fidelity Modeling Program Approach

    Science.gov (United States)

    2011-05-18

    Gases and Mixtures of Redlich - Kwong and Peng- Robinson Fluids Assumed pdf Model based on k- ε-g Model in NASA/LaRc Vulcan code Level Set model...Potential Attractiveness Of Liquid Hydrocarbon Engines For Boost Applications • Propensity Of Hydrocarbon Engines For Combustion Instability • Air

  1. A Variational Approach to the Modeling of MIMO Systems

    Directory of Open Access Journals (Sweden)

    A. Jraifi

    2007-05-01

    Full Text Available Motivated by the study of the optimization of the quality of service for multiple input multiple output (MIMO systems in 3G (third generation, we develop a method for modeling MIMO channel ℋ. This method, which uses a statistical approach, is based on a variational form of the usual channel equation. The proposed equation is given by δ2=〈δR|ℋ|δE〉+〈δR|(δℋ|E〉 with scalar variable δ=‖δR‖. Minimum distance δmin of received vectors |R〉 is used as the random variable to model MIMO channel. This variable is of crucial importance for the performance of the transmission system as it captures the degree of interference between neighbors vectors. Then, we use this approach to compute numerically the total probability of errors with respect to signal-to-noise ratio (SNR and then predict the numbers of antennas. By fixing SNR variable to a specific value, we extract informations on the optimal numbers of MIMO antennas.

  2. Dark matter physics in neutrino specific two Higgs doublet model

    CERN Document Server

    Baek, Seungwon

    2016-01-01

    Although the seesaw mechanism is a natural explanation for the small neutrino masses, there are cases when the Majorana mass terms for the right-handed neutrinos are not allowed due to symmetry. In that case, if neutrino-specific Higgs doublet is introduced, neutrinos become Dirac particles and their small masses can be explained by its small VEV. We show that the same symmetry, which we assume a global $U(1)_X$, can also be used to explain the stability of dark matter. In our model, a new singlet scalar breaks the global symmetry spontaneously down to a discrete $Z_2$ symmetry. The dark matter particle, lightest $Z_2$-odd fermion, is stabilized. We discuss the phenomenology of dark matter: relic density, direct detection, and indirect detection. We find that the relic density can be explained by a novel Goldstone boson channel or by resonance channel. In the most region of parameter space considered, the direct detections is suppressed well below the current experimental bound. Our model can be further teste...

  3. General approach for in vivo recovery of cell type-specific effector gene sets.

    Science.gov (United States)

    Barsi, Julius C; Tu, Qiang; Davidson, Eric H

    2014-05-01

    Differentially expressed, cell type-specific effector gene sets hold the key to multiple important problems in biology, from theoretical aspects of developmental gene regulatory networks (GRNs) to various practical applications. Although individual cell types of interest have been recovered by various methods and analyzed, systematic recovery of multiple cell type-specific gene sets from whole developing organisms has remained problematic. Here we describe a general methodology using the sea urchin embryo, a material of choice because of the large-scale GRNs already solved for this model system. This method utilizes the regulatory states expressed by given cells of the embryo to define cell type and includes a fluorescence activated cell sorting (FACS) procedure that results in no perturbation of transcript representation. We have extensively validated the method by spatial and qualitative analyses of the transcriptome expressed in isolated embryonic skeletogenic cells and as a consequence, generated a prototypical cell type-specific transcriptome database.

  4. New Cutting Force Modeling Approach for Flat End Mill

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    A new mechanistic cutting force model for flat end milling using the instantaneous cutting force coefficients is proposed. An in-depth analysis shows that the total cutting forces can be separated into two terms: a nominal component independent of the runout and a perturbation component induced by the runout. The instantaneous value of the nominal component is used to calibrate the cutting force coefficients. With the help of the perturbation component and the cutting force coeffcients obtained above, the cutter runout is identified.Based on simulation and experimental results, the validity of the identification approach is demonstrated. The advantage of the proposed method lies in that the calibration performed with data of one cutting test under a specific regime can be applied for a great range of cutting conditions.

  5. Systems approaches to computational modeling of the oral microbiome

    Directory of Open Access Journals (Sweden)

    Dimiter V. Dimitrov

    2013-07-01

    Full Text Available Current microbiome research has generated tremendous amounts of data providing snapshots of molecular activity in a variety of organisms, environments, and cell types. However, turning this knowledge into whole system level of understanding on pathways and processes has proven to be a challenging task. In this review we highlight the applicability of bioinformatics and visualization techniques to large collections of data in order to better understand the information that contains related diet – oral microbiome – host mucosal transcriptome interactions. In particular we focus on systems biology of Porphyromonas gingivalis in the context of high throughput computational methods tightly integrated with translational systems medicine. Those approaches have applications for both basic research, where we can direct specific laboratory experiments in model organisms and cell cultures, to human disease, where we can validate new mechanisms and biomarkers for prevention and treatment of chronic disorders

  6. Modeling Educational Content: The Cognitive Approach of the PALO Language

    Directory of Open Access Journals (Sweden)

    M. Felisa Verdejo Maíllo

    2004-01-01

    Full Text Available This paper presents a reference framework to describe educational material. It introduces the PALO Language as a cognitive based approach to Educational Modeling Languages (EML. In accordance with recent trends for reusability and interoperability in Learning Technologies, EML constitutes an evolution of the current content-centered specifications of learning material, involving the description of learning processes and methods from a pedagogical and instructional perspective. The PALO Language, thus, provides a layer of abstraction for the description of learning material, including the description of learning activities, structure and scheduling. The framework makes use of domain and pedagogical ontologies as a reusable and maintainable way to represent and store instructional content, and to provide a pedagogical level of abstraction in the authoring process.

  7. LEXICAL APPROACH IN TEACHING TURKISH: A COLLOCATIONAL STUDY MODEL

    National Research Council Canada - National Science Library

    Eser ÖRDEM

    2013-01-01

    Abstract This study intends to propose Lexical Approach (Lewis, 1998, 2002; Harwood, 2002) and a model for teaching Turkish as a foreign language so that this model can be used in classroom settings...

  8. A model-based multisensor data fusion knowledge management approach

    Science.gov (United States)

    Straub, Jeremy

    2014-06-01

    A variety of approaches exist for combining data from multiple sensors. The model-based approach combines data based on its support for or refutation of elements of the model which in turn can be used to evaluate an experimental thesis. This paper presents a collection of algorithms for mapping various types of sensor data onto a thesis-based model and evaluating the truth or falsity of the thesis, based on the model. The use of this approach for autonomously arriving at findings and for prioritizing data are considered. Techniques for updating the model (instead of arriving at a true/false assertion) are also discussed.

  9. A Mixed Approach for Modeling Blood Flow in Brain Microcirculation

    Science.gov (United States)

    Peyrounette, M.; Sylvie, L.; Davit, Y.; Quintard, M.

    2014-12-01

    We have previously demonstrated [1] that the vascular system of the healthy human brain cortex is a superposition of two structural components, each corresponding to a different spatial scale. At small-scale, the vascular network has a capillary structure, which is homogeneous and space-filling over a cut-off length. At larger scale, veins and arteries conform to a quasi-fractal branched structure. This structural duality is consistent with the functional duality of the vasculature, i.e. distribution and exchange. From a modeling perspective, this can be viewed as the superposition of: (a) a continuum model describing slow transport in the small-scale capillary network, characterized by a representative elementary volume and effective properties; and (b) a discrete network approach [2] describing fast transport in the arterial and venous network, which cannot be homogenized because of its fractal nature. This problematic is analogous to modeling problems encountered in geological media, e.g, in petroleum engineering, where fast conducting channels (wells or fractures) are embedded in a porous medium (reservoir rock). An efficient method to reduce the computational cost of fractures/continuum simulations is to use relatively large grid blocks for the continuum model. However, this also makes it difficult to accurately couple both structural components. In this work, we solve this issue by adapting the "well model" concept used in petroleum engineering [3] to brain specific 3-D situations. We obtain a unique linear system of equations describing the discrete network, the continuum and the well model coupling. Results are presented for realistic geometries and compared with a non-homogenized small-scale network model of an idealized periodic capillary network of known permeability. [1] Lorthois & Cassot, J. Theor. Biol. 262, 614-633, 2010. [2] Lorthois et al., Neuroimage 54 : 1031-1042, 2011. [3] Peaceman, SPE J. 18, 183-194, 1978.

  10. An Alternative Bayesian Approach to Structural Breaks in Time Series Models

    NARCIS (Netherlands)

    S. van den Hauwe (Sjoerd); R. Paap (Richard); D.J.C. van Dijk (Dick)

    2011-01-01

    textabstractWe propose a new approach to deal with structural breaks in time series models. The key contribution is an alternative dynamic stochastic specification for the model parameters which describes potential breaks. After a break new parameter values are generated from a so-called baseline pr

  11. Comparison of two novel approaches to model fibre reinforced concrete

    NARCIS (Netherlands)

    Radtke, F.K.F.; Simone, A.; Sluys, L.J.

    2009-01-01

    We present two approaches to model fibre reinforced concrete. In both approaches, discrete fibre distributions and the behaviour of the fibre-matrix interface are explicitly considered. One approach employs the reaction forces from fibre to matrix while the other is based on the partition of unity f

  12. The development of industry-specific odor impact criteria for feedlots using models.

    Science.gov (United States)

    Henry, Chris G; Watts, Peter J; Nicholas, Peter J

    2008-09-01

    Emissions from feedlot operations are known to vary by environmental conditions and few if any techniques or models exist to predict the variability of odor emission rates from feedlots. The purpose of this paper is to outline and summarize unpublished reports that are the result of a collective effort to develop industry-specific odor impact criteria for Australian feedlots. This effort used over 250 olfactometry samples collected with a wind tunnel and past research to develop emission models for pads, sediment basins, holding ponds, and manure storage areas over a range of environmental conditions and tested using dynamic olfactometry. A process was developed to integrate these emission models into odor dispersion modeling for the development of impact criteria. The approach used a feedlot hydrology model to derive daily feedlot pad moisture, temperature, and thickness. A submodel converted these daily data to hourly data. A feedlot pad emissions model was developed that predicts feedlot pad emissions as a function of temperature, moisture content, and pad depth. Emissions from sediment basins and holding ponds were predicted using a basin emissions model as a function of days since rain, inflow volume, inflow ratio (pond volume), and temperature. This is the first attempt to model all odor source emissions from a feedlot as variable hourly emissions on the basis of climate, management, and site-specific conditions. Results from the holding pond, sediment basin, and manure storage emission models performed well, but additional work on the pad emissions model may be warranted. This methodology mimics the variable odor emissions and odor impact expected from feedlots due to climate and management effects. The main outcome of the work is the recognition that an industry-specific odor impact criterion must be expressed in terms of all of the components of the assessment methodology.

  13. New Approaches in Reusable Booster System Life Cycle Cost Modeling

    Science.gov (United States)

    Zapata, Edgar

    2013-01-01

    This paper presents the results of a 2012 life cycle cost (LCC) study of hybrid Reusable Booster Systems (RBS) conducted by NASA Kennedy Space Center (KSC) and the Air Force Research Laboratory (AFRL). The work included the creation of a new cost estimating model and an LCC analysis, building on past work where applicable, but emphasizing the integration of new approaches in life cycle cost estimation. Specifically, the inclusion of industry processes/practices and indirect costs were a new and significant part of the analysis. The focus of LCC estimation has traditionally been from the perspective of technology, design characteristics, and related factors such as reliability. Technology has informed the cost related support to decision makers interested in risk and budget insight. This traditional emphasis on technology occurs even though it is well established that complex aerospace systems costs are mostly about indirect costs, with likely only partial influence in these indirect costs being due to the more visible technology products. Organizational considerations, processes/practices, and indirect costs are traditionally derived ("wrapped") only by relationship to tangible product characteristics. This traditional approach works well as long as it is understood that no significant changes, and by relation no significant improvements, are being pursued in the area of either the government acquisition or industry?s indirect costs. In this sense then, most launch systems cost models ignore most costs. The alternative was implemented in this LCC study, whereby the approach considered technology and process/practices in balance, with as much detail for one as the other. This RBS LCC study has avoided point-designs, for now, instead emphasizing exploring the trade-space of potential technology advances joined with potential process/practice advances. Given the range of decisions, and all their combinations, it was necessary to create a model of the original model

  14. New Approaches in Reuseable Booster System Life Cycle Cost Modeling

    Science.gov (United States)

    Zapata, Edgar

    2013-01-01

    This paper presents the results of a 2012 life cycle cost (LCC) study of hybrid Reusable Booster Systems (RBS) conducted by NASA Kennedy Space Center (KSC) and the Air Force Research Laboratory (AFRL). The work included the creation of a new cost estimating model and an LCC analysis, building on past work where applicable, but emphasizing the integration of new approaches in life cycle cost estimation. Specifically, the inclusion of industry processes/practices and indirect costs were a new and significant part of the analysis. The focus of LCC estimation has traditionally been from the perspective of technology, design characteristics, and related factors such as reliability. Technology has informed the cost related support to decision makers interested in risk and budget insight. This traditional emphasis on technology occurs even though it is well established that complex aerospace systems costs are mostly about indirect costs, with likely only partial influence in these indirect costs being due to the more visible technology products. Organizational considerations, processes/practices, and indirect costs are traditionally derived ("wrapped") only by relationship to tangible product characteristics. This traditional approach works well as long as it is understood that no significant changes, and by relation no significant improvements, are being pursued in the area of either the government acquisition or industry?s indirect costs. In this sense then, most launch systems cost models ignore most costs. The alternative was implemented in this LCC study, whereby the approach considered technology and process/practices in balance, with as much detail for one as the other. This RBS LCC study has avoided point-designs, for now, instead emphasizing exploring the trade-space of potential technology advances joined with potential process/practice advances. Given the range of decisions, and all their combinations, it was necessary to create a model of the original model

  15. An algebraic approach to the Hubbard model

    CERN Document Server

    de Leeuw, Marius

    2015-01-01

    We study the algebraic structure of an integrable Hubbard-Shastry type lattice model associated with the centrally extended su(2|2) superalgebra. This superalgebra underlies Beisert's AdS/CFT worldsheet R-matrix and Shastry's R-matrix. The considered model specializes to the one-dimensional Hubbard model in a certain limit. We demonstrate that Yangian symmetries of the R-matrix specialize to the Yangian symmetry of the Hubbard model found by Korepin and Uglov. Moreover, we show that the Hubbard model Hamiltonian has an algebraic interpretation as the so-called secret symmetry. We also discuss Yangian symmetries of the A and B models introduced by Frolov and Quinn.

  16. Numerical modelling approach for mine backfill

    Indian Academy of Sciences (India)

    MUHAMMAD ZAKA EMAD

    2017-09-01

    Numerical modelling is broadly used for assessing complex scenarios in underground mines, including mining sequence and blast-induced vibrations from production blasting. Sublevel stoping mining methods with delayed backfill are extensively used to exploit steeply dipping ore bodies by Canadian hard-rockmetal mines. Mine backfill is an important constituent of mining process. Numerical modelling of mine backfill material needs special attention as the numerical model must behave realistically and in accordance with the site conditions. This paper discusses a numerical modelling strategy for modelling mine backfill material. Themodelling strategy is studied using a case study mine from Canadian mining industry. In the end, results of numerical model parametric study are shown and discussed.

  17. Regularization of turbulence - a comprehensive modeling approach

    NARCIS (Netherlands)

    Geurts, Bernard J.

    2011-01-01

    Turbulence readily arises in numerous flows in nature and technology. The large number of degrees of freedom of turbulence poses serious challenges to numerical approaches aimed at simulating and controlling such flows. While the Navier-Stokes equations are commonly accepted to precisely describe fl

  18. Measuring equilibrium models: a multivariate approach

    Directory of Open Access Journals (Sweden)

    Nadji RAHMANIA

    2011-04-01

    Full Text Available This paper presents a multivariate methodology for obtaining measures of unobserved macroeconomic variables. The used procedure is the multivariate Hodrick-Prescot which depends on smoothing param eters. The choice of these parameters is crucial. Our approach is based on consistent estimators of these parameters, depending only on the observed data.

  19. A graphical approach to analogue behavioural modelling

    OpenAIRE

    Moser, Vincent; Nussbaum, Pascal; Amann, Hans-Peter; Astier, Luc; Pellandini, Fausto

    2007-01-01

    In order to master the growing complexity of analogue electronic systems, modelling and simulation of analogue hardware at various levels is absolutely necessary. This paper presents an original modelling method based on the graphical description of analogue electronic functional blocks. This method is intended to be automated and integrated into a design framework: specialists create behavioural models of existing functional blocks, that can then be used through high-level selection and spec...

  20. A geometrical approach to structural change modeling

    OpenAIRE

    Stijepic, Denis

    2013-01-01

    We propose a model for studying the dynamics of economic structures. The model is based on qualitative information regarding structural dynamics, in particular, (a) the information on the geometrical properties of trajectories (and their domains) which are studied in structural change theory and (b) the empirical information from stylized facts of structural change. We show that structural change is path-dependent in this model and use this fact to restrict the number of future structural cha...

  1. A Sampling Based Approach to Spacecraft Autonomous Maneuvering with Safety Specifications

    Science.gov (United States)

    Starek, Joseph A.; Barbee, Brent W.; Pavone, Marco

    2015-01-01

    This paper presents a methods for safe spacecraft autonomous maneuvering that leverages robotic motion-planning techniques to spacecraft control. Specifically the scenario we consider is an in-plan rendezvous of a chaser spacecraft in proximity to a target spacecraft at the origin of the Clohessy Wiltshire Hill frame. The trajectory for the chaser spacecraft is generated in a receding horizon fashion by executing a sampling based robotic motion planning algorithm name Fast Marching Trees (FMT) which efficiently grows a tree of trajectories over a set of probabillistically drawn samples in the state space. To enforce safety the tree is only grown over actively safe samples for which there exists a one-burn collision avoidance maneuver that circularizes the spacecraft orbit along a collision-free coasting arc and that can be executed under potential thrusters failures. The overall approach establishes a provably correct framework for the systematic encoding of safety specifications into the spacecraft trajectory generations process and appears amenable to real time implementation on orbit. Simulation results are presented for a two-fault tolerant spacecraft during autonomous approach to a single client in Low Earth Orbit.

  2. Consumer preference models: fuzzy theory approach

    Science.gov (United States)

    Turksen, I. B.; Wilson, I. A.

    1993-12-01

    Consumer preference models are widely used in new product design, marketing management, pricing and market segmentation. The purpose of this article is to develop and test a fuzzy set preference model which can represent linguistic variables in individual-level models implemented in parallel with existing conjoint models. The potential improvements in market share prediction and predictive validity can substantially improve management decisions about what to make (product design), for whom to make it (market segmentation) and how much to make (market share prediction).

  3. Evaluation of Investment in Renovation to Increase the Quality of Buildings: A Specific Discounted Cash Flow (DCF Approach of Appraisal

    Directory of Open Access Journals (Sweden)

    Giuseppe Bonazzi

    2016-03-01

    Full Text Available The objective of this article is to develop and apply a specific discounting cash flow (DCF approach to evaluate investment in renovation to improve building quality, thus increasing energy efficiency. In this article, we develop and apply a specific net present value (NPV and an internal rate of return (IRR approach to quantify the value created for the owners of the building by the investment in renovation via energy-saving investments that produce positive externalities. The model has an applied interest because, in recent years, a lot of investments in real estate were made by owners in order to increase the green quality of the buildings, and several funds of public aid were provided by the government to stimulate these energy-saving investments. The model proposed here is applied to a case study of a 16-apartment building located in northern Italy considers the model attempts to quantify the initial investment value, the energy savings, the tax deduction of the initial investment and the terminal value of the investment as the increase in building value. The analysis shows that the model is consistent in evaluating investments to improve building quality, and investments within the context of the specific case study considered in the research have IRRs ranging from a minimum of 4.907% to a maximum of 12.980%. It could even be useful to consider a sample of cases to verify whether our results are representative of this specific case study. The model could represent a useful tool for consumers in evaluating their own investments in building renovation, from a stand-alone perspective and even by comparing them with other types of investment. The research could be developed in the future to quantify the social welfare generated by public spending via tax deductions to reduce the costs of investment in energy savings for buildings and could even be applied to new real estate projects in comparing different construction technologies and even

  4. Glass Transition Temperature- and Specific Volume- Composition Models for Tellurite Glasses

    Energy Technology Data Exchange (ETDEWEB)

    Riley, Brian J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Vienna, John D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2017-09-01

    This report provides models for predicting composition-properties for tellurite glasses, namely specific gravity and glass transition temperature. Included are the partial specific coefficients for each model, the component validity ranges, and model fit parameters.

  5. Anisotropic Finite Element Modeling Based on a Harmonic Field for Patient-Specific Sclera

    Directory of Open Access Journals (Sweden)

    Xu Jia

    2017-01-01

    Full Text Available Purpose. This study examined the influence of anisotropic material for human sclera. Method. First, the individual geometry of patient-specific sclera was reproduced from a laser scan. Then, high quality finite element modeling of individual sclera was performed using a convenient automatic hexahedral mesh generator based on harmonic field and integrated with anisotropic material assignment function. Finally, comparison experiments were designed to investigate the effects of anisotropy on finite element modeling of sclera biomechanics. Results. The experimental results show that the presented approach can generate high quality anisotropic hexahedral mesh for patient-specific sclera. Conclusion. The anisotropy shows significant differences for stresses and strain distribution and careful consideration should be given to its use in biomechanical FE studies.

  6. Anisotropic Finite Element Modeling Based on a Harmonic Field for Patient-Specific Sclera.

    Science.gov (United States)

    Jia, Xu; Liao, Shenghui; Duan, Xuanchu; Zheng, Wanqiu; Zou, Beiji

    2017-01-01

    Purpose. This study examined the influence of anisotropic material for human sclera. Method. First, the individual geometry of patient-specific sclera was reproduced from a laser scan. Then, high quality finite element modeling of individual sclera was performed using a convenient automatic hexahedral mesh generator based on harmonic field and integrated with anisotropic material assignment function. Finally, comparison experiments were designed to investigate the effects of anisotropy on finite element modeling of sclera biomechanics. Results. The experimental results show that the presented approach can generate high quality anisotropic hexahedral mesh for patient-specific sclera. Conclusion. The anisotropy shows significant differences for stresses and strain distribution and careful consideration should be given to its use in biomechanical FE studies.

  7. A New Approach for Magneto-Static Hysteresis Behavioral Modeling

    DEFF Research Database (Denmark)

    Astorino, Antonio; Swaminathan, Madhavan; Antonini, Giulio

    2016-01-01

    In this paper, a new behavioral modeling approach for magneto-static hysteresis is presented. Many accurate models are currently available, but none of them seems to be able to correctly reproduce all the possible B-H paths with low computational cost. By contrast, the approach proposed...... achieved when comparing the measured and simulated results....

  8. Optogenetics in the cerebellum: Purkinje cell-specific approaches for understanding local cerebellar functions.

    Science.gov (United States)

    Tsubota, Tadashi; Ohashi, Yohei; Tamura, Keita

    2013-10-15

    The cerebellum consists of the cerebellar cortex and the cerebellar nuclei. Although the basic neuronal circuitry of the cerebellar cortex is uniform everywhere, anatomical data demonstrate that the input and output relationships of the cortex are spatially segregated between different cortical areas, which suggests that there are functional distinctions between these different areas. Perturbation of cerebellar cortical functions in a spatially restricted fashion is thus essential for investigating the distinctions among different cortical areas. In the cerebellar cortex, Purkinje cells are the sole output neurons that send information to downstream cerebellar and vestibular nuclei. Therefore, selective manipulation of Purkinje cell activities, without disturbing other neuronal types and passing fibers within the cortex, is a direct approach to spatially restrict the effects of perturbations. Although this type of approach has for many years been technically difficult, recent advances in optogenetics now enable selective activation or inhibition of Purkinje cell activities, with high temporal resolution. Here we discuss the effectiveness of using Purkinje cell-specific optogenetic approaches to elucidate the functions of local cerebellar cortex regions. We also discuss what improvements to current methods are necessary for future investigations of cerebellar functions to provide further advances.

  9. Vaccination with lipid core peptides fails to induce epitope-specific T cell responses but confers non-specific protective immunity in a malaria model.

    Directory of Open Access Journals (Sweden)

    Simon H Apte

    Full Text Available Vaccines against many pathogens for which conventional approaches have failed remain an unmet public health priority. Synthetic peptide-based vaccines offer an attractive alternative to whole protein and whole organism vaccines, particularly for complex pathogens that cause chronic infection. Previously, we have reported a promising lipid core peptide (LCP vaccine delivery system that incorporates the antigen, carrier, and adjuvant in a single molecular entity. LCP vaccines have been used to deliver several peptide subunit-based vaccine candidates and induced high titre functional antibodies and protected against Group A streptococcus in mice. Herein, we have evaluated whether LCP constructs incorporating defined CD4(+ and/or CD8(+ T cell epitopes could induce epitope-specific T cell responses and protect against pathogen challenge in a rodent malaria model. We show that LCP vaccines failed to induce an expansion of antigen-specific CD8(+ T cells following primary immunization or by boosting. We further demonstrated that the LCP vaccines induced a non-specific type 2 polarized cytokine response, rather than an epitope-specific canonical CD8(+ T cell type 1 response. Cytotoxic responses of unknown specificity were also induced. These non-specific responses were able to protect against parasite challenge. These data demonstrate that vaccination with lipid core peptides fails to induce canonical epitope-specific T cell responses, at least in our rodent model, but can nonetheless confer non-specific protective immunity against Plasmodium parasite challenge.

  10. A Contingency Model Approach to Leadership Training.

    Science.gov (United States)

    Csoka, Louis S.; Fiedler, Fred E.

    Two studies were specifically designed to test the effect which leadership training and experience would have on the performance of relationship-motivated and task-motivated leaders. In the first study it was predicted that task-relevant training and experience would make the situation more favorable in the task-structure dimension. Subjects were…

  11. Post-closure biosphere assessment modelling: comparison of complex and more stylised approaches.

    Science.gov (United States)

    Walke, Russell C; Kirchner, Gerald; Xu, Shulan; Dverstorp, Björn

    2015-10-01

    Geological disposal facilities are the preferred option for high-level radioactive waste, due to their potential to provide isolation from the surface environment (biosphere) on very long timescales. Assessments need to strike a balance between stylised models and more complex approaches that draw more extensively on site-specific information. This paper explores the relative merits of complex versus more stylised biosphere models in the context of a site-specific assessment. The more complex biosphere modelling approach was developed by the Swedish Nuclear Fuel and Waste Management Co (SKB) for the Formark candidate site for a spent nuclear fuel repository in Sweden. SKB's approach is built on a landscape development model, whereby radionuclide releases to distinct hydrological basins/sub-catchments (termed 'objects') are represented as they evolve through land rise and climate change. Each of seventeen of these objects is represented with more than 80 site specific parameters, with about 22 that are time-dependent and result in over 5000 input values per object. The more stylised biosphere models developed for this study represent releases to individual ecosystems without environmental change and include the most plausible transport processes. In the context of regulatory review of the landscape modelling approach adopted in the SR-Site assessment in Sweden, the more stylised representation has helped to build understanding in the more complex modelling approaches by providing bounding results, checking the reasonableness of the more complex modelling, highlighting uncertainties introduced through conceptual assumptions and helping to quantify the conservatisms involved. The more stylised biosphere models are also shown capable of reproducing the results of more complex approaches. A major recommendation is that biosphere assessments need to justify the degree of complexity in modelling approaches as well as simplifying and conservative assumptions. In light of

  12. Nucleon Spin Content in a Relativistic Quark Potential Model Approach

    Institute of Scientific and Technical Information of China (English)

    DONG YuBing; FENG QingGuo

    2002-01-01

    Based on a relativistic quark model approach with an effective potential U(r) = (ac/2)(1 + γ0)r2, the spin content of the nucleon is investigated. Pseudo-scalar interaction between quarks and Goldstone bosons is employed to calculate the couplings between the Goldstone bosons and the nucleon. Different approaches to deal with the center of mass correction in the relativistic quark potential model approach are discussed.

  13. EGFR-specific nanoprobe biodistribution in mouse models

    Science.gov (United States)

    Fashir, Samia A.; Castilho, Maiara L.; Hupman, Michael A.; Lee, Christopher L. D.; Raniero, Leandro J.; Alwayn, Ian; Hewitt, Kevin C.

    2015-06-01

    Nanotechnology offers a targeted approach to both imaging and treatment of cancer, the leading cause of death worldwide. Previous studies have found nanoparticles with a wide variety of coatings initiate an immune response leading to sequestration in the liver and spleen. In an effort to find a nanoparticle platform which does not elicit an immune response we created 43/44 nm gold or silver nanoparticles coated with biomolecules normally produced by the body, α-lipoic acid and the Epidermal Growth Factor (EGF), and have used mass spectroscopy to determine their biodistribution in mouse models, 24 hours following tail vein injection. Relative to controls, mouse EGF (mEGF) coated silver and gold nanoprobes are found at reduced levels in the liver and spleen. mEGF coated gold nanoprobes on the other hand do not appear to elicit any immune response, as they are found at background levels in these organs. As a result they should remain in circulation for longer and accumulate at high levels in tumors by the enhanced permeability retention (EPR) effect.

  14. Scaled, patient-specific 3D vertebral model reconstruction based on 2D lateral fluoroscopy.

    Science.gov (United States)

    Zheng, Guoyan; Nolte, Lutz-P; Ferguson, Stephen J

    2011-05-01

    Accurate three-dimensional (3D) models of lumbar vertebrae are required for image-based 3D kinematics analysis. MRI or CT datasets are frequently used to derive 3D models but have the disadvantages that they are expensive, time-consuming or involving ionizing radiation (e.g., CT acquisition). An alternative method using 2D lateral fluoroscopy was developed. A technique was developed to reconstruct a scaled 3D lumbar vertebral model from a single two-dimensional (2D) lateral fluoroscopic image and a statistical shape model of the lumbar vertebrae. Four cadaveric lumbar spine segments and two statistical shape models were used for testing. Reconstruction accuracy was determined by comparison of the surface models reconstructed from the single lateral fluoroscopic images to the ground truth data from 3D CT segmentation. For each case, two different surface-based registration techniques were used to recover the unknown scale factor, and the rigid transformation between the reconstructed surface model and the ground truth model before the differences between the two discrete surface models were computed. Successful reconstruction of scaled surface models was achieved for all test lumbar vertebrae based on single lateral fluoroscopic images. The mean reconstruction error was between 0.7 and 1.6 mm. A scaled, patient-specific surface model of the lumbar vertebra from a single lateral fluoroscopic image can be synthesized using the present approach. This new method for patient-specific 3D modeling has potential applications in spine kinematics analysis, surgical planning, and navigation.

  15. How specific Raman spectroscopic models are: a comparative study between different cancers

    Science.gov (United States)

    Singh, S. P.; Kumar, K. Kalyan; Chowdary, M. V. P.; Maheedhar, K.; Krishna, C. Murali

    2010-02-01

    Optical spectroscopic methods are being contemplated as adjunct/ alternative to existing 'Gold standard' of cancer diagnosis, histopathological examination. Several groups are actively pursuing diagnostic applications of Ramanspectroscopy in cancers. We have developed Raman spectroscopic models for diagnosis of breast, oral, stomach, colon and larynx cancers. So far, specificity and applicability of spectral- models has been limited to particular tissue origin. In this study we have evaluated explicitly of spectroscopic-models by analyzing spectra from already developed spectralmodels representing normal and malignant tissues of breast (46), cervix (52), colon (25), larynx (53), and oral (47). Spectral data was analyzed by Principal Component Analysis (PCA) using scores of factor, Mahalanobis distance and Spectral residuals as discriminating parameters. Multiparametric limit test approach was also explored. The preliminary unsupervised PCA of pooled data indicates that normal tissue types were always exclusive from their malignant counterparts. But when we consider tissue of different origin, large overlap among clusters was found. Supervised analysis by Mahalanobis distance and spectral residuals gave similar results. The 'limit test' approach where classification is based on match / mis-match of the given spectrum against all the available spectra has revealed that spectral models are very exclusive and specific. For example breast normal spectral model show matches only with breast normal spectra and mismatch to rest of the spectra. Same pattern was seen for most of spectral models. Therefore, results of the study indicate the exclusiveness and efficacy of Raman spectroscopic-models. Prospectively, these findings might open new application of Raman spectroscopic models in identifying a tumor as primary or metastatic.

  16. A simple approach to modeling ductile failure.

    Energy Technology Data Exchange (ETDEWEB)

    Wellman, Gerald William

    2012-06-01

    Sandia National Laboratories has the need to predict the behavior of structures after the occurrence of an initial failure. In some cases determining the extent of failure, beyond initiation, is required, while in a few cases the initial failure is a design feature used to tailor the subsequent load paths. In either case, the ability to numerically simulate the initiation and propagation of failures is a highly desired capability. This document describes one approach to the simulation of failure initiation and propagation.

  17. The female gametophyte: an emerging model for cell type-specific systems biology in plant development

    Directory of Open Access Journals (Sweden)

    Marc William Schmid

    2015-11-01

    Full Text Available Systems biology, a holistic approach describing a system emerging from the interactions of its molecular components, critically depends on accurate qualitative determination and quantitative measurements of these components. Development and improvement of large-scale profiling methods (omics now facilitates comprehensive measurements of many relevant molecules. For multicellular organisms, such as animals, fungi, algae, and plants, the complexity of the system is augmented by the presence of specialized cell types and organs, and a complex interplay within and between them. Cell type-specific analyses are therefore crucial for the understanding of developmental processes and environmental responses. This review first gives an overview of current methods used for large-scale profiling of specific cell types exemplified by recent advances in plant biology. The focus then lies on suitable model systems to study plant development and cell type specification. We introduce the female gametophyte of flowering plants as an ideal model to study fundamental developmental processes. Moreover, the female reproductive lineage is of importance for the emergence of evolutionary novelties such as an unequal parental contribution to the tissue nurturing the embryo or the clonal production of seeds by asexual reproduction (apomixis. Understanding these processes is not only interesting from a developmental or evolutionary perspective, but bears great potential for further crop improvement and the simplification of breeding efforts. We finally highlight novel methods, which are already available or which will likely soon facilitate large-scale profiling of the specific cell types of the female gametophyte in both model and non-model species. We conclude that it may take only few years until an evolutionary systems biology approach toward female gametogenesis may decipher some of its biologically most interesting and economically most valuable processes.

  18. Advanced language modeling approaches, case study: Expert search

    NARCIS (Netherlands)

    Hiemstra, Djoerd

    2008-01-01

    This tutorial gives a clear and detailed overview of advanced language modeling approaches and tools, including the use of document priors, translation models, relevance models, parsimonious models and expectation maximization training. Expert search will be used as a case study to explain the

  19. Central neuronal motor behaviour in skilled and less skilled novices - Approaching sports-specific movement techniques.

    Science.gov (United States)

    Vogt, Tobias; Kato, Kouki; Schneider, Stefan; Türk, Stefan; Kanosue, Kazuyuki

    2017-02-14

    Research on motor behavioural processes preceding voluntary movements often refers to analysing the readiness potential (RP). For this, decades of studies used laboratory setups with controlled sports-related actions. Further, recent applied approaches focus on athlete-non-athlete comparisons, omitting possible effects of training history on RP. However, RP preceding real sport-specific movements in accordance to skill acquisition remains to be elucidated. Therefore, after familiarization 16 right-handed males with no experience in archery volunteered to perform repeated sports-specific movements, i.e. 40 arrow-releasing shots at 60s rest on a 15m distant standard target. Continuous, synchronised EEG and right limb EMG recordings during arrow-releasing served to detect movement onsets for RP analyses over distinct cortical motor areas. Based on attained scores on target, archery novices were, a posteriori, subdivided into a skilled and less skilled group. EMG results for mean values revealed no significant changes (all p>0.05), whereas RP amplitudes and onsets differed between groups but not between motor areas. Arrow-releasing preceded larger RP amplitudes (p<0.05) and later RP onsets (p<0.05) in skilled compared to less skilled novices. We suggest this to reflect attentional orienting and greater effort that accompanies central neuronal preparatory states of a sports-specific movement.

  20. Position-specific isotope modeling of organic micropollutants transformations through different reaction pathways

    Science.gov (United States)

    Jin, Biao; Rolle, Massimo

    2016-04-01

    Organic compounds are produced in vast quantities for industrial and agricultural use, as well as for human and animal healthcare [1]. These chemicals and their metabolites are frequently detected at trace levels in fresh water environments where they undergo degradation via different reaction pathways. Compound specific stable isotope analysis (CSIA) is a valuable tool to identify such degradation pathways in different environmental systems. Recent advances in analytical techniques have promoted the fast development and implementation of multi-element CSIA. However, quantitative frameworks to evaluate multi-element stable isotope data and incorporating mechanistic information on the degradation processes [2,3] are still lacking. In this study we propose a mechanism-based modeling approach to simultaneously evaluate concentration as well as bulk and position-specific multi-element isotope evolution during the transformation of organic micropollutants. The model explicitly simulates position-specific isotopologues for those atoms that experience isotope effects and, thereby, provides a mechanistic description of isotope fractionation occurring at different molecular positions. We validate the proposed approach with the concentration and multi-element isotope data of three selected organic micropollutants: dichlorobenzamide (BAM), isoproturon (IPU) and diclofenac (DCF). The model precisely captures the dual element isotope trends characteristic of different reaction pathways and their range of variation consistent with observed multi-element (C, N) bulk isotope fractionation. The proposed approach can also be used as a tool to explore transformation pathways in scenarios for which position-specific isotope data are not yet available. [1] Schwarzenbach, R.P., Egli, T., Hofstetter, T.B., von Gunten, U., Wehrli, B., 2010. Global Water Pollution and Human Health. Annu. Rev. Environ. Resour. doi:10.1146/annurev-environ-100809-125342. [2] Jin, B., Haderlein, S.B., Rolle, M

  1. A New Approach to Model Verification, Falsification and Selection

    Directory of Open Access Journals (Sweden)

    Andrew J. Buck

    2015-06-01

    Full Text Available This paper shows that a qualitative analysis, i.e., an assessment of the consistency of a hypothesized sign pattern for structural arrays with the sign pattern of the estimated reduced form, can always provide decisive insight into a model’s validity both in general and compared to other models. Qualitative analysis can show that it is impossible for some models to have generated the data used to estimate the reduced form, even though standard specification tests might show the model to be adequate. A partially specified structural hypothesis can be falsified by estimating as few as one reduced form equation. Zero restrictions in the structure can themselves be falsified. It is further shown how the information content of the hypothesized structural sign patterns can be measured using a commonly applied concept of statistical entropy. The lower the hypothesized structural sign pattern’s entropy, the more a priori information it proposes about the sign pattern of the estimated reduced form. As an hypothesized structural sign pattern has a lower entropy, it is more subject to type 1 error and less subject to type 2 error. Three cases illustrate the approach taken here.

  2. Random matrix model approach to chiral symmetry

    CERN Document Server

    Verbaarschot, J J M

    1996-01-01

    We review the application of random matrix theory (RMT) to chiral symmetry in QCD. Starting from the general philosophy of RMT we introduce a chiral random matrix model with the global symmetries of QCD. Exact results are obtained for universal properties of the Dirac spectrum: i) finite volume corrections to valence quark mass dependence of the chiral condensate, and ii) microscopic fluctuations of Dirac spectra. Comparisons with lattice QCD simulations are made. Most notably, the variance of the number of levels in an interval containing $n$ levels on average is suppressed by a factor $(\\log n)/\\pi^2 n$. An extension of the random matrix model model to nonzero temperatures and chemical potential provides us with a schematic model of the chiral phase transition. In particular, this elucidates the nature of the quenched approximation at nonzero chemical potential.

  3. Machine Learning Approaches for Modeling Spammer Behavior

    CERN Document Server

    Islam, Md Saiful; Islam, Md Rafiqul

    2010-01-01

    Spam is commonly known as unsolicited or unwanted email messages in the Internet causing potential threat to Internet Security. Users spend a valuable amount of time deleting spam emails. More importantly, ever increasing spam emails occupy server storage space and consume network bandwidth. Keyword-based spam email filtering strategies will eventually be less successful to model spammer behavior as the spammer constantly changes their tricks to circumvent these filters. The evasive tactics that the spammer uses are patterns and these patterns can be modeled to combat spam. This paper investigates the possibilities of modeling spammer behavioral patterns by well-known classification algorithms such as Na\\"ive Bayesian classifier (Na\\"ive Bayes), Decision Tree Induction (DTI) and Support Vector Machines (SVMs). Preliminary experimental results demonstrate a promising detection rate of around 92%, which is considerably an enhancement of performance compared to similar spammer behavior modeling research.

  4. Infectious disease modeling a hybrid system approach

    CERN Document Server

    Liu, Xinzhi

    2017-01-01

    This volume presents infectious diseases modeled mathematically, taking seasonality and changes in population behavior into account, using a switched and hybrid systems framework. The scope of coverage includes background on mathematical epidemiology, including classical formulations and results; a motivation for seasonal effects and changes in population behavior, an investigation into term-time forced epidemic models with switching parameters, and a detailed account of several different control strategies. The main goal is to study these models theoretically and to establish conditions under which eradication or persistence of the disease is guaranteed. In doing so, the long-term behavior of the models is determined through mathematical techniques from switched systems theory. Numerical simulations are also given to augment and illustrate the theoretical results and to help study the efficacy of the control schemes.

  5. Rotary ATPases: models, machine elements and technical specifications.

    Science.gov (United States)

    Stewart, Alastair G; Sobti, Meghna; Harvey, Richard P; Stock, Daniela

    2013-01-01

    Rotary ATPases are molecular rotary motors involved in biological energy conversion. They either synthesize or hydrolyze the universal biological energy carrier adenosine triphosphate. Recent work has elucidated the general architecture and subunit compositions of all three sub-types of rotary ATPases. Composite models of the intact F-, V- and A-type ATPases have been constructed by fitting high-resolution X-ray structures of individual subunits or sub-complexes into low-resolution electron densities of the intact enzymes derived from electron cryo-microscopy. Electron cryo-tomography has provided new insights into the supra-molecular arrangement of eukaryotic ATP synthases within mitochondria and mass-spectrometry has started to identify specifically bound lipids presumed to be essential for function. Taken together these molecular snapshots show that nano-scale rotary engines have much in common with basic design principles of man made machines from the function of individual "machine elements" to the requirement of the right "fuel" and "oil" for different types of motors.

  6. Specificity in transition state binding: the Pauling model revisited.

    Science.gov (United States)

    Amyes, Tina L; Richard, John P

    2013-03-26

    Linus Pauling proposed that the large rate accelerations for enzymes are caused by the high specificity of the protein catalyst for binding the reaction transition state. The observation that stable analogues of the transition states for enzymatic reactions often act as tight-binding inhibitors provided early support for this simple and elegant proposal. We review experimental results that support the proposal that Pauling's model provides a satisfactory explanation for the rate accelerations for many heterolytic enzymatic reactions through high-energy reaction intermediates, such as proton transfer and decarboxylation. Specificity in transition state binding is obtained when the total intrinsic binding energy of the substrate is significantly larger than the binding energy observed at the Michaelis complex. The results of recent studies that aimed to characterize the specificity in binding of the enolate oxygen at the transition state for the 1,3-isomerization reaction catalyzed by ketosteroid isomerase are reviewed. Interactions between pig heart succinyl-coenzyme A:3-oxoacid coenzyme A transferase (SCOT) and the nonreacting portions of coenzyme A (CoA) are responsible for a rate increase of 3 × 10(12)-fold, which is close to the estimated total 5 × 10(13)-fold enzymatic rate acceleration. Studies that partition the interactions between SCOT and CoA into their contributing parts are reviewed. Interactions of the protein with the substrate phosphodianion group provide an ~12 kcal/mol stabilization of the transition state for the reactions catalyzed by triosephosphate isomerase, orotidine 5'-monophosphate decarboxylase, and α-glycerol phosphate dehydrogenase. The interactions of these enzymes with the substrate piece phosphite dianion provide a 6-8 kcal/mol stabilization of the transition state for reaction of the appropriate truncated substrate. Enzyme activation by phosphite dianion reflects the higher dianion affinity for binding to the enzyme

  7. Second Quantization Approach to Stochastic Epidemic Models

    CERN Document Server

    Mondaini, Leonardo

    2015-01-01

    We show how the standard field theoretical language based on creation and annihilation operators may be used for a straightforward derivation of closed master equations describing the population dynamics of multivariate stochastic epidemic models. In order to do that, we introduce an SIR-inspired stochastic model for hepatitis C virus epidemic, from which we obtain the time evolution of the mean number of susceptible, infected, recovered and chronically infected individuals in a population whose total size is allowed to change.

  8. Methodologies for Development of Patient Specific Bone Models from Human Body CT Scans

    Science.gov (United States)

    Chougule, Vikas Narayan; Mulay, Arati Vinayak; Ahuja, Bharatkumar Bhagatraj

    2016-06-01

    This work deals with development of algorithm for physical replication of patient specific human bone and construction of corresponding implants/inserts RP models by using Reverse Engineering approach from non-invasive medical images for surgical purpose. In medical field, the volumetric data i.e. voxel and triangular facet based models are primarily used for bio-modelling and visualization, which requires huge memory space. On the other side, recent advances in Computer Aided Design (CAD) technology provides additional facilities/functions for design, prototyping and manufacturing of any object having freeform surfaces based on boundary representation techniques. This work presents a process to physical replication of 3D rapid prototyping (RP) physical models of human bone from various CAD modeling techniques developed by using 3D point cloud data which is obtained from non-invasive CT/MRI scans in DICOM 3.0 format. This point cloud data is used for construction of 3D CAD model by fitting B-spline curves through these points and then fitting surface between these curve networks by using swept blend techniques. This process also can be achieved by generating the triangular mesh directly from 3D point cloud data without developing any surface model using any commercial CAD software. The generated STL file from 3D point cloud data is used as a basic input for RP process. The Delaunay tetrahedralization approach is used to process the 3D point cloud data to obtain STL file. CT scan data of Metacarpus (human bone) is used as the case study for the generation of the 3D RP model. A 3D physical model of the human bone is generated on rapid prototyping machine and its virtual reality model is presented for visualization. The generated CAD model by different techniques is compared for the accuracy and reliability. The results of this research work are assessed for clinical reliability in replication of human bone in medical field.

  9. "Dispersion modeling approaches for near road | Science ...

    Science.gov (United States)

    Roadway design and roadside barriers can have significant effects on the dispersion of traffic-generated pollutants, especially in the near-road environment. Dispersion models that can accurately simulate these effects are needed to fully assess these impacts for a variety of applications. For example, such models can be useful for evaluating the mitigation potential of roadside barriers in reducing near-road exposures and their associated adverse health effects. Two databases, a tracer field study and a wind tunnel study, provide measurements used in the development and/or validation of algorithms to simulate dispersion in the presence of noise barriers. The tracer field study was performed in Idaho Falls, ID, USA with a 6-m noise barrier and a finite line source in a variety of atmospheric conditions. The second study was performed in the meteorological wind tunnel at the US EPA and simulated line sources at different distances from a model noise barrier to capture the effect on emissions from individual lanes of traffic. In both cases, velocity and concentration measurements characterized the effect of the barrier on dispersion.This paper presents comparisons with the two datasets of the barrier algorithms implemented in two different dispersion models: US EPA’s R-LINE (a research dispersion modelling tool under development by the US EPA’s Office of Research and Development) and CERC’s ADMS model (ADMS-Urban). In R-LINE the physical features reveal

  10. Flipped models in Trinification: A Comprehensive Approach

    CERN Document Server

    Rodríguez, Oscar; Ponce, William A; Rojas, Eduardo

    2016-01-01

    By considering the 3-3-1 and the left-right symmetric models as low energy effective theories of the trinification group, alternative versions of these models are found. The new neutral gauge bosons in the universal 3-3-1 model and its flipped versions are considered; also, the left-right symmetric model and the two flipped variants of it are also studied. For these models, the couplings of the $Z'$ bosons to the standard model fermions are reported. The explicit form of the null space of the vector boson mass matrix for an arbitrary Higgs tensor and gauge group is also presented. In the general framework of the trinification gauge group, and by using the LHC experimental results and EW precision data, limits on the $Z'$ mass and the mixing angle between $Z$ and the new gauge bosons $Z'$ are imposed. The general results call for very small mixing angles in the range $10^{-3}$ radians and $M_{Z'}$ > 2.5 TeV.

  11. Lightweight approach to model traceability in a CASE tool

    Science.gov (United States)

    Vileiniskis, Tomas; Skersys, Tomas; Pavalkis, Saulius; Butleris, Rimantas; Butkiene, Rita

    2017-07-01

    A term "model-driven" is not at all a new buzzword within the ranks of system development community. Nevertheless, the ever increasing complexity of model-driven approaches keeps fueling all kinds of discussions around this paradigm and pushes researchers forward to research and develop new and more effective ways to system development. With the increasing complexity, model traceability, and model management as a whole, becomes indispensable activities of model-driven system development process. The main goal of this paper is to present a conceptual design and implementation of a practical lightweight approach to model traceability in a CASE tool.

  12. Approaching models of nursing from a postmodernist perspective.

    Science.gov (United States)

    Lister, P

    1991-02-01

    This paper explores some questions about the use of models of nursing. These questions make various assumptions about the nature of models of nursing, in general and in particular. Underlying these assumptions are various philosophical positions which are explored through an introduction to postmodernist approaches in philosophical criticism. To illustrate these approaches, a critique of the Roper et al. model is developed, and more general attitudes towards models of nursing are examined. It is suggested that postmodernism offers a challenge to many of the assumptions implicit in models of nursing, and that a greater awareness of these assumptions should lead to nursing care being better informed where such models are in use.

  13. Effective use of integrated hydrological models in basin-scale water resources management: surrogate modeling approaches

    Science.gov (United States)

    Zheng, Y.; Wu, B.; Wu, X.

    2015-12-01

    Integrated hydrological models (IHMs) consider surface water and subsurface water as a unified system, and have been widely adopted in basin-scale water resources studies. However, due to IHMs' mathematical complexity and high computational cost, it is difficult to implement them in an iterative model evaluation process (e.g., Monte Carlo Simulation, simulation-optimization analysis, etc.), which diminishes their applicability for supporting decision-making in real-world situations. Our studies investigated how to effectively use complex IHMs to address real-world water issues via surrogate modeling. Three surrogate modeling approaches were considered, including 1) DYCORS (DYnamic COordinate search using Response Surface models), a well-established response surface-based optimization algorithm; 2) SOIM (Surrogate-based Optimization for Integrated surface water-groundwater Modeling), a response surface-based optimization algorithm that we developed specifically for IHMs; and 3) Probabilistic Collocation Method (PCM), a stochastic response surface approach. Our investigation was based on a modeling case study in the Heihe River Basin (HRB), China's second largest endorheic river basin. The GSFLOW (Coupled Ground-Water and Surface-Water Flow Model) model was employed. Two decision problems were discussed. One is to optimize, both in time and in space, the conjunctive use of surface water and groundwater for agricultural irrigation in the middle HRB region; and the other is to cost-effectively collect hydrological data based on a data-worth evaluation. Overall, our study results highlight the value of incorporating an IHM in making decisions of water resources management and hydrological data collection. An IHM like GSFLOW can provide great flexibility to formulating proper objective functions and constraints for various optimization problems. On the other hand, it has been demonstrated that surrogate modeling approaches can pave the path for such incorporation in real

  14. Manufacturing Excellence Approach to Business Performance Model

    Directory of Open Access Journals (Sweden)

    Jesus Cruz Alvarez

    2015-03-01

    Full Text Available Six Sigma, lean manufacturing, total quality management, quality control, and quality function deployment are the fundamental set of tools to enhance productivity in organizations. There is some research that outlines the benefit of each tool into a particular context of firm´s productivity, but not into a broader context of firm´s competitiveness that is achieved thru business performance. The aim of this theoretical research paper is to contribute to this mean and propose a manufacturing excellence approach that links productivity tools into a broader context of business performance.

  15. ASYMPTOTICS FOR CHANGE-POINT MODELS UNDER VARYING DEGREES OF MIS-SPECIFICATION.

    Science.gov (United States)

    Song, Rui; Banerjee, Moulinath; Kosorok, Michael R

    2016-02-01

    Change-point models are widely used by statisticians to model drastic changes in the pattern of observed data. Least squares/maximum likelihood based estimation of change-points leads to curious asymptotic phenomena. When the change-point model is correctly specified, such estimates generally converge at a fast rate (n) and are asymptotically described by minimizers of a jump process. Under complete mis-specification by a smooth curve, i.e. when a change-point model is fitted to data described by a smooth curve, the rate of convergence slows down to n(1/3) and the limit distribution changes to that of the minimizer of a continuous Gaussian process. In this paper we provide a bridge between these two extreme scenarios by studying the limit behavior of change-point estimates under varying degrees of model mis-specification by smooth curves, which can be viewed as local alternatives. We find that the limiting regime depends on how quickly the alternatives approach a change-point model. We unravel a family of 'intermediate' limits that can transition, at least qualitatively, to the limits in the two extreme scenarios. The theoretical results are illustrated via a set of carefully designed simulations. We also demonstrate how inference for the change-point parameter can be performed in absence of knowledge of the underlying scenario by resorting to subsampling techniques that involve estimation of the convergence rate.

  16. A Bayesian Model Committee Approach to Forecasting Global Solar Radiation

    CERN Document Server

    Lauret, Philippe; Muselli, Marc; David, Mathieu; Diagne, Hadja; Voyant, Cyril

    2012-01-01

    This paper proposes to use a rather new modelling approach in the realm of solar radiation forecasting. In this work, two forecasting models: Autoregressive Moving Average (ARMA) and Neural Network (NN) models are combined to form a model committee. The Bayesian inference is used to affect a probability to each model in the committee. Hence, each model's predictions are weighted by their respective probability. The models are fitted to one year of hourly Global Horizontal Irradiance (GHI) measurements. Another year (the test set) is used for making genuine one hour ahead (h+1) out-of-sample forecast comparisons. The proposed approach is benchmarked against the persistence model. The very first results show an improvement brought by this approach.

  17. MDA based-approach for UML Models Complete Comparison

    CERN Document Server

    Chaouni, Samia Benabdellah; Mouline, Salma

    2011-01-01

    If a modeling task is distributed, it will frequently be necessary to integrate models developed by different team members. Problems occur in the models integration step and particularly, in the comparison phase of the integration. This issue had been discussed in several domains and various models. However, previous approaches have not correctly handled the semantic comparison. In the current paper, we provide a MDA-based approach for models comparison which aims at comparing UML models. We develop an hybrid approach which takes into account syntactic, semantic and structural comparison aspects. For this purpose, we use the domain ontology as well as other resources such as dictionaries. We propose a decision support system which permits the user to validate (or not) correspondences extracted in the comparison phase. For implementation, we propose an extension of the generic correspondence metamodel AMW in order to transform UML models to the correspondence model.

  18. Theoretical approach to the phonon modes and specific heat of germanium nanowires

    Energy Technology Data Exchange (ETDEWEB)

    Trejo, A.; López-Palacios, L.; Vázquez-Medina, R.; Cruz-Irisson, M., E-mail: irisson@ipn.mx

    2014-11-15

    The phonon modes and specific heat of Ge nanowires were computed using a first principles density functional theory scheme with a generalized gradient approximation and finite-displacement supercell algorithms. The nanowires were modeled in three different directions: [001], [111], and [110], using the supercell technique. All surface dangling bonds were saturated with Hydrogen atoms. The results show that the specific heat of the GeNWs at room temperature increases as the nanowire diameter decreases, regardless the orientation due to the phonon confinement and surface passivation. Also the phonon confinement effects could be observed since the highest optical phonon modes in the Ge vibration interval shifted to a lower frequency compared to their bulk counterparts.

  19. Raw Data Maximum Likelihood Estimation for Common Principal Component Models: A State Space Approach.

    Science.gov (United States)

    Gu, Fei; Wu, Hao

    2016-09-01

    The specifications of state space model for some principal component-related models are described, including the independent-group common principal component (CPC) model, the dependent-group CPC model, and principal component-based multivariate analysis of variance. Some derivations are provided to show the equivalence of the state space approach and the existing Wishart-likelihood approach. For each model, a numeric example is used to illustrate the state space approach. In addition, a simulation study is conducted to evaluate the standard error estimates under the normality and nonnormality conditions. In order to cope with the nonnormality conditions, the robust standard errors are also computed. Finally, other possible applications of the state space approach are discussed at the end.

  20. A modular approach for item response theory modeling with the R package flirt.

    Science.gov (United States)

    Jeon, Minjeong; Rijmen, Frank

    2016-06-01

    The new R package flirt is introduced for flexible item response theory (IRT) modeling of psychological, educational, and behavior assessment data. flirt integrates a generalized linear and nonlinear mixed modeling framework with graphical model theory. The graphical model framework allows for efficient maximum likelihood estimation. The key feature of flirt is its modular approach to facilitate convenient and flexible model specifications. Researchers can construct customized IRT models by simply selecting various modeling modules, such as parametric forms, number of dimensions, item and person covariates, person groups, link functions, etc. In this paper, we describe major features of flirt and provide examples to illustrate how flirt works in practice.

  1. A consortium approach to glass furnace modeling.

    Energy Technology Data Exchange (ETDEWEB)

    Chang, S.-L.; Golchert, B.; Petrick, M.

    1999-04-20

    Using computational fluid dynamics to model a glass furnace is a difficult task for any one glass company, laboratory, or university to accomplish. The task of building a computational model of the furnace requires knowledge and experience in modeling two dissimilar regimes (the combustion space and the liquid glass bath), along with the skill necessary to couple these two regimes. Also, a detailed set of experimental data is needed in order to evaluate the output of the code to ensure that the code is providing proper results. Since all these diverse skills are not present in any one research institution, a consortium was formed between Argonne National Laboratory, Purdue University, Mississippi State University, and five glass companies in order to marshal these skills into one three-year program. The objective of this program is to develop a fully coupled, validated simulation of a glass melting furnace that may be used by industry to optimize the performance of existing furnaces.

  2. Mixture modeling approach to flow cytometry data.

    Science.gov (United States)

    Boedigheimer, Michael J; Ferbas, John

    2008-05-01

    Flow Cytometry has become a mainstay technique for measuring fluorescent and physical attributes of single cells in a suspended mixture. These data are reduced during analysis using a manual or semiautomated process of gating. Despite the need to gate data for traditional analyses, it is well recognized that analyst-to-analyst variability can impact the dataset. Moreover, cells of interest can be inadvertently excluded from the gate, and relationships between collected variables may go unappreciated because they were not included in the original analysis plan. A multivariate non-gating technique was developed and implemented that accomplished the same goal as traditional gating while eliminating many weaknesses. The procedure was validated against traditional gating for analysis of circulating B cells in normal donors (n = 20) and persons with Systemic Lupus Erythematosus (n = 42). The method recapitulated relationships in the dataset while providing for an automated and objective assessment of the data. Flow cytometry analyses are amenable to automated analytical techniques that are not predicated on discrete operator-generated gates. Such alternative approaches can remove subjectivity in data analysis, improve efficiency and may ultimately enable construction of large bioinformatics data systems for more sophisticated approaches to hypothesis testing.

  3. Inhomogeneous condensation in effective models for QCD using the finite-mode approach

    CERN Document Server

    Heinz, Achim; Wagner, Marc; Rischke, Dirk H

    2016-01-01

    We use a numerical method, the finite-mode approach, to study inhomogeneous condensation in effective models for QCD in a general framework. Former limitations of considering a specific ansatz for the spatial dependence of the condensate are overcome. Different error sources are analyzed and strategies to minimize or eliminate them are outlined. The analytically known results for $1+1$ dimensional models (such as the Gross-Neveu model and extensions of it) are correctly reproduced using the finite-mode approach. Moreover, the NJL model in $3+1$ dimensions is investigated and its phase diagram is determined with particular focus on the inhomogeneous phase at high density.

  4. A model-based systems approach to pharmaceutical product-process design and analysis

    DEFF Research Database (Denmark)

    Gernaey, Krist; Gani, Rafiqul

    2010-01-01

    This is a perspective paper highlighting the need for systematic model-based design and analysis in pharmaceutical product-process development. A model-based framework is presented and the role, development and use of models of various types are discussed together with the structure of the models...... for the product and the process. The need for a systematic modelling framework is highlighted together with modelling issues related to model identification, adaptation and extension. In the area of product design and analysis, predictive models are needed with a wide application range. In the area of process...... synthesis and design, the use of generic process models from which specific process models can be generated, is highlighted. The use of a multi-scale modelling approach to extend the application range of the property models is highlighted as well. Examples of different types of process models, model...

  5. BUSINESS MODEL IN ELECTRICITY INDUSTRY USING BUSINESS MODEL CANVAS APPROACH; THE CASE OF PT. XYZ

    National Research Council Canada - National Science Library

    Wicaksono, Achmad Arief; Syarief, Rizal; Suparno, Ono

    2017-01-01

    .... This study aims to identify company's business model using Business Model Canvas approach, formulate business development strategy alternatives, and determine the prioritized business development...

  6. A New Comparative-Genomics Approach for Defining Phenotype-Specific Indicators Reveals Specific Genetic Markers in Predatory Bacteria.

    Science.gov (United States)

    Pasternak, Zohar; Ben Sasson, Tom; Cohen, Yossi; Segev, Elad; Jurkevitch, Edouard

    2015-01-01

    Predatory bacteria seek and consume other live bacteria. Although belonging to taxonomically diverse groups, relatively few bacterial predator species are known. Consequently, it is difficult to assess the impact of predation within the bacterial realm. As no genetic signatures distinguishing them from non-predatory bacteria are known, genomic resources cannot be exploited to uncover novel predators. In order to identify genes specific to predatory bacteria, we developed a bioinformatic tool called DiffGene. This tool automatically identifies marker genes that are specific to phenotypic or taxonomic groups, by mapping the complete gene content of all available fully-sequenced genomes for the presence/absence of each gene in each genome. A putative 'predator region' of ~60 amino acids in the tryptophan 2,3-dioxygenase (TDO) protein was found to probably be a predator-specific marker. This region is found in all known obligate predator and a few facultative predator genomes, and is absent from most facultative predators and all non-predatory bacteria. We designed PCR primers that uniquely amplify a ~180bp-long sequence within the predators' TDO gene, and validated them in monocultures as well as in metagenetic analysis of environmental wastewater samples. This marker, in addition to its usage in predator identification and phylogenetics, may finally permit reliable enumeration and cataloguing of predatory bacteria from environmental samples, as well as uncovering novel predators.

  7. The specific shapes of gender imbalance in scientific authorships: a network approach

    CERN Document Server

    Araújo, Tanya

    2016-01-01

    Gender differences in collaborative research have received little attention when compared with the growing importance that women hold in academia and research. Unsurprisingly, most of bibliometric databases have a strong lack of directly available information by gender. Although empirical-based network approaches are often used in the study of research collaboration, the studies about the influence of gender dissimilarities on the resulting topological outcomes are still scarce. Here, networks of scientific subjects are used to characterize patterns that might be associated to five categories of authorships which were built based on gender. We find enough evidence that gender imbalance in scientific authorships brings a peculiar trait to the networks induced from papers published in Web of Science (WoS) indexed journals of Economics over the period 2010-2015 and having at least one author affiliated to a Portuguese institution. Our results show the emergence of a specific pattern when the network of co-occurr...

  8. An Isomer-Specific Approach to Endocrine-Disrupting Nonylphenol in Infant Food.

    Science.gov (United States)

    Günther, Klaus; Räcker, Torsten; Böhme, Roswitha

    2017-02-15

    Nonylphenols (NPs) are persistent endocrine disruptors that are priority hazardous substances of the European Union Water Framework Directive. Their presence in the environment has caused growing concern regarding their impact on human health. Recent studies have shown that nonylphenol is ubiquitous in commercially available foodstuffs and is also present in human blood. The isomer distribution of 4-nonylphenol was analyzed by gas chromatography - mass spectrometry in 44 samples of infant food. Our study shows that the distribution of nonylphenol isomers is dependent on the foodstuff analyzed. Although some isomer groups prevail, different distributions are frequent. Variations are even found in the same food group. Nonylphenol is a complex mixture of isomers, and the estrogenic potentials of each of these isomers are very different. Consequently, to determine the potential toxicological impact of NP in food, an isomer-specific approach is necessary.

  9. Sex-specific patterns and differences in dementia and Alzheimer’s disease using informatics approaches

    Science.gov (United States)

    Ronquillo, Jay Geronimo; Baer, Merritt Rachel; Lester, William T.

    2016-01-01

    The National Institutes of Health Office of Research on Women’s Health recently highlighted the critical need for explicitly addressing sex differences in biomedical research, including Alzheimer’s disease and dementia. The purpose of our study was to perform a sex-stratified analysis of cognitive impairment using diverse medical, clinical and genetic factors of unprecedented scale and scope by applying informatics approaches to three large Alzheimer’s databases. Analyses suggested females were 1.5 times more likely than males to have a documented diagnosis of probable Alzheimer’s disease, and several other factors fell along sex-specific lines and were possibly associated with severity of cognitive impairment PMID:27105335

  10. "Dispersion modeling approaches for near road

    Science.gov (United States)

    Roadway design and roadside barriers can have significant effects on the dispersion of traffic-generated pollutants, especially in the near-road environment. Dispersion models that can accurately simulate these effects are needed to fully assess these impacts for a variety of app...

  11. and Models: A Self-Similar Approach

    Directory of Open Access Journals (Sweden)

    José Antonio Belinchón

    2013-01-01

    equations (FEs admit self-similar solutions. The methods employed allow us to obtain general results that are valid not only for the FRW metric, but also for all the Bianchi types as well as for the Kantowski-Sachs model (under the self-similarity hypothesis and the power-law hypothesis for the scale factors.

  12. Nonperturbative approach to the modified statistical model

    Energy Technology Data Exchange (ETDEWEB)

    Magdy, M.A.; Bekmezci, A.; Sever, R. [Middle East Technical Univ., Ankara (Turkey)

    1993-12-01

    The modified form of the statistical model is used without making any perturbation. The mass spectra of the lowest S, P and D levels of the (Q{bar Q}) and the non-self-conjugate (Q{bar q}) mesons are studied with the Song-Lin potential. The authors results are in good agreement with the experimental and theoretical findings.

  13. System Behavior Models: A Survey of Approaches

    Science.gov (United States)

    2016-06-01

    Mandana Vaziri, and Frank Tip. 2007. “Finding Bugs Efficiently with a SAT Solver.” In European Software Engineering Conference and the ACM SIGSOFT...Van Gorp. 2005. “A Taxonomy of Model Transformation.” Electronic Notes in Theoretical Computer Science 152: 125–142. Miyazawa, Alvaro, and Ana

  14. The integrative taxonomic approach reveals host specific species in an encyrtid parasitoid species complex.

    Directory of Open Access Journals (Sweden)

    Douglas Chesters

    Full Text Available Integrated taxonomy uses evidence from a number of different character types to delimit species and other natural groupings. While this approach has been advocated recently, and should be of particular utility in the case of diminutive insect parasitoids, there are relatively few examples of its application in these taxa. Here, we use an integrated framework to delimit independent lineages in Encyrtus sasakii (Hymenoptera: Chalcidoidea: Encyrtidae, a parasitoid morphospecies previously considered a host generalist. Sequence variation at the DNA barcode (cytochrome c oxidase I, COI and nuclear 28S rDNA loci were compared to morphometric recordings and mating compatibility tests, among samples of this species complex collected from its four scale insect hosts, covering a broad geographic range of northern and central China. Our results reveal that Encyrtus sasakii comprises three lineages that, while sharing a similar morphology, are highly divergent at the molecular level. At the barcode locus, the median K2P molecular distance between individuals from three primary populations was found to be 11.3%, well outside the divergence usually observed between Chalcidoidea conspecifics (0.5%. Corroborative evidence that the genetic lineages represent independent species was found from mating tests, where compatibility was observed only within populations, and morphometric analysis, which found that despite apparent morphological homogeneity, populations clustered according to forewing shape. The independent lineages defined by the integrated analysis correspond to the three scale insect hosts, suggesting the presence of host specific cryptic species. The finding of hidden host specificity in this species complex demonstrates the critical role that DNA barcoding will increasingly play in revealing hidden biodiversity in taxa that present difficulties for traditional taxonomic approaches.

  15. A moving approach for the Vector Hysteron Model

    Energy Technology Data Exchange (ETDEWEB)

    Cardelli, E. [Department of Engineering, University of Perugia, Via G. Duranti 93, 06125 Perugia (Italy); Faba, A., E-mail: antonio.faba@unipg.it [Department of Engineering, University of Perugia, Via G. Duranti 93, 06125 Perugia (Italy); Laudani, A. [Department of Engineering, Roma Tre University, Via V. Volterra 62, 00146 Rome (Italy); Quondam Antonio, S. [Department of Engineering, University of Perugia, Via G. Duranti 93, 06125 Perugia (Italy); Riganti Fulginei, F.; Salvini, A. [Department of Engineering, Roma Tre University, Via V. Volterra 62, 00146 Rome (Italy)

    2016-04-01

    A moving approach for the VHM (Vector Hysteron Model) is here described, to reconstruct both scalar and rotational magnetization of electrical steels with weak anisotropy, such as the non oriented grain Silicon steel. The hysterons distribution is postulated to be function of the magnetization state of the material, in order to overcome the practical limitation of the congruency property of the standard VHM approach. By using this formulation and a suitable accommodation procedure, the results obtained indicate that the model is accurate, in particular in reproducing the experimental behavior approaching to the saturation region, allowing a real improvement respect to the previous approach.

  16. Integration models: multicultural and liberal approaches confronted

    Science.gov (United States)

    Janicki, Wojciech

    2012-01-01

    European societies have been shaped by their Christian past, upsurge of international migration, democratic rule and liberal tradition rooted in religious tolerance. Boosting globalization processes impose new challenges on European societies, striving to protect their diversity. This struggle is especially clearly visible in case of minorities trying to resist melting into mainstream culture. European countries' legal systems and cultural policies respond to these efforts in many ways. Respecting identity politics-driven group rights seems to be the most common approach, resulting in creation of a multicultural society. However, the outcome of respecting group rights may be remarkably contradictory to both individual rights growing out from liberal tradition, and to reinforced concept of integration of immigrants into host societies. The hereby paper discusses identity politics upturn in the context of both individual rights and integration of European societies.

  17. A Fully Conditional Specification Approach to Multilevel Imputation of Categorical and Continuous Variables.

    Science.gov (United States)

    Enders, Craig K; Keller, Brian T; Levy, Roy

    2017-05-29

    Specialized imputation routines for multilevel data are widely available in software packages, but these methods are generally not equipped to handle a wide range of complexities that are typical of behavioral science data. In particular, existing imputation schemes differ in their ability to handle random slopes, categorical variables, differential relations at Level-1 and Level-2, and incomplete Level-2 variables. Given the limitations of existing imputation tools, the purpose of this manuscript is to describe a flexible imputation approach that can accommodate a diverse set of 2-level analysis problems that includes any of the aforementioned features. The procedure employs a fully conditional specification (also known as chained equations) approach with a latent variable formulation for handling incomplete categorical variables. Computer simulations suggest that the proposed procedure works quite well, with trivial biases in most cases. We provide a software program that implements the imputation strategy, and we use an artificial data set to illustrate its use. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  18. An approach to model validation and model-based prediction -- polyurethane foam case study.

    Energy Technology Data Exchange (ETDEWEB)

    Dowding, Kevin J.; Rutherford, Brian Milne

    2003-07-01

    Enhanced software methodology and improved computing hardware have advanced the state of simulation technology to a point where large physics-based codes can be a major contributor in many systems analyses. This shift toward the use of computational methods has brought with it new research challenges in a number of areas including characterization of uncertainty, model validation, and the analysis of computer output. It is these challenges that have motivated the work described in this report. Approaches to and methods for model validation and (model-based) prediction have been developed recently in the engineering, mathematics and statistical literatures. In this report we have provided a fairly detailed account of one approach to model validation and prediction applied to an analysis investigating thermal decomposition of polyurethane foam. A model simulates the evolution of the foam in a high temperature environment as it transforms from a solid to a gas phase. The available modeling and experimental results serve as data for a case study focusing our model validation and prediction developmental efforts on this specific thermal application. We discuss several elements of the ''philosophy'' behind the validation and prediction approach: (1) We view the validation process as an activity applying to the use of a specific computational model for a specific application. We do acknowledge, however, that an important part of the overall development of a computational simulation initiative is the feedback provided to model developers and analysts associated with the application. (2) We utilize information obtained for the calibration of model parameters to estimate the parameters and quantify uncertainty in the estimates. We rely, however, on validation data (or data from similar analyses) to measure the variability that contributes to the uncertainty in predictions for specific systems or units (unit-to-unit variability). (3) We perform statistical

  19. ISM Approach to Model Offshore Outsourcing Risks

    Directory of Open Access Journals (Sweden)

    Sunand Kumar

    2014-07-01

    Full Text Available In an effort to achieve a competitive advantage via cost reductions and improved market responsiveness, organizations are increasingly employing offshore outsourcing as a major component of their supply chain strategies. But as evident from literature number of risks such as Political risk, Risk due to cultural differences, Compliance and regulatory risk, Opportunistic risk and Organization structural risk, which adversely affect the performance of offshore outsourcing in a supply chain network. This also leads to dissatisfaction among different stake holders. The main objective of this paper is to identify and understand the mutual interaction among various risks which affect the performance of offshore outsourcing.  To this effect, authors have identified various risks through extant review of literature.  From this information, an integrated model using interpretive structural modelling (ISM for risks affecting offshore outsourcing is developed and the structural relationships between these risks are modeled.  Further, MICMAC analysis is done to analyze the driving power and dependency of risks which shall be helpful to managers to identify and classify important criterions and to reveal the direct and indirect effects of each criterion on offshore outsourcing. Results show that political risk and risk due to cultural differences are act as strong drivers.

  20. Experimental development based on mapping rule between requirements analysis model and web framework specific design model.

    Science.gov (United States)

    Okuda, Hirotaka; Ogata, Shinpei; Matsuura, Saeko

    2013-12-01

    Model Driven Development is a promising approach to develop high quality software systems. We have proposed a method of model-driven requirements analysis using Unified Modeling Language (UML). The main feature of our method is to automatically generate a Web user interface prototype from UML requirements analysis model so that we can confirm validity of input/output data for each page and page transition on the system by directly operating the prototype. We proposes a mapping rule in which design information independent of each web application framework implementation is defined based on the requirements analysis model, so as to improve the traceability to the final product from the valid requirements analysis model. This paper discusses the result of applying our method to the development of a Group Work Support System that is currently running in our department.

  1. Influenza virus-specific TCR-transduced T cells as a model for adoptive immunotherapy.

    Science.gov (United States)

    Berdien, Belinda; Reinhard, Henrike; Meyer, Sabrina; Spöck, Stefanie; Kröger, Nicolaus; Atanackovic, Djordje; Fehse, Boris

    2013-06-01

    Adoptive transfer of T lymphocytes equipped with tumor-antigen specific T-cell receptors (TCRs) represents a promising strategy in cancer immunotherapy, but the approach remains technically demanding. Using influenza virus (Flu)-specific T-cell responses as a model system we compared different methods for the generation of T-cell clones and isolation of antigen-specific TCRs. Altogether, we generated 12 CD8(+) T-cell clones reacting to the Flu matrix protein (Flu-M) and 6 CD4(+) T-cell clones reacting to the Flu nucleoprotein (Flu-NP) from 4 healthy donors. IFN-γ-secretion-based enrichment of antigen-specific cells, optionally combined with tetramer staining, was the most efficient way for generating T-cell clones. In contrast, the commonly used limiting dilution approach was least efficient. TCR genes were isolated from T-cell clones and cloned into both a previously used gammaretroviral LTR-vector, MP91 and the novel lentiviral self-inactivating vector LeGO-MP that contains MP91-derived promotor and regulatory elements. To directly compare their functional efficiencies, we in parallel transduced T-cell lines and primary T cells with the two vectors encoding identical TCRs. Transduction efficiencies were approximately twice higher with the gammaretroviral vector. Secretion of high amounts of IFN-γ, IL-2 and TNF-α by transduced cells after exposure to the respective influenza target epitope proved efficient specificity transfer of the isolated TCRs to primary T-cells for both vectors, at the same time indicating superior functionality of MP91-transduced cells. In conclusion, we have developed optimized strategies to obtain and transfer antigen-specific TCRs as well as designed a novel lentiviral vector for TCR-gene transfer. Our data may help to improve adoptive T-cell therapies.

  2. Quantum Machine and SR Approach: a Unified Model

    CERN Document Server

    Garola, C; Sozzo, S; Garola, Claudio; Pykacz, Jaroslav; Sozzo, Sandro

    2005-01-01

    The Geneva-Brussels approach to quantum mechanics (QM) and the semantic realism (SR) nonstandard interpretation of QM exhibit some common features and some deep conceptual differences. We discuss in this paper two elementary models provided in the two approaches as intuitive supports to general reasonings and as a proof of consistency of general assumptions, and show that Aerts' quantum machine can be embodied into a macroscopic version of the microscopic SR model, overcoming the seeming incompatibility between the two models. This result provides some hints for the construction of a unified perspective in which the two approaches can be properly placed.

  3. A cultural evolutionary programming approach to automatic analytical modeling of electrochemical phenomena through impedance spectroscopy

    CERN Document Server

    Arpaia, Pasquale

    2009-01-01

    An approach to automatic analytical modeling of electrochemical impedance spectroscopy data by evolutionary programming based on cultural algorithms is proposed. A solution-search strategy based on a cultural mechanism is exploited for defining the equivalent-circuit model automatically: information on search advance is transmitted to all potential solutions, rather than only to a small inheriting subset, such as in a traditional genetic approach. Moreover, with respect to the state of the art, also specific information related to constraints on the application physics knowledge is transferred. Experimental results of the proposed approach implementation in impedance spectroscopy for general-purpose electrochemical circuit analysis and for corrosion monitoring and diagnosing are presented.

  4. Cover signal specific steganalysis: the impact of training on the example of two selected audio steganalysis approaches

    Science.gov (United States)

    Kraetzer, Christian; Dittmann, Jana

    2008-02-01

    The main goals of this paper are to show the impact of the basic assumptions for the cover channel characteristics as well as the impact of different training/testing set generation strategies on the statistical detectability of exemplary chosen audio hiding approaches known from steganography and watermarking. Here we have selected exemplary five steganography algorithms and four watermarking algorithms. The channel characteristics for two different chosen audio cover channels (an application specific exemplary scenario of VoIP steganography and universal audio steganography) are formalised and their impact on decisions in the steganalysis process, especially on the strategies applied for training/ testing set generation, are shown. Following the assumptions on the cover channel characteristics either cover dependent or cover independent training and testing can be performed, using either correlated or non-correlated training and test sets. In comparison to previous work, additional frequency domain features are introduced for steganalysis and the performance (in terms of classification accuracy) of Bayesian classifiers and multinomial logistic regression models is compared with the results of SVM classification. We show that the newly implemented frequency domain features increase the classification accuracy achieved in SVM classification. Furthermore it is shown on the example of VoIP steganalysis that channel character specific evaluation performs better than tests without focus on a specific channel (i.e. universal steganalysis). A comparison of test results for cover dependent and independent training and testing shows that the latter performs better for all nine algorithms evaluated here and the used SVM based classifier.

  5. Dynamic Metabolic Model Building Based on the Ensemble Modeling Approach

    Energy Technology Data Exchange (ETDEWEB)

    Liao, James C. [Univ. of California, Los Angeles, CA (United States)

    2016-10-01

    Ensemble modeling of kinetic systems addresses the challenges of kinetic model construction, with respect to parameter value selection, and still allows for the rich insights possible from kinetic models. This project aimed to show that constructing, implementing, and analyzing such models is a useful tool for the metabolic engineering toolkit, and that they can result in actionable insights from models. Key concepts are developed and deliverable publications and results are presented.

  6. A modular approach to numerical human body modeling

    NARCIS (Netherlands)

    Forbes, P.A.; Griotto, G.; Rooij, L. van

    2007-01-01

    The choice of a human body model for a simulated automotive impact scenario must take into account both accurate model response and computational efficiency as key factors. This study presents a "modular numerical human body modeling" approach which allows the creation of a customized human body mod

  7. A BEHAVIORAL-APPROACH TO LINEAR EXACT MODELING

    NARCIS (Netherlands)

    ANTOULAS, AC; WILLEMS, JC

    1993-01-01

    The behavioral approach to system theory provides a parameter-free framework for the study of the general problem of linear exact modeling and recursive modeling. The main contribution of this paper is the solution of the (continuous-time) polynomial-exponential time series modeling problem. Both re

  8. A modular approach to numerical human body modeling

    NARCIS (Netherlands)

    Forbes, P.A.; Griotto, G.; Rooij, L. van

    2007-01-01

    The choice of a human body model for a simulated automotive impact scenario must take into account both accurate model response and computational efficiency as key factors. This study presents a "modular numerical human body modeling" approach which allows the creation of a customized human body

  9. A market model for stochastic smile: a conditional density approach

    NARCIS (Netherlands)

    Zilber, A.

    2005-01-01

    The purpose of this paper is to introduce a new approach that allows to construct no-arbitrage market models of for implied volatility surfaces (in other words, stochastic smile models). That is to say, the idea presented here allows us to model prices of liquidly traded vanilla options as separate

  10. Modelling and simulating retail management practices: a first approach

    CERN Document Server

    Siebers, Peer-Olaf; Celia, Helen; Clegg, Chris

    2010-01-01

    Multi-agent systems offer a new and exciting way of understanding the world of work. We apply agent-based modeling and simulation to investigate a set of problems in a retail context. Specifically, we are working to understand the relationship between people management practices on the shop-floor and retail performance. Despite the fact we are working within a relatively novel and complex domain, it is clear that using an agent-based approach offers great potential for improving organizational capabilities in the future. Our multi-disciplinary research team has worked closely with one of the UK's top ten retailers to collect data and build an understanding of shop-floor operations and the key actors in a department (customers, staff, and managers). Based on this case study we have built and tested our first version of a retail branch agent-based simulation model where we have focused on how we can simulate the effects of people management practices on customer satisfaction and sales. In our experiments we hav...

  11. Forecasting wind-driven wildfires using an inverse modelling approach

    Directory of Open Access Journals (Sweden)

    O. Rios

    2014-06-01

    Full Text Available A technology able to rapidly forecast wildfire dynamics would lead to a paradigm shift in the response to emergencies, providing the Fire Service with essential information about the ongoing fire. This paper presents and explores a novel methodology to forecast wildfire dynamics in wind-driven conditions, using real-time data assimilation and inverse modelling. The forecasting algorithm combines Rothermel's rate of spread theory with a perimeter expansion model based on Huygens principle and solves the optimisation problem with a tangent linear approach and forward automatic differentiation. Its potential is investigated using synthetic data and evaluated in different wildfire scenarios. The results show the capacity of the method to quickly predict the location of the fire front with a positive lead time (ahead of the event in the order of 10 min for a spatial scale of 100 m. The greatest strengths of our method are lightness, speed and flexibility. We specifically tailor the forecast to be efficient and computationally cheap so it can be used in mobile systems for field deployment and operativeness. Thus, we put emphasis on producing a positive lead time and the means to maximise it.

  12. Automated model integration at source code level: An approach for implementing models into the NASA Land Information System

    Science.gov (United States)

    Wang, S.; Peters-Lidard, C. D.; Mocko, D. M.; Kumar, S.; Nearing, G. S.; Arsenault, K. R.; Geiger, J. V.

    2014-12-01

    Model integration bridges the data flow between modeling frameworks and models. However, models usually do not fit directly into a particular modeling environment, if not designed for it. An example includes implementing different types of models into the NASA Land Information System (LIS), a software framework for land-surface modeling and data assimilation. Model implementation requires scientific knowledge and software expertise and may take a developer months to learn LIS and model software structure. Debugging and testing of the model implementation is also time-consuming due to not fully understanding LIS or the model. This time spent is costly for research and operational projects. To address this issue, an approach has been developed to automate model integration into LIS. With this in mind, a general model interface was designed to retrieve forcing inputs, parameters, and state variables needed by the model and to provide as state variables and outputs to LIS. Every model can be wrapped to comply with the interface, usually with a FORTRAN 90 subroutine. Development efforts need only knowledge of the model and basic programming skills. With such wrappers, the logic is the same for implementing all models. Code templates defined for this general model interface could be re-used with any specific model. Therefore, the model implementation can be done automatically. An automated model implementation toolkit was developed with Microsoft Excel and its built-in VBA language. It allows model specifications in three worksheets and contains FORTRAN 90 code templates in VBA programs. According to the model specification, the toolkit generates data structures and procedures within FORTRAN modules and subroutines, which transfer data between LIS and the model wrapper. Model implementation is standardized, and about 80 - 90% of the development load is reduced. In this presentation, the automated model implementation approach is described along with LIS programming

  13. Thermoplasmonics modeling: A Green's function approach

    Science.gov (United States)

    Baffou, Guillaume; Quidant, Romain; Girard, Christian

    2010-10-01

    We extend the discrete dipole approximation (DDA) and the Green’s dyadic tensor (GDT) methods—previously dedicated to all-optical simulations—to investigate the thermodynamics of illuminated plasmonic nanostructures. This extension is based on the use of the thermal Green’s function and a original algorithm that we named Laplace matrix inversion. It allows for the computation of the steady-state temperature distribution throughout plasmonic systems. This hybrid photothermal numerical method is suited to investigate arbitrarily complex structures. It can take into account the presence of a dielectric planar substrate and is simple to implement in any DDA or GDT code. Using this numerical framework, different applications are discussed such as thermal collective effects in nanoparticles assembly, the influence of a substrate on the temperature distribution and the heat generation in a plasmonic nanoantenna. This numerical approach appears particularly suited for new applications in physics, chemistry, and biology such as plasmon-induced nanochemistry and catalysis, nanofluidics, photothermal cancer therapy, or phase-transition control at the nanoscale.

  14. Coupling approaches used in atmospheric entry models

    Science.gov (United States)

    Gritsevich, M. I.

    2012-09-01

    While a planet orbits the Sun, it is subject to impact by smaller objects, ranging from tiny dust particles and space debris to much larger asteroids and comets. Such collisions have taken place frequently over geological time and played an important role in the evolution of planets and the development of life on the Earth. Though the search for near-Earth objects addresses one of the main points of the Asteroid and Comet Hazard, one should not underestimate the useful information to be gleaned from smaller atmospheric encounters, known as meteors or fireballs. Not only do these events help determine the linkages between meteorites and their parent bodies; due to their relative regularity they provide a good statistical basis for analysis. For successful cases with found meteorites, the detailed atmospheric path record is an excellent tool to test and improve existing entry models assuring the robustness of their implementation. There are many more important scientific questions meteoroids help us to answer, among them: Where do these objects come from, what are their origins, physical properties and chemical composition? What are the shapes and bulk densities of the space objects which fully ablate in an atmosphere and do not reach the planetary surface? Which values are directly measured and which are initially assumed as input to various models? How to couple both fragmentation and ablation effects in the model, taking real size distribution of fragments into account? How to specify and speed up the recovery of a recently fallen meteorites, not letting weathering to affect samples too much? How big is the pre-atmospheric projectile to terminal body ratio in terms of their mass/volume? Which exact parameters beside initial mass define this ratio? More generally, how entering object affects Earth's atmosphere and (if applicable) Earth's surface? How to predict these impact consequences based on atmospheric trajectory data? How to describe atmospheric entry

  15. Applied Regression Modeling A Business Approach

    CERN Document Server

    Pardoe, Iain

    2012-01-01

    An applied and concise treatment of statistical regression techniques for business students and professionals who have little or no background in calculusRegression analysis is an invaluable statistical methodology in business settings and is vital to model the relationship between a response variable and one or more predictor variables, as well as the prediction of a response value given values of the predictors. In view of the inherent uncertainty of business processes, such as the volatility of consumer spending and the presence of market uncertainty, business professionals use regression a

  16. Bayesian Approach to Neuro-Rough Models for Modelling HIV

    CERN Document Server

    Marwala, Tshilidzi

    2007-01-01

    This paper proposes a new neuro-rough model for modelling the risk of HIV from demographic data. The model is formulated using Bayesian framework and trained using Markov Chain Monte Carlo method and Metropolis criterion. When the model was tested to estimate the risk of HIV infection given the demographic data it was found to give the accuracy of 62% as opposed to 58% obtained from a Bayesian formulated rough set model trained using Markov chain Monte Carlo method and 62% obtained from a Bayesian formulated multi-layered perceptron (MLP) model trained using hybrid Monte. The proposed model is able to combine the accuracy of the Bayesian MLP model and the transparency of Bayesian rough set model.

  17. Development of a computationally efficient urban modeling approach

    DEFF Research Database (Denmark)

    Wolfs, Vincent; Murla, Damian; Ntegeka, Victor;

    2016-01-01

    This paper presents a parsimonious and data-driven modelling approach to simulate urban floods. Flood levels simulated by detailed 1D-2D hydrodynamic models can be emulated using the presented conceptual modelling approach with a very short calculation time. In addition, the model detail can...... be adjust-ed, allowing the modeller to focus on flood-prone locations. This results in efficiently parameterized models that can be tailored to applications. The simulated flood levels are transformed into flood extent maps using a high resolution (0.5-meter) digital terrain model in GIS. To illustrate...... the developed methodology, a case study for the city of Ghent in Belgium is elaborated. The configured conceptual model mimics the flood levels of a detailed 1D-2D hydrodynamic InfoWorks ICM model accurately, while the calculation time is an order of magnitude of 106 times shorter than the original highly...

  18. Implicit moral evaluations: A multinomial modeling approach.

    Science.gov (United States)

    Cameron, C Daryl; Payne, B Keith; Sinnott-Armstrong, Walter; Scheffer, Julian A; Inzlicht, Michael

    2017-01-01

    Implicit moral evaluations-i.e., immediate, unintentional assessments of the wrongness of actions or persons-play a central role in supporting moral behavior in everyday life. Yet little research has employed methods that rigorously measure individual differences in implicit moral evaluations. In five experiments, we develop a new sequential priming measure-the Moral Categorization Task-and a multinomial model that decomposes judgment on this task into multiple component processes. These include implicit moral evaluations of moral transgression primes (Unintentional Judgment), accurate moral judgments about target actions (Intentional Judgment), and a directional tendency to judge actions as morally wrong (Response Bias). Speeded response deadlines reduced Intentional Judgment but not Unintentional Judgment (Experiment 1). Unintentional Judgment was stronger toward moral transgression primes than non-moral negative primes (Experiments 2-4). Intentional Judgment was associated with increased error-related negativity, a neurophysiological indicator of behavioral control (Experiment 4). Finally, people who voted for an anti-gay marriage amendment had stronger Unintentional Judgment toward gay marriage primes (Experiment 5). Across Experiments 1-4, implicit moral evaluations converged with moral personality: Unintentional Judgment about wrong primes, but not negative primes, was negatively associated with psychopathic tendencies and positively associated with moral identity and guilt proneness. Theoretical and practical applications of formal modeling for moral psychology are discussed. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Continuous Molecular Fields Approach Applied to Structure-Activity Modeling

    CERN Document Server

    Baskin, Igor I

    2013-01-01

    The Method of Continuous Molecular Fields is a universal approach to predict various properties of chemical compounds, in which molecules are represented by means of continuous fields (such as electrostatic, steric, electron density functions, etc). The essence of the proposed approach consists in performing statistical analysis of functional molecular data by means of joint application of kernel machine learning methods and special kernels which compare molecules by computing overlap integrals of their molecular fields. This approach is an alternative to traditional methods of building 3D structure-activity and structure-property models based on the use of fixed sets of molecular descriptors. The methodology of the approach is described in this chapter, followed by its application to building regression 3D-QSAR models and conducting virtual screening based on one-class classification models. The main directions of the further development of this approach are outlined at the end of the chapter.

  20. An Efficient Role Specification Management Model for Highly Distributed Environments

    Directory of Open Access Journals (Sweden)

    Soomi Yang

    2006-07-01

    Full Text Available Highly distributed environments such as pervasive computing environments not having global or broad control, need another attribute certificate management technique. For an efficient role based access control using attribute certificate, we use a technique of structuring role specification certificates. It can provide more flexible and secure collaborating environments. The roles are grouped and made them into the relation tree. It can reduce management cost and overhead incurred when changing the specification of the role. Further we use caching of frequently used role specification certificate for better performance in case applying the role. Tree structured role specification results secure and efficient role renewing and distribution. Caching of role specification helps an application of role. In order to be scalable distribution of the role specification certificate, we use multicasting packets. Also, performance enhancement of structuring role specification certificates is quantified in the sense of taking into account of the packet loss. In the experimental section, it is shown that role updating and distribution are secured and efficient.

  1. A forward modeling approach for interpreting impeller flow logs.

    Science.gov (United States)

    Parker, Alison H; West, L Jared; Odling, Noelle E; Bown, Richard T

    2010-01-01

    A rigorous and practical approach for interpretation of impeller flow log data to determine vertical variations in hydraulic conductivity is presented and applied to two well logs from a Chalk aquifer in England. Impeller flow logging involves measuring vertical flow speed in a pumped well and using changes in flow with depth to infer the locations and magnitudes of inflows into the well. However, the measured flow logs are typically noisy, which leads to spurious hydraulic conductivity values where simplistic interpretation approaches are applied. In this study, a new method for interpretation is presented, which first defines a series of physical models for hydraulic conductivity variation with depth and then fits the models to the data, using a regression technique. Some of the models will be rejected as they are physically unrealistic. The best model is then selected from the remaining models using a maximum likelihood approach. This balances model complexity against fit, for example, using Akaike's Information Criterion.

  2. An Adaptive Approach to Schema Classification for Data Warehouse Modeling

    Institute of Scientific and Technical Information of China (English)

    Hong-Ding Wang; Yun-Hai Tong; Shao-Hua Tan; Shi-Wei Tang; Dong-Qing Yang; Guo-Hui Sun

    2007-01-01

    Data warehouse (DW) modeling is a complicated task, involving both knowledge of business processes and familiarity with operational information systems structure and behavior. Existing DW modeling techniques suffer from the following major drawbacks -data-driven approach requires high levels of expertise and neglects the requirements of end users, while demand-driven approach lacks enterprise-wide vision and is regardless of existing models of underlying operational systems. In order to make up for those shortcomings, a method of classification of schema elements for DW modeling is proposed in this paper. We first put forward the vector space models for subjects and schema elements, then present an adaptive approach with self-tuning theory to construct context vectors of subjects, and finally classify the source schema elements into different subjects of the DW automatically. Benefited from the result of the schema elements classification, designers can model and construct a DW more easily.

  3. A Networks Approach to Modeling Enzymatic Reactions.

    Science.gov (United States)

    Imhof, P

    2016-01-01

    Modeling enzymatic reactions is a demanding task due to the complexity of the system, the many degrees of freedom involved and the complex, chemical, and conformational transitions associated with the reaction. Consequently, enzymatic reactions are not determined by precisely one reaction pathway. Hence, it is beneficial to obtain a comprehensive picture of possible reaction paths and competing mechanisms. By combining individually generated intermediate states and chemical transition steps a network of such pathways can be constructed. Transition networks are a discretized representation of a potential energy landscape consisting of a multitude of reaction pathways connecting the end states of the reaction. The graph structure of the network allows an easy identification of the energetically most favorable pathways as well as a number of alternative routes.

  4. Modeling Approaches for Describing Microbial Population Heterogeneity

    DEFF Research Database (Denmark)

    Lencastre Fernandes, Rita

    , ethanol and biomass throughout the reactor. This work has proven that the integration of CFD and population balance models, for describing the growth of a microbial population in a spatially heterogeneous reactor, is feasible, and that valuable insight on the interplay between flow and the dynamics......Although microbial populations are typically described by averaged properties, individual cells present a certain degree of variability. Indeed, initially clonal microbial populations develop into heterogeneous populations, even when growing in a homogeneous environment. A heterogeneous microbial......) to predict distributions of certain population properties including particle size, mass or volume, and molecular weight. Similarly, PBM allow for a mathematical description of distributed cell properties within microbial populations. Cell total protein content distributions (a measure of cell mass) have been...

  5. Hamiltonian approach to hybrid plasma models

    CERN Document Server

    Tronci, Cesare

    2010-01-01

    The Hamiltonian structures of several hybrid kinetic-fluid models are identified explicitly, upon considering collisionless Vlasov dynamics for the hot particles interacting with a bulk fluid. After presenting different pressure-coupling schemes for an ordinary fluid interacting with a hot gas, the paper extends the treatment to account for a fluid plasma interacting with an energetic ion species. Both current-coupling and pressure-coupling MHD schemes are treated extensively. In particular, pressure-coupling schemes are shown to require a transport-like term in the Vlasov kinetic equation, in order for the Hamiltonian structure to be preserved. The last part of the paper is devoted to studying the more general case of an energetic ion species interacting with a neutralizing electron background (hybrid Hall-MHD). Circulation laws and Casimir functionals are presented explicitly in each case.

  6. Patient-Specific Predictive Modeling Using Random Forests: An Observational Study for the Critically Ill

    Science.gov (United States)

    2017-01-01

    Background With a large-scale electronic health record repository, it is feasible to build a customized patient outcome prediction model specifically for a given patient. This approach involves identifying past patients who are similar to the present patient and using their data to train a personalized predictive model. Our previous work investigated a cosine-similarity patient similarity metric (PSM) for such patient-specific predictive modeling. Objective The objective of the study is to investigate the random forest (RF) proximity measure as a PSM in the context of personalized mortality prediction for intensive care unit (ICU) patients. Methods A total of 17,152 ICU admissions were extracted from the Multiparameter Intelligent Monitoring in Intensive Care II database. A number of predictor variables were extracted from the first 24 hours in the ICU. Outcome to be predicted was 30-day mortality. A patient-specific predictive model was trained for each ICU admission using an RF PSM inspired by the RF proximity measure. Death counting, logistic regression, decision tree, and RF models were studied with a hard threshold applied to RF PSM values to only include the M most similar patients in model training, where M was varied. In addition, case-specific random forests (CSRFs), which uses RF proximity for weighted bootstrapping, were trained. Results Compared to our previous study that investigated a cosine similarity PSM, the RF PSM resulted in superior or comparable predictive performance. RF and CSRF exhibited the best performances (in terms of mean area under the receiver operating characteristic curve [95% confidence interval], RF: 0.839 [0.835-0.844]; CSRF: 0.832 [0.821-0.843]). RF and CSRF did not benefit from personalization via the use of the RF PSM, while the other models did. Conclusions The RF PSM led to good mortality prediction performance for several predictive models, although it failed to induce improved performance in RF and CSRF. The distinction

  7. Assessing the accuracy of subject-specific, muscle-model parameters determined by optimizing to match isometric strength.

    Science.gov (United States)

    DeSmitt, Holly J; Domire, Zachary J

    2016-12-01

    Biomechanical models are sensitive to the choice of model parameters. Therefore, determination of accurate subject specific model parameters is important. One approach to generate these parameters is to optimize the values such that the model output will match experimentally measured strength curves. This approach is attractive as it is inexpensive and should provide an excellent match to experimentally measured strength. However, given the problem of muscle redundancy, it is not clear that this approach generates accurate individual muscle forces. The purpose of this investigation is to evaluate this approach using simulated data to enable a direct comparison. It is hypothesized that the optimization approach will be able to recreate accurate muscle model parameters when information from measurable parameters is given. A model of isometric knee extension was developed to simulate a strength curve across a range of knee angles. In order to realistically recreate experimentally measured strength, random noise was added to the modeled strength. Parameters were solved for using a genetic search algorithm. When noise was added to the measurements the strength curve was reasonably recreated. However, the individual muscle model parameters and force curves were far less accurate. Based upon this examination, it is clear that very different sets of model parameters can recreate similar strength curves. Therefore, experimental variation in strength measurements has a significant influence on the results. Given the difficulty in accurately recreating individual muscle parameters, it may be more appropriate to perform simulations with lumped actuators representing similar muscles.

  8. Modeling of phase equilibria with CPA using the homomorph approach

    DEFF Research Database (Denmark)

    Breil, Martin Peter; Tsivintzelis, Ioannis; Kontogeorgis, Georgios

    2011-01-01

    For association models, like CPA and SAFT, a classical approach is often used for estimating pure-compound and mixture parameters. According to this approach, the pure-compound parameters are estimated from vapor pressure and liquid density data. Then, the binary interaction parameters, kij, are ...

  9. A Constructive Neural-Network Approach to Modeling Psychological Development

    Science.gov (United States)

    Shultz, Thomas R.

    2012-01-01

    This article reviews a particular computational modeling approach to the study of psychological development--that of constructive neural networks. This approach is applied to a variety of developmental domains and issues, including Piagetian tasks, shift learning, language acquisition, number comparison, habituation of visual attention, concept…

  10. A Constructive Neural-Network Approach to Modeling Psychological Development

    Science.gov (United States)

    Shultz, Thomas R.

    2012-01-01

    This article reviews a particular computational modeling approach to the study of psychological development--that of constructive neural networks. This approach is applied to a variety of developmental domains and issues, including Piagetian tasks, shift learning, language acquisition, number comparison, habituation of visual attention, concept…

  11. Pattern-based approach for logical traffic isolation forensic modelling

    CSIR Research Space (South Africa)

    Dlamini, I

    2009-08-01

    Full Text Available The use of design patterns usually changes the approach of software design and makes software development relatively easy. This paper extends work on a forensic model for Logical Traffic Isolation (LTI) based on Differentiated Services (Diff...

  12. A semantic-web approach for modeling computing infrastructures

    NARCIS (Netherlands)

    M. Ghijsen; J. van der Ham; P. Grosso; C. Dumitru; H. Zhu; Z. Zhao; C. de Laat

    2013-01-01

    This paper describes our approach to modeling computing infrastructures. Our main contribution is the Infrastructure and Network Description Language (INDL) ontology. The aim of INDL is to provide technology independent descriptions of computing infrastructures, including the physical resources as w

  13. Bayesian approach to decompression sickness model parameter estimation.

    Science.gov (United States)

    Howle, L E; Weber, P W; Nichols, J M

    2017-03-01

    We examine both maximum likelihood and Bayesian approaches for estimating probabilistic decompression sickness model parameters. Maximum likelihood estimation treats parameters as fixed values and determines the best estimate through repeated trials, whereas the Bayesian approach treats parameters as random variables and determines the parameter probability distributions. We would ultimately like to know the probability that a parameter lies in a certain range rather than simply make statements about the repeatability of our estimator. Although both represent powerful methods of inference, for models with complex or multi-peaked likelihoods, maximum likelihood parameter estimates can prove more difficult to interpret than the estimates of the parameter distributions provided by the Bayesian approach. For models of decompression sickness, we show that while these two estimation methods are complementary, the credible intervals generated by the Bayesian approach are more naturally suited to quantifying uncertainty in the model parameters.

  14. Vehicle-specific emissions modeling based upon on-road measurements.

    Science.gov (United States)

    Frey, H Christopher; Zhang, Kaishan; Rouphail, Nagui M

    2010-05-01

    Vehicle-specific microscale fuel use and emissions rate models are developed based upon real-world hot-stabilized tailpipe measurements made using a portable emissions measurement system. Consecutive averaging periods of one to three multiples of the response time are used to compare two semiempirical physically based modeling schemes. One scheme is based on internally observable variables (IOVs), such as engine speed and manifold absolute pressure, while the other is based on externally observable variables (EOVs), such as speed, acceleration, and road grade. For NO, HC, and CO emission rates, the average R(2) ranged from 0.41 to 0.66 for the former and from 0.17 to 0.30 for the latter. The EOV models have R(2) for CO(2) of 0.43 to 0.79 versus 0.99 for the IOV models. The models are sensitive to episodic events in driving cycles such as high acceleration. Intervehicle and fleet average modeling approaches are compared; the former account for microscale variations that might be useful for some types of assessments. EOV-based models have practical value for traffic management or simulation applications since IOVs usually are not available or not used for emission estimation.

  15. The Demand-Control Model: Specific demands, specific Control, and well-defined groups

    NARCIS (Netherlands)

    Jonge, J. de; Dollard, M.F.; Dormann, C.; Blanc, P.M.; Houtman, I.L.D.

    2000-01-01

    The purpose of this study was to test the Demand-Control Model (DCM), accompanied by three goals. Firstly, we used alternative, more focused, and multifaceted measures of both job demands and job control that are relevant and applicable to today's working contexts. Secondly, this study intended to

  16. Predicting the functions and specificity of triterpenoid synthases: a mechanism-based multi-intermediate docking approach.

    Directory of Open Access Journals (Sweden)

    Bo-Xue Tian

    2014-10-01

    Full Text Available Terpenoid synthases construct the carbon skeletons of tens of thousands of natural products. To predict functions and specificity of triterpenoid synthases, a mechanism-based, multi-intermediate docking approach is proposed. In addition to enzyme function prediction, other potential applications of the current approach, such as enzyme mechanistic studies and enzyme redesign by mutagenesis, are discussed.

  17. Functional state modelling approach validation for yeast and bacteria cultivations

    Science.gov (United States)

    Roeva, Olympia; Pencheva, Tania

    2014-01-01

    In this paper, the functional state modelling approach is validated for modelling of the cultivation of two different microorganisms: yeast (Saccharomyces cerevisiae) and bacteria (Escherichia coli). Based on the available experimental data for these fed-batch cultivation processes, three different functional states are distinguished, namely primary product synthesis state, mixed oxidative state and secondary product synthesis state. Parameter identification procedures for different local models are performed using genetic algorithms. The simulation results show high degree of adequacy of the models describing these functional states for both S. cerevisiae and E. coli cultivations. Thus, the local models are validated for the cultivation of both microorganisms. This fact is a strong structure model verification of the functional state modelling theory not only for a set of yeast cultivations, but also for bacteria cultivation. As such, the obtained results demonstrate the efficiency and efficacy of the functional state modelling approach. PMID:26740778

  18. An optimization approach to kinetic model reduction for combustion chemistry

    CERN Document Server

    Lebiedz, Dirk

    2013-01-01

    Model reduction methods are relevant when the computation time of a full convection-diffusion-reaction simulation based on detailed chemical reaction mechanisms is too large. In this article, we review a model reduction approach based on optimization of trajectories and show its applicability to realistic combustion models. As most model reduction methods, it identifies points on a slow invariant manifold based on time scale separation in the dynamics of the reaction system. The numerical approximation of points on the manifold is achieved by solving a semi-infinite optimization problem, where the dynamics enter the problem as constraints. The proof of existence of a solution for an arbitrarily chosen dimension of the reduced model (slow manifold) is extended to the case of realistic combustion models including thermochemistry by considering the properties of proper maps. The model reduction approach is finally applied to three models based on realistic reaction mechanisms: 1. ozone decomposition as a small t...

  19. Functional state modelling approach validation for yeast and bacteria cultivations.

    Science.gov (United States)

    Roeva, Olympia; Pencheva, Tania

    2014-09-03

    In this paper, the functional state modelling approach is validated for modelling of the cultivation of two different microorganisms: yeast (Saccharomyces cerevisiae) and bacteria (Escherichia coli). Based on the available experimental data for these fed-batch cultivation processes, three different functional states are distinguished, namely primary product synthesis state, mixed oxidative state and secondary product synthesis state. Parameter identification procedures for different local models are performed using genetic algorithms. The simulation results show high degree of adequacy of the models describing these functional states for both S. cerevisiae and E. coli cultivations. Thus, the local models are validated for the cultivation of both microorganisms. This fact is a strong structure model verification of the functional state modelling theory not only for a set of yeast cultivations, but also for bacteria cultivation. As such, the obtained results demonstrate the efficiency and efficacy of the functional state modelling approach.

  20. Molecular Modeling Approach to Cardiovascular Disease Targetting

    Directory of Open Access Journals (Sweden)

    Chandra Sekhar Akula,

    2010-05-01

    Full Text Available Cardiovascular disease, including stroke, is the leading cause of illness and death in the India. A number of studies have shown that inflammation of blood vessels is one of the major factors that increase the incidence of heart diseases, including arteriosclerosis (clogging of the arteries, stroke and myocardial infraction or heart attack. Studies have associated obesity and other components of metabolic syndrome, cardiovascular risk factors, with lowgradeinflammation. Furthermore, some findings suggest that drugs commonly prescribed to the lower cholesterol also reduce this inflammation, suggesting an additional beneficial effect of the stains. The recent development of angiotensin 11 (Ang11 receptor antagonists has enabled to improve significantly the tolerability profile of thisgroup of drugs while maintaining a high clinical efficacy. ACE2 is expressed predominantly in the endothelium and in renal tubular epithelium, and it thus may be an import new cardiovascular target. In the present study we modeled the structure of ACE and designed an inhibitor through using ARGUS lab and the validation of the Drug molecule is done basing on QSAR properties and Cache for this protein through CADD.

  1. Data Analysis A Model Comparison Approach, Second Edition

    CERN Document Server

    Judd, Charles M; Ryan, Carey S

    2008-01-01

    This completely rewritten classic text features many new examples, insights and topics including mediational, categorical, and multilevel models. Substantially reorganized, this edition provides a briefer, more streamlined examination of data analysis. Noted for its model-comparison approach and unified framework based on the general linear model, the book provides readers with a greater understanding of a variety of statistical procedures. This consistent framework, including consistent vocabulary and notation, is used throughout to develop fewer but more powerful model building techniques. T

  2. Modelling and Generating Ajax Applications: A Model-Driven Approach

    NARCIS (Netherlands)

    Gharavi, V.; Mesbah, A.; Van Deursen, A.

    2008-01-01

    Preprint of paper published in: IWWOST 2008 - 7th International Workshop on Web-Oriented Software Technologies, 14-15 July 2008 AJAX is a promising and rapidly evolving approach for building highly interactive web applications. In AJAX, user interface components and the event-based interaction betw

  3. Modelling and Generating Ajax Applications: A Model-Driven Approach

    NARCIS (Netherlands)

    Gharavi, V.; Mesbah, A.; Van Deursen, A.

    2008-01-01

    Preprint of paper published in: IWWOST 2008 - 7th International Workshop on Web-Oriented Software Technologies, 14-15 July 2008 AJAX is a promising and rapidly evolving approach for building highly interactive web applications. In AJAX, user interface components and the event-based interaction

  4. A novel approach to modeling and diagnosing the cardiovascular system

    Energy Technology Data Exchange (ETDEWEB)

    Keller, P.E.; Kangas, L.J.; Hashem, S.; Kouzes, R.T. [Pacific Northwest Lab., Richland, WA (United States); Allen, P.A. [Life Link, Richland, WA (United States)

    1995-07-01

    A novel approach to modeling and diagnosing the cardiovascular system is introduced. A model exhibits a subset of the dynamics of the cardiovascular behavior of an individual by using a recurrent artificial neural network. Potentially, a model will be incorporated into a cardiovascular diagnostic system. This approach is unique in that each cardiovascular model is developed from physiological measurements of an individual. Any differences between the modeled variables and the variables of an individual at a given time are used for diagnosis. This approach also exploits sensor fusion to optimize the utilization of biomedical sensors. The advantage of sensor fusion has been demonstrated in applications including control and diagnostics of mechanical and chemical processes.

  5. Internal Models Support Specific Gaits in Orthotic Devices

    DEFF Research Database (Denmark)

    Matthias Braun, Jan; Wörgötter, Florentin; Manoonpong, Poramate

    2014-01-01

    Patients use orthoses and prosthesis for the lower limbs to support and enable movements, they can not or only with difficulties perform themselves. Because traditional devices support only a limited set of movements, patients are restricted in their mobility. A possible approach to overcome...

  6. Gratifications Sought and Obtained: Model Specification and Theoretical Development.

    Science.gov (United States)

    Wenner, Lawrence A.

    Uses of the gratifications sought and gratifications obtained distinction in explanations of media effects have taken two conceptually distinct forms. The discrepancy approach poses that the difference between what is sought and what is actually obtained, expressed effectively as a discrepancy score, significantly aids effects explanations. The…

  7. A systems approach to predict oncometabolites via context-specific genome-scale metabolic networks.

    Directory of Open Access Journals (Sweden)

    Hojung Nam

    2014-09-01

    Full Text Available Altered metabolism in cancer cells has been viewed as a passive response required for a malignant transformation. However, this view has changed through the recently described metabolic oncogenic factors: mutated isocitrate dehydrogenases (IDH, succinate dehydrogenase (SDH, and fumarate hydratase (FH that produce oncometabolites that competitively inhibit epigenetic regulation. In this study, we demonstrate in silico predictions of oncometabolites that have the potential to dysregulate epigenetic controls in nine types of cancer by incorporating massive scale genetic mutation information (collected from more than 1,700 cancer genomes, expression profiling data, and deploying Recon 2 to reconstruct context-specific genome-scale metabolic models. Our analysis predicted 15 compounds and 24 substructures of potential oncometabolites that could result from the loss-of-function and gain-of-function mutations of metabolic enzymes, respectively. These results suggest a substantial potential for discovering unidentified oncometabolites in various forms of cancers.

  8. Solvation and electrostatic model for specific electrolyte adsorption

    Energy Technology Data Exchange (ETDEWEB)

    Sahai, N.; Sverjensky, D.A. [Johns Hopkins Univ., Baltimore, MD (United States)

    1997-07-01

    A salvation and electrostatic model has been developed for estimating electrolyte adsorption from physical and chemical properties of the system, consistent with the triple-layer model. The model is calibrated on experimental surface titration data for ten oxides and hydroxides in ten electrolytes over a range of ionic strengths from 0.001 M-2.9 M. 77 refs., 7 figs., 4 tabs.

  9. One Approach for Dynamic L-lysine Modelling of Repeated Fed-batch Fermentation

    Directory of Open Access Journals (Sweden)

    Kalin Todorov

    2007-03-01

    Full Text Available This article deals with establishment of dynamic unstructured model of variable volume fed-batch fermentation process with intensive droppings for L-lysine production. The presented approach of the investigation includes the following main procedures: description of the process by generalized stoichiometric equations; preliminary data processing and calculation of specific rates for main kinetic variables; identification of the specific rates as a second-order non-linear dynamic models; establishment and optimisation of dynamic model of the process; simulation researches. MATLAB is used as a research environment.

  10. A model-based combinatorial optimisation approach for energy-efficient processing of microalgae

    NARCIS (Netherlands)

    Slegers, P.M.; Koetzier, B.J.; Fasaei, F.; Wijffels, R.H.; Straten, van G.; Boxtel, van A.J.B.

    2014-01-01

    The analyses of algae biorefinery performance are commonly based on fixed performance data for each processing step. In this work, we demonstrate a model-based combinatorial approach to derive the design-specific upstream energy consumption and biodiesel yield in the production of biodiesel from mic

  11. Teaching Higher Order Thinking in the Introductory MIS Course: A Model-Directed Approach

    Science.gov (United States)

    Wang, Shouhong; Wang, Hai

    2011-01-01

    One vision of education evolution is to change the modes of thinking of students. Critical thinking, design thinking, and system thinking are higher order thinking paradigms that are specifically pertinent to business education. A model-directed approach to teaching and learning higher order thinking is proposed. An example of application of the…

  12. Comparison of data-driven and model-driven approaches to brightness temperature diurnal cycle interpolation

    CSIR Research Space (South Africa)

    Van den Bergh, F

    2006-01-01

    Full Text Available RKHS model for the first experiment. MSE = (0.5363, 0.7331). motivation for this approach was that the amount of compu- tation per cycle would be reduced significantly. The specific example in Figure 4 shows the RKHS model—initially fitted to cycle...

  13. Teaching Higher Order Thinking in the Introductory MIS Course: A Model-Directed Approach

    Science.gov (United States)

    Wang, Shouhong; Wang, Hai

    2011-01-01

    One vision of education evolution is to change the modes of thinking of students. Critical thinking, design thinking, and system thinking are higher order thinking paradigms that are specifically pertinent to business education. A model-directed approach to teaching and learning higher order thinking is proposed. An example of application of the…

  14. A model-based combinatorial optimisation approach for energy-efficient processing of microalgae

    NARCIS (Netherlands)

    Slegers, P.M.; Koetzier, B.J.; Fasaei, F.; Wijffels, R.H.; Straten, van G.; Boxtel, van A.J.B.

    2014-01-01

    The analyses of algae biorefinery performance are commonly based on fixed performance data for each processing step. In this work, we demonstrate a model-based combinatorial approach to derive the design-specific upstream energy consumption and biodiesel yield in the production of biodiesel from

  15. A Model-Based Approach to Support Validation of Medical Cyber-Physical Systems

    Directory of Open Access Journals (Sweden)

    Lenardo C. Silva

    2015-10-01

    Full Text Available Medical Cyber-Physical Systems (MCPS are context-aware, life-critical systems with patient safety as the main concern, demanding rigorous processes for validation to guarantee user requirement compliance and specification-oriented correctness. In this article, we propose a model-based approach for early validation of MCPS, focusing on promoting reusability and productivity. It enables system developers to build MCPS formal models based on a library of patient and medical device models, and simulate the MCPS to identify undesirable behaviors at design time. Our approach has been applied to three different clinical scenarios to evaluate its reusability potential for different contexts. We have also validated our approach through an empirical evaluation with developers to assess productivity and reusability. Finally, our models have been formally verified considering functional and safety requirements and model coverage.

  16. Model-centric approaches for the development of health information systems.

    Science.gov (United States)

    Tuomainen, Mika; Mykkänen, Juha; Luostarinen, Heli; Pöyhölä, Assi; Paakkanen, Esa

    2007-01-01

    Modeling is used increasingly in healthcare to increase shared knowledge, to improve the processes, and to document the requirements of the solutions related to health information systems (HIS). There are numerous modeling approaches which aim to support these aims, but a careful assessment of their strengths, weaknesses and deficiencies is needed. In this paper, we compare three model-centric approaches in the context of HIS development: the Model-Driven Architecture, Business Process Modeling with BPMN and BPEL and the HL7 Development Framework. The comparison reveals that all these approaches are viable candidates for the development of HIS. However, they have distinct strengths and abstraction levels, they require local and project-specific adaptation and offer varying levels of automation. In addition, illustration of the solutions to the end users must be improved.

  17. A Model-Based Approach to Support Validation of Medical Cyber-Physical Systems.

    Science.gov (United States)

    Silva, Lenardo C; Almeida, Hyggo O; Perkusich, Angelo; Perkusich, Mirko

    2015-10-30

    Medical Cyber-Physical Systems (MCPS) are context-aware, life-critical systems with patient safety as the main concern, demanding rigorous processes for validation to guarantee user requirement compliance and specification-oriented correctness. In this article, we propose a model-based approach for early validation of MCPS, focusing on promoting reusability and productivity. It enables system developers to build MCPS formal models based on a library of patient and medical device models, and simulate the MCPS to identify undesirable behaviors at design time. Our approach has been applied to three different clinical scenarios to evaluate its reusability potential for different contexts. We have also validated our approach through an empirical evaluation with developers to assess productivity and reusability. Finally, our models have been formally verified considering functional and safety requirements and model coverage.

  18. Do Fencers Require a Weapon-Specific Approach to Strength and Conditioning Training?

    Science.gov (United States)

    Turner, Anthony N; Bishop, Chris J; Cree, Jon A; Edwards, Michael L; Chavda, Shyam; Read, Paul J; Kirby, David M J

    2017-06-01

    There are 3 types of weapons used in Olympic fencing: the épée, foil, and sabre. The aim of this study was to determine if fencers exhibited different physical characteristics across weapons. Seventy-nine male (n = 46) and female (n = 33) national standard fencers took part in this study. Fencers from each weapon (male and female), i.e., épée (n = 19 and 10), foil (n = 22 and 14), and sabre (n = 13 and 10), were (mean ± SD) 15.9 ± 0.7 years of age, 178.5 ± 7.9 cm tall, 67.4 ± 12.2 kg in mass and had 6.3 ± 2.3 years fencing experience; all were in regular training (∼4 times per week). Results revealed that across all performance tests (lower-body power, reactive strength index, change of direction speed, and repeat lunge ability), there was no significant main effect for weapon in male fencers (p = 0.63) or female fencers (p = 0.232), but a significant main affect for gender (p weapon-specific approach to strength and conditioning training. Each fencer should target the area they are weakest at, rather than an area that they feel best represents the unique demands of their weapon.

  19. Mathematical models for therapeutic approaches to control HIV disease transmission

    CERN Document Server

    Roy, Priti Kumar

    2015-01-01

    The book discusses different therapeutic approaches based on different mathematical models to control the HIV/AIDS disease transmission. It uses clinical data, collected from different cited sources, to formulate the deterministic as well as stochastic mathematical models of HIV/AIDS. It provides complementary approaches, from deterministic and stochastic points of view, to optimal control strategy with perfect drug adherence and also tries to seek viewpoints of the same issue from different angles with various mathematical models to computer simulations. The book presents essential methods and techniques for students who are interested in designing epidemiological models on HIV/AIDS. It also guides research scientists, working in the periphery of mathematical modeling, and helps them to explore a hypothetical method by examining its consequences in the form of a mathematical modelling and making some scientific predictions. The model equations, mathematical analysis and several numerical simulations that are...

  20. Asteroid modeling for testing spacecraft approach and landing.

    Science.gov (United States)

    Martin, Iain; Parkes, Steve; Dunstan, Martin; Rowell, Nick

    2014-01-01

    Spacecraft exploration of asteroids presents autonomous-navigation challenges that can be aided by virtual models to test and develop guidance and hazard-avoidance systems. Researchers have extended and applied graphics techniques to create high-resolution asteroid models to simulate cameras and other spacecraft sensors approaching and descending toward asteroids. A scalable model structure with evenly spaced vertices simplifies terrain modeling, avoids distortion at the poles, and enables triangle-strip definition for efficient rendering. To create the base asteroid models, this approach uses two-phase Poisson faulting and Perlin noise. It creates realistic asteroid surfaces by adding both crater models adapted from lunar terrain simulation and multiresolution boulders. The researchers evaluated the virtual asteroids by comparing them with real asteroid images, examining the slope distributions, and applying a surface-relative feature-tracking algorithm to the models.

  1. Heuristic approaches to models and modeling in systems biology

    NARCIS (Netherlands)

    MacLeod, Miles

    2016-01-01

    Prediction and control sufficient for reliable medical and other interventions are prominent aims of modeling in systems biology. The short-term attainment of these goals has played a strong role in projecting the importance and value of the field. In this paper I identify the standard models must m

  2. A New Detection Approach Based on the Maximum Entropy Model

    Institute of Scientific and Technical Information of China (English)

    DONG Xiaomei; XIANG Guang; YU Ge; LI Xiaohua

    2006-01-01

    The maximum entropy model was introduced and a new intrusion detection approach based on the maximum entropy model was proposed. The vector space model was adopted for data presentation. The minimal entropy partitioning method was utilized for attribute discretization. Experiments on the KDD CUP 1999 standard data set were designed and the experimental results were shown. The receiver operating characteristic(ROC) curve analysis approach was utilized to analyze the experimental results. The analysis results show that the proposed approach is comparable to those based on support vector machine(SVM) and outperforms those based on C4.5 and Naive Bayes classifiers. According to the overall evaluation result, the proposed approach is a little better than those based on SVM.

  3. Patient Specific Modeling of Head-Up Tilt

    DEFF Research Database (Denmark)

    Williams, Nakeya; Wright, Andrew; Mehlsen, Jesper;

    2014-01-01

    blood pressure. The model contains five compartments representing arteries and veins in the upper and lower body of the systemic circulation, as well as the left ventricle facilitating pumping of the heart. A physiologically based sub-model describes gravitational effects on pooling of blood during...

  4. LEXICAL APPROACH IN TEACHING TURKISH: A COLLOCATIONAL STUDY MODEL

    Directory of Open Access Journals (Sweden)

    Eser ÖRDEM

    2013-06-01

    Full Text Available Abstract This study intends to propose Lexical Approach (Lewis, 1998, 2002; Harwood, 2002 and a model for teaching Turkish as a foreign language so that this model can be used in classroom settings. This model was created by the researcher as a result of the studies carried out in applied linguistics (Hill, 20009 and memory (Murphy, 2004. Since one of the main problems of foreign language learners is to retrieve what they have learnt, Lewis (1998 and Wray (2008 assume that lexical approach is an alternative explanation to solve this problem.Unlike grammar translation method, this approach supports the idea that language is not composed of general grammar but strings of word and word combinations.In addition, lexical approach posits the idea that each word has tiw gramamtical properties, and therefore each dictionary is a potential grammar book. Foreign language learners can learn to use collocations, a basic principle of Lexical approach. Thus, learners can increase the level of retention.The concept of retrieval clue (Murphy, 2004 is considered the main element in this collocational study model because the main purpose of this model is boost fluency and help learners gain native-like accuracy while producing the target language. Keywords: Foreign language teaching, lexical approach, collocations, retrieval clue

  5. Coalgebraic specifications and models of deterministic hybrid systems

    NARCIS (Netherlands)

    Jacobs, B.P.F.

    1996-01-01

    Coalgebraic specification and semantics, as used earlier for object-oriented programming, is extended with temporal aspects. The (non-temporal) expression ``s.meth'' expressing that method ``meth'' is applied in state s, is extended to an expression ``s.metha'', where a is a time parameter. It means

  6. Consequence Based Design. An approach for integrating computational collaborative models (Integrated Dynamic Models) in the building design phase

    DEFF Research Database (Denmark)

    Negendahl, Kristoffer

    affect the design process and collaboration between building designers and simulationists. Within the limits of applying the approach of Consequence based design to five case studies, followed by documentation based on interviews, surveys and project related documentations derived from internal reports...... that secures validity and quality assurance with a simulationist while sustaining autonomous control of building design with the building designer. Consequence based design is defined by the specific use of integrated dynamic models. These models include the parametric capabilities of a visual programming tool...... relies on various advancements in the area of integrated dynamic models. It also relies on the application and test of the approach in practice to evaluate the Consequence based design and the use of integrated dynamic models. As a result, the Consequence based design approach has been applied in five...

  7. Evaluation of site-specific lateral inclusion zone for vapor intrusion based on an analytical approach.

    Science.gov (United States)

    Yao, Yijun; Wu, Yun; Tang, Mengling; Wang, Yue; Wang, Jianjin; Suuberg, Eric M; Jiang, Lin; Liu, Jing

    2015-11-15

    In 2002, U.S. EPA proposed a general buffer zone of approximately 100 feet (30 m) laterally to determine which buildings to include in vapor intrusion (VI) investigations. However, this screening distance can be threatened by factors such as extensive surface pavements. Under such circumstances, EPA recommended investigating soil vapor migration distance on a site-specific basis. To serve this purpose, we present an analytical model (AAMLPH) as an alternative to estimate lateral VI screening distances at chlorinated compound-contaminated sites. Based on a previously introduced model (AAML), AAMLPH is developed by considering the effects of impervious surface cover and soil geology heterogeneities, providing predictions consistent with the three-dimensional (3-D) numerical simulated results. By employing risk-based and contribution-based screening levels of subslab concentrations (50 and 500 μg/m(3), respectively) and source-to-subslab attenuation factor (0.001 and 0.01, respectively), AAMLPH suggests that buildings greater than 30 m from a plume boundary can still be affected by VI in the presence of any two of the three factors, which are high source vapor concentration, shallow source and significant surface cover. This finding justifies the concern that EPA has expressed about the application of the 30 m lateral separation distance in the presence of physical barriers (e.g., asphalt covers or ice) at the ground surface.

  8. Specification Search for Identifying the Correct Mean Trajectory in Polynomial Latent Growth Models

    Science.gov (United States)

    Kim, Minjung; Kwok, Oi-Man; Yoon, Myeongsun; Willson, Victor; Lai, Mark H. C.

    2016-01-01

    This study investigated the optimal strategy for model specification search under the latent growth modeling (LGM) framework, specifically on searching for the correct polynomial mean or average growth model when there is no a priori hypothesized model in the absence of theory. In this simulation study, the effectiveness of different starting…

  9. Child human model development: a hybrid validation approach

    NARCIS (Netherlands)

    Forbes, P.A.; Rooij, L. van; Rodarius, C.; Crandall, J.

    2008-01-01

    The current study presents a development and validation approach of a child human body model that will help understand child impact injuries and improve the biofidelity of child anthropometric test devices. Due to the lack of fundamental child biomechanical data needed to fully develop such models a

  10. Modeling Alaska boreal forests with a controlled trend surface approach

    Science.gov (United States)

    Mo Zhou; Jingjing Liang

    2012-01-01

    An approach of Controlled Trend Surface was proposed to simultaneously take into consideration large-scale spatial trends and nonspatial effects. A geospatial model of the Alaska boreal forest was developed from 446 permanent sample plots, which addressed large-scale spatial trends in recruitment, diameter growth, and mortality. The model was tested on two sets of...

  11. Teaching Service Modelling to a Mixed Class: An Integrated Approach

    Science.gov (United States)

    Deng, Jeremiah D.; Purvis, Martin K.

    2015-01-01

    Service modelling has become an increasingly important area in today's telecommunications and information systems practice. We have adapted a Network Design course in order to teach service modelling to a mixed class of both the telecommunication engineering and information systems backgrounds. An integrated approach engaging mathematics teaching…

  12. Gray-box modelling approach for description of storage tunnel

    DEFF Research Database (Denmark)

    Harremoës, Poul; Carstensen, Jacob

    1999-01-01

    The dynamics of a storage tunnel is examined using a model based on on-line measured data and a combination of simple deterministic and black-box stochastic elements. This approach, called gray-box modeling, is a new promising methodology for giving an on-line state description of sewer systems...

  13. Child human model development: a hybrid validation approach

    NARCIS (Netherlands)

    Forbes, P.A.; Rooij, L. van; Rodarius, C.; Crandall, J.

    2008-01-01

    The current study presents a development and validation approach of a child human body model that will help understand child impact injuries and improve the biofidelity of child anthropometric test devices. Due to the lack of fundamental child biomechanical data needed to fully develop such models a

  14. Refining the Committee Approach and Uncertainty Prediction in Hydrological Modelling

    NARCIS (Netherlands)

    Kayastha, N.

    2014-01-01

    Due to the complexity of hydrological systems a single model may be unable to capture the full range of a catchment response and accurately predict the streamflows. The multi modelling approach opens up possibilities for handling such difficulties and allows improve the predictive capability of mode

  15. Refining the committee approach and uncertainty prediction in hydrological modelling

    NARCIS (Netherlands)

    Kayastha, N.

    2014-01-01

    Due to the complexity of hydrological systems a single model may be unable to capture the full range of a catchment response and accurately predict the streamflows. The multi modelling approach opens up possibilities for handling such difficulties and allows improve the predictive capability of mode

  16. Hybrid continuum-atomistic approach to model electrokinetics in nanofluidics

    Energy Technology Data Exchange (ETDEWEB)

    Amani, Ehsan, E-mail: eamani@aut.ac.ir; Movahed, Saeid, E-mail: smovahed@aut.ac.ir

    2016-06-07

    In this study, for the first time, a hybrid continuum-atomistic based model is proposed for electrokinetics, electroosmosis and electrophoresis, through nanochannels. Although continuum based methods are accurate enough to model fluid flow and electric potential in nanofluidics (in dimensions larger than 4 nm), ionic concentration is too low in nanochannels for the continuum assumption to be valid. On the other hand, the non-continuum based approaches are too time-consuming and therefore is limited to simple geometries, in practice. Here, to propose an efficient hybrid continuum-atomistic method of modelling the electrokinetics in nanochannels; the fluid flow and electric potential are computed based on continuum hypothesis coupled with an atomistic Lagrangian approach for the ionic transport. The results of the model are compared to and validated by the results of the molecular dynamics technique for a couple of case studies. Then, the influences of bulk ionic concentration, external electric field, size of nanochannel, and surface electric charge on the electrokinetic flow and ionic mass transfer are investigated, carefully. The hybrid continuum-atomistic method is a promising approach to model more complicated geometries and investigate more details of the electrokinetics in nanofluidics. - Highlights: • A hybrid continuum-atomistic model is proposed for electrokinetics in nanochannels. • The model is validated by molecular dynamics. • This is a promising approach to model more complicated geometries and physics.

  17. Modelling diversity in building occupant behaviour: a novel statistical approach

    DEFF Research Database (Denmark)

    Haldi, Frédéric; Calì, Davide; Andersen, Rune Korsholm

    2016-01-01

    We propose an advanced modelling framework to predict the scope and effects of behavioural diversity regarding building occupant actions on window openings, shading devices and lighting. We develop a statistical approach based on generalised linear mixed models to account for the longitudinal nat...

  18. Asteroid fragmentation approaches for modeling atmospheric energy deposition

    Science.gov (United States)

    Register, Paul J.; Mathias, Donovan L.; Wheeler, Lorien F.

    2017-03-01

    During asteroid entry, energy is deposited in the atmosphere through thermal ablation and momentum-loss due to aerodynamic drag. Analytic models of asteroid entry and breakup physics are used to compute the energy deposition, which can then be compared against measured light curves and used to estimate ground damage due to airburst events. This work assesses and compares energy deposition results from four existing approaches to asteroid breakup modeling, and presents a new model that combines key elements of those approaches. The existing approaches considered include a liquid drop or "pancake" model where the object is treated as a single deforming body, and a set of discrete fragment models where the object breaks progressively into individual fragments. The new model incorporates both independent fragments and aggregate debris clouds to represent a broader range of fragmentation behaviors and reproduce more detailed light curve features. All five models are used to estimate the energy deposition rate versus altitude for the Chelyabinsk meteor impact, and results are compared with an observationally derived energy deposition curve. Comparisons show that four of the five approaches are able to match the overall observed energy deposition profile, but the features of the combined model are needed to better replicate both the primary and secondary peaks of the Chelyabinsk curve.

  19. A Bayesian Approach for Analyzing Longitudinal Structural Equation Models

    Science.gov (United States)

    Song, Xin-Yuan; Lu, Zhao-Hua; Hser, Yih-Ing; Lee, Sik-Yum

    2011-01-01

    This article considers a Bayesian approach for analyzing a longitudinal 2-level nonlinear structural equation model with covariates, and mixed continuous and ordered categorical variables. The first-level model is formulated for measures taken at each time point nested within individuals for investigating their characteristics that are dynamically…

  20. An Empirical-Mathematical Modelling Approach to Upper Secondary Physics

    Science.gov (United States)

    Angell, Carl; Kind, Per Morten; Henriksen, Ellen K.; Guttersrud, Oystein

    2008-01-01

    In this paper we describe a teaching approach focusing on modelling in physics, emphasizing scientific reasoning based on empirical data and using the notion of multiple representations of physical phenomena as a framework. We describe modelling activities from a project (PHYS 21) and relate some experiences from implementation of the modelling…

  1. An Alternative Approach for Nonlinear Latent Variable Models

    Science.gov (United States)

    Mooijaart, Ab; Bentler, Peter M.

    2010-01-01

    In the last decades there has been an increasing interest in nonlinear latent variable models. Since the seminal paper of Kenny and Judd, several methods have been proposed for dealing with these kinds of models. This article introduces an alternative approach. The methodology involves fitting some third-order moments in addition to the means and…

  2. Refining the Committee Approach and Uncertainty Prediction in Hydrological Modelling

    NARCIS (Netherlands)

    Kayastha, N.

    2014-01-01

    Due to the complexity of hydrological systems a single model may be unable to capture the full range of a catchment response and accurately predict the streamflows. The multi modelling approach opens up possibilities for handling such difficulties and allows improve the predictive capability of mode

  3. Refining the committee approach and uncertainty prediction in hydrological modelling

    NARCIS (Netherlands)

    Kayastha, N.

    2014-01-01

    Due to the complexity of hydrological systems a single model may be unable to capture the full range of a catchment response and accurately predict the streamflows. The multi modelling approach opens up possibilities for handling such difficulties and allows improve the predictive capability of mode

  4. Positive Mathematical Programming Approaches – Recent Developments in Literature and Applied Modelling

    Directory of Open Access Journals (Sweden)

    Thomas Heckelei

    2012-05-01

    Full Text Available This paper reviews and discusses the more recent literature and application of Positive Mathematical Programming in the context of agricultural supply models. Specifically, advances in the empirical foundation of parameter specifications as well as the economic rationalisation of PMP models – both criticized in earlier reviews – are investigated. Moreover, the paper provides an overview on a larger set of models with regular/repeated policy application that apply variants of PMP. Results show that most applications today avoid arbitrary parameter specifications and rely on exogenous information on supply responses to calibrate model parameters. However, only few approaches use multiple observations to estimate parameters, which is likely due to the still considerable technical challenges associated with it. Equally, we found only limited reflection on the behavioral or technological assumptions that could rationalise the PMP model structure while still keeping the model’s advantages.

  5. Specification of a STEP Based Reference Model for Exchange of Robotics Models

    DEFF Research Database (Denmark)

    Haenisch, Jochen; Kroszynski, Uri; Ludwig, Arnold

    ESPRIT Project 6457: "Interoperability of Standards for Robotics in CIME" (InterRob) belongs to the Subprogram "Computer Integrated Manufacturing and Engineering" of ESPRIT, the European Specific Programme for Research and Development in Information Technology supported by the European Commision....... InterRob aims to develop an integrated solution to precision manufacturing by combining product data and database technologies with robotic off-line programming and simulation. Benefits arise from the use of high level simulation tools and developing standards for the exchange of product model data...... combining geometric, dynamic, process and robot specific data.The growing need for accurate information about manufacturing data (models of robots and other mechanisms) in diverse industrial applications has initiated ESPRIT Project 6457: InterRob. Besides the topics associated with standards for industrial...

  6. New prediction model for probe specificity in an allele-specific extension reaction for haplotype-specific extraction (HSE) of Y chromosome mixtures.

    Science.gov (United States)

    Rothe, Jessica; Watkins, Norman E; Nagy, Marion

    2012-01-01

    Allele-specific extension reactions (ASERs) use 3' terminus-specific primers for the selective extension of completely annealed matches by polymerase. The ability of the polymerase to extend non-specific 3' terminal mismatches leads to a failure of the reaction, a process that is only partly understood and predictable, and often requires time-consuming assay design. In our studies we investigated haplotype-specific extraction (HSE) for the separation of male DNA mixtures. HSE is an ASER and provides the ability to distinguish between diploid chromosomes from one or more individuals. Here, we show that the success of HSE and allele-specific extension depend strongly on the concentration difference between complete match and 3' terminal mismatch. Using the oligonucleotide-modeling platform Visual Omp, we demonstrated the dependency of the discrimination power of the polymerase on match- and mismatch-target hybridization between different probe lengths. Therefore, the probe specificity in HSE could be predicted by performing a relative comparison of different probe designs with their simulated differences between the duplex concentration of target-probe match and mismatches. We tested this new model for probe design in more than 300 HSE reactions with 137 different probes and obtained an accordance of 88%.

  7. Gray-box modelling approach for description of storage tunnel

    DEFF Research Database (Denmark)

    Harremoës, Poul; Carstensen, Jacob

    1999-01-01

    of the water in the overflow structures. The capacity of a pump draining the storage tunnel is estimated for two different rain events, revealing that the pump was malfunctioning during the first rain event. The proposed modeling approach can be used in automated online surveillance and control and implemented....... The model in the present paper provides on-line information on overflow volumes, pumping capacities, and remaining storage capacities. A linear overflow relation is found, differing significantly from the traditional deterministic modeling approach. The linearity of the formulas is explained by the inertia...

  8. A study of multidimensional modeling approaches for data warehouse

    Science.gov (United States)

    Yusof, Sharmila Mat; Sidi, Fatimah; Ibrahim, Hamidah; Affendey, Lilly Suriani

    2016-08-01

    Data warehouse system is used to support the process of organizational decision making. Hence, the system must extract and integrate information from heterogeneous data sources in order to uncover relevant knowledge suitable for decision making process. However, the development of data warehouse is a difficult and complex process especially in its conceptual design (multidimensional modeling). Thus, there have been various approaches proposed to overcome the difficulty. This study surveys and compares the approaches of multidimensional modeling and highlights the issues, trend and solution proposed to date. The contribution is on the state of the art of the multidimensional modeling design.

  9. Meta-analysis a structural equation modeling approach

    CERN Document Server

    Cheung, Mike W-L

    2015-01-01

    Presents a novel approach to conducting meta-analysis using structural equation modeling. Structural equation modeling (SEM) and meta-analysis are two powerful statistical methods in the educational, social, behavioral, and medical sciences. They are often treated as two unrelated topics in the literature. This book presents a unified framework on analyzing meta-analytic data within the SEM framework, and illustrates how to conduct meta-analysis using the metaSEM package in the R statistical environment. Meta-Analysis: A Structural Equation Modeling Approach begins by introducing the impo

  10. Modeling and Algorithmic Approaches to Constitutively-Complex, Microstructured Fluids

    Energy Technology Data Exchange (ETDEWEB)

    Miller, Gregory H. [Univ. of California, Davis, CA (United States); Forest, Gregory [Univ. of California, Davis, CA (United States)

    2014-05-01

    We present a new multiscale model for complex fluids based on three scales: microscopic, kinetic, and continuum. We choose the microscopic level as Kramers' bead-rod model for polymers, which we describe as a system of stochastic differential equations with an implicit constraint formulation. The associated Fokker-Planck equation is then derived, and adiabatic elimination removes the fast momentum coordinates. Approached in this way, the kinetic level reduces to a dispersive drift equation. The continuum level is modeled with a finite volume Godunov-projection algorithm. We demonstrate computation of viscoelastic stress divergence using this multiscale approach.

  11. Metamodelling Approach and Software Tools for Physical Modelling and Simulation

    Directory of Open Access Journals (Sweden)

    Vitaliy Mezhuyev

    2015-02-01

    Full Text Available In computer science, metamodelling approach becomes more and more popular for the purpose of software systems development. In this paper, we discuss applicability of the metamodelling approach for development of software tools for physical modelling and simulation.To define a metamodel for physical modelling the analysis of physical models will be done. The result of such the analyses will show the invariant physical structures, we propose to use as the basic abstractions of the physical metamodel. It is a system of geometrical objects, allowing to build a spatial structure of physical models and to set a distribution of physical properties. For such geometry of distributed physical properties, the different mathematical methods can be applied. To prove the proposed metamodelling approach, we consider the developed prototypes of software tools.

  12. Social learning in Models and Cases - an Interdisciplinary Approach

    Science.gov (United States)

    Buhl, Johannes; De Cian, Enrica; Carrara, Samuel; Monetti, Silvia; Berg, Holger

    2016-04-01

    Our paper follows an interdisciplinary understanding of social learning. We contribute to the literature on social learning in transition research by bridging case-oriented research and modelling-oriented transition research. We start by describing selected theories on social learning in innovation, diffusion and transition research. We present theoretical understandings of social learning in techno-economic and agent-based modelling. Then we elaborate on empirical research on social learning in transition case studies. We identify and synthetize key dimensions of social learning in transition case studies. In the following we bridge between more formal and generalising modelling approaches towards social learning processes and more descriptive, individualising case study approaches by interpreting the case study analysis into a visual guide on functional forms of social learning typically identified in the cases. We then try to exemplarily vary functional forms of social learning in integrated assessment models. We conclude by drawing the lessons learned from the interdisciplinary approach - methodologically and empirically.

  13. Learning the Task Management Space of an Aircraft Approach Model

    Science.gov (United States)

    Krall, Joseph; Menzies, Tim; Davies, Misty

    2014-01-01

    Validating models of airspace operations is a particular challenge. These models are often aimed at finding and exploring safety violations, and aim to be accurate representations of real-world behavior. However, the rules governing the behavior are quite complex: nonlinear physics, operational modes, human behavior, and stochastic environmental concerns all determine the responses of the system. In this paper, we present a study on aircraft runway approaches as modeled in Georgia Tech's Work Models that Compute (WMC) simulation. We use a new learner, Genetic-Active Learning for Search-Based Software Engineering (GALE) to discover the Pareto frontiers defined by cognitive structures. These cognitive structures organize the prioritization and assignment of tasks of each pilot during approaches. We discuss the benefits of our approach, and also discuss future work necessary to enable uncertainty quantification.

  14. Formal Specifications and Verification of a Secure Communication Protocol Model

    Institute of Scientific and Technical Information of China (English)

    夏阳; 陆余良; 蒋凡

    2003-01-01

    This paper presents a secure communication protocol model-EABM, by which network security communication can be realized easily and efficiently. First, the paper gives a thorough analysis of the protocol system, systematic construction and state transition of EABM. Then , it describes the channels and the process of state transition of EABM in terms of ESTELLE. At last, it offers a verification of the accuracy of the EABM model.

  15. Non-additive model for specific heat of electrons

    Science.gov (United States)

    Anselmo, D. H. A. L.; Vasconcelos, M. S.; Silva, R.; Mello, V. D.

    2016-10-01

    By using non-additive Tsallis entropy we demonstrate numerically that one-dimensional quasicrystals, whose energy spectra are multifractal Cantor sets, are characterized by an entropic parameter, and calculate the electronic specific heat, where we consider a non-additive entropy Sq. In our method we consider an energy spectra calculated using the one-dimensional tight binding Schrödinger equation, and their bands (or levels) are scaled onto the [ 0 , 1 ] interval. The Tsallis' formalism is applied to the energy spectra of Fibonacci and double-period one-dimensional quasiperiodic lattices. We analytically obtain an expression for the specific heat that we consider to be more appropriate to calculate this quantity in those quasiperiodic structures.

  16. Patient-specific computer modelling of coronary bifurcation stenting: the John Doe programme.

    Science.gov (United States)

    Mortier, Peter; Wentzel, Jolanda J; De Santis, Gianluca; Chiastra, Claudio; Migliavacca, Francesco; De Beule, Matthieu; Louvard, Yves; Dubini, Gabriele

    2015-01-01

    John Doe, an 81-year-old patient with a significant distal left main (LM) stenosis, was treated using a provisional stenting approach. As part of an European Bifurcation Club (EBC) project, the complete stenting procedure was repeated using computational modelling. First, a tailored three-dimensional (3D) reconstruction of the bifurcation anatomy was created by fusion of multislice computed tomography (CT) imaging and intravascular ultrasound. Second, finite element analysis was employed to deploy and post-dilate the stent virtually within the generated patient-specific anatomical bifurcation model. Finally, blood flow was modelled using computational fluid dynamics. This proof-of-concept study demonstrated the feasibility of such patient-specific simulations for bifurcation stenting and has provided unique insights into the bifurcation anatomy, the technical aspects of LM bifurcation stenting, and the positive impact of adequate post-dilatation on blood flow patterns. Potential clinical applications such as virtual trials and preoperative planning seem feasible but require a thorough clinical validation of the predictive power of these computer simulations.

  17. Model-driven design using IEC 61499 a synchronous approach for embedded and automation systems

    CERN Document Server

    Yoong, Li Hsien; Bhatti, Zeeshan E; Kuo, Matthew M Y

    2015-01-01

    This book describes a novel approach for the design of embedded systems and industrial automation systems, using a unified model-driven approach that is applicable in both domains.  The authors illustrate their methodology, using the IEC 61499 standard as the main vehicle for specification, verification, static timing analysis and automated code synthesis.  The well-known synchronous approach is used as the main vehicle for defining an unambiguous semantics that ensures determinism and deadlock freedom. The proposed approach also ensures very efficient implementations either on small-scale embedded devices or on industry-scale programmable automation controllers (PACs). It can be used for both centralized and distributed implementations. Significantly, the proposed approach can be used without the need for any run-time support. This approach, for the first time, blurs the gap between embedded systems and automation systems and can be applied in wide-ranging applications in automotive, robotics, and industri...

  18. Life events in bipolar disorder: towards more specific models.

    Science.gov (United States)

    Johnson, Sheri L

    2005-12-01

    This article reviews the evidence concerning life events as a predictor of symptoms within bipolar disorder. First, key methodological issues in this area are described, and criteria used for including studies in this review are defined. Then findings that negative life events predict worse outcomes within bipolar disorder are reviewed. Beyond general studies on relapse, it is important to differentiate predictors of depression from predictors of mania. When severe negative life events occur, they appear to trigger increases in bipolar depression. Nonetheless, many depressions are unrelated to negative life events and appear to be triggered by other variables. The strongest evidence suggests that negative life events do not trigger mania, except perhaps in certain contexts. Retrospective findings for schedule-disrupting life events as a trigger for manic symptoms await further assessment within a longitudinal study. Life events involving goal attainment do appear to trigger manic symptoms. Overall, it is time to differentiate among specific types of life events, as these different forms of events point towards mechanisms linking stressors with symptom expression. These mechanisms provide clues into ways to integrate the social environment with biological vulnerability (see [Monroe, S.M., & Johnson, S.L. (1990)). the dimensions of life stress and the specificity of disorder. Journal of Applied Social Psychology, 20, 167-1694; Harris, T.O. (1991). Life stress and illness: the question of specificity. Annals of Behavioral Medicine, 13, 211-219]).

  19. Specific and generic stem biomass and volume models of tree species in a West African tropical semi-deciduous forest

    DEFF Research Database (Denmark)

    Goussanou, Cédric A.; Guendehou, Sabin; Assogbadjo, Achille E.

    2016-01-01

    The quantification of the contribution of tropical forests to global carbon stocks and climate change mitigation requires availability of data and tools such as allometric equations. This study made available volume and biomass models for eighteen tree species in a semi-deciduous tropical forest...... in West Africa. Generic models were also developed for the forest ecosystem, and basic wood density determined for the tree species. Non-destructive sampling approach was carried out on five hundred and one sample trees to analyse stem volume and biomass. From the modelling of volume and biomass...... predictive ability for biomass. Given tree species richness of tropical forests, the study demonstrated the hypothesis that species-specific models are preferred to generic models, and concluded that further research should be oriented towards development of specific models to cover the full range...

  20. A methodical approach to performance measurement experiments : measure and measurement specification

    OpenAIRE

    Hoeksema, F.W.; Veen,, A. m.; Beijnum, van, B.J.F.

    1997-01-01

    This report describes a methodical approach to performance measurement experiments. This approach gives a blueprint for the whole trajectory from the notion of performance measures and how to define them via planning, instrumentation and execution of the experiments to interpretation of the results. The first stage of the approach, Measurement Initialisation, has been worked out completely. It is shown that a well-defined system description allows a procedural approach to defining performance...