WorldWideScience

Sample records for models based primarily

  1. Development and Sensitivity Analysis of a Frost Risk model based primarily on freely distributed Earth Observation data

    Science.gov (United States)

    Louka, Panagiota; Petropoulos, George; Papanikolaou, Ioannis

    2015-04-01

    The ability to map the spatiotemporal distribution of extreme climatic conditions, such as frost, is a significant tool in successful agricultural management and decision making. Nowadays, with the development of Earth Observation (EO) technology, it is possible to obtain accurately, timely and in a cost-effective way information on the spatiotemporal distribution of frost conditions, particularly over large and otherwise inaccessible areas. The present study aimed at developing and evaluating a frost risk prediction model, exploiting primarily EO data from MODIS and ASTER sensors and ancillary ground observation data. For the evaluation of our model, a region in north-western Greece was selected as test site and a detailed sensitivity analysis was implemented. The agreement between the model predictions and the observed (remotely sensed) frost frequency obtained by MODIS sensor was evaluated thoroughly. Also, detailed comparisons of the model predictions were performed against reference frost ground observations acquired from the Greek Agricultural Insurance Organization (ELGA) over a period of 10-years (2000-2010). Overall, results evidenced the ability of the model to produce reasonably well the frost conditions, following largely explainable patterns in respect to the study site and local weather conditions characteristics. Implementation of our proposed frost risk model is based primarily on satellite imagery analysis provided nowadays globally at no cost. It is also straightforward and computationally inexpensive, requiring much less effort in comparison for example to field surveying. Finally, the method is adjustable to be potentially integrated with other high resolution data available from both commercial and non-commercial vendors. Keywords: Sensitivity analysis, frost risk mapping, GIS, remote sensing, MODIS, Greece

  2. Disgust sensitivity is primarily associated with purity-based moral judgments.

    Science.gov (United States)

    Wagemans, Fieke M A; Brandt, Mark J; Zeelenberg, Marcel

    2018-03-01

    Individual differences in disgust sensitivity are associated with a range of judgments and attitudes related to the moral domain. Some perspectives suggest that the association between disgust sensitivity and moral judgments will be equally strong across all moral domains (i.e., purity, authority, loyalty, care, fairness, and liberty). Other perspectives predict that disgust sensitivity is primarily associated with judgments of specific moral domains (e.g., primarily purity). However, no study has systematically tested if disgust sensitivity is associated with moral judgments of the purity domain specifically, more generally to moral judgments of the binding moral domains, or to moral judgments of all of the moral domains equally. Across 5 studies (total N = 1,104), we find consistent evidence for the notion that disgust sensitivity relates more strongly to moral condemnation of purity-based transgressions (meta-analytic r = .40) than to moral condemnation of transgressions of any of the other domains (range meta-analytic rs: .07-.27). Our findings are in line with predictions from Moral Foundations Theory, which predicts that personality characteristics like disgust sensitivity make people more sensitive to a certain set of moral issues. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  3. Spatial agent-based models for socio-ecological systems: challenges and prospects

    NARCIS (Netherlands)

    de Filatova, T.; Verburg, P.H.; Parker, D.C.; Stannard, S.R.

    2013-01-01

    Departing from the comprehensive reviews carried out in the field, we identify the key challenges that agent-based methodology faces when modeling coupled socio-ecological systems. Focusing primarily on the papers presented in this thematic issue, we review progress in spatial agent-based models

  4. Central Puget Sound Ecopath/Ecosim model outputs - Developing food web models for ecosystem-based management applications in Puget Sound

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This project is developing food web models for ecosystem-based management applications in Puget Sound. It is primarily being done by NMFS FTEs and contractors, in...

  5. Incidence of diseases primarily affecting the skin by age group: population-based epidemiologic study in Olmsted County, Minnesota, and comparison with age-specific incidence rates worldwide.

    Science.gov (United States)

    Wessman, Laurel L; Andersen, Louise K; Davis, Mark D P

    2018-01-29

    Understanding the effects of age on the epidemiology of diseases primarily affecting the skin is important to the practice of dermatology, both for proper allocation of resources and for optimal patient-centered care. To fully appreciate the effect that age may have on the population-based calculations of incidence of diseases primarily affecting the skin in Olmsted County, Minnesota, and worldwide, we performed a review of all relevant Rochester Epidemiology Project-published data and compared them to similar reports in the worldwide English literature. Using the Rochester Epidemiology Project, population-based epidemiologic studies have been performed to estimate the incidence of specific skin diseases over the past 50 years. In older persons (>65 years), nonmelanoma skin cancer, lentigo maligna, herpes zoster, delusional infestation, venous stasis syndrome, venous ulcer, and burning mouth syndrome were more commonly diagnosed. In those younger than 65 years, atypical nevi, psoriatic arthritis, pityriasis rosea, herpes progenitalis, genital warts, alopecia areata, hidradenitis suppurativa, infantile hemangioma, Behçet's disease, and sarcoidosis (isolated cutaneous, with sarcoidosis-specific cutaneous lesions and with erythema nodosum) had a higher incidence. Many of the incidence rates by age group of diseases primarily affecting the skin derived from the Rochester Epidemiology Project were similar to those reported elsewhere. © 2018 The International Society of Dermatology.

  6. Central Puget Sound Ecopath/Ecosim model biological parameters - Developing food web models for ecosystem-based management applications in Puget Sound

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This project is developing food web models for ecosystem-based management applications in Puget Sound. It is primarily being done by NMFS FTEs and contractors, in...

  7. Admission rates in a general practitioner-based versus a hospital specialist based, hospital-at-home model

    DEFF Research Database (Denmark)

    Mogensen, Christian Backer; Ankersen, Ejnar Skytte; Lindberg, Mats J

    2018-01-01

    . CONCLUSIONS: The GP based HaH model was more effective than the hospital specialist model in avoiding hospital admissions within 7 days among elderly patients with an acute medical condition with no differences in mental or physical recovery rates or deaths between the two models. REGISTRATION: No. NCT......BACKGROUND: Hospital at home (HaH) is an alternative to acute admission for elderly patients. It is unclear if should be cared for a primarily by a hospital intern specialist or by the patient's own general practitioner (GP). The study assessed whether a GP based model was more effective than...... Denmark, including + 65 years old patients with an acute medical condition that required acute hospital in-patient care. The patients were randomly assigned to hospital specialist based model or GP model of HaH care. Five physical and cognitive performance tests were performed at inclusion and after 7...

  8. An empirical test of the information-motivation-behavioral skills model of ART adherence in a sample of HIV-positive persons primarily in out-of-HIV-care settings.

    Science.gov (United States)

    Horvath, Keith J; Smolenski, Derek; Amico, K Rivet

    2014-02-01

    The current body of evidence supporting the Information-Motivation-Behavioral Skills (IMB) model of antiretroviral therapy (ART) adherence rests exclusively on data collected from people living with HIV (PLWH) at point-of-HIV-care services. The aims of this study were to: (1) determine if the IMB model is a useful predictive model of ART adherence among PLWH who were primarily recruited in out-of-HIV-care settings; and (2) assess whether the theorized associations between IMB model constructs and adherence persist in the presence of depression and current drug use. PLWH (n = 312) responding to a one-time online survey completed the Life Windows IMB-ART-Adherence Questionnaire, and demographic, depression (CES-D 10), and drug use items. Path models were used to assess the fit of a saturated versus fully mediated IMB model of adherence and examined for moderating effects of depression and current drug use. Participants were on average 43 years of age, had been living with HIV for 9 or more years, and mostly male (84.0%), Caucasian (68.8%), and gay-identified (74.8%). The a priori measurement models for information and behavioral skills did not have acceptable fit to the data and were modified accordingly. Using the revised IMB scales, IMB constructs were associated with adherence as predicted by the theory in all but one model (i.e., the IMB model operated as predicted among nondrug users and those with and without depression). Among drug users, information exerted a direct effect on adherence but was not significantly associated with behavioral skills. Results of this study suggest that the fully or partially mediated IMB model is supported for use with samples of PLWH recruited primarily out-of-HIV-care service settings and is robust in the presence of depression and drug use.

  9. Microinstability-based model for anomalous thermal confinement in tokamaks

    International Nuclear Information System (INIS)

    Tang, W.M.

    1986-03-01

    This paper deals with the formulation of microinstability-based thermal transport coefficients (chi/sub j/) for the purpose of modelling anomalous energy confinement properties in tokamak plasmas. Attention is primarily focused on ohmically heated discharges and the associated anomalous electron thermal transport. An appropriate expression for chi/sub e/ is developed which is consistent with reasonable global constraints on the current and electron temperature profiles as well as with the key properties of the kinetic instabilities most likely to be present. Comparisons of confinement scaling trends predicted by this model with the empirical ohmic data base indicate quite favorable agreement. The subject of anomalous ion thermal transport and its implications for high density ohmic discharges and for auxiliary-heated plasmas is also addressed

  10. A Full-Body Layered Deformable Model for Automatic Model-Based Gait Recognition

    Science.gov (United States)

    Lu, Haiping; Plataniotis, Konstantinos N.; Venetsanopoulos, Anastasios N.

    2007-12-01

    This paper proposes a full-body layered deformable model (LDM) inspired by manually labeled silhouettes for automatic model-based gait recognition from part-level gait dynamics in monocular video sequences. The LDM is defined for the fronto-parallel gait with 22 parameters describing the human body part shapes (widths and lengths) and dynamics (positions and orientations). There are four layers in the LDM and the limbs are deformable. Algorithms for LDM-based human body pose recovery are then developed to estimate the LDM parameters from both manually labeled and automatically extracted silhouettes, where the automatic silhouette extraction is through a coarse-to-fine localization and extraction procedure. The estimated LDM parameters are used for model-based gait recognition by employing the dynamic time warping for matching and adopting the combination scheme in AdaBoost.M2. While the existing model-based gait recognition approaches focus primarily on the lower limbs, the estimated LDM parameters enable us to study full-body model-based gait recognition by utilizing the dynamics of the upper limbs, the shoulders and the head as well. In the experiments, the LDM-based gait recognition is tested on gait sequences with differences in shoe-type, surface, carrying condition and time. The results demonstrate that the recognition performance benefits from not only the lower limb dynamics, but also the dynamics of the upper limbs, the shoulders and the head. In addition, the LDM can serve as an analysis tool for studying factors affecting the gait under various conditions.

  11. Cost Effective Community Based Dementia Screening: A Markov Model Simulation

    Directory of Open Access Journals (Sweden)

    Erin Saito

    2014-01-01

    Full Text Available Background. Given the dementia epidemic and the increasing cost of healthcare, there is a need to assess the economic benefit of community based dementia screening programs. Materials and Methods. Markov model simulations were generated using data obtained from a community based dementia screening program over a one-year period. The models simulated yearly costs of caring for patients based on clinical transitions beginning in pre dementia and extending for 10 years. Results. A total of 93 individuals (74 female, 19 male were screened for dementia and 12 meeting clinical criteria for either mild cognitive impairment (n=7 or dementia (n=5 were identified. Assuming early therapeutic intervention beginning during the year of dementia detection, Markov model simulations demonstrated 9.8% reduction in cost of dementia care over a ten-year simulation period, primarily through increased duration in mild stages and reduced time in more costly moderate and severe stages. Discussion. Community based dementia screening can reduce healthcare costs associated with caring for demented individuals through earlier detection and treatment, resulting in proportionately reduced time in more costly advanced stages.

  12. 29 CFR 780.607 - “Primarily employed” in agriculture.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 3 2010-07-01 2010-07-01 false âPrimarily employedâ in agriculture. 780.607 Section 780... AGRICULTURE, PROCESSING OF AGRICULTURAL COMMODITIES, AND RELATED SUBJECTS UNDER THE FAIR LABOR STANDARDS ACT Employment in Agriculture and Livestock Auction Operations Under the Section 13(b)(13) Exemption Requirements...

  13. A simplified physics-based model for nickel hydrogen battery

    Science.gov (United States)

    Liu, Shengyi; Dougal, Roger A.; Weidner, John W.; Gao, Lijun

    This paper presents a simplified model of a nickel hydrogen battery based on a first approximation. The battery is assumed uniform throughout. The reversible potential is considered primarily due to one-electron transfer redox reaction of nickel hydroxide and nickel oxyhydroxide. The non-ideality due to phase reactions is characterized by the two-parameter activity coefficients. The overcharge process is characterized by the oxygen reaction. The overpotentials are lumped to a tunable resistive drop to fit particular battery designs. The model is implemented in the Virtual Test Bed environment, and the characteristics of the battery are simulated and in good agreement with the experimental data within the normal operating regime. The model can be used for battery dynamic simulation and design in a satellite power system, an example of which is given.

  14. An information theory-based approach to modeling the information processing of NPP operators

    International Nuclear Information System (INIS)

    Kim, Jong Hyun; Seong, Poong Hyun

    2002-01-01

    This paper proposes a quantitative approach to modeling the information processing of NPP operators. The aim of this work is to derive the amount of the information processed during a certain control task. The focus will be on i) developing a model for information processing of NPP operators and ii) quantifying the model. To resolve the problems of the previous approaches based on the information theory, i.e. the problems of single channel approaches, we primarily develop the information processing model having multiple stages, which contains information flows. Then the uncertainty of the information is quantified using the Conant's model, a kind of information theory

  15. Uncertainty modelling and analysis of volume calculations based on a regular grid digital elevation model (DEM)

    Science.gov (United States)

    Li, Chang; Wang, Qing; Shi, Wenzhong; Zhao, Sisi

    2018-05-01

    The accuracy of earthwork calculations that compute terrain volume is critical to digital terrain analysis (DTA). The uncertainties in volume calculations (VCs) based on a DEM are primarily related to three factors: 1) model error (ME), which is caused by an adopted algorithm for a VC model, 2) discrete error (DE), which is usually caused by DEM resolution and terrain complexity, and 3) propagation error (PE), which is caused by the variables' error. Based on these factors, the uncertainty modelling and analysis of VCs based on a regular grid DEM are investigated in this paper. Especially, how to quantify the uncertainty of VCs is proposed by a confidence interval based on truncation error (TE). In the experiments, the trapezoidal double rule (TDR) and Simpson's double rule (SDR) were used to calculate volume, where the TE is the major ME, and six simulated regular grid DEMs with different terrain complexity and resolution (i.e. DE) were generated by a Gauss synthetic surface to easily obtain the theoretical true value and eliminate the interference of data errors. For PE, Monte-Carlo simulation techniques and spatial autocorrelation were used to represent DEM uncertainty. This study can enrich uncertainty modelling and analysis-related theories of geographic information science.

  16. Control volume based modelling of compressible flow in reciprocating machines

    DEFF Research Database (Denmark)

    Andersen, Stig Kildegård; Thomsen, Per Grove; Carlsen, Henrik

    2004-01-01

    , and multidimensional effects must be calculated using empirical correlations; correlations for steady state flow can be used as an approximation. A transformation that assumes ideal gas is presented for transforming equations for masses and energies in control volumes into the corresponding pressures and temperatures......An approach to modelling unsteady compressible flow that is primarily one dimensional is presented. The approach was developed for creating distributed models of machines with reciprocating pistons but it is not limited to this application. The approach is based on the integral form of the unsteady...... conservation laws for mass, energy, and momentum applied to a staggered mesh consisting of two overlapping strings of control volumes. Loss mechanisms can be included directly in the governing equations of models by including them as terms in the conservation laws. Heat transfer, flow friction...

  17. Analytical modeling of a sandwiched plate piezoelectric transformer-based acoustic-electric transmission channel.

    Science.gov (United States)

    Lawry, Tristan J; Wilt, Kyle R; Scarton, Henry A; Saulnier, Gary J

    2012-11-01

    The linear propagation of electromagnetic and dilatational waves through a sandwiched plate piezoelectric transformer (SPPT)-based acoustic-electric transmission channel is modeled using the transfer matrix method with mixed-domain two-port ABCD parameters. This SPPT structure is of great interest because it has been explored in recent years as a mechanism for wireless transmission of electrical signals through solid metallic barriers using ultrasound. The model we present is developed to allow for accurate channel performance prediction while greatly reducing the computational complexity associated with 2- and 3-dimensional finite element analysis. As a result, the model primarily considers 1-dimensional wave propagation; however, approximate solutions for higher-dimensional phenomena (e.g., diffraction in the SPPT's metallic core layer) are also incorporated. The model is then assessed by comparing it to the measured wideband frequency response of a physical SPPT-based channel from our previous work. Very strong agreement between the modeled and measured data is observed, confirming the accuracy and utility of the presented model.

  18. Modeling the interdependent network based on two-mode networks

    Science.gov (United States)

    An, Feng; Gao, Xiangyun; Guan, Jianhe; Huang, Shupei; Liu, Qian

    2017-10-01

    Among heterogeneous networks, there exist obviously and closely interdependent linkages. Unlike existing research primarily focus on the theoretical research of physical interdependent network model. We propose a two-layer interdependent network model based on two-mode networks to explore the interdependent features in the reality. Specifically, we construct a two-layer interdependent loan network and develop several dependent features indices. The model is verified to enable us to capture the loan dependent features of listed companies based on loan behaviors and shared shareholders. Taking Chinese debit and credit market as case study, the main conclusions are: (1) only few listed companies shoulder the main capital transmission (20% listed companies occupy almost 70% dependent degree). (2) The control of these key listed companies will be more effective of avoiding the spreading of financial risks. (3) Identifying the companies with high betweenness centrality and controlling them could be helpful to monitor the financial risk spreading. (4) The capital transmission channel among Chinese financial listed companies and Chinese non-financial listed companies are relatively strong. However, under greater pressure of demand of capital transmission (70% edges failed), the transmission channel, which constructed by debit and credit behavior, will eventually collapse.

  19. Nanoparticles affect PCR primarily via surface interactions with PCR components: using amino-modified silica-coated magnetic nanoparticles as a main model

    Science.gov (United States)

    Nanomaterials have been widely reported to affect the polymerase chain reaction (PCR). However, many studies in which these effects were observed were not comprehensive, and many of the proposed mechanisms have been primarily speculative. In this work, we used amino-modified silica-coated magnetic n...

  20. Process-Based Modeling of Constructed Wetlands

    Science.gov (United States)

    Baechler, S.; Brovelli, A.; Rossi, L.; Barry, D. A.

    2007-12-01

    Constructed wetlands (CWs) are widespread facilities for wastewater treatment. In subsurface flow wetlands, contaminated wastewater flows through a porous matrix, where oxidation and detoxification phenomena occur. Despite the large number of working CWs, system design and optimization are still mainly based upon empirical equations or simplified first-order kinetics. This results from an incomplete understanding of the system functioning, and may in turn hinder the performance and effectiveness of the treatment process. As a result, CWs are often considered not suitable to meet high water quality-standards, or to treat water contaminated with recalcitrant anthropogenic contaminants. To date, only a limited number of detailed numerical models have been developed and successfully applied to simulate constructed wetland behavior. Among these, one of the most complete and powerful is CW2D, which is based on Hydrus2D. The aim of this work is to develop a comprehensive simulator tailored to model the functioning of horizontal flow constructed wetlands and in turn provide a reliable design and optimization tool. The model is based upon PHWAT, a general reactive transport code for saturated flow. PHWAT couples MODFLOW, MT3DMS and PHREEQC-2 using an operator-splitting approach. The use of PHREEQC to simulate reactions allows great flexibility in simulating biogeochemical processes. The biogeochemical reaction network is similar to that of CW2D, and is based on the Activated Sludge Model (ASM). Kinetic oxidation of carbon sources and nutrient transformations (nitrogen and phosphorous primarily) are modeled via Monod-type kinetic equations. Oxygen dissolution is accounted for via a first-order mass-transfer equation. While the ASM model only includes a limited number of kinetic equations, the new simulator permits incorporation of an unlimited number of both kinetic and equilibrium reactions. Changes in pH, redox potential and surface reactions can be easily incorporated

  1. Primarily Experimental Results for a W Wire Array Z Pinch

    International Nuclear Information System (INIS)

    Kuai Bin; Aici, Qiu; Wang Liangping; Zeng Zhengzhong; Wang Wensheng; Cong Peitian; Gai Tongyang; Wei Fuli; Guo Ning; Zhang Zhong

    2006-01-01

    Primarily experimental results are given for a W wire array Z pinch imploded with up to 2 MA in 100 ns on a Qiangguang-I pulsed power generator. The configuration and parameters of the generator, the W wire array load assembly and the diagnostic system for the experiment are described. The total X-ray energy has been obtained with a averaged power of X-ray radiation of 1.28 TW

  2. Pattern-based translation of BPMN process models to BPEL web services

    NARCIS (Netherlands)

    Ouyang, C.; Dumas, M.; Hofstede, ter A.H.M.; Aalst, van der W.M.P.

    2008-01-01

    The business process modeling notation (BPMN) is a graph-oriented language primarily targeted at domain analysts and supported by many modeling tools. The business process execution language for Web services (BPEL) on the other hand is a mainly block-structured language targeted at software

  3. Alternative ways of using field-based estimates to calibrate ecosystem models and their implications for carbon cycle studies

    Science.gov (United States)

    He, Yujie; Zhuang, Qianlai; McGuire, David; Liu, Yaling; Chen, Min

    2013-01-01

    Model-data fusion is a process in which field observations are used to constrain model parameters. How observations are used to constrain parameters has a direct impact on the carbon cycle dynamics simulated by ecosystem models. In this study, we present an evaluation of several options for the use of observations in modeling regional carbon dynamics and explore the implications of those options. We calibrated the Terrestrial Ecosystem Model on a hierarchy of three vegetation classification levels for the Alaskan boreal forest: species level, plant-functional-type level (PFT level), and biome level, and we examined the differences in simulated carbon dynamics. Species-specific field-based estimates were directly used to parameterize the model for species-level simulations, while weighted averages based on species percent cover were used to generate estimates for PFT- and biome-level model parameterization. We found that calibrated key ecosystem process parameters differed substantially among species and overlapped for species that are categorized into different PFTs. Our analysis of parameter sets suggests that the PFT-level parameterizations primarily reflected the dominant species and that functional information of some species were lost from the PFT-level parameterizations. The biome-level parameterization was primarily representative of the needleleaf PFT and lost information on broadleaf species or PFT function. Our results indicate that PFT-level simulations may be potentially representative of the performance of species-level simulations while biome-level simulations may result in biased estimates. Improved theoretical and empirical justifications for grouping species into PFTs or biomes are needed to adequately represent the dynamics of ecosystem functioning and structure.

  4. Temporal integration of loudness in listeners with hearing losses of primarily cochlear origin

    DEFF Research Database (Denmark)

    Buus, Søren; Florentine, Mary; Poulsen, Torben

    1999-01-01

    To investigate how hearing loss of primarily cochlear origin affects the loudness of brief tones, loudness matches between 5- and 200-ms tones were obtained as a function of level for 15 listeners with cochlear impairments and for seven age-matched controls. Three frequencies, usually 0.5, 1, and 4...... of temporal integration—defined as the level difference between equally loud short and long tones—varied nonmonotonically with level and was largest at moderate levels. No consistent effect of frequency was apparent. The impaired listeners varied widely, but most showed a clear effect of level on the amount...... of temporal integration. Overall, their results appear consistent with expectations based on knowledge of the general properties of their loudness-growth functions and the equal-loudness-ratio hypothesis, which states that the loudness ratio between equal-SPL long and brief tones is the same at all SPLs...

  5. Perceptions of Mindfulness in a Low-income, Primarily African American Treatment-Seeking Sample.

    Science.gov (United States)

    Spears, Claire Adams; Houchins, Sean C; Bamatter, Wendy P; Barrueco, Sandra; Hoover, Diana Stewart; Perskaudas, Rokas

    2017-12-01

    Individuals with low socioeconomic status (SES) and members of racial/ethnic minority groups often experience profound disparities in mental health and physical well-being. Mindfulness-based interventions show promise for improving mood and health behaviors in higher-SES and non-Latino White populations. However, research is needed to explore what types of adaptations, if any, are needed to best support underserved populations. This study used qualitative methods to gain information about a) perceptions of mindfulness, b) experiences with meditation, c) barriers to practicing mindfulness, and d) recommendations for tailoring mindfulness-based interventions in a low-income, primarily African American treatment-seeking sample. Eight focus groups were conducted with 32 adults (16 men and 16 women) currently receiving services at a community mental health center. Most participants (91%) were African American. Focus group data were transcribed and analyzed using NVivo 10. A team of coders reviewed the transcripts to identify salient themes. Relevant themes included beliefs that mindfulness practice might improve mental health (e.g., managing stress and anger more effectively) and physical health (e.g., improving sleep and chronic pain, promoting healthier behaviors). Participants also discussed ways in which mindfulness might be consistent with, and even enhance, their religious and spiritual practices. Results could be helpful in tailoring mindfulness-based treatments to optimize feasibility and effectiveness for low-SES adults receiving mental health services.

  6. A time-dependent Green's function-based model for stream ...

    African Journals Online (AJOL)

    DRINIE

    2003-07-03

    Jul 3, 2003 ... applications, this Green's function has found use primarily in linear heat transfer and flow ... based on the mathematical description of the flow with the nonlinear .... i∂/∂x + j∂/∂y is the two-dimensional gradient operator,.

  7. Ammonia concentration modeling based on retained gas sampler data

    International Nuclear Information System (INIS)

    Terrones, G.; Palmer, B.J.; Cuta, J.M.

    1997-09-01

    The vertical ammonia concentration distributions determined by the retained gas sampler (RGS) apparatus were modeled for double-shell tanks (DSTs) AW-101, AN-103, AN-104, and AN-105 and single-shell tanks (SSTs) A-101, S-106, and U-103. One the vertical transport of ammonia in the tanks were used for the modeling. Transport in the non-convective settled solids and floating solids layers is assumed to occur primarily via some type of diffusion process, while transport in the convective liquid layers is incorporated into the model via mass transfer coefficients based on empirical correlations. Mass transfer between the top of the waste and the tank headspace and the effects of ventilation of the headspace are also included in the models. The resulting models contain a large number of parameters, but many of them can be determined from known properties of the waste configuration or can be estimated within reasonable bounds from data on the waste samples themselves. The models are used to extract effective diffusion coefficients for transport in the nonconvective layers based on the measured values of ammonia from the RGS apparatus. The modeling indicates that the higher concentrations of ammonia seen in bubbles trapped inside the waste relative to the ammonia concentrations in the tank headspace can be explained by a combination of slow transport of ammonia via diffusion in the nonconvective layers and ventilation of the tank headspace by either passive or active means. Slow transport by diffusion causes a higher concentration of ammonia to build up deep within the waste until the concentration gradients between the interior and top of the waste are sufficient to allow ammonia to escape at the same rate at which it is being generated in the waste

  8. Human punishment is not primarily motivated by inequality.

    Science.gov (United States)

    Marczyk, Jesse

    2017-01-01

    Previous theorizing about punishment has suggested that humans desire to punish inequality per se. However, the research supporting such an interpretation contains important methodological confounds. The main objective of the current experiment was to remove those confounds in order to test whether generating inequality per se is punished. Participants were recruited from an online market to take part in a wealth-alteration game with an ostensible second player. The participants were given an option to deduct from the other player's payment as punishment for their behavior during the game. The results suggest that human punishment does not appear to be motivated by inequality per se, as inequality that was generated without inflicting costs on others was not reliably punished. Instead, punishment seems to respond primarily to the infliction of costs, with inequality only becoming relevant as a secondary input for punishment decisions. The theoretical significance of this finding is discussed in the context of its possible adaptive value.

  9. Human punishment is not primarily motivated by inequality

    Science.gov (United States)

    Marczyk, Jesse

    2017-01-01

    Previous theorizing about punishment has suggested that humans desire to punish inequality per se. However, the research supporting such an interpretation contains important methodological confounds. The main objective of the current experiment was to remove those confounds in order to test whether generating inequality per se is punished. Participants were recruited from an online market to take part in a wealth-alteration game with an ostensible second player. The participants were given an option to deduct from the other player’s payment as punishment for their behavior during the game. The results suggest that human punishment does not appear to be motivated by inequality per se, as inequality that was generated without inflicting costs on others was not reliably punished. Instead, punishment seems to respond primarily to the infliction of costs, with inequality only becoming relevant as a secondary input for punishment decisions. The theoretical significance of this finding is discussed in the context of its possible adaptive value. PMID:28187166

  10. Human punishment is not primarily motivated by inequality.

    Directory of Open Access Journals (Sweden)

    Jesse Marczyk

    Full Text Available Previous theorizing about punishment has suggested that humans desire to punish inequality per se. However, the research supporting such an interpretation contains important methodological confounds. The main objective of the current experiment was to remove those confounds in order to test whether generating inequality per se is punished. Participants were recruited from an online market to take part in a wealth-alteration game with an ostensible second player. The participants were given an option to deduct from the other player's payment as punishment for their behavior during the game. The results suggest that human punishment does not appear to be motivated by inequality per se, as inequality that was generated without inflicting costs on others was not reliably punished. Instead, punishment seems to respond primarily to the infliction of costs, with inequality only becoming relevant as a secondary input for punishment decisions. The theoretical significance of this finding is discussed in the context of its possible adaptive value.

  11. Analysis of Food Hub Commerce and Participation Using Agent-Based Modeling: Integrating Financial and Social Drivers.

    Science.gov (United States)

    Krejci, Caroline C; Stone, Richard T; Dorneich, Michael C; Gilbert, Stephen B

    2016-02-01

    Factors influencing long-term viability of an intermediated regional food supply network (food hub) were modeled using agent-based modeling techniques informed by interview data gathered from food hub participants. Previous analyses of food hub dynamics focused primarily on financial drivers rather than social factors and have not used mathematical models. Based on qualitative and quantitative data gathered from 22 customers and 11 vendors at a midwestern food hub, an agent-based model (ABM) was created with distinct consumer personas characterizing the range of consumer priorities. A comparison study determined if the ABM behaved differently than a model based on traditional economic assumptions. Further simulation studies assessed the effect of changes in parameters, such as producer reliability and the consumer profiles, on long-term food hub sustainability. The persona-based ABM model produced different and more resilient results than the more traditional way of modeling consumers. Reduced producer reliability significantly reduced trade; in some instances, a modest reduction in reliability threatened the sustainability of the system. Finally, a modest increase in price-driven consumers at the outset of the simulation quickly resulted in those consumers becoming a majority of the overall customer base. Results suggest that social factors, such as desire to support the community, can be more important than financial factors. An ABM of food hub dynamics, based on human factors data gathered from the field, can be a useful tool for policy decisions. Similar approaches can be used for modeling customer dynamics with other sustainable organizations. © 2015, Human Factors and Ergonomics Society.

  12. Diet Quality and Nutrient Intake of Urban Overweight and Obese Primarily African American Older Adults with Osteoarthritis

    Directory of Open Access Journals (Sweden)

    Sevasti Vergis

    2018-04-01

    Full Text Available Diet quality may be a unique target for preventing and managing obesity-related osteoarthritis (OA. Using the Healthy Eating Index-2010 (HEI-2010, this study examined the nutrient intake and diet quality of 400 urban overweight and obese primarily African American older adults with self-reported lower extremity OA. Associations between sociodemographic and health-related factors and diet quality were explored. Participants (mean age 67.8 years, SD 5.9 were included. Habitual dietary intake was assessed using a food frequency questionnaire (FFQ. Nutrient intake and diet quality were calculated from the FFQ. Results indicated that diet quality needs improvement (HEI-2010: 66.3 (SD 10.5. Age, body mass index, employment (multivariable model only, and OA severity (bivariate model only were significant predictors of HEI-2010 total score in linear models. Mean intakes for fiber, calcium, and vitamin D were below recommendations, while percentage of calories as total fat exceeded recommendations. These findings can inform future dietary intervention trials and public health messaging for a sub-population at a high risk for obesity-related OA.

  13. Handling high predictor dimensionality in slope-unit-based landslide susceptibility models through LASSO-penalized Generalized Linear Model

    KAUST Repository

    Camilo, Daniela Castro; Lombardo, Luigi; Mai, Paul Martin; Dou, Jie; Huser, Raphaë l

    2017-01-01

    Grid-based landslide susceptibility models at regional scales are computationally demanding when using a fine grid resolution. Conversely, Slope-Unit (SU) based susceptibility models allows to investigate the same areas offering two main advantages: 1) a smaller computational burden and 2) a more geomorphologically-oriented interpretation. In this contribution, we generate SU-based landslide susceptibility for the Sado Island in Japan. This island is characterized by deep-seated landslides which we assume can only limitedly be explained by the first two statistical moments (mean and variance) of a set of predictors within each slope unit. As a consequence, in a nested experiment, we first analyse the distributions of a set of continuous predictors within each slope unit computing the standard deviation and quantiles from 0.05 to 0.95 with a step of 0.05. These are then used as predictors for landslide susceptibility. In addition, we combine shape indices for polygon features and the normalized extent of each class belonging to the outcropping lithology in a given SU. This procedure significantly enlarges the size of the predictor hyperspace, thus producing a high level of slope-unit characterization. In a second step, we adopt a LASSO-penalized Generalized Linear Model to shrink back the predictor set to a sensible and interpretable number, carrying only the most significant covariates in the models. As a result, we are able to document the geomorphic features (e.g., 95% quantile of Elevation and 5% quantile of Plan Curvature) that primarily control the SU-based susceptibility within the test area while producing high predictive performances. The implementation of the statistical analyses are included in a parallelized R script (LUDARA) which is here made available for the community to replicate analogous experiments.

  14. Handling high predictor dimensionality in slope-unit-based landslide susceptibility models through LASSO-penalized Generalized Linear Model

    KAUST Repository

    Camilo, Daniela Castro

    2017-08-30

    Grid-based landslide susceptibility models at regional scales are computationally demanding when using a fine grid resolution. Conversely, Slope-Unit (SU) based susceptibility models allows to investigate the same areas offering two main advantages: 1) a smaller computational burden and 2) a more geomorphologically-oriented interpretation. In this contribution, we generate SU-based landslide susceptibility for the Sado Island in Japan. This island is characterized by deep-seated landslides which we assume can only limitedly be explained by the first two statistical moments (mean and variance) of a set of predictors within each slope unit. As a consequence, in a nested experiment, we first analyse the distributions of a set of continuous predictors within each slope unit computing the standard deviation and quantiles from 0.05 to 0.95 with a step of 0.05. These are then used as predictors for landslide susceptibility. In addition, we combine shape indices for polygon features and the normalized extent of each class belonging to the outcropping lithology in a given SU. This procedure significantly enlarges the size of the predictor hyperspace, thus producing a high level of slope-unit characterization. In a second step, we adopt a LASSO-penalized Generalized Linear Model to shrink back the predictor set to a sensible and interpretable number, carrying only the most significant covariates in the models. As a result, we are able to document the geomorphic features (e.g., 95% quantile of Elevation and 5% quantile of Plan Curvature) that primarily control the SU-based susceptibility within the test area while producing high predictive performances. The implementation of the statistical analyses are included in a parallelized R script (LUDARA) which is here made available for the community to replicate analogous experiments.

  15. Modeling the Internet of Things, Self-Organizing and Other Complex Adaptive Communication Networks: A Cognitive Agent-Based Computing Approach.

    Directory of Open Access Journals (Sweden)

    Samreen Laghari

    Full Text Available Computer Networks have a tendency to grow at an unprecedented scale. Modern networks involve not only computers but also a wide variety of other interconnected devices ranging from mobile phones to other household items fitted with sensors. This vision of the "Internet of Things" (IoT implies an inherent difficulty in modeling problems.It is practically impossible to implement and test all scenarios for large-scale and complex adaptive communication networks as part of Complex Adaptive Communication Networks and Environments (CACOONS. The goal of this study is to explore the use of Agent-based Modeling as part of the Cognitive Agent-based Computing (CABC framework to model a Complex communication network problem.We use Exploratory Agent-based Modeling (EABM, as part of the CABC framework, to develop an autonomous multi-agent architecture for managing carbon footprint in a corporate network. To evaluate the application of complexity in practical scenarios, we have also introduced a company-defined computer usage policy.The conducted experiments demonstrated two important results: Primarily CABC-based modeling approach such as using Agent-based Modeling can be an effective approach to modeling complex problems in the domain of IoT. Secondly, the specific problem of managing the Carbon footprint can be solved using a multiagent system approach.

  16. Modeling the Internet of Things, Self-Organizing and Other Complex Adaptive Communication Networks: A Cognitive Agent-Based Computing Approach.

    Science.gov (United States)

    Laghari, Samreen; Niazi, Muaz A

    2016-01-01

    Computer Networks have a tendency to grow at an unprecedented scale. Modern networks involve not only computers but also a wide variety of other interconnected devices ranging from mobile phones to other household items fitted with sensors. This vision of the "Internet of Things" (IoT) implies an inherent difficulty in modeling problems. It is practically impossible to implement and test all scenarios for large-scale and complex adaptive communication networks as part of Complex Adaptive Communication Networks and Environments (CACOONS). The goal of this study is to explore the use of Agent-based Modeling as part of the Cognitive Agent-based Computing (CABC) framework to model a Complex communication network problem. We use Exploratory Agent-based Modeling (EABM), as part of the CABC framework, to develop an autonomous multi-agent architecture for managing carbon footprint in a corporate network. To evaluate the application of complexity in practical scenarios, we have also introduced a company-defined computer usage policy. The conducted experiments demonstrated two important results: Primarily CABC-based modeling approach such as using Agent-based Modeling can be an effective approach to modeling complex problems in the domain of IoT. Secondly, the specific problem of managing the Carbon footprint can be solved using a multiagent system approach.

  17. Exploration through Business Model Innovation

    DEFF Research Database (Denmark)

    Knab, Sebastian; Rohrbeck, René

    2015-01-01

    With this research we aim to enhance our understanding about how incumbents can explore emerging opportunities through business model innovation. Using a multiple-case, longitudinal research design spanning 2008 to 2014 we investigate exploration activities of the four largest German energy...... utilities in the emerging virtual power plant market. Based on the behavioral theory of the firm, we study how the cognitive and physical elements of an incumbent’s strategy can be changed and how these changes affect its business model innovation activities in the exploration process. Our preliminary...... findings suggest that the use of synergies and probing can lead to changing physical elements and primarily increase business model maturity. CEO change and structural separation can lead to changing cognitive elements and primarily increase business model sophistication....

  18. Control-Oriented First Principles-Based Model of a Diesel Generator

    DEFF Research Database (Denmark)

    Knudsen, Jesper Viese; Bendtsen, Jan Dimon; Andersen, Palle

    2016-01-01

    This paper presents the development of a control-oriented tenth-order nonlinear model of a diesel driven generator set, using first principles modeling. The model provides physical system insight, while keeping the complexity at a level where it can be a tool for future design of improved automatic...... generation control (AGC), by including important nonlinearities of the machine. The nonlinearities are, as would be expected for a generator, primarily of bilinear nature. Validation of the model is done with measurements on a 60 kVA/48 kW diesel driven generator set in island operation during steps...

  19. Unifying Model-Based and Reactive Programming within a Model-Based Executive

    Science.gov (United States)

    Williams, Brian C.; Gupta, Vineet; Norvig, Peter (Technical Monitor)

    1999-01-01

    Real-time, model-based, deduction has recently emerged as a vital component in AI's tool box for developing highly autonomous reactive systems. Yet one of the current hurdles towards developing model-based reactive systems is the number of methods simultaneously employed, and their corresponding melange of programming and modeling languages. This paper offers an important step towards unification. We introduce RMPL, a rich modeling language that combines probabilistic, constraint-based modeling with reactive programming constructs, while offering a simple semantics in terms of hidden state Markov processes. We introduce probabilistic, hierarchical constraint automata (PHCA), which allow Markov processes to be expressed in a compact representation that preserves the modularity of RMPL programs. Finally, a model-based executive, called Reactive Burton is described that exploits this compact encoding to perform efficIent simulation, belief state update and control sequence generation.

  20. Building v/s Exploring Models: Comparing Learning of Evolutionary Processes through Agent-based Modeling

    Science.gov (United States)

    Wagh, Aditi

    Two strands of work motivate the three studies in this dissertation. Evolutionary change can be viewed as a computational complex system in which a small set of rules operating at the individual level result in different population level outcomes under different conditions. Extensive research has documented students' difficulties with learning about evolutionary change (Rosengren et al., 2012), particularly in terms of levels slippage (Wilensky & Resnick, 1999). Second, though building and using computational models is becoming increasingly common in K-12 science education, we know little about how these two modalities compare. This dissertation adopts agent-based modeling as a representational system to compare these modalities in the conceptual context of micro-evolutionary processes. Drawing on interviews, Study 1 examines middle-school students' productive ways of reasoning about micro-evolutionary processes to find that the specific framing of traits plays a key role in whether slippage explanations are cued. Study 2, which was conducted in 2 schools with about 150 students, forms the crux of the dissertation. It compares learning processes and outcomes when students build their own models or explore a pre-built model. Analysis of Camtasia videos of student pairs reveals that builders' and explorers' ways of accessing rules, and sense-making of observed trends are of a different character. Builders notice rules through available blocks-based primitives, often bypassing their enactment while explorers attend to rules primarily through the enactment. Moreover, builders' sense-making of observed trends is more rule-driven while explorers' is more enactment-driven. Pre and posttests reveal that builders manifest a greater facility with accessing rules, providing explanations manifesting targeted assembly. Explorers use rules to construct explanations manifesting non-targeted assembly. Interviews reveal varying degrees of shifts away from slippage in both

  1. Vibration-based health monitoring and model refinement of civil engineering structures

    Energy Technology Data Exchange (ETDEWEB)

    Farrar, C.R.; Doebling, S.W.

    1997-10-01

    Damage or fault detection, as determined by changes in the dynamic properties of structures, is a subject that has received considerable attention in the technical literature beginning approximately 30 years ago. The basic idea is that changes in the structure`s properties, primarily stiffness, will alter the dynamic properties of the structure such as resonant frequencies and mode shapes, and properties derived from these quantities such as modal-based flexibility. Recently, this technology has been investigated for applications to health monitoring of large civil engineering structures. This presentation will discuss such a study undertaken by engineers from New Mexico Sate University, Sandia National Laboratory and Los Alamos National Laboratory. Experimental modal analyses were performed in an undamaged interstate highway bridge and immediately after four successively more severe damage cases were inflicted in the main girder of the structure. Results of these tests provide insight into the abilities of modal-based damage ID methods to identify damage and the current limitations of this technology. Closely related topics that will be discussed are the use of modal properties to validate computer models of the structure, the use of these computer models in the damage detection process, and the general lack of experimental investigation of large civil engineering structures.

  2. A 3D City Model with Dynamic Behaviour Based on Geospatial Managed Objects

    DEFF Research Database (Denmark)

    Kjems, Erik; Kolář, Jan

    2014-01-01

    of a geographic data representation of the world. The combination of 3D city models and real time information based systems though can provide a whole new setup for data fusion within an urban environment and provide time critical information preserving our limited resources in the most sustainable way. Using 3D......One of the major development efforts within the GI Science domain are pointing at real time information coming from geographic referenced features in general. At the same time 3D City models are mostly justified as being objects for visualization purposes rather than constituting the foundation...... occasions we have been advocating for a new and advanced formulation of real world features using the concept of Geospatial Managed Objects (GMO). This chapter presents the outcome of the InfraWorld project, a 4 million Euro project financed primarily by the Norwegian Research Council where the concept...

  3. A maturation model for project-based organisations – with uncertainty management as an always remaining multi-project management focus

    Directory of Open Access Journals (Sweden)

    Anna Jerbrant

    2014-02-01

    Full Text Available The classical view of multi-project management does not capture its dynamic nature. Present theory falls short in the expositive dimension of how management of project-based companies evolves because of their need to be agile and adaptable to a changing environment. The purpose of this paper is therefore to present a descriptive model that elucidates the maturation processes in a project-based organization as well as to give an enhanced understanding of multi-project management in practice. The maturation model displays how the management of project-based organizations evolves between structuring administration and managing any uncertainty, and emphasizes the importance of active individual actions and situated management actions that haveto be undertaken in order to coordinate, synchronize, and communicate the required knowledge and skills.The outcomes primarily reveal that, although standardized project models are used and considerable resources are spent on effective project portfolio management, how information and communication are executedis essential for the management of project-based organizations. This is particularly true for informal and non-codified communication.

  4. Factors affecting the number and type of student research products for chemistry and physics students at primarily undergraduate institutions: A case study.

    Science.gov (United States)

    Mellis, Birgit; Soto, Patricia; Bruce, Chrystal D; Lacueva, Graciela; Wilson, Anne M; Jayasekare, Rasitha

    2018-01-01

    For undergraduate students, involvement in authentic research represents scholarship that is consistent with disciplinary quality standards and provides an integrative learning experience. In conjunction with performing research, the communication of the results via presentations or publications is a measure of the level of scientific engagement. The empirical study presented here uses generalized linear mixed models with hierarchical bootstrapping to examine the factors that impact the means of dissemination of undergraduate research results. Focusing on the research experiences in physics and chemistry of undergraduates at four Primarily Undergraduate Institutions (PUIs) from 2004-2013, statistical analysis indicates that the gender of the student does not impact the number and type of research products. However, in chemistry, the rank of the faculty advisor and the venue of the presentation do impact the number of research products by undergraduate student, whereas in physics, gender match between student and advisor has an effect on the number of undergraduate research products. This study provides a baseline for future studies of discipline-based bibliometrics and factors that affect the number of research products of undergraduate students.

  5. Factors affecting the number and type of student research products for chemistry and physics students at primarily undergraduate institutions: A case study

    Science.gov (United States)

    Soto, Patricia; Bruce, Chrystal D.; Lacueva, Graciela; Wilson, Anne M.; Jayasekare, Rasitha

    2018-01-01

    For undergraduate students, involvement in authentic research represents scholarship that is consistent with disciplinary quality standards and provides an integrative learning experience. In conjunction with performing research, the communication of the results via presentations or publications is a measure of the level of scientific engagement. The empirical study presented here uses generalized linear mixed models with hierarchical bootstrapping to examine the factors that impact the means of dissemination of undergraduate research results. Focusing on the research experiences in physics and chemistry of undergraduates at four Primarily Undergraduate Institutions (PUIs) from 2004–2013, statistical analysis indicates that the gender of the student does not impact the number and type of research products. However, in chemistry, the rank of the faculty advisor and the venue of the presentation do impact the number of research products by undergraduate student, whereas in physics, gender match between student and advisor has an effect on the number of undergraduate research products. This study provides a baseline for future studies of discipline-based bibliometrics and factors that affect the number of research products of undergraduate students. PMID:29698502

  6. Efficient transfection of DNA into primarily cultured rat sertoli cells by electroporation.

    Science.gov (United States)

    Li, Fuping; Yamaguchi, Kohei; Okada, Keisuke; Matsushita, Kei; Enatsu, Noritoshi; Chiba, Koji; Yue, Huanxun; Fujisawa, Masato

    2013-03-01

    The expression of exogenous DNA in Sertoli cells is essential for studying its functional genomics, pathway analysis, and medical applications. Electroporation is a valuable tool for nucleic acid delivery, even in primarily cultured cells, which are considered difficult to transfect. In this study, we developed an optimized protocol for electroporation-based transfection of Sertoli cells and compared its efficiency with conventional lipofection. Sertoli cells were transfected with pCMV-GFP plasmid by square-wave electroporation under different conditions. After transfection of plasmid into Sertoli cells, enhanced green fluorescent protein (EGFP) expression could be easily detected by fluorescent microscopy, and cell survival was evaluated by dye exclusion assay using Trypan blue. In terms of both cell survival and the percentage expressing EGFP, 250 V was determined to produce the greatest number of transiently transfected cells. Keeping the voltage constant (250 V), relatively high cell survival (76.5% ± 3.4%) and transfection efficiency (30.6% ± 5.6%) were observed with a pulse length of 20 μm. The number of pulses significantly affected cell survival and EGFP expression (P transfection methods, the transfection efficiency of electroporation (21.5% ± 5.7%) was significantly higher than those of Lipofectamine 2000 (2.9% ± 1.0%) and Effectene (1.9% ± 0.8%) in this experiment (P transfection of Sertoli cells.

  7. Documentation for grants equal to tax model: Volume 3, Source code

    International Nuclear Information System (INIS)

    Boryczka, M.K.

    1986-01-01

    The GETT model is capable of forecasting the amount of tax liability associated with all property owned and all activities undertaken by the US Department of Energy (DOE) in site characterization and repository development. The GETT program is a user-friendly, menu-driven model developed using dBASE III/trademark/, a relational data base management system. The data base for GETT consists primarily of eight separate dBASE III/trademark/ files corresponding to each of the eight taxes (real property, personal property, corporate income, franchise, sales, use, severance, and excise) levied by State and local jurisdictions on business property and activity. Additional smaller files help to control model inputs and reporting options. Volume 3 of the GETT model documentation is the source code. The code is arranged primarily by the eight tax types. Other code files include those for JURISDICTION, SIMULATION, VALIDATION, TAXES, CHANGES, REPORTS, GILOT, and GETT. The code has been verified through hand calculations

  8. 49 CFR 37.195 - Purchase or lease of OTRBs by private entities not primarily in the business of transporting people.

    Science.gov (United States)

    2010-10-01

    ... primarily in the business of transporting people. 37.195 Section 37.195 Transportation Office of the... transporting people. This section applies to all purchases or leases of new vehicles by private entities which are not primarily engaged in the business of transporting people, with respect to buses delivered to...

  9. Understanding the general packing rearrangements required for successful template based modeling of protein structure from a CASP experiment.

    Science.gov (United States)

    Day, Ryan; Joo, Hyun; Chavan, Archana C; Lennox, Kristin P; Chen, Y Ann; Dahl, David B; Vannucci, Marina; Tsai, Jerry W

    2013-02-01

    As an alternative to the common template based protein structure prediction methods based on main-chain position, a novel side-chain centric approach has been developed. Together with a Bayesian loop modeling procedure and a combination scoring function, the Stone Soup algorithm was applied to the CASP9 set of template based modeling targets. Although the method did not generate as large of perturbations to the template structures as necessary, the analysis of the results gives unique insights into the differences in packing between the target structures and their templates. Considerable variation in packing is found between target and template structures even when the structures are close, and this variation is found due to 2 and 3 body packing interactions. Outside the inherent restrictions in packing representation of the PDB, the first steps in correctly defining those regions of variable packing have been mapped primarily to local interactions, as the packing at the secondary and tertiary structure are largely conserved. Of the scoring functions used, a loop scoring function based on water structure exhibited some promise for discrimination. These results present a clear structural path for further development of a side-chain centered approach to template based modeling. Copyright © 2012 Elsevier Ltd. All rights reserved.

  10. Bacterial diversity shift determined by different diets in the gut of the spotted wing fly Drosophila suzukii is primarily reflected on acetic acid bacteria

    KAUST Repository

    Vacchini, Violetta

    2016-11-25

    The pivotal role of diet in shaping gut microbiota has been evaluated in different animal models, including insects. Drosophila flies harbour an inconstant microbiota among which acetic acid bacteria (AAB) are important components. Here, we investigated the bacterial and AAB components of the invasive pest Drosophila suzukii microbiota, by studying the same insect population separately grown on fruit-based or non-fruit artificial diet. AAB were highly prevalent in the gut under both diets (90 and 92% infection rates with fruits and artificial diet, respectively). Fluorescent in situ hybridization and recolonization experiments with green fluorescent protein (Gfp)-labelled strains showed AAB capability to massively colonize insect gut. High-throughput sequencing on 16S rRNA gene indicated that the bacterial microbiota of guts fed with the two diets clustered separately. By excluding AAB-related OTUs from the analysis, insect bacterial communities did not cluster separately according to the diet, suggesting that diet-based diversification of the community is primarily reflected on the AAB component of the community. Diet influenced also AAB alpha-diversity, with separate OTU distributions based on diets. High prevalence, localization and massive recolonization, together with AAB clustering behaviour in relation to diet, suggest an AAB role in the D. suzukii gut response to diet modification. This article is protected by copyright. All rights reserved.

  11. CLIMLAB: a Python-based software toolkit for interactive, process-oriented climate modeling

    Science.gov (United States)

    Rose, B. E. J.

    2015-12-01

    Global climate is a complex emergent property of the rich interactions between simpler components of the climate system. We build scientific understanding of this system by breaking it down into component process models (e.g. radiation, large-scale dynamics, boundary layer turbulence), understanding each components, and putting them back together. Hands-on experience and freedom to tinker with climate models (whether simple or complex) is invaluable for building physical understanding. CLIMLAB is an open-ended software engine for interactive, process-oriented climate modeling. With CLIMLAB you can interactively mix and match model components, or combine simpler process models together into a more comprehensive model. It was created primarily to support classroom activities, using hands-on modeling to teach fundamentals of climate science at both undergraduate and graduate levels. CLIMLAB is written in Python and ties in with the rich ecosystem of open-source scientific Python tools for numerics and graphics. The IPython notebook format provides an elegant medium for distributing interactive example code. I will give an overview of the current capabilities of CLIMLAB, the curriculum we have developed thus far, and plans for the future. Using CLIMLAB requires some basic Python coding skills. We consider this an educational asset, as we are targeting upper-level undergraduates and Python is an increasingly important language in STEM fields. However CLIMLAB is well suited to be deployed as a computational back-end for a graphical gaming environment based on earth-system modeling.

  12. Large-scale Comparative Study of Hi-C-based Chromatin 3D Structure Modeling Methods

    KAUST Repository

    Wang, Cheng

    2018-05-17

    Chromatin is a complex polymer molecule in eukaryotic cells, primarily consisting of DNA and histones. Many works have shown that the 3D folding of chromatin structure plays an important role in DNA expression. The recently proposed Chro- mosome Conformation Capture technologies, especially the Hi-C assays, provide us an opportunity to study how the 3D structures of the chromatin are organized. Based on the data from Hi-C experiments, many chromatin 3D structure modeling methods have been proposed. However, there is limited ground truth to validate these methods and no robust chromatin structure alignment algorithms to evaluate the performance of these methods. In our work, we first made a thorough literature review of 25 publicly available population Hi-C-based chromatin 3D structure modeling methods. Furthermore, to evaluate and to compare the performance of these methods, we proposed a novel data simulation method, which combined the population Hi-C data and single-cell Hi-C data without ad hoc parameters. Also, we designed a global and a local alignment algorithms to measure the similarity between the templates and the chromatin struc- tures predicted by different modeling methods. Finally, the results from large-scale comparative tests indicated that our alignment algorithms significantly outperform the algorithms in literature.

  13. Lamin A/C mutation affecting primarily the right side of the heart

    Directory of Open Access Journals (Sweden)

    Laura Ollila

    2013-04-01

    Full Text Available LMNA mutations are amongst the most important causes of familial dilated cardiomyopathy. The most important cause of arrhythmogenic right ventricular cardiomyopathy (ARVC is desmosomal pathology. The aim of the study was to elucidate the role of LMNA mutations among Finnish cardiomyopathy patients. We screened 135 unrelated cardiomyopathy patients for LMNA mutations. Because of unusual phenotype, two patients were screened for the known Finnish ARVC-related mutations of desmosomal genes, and their Plakophilin-2b gene was sequenced. Myocardial samples from two patients were examined by immunohistochemical plakoglobin staining and in one case by electron microscopy. We found a new LMNA mutation Phe237Ser in a family of five affected members with a cardiomyopathy affecting primarily the right side of the heart. The phenotype resembles ARVC but does not fulfill the Task Force Criteria. The main clinical manifestations of the mutation were severe tricuspid insufficiency, right ventricular enlargement and failure. Three of the affected patients died of the heart disease, and the two living patients received heart transplants at ages 44 and 47. Electron microscopy showed nuclear blebbing compatible with laminopathy. Immunohisto - chemical analysis did not suggest desmosomal pathology. No desmosomal mutations were found. The Phe237Ser LMNA mutation causes a phenotype different from traditional cardiolaminopathy. Our findings suggest that cardiomyopathy affecting primarily the right side of the heart is not always caused by desmosomal pathology. Our observations highlight the challenges in classifying cardiomyopathies, as there often is significant overlap between the traditional categories.

  14. The Serotonin Transporter Undergoes Constitutive Internalization and Is Primarily Sorted to Late Endosomes and Lysosomal Degradation*

    Science.gov (United States)

    Rahbek-Clemmensen, Troels; Bay, Tina; Eriksen, Jacob; Gether, Ulrik; Jørgensen, Trine Nygaard

    2014-01-01

    The serotonin transporter (SERT) plays a critical role in regulating serotonin signaling by mediating reuptake of serotonin from the extracellular space. The molecular and cellular mechanisms controlling SERT levels in the membrane remain poorly understood. To study trafficking of the surface resident SERT, two functional epitope-tagged variants were generated. Fusion of a FLAG-tagged one-transmembrane segment protein Tac to the SERT N terminus generated a transporter with an extracellular epitope suited for trafficking studies (TacSERT). Likewise, a construct with an extracellular antibody epitope was generated by introducing an HA (hemagglutinin) tag in the extracellular loop 2 of SERT (HA-SERT). By using TacSERT and HA-SERT in antibody-based internalization assays, we show that SERT undergoes constitutive internalization in a dynamin-dependent manner. Confocal images of constitutively internalized SERT demonstrated that SERT primarily co-localized with the late endosomal/lysosomal marker Rab7, whereas little co-localization was observed with the Rab11, a marker of the “long loop” recycling pathway. This sorting pattern was distinct from that of a prototypical recycling membrane protein, the β2-adrenergic receptor. Furthermore, internalized SERT co-localized with the lysosomal marker LysoTracker and not with transferrin. The sorting pattern was further confirmed by visualizing internalization of SERT using the fluorescent cocaine analog JHC1-64 and by reversible and pulse-chase biotinylation assays showing evidence for lysosomal degradation of the internalized transporter. Finally, we found that SERT internalized in response to stimulation with 12-myristate 13-acetate co-localized primarily with Rab7- and LysoTracker-positive compartments. We conclude that SERT is constitutively internalized and that the internalized transporter is sorted mainly to degradation. PMID:24973209

  15. Modeling the impact of preflushing on CTE in proton irradiated CCD-based detectors

    Science.gov (United States)

    Philbrick, R. H.

    2002-04-01

    A software model is described that performs a "real world" simulation of the operation of several types of charge-coupled device (CCD)-based detectors in order to accurately predict the impact that high-energy proton radiation has on image distortion and modulation transfer function (MTF). The model was written primarily to predict the effectiveness of vertical preflushing on the custom full frame CCD-based detectors intended for use on the proposed Kepler Discovery mission, but it is capable of simulating many other types of CCD detectors and operating modes as well. The model keeps track of the occupancy of all phosphorous-silicon (P-V), divacancy (V-V) and oxygen-silicon (O-V) defect centers under every CCD electrode over the entire detector area. The integrated image is read out by simulating every electrode-to-electrode charge transfer in both the vertical and horizontal CCD registers. A signal level dependency on the capture and emission of signal is included and the current state of each electrode (e.g., barrier or storage) is considered when distributing integrated and emitted signal. Options for performing preflushing, preflashing, and including mini-channels are available on both the vertical and horizontal CCD registers. In addition, dark signal generation and image transfer smear can be selectively enabled or disabled. A comparison of the charge transfer efficiency (CTE) data measured on the Hubble space telescope imaging spectrometer (STIS) CCD with the CTE extracted from model simulations of the STIS CCD show good agreement.

  16. GOLD HULL AND INTERNODE2 encodes a primarily multifunctional cinnamyl-alcohol dehydrogenase in rice.

    Science.gov (United States)

    Zhang, Kewei; Qian, Qian; Huang, Zejun; Wang, Yiqin; Li, Ming; Hong, Lilan; Zeng, Dali; Gu, Minghong; Chu, Chengcai; Cheng, Zhukuan

    2006-03-01

    Lignin content and composition are two important agronomic traits for the utilization of agricultural residues. Rice (Oryza sativa) gold hull and internode phenotype is a classical morphological marker trait that has long been applied to breeding and genetics study. In this study, we have cloned the GOLD HULL AND INTERNODE2 (GH2) gene in rice using a map-based cloning approach. The result shows that the gh2 mutant is a lignin-deficient mutant, and GH2 encodes a cinnamyl-alcohol dehydrogenase (CAD). Consistent with this finding, extracts from roots, internodes, hulls, and panicles of the gh2 plants exhibited drastically reduced CAD activity and undetectable sinapyl alcohol dehydrogenase activity. When expressed in Escherichia coli, purified recombinant GH2 was found to exhibit strong catalytic ability toward coniferaldehyde and sinapaldehyde, while the mutant protein gh2 completely lost the corresponding CAD and sinapyl alcohol dehydrogenase activities. Further phenotypic analysis of the gh2 mutant plants revealed that the p-hydroxyphenyl, guaiacyl, and sinapyl monomers were reduced in almost the same ratio compared to the wild type. Our results suggest GH2 acts as a primarily multifunctional CAD to synthesize coniferyl and sinapyl alcohol precursors in rice lignin biosynthesis.

  17. Physics-Based Modeling of Meteor Entry and Breakup

    Science.gov (United States)

    Prabhu, Dinesh K.; Agrawal, Parul; Allen, Gary A., Jr.; Bauschlicher, Charles W., Jr.; Brandis, Aaron M.; Chen, Yih-Kang; Jaffe, Richard L.; Palmer, Grant E.; Saunders, David A.; Stern, Eric C.; hide

    2015-01-01

    A new research effort at NASA Ames Research Center has been initiated in Planetary Defense, which integrates the disciplines of planetary science, atmospheric entry physics, and physics-based risk assessment. This paper describes work within the new program and is focused on meteor entry and breakup.Over the last six decades significant effort was expended in the US and in Europe to understand meteor entry including ablation, fragmentation and airburst (if any) for various types of meteors ranging from stony to iron spectral types. These efforts have produced primarily empirical mathematical models based on observations. Weaknesses of these models, apart from their empiricism, are reliance on idealized shapes (spheres, cylinders, etc.) and simplified models for thermal response of meteoritic materials to aerodynamic and radiative heating. Furthermore, the fragmentation and energy release of meteors (airburst) is poorly understood.On the other hand, flight of human-made atmospheric entry capsules is well understood. The capsules and their requisite heatshields are designed and margined to survive entry. However, the highest speed Earth entry for capsules is 13 kms (Stardust). Furthermore, Earth entry capsules have never exceeded diameters of 5 m, nor have their peak aerothermal environments exceeded 0.3 atm and 1 kW/sq cm. The aims of the current work are: (i) to define the aerothermal environments for objects with entry velocities from 13 to 20 kms; (ii) to explore various hypotheses of fragmentation and airburst of stony meteors in the near term; (iii) to explore the possibility of performing relevant ground-based tests to verify candidate hypotheses; and (iv) to quantify the energy released in airbursts. The results of the new simulations will be used to anchor said risk assessment analyses. With these aims in mind, state-of-the-art entry capsule design tools are being extended for meteor entries. We describe: (i) applications of current simulation tools to

  18. Predictive value of EEG in postanoxic encephalopathy: A quantitative model-based approach.

    Science.gov (United States)

    Efthymiou, Evdokia; Renzel, Roland; Baumann, Christian R; Poryazova, Rositsa; Imbach, Lukas L

    2017-10-01

    The majority of comatose patients after cardiac arrest do not regain consciousness due to severe postanoxic encephalopathy. Early and accurate outcome prediction is therefore essential in determining further therapeutic interventions. The electroencephalogram is a standardized and commonly available tool used to estimate prognosis in postanoxic patients. The identification of pathological EEG patterns with poor prognosis relies however primarily on visual EEG scoring by experts. We introduced a model-based approach of EEG analysis (state space model) that allows for an objective and quantitative description of spectral EEG variability. We retrospectively analyzed standard EEG recordings in 83 comatose patients after cardiac arrest between 2005 and 2013 in the intensive care unit of the University Hospital Zürich. Neurological outcome was assessed one month after cardiac arrest using the Cerebral Performance Category. For a dynamic and quantitative EEG analysis, we implemented a model-based approach (state space analysis) to quantify EEG background variability independent from visual scoring of EEG epochs. Spectral variability was compared between groups and correlated with clinical outcome parameters and visual EEG patterns. Quantitative assessment of spectral EEG variability (state space velocity) revealed significant differences between patients with poor and good outcome after cardiac arrest: Lower mean velocity in temporal electrodes (T4 and T5) was significantly associated with poor prognostic outcome (pEEG patterns such as generalized periodic discharges (pEEG analysis (state space analysis) provides a novel, complementary marker for prognosis in postanoxic encephalopathy. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Event-Based Conceptual Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    The paper demonstrates that a wide variety of event-based modeling approaches are based on special cases of the same general event concept, and that the general event concept can be used to unify the otherwise unrelated fields of information modeling and process modeling. A set of event......-based modeling approaches are analyzed and the results are used to formulate a general event concept that can be used for unifying the seemingly unrelated event concepts. Events are characterized as short-duration processes that have participants, consequences, and properties, and that may be modeled in terms...... of information structures. The general event concept can be used to guide systems analysis and design and to improve modeling approaches....

  20. Recovery Act: Web-based CO{sub 2} Subsurface Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Paolini, Christopher; Castillo, Jose

    2012-11-30

    The Web-based CO{sub 2} Subsurface Modeling project focused primarily on extending an existing text-only, command-line driven, isothermal and isobaric, geochemical reaction-transport simulation code, developed and donated by Sienna Geodynamics, into an easier-to-use Web-based application for simulating long-term storage of CO{sub 2} in geologic reservoirs. The Web-based interface developed through this project, publically accessible via URL http://symc.sdsu.edu/, enables rapid prototyping of CO{sub 2} injection scenarios and allows students without advanced knowledge of geochemistry to setup a typical sequestration scenario, invoke a simulation, analyze results, and then vary one or more problem parameters and quickly re-run a simulation to answer what-if questions. symc.sdsu.edu has 2x12 core AMD Opteron™ 6174 2.20GHz processors and 16GB RAM. The Web-based application was used to develop a new computational science course at San Diego State University, COMP 670: Numerical Simulation of CO{sub 2} Sequestration, which was taught during the fall semester of 2012. The purpose of the class was to introduce graduate students to Carbon Capture, Use and Storage (CCUS) through numerical modeling and simulation, and to teach students how to interpret simulation results to make predictions about long-term CO{sub 2} storage capacity in deep brine reservoirs. In addition to the training and education component of the project, significant software development efforts took place. Two computational science doctoral and one geological science masters student, under the direction of the PIs, extended the original code developed by Sienna Geodynamics, named Sym.8. New capabilities were added to Sym.8 to simulate non-isothermal and non-isobaric flows of charged aqueous solutes in porous media, in addition to incorporating HPC support into the code for execution on many-core XSEDE clusters. A successful outcome of this project was the funding and training of three new computational

  1. Nonparametric modeling of dynamic functional connectivity in fmri data

    DEFF Research Database (Denmark)

    Nielsen, Søren Føns Vind; Madsen, Kristoffer H.; Røge, Rasmus

    2015-01-01

    dynamic changes. The existing approaches modeling dynamic connectivity have primarily been based on time-windowing the data and k-means clustering. We propose a nonparametric generative model for dynamic FC in fMRI that does not rely on specifying window lengths and number of dynamic states. Rooted...

  2. Hydrogen peroxide production is not primarily increased in human myotubes established from type 2 diabetic subjects.

    Science.gov (United States)

    Minet, A D; Gaster, M

    2011-09-01

    Increased oxidative stress and mitochondrial dysfunction have been implicated in the development of insulin resistance in type 2 diabetes. To date, it is unknown whether increased mitochondrial reactive oxygen species (ROS) production in skeletal muscle from patients with type 2 diabetes is primarily increased or a secondary adaptation to environmental, lifestyle, and hormonal factors. This study investigates whether ROS production is primarily increased in isolated diabetic myotubes. Mitochondrial membrane potential, hydrogen peroxide (H(2)O(2)), superoxide, and mitochondrial mass were determined in human myotubes precultured under normophysiological conditions. Furthermore, the corresponding ATP synthesis was measured in isolated mitochondria. Muscle biopsies were taken from 10 lean subjects, 10 obese subjects, and 10 subjects with type 2 diabetes; satellite cells were isolated, cultured, and differentiated to myotubes. Mitochondrial mass, membrane potential/mitochondrial mass, and superoxide-production/mitochondrial mass were not different between groups. In contrast, H(2)O(2) production/mitochondrial mass and ATP production were significantly reduced in diabetic myotubes compared to lean controls (P production is not primarily increased in diabetic myotubes but rather is reduced. Moreover, the comparable ATP/H(2)O(2) ratios indicate that the reduced ROS production in diabetic myotubes parallels the reduced ATP production because ROS production in diabetic myotubes must be considered to be in a proportion comparable to lean. Thus, the increased ROS production seen in skeletal muscle of type 2 diabetic patients is an adaptation to the in vivo conditions.

  3. Alcoholics Anonymous and twelve-step recovery: a model based on social and cognitive neuroscience.

    Science.gov (United States)

    Galanter, Marc

    2014-01-01

    In the course of achieving abstinence from alcohol, longstanding members of Alcoholics Anonymous (AA) typically experience a change in their addiction-related attitudes and behaviors. These changes are reflective of physiologically grounded mechanisms which can be investigated within the disciplines of social and cognitive neuroscience. This article is designed to examine recent findings associated with these disciplines that may shed light on the mechanisms underlying this change. Literature review and hypothesis development. Pertinent aspects of the neural impact of drugs of abuse are summarized. After this, research regarding specific brain sites, elucidated primarily by imaging techniques, is reviewed relative to the following: Mirroring and mentalizing are described in relation to experimentally modeled studies on empathy and mutuality, which may parallel the experiences of social interaction and influence on AA members. Integration and retrieval of memories acquired in a setting like AA are described, and are related to studies on storytelling, models of self-schema development, and value formation. A model for ascription to a Higher Power is presented. The phenomena associated with AA reflect greater complexity than the empirical studies on which this article is based, and certainly require further elucidation. Despite this substantial limitation in currently available findings, there is heuristic value in considering the relationship between the brain-based and clinical phenomena described here. There are opportunities for the study of neuroscientific correlates of Twelve-Step-based recovery, and these can potentially enhance our understanding of related clinical phenomena. © American Academy of Addiction Psychiatry.

  4. Improving satellite-based PM2.5 estimates in China using Gaussian processes modeling in a Bayesian hierarchical setting.

    Science.gov (United States)

    Yu, Wenxi; Liu, Yang; Ma, Zongwei; Bi, Jun

    2017-08-01

    Using satellite-based aerosol optical depth (AOD) measurements and statistical models to estimate ground-level PM 2.5 is a promising way to fill the areas that are not covered by ground PM 2.5 monitors. The statistical models used in previous studies are primarily Linear Mixed Effects (LME) and Geographically Weighted Regression (GWR) models. In this study, we developed a new regression model between PM 2.5 and AOD using Gaussian processes in a Bayesian hierarchical setting. Gaussian processes model the stochastic nature of the spatial random effects, where the mean surface and the covariance function is specified. The spatial stochastic process is incorporated under the Bayesian hierarchical framework to explain the variation of PM 2.5 concentrations together with other factors, such as AOD, spatial and non-spatial random effects. We evaluate the results of our model and compare them with those of other, conventional statistical models (GWR and LME) by within-sample model fitting and out-of-sample validation (cross validation, CV). The results show that our model possesses a CV result (R 2  = 0.81) that reflects higher accuracy than that of GWR and LME (0.74 and 0.48, respectively). Our results indicate that Gaussian process models have the potential to improve the accuracy of satellite-based PM 2.5 estimates.

  5. Airfoil Shape Optimization based on Surrogate Model

    Science.gov (United States)

    Mukesh, R.; Lingadurai, K.; Selvakumar, U.

    2018-02-01

    Engineering design problems always require enormous amount of real-time experiments and computational simulations in order to assess and ensure the design objectives of the problems subject to various constraints. In most of the cases, the computational resources and time required per simulation are large. In certain cases like sensitivity analysis, design optimisation etc where thousands and millions of simulations have to be carried out, it leads to have a life time of difficulty for designers. Nowadays approximation models, otherwise called as surrogate models (SM), are more widely employed in order to reduce the requirement of computational resources and time in analysing various engineering systems. Various approaches such as Kriging, neural networks, polynomials, Gaussian processes etc are used to construct the approximation models. The primary intention of this work is to employ the k-fold cross validation approach to study and evaluate the influence of various theoretical variogram models on the accuracy of the surrogate model construction. Ordinary Kriging and design of experiments (DOE) approaches are used to construct the SMs by approximating panel and viscous solution algorithms which are primarily used to solve the flow around airfoils and aircraft wings. The method of coupling the SMs with a suitable optimisation scheme to carryout an aerodynamic design optimisation process for airfoil shapes is also discussed.

  6. gPKPDSim: a SimBiology®-based GUI application for PKPD modeling in drug development.

    Science.gov (United States)

    Hosseini, Iraj; Gajjala, Anita; Bumbaca Yadav, Daniela; Sukumaran, Siddharth; Ramanujan, Saroja; Paxson, Ricardo; Gadkar, Kapil

    2018-04-01

    Modeling and simulation (M&S) is increasingly used in drug development to characterize pharmacokinetic-pharmacodynamic (PKPD) relationships and support various efforts such as target feasibility assessment, molecule selection, human PK projection, and preclinical and clinical dose and schedule determination. While model development typically require mathematical modeling expertise, model exploration and simulations could in many cases be performed by scientists in various disciplines to support the design, analysis and interpretation of experimental studies. To this end, we have developed a versatile graphical user interface (GUI) application to enable easy use of any model constructed in SimBiology ® to execute various common PKPD analyses. The MATLAB ® -based GUI application, called gPKPDSim, has a single screen interface and provides functionalities including simulation, data fitting (parameter estimation), population simulation (exploring the impact of parameter variability on the outputs of interest), and non-compartmental PK analysis. Further, gPKPDSim is a user-friendly tool with capabilities including interactive visualization, exporting of results and generation of presentation-ready figures. gPKPDSim was designed primarily for use in preclinical and translational drug development, although broader applications exist. gPKPDSim is a MATLAB ® -based open-source application and is publicly available to download from MATLAB ® Central™. We illustrate the use and features of gPKPDSim using multiple PKPD models to demonstrate the wide applications of this tool in pharmaceutical sciences. Overall, gPKPDSim provides an integrated, multi-purpose user-friendly GUI application to enable efficient use of PKPD models by scientists from various disciplines, regardless of their modeling expertise.

  7. Comparative analysis of various methods for modelling permanent magnet machines

    NARCIS (Netherlands)

    Ramakrishnan, K.; Curti, M.; Zarko, D.; Mastinu, G.; Paulides, J.J.H.; Lomonova, E.A.

    2017-01-01

    In this paper, six different modelling methods for permanent magnet (PM) electric machines are compared in terms of their computational complexity and accuracy. The methods are based primarily on conformal mapping, mode matching, and harmonic modelling. In the case of conformal mapping, slotted air

  8. Rule-based decision making model

    International Nuclear Information System (INIS)

    Sirola, Miki

    1998-01-01

    A rule-based decision making model is designed in G2 environment. A theoretical and methodological frame for the model is composed and motivated. The rule-based decision making model is based on object-oriented modelling, knowledge engineering and decision theory. The idea of safety objective tree is utilized. Advanced rule-based methodologies are applied. A general decision making model 'decision element' is constructed. The strategy planning of the decision element is based on e.g. value theory and utility theory. A hypothetical process model is built to give input data for the decision element. The basic principle of the object model in decision making is division in tasks. Probability models are used in characterizing component availabilities. Bayes' theorem is used to recalculate the probability figures when new information is got. The model includes simple learning features to save the solution path. A decision analytic interpretation is given to the decision making process. (author)

  9. Water-Based Pressure-Sensitive Paints

    Science.gov (United States)

    Jordan, Jeffrey D.; Watkins, A. Neal; Oglesby, Donald M.; Ingram, JoAnne L.

    2006-01-01

    Water-based pressure-sensitive paints (PSPs) have been invented as alternatives to conventional organic-solvent-based pressure-sensitive paints, which are used primarily for indicating distributions of air pressure on wind-tunnel models. Typically, PSPs are sprayed onto aerodynamic models after they have been mounted in wind tunnels. When conventional organic-solvent-based PSPs are used, this practice creates a problem of removing toxic fumes from inside the wind tunnels. The use of water-based PSPs eliminates this problem. The waterbased PSPs offer high performance as pressure indicators, plus all the advantages of common water-based paints (low toxicity, low concentrations of volatile organic compounds, and easy cleanup by use of water).

  10. X-ray and CT signs of connective tissue dysplasia in patients with primarily diagnosed infiltrative pulmonary tuberculosis

    International Nuclear Information System (INIS)

    Sukhanova, L.A.; Sharmazanova, O.P.

    2009-01-01

    The x-ray signs of connective tissue systemic dysplasia (CTSD) in patients with primarily diagnosed pulmonary tuberculosis was investigated. Fifty-four patients (28 med and 26 women aged 18-70) with primarily diagnosed infiltrative pulmonary tuberculosis underwent x-ray study. In patients with infiltration pulmonary tuberculosis CTSD in the lungs manifests by their diminishing, deformity of the lung pattern, high position of the diaphragm cupola, mediastinum shift to the side of the pathology, which is better seen on CT. The degree of CTSD x-ray signs in the lungs depends on the number of phenotypical signs that is the degree of the disease manifestation. CT allows more accurate determining of the signs of connective tissue dysplasia in which tuberculosis develops

  11. Introducing Model-Based System Engineering Transforming System Engineering through Model-Based Systems Engineering

    Science.gov (United States)

    2014-03-31

    Web  Presentation...Software  .....................................................  20   Figure  6.  Published   Web  Page  from  Data  Collection...the  term  Model  Based  Engineering  (MBE),  Model  Driven  Engineering  ( MDE ),  or  Model-­‐Based  Systems  

  12. Integration of Simulink Models with Component-based Software Models

    DEFF Research Database (Denmark)

    Marian, Nicolae

    2008-01-01

    Model based development aims to facilitate the development of embedded control systems by emphasizing the separation of the design level from the implementation level. Model based design involves the use of multiple models that represent different views of a system, having different semantics...... of abstract system descriptions. Usually, in mechatronics systems, design proceeds by iterating model construction, model analysis, and model transformation. Constructing a MATLAB/Simulink model, a plant and controller behavior is simulated using graphical blocks to represent mathematical and logical...... constraints. COMDES (Component-based Design of Software for Distributed Embedded Systems) is such a component-based system framework developed by the software engineering group of Mads Clausen Institute for Product Innovation (MCI), University of Southern Denmark. Once specified, the software model has...

  13. Coach simplified structure modeling and optimization study based on the PBM method

    Science.gov (United States)

    Zhang, Miaoli; Ren, Jindong; Yin, Ying; Du, Jian

    2016-09-01

    For the coach industry, rapid modeling and efficient optimization methods are desirable for structure modeling and optimization based on simplified structures, especially for use early in the concept phase and with capabilities of accurately expressing the mechanical properties of structure and with flexible section forms. However, the present dimension-based methods cannot easily meet these requirements. To achieve these goals, the property-based modeling (PBM) beam modeling method is studied based on the PBM theory and in conjunction with the characteristics of coach structure of taking beam as the main component. For a beam component of concrete length, its mechanical characteristics are primarily affected by the section properties. Four section parameters are adopted to describe the mechanical properties of a beam, including the section area, the principal moments of inertia about the two principal axles, and the torsion constant of the section. Based on the equivalent stiffness strategy, expressions for the above section parameters are derived, and the PBM beam element is implemented in HyperMesh software. A case is realized using this method, in which the structure of a passenger coach is simplified. The model precision is validated by comparing the basic performance of the total structure with that of the original structure, including the bending and torsion stiffness and the first-order bending and torsional modal frequencies. Sensitivity analysis is conducted to choose design variables. The optimal Latin hypercube experiment design is adopted to sample the test points, and polynomial response surfaces are used to fit these points. To improve the bending and torsion stiffness and the first-order torsional frequency and taking the allowable maximum stresses of the braking and left turning conditions as constraints, the multi-objective optimization of the structure is conducted using the NSGA-II genetic algorithm on the ISIGHT platform. The result of the

  14. Model-free and model-based reward prediction errors in EEG.

    Science.gov (United States)

    Sambrook, Thomas D; Hardwick, Ben; Wills, Andy J; Goslin, Jeremy

    2018-05-24

    Learning theorists posit two reinforcement learning systems: model-free and model-based. Model-based learning incorporates knowledge about structure and contingencies in the world to assign candidate actions with an expected value. Model-free learning is ignorant of the world's structure; instead, actions hold a value based on prior reinforcement, with this value updated by expectancy violation in the form of a reward prediction error. Because they use such different learning mechanisms, it has been previously assumed that model-based and model-free learning are computationally dissociated in the brain. However, recent fMRI evidence suggests that the brain may compute reward prediction errors to both model-free and model-based estimates of value, signalling the possibility that these systems interact. Because of its poor temporal resolution, fMRI risks confounding reward prediction errors with other feedback-related neural activity. In the present study, EEG was used to show the presence of both model-based and model-free reward prediction errors and their place in a temporal sequence of events including state prediction errors and action value updates. This demonstration of model-based prediction errors questions a long-held assumption that model-free and model-based learning are dissociated in the brain. Copyright © 2018 Elsevier Inc. All rights reserved.

  15. Transcription-based model for the induction of chromosomal exchange events by ionising radiation

    International Nuclear Information System (INIS)

    Radford, I.A.

    2003-01-01

    The mechanistic basis for chromosomal aberration formation, following exposure of mammalian cells to ionising radiation, has long been debated. Although chromosomal aberrations are probably initiated by DNA double-strand breaks (DSB), little is understood about the mechanisms that generate and modulate DNA rearrangement. Based on results from our laboratory and data from the literature, a novel model of chromosomal aberration formation has been suggested (Radford 2002). The basic postulates of this model are that: (1) DSB, primarily those involving multiple individual damage sites (i.e. complex DSB), are the critical initiating lesion; (2) only those DSB occurring in transcription units that are associated with transcription 'factories' (complexes containing multiple transcription units) induce chromosomal exchange events; (3) such DSB are brought into contact with a DNA topoisomerase I molecule through RNA polymerase II catalysed transcription and give rise to trapped DNA-topo I cleavage complexes; and (4) trapped complexes interact with another topo I molecule on a temporarily inactive transcription unit at the same transcription factory leading to DNA cleavage and subsequent strand exchange between the cleavage complexes. We have developed a method using inverse PCR that allows the detection and sequencing of putative ionising radiation-induced DNA rearrangements involving different regions of the human genome (Forrester and Radford 1998). The sequences detected by inverse PCR can provide a test of the prediction of the transcription-based model that ionising radiation-induced DNA rearrangements occur between sequences in active transcription units. Accordingly, reverse transcriptase PCR was used to determine if sequences involved in rearrangements were transcribed in the test cells. Consistent with the transcription-based model, nearly all of the sequences examined gave a positive result to reverse transcriptase PCR (Forrester and Radford unpublished)

  16. Guide to APA-Based Models

    Science.gov (United States)

    Robins, Robert E.; Delisi, Donald P.

    2008-01-01

    In Robins and Delisi (2008), a linear decay model, a new IGE model by Sarpkaya (2006), and a series of APA-Based models were scored using data from three airports. This report is a guide to the APA-based models.

  17. Mathematical Modelling in Engineering: A Proposal to Introduce Linear Algebra Concepts

    Science.gov (United States)

    Cárcamo Bahamonde, Andrea; Gómez Urgelles, Joan; Fortuny Aymemí, Josep

    2016-01-01

    The modern dynamic world requires that basic science courses for engineering, including linear algebra, emphasise the development of mathematical abilities primarily associated with modelling and interpreting, which are not exclusively calculus abilities. Considering this, an instructional design was created based on mathematical modelling and…

  18. NASA/Air Force Cost Model: NAFCOM

    Science.gov (United States)

    Winn, Sharon D.; Hamcher, John W. (Technical Monitor)

    2002-01-01

    The NASA/Air Force Cost Model (NAFCOM) is a parametric estimating tool for space hardware. It is based on historical NASA and Air Force space projects and is primarily used in the very early phases of a development project. NAFCOM can be used at the subsystem or component levels.

  19. Traceability in Model-Based Testing

    Directory of Open Access Journals (Sweden)

    Mathew George

    2012-11-01

    Full Text Available The growing complexities of software and the demand for shorter time to market are two important challenges that face today’s IT industry. These challenges demand the increase of both productivity and quality of software. Model-based testing is a promising technique for meeting these challenges. Traceability modeling is a key issue and challenge in model-based testing. Relationships between the different models will help to navigate from one model to another, and trace back to the respective requirements and the design model when the test fails. In this paper, we present an approach for bridging the gaps between the different models in model-based testing. We propose relation definition markup language (RDML for defining the relationships between models.

  20. A practical procedure for the selection of time-to-failure models based on the assessment of trends in maintenance data

    International Nuclear Information System (INIS)

    Louit, D.M.; Pascual, R.; Jardine, A.K.S.

    2009-01-01

    Many times, reliability studies rely on false premises such as independent and identically distributed time between failures assumption (renewal process). This can lead to erroneous model selection for the time to failure of a particular component or system, which can in turn lead to wrong conclusions and decisions. A strong statistical focus, a lack of a systematic approach and sometimes inadequate theoretical background seem to have made it difficult for maintenance analysts to adopt the necessary stage of data testing before the selection of a suitable model. In this paper, a framework for model selection to represent the failure process for a component or system is presented, based on a review of available trend tests. The paper focuses only on single-time-variable models and is primarily directed to analysts responsible for reliability analyses in an industrial maintenance environment. The model selection framework is directed towards the discrimination between the use of statistical distributions to represent the time to failure ('renewal approach'); and the use of stochastic point processes ('repairable systems approach'), when there may be the presence of system ageing or reliability growth. An illustrative example based on failure data from a fleet of backhoes is included.

  1. Modeling complexes of modeled proteins.

    Science.gov (United States)

    Anishchenko, Ivan; Kundrotas, Petras J; Vakser, Ilya A

    2017-03-01

    Structural characterization of proteins is essential for understanding life processes at the molecular level. However, only a fraction of known proteins have experimentally determined structures. This fraction is even smaller for protein-protein complexes. Thus, structural modeling of protein-protein interactions (docking) primarily has to rely on modeled structures of the individual proteins, which typically are less accurate than the experimentally determined ones. Such "double" modeling is the Grand Challenge of structural reconstruction of the interactome. Yet it remains so far largely untested in a systematic way. We present a comprehensive validation of template-based and free docking on a set of 165 complexes, where each protein model has six levels of structural accuracy, from 1 to 6 Å C α RMSD. Many template-based docking predictions fall into acceptable quality category, according to the CAPRI criteria, even for highly inaccurate proteins (5-6 Å RMSD), although the number of such models (and, consequently, the docking success rate) drops significantly for models with RMSD > 4 Å. The results show that the existing docking methodologies can be successfully applied to protein models with a broad range of structural accuracy, and the template-based docking is much less sensitive to inaccuracies of protein models than the free docking. Proteins 2017; 85:470-478. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  2. Activity-based DEVS modeling

    DEFF Research Database (Denmark)

    Alshareef, Abdurrahman; Sarjoughian, Hessam S.; Zarrin, Bahram

    2018-01-01

    architecture and the UML concepts. In this paper, we further this work by grounding Activity-based DEVS modeling and developing a fully-fledged modeling engine to demonstrate applicability. We also detail the relevant aspects of the created metamodel in terms of modeling and simulation. A significant number......Use of model-driven approaches has been increasing to significantly benefit the process of building complex systems. Recently, an approach for specifying model behavior using UML activities has been devised to support the creation of DEVS models in a disciplined manner based on the model driven...... of the artifacts of the UML 2.5 activities and actions, from the vantage point of DEVS behavioral modeling, is covered in details. Their semantics are discussed to the extent of time-accurate requirements for simulation. We characterize them in correspondence with the specification of the atomic model behavior. We...

  3. Lévy-based growth models

    DEFF Research Database (Denmark)

    Jónsdóttir, Kristjana Ýr; Schmiegel, Jürgen; Jensen, Eva Bjørn Vedel

    2008-01-01

    In the present paper, we give a condensed review, for the nonspecialist reader, of a new modelling framework for spatio-temporal processes, based on Lévy theory. We show the potential of the approach in stochastic geometry and spatial statistics by studying Lévy-based growth modelling of planar o...... objects. The growth models considered are spatio-temporal stochastic processes on the circle. As a by product, flexible new models for space–time covariance functions on the circle are provided. An application of the Lévy-based growth models to tumour growth is discussed....

  4. Control volume based modelling in one space dimension of oscillating, compressible flow in reciprocating machines

    DEFF Research Database (Denmark)

    Andersen, Stig Kildegård; Carlsen, Henrik; Thomsen, Per Grove

    2006-01-01

    We present an approach for modelling unsteady, primarily one-dimensional, compressible flow. The conservation laws for mass, energy, and momentum are applied to a staggered mesh of control volumes and loss mechanisms are included directly as extra terms. Heat transfer, flow friction, and multidim...... are presented. The capabilities of the approach are illustrated with an example solution and an experimental validation of a Stirling engine model....

  5. Gradient-based model calibration with proxy-model assistance

    Science.gov (United States)

    Burrows, Wesley; Doherty, John

    2016-02-01

    Use of a proxy model in gradient-based calibration and uncertainty analysis of a complex groundwater model with large run times and problematic numerical behaviour is described. The methodology is general, and can be used with models of all types. The proxy model is based on a series of analytical functions that link all model outputs used in the calibration process to all parameters requiring estimation. In enforcing history-matching constraints during the calibration and post-calibration uncertainty analysis processes, the proxy model is run for the purposes of populating the Jacobian matrix, while the original model is run when testing parameter upgrades; the latter process is readily parallelized. Use of a proxy model in this fashion dramatically reduces the computational burden of complex model calibration and uncertainty analysis. At the same time, the effect of model numerical misbehaviour on calculation of local gradients is mitigated, this allowing access to the benefits of gradient-based analysis where lack of integrity in finite-difference derivatives calculation would otherwise have impeded such access. Construction of a proxy model, and its subsequent use in calibration of a complex model, and in analysing the uncertainties of predictions made by that model, is implemented in the PEST suite.

  6. HMM-based Trust Model

    DEFF Research Database (Denmark)

    ElSalamouny, Ehab; Nielsen, Mogens; Sassone, Vladimiro

    2010-01-01

    Probabilistic trust has been adopted as an approach to taking security sensitive decisions in modern global computing environments. Existing probabilistic trust frameworks either assume fixed behaviour for the principals or incorporate the notion of ‘decay' as an ad hoc approach to cope...... with their dynamic behaviour. Using Hidden Markov Models (HMMs) for both modelling and approximating the behaviours of principals, we introduce the HMM-based trust model as a new approach to evaluating trust in systems exhibiting dynamic behaviour. This model avoids the fixed behaviour assumption which is considered...... the major limitation of existing Beta trust model. We show the consistency of the HMM-based trust model and contrast it against the well known Beta trust model with the decay principle in terms of the estimation precision....

  7. An Elaboration of a Strategic Alignment Model of University Information Systems based on SAM Model

    Directory of Open Access Journals (Sweden)

    S. Ahriz

    2018-02-01

    Full Text Available Information system is a guarantee of the universities' ability to anticipate the essential functions to their development and durability. The alignment of information system, one of the pillars of IT governance, has become a necessity. In this paper, we consider the problem of strategic alignment model implementation in Moroccan universities. Literature revealed that few studies have examined strategic alignment in the public sector, particularly in higher education institutions. Hence we opted for an exploratory approach that aims to better understanding the strategic alignment and to evaluate the degree of its use within Moroccan universities. The data gained primarily through interviews with top managers and IT managers reveal that the alignment is not formalized and that it would be appropriate to implement an alignment model. It is found that the implementation of our proposed model can help managers to maximize returns of IT investment and to increase their efficiency.

  8. Base Station Performance Model

    OpenAIRE

    Walsh, Barbara; Farrell, Ronan

    2005-01-01

    At present the testing of power amplifiers within base station transmitters is limited to testing at component level as opposed to testing at the system level. While the detection of catastrophic failure is possible, that of performance degradation is not. This paper proposes a base station model with respect to transmitter output power with the aim of introducing system level monitoring of the power amplifier behaviour within the base station. Our model reflects the expe...

  9. Model based design introduction: modeling game controllers to microprocessor architectures

    Science.gov (United States)

    Jungwirth, Patrick; Badawy, Abdel-Hameed

    2017-04-01

    We present an introduction to model based design. Model based design is a visual representation, generally a block diagram, to model and incrementally develop a complex system. Model based design is a commonly used design methodology for digital signal processing, control systems, and embedded systems. Model based design's philosophy is: to solve a problem - a step at a time. The approach can be compared to a series of steps to converge to a solution. A block diagram simulation tool allows a design to be simulated with real world measurement data. For example, if an analog control system is being upgraded to a digital control system, the analog sensor input signals can be recorded. The digital control algorithm can be simulated with the real world sensor data. The output from the simulated digital control system can then be compared to the old analog based control system. Model based design can compared to Agile software develop. The Agile software development goal is to develop working software in incremental steps. Progress is measured in completed and tested code units. Progress is measured in model based design by completed and tested blocks. We present a concept for a video game controller and then use model based design to iterate the design towards a working system. We will also describe a model based design effort to develop an OS Friendly Microprocessor Architecture based on the RISC-V.

  10. The impact of design-based modeling instruction on seventh graders' spatial abilities and model-based argumentation

    Science.gov (United States)

    McConnell, William J.

    Due to the call of current science education reform for the integration of engineering practices within science classrooms, design-based instruction is receiving much attention in science education literature. Although some aspect of modeling is often included in well-known design-based instructional methods, it is not always a primary focus. The purpose of this study was to better understand how design-based instruction with an emphasis on scientific modeling might impact students' spatial abilities and their model-based argumentation abilities. In the following mixed-method multiple case study, seven seventh grade students attending a secular private school in the Mid-Atlantic region of the United States underwent an instructional intervention involving design-based instruction, modeling and argumentation. Through the course of a lesson involving students in exploring the interrelatedness of the environment and an animal's form and function, students created and used multiple forms of expressed models to assist them in model-based scientific argument. Pre/post data were collected through the use of The Purdue Spatial Visualization Test: Rotation, the Mental Rotation Test and interviews. Other data included a spatial activities survey, student artifacts in the form of models, notes, exit tickets, and video recordings of students throughout the intervention. Spatial abilities tests were analyzed using descriptive statistics while students' arguments were analyzed using the Instrument for the Analysis of Scientific Curricular Arguments and a behavior protocol. Models were analyzed using content analysis and interviews and all other data were coded and analyzed for emergent themes. Findings in the area of spatial abilities included increases in spatial reasoning for six out of seven participants, and an immense difference in the spatial challenges encountered by students when using CAD software instead of paper drawings to create models. Students perceived 3D printed

  11. EPR-based material modelling of soils

    Science.gov (United States)

    Faramarzi, Asaad; Alani, Amir M.

    2013-04-01

    In the past few decades, as a result of the rapid developments in computational software and hardware, alternative computer aided pattern recognition approaches have been introduced to modelling many engineering problems, including constitutive modelling of materials. The main idea behind pattern recognition systems is that they learn adaptively from experience and extract various discriminants, each appropriate for its purpose. In this work an approach is presented for developing material models for soils based on evolutionary polynomial regression (EPR). EPR is a recently developed hybrid data mining technique that searches for structured mathematical equations (representing the behaviour of a system) using genetic algorithm and the least squares method. Stress-strain data from triaxial tests are used to train and develop EPR-based material models for soil. The developed models are compared with some of the well-known conventional material models and it is shown that EPR-based models can provide a better prediction for the behaviour of soils. The main benefits of using EPR-based material models are that it provides a unified approach to constitutive modelling of all materials (i.e., all aspects of material behaviour can be implemented within a unified environment of an EPR model); it does not require any arbitrary choice of constitutive (mathematical) models. In EPR-based material models there are no material parameters to be identified. As the model is trained directly from experimental data therefore, EPR-based material models are the shortest route from experimental research (data) to numerical modelling. Another advantage of EPR-based constitutive model is that as more experimental data become available, the quality of the EPR prediction can be improved by learning from the additional data, and therefore, the EPR model can become more effective and robust. The developed EPR-based material models can be incorporated in finite element (FE) analysis.

  12. HYSOGs250m, global gridded hydrologic soil groups for curve-number-based runoff modeling.

    Science.gov (United States)

    Ross, C Wade; Prihodko, Lara; Anchang, Julius; Kumar, Sanath; Ji, Wenjie; Hanan, Niall P

    2018-05-15

    Hydrologic soil groups (HSGs) are a fundamental component of the USDA curve-number (CN) method for estimation of rainfall runoff; yet these data are not readily available in a format or spatial-resolution suitable for regional- and global-scale modeling applications. We developed a globally consistent, gridded dataset defining HSGs from soil texture, bedrock depth, and groundwater. The resulting data product-HYSOGs250m-represents runoff potential at 250 m spatial resolution. Our analysis indicates that the global distribution of soil is dominated by moderately high runoff potential, followed by moderately low, high, and low runoff potential. Low runoff potential, sandy soils are found primarily in parts of the Sahara and Arabian Deserts. High runoff potential soils occur predominantly within tropical and sub-tropical regions. No clear pattern could be discerned for moderately low runoff potential soils, as they occur in arid and humid environments and at both high and low elevations. Potential applications of this data include CN-based runoff modeling, flood risk assessment, and as a covariate for biogeographical analysis of vegetation distributions.

  13. Issues in practical model-based diagnosis

    NARCIS (Netherlands)

    Bakker, R.R.; Bakker, R.R.; van den Bempt, P.C.A.; van den Bempt, P.C.A.; Mars, Nicolaas; Out, D.-J.; Out, D.J.; van Soest, D.C.; van Soes, D.C.

    1993-01-01

    The model-based diagnosis project at the University of Twente has been directed at improving the practical usefulness of model-based diagnosis. In cooperation with industrial partners, the research addressed the modeling problem and the efficiency problem in model-based reasoning. Main results of

  14. Agent-based modeling and network dynamics

    CERN Document Server

    Namatame, Akira

    2016-01-01

    The book integrates agent-based modeling and network science. It is divided into three parts, namely, foundations, primary dynamics on and of social networks, and applications. The book begins with the network origin of agent-based models, known as cellular automata, and introduce a number of classic models, such as Schelling’s segregation model and Axelrod’s spatial game. The essence of the foundation part is the network-based agent-based models in which agents follow network-based decision rules. Under the influence of the substantial progress in network science in late 1990s, these models have been extended from using lattices into using small-world networks, scale-free networks, etc. The book also shows that the modern network science mainly driven by game-theorists and sociophysicists has inspired agent-based social scientists to develop alternative formation algorithms, known as agent-based social networks. The book reviews a number of pioneering and representative models in this family. Upon the gi...

  15. Graph Based Models for Unsupervised High Dimensional Data Clustering and Network Analysis

    Science.gov (United States)

    2015-01-01

    A. Porter and my advisor. The text is primarily written by me. Chapter 5 is a version of [46] where my contribution is all of the analytical ...inn Euclidean space, a variational method refers to using calculus of variation techniques to find the minimizer (or maximizer) of a functional (energy... geometric inter- pretation of modularity optimization contrasts with existing interpretations (e.g., probabilistic ones or in terms of the Potts model

  16. Principles of models based engineering

    Energy Technology Data Exchange (ETDEWEB)

    Dolin, R.M.; Hefele, J.

    1996-11-01

    This report describes a Models Based Engineering (MBE) philosophy and implementation strategy that has been developed at Los Alamos National Laboratory`s Center for Advanced Engineering Technology. A major theme in this discussion is that models based engineering is an information management technology enabling the development of information driven engineering. Unlike other information management technologies, models based engineering encompasses the breadth of engineering information, from design intent through product definition to consumer application.

  17. Model-based Software Engineering

    DEFF Research Database (Denmark)

    Kindler, Ekkart

    2010-01-01

    The vision of model-based software engineering is to make models the main focus of software development and to automatically generate software from these models. Part of that idea works already today. But, there are still difficulties when it comes to behaviour. Actually, there is no lack in models...

  18. Triacylglycerol Accumulation is not primarily affected in Myotubes established from Type 2 Diabetic Subjects

    DEFF Research Database (Denmark)

    Gaster, Michael; Beck-Nielsen, Henning

    2006-01-01

    In the present study, we investigated triacylglycerol (TAG) accumulation, glucose and fatty acid (FA) uptake, and glycogen synthesis (GS) in human myotubes from healthy, lean, and obese subjects with and without type 2 diabetes (T2D), exposed to increasing palmitate (PA) and oleate (OA...... uptake (P0.05). These results indicate that (1) TAG accumulation is not primarily affected in skeletal muscle tissue of obese and T2D; (2) induced inhibition of oxidative phosphorylation is followed by TAG accumulation...... in skeletal muscle of obese and T2D subjects is adaptive....

  19. Electroejaculation functions primarily by direct activation of pelvic musculature: Perspectives from a porcine model

    Directory of Open Access Journals (Sweden)

    Adam M.R. Groh

    2018-03-01

    Full Text Available Ejaculatory dysfunction is a significant cause of infertility in men that have incurred spinal cord injury or iatrogenic lesions to the sympathetic nerves in the retroperitoneum. For such patients, electroejaculation – whereby a voltage is applied transrectally under general anesthesia – is a highly-effective procedure to obtain ejaculate. At present, however, there remains uncertainty as to the physiological mechanism by which electroejaculation prompts seminal emission in males with neurogenic anejaculation. Thus, in the present study, we aimed to determine, for the first time, whether electroejaculation functions by mimicking a neurophysiological response, or by directly activating local pelvic musculature. Using electroejaculation in a novel porcine model, we monitored the strength of contraction of the internal urethral sphincter (a smooth muscle involved in ejaculation before and after lesioning its sympathetic innervation with a combination of progressively-worsening surgical and pharmacological insults in three anesthetized boars (46.1 ± 7.4 kg. Importantly, prior to this investigation, we confirmed the comparative structural anatomy of the porcine model to humans through gross dissection and histological analysis of the infrarenal retroperitoneal sympathetic nerves and ganglia in 18 unembalmed boars. Prior to sacrifice, three of these boars underwent functional testing to confirm control of the internal urethral sphincter by the hypogastric nerves. Our results demonstrate that electroejaculation-induced contraction of the internal urethral sphincter was preserved following each progressive neural insult compared to the control state (p > 0.05. In contrast, these same insults resulted in paralysis/paresis of the internal urethral sphincter when its sympathetic innervation was directly stimulated with bipolar electrodes (p < 0.05. Taken together, our results provide the first empirical evidence to suggest that

  20. Modeling Dynamic Systems with Efficient Ensembles of Process-Based Models.

    Directory of Open Access Journals (Sweden)

    Nikola Simidjievski

    Full Text Available Ensembles are a well established machine learning paradigm, leading to accurate and robust models, predominantly applied to predictive modeling tasks. Ensemble models comprise a finite set of diverse predictive models whose combined output is expected to yield an improved predictive performance as compared to an individual model. In this paper, we propose a new method for learning ensembles of process-based models of dynamic systems. The process-based modeling paradigm employs domain-specific knowledge to automatically learn models of dynamic systems from time-series observational data. Previous work has shown that ensembles based on sampling observational data (i.e., bagging and boosting, significantly improve predictive performance of process-based models. However, this improvement comes at the cost of a substantial increase of the computational time needed for learning. To address this problem, the paper proposes a method that aims at efficiently learning ensembles of process-based models, while maintaining their accurate long-term predictive performance. This is achieved by constructing ensembles with sampling domain-specific knowledge instead of sampling data. We apply the proposed method to and evaluate its performance on a set of problems of automated predictive modeling in three lake ecosystems using a library of process-based knowledge for modeling population dynamics. The experimental results identify the optimal design decisions regarding the learning algorithm. The results also show that the proposed ensembles yield significantly more accurate predictions of population dynamics as compared to individual process-based models. Finally, while their predictive performance is comparable to the one of ensembles obtained with the state-of-the-art methods of bagging and boosting, they are substantially more efficient.

  1. GOLD HULL AND INTERNODE2 Encodes a Primarily Multifunctional Cinnamyl-Alcohol Dehydrogenase in Rice1

    Science.gov (United States)

    Zhang, Kewei; Qian, Qian; Huang, Zejun; Wang, Yiqin; Li, Ming; Hong, Lilan; Zeng, Dali; Gu, Minghong; Chu, Chengcai; Cheng, Zhukuan

    2006-01-01

    Lignin content and composition are two important agronomic traits for the utilization of agricultural residues. Rice (Oryza sativa) gold hull and internode phenotype is a classical morphological marker trait that has long been applied to breeding and genetics study. In this study, we have cloned the GOLD HULL AND INTERNODE2 (GH2) gene in rice using a map-based cloning approach. The result shows that the gh2 mutant is a lignin-deficient mutant, and GH2 encodes a cinnamyl-alcohol dehydrogenase (CAD). Consistent with this finding, extracts from roots, internodes, hulls, and panicles of the gh2 plants exhibited drastically reduced CAD activity and undetectable sinapyl alcohol dehydrogenase activity. When expressed in Escherichia coli, purified recombinant GH2 was found to exhibit strong catalytic ability toward coniferaldehyde and sinapaldehyde, while the mutant protein gh2 completely lost the corresponding CAD and sinapyl alcohol dehydrogenase activities. Further phenotypic analysis of the gh2 mutant plants revealed that the p-hydroxyphenyl, guaiacyl, and sinapyl monomers were reduced in almost the same ratio compared to the wild type. Our results suggest GH2 acts as a primarily multifunctional CAD to synthesize coniferyl and sinapyl alcohol precursors in rice lignin biosynthesis. PMID:16443696

  2. Model-based internal wave processing

    Energy Technology Data Exchange (ETDEWEB)

    Candy, J.V.; Chambers, D.H.

    1995-06-09

    A model-based approach is proposed to solve the oceanic internal wave signal processing problem that is based on state-space representations of the normal-mode vertical velocity and plane wave horizontal velocity propagation models. It is shown that these representations can be utilized to spatially propagate the modal (dept) vertical velocity functions given the basic parameters (wave numbers, Brunt-Vaisala frequency profile etc.) developed from the solution of the associated boundary value problem as well as the horizontal velocity components. Based on this framework, investigations are made of model-based solutions to the signal enhancement problem for internal waves.

  3. Integration of Simulink Models with Component-based Software Models

    Directory of Open Access Journals (Sweden)

    MARIAN, N.

    2008-06-01

    Full Text Available Model based development aims to facilitate the development of embedded control systems by emphasizing the separation of the design level from the implementation level. Model based design involves the use of multiple models that represent different views of a system, having different semantics of abstract system descriptions. Usually, in mechatronics systems, design proceeds by iterating model construction, model analysis, and model transformation. Constructing a MATLAB/Simulink model, a plant and controller behavior is simulated using graphical blocks to represent mathematical and logical constructs and process flow, then software code is generated. A Simulink model is a representation of the design or implementation of a physical system that satisfies a set of requirements. A software component-based system aims to organize system architecture and behavior as a means of computation, communication and constraints, using computational blocks and aggregates for both discrete and continuous behavior, different interconnection and execution disciplines for event-based and time-based controllers, and so on, to encompass the demands to more functionality, at even lower prices, and with opposite constraints. COMDES (Component-based Design of Software for Distributed Embedded Systems is such a component-based system framework developed by the software engineering group of Mads Clausen Institute for Product Innovation (MCI, University of Southern Denmark. Once specified, the software model has to be analyzed. One way of doing that is to integrate in wrapper files the model back into Simulink S-functions, and use its extensive simulation features, thus allowing an early exploration of the possible design choices over multiple disciplines. The paper describes a safe translation of a restricted set of MATLAB/Simulink blocks to COMDES software components, both for continuous and discrete behavior, and the transformation of the software system into the S

  4. Eningiomas: outcome, and analysis of prognostic factors of primarily resected tumors

    International Nuclear Information System (INIS)

    Stafford, S.L.; Perry, A.; Suman, V.; Meyer, B.; Scheithauer, B.W.; Shaw, E.G.; Earle, J.D.

    1996-01-01

    Purpose: 582 consecutive cases of primary intracranial meningioma undergoing resection at the Mayo Clinic, (Rochester, MN) were reviewed to determine overall survival (OS), progression free survival(PFS), prognostic factors predicting recurrence, and to determine the importance of radiation therapy in the management of this tumor. Materials and Methods: Between 1978-1988, 582 cases of primarily resected meningiomas were identified based on the tumor and operative registries where diagnosis was between 1978-1988 inclusive. PFS was identified by radiographic progression. Follow-up was accomplished by chart review, and a detailed questionnaire sent to patients and referring physicians. Estimation of OS and PFS distributions were done by the Kaplan-Meier method. The log rank test was used to assess which factors were associated with PFS. Proportional hazard modeling was performed to obtain a subset of independent predictors of PFS. Results: the median age was 57(5-93). 67% were female. CT identified the tumor in 91% of cases. There was associated edema in 21% and 2% were radiographically en plaque. There were 17 patients with multiple tumors, four of whom had a known diagnosis of neurofibromatosis. Gross total resection (GTR) was accomplished in 80%, radical subtotal or subtotal resection(STR) in 20%, and biopsy in 53) cellularity, and four or more mitoses per 10 HPF. Multivariate analysis indicated young age, male sex, en plaque at surgery, were significant for decreased PFS when only patient characteristics were considered. When treatment and pathologic factors were also considered, then young age, male sex, less than GTR, and tumor sheeting were predictors for decreased PFS. 10 patients had RT after initial resection, two of whom recurred. There were 107 first recurrences. 50 were observed(no intervention within 3 months), 35 treated by surgery alone, 11 had S+RT, and 11 were treated with RT alone. Considering those patients treated at recurrence (n=57), PFS was at

  5. Model-based machine learning.

    Science.gov (United States)

    Bishop, Christopher M

    2013-02-13

    Several decades of research in the field of machine learning have resulted in a multitude of different algorithms for solving a broad range of problems. To tackle a new application, a researcher typically tries to map their problem onto one of these existing methods, often influenced by their familiarity with specific algorithms and by the availability of corresponding software implementations. In this study, we describe an alternative methodology for applying machine learning, in which a bespoke solution is formulated for each new application. The solution is expressed through a compact modelling language, and the corresponding custom machine learning code is then generated automatically. This model-based approach offers several major advantages, including the opportunity to create highly tailored models for specific scenarios, as well as rapid prototyping and comparison of a range of alternative models. Furthermore, newcomers to the field of machine learning do not have to learn about the huge range of traditional methods, but instead can focus their attention on understanding a single modelling environment. In this study, we show how probabilistic graphical models, coupled with efficient inference algorithms, provide a very flexible foundation for model-based machine learning, and we outline a large-scale commercial application of this framework involving tens of millions of users. We also describe the concept of probabilistic programming as a powerful software environment for model-based machine learning, and we discuss a specific probabilistic programming language called Infer.NET, which has been widely used in practical applications.

  6. Examining Change in K-3 Teachers' Mathematical Knowledge, Attitudes, and Beliefs: The Case of Primarily Math

    Science.gov (United States)

    Kutaka, T. S.; Ren, L.; Smith, W. M.; Beattie, H. L.; Edwards, C. P.; Green, J. L.; Chernyavskiy, P.; Stroup, W.; Heaton, R. M.; Lewis, W. J.

    2018-01-01

    This study examines the impact of the Primarily Math Elementary Mathematics Specialist program on K-3 teachers' mathematical content knowledge for teaching, attitudes toward learning mathematics, and beliefs about mathematics teaching and learning. Three cohorts of teachers participating in the program were compared to a similar group of…

  7. Can role models boost entrepreneurial attitudes?

    Science.gov (United States)

    Fellnhofer, Katharina; Puumalainen, Kaisu

    2017-01-01

    This multi-country study used role models to boost perceptions of entrepreneurial feasibility and desirability. The results of a structural equation model based on a sample comprising 426 individuals who were primarily from Austria, Finland and Greece revealed a significant positive influence on perceived entrepreneurial desirability and feasibility. These findings support the argument for embedding entrepreneurial role models in entrepreneurship education courses to promote entrepreneurial activities. This direction is not only relevant for the academic community but also essential for nascent entrepreneurs, policymakers and society at large.

  8. A practical procedure for the selection of time-to-failure models based on the assessment of trends in maintenance data

    Energy Technology Data Exchange (ETDEWEB)

    Louit, D.M. [Komatsu Chile, Av. Americo Vespucio 0631, Quilicura, Santiago (Chile)], E-mail: rpascual@ing.puc.cl; Pascual, R. [Centro de Mineria, Pontificia Universidad Catolica de Chile, Av. Vicuna Mackenna 4860, Santiago (Chile); Jardine, A.K.S. [Department of Mechanical and Industrial Engineering, University of Toronto, 5 King' s College Road, Toronto, Ont., M5S 3G8 (Canada)

    2009-10-15

    Many times, reliability studies rely on false premises such as independent and identically distributed time between failures assumption (renewal process). This can lead to erroneous model selection for the time to failure of a particular component or system, which can in turn lead to wrong conclusions and decisions. A strong statistical focus, a lack of a systematic approach and sometimes inadequate theoretical background seem to have made it difficult for maintenance analysts to adopt the necessary stage of data testing before the selection of a suitable model. In this paper, a framework for model selection to represent the failure process for a component or system is presented, based on a review of available trend tests. The paper focuses only on single-time-variable models and is primarily directed to analysts responsible for reliability analyses in an industrial maintenance environment. The model selection framework is directed towards the discrimination between the use of statistical distributions to represent the time to failure ('renewal approach'); and the use of stochastic point processes ('repairable systems approach'), when there may be the presence of system ageing or reliability growth. An illustrative example based on failure data from a fleet of backhoes is included.

  9. Event-based model diagnosis of rainfall-runoff model structures

    International Nuclear Information System (INIS)

    Stanzel, P.

    2012-01-01

    The objective of this research is a comparative evaluation of different rainfall-runoff model structures. Comparative model diagnostics facilitate the assessment of strengths and weaknesses of each model. The application of multiple models allows an analysis of simulation uncertainties arising from the selection of model structure, as compared with effects of uncertain parameters and precipitation input. Four different model structures, including conceptual and physically based approaches, are compared. In addition to runoff simulations, results for soil moisture and the runoff components of overland flow, interflow and base flow are analysed. Catchment runoff is simulated satisfactorily by all four model structures and shows only minor differences. Systematic deviations from runoff observations provide insight into model structural deficiencies. While physically based model structures capture some single runoff events better, they do not generally outperform conceptual model structures. Contributions to uncertainty in runoff simulations stemming from the choice of model structure show similar dimensions to those arising from parameter selection and the representation of precipitation input. Variations in precipitation mainly affect the general level and peaks of runoff, while different model structures lead to different simulated runoff dynamics. Large differences between the four analysed models are detected for simulations of soil moisture and, even more pronounced, runoff components. Soil moisture changes are more dynamical in the physically based model structures, which is in better agreement with observations. Streamflow contributions of overland flow are considerably lower in these models than in the more conceptual approaches. Observations of runoff components are rarely made and are not available in this study, but are shown to have high potential for an effective selection of appropriate model structures (author) [de

  10. Designing of fuzzy expert heuristic models with cost management ...

    Indian Academy of Sciences (India)

    In genuine industrial case, problems are inescapable and pose enormous challenges to incorporate accurate sustainability factors into supplier selection. In this present study, three different primarily based multicriteria decision making fuzzy models have been compared with their deterministic version so as to resolve fuzzy ...

  11. Humanistic Speech Education to Create Leadership Models.

    Science.gov (United States)

    Oka, Beverley Jeanne

    A theoretical framework based primarily on the humanistic psychology of Abraham Maslow is used in developing a humanistic approach to speech education. The holistic view of human learning and behavior, inherent in this approach, is seen to be compatible with a model of effective leadership. Specific applications of this approach to speech…

  12. Agent-based modeling of sustainable behaviors

    CERN Document Server

    Sánchez-Maroño, Noelia; Fontenla-Romero, Oscar; Polhill, J; Craig, Tony; Bajo, Javier; Corchado, Juan

    2017-01-01

    Using the O.D.D. (Overview, Design concepts, Detail) protocol, this title explores the role of agent-based modeling in predicting the feasibility of various approaches to sustainability. The chapters incorporated in this volume consist of real case studies to illustrate the utility of agent-based modeling and complexity theory in discovering a path to more efficient and sustainable lifestyles. The topics covered within include: households' attitudes toward recycling, designing decision trees for representing sustainable behaviors, negotiation-based parking allocation, auction-based traffic signal control, and others. This selection of papers will be of interest to social scientists who wish to learn more about agent-based modeling as well as experts in the field of agent-based modeling.

  13. Statin Selection in Qatar Based on Multi-indication Pharmacotherapeutic Multi-criteria Scoring Model, and Clinician Preference.

    Science.gov (United States)

    Al-Badriyeh, Daoud; Fahey, Michael; Alabbadi, Ibrahim; Al-Khal, Abdullatif; Zaidan, Manal

    2015-12-01

    Statin selection for the largest hospital formulary in Qatar is not systematic, not comparative, and does not consider the multi-indication nature of statins. There are no reports in the literature of multi-indication-based comparative scoring models of statins or of statin selection criteria weights that are based primarily on local clinicians' preferences and experiences. This study sought to comparatively evaluate statins for first-line therapy in Qatar, and to quantify the economic impact of this. An evidence-based, multi-indication, multi-criteria pharmacotherapeutic model was developed for the scoring of statins from the perspective of the main health care provider in Qatar. The literature and an expert panel informed the selection criteria of statins. Relative weighting of selection criteria was based on the input of the relevant local clinician population. Statins were comparatively scored based on literature evidence, with those exceeding a defined scoring threshold being recommended for use. With 95% CI and 5% margin of error, the scoring model was successfully developed. Selection criteria comprised 28 subcriteria under the following main criteria: clinical efficacy, best publish evidence and experience, adverse effects, drug interaction, dosing time, and fixed dose combination availability. Outcome measures for multiple indications were related to effects on LDL cholesterol, HDL cholesterol, triglyceride, total cholesterol, and C-reactive protein. Atorvastatin, pravastatin, and rosuvastatin exceeded defined pharmacotherapeutic thresholds. Atorvastatin and pravastatin were recommended as first-line use and rosuvastatin as a nonformulary alternative. It was estimated that this would produce a 17.6% cost savings in statins expenditure. Sensitivity analyses confirmed the robustness of the evaluation's outcomes against input uncertainties. Incorporating a comparative evaluation of statins in Qatari practices based on a locally developed, transparent, multi

  14. Direct healthcare costs of selected diseases primarily or partially transmitted by water.

    Science.gov (United States)

    Collier, S A; Stockman, L J; Hicks, L A; Garrison, L E; Zhou, F J; Beach, M J

    2012-11-01

    Despite US sanitation advancements, millions of waterborne disease cases occur annually, although the precise burden of disease is not well quantified. Estimating the direct healthcare cost of specific infections would be useful in prioritizing waterborne disease prevention activities. Hospitalization and outpatient visit costs per case and total US hospitalization costs for ten waterborne diseases were calculated using large healthcare claims and hospital discharge databases. The five primarily waterborne diseases in this analysis (giardiasis, cryptosporidiosis, Legionnaires' disease, otitis externa, and non-tuberculous mycobacterial infection) were responsible for over 40 000 hospitalizations at a cost of $970 million per year, including at least $430 million in hospitalization costs for Medicaid and Medicare patients. An additional 50 000 hospitalizations for campylobacteriosis, salmonellosis, shigellosis, haemolytic uraemic syndrome, and toxoplasmosis cost $860 million annually ($390 million in payments for Medicaid and Medicare patients), a portion of which can be assumed to be due to waterborne transmission.

  15. Measurement-based reliability/performability models

    Science.gov (United States)

    Hsueh, Mei-Chen

    1987-01-01

    Measurement-based models based on real error-data collected on a multiprocessor system are described. Model development from the raw error-data to the estimation of cumulative reward is also described. A workload/reliability model is developed based on low-level error and resource usage data collected on an IBM 3081 system during its normal operation in order to evaluate the resource usage/error/recovery process in a large mainframe system. Thus, both normal and erroneous behavior of the system are modeled. The results provide an understanding of the different types of errors and recovery processes. The measured data show that the holding times in key operational and error states are not simple exponentials and that a semi-Markov process is necessary to model the system behavior. A sensitivity analysis is performed to investigate the significance of using a semi-Markov process, as opposed to a Markov process, to model the measured system.

  16. The "proactive" model of learning: Integrative framework for model-free and model-based reinforcement learning utilizing the associative learning-based proactive brain concept.

    Science.gov (United States)

    Zsuga, Judit; Biro, Klara; Papp, Csaba; Tajti, Gabor; Gesztelyi, Rudolf

    2016-02-01

    Reinforcement learning (RL) is a powerful concept underlying forms of associative learning governed by the use of a scalar reward signal, with learning taking place if expectations are violated. RL may be assessed using model-based and model-free approaches. Model-based reinforcement learning involves the amygdala, the hippocampus, and the orbitofrontal cortex (OFC). The model-free system involves the pedunculopontine-tegmental nucleus (PPTgN), the ventral tegmental area (VTA) and the ventral striatum (VS). Based on the functional connectivity of VS, model-free and model based RL systems center on the VS that by integrating model-free signals (received as reward prediction error) and model-based reward related input computes value. Using the concept of reinforcement learning agent we propose that the VS serves as the value function component of the RL agent. Regarding the model utilized for model-based computations we turned to the proactive brain concept, which offers an ubiquitous function for the default network based on its great functional overlap with contextual associative areas. Hence, by means of the default network the brain continuously organizes its environment into context frames enabling the formulation of analogy-based association that are turned into predictions of what to expect. The OFC integrates reward-related information into context frames upon computing reward expectation by compiling stimulus-reward and context-reward information offered by the amygdala and hippocampus, respectively. Furthermore we suggest that the integration of model-based expectations regarding reward into the value signal is further supported by the efferent of the OFC that reach structures canonical for model-free learning (e.g., the PPTgN, VTA, and VS). (c) 2016 APA, all rights reserved).

  17. Modal-based reduced-order model of BWR out-of phase instabilities

    International Nuclear Information System (INIS)

    Turso, J.A.; Edwards, R.M.; March-Leuba, J.

    1995-01-01

    For the past 40 yr, reduced-order modeling of boiling water reactor (BWR) dynamic behavior has been accomplished by several researchers. These models have been primarily concerned with providing insight into the so-called corewide neutron flux oscillation, where the power at each radial location in the core oscillates in unison. This is generally considered to be an illustration of the fundamental neutronic mode excited by the core thermal hydraulics. The time dependence of the fundamental mode is typically described by the point-kinetics equations, with one or more delayed-neutron groups. Thermal-hydraulic excitation of the first azimuthal harmonic mode, the so-called out-of-phase (OOP) instability, has been observed in operating BWRs. The temporal behavior of a low-order model of this phenomenon can be characterized using the modal point-kinetics formulation developed in this paper

  18. The Use of Modeling-Based Text to Improve Students' Modeling Competencies

    Science.gov (United States)

    Jong, Jing-Ping; Chiu, Mei-Hung; Chung, Shiao-Lan

    2015-01-01

    This study investigated the effects of a modeling-based text on 10th graders' modeling competencies. Fifteen 10th graders read a researcher-developed modeling-based science text on the ideal gas law that included explicit descriptions and representations of modeling processes (i.e., model selection, model construction, model validation, model…

  19. The Managerial Roles of Academic Library Directors: The Mintzberg Model.

    Science.gov (United States)

    Moskowitz, Michael Ann

    1986-01-01

    A study based on a model developed by Henry Mintzberg examined the internal and external managerial roles of 126 New England college and university library directors. Survey results indicate that the 97 responding directors were primarily involved with internal managerial roles and work contacts. (CDD)

  20. Knowledge-Based Environmental Context Modeling

    Science.gov (United States)

    Pukite, P. R.; Challou, D. J.

    2017-12-01

    As we move from the oil-age to an energy infrastructure based on renewables, the need arises for new educational tools to support the analysis of geophysical phenomena and their behavior and properties. Our objective is to present models of these phenomena to make them amenable for incorporation into more comprehensive analysis contexts. Starting at the level of a college-level computer science course, the intent is to keep the models tractable and therefore practical for student use. Based on research performed via an open-source investigation managed by DARPA and funded by the Department of Interior [1], we have adapted a variety of physics-based environmental models for a computer-science curriculum. The original research described a semantic web architecture based on patterns and logical archetypal building-blocks (see figure) well suited for a comprehensive environmental modeling framework. The patterns span a range of features that cover specific land, atmospheric and aquatic domains intended for engineering modeling within a virtual environment. The modeling engine contained within the server relied on knowledge-based inferencing capable of supporting formal terminology (through NASA JPL's Semantic Web for Earth and Environmental Technology (SWEET) ontology and a domain-specific language) and levels of abstraction via integrated reasoning modules. One of the key goals of the research was to simplify models that were ordinarily computationally intensive to keep them lightweight enough for interactive or virtual environment contexts. The breadth of the elements incorporated is well-suited for learning as the trend toward ontologies and applying semantic information is vital for advancing an open knowledge infrastructure. As examples of modeling, we have covered such geophysics topics as fossil-fuel depletion, wind statistics, tidal analysis, and terrain modeling, among others. Techniques from the world of computer science will be necessary to promote efficient

  1. Language Implications for Advertising in International Markets: A Model for Message Content and Message Execution.

    Science.gov (United States)

    Beard, John; Yaprak, Attila

    A content analysis model for assessing advertising themes and messages generated primarily for United States markets to overcome barriers in the cultural environment of international markets was developed and tested. The model is based on three primary categories for generating, evaluating, and executing advertisements: rational, emotional, and…

  2. Hydrological models are mediating models

    Science.gov (United States)

    Babel, L. V.; Karssenberg, D.

    2013-08-01

    Despite the increasing role of models in hydrological research and decision-making processes, only few accounts of the nature and function of models exist in hydrology. Earlier considerations have traditionally been conducted while making a clear distinction between physically-based and conceptual models. A new philosophical account, primarily based on the fields of physics and economics, transcends classes of models and scientific disciplines by considering models as "mediators" between theory and observations. The core of this approach lies in identifying models as (1) being only partially dependent on theory and observations, (2) integrating non-deductive elements in their construction, and (3) carrying the role of instruments of scientific enquiry about both theory and the world. The applicability of this approach to hydrology is evaluated in the present article. Three widely used hydrological models, each showing a different degree of apparent physicality, are confronted to the main characteristics of the "mediating models" concept. We argue that irrespective of their kind, hydrological models depend on both theory and observations, rather than merely on one of these two domains. Their construction is additionally involving a large number of miscellaneous, external ingredients, such as past experiences, model objectives, knowledge and preferences of the modeller, as well as hardware and software resources. We show that hydrological models convey the role of instruments in scientific practice by mediating between theory and the world. It results from these considerations that the traditional distinction between physically-based and conceptual models is necessarily too simplistic and refers at best to the stage at which theory and observations are steering model construction. The large variety of ingredients involved in model construction would deserve closer attention, for being rarely explicitly presented in peer-reviewed literature. We believe that devoting

  3. Evidence-based quantification of uncertainties induced via simulation-based modeling

    International Nuclear Information System (INIS)

    Riley, Matthew E.

    2015-01-01

    The quantification of uncertainties in simulation-based modeling traditionally focuses upon quantifying uncertainties in the parameters input into the model, referred to as parametric uncertainties. Often neglected in such an approach are the uncertainties induced by the modeling process itself. This deficiency is often due to a lack of information regarding the problem or the models considered, which could theoretically be reduced through the introduction of additional data. Because of the nature of this epistemic uncertainty, traditional probabilistic frameworks utilized for the quantification of uncertainties are not necessarily applicable to quantify the uncertainties induced in the modeling process itself. This work develops and utilizes a methodology – incorporating aspects of Dempster–Shafer Theory and Bayesian model averaging – to quantify uncertainties of all forms for simulation-based modeling problems. The approach expands upon classical parametric uncertainty approaches, allowing for the quantification of modeling-induced uncertainties as well, ultimately providing bounds on classical probability without the loss of epistemic generality. The approach is demonstrated on two different simulation-based modeling problems: the computation of the natural frequency of a simple two degree of freedom non-linear spring mass system and the calculation of the flutter velocity coefficient for the AGARD 445.6 wing given a subset of commercially available modeling choices. - Highlights: • Modeling-induced uncertainties are often mishandled or ignored in the literature. • Modeling-induced uncertainties are epistemic in nature. • Probabilistic representations of modeling-induced uncertainties are restrictive. • Evidence theory and Bayesian model averaging are integrated. • Developed approach is applicable for simulation-based modeling problems

  4. Convex-based void filling method for CAD-based Monte Carlo geometry modeling

    International Nuclear Information System (INIS)

    Yu, Shengpeng; Cheng, Mengyun; Song, Jing; Long, Pengcheng; Hu, Liqin

    2015-01-01

    Highlights: • We present a new void filling method named CVF for CAD based MC geometry modeling. • We describe convex based void description based and quality-based space subdivision. • The results showed improvements provided by CVF for both modeling and MC calculation efficiency. - Abstract: CAD based automatic geometry modeling tools have been widely applied to generate Monte Carlo (MC) calculation geometry for complex systems according to CAD models. Automatic void filling is one of the main functions in the CAD based MC geometry modeling tools, because the void space between parts in CAD models is traditionally not modeled while MC codes such as MCNP need all the problem space to be described. A dedicated void filling method, named Convex-based Void Filling (CVF), is proposed in this study for efficient void filling and concise void descriptions. The method subdivides all the problem space into disjointed regions using Quality based Subdivision (QS) and describes the void space in each region with complementary descriptions of the convex volumes intersecting with that region. It has been implemented in SuperMC/MCAM, the Multiple-Physics Coupling Analysis Modeling Program, and tested on International Thermonuclear Experimental Reactor (ITER) Alite model. The results showed that the new method reduced both automatic modeling time and MC calculation time

  5. An Integrated model for Product Quality Development—A case study on Quality functions deployment and AHP based approach

    Science.gov (United States)

    Maitra, Subrata; Banerjee, Debamalya

    2010-10-01

    Present article is based on application of the product quality and improvement of design related with the nature of failure of machineries and plant operational problems of an industrial blower fan Company. The project aims at developing the product on the basis of standardized production parameters for selling its products in the market. Special attention is also being paid to the blower fans which have been ordered directly by the customer on the basis of installed capacity of air to be provided by the fan. Application of quality function deployment is primarily a customer oriented approach. Proposed model of QFD integrated with AHP to select and rank the decision criterions on the commercial and technical factors and the measurement of the decision parameters for selection of best product in the compettitive environment. The present AHP-QFD model justifies the selection of a blower fan with the help of the group of experts' opinion by pairwise comparison of the customer's and ergonomy based technical design requirements. The steps invoved in implementation of the QFD—AHP and selection of weighted criterion may be helpful for all similar purpose industries maintaining cost and utility for competitive product.

  6. Model-based version management system framework

    International Nuclear Information System (INIS)

    Mehmood, W.

    2016-01-01

    In this paper we present a model-based version management system. Version Management System (VMS) a branch of software configuration management (SCM) aims to provide a controlling mechanism for evolution of software artifacts created during software development process. Controlling the evolution requires many activities to perform, such as, construction and creation of versions, identification of differences between versions, conflict detection and merging. Traditional VMS systems are file-based and consider software systems as a set of text files. File based VMS systems are not adequate for performing software configuration management activities such as, version control on software artifacts produced in earlier phases of the software life cycle. New challenges of model differencing, merge, and evolution control arise while using models as central artifact. The goal of this work is to present a generic framework model-based VMS which can be used to overcome the problem of tradition file-based VMS systems and provide model versioning services. (author)

  7. Base of the upper layer of the phase-three Elkhorn-Loup groundwater-flow model, north-central Nebraska

    Science.gov (United States)

    Stanton, Jennifer S.

    2013-01-01

    The Elkhorn and Loup Rivers in Nebraska provide water for irrigation, recreation, hydropower produc­tion, aquatic life, and municipal water systems for the Omaha and Lincoln metropolitan areas. Groundwater is another important resource in the region and is extracted primarily for agricultural irrigation. Water managers of the area are interested in balancing and sustaining the long-term uses of these essential surface-water and groundwater resources. Thus, a cooperative study was established in 2006 to compile reliable data describing hydrogeologic properties and water-budget components and to improve the understanding of stream-aquifer interactions in the Elkhorn and Loup River Basins. A groundwater-flow model was constructed as part of the first two phases of that study as a tool for under­standing the effect of groundwater pumpage on stream base flow and the effects of management strategies on hydrologically connected groundwater and surface-water supplies. The third phase of the study was implemented to gain additional geologic knowledge and update the ELM with enhanced water-budget information and refined discretization of the model grid and stress periods. As part of that effort, the ELM is being reconstructed to include two vertical model layers, whereas phase-one and phase-two simulations represented the aquifer system using one vertical model layer. This report presents a map of and methods for developing the elevation of the base of the upper model layer for the phase-three ELM. Digital geospatial data of elevation contours and geologic log sites used to esti­mate elevation contours are available as part of this report.

  8. Universal in vivo Textural Model for Human Skin based on Optical Coherence Tomograms.

    Science.gov (United States)

    Adabi, Saba; Hosseinzadeh, Matin; Noei, Shahryar; Conforto, Silvia; Daveluy, Steven; Clayton, Anne; Mehregan, Darius; Nasiriavanaki, Mohammadreza

    2017-12-20

    Currently, diagnosis of skin diseases is based primarily on the visual pattern recognition skills and expertise of the physician observing the lesion. Even though dermatologists are trained to recognize patterns of morphology, it is still a subjective visual assessment. Tools for automated pattern recognition can provide objective information to support clinical decision-making. Noninvasive skin imaging techniques provide complementary information to the clinician. In recent years, optical coherence tomography (OCT) has become a powerful skin imaging technique. According to specific functional needs, skin architecture varies across different parts of the body, as do the textural characteristics in OCT images. There is, therefore, a critical need to systematically analyze OCT images from different body sites, to identify their significant qualitative and quantitative differences. Sixty-three optical and textural features extracted from OCT images of healthy and diseased skin are analyzed and, in conjunction with decision-theoretic approaches, used to create computational models of the diseases. We demonstrate that these models provide objective information to the clinician to assist in the diagnosis of abnormalities of cutaneous microstructure, and hence, aid in the determination of treatment. Specifically, we demonstrate the performance of this methodology on differentiating basal cell carcinoma (BCC) and squamous cell carcinoma (SCC) from healthy tissue.

  9. Culturicon model: A new model for cultural-based emoticon

    Science.gov (United States)

    Zukhi, Mohd Zhafri Bin Mohd; Hussain, Azham

    2017-10-01

    Emoticons are popular among distributed collective interaction user in expressing their emotion, gestures and actions. Emoticons have been proved to be able to avoid misunderstanding of the message, attention saving and improved the communications among different native speakers. However, beside the benefits that emoticons can provide, the study regarding emoticons in cultural perspective is still lacking. As emoticons are crucial in global communication, culture should be one of the extensively research aspect in distributed collective interaction. Therefore, this study attempt to explore and develop model for cultural-based emoticon. Three cultural models that have been used in Human-Computer Interaction were studied which are the Hall Culture Model, Trompenaars and Hampden Culture Model and Hofstede Culture Model. The dimensions from these three models will be used in developing the proposed cultural-based emoticon model.

  10. Least-squares model-based halftoning

    Science.gov (United States)

    Pappas, Thrasyvoulos N.; Neuhoff, David L.

    1992-08-01

    A least-squares model-based approach to digital halftoning is proposed. It exploits both a printer model and a model for visual perception. It attempts to produce an 'optimal' halftoned reproduction, by minimizing the squared error between the response of the cascade of the printer and visual models to the binary image and the response of the visual model to the original gray-scale image. Conventional methods, such as clustered ordered dither, use the properties of the eye only implicitly, and resist printer distortions at the expense of spatial and gray-scale resolution. In previous work we showed that our printer model can be used to modify error diffusion to account for printer distortions. The modified error diffusion algorithm has better spatial and gray-scale resolution than conventional techniques, but produces some well known artifacts and asymmetries because it does not make use of an explicit eye model. Least-squares model-based halftoning uses explicit eye models and relies on printer models that predict distortions and exploit them to increase, rather than decrease, both spatial and gray-scale resolution. We have shown that the one-dimensional least-squares problem, in which each row or column of the image is halftoned independently, can be implemented with the Viterbi's algorithm. Unfortunately, no closed form solution can be found in two dimensions. The two-dimensional least squares solution is obtained by iterative techniques. Experiments show that least-squares model-based halftoning produces more gray levels and better spatial resolution than conventional techniques. We also show that the least- squares approach eliminates the problems associated with error diffusion. Model-based halftoning can be especially useful in transmission of high quality documents using high fidelity gray-scale image encoders. As we have shown, in such cases halftoning can be performed at the receiver, just before printing. Apart from coding efficiency, this approach

  11. Model-based sensor diagnosis

    International Nuclear Information System (INIS)

    Milgram, J.; Dormoy, J.L.

    1994-09-01

    Running a nuclear power plant involves monitoring data provided by the installation's sensors. Operators and computerized systems then use these data to establish a diagnostic of the plant. However, the instrumentation system is complex, and is not immune to faults and failures. This paper presents a system for detecting sensor failures using a topological description of the installation and a set of component models. This model of the plant implicitly contains relations between sensor data. These relations must always be checked if all the components are functioning correctly. The failure detection task thus consists of checking these constraints. The constraints are extracted in two stages. Firstly, a qualitative model of their existence is built using structural analysis. Secondly, the models are formally handled according to the results of the structural analysis, in order to establish the constraints on the sensor data. This work constitutes an initial step in extending model-based diagnosis, as the information on which it is based is suspect. This work will be followed by surveillance of the detection system. When the instrumentation is assumed to be sound, the unverified constraints indicate errors on the plant model. (authors). 8 refs., 4 figs

  12. DEVELOPMENT OF A PHYSIOLOGICALLY BASED PHARMACOKINETIC MODEL FOR DELTAMETHRIN IN THE ADULT MALE SPRAGUE-DAWLEY RAT

    Science.gov (United States)

    Deltamethrin (DLT) is a Type II pyrethroid insecticide widely used in agriculture and public health. DLT is a potent neurotoxin that is primarily cleared from the body by metabolism. To better understand the dosimetry of DLT in the central nervous system, a physiologically based ...

  13. Ground-based Observations and Atmospheric Modelling of Energetic Electron Precipitation Effects on Antarctic Mesospheric Chemistry

    Science.gov (United States)

    Newnham, D.; Clilverd, M. A.; Horne, R. B.; Rodger, C. J.; Seppälä, A.; Verronen, P. T.; Andersson, M. E.; Marsh, D. R.; Hendrickx, K.; Megner, L. S.; Kovacs, T.; Feng, W.; Plane, J. M. C.

    2016-12-01

    The effect of energetic electron precipitation (EEP) on the seasonal and diurnal abundances of nitric oxide (NO) and ozone in the Antarctic middle atmosphere during March 2013 to July 2014 is investigated. Geomagnetic storm activity during this period, close to solar maximum, was driven primarily by impulsive coronal mass ejections. Near-continuous ground-based atmospheric measurements have been made by a passive millimetre-wave radiometer deployed at Halley station (75°37'S, 26°14'W, L = 4.6), Antarctica. This location is directly under the region of radiation-belt EEP, at the extremity of magnetospheric substorm-driven EEP, and deep within the polar vortex during Austral winter. Superposed epoch analyses of the ground based data, together with NO observations made by the Solar Occultation For Ice Experiment (SOFIE) onboard the Aeronomy of Ice in the Mesosphere (AIM) satellite, show enhanced mesospheric NO following moderate geomagnetic storms (Dst ≤ -50 nT). Measurements by co-located 30 MHz riometers indicate simultaneous increases in ionisation at 75-90 km directly above Halley when Kp index ≥ 4. Direct NO production by EEP in the upper mesosphere, versus downward transport of NO from the lower thermosphere, is evaluated using a new version of the Whole Atmosphere Community Climate Model incorporating the full Sodankylä Ion Neutral Chemistry Model (WACCM SIC). Model ionization rates are derived from the Polar orbiting Operational Environmental Satellites (POES) second generation Space Environment Monitor (SEM 2) Medium Energy Proton and Electron Detector instrument (MEPED). The model data are compared with observations to quantify the impact of EEP on stratospheric and mesospheric odd nitrogen (NOx), odd hydrogen (HOx), and ozone.

  14. Model-Based Learning Environment Based on The Concept IPS School-Based Management

    Directory of Open Access Journals (Sweden)

    Hamid Darmadi

    2017-03-01

    Full Text Available The results showed: (1 learning model IPS-oriented environment can grow and not you love the cultural values of the area as a basis for the development of national culture, (2 community participation, and the role of government in implementing learning model of IPS-based environment provides a positive impact for the improvement of management school resources, (3 learning model IPS-based environment effectively creating a way of life together peacefully, increase the intensity of togetherness and mutual respect (4 learning model IPS-based environment can improve student learning outcomes, (5 there are differences in the expression of attitudes and results learning among students who are located in the area of conflict with students who are outside the area of conflict (6 analysis of the scale of attitudes among school students da SMA result rewards high school students to the values of unity and nation, respect for diversity and peaceful coexistence, It is recommended that the Department of Education authority as an institution of Trustees and the development of social and cultural values in the province can apply IPS learning model based environments.

  15. Model based process-product design and analysis

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    This paper gives a perspective on modelling and the important role it has within product-process design and analysis. Different modelling issues related to development and application of systematic model-based solution approaches for product-process design is discussed and the need for a hybrid...... model-based framework is highlighted. This framework should be able to manage knowledge-data, models, and associated methods and tools integrated with design work-flows and data-flows for specific product-process design problems. In particular, the framework needs to manage models of different types......, forms and complexity, together with their associated parameters. An example of a model-based system for design of chemicals based formulated products is also given....

  16. A probabilistic graphical model based stochastic input model construction

    International Nuclear Information System (INIS)

    Wan, Jiang; Zabaras, Nicholas

    2014-01-01

    Model reduction techniques have been widely used in modeling of high-dimensional stochastic input in uncertainty quantification tasks. However, the probabilistic modeling of random variables projected into reduced-order spaces presents a number of computational challenges. Due to the curse of dimensionality, the underlying dependence relationships between these random variables are difficult to capture. In this work, a probabilistic graphical model based approach is employed to learn the dependence by running a number of conditional independence tests using observation data. Thus a probabilistic model of the joint PDF is obtained and the PDF is factorized into a set of conditional distributions based on the dependence structure of the variables. The estimation of the joint PDF from data is then transformed to estimating conditional distributions under reduced dimensions. To improve the computational efficiency, a polynomial chaos expansion is further applied to represent the random field in terms of a set of standard random variables. This technique is combined with both linear and nonlinear model reduction methods. Numerical examples are presented to demonstrate the accuracy and efficiency of the probabilistic graphical model based stochastic input models. - Highlights: • Data-driven stochastic input models without the assumption of independence of the reduced random variables. • The problem is transformed to a Bayesian network structure learning problem. • Examples are given in flows in random media

  17. Web Based VRML Modelling

    NARCIS (Netherlands)

    Kiss, S.; Sarfraz, M.

    2004-01-01

    Presents a method to connect VRML (Virtual Reality Modeling Language) and Java components in a Web page using EAI (External Authoring Interface), which makes it possible to interactively generate and edit VRML meshes. The meshes used are based on regular grids, to provide an interaction and modeling

  18. A model for self-diffusion of guanidinium-based ionic liquids: a molecular simulation study.

    Science.gov (United States)

    Klähn, Marco; Seduraman, Abirami; Wu, Ping

    2008-11-06

    We propose a novel self-diffusion model for ionic liquids on an atomic level of detail. The model is derived from molecular dynamics simulations of guanidinium-based ionic liquids (GILs) as a model case. The simulations are based on an empirical molecular mechanical force field, which has been developed in our preceding work, and it relies on the charge distribution in the actual liquid. The simulated GILs consist of acyclic and cyclic cations that were paired with nitrate and perchlorate anions. Self-diffusion coefficients are calculated at different temperatures from which diffusive activation energies between 32-40 kJ/mol are derived. Vaporization enthalpies between 174-212 kJ/mol are calculated, and their strong connection with diffusive activation energies is demonstrated. An observed formation of cavities in GILs of up to 6.5% of the total volume does not facilitate self-diffusion. Instead, the diffusion of ions is found to be determined primarily by interactions with their immediate environment via electrostatic attraction between cation hydrogen and anion oxygen atoms. The calculated average time between single diffusive transitions varies between 58-107 ps and determines the speed of diffusion, in contrast to diffusive displacement distances, which were found to be similar in all simulated GILs. All simulations indicate that ions diffuse by using a brachiation type of movement: a diffusive transition is initiated by cleaving close contacts to a coordinated counterion, after which the ion diffuses only about 2 A until new close contacts are formed with another counterion in its vicinity. The proposed diffusion model links all calculated energetic and dynamic properties of GILs consistently and explains their molecular origin. The validity of the model is confirmed by providing an explanation for the variation of measured ratios of self-diffusion coefficients of cations and paired anions over a wide range of values, encompassing various ionic liquid classes

  19. Physics Based Modeling of Compressible Turbulance

    Science.gov (United States)

    2016-11-07

    AFRL-AFOSR-VA-TR-2016-0345 PHYSICS -BASED MODELING OF COMPRESSIBLE TURBULENCE PARVIZ MOIN LELAND STANFORD JUNIOR UNIV CA Final Report 09/13/2016...on the AFOSR project (FA9550-11-1-0111) entitled: Physics based modeling of compressible turbulence. The period of performance was, June 15, 2011...by ANSI Std. Z39.18 Page 1 of 2FORM SF 298 11/10/2016https://livelink.ebs.afrl.af.mil/livelink/llisapi.dll PHYSICS -BASED MODELING OF COMPRESSIBLE

  20. Experimental models of brain ischemia: a review of techniques, magnetic resonance imaging and investigational cell-based therapies

    Directory of Open Access Journals (Sweden)

    Alessandra eCanazza

    2014-02-01

    Full Text Available Stroke continues to be a significant cause of death and disability worldwide. Although major advances have been made in the past decades in prevention, treatment and rehabilitation, enormous challenges remain in the way of translating new therapeutic approaches from bench to bedside. Thrombolysis, while routinely used for ischemic stroke, is only a viable option within a narrow time window. Recently, progress in stem cell biology has opened up avenues to therapeutic strategies aimed at supporting and replacing neural cells in infarcted areas. Realistic experimental animal models are crucial to understand the mechanisms of neuronal survival following ischemic brain injury and to develop therapeutic interventions. Current studies on experimental stroke therapies evaluate the efficiency of neuroprotective agents and cell-based approaches using primarily rodent models of permanent or transient focal cerebral ischemia. In parallel, advancements in imaging techniques permit better mapping of the spatial-temporal evolution of the lesioned cortex and its functional responses. This review provides a condensed conceptual review of the state of the art of this field, from models and magnetic resonance imaging techniques through to stem cell therapies.

  1. Probabilistic reasoning for assembly-based 3D modeling

    KAUST Repository

    Chaudhuri, Siddhartha; Kalogerakis, Evangelos; Guibas, Leonidas; Koltun, Vladlen

    2011-01-01

    Assembly-based modeling is a promising approach to broadening the accessibility of 3D modeling. In assembly-based modeling, new models are assembled from shape components extracted from a database. A key challenge in assembly-based modeling

  2. Model Based Temporal Reasoning

    Science.gov (United States)

    Rabin, Marla J.; Spinrad, Paul R.; Fall, Thomas C.

    1988-03-01

    Systems that assess the real world must cope with evidence that is uncertain, ambiguous, and spread over time. Typically, the most important function of an assessment system is to identify when activities are occurring that are unusual or unanticipated. Model based temporal reasoning addresses both of these requirements. The differences among temporal reasoning schemes lies in the methods used to avoid computational intractability. If we had n pieces of data and we wanted to examine how they were related, the worst case would be where we had to examine every subset of these points to see if that subset satisfied the relations. This would be 2n, which is intractable. Models compress this; if several data points are all compatible with a model, then that model represents all those data points. Data points are then considered related if they lie within the same model or if they lie in models that are related. Models thus address the intractability problem. They also address the problem of determining unusual activities if the data do not agree with models that are indicated by earlier data then something out of the norm is taking place. The models can summarize what we know up to that time, so when they are not predicting correctly, either something unusual is happening or we need to revise our models. The model based reasoner developed at Advanced Decision Systems is thus both intuitive and powerful. It is currently being used on one operational system and several prototype systems. It has enough power to be used in domains spanning the spectrum from manufacturing engineering and project management to low-intensity conflict and strategic assessment.

  3. Disgust sensitivity is primarily associated with purity-based moral judgments

    NARCIS (Netherlands)

    Wagemans, F.M.A.; Brandt, M.J.; Zeelenberg, M.

    2018-01-01

    Individual differences in disgust sensitivity are associated with a range of judgments and attitudes related to the moral domain. Some perspectives suggest that the association between disgust sensitivity and moral judgments will be equally strong across all moral domains (i.e., purity, authority,

  4. Scientific data base for safeguards components

    International Nuclear Information System (INIS)

    Hall, R.C.; Jones, R.D.

    1978-01-01

    The need to store and maintain vast amounts of data and the desire to avoid nonfunctional redundancy have provided an impetus for modern data base technology. Large-scale data base management systems (DBMS) have emerged during the past two decades evolving from earlier generalized file processing systems. This evolution has primarily involved certain business applications (e.g., production control, payroll, order entry) because of their high volume data processing characterization. Current data base technology, however, is becoming increasingly concerned with generality. Many diverse applications, including scientific ones, are benefiting from the generalized data base management software which has resulted. The concept of a data base management system is examined. The three common models which have been proposed for organizing data and relationships are identified: the network model, the hierarchical model, and the relational model. A specific implementation using a hierarchical data base management system is described. This is the data base for safeguards components which has been developed at Sandia Laboratories using the System 2000 developed by MRI Systems Corporation. Its organization, components, and functions are presented. The various interfaces it permits to user programs (e.g., safeguards automated facility evaluation software) and interactive terminal users are described

  5. Assessment of Vegetation Variation on Primarily Creation Zones of the Dust Storms Around the Euphrates Using Remote Sensing Images

    Directory of Open Access Journals (Sweden)

    Jamil Amanollahi

    2012-06-01

    Full Text Available Recently, period frequency and effect domain of the dust storms that enter Iran from Iraq have increased. In this study, in addition to detecting the creation zones of the dust storms, the effect of vegetation cover variation on their creation was investigated using remote sensing. Moderate resolution image Spectroradiometer (MODIS and Landsat Thematic Mapper (TM5 have been utilized to identify the primarily creation zones of the dust storms and to assess the vegetation cover variation, respectively. Vegetation cover variation was studied using Normalized Differences Vegetation Index (NDVI obtained from band 3 and band 4 of the Landsate satellite. The results showed that the surrounding area of the Euphrates in Syria, the desert in the vicinity of this river in Iraq, including the deserts of Alanbar Province, and the north deserts of Saudi Arabia are the primarily creation zones of the dust storms entering west and south west of Iran. The results of NDVI showed that excluding the deserts in the border of Syria and Iraq, the area with very weak vegetation cover have increased between 2.44% and 20.65% from 1991 to 2009. In the meanwhile, the retention pound surface areas in the south deserts of Syria as well as the deserts in its border with Iraq have decreased 6320 and 4397 hectares, respectively. As it can be concluded from the findings, one of the main environmental parameters initiating these dust storms is the decrease in the vegetation cover in their primarily creation zones.

  6. Distribution of lithostratigraphic units within the central block of Yucca Mountain, Nevada: A three-dimensional computer-based model, Version YMP.R2.0

    International Nuclear Information System (INIS)

    Buesch, D.C.; Nelson, J.E.; Dickerson, R.P.; Drake, R.M. II; San Juan, C.A.; Spengler, R.W.; Geslin, J.K.; Moyer, T.C.

    1996-01-01

    Yucca Mountain, Nevada is underlain by 14.0 to 11.6 Ma volcanic rocks tilted eastward 3 degree to 20 degree and cut by faults that were primarily active between 12.7 and 11.6 Ma. A three-dimensional computer-based model of the central block of the mountain consists of seven structural subblocks composed of six formations and the interstratified-bedded tuffaceous deposits. Rocks from the 12.7 Ma Tiva Canyon Tuff, which forms most of the exposed rocks on the mountain, to the 13.1 Ma Prow Pass Tuff are modeled with 13 surfaces. Modeled units represent single formations such as the Pah Canyon Tuff, grouped units such as the combination of the Yucca Mountain Tuff with the superjacent bedded tuff, and divisions of the Topopah Spring Tuff such as the crystal-poor vitrophyre interval. The model is based on data from 75 boreholes from which a structure contour map at the base of the Tiva Canyon Tuff and isochore maps for each unit are constructed to serve as primary input. Modeling consists of an iterative cycle that begins with the primary structure-contour map from which isochore values of the subjacent model unit are subtracted to produce the structure contour map on the base of the unit. This new structure contour map forms the input for another cycle of isochore subtraction to produce the next structure contour map. In this method of solids modeling, the model units are presented by surfaces (structure contour maps), and all surfaces are stored in the model. Surfaces can be converted to form volumes of model units with additional effort. This lithostratigraphic and structural model can be used for (1) storing data from, and planning future, site characterization activities, (2) preliminary geometry of units for design of Exploratory Studies Facility and potential repository, and (3) performance assessment evaluations

  7. Econophysics of agent-based models

    CERN Document Server

    Aoyama, Hideaki; Chakrabarti, Bikas; Chakraborti, Anirban; Ghosh, Asim

    2014-01-01

    The primary goal of this book is to present the research findings and conclusions of physicists, economists, mathematicians and financial engineers working in the field of "Econophysics" who have undertaken agent-based modelling, comparison with empirical studies and related investigations. Most standard economic models assume the existence of the representative agent, who is “perfectly rational” and applies the utility maximization principle when taking action. One reason for this is the desire to keep models mathematically tractable: no tools are available to economists for solving non-linear models of heterogeneous adaptive agents without explicit optimization. In contrast, multi-agent models, which originated from statistical physics considerations, allow us to go beyond the prototype theories of traditional economics involving the representative agent. This book is based on the Econophys-Kolkata VII Workshop, at which many such modelling efforts were presented. In the book, leading researchers in the...

  8. A model-based framework for design of intensified enzyme-based processes

    DEFF Research Database (Denmark)

    Román-Martinez, Alicia

    This thesis presents a generic and systematic model-based framework to design intensified enzyme-based processes. The development of the presented methodology was motivated by the needs of the bio-based industry for a more systematic approach to achieve intensification in its production plants...... in enzyme-based processes which have found significant application in the pharmaceutical, food, and renewable fuels sector. The framework uses model-based strategies for (bio)-chemical process design and optimization, including the use of a superstructure to generate all potential reaction......(s)-separation(s) options according to a desired performance criteria and a generic mathematical model represented by the superstructure to derive the specific models corresponding to a specific process option. In principle, three methods of intensification of bioprocess are considered in this thesis: 1. enzymatic one...

  9. ANFIS-Based Modeling for Photovoltaic Characteristics Estimation

    Directory of Open Access Journals (Sweden)

    Ziqiang Bi

    2016-09-01

    Full Text Available Due to the high cost of photovoltaic (PV modules, an accurate performance estimation method is significantly valuable for studying the electrical characteristics of PV generation systems. Conventional analytical PV models are usually composed by nonlinear exponential functions and a good number of unknown parameters must be identified before using. In this paper, an adaptive-network-based fuzzy inference system (ANFIS based modeling method is proposed to predict the current-voltage characteristics of PV modules. The effectiveness of the proposed modeling method is evaluated through comparison with Villalva’s model, radial basis function neural networks (RBFNN based model and support vector regression (SVR based model. Simulation and experimental results confirm both the feasibility and the effectiveness of the proposed method.

  10. Modeling of the service taxi

    OpenAIRE

    R. M. Bezborodnikova

    2016-01-01

    Modeling and optimization business processes are ongoing challenges of modernity. Based on the study of business processes to anticipate and avoid many problems in the work of companies linked primarily with an increased level of costs, the low quality of the performed works and manufactured products, excessive run time functions. Application of simulation tool for business processes allows at the stage of planning to assess various indicators of the effectiveness of processes to identify the...

  11. Analysis of the Explanatory Variables of the Differences in Perceptions of Cyberbullying: A Role-Based-Model Approach.

    Science.gov (United States)

    Fernández-Antelo, Inmaculada; Cuadrado-Gordillo, Isabel

    2018-04-01

    The controversies that exist regarding the delimitation of the cyberbullying construct demonstrate the need for further research focused on determining the criteria that shape the structure of the perceptions that adolescents have of this phenomenon and on seeking explanations of this behavior. The objectives of this study were to (a) construct possible explanatory models of the perception of cyberbullying from identifying and relating the criteria that form this construct and (b) analyze the influence of previous cyber victimization and cyber aggression experiences in the construction of explanatory models of the perception of cyberbullying. The sample consisted of 2,148 adolescents (49.1% girls; SD = 0.5) aged from 12 to 16 years ( M = 13.9 years; SD = 1.2). The results have shown that previous cyber victimization and cyber aggression experiences lead to major differences in the explanatory models to interpret cyber-abusive behavior as cyberbullying episodes, or as social relationship mechanisms, or as a revenge reaction. We note that the aggressors' explanatory model is based primarily on a strong reciprocal relationship between the imbalance of power and intentionality, that it functions as a link promoting indirect causal relationships of the anonymity and repetition factors with the cyberbullying construct. The victims' perceptual structure is based on three criteria-imbalance of power, intentionality, and publicity-where the key factor in this structure is the intention to harm. These results allow to design more effective measures of prevention and intervention closely tailored to addressing directly the factors that are considered to be predictors of risk.

  12. Using Model Replication to Improve the Reliability of Agent-Based Models

    Science.gov (United States)

    Zhong, Wei; Kim, Yushim

    The basic presupposition of model replication activities for a computational model such as an agent-based model (ABM) is that, as a robust and reliable tool, it must be replicable in other computing settings. This assumption has recently gained attention in the community of artificial society and simulation due to the challenges of model verification and validation. Illustrating the replication of an ABM representing fraudulent behavior in a public service delivery system originally developed in the Java-based MASON toolkit for NetLogo by a different author, this paper exemplifies how model replication exercises provide unique opportunities for model verification and validation process. At the same time, it helps accumulate best practices and patterns of model replication and contributes to the agenda of developing a standard methodological protocol for agent-based social simulation.

  13. Skull base tumor model.

    Science.gov (United States)

    Gragnaniello, Cristian; Nader, Remi; van Doormaal, Tristan; Kamel, Mahmoud; Voormolen, Eduard H J; Lasio, Giovanni; Aboud, Emad; Regli, Luca; Tulleken, Cornelius A F; Al-Mefty, Ossama

    2010-11-01

    Resident duty-hours restrictions have now been instituted in many countries worldwide. Shortened training times and increased public scrutiny of surgical competency have led to a move away from the traditional apprenticeship model of training. The development of educational models for brain anatomy is a fascinating innovation allowing neurosurgeons to train without the need to practice on real patients and it may be a solution to achieve competency within a shortened training period. The authors describe the use of Stratathane resin ST-504 polymer (SRSP), which is inserted at different intracranial locations to closely mimic meningiomas and other pathological entities of the skull base, in a cadaveric model, for use in neurosurgical training. Silicone-injected and pressurized cadaveric heads were used for studying the SRSP model. The SRSP presents unique intrinsic metamorphic characteristics: liquid at first, it expands and foams when injected into the desired area of the brain, forming a solid tumorlike structure. The authors injected SRSP via different passages that did not influence routes used for the surgical approach for resection of the simulated lesion. For example, SRSP injection routes included endonasal transsphenoidal or transoral approaches if lesions were to be removed through standard skull base approach, or, alternatively, SRSP was injected via a cranial approach if the removal was planned to be via the transsphenoidal or transoral route. The model was set in place in 3 countries (US, Italy, and The Netherlands), and a pool of 13 physicians from 4 different institutions (all surgeons and surgeons in training) participated in evaluating it and provided feedback. All 13 evaluating physicians had overall positive impressions of the model. The overall score on 9 components evaluated--including comparison between the tumor model and real tumor cases, perioperative requirements, general impression, and applicability--was 88% (100% being the best possible

  14. Modelling Flexible Pavement Response and Performance

    DEFF Research Database (Denmark)

    Ullidtz, Per

    This textbook is primarily concerned with models for predicting the future condition of flexible pavements, as a function of traffic loading, climate, materials, etc., using analytical-empirical methods.......This textbook is primarily concerned with models for predicting the future condition of flexible pavements, as a function of traffic loading, climate, materials, etc., using analytical-empirical methods....

  15. Implant-based oral rehabilitation of a variant model of type I dentinal dysplasia: A rare case report

    Directory of Open Access Journals (Sweden)

    Sowmya Nettem

    2014-01-01

    Full Text Available Dentin dysplasia is an exceptionally rare, autosomal-dominant, hereditary condition, primarily characterized by defective dentin formation affecting both the deciduous and permanent dentitions. The etiology remains imprecise to date, in spite of the numerous hypotheses put forward and the constant updates on this condition. This case report of type I dentin dysplasia exhibits radiographic findings that are unique and diverse from the classical findings of various subtypes of this disease reported to date. This article also depicts the implant-based oral rehabilitation of the young patient diagnosed with this variant model of dentin dysplasia type I. Early diagnosis and implementation of this preventive and curative therapy is vital for avoiding premature exfoliation of deciduous and permanent dentition and the associated residual ridge resorption, thereby overcoming functional and esthetic deficits and ensuring protection of the remaining dentition from further harm.

  16. Firm Based Trade Models and Turkish Economy

    Directory of Open Access Journals (Sweden)

    Nilüfer ARGIN

    2015-12-01

    Full Text Available Among all international trade models, only The Firm Based Trade Models explains firm’s action and behavior in the world trade. The Firm Based Trade Models focuses on the trade behavior of individual firms that actually make intra industry trade. Firm Based Trade Models can explain globalization process truly. These approaches include multinational cooperation, supply chain and outsourcing also. Our paper aims to explain and analyze Turkish export with Firm Based Trade Models’ context. We use UNCTAD data on exports by SITC Rev 3 categorization to explain total export and 255 products and calculate intensive-extensive margins of Turkish firms.

  17. Incident Duration Modeling Using Flexible Parametric Hazard-Based Models

    Directory of Open Access Journals (Sweden)

    Ruimin Li

    2014-01-01

    Full Text Available Assessing and prioritizing the duration time and effects of traffic incidents on major roads present significant challenges for road network managers. This study examines the effect of numerous factors associated with various types of incidents on their duration and proposes an incident duration prediction model. Several parametric accelerated failure time hazard-based models were examined, including Weibull, log-logistic, log-normal, and generalized gamma, as well as all models with gamma heterogeneity and flexible parametric hazard-based models with freedom ranging from one to ten, by analyzing a traffic incident dataset obtained from the Incident Reporting and Dispatching System in Beijing in 2008. Results show that different factors significantly affect different incident time phases, whose best distributions were diverse. Given the best hazard-based models of each incident time phase, the prediction result can be reasonable for most incidents. The results of this study can aid traffic incident management agencies not only in implementing strategies that would reduce incident duration, and thus reduce congestion, secondary incidents, and the associated human and economic losses, but also in effectively predicting incident duration time.

  18. NASA Software Cost Estimation Model: An Analogy Based Estimation Model

    Science.gov (United States)

    Hihn, Jairus; Juster, Leora; Menzies, Tim; Mathew, George; Johnson, James

    2015-01-01

    The cost estimation of software development activities is increasingly critical for large scale integrated projects such as those at DOD and NASA especially as the software systems become larger and more complex. As an example MSL (Mars Scientific Laboratory) developed at the Jet Propulsion Laboratory launched with over 2 million lines of code making it the largest robotic spacecraft ever flown (Based on the size of the software). Software development activities are also notorious for their cost growth, with NASA flight software averaging over 50% cost growth. All across the agency, estimators and analysts are increasingly being tasked to develop reliable cost estimates in support of program planning and execution. While there has been extensive work on improving parametric methods there is very little focus on the use of models based on analogy and clustering algorithms. In this paper we summarize our findings on effort/cost model estimation and model development based on ten years of software effort estimation research using data mining and machine learning methods to develop estimation models based on analogy and clustering. The NASA Software Cost Model performance is evaluated by comparing it to COCOMO II, linear regression, and K-­ nearest neighbor prediction model performance on the same data set.

  19. Entity-Based Landscape Modelling to Assess the Impacts of Different Incentives Mechanisms on Argan Forest Dynamics

    Directory of Open Access Journals (Sweden)

    Farid El Wahidi

    2015-11-01

    Full Text Available Illegal occupation of argan forest parcels by local households is a new phenomenon in South West Morocco. This is primarily due to the weakening of traditional common control systems and to the boom of the argan oil price. The scope of this work is to develop a decision support system based on dynamic spatial modelling, allowing to anticipate the land tenure dynamics and their impact on forest stand degradation under different policy scenarios. The model simulates the change of land possession by locals and the forest stand degradation levels. The methodological approach combines a Markov chain analysis (MCA with stakeholders’ preferences for land tenure. First, parcels’ transition probabilities are computed using the MCA. Second, the acquiring suitability map is derived from multi-criteria evaluation procedure (AHP using biophysical and socio-economic data. Finally, uncertainty is introduced in the simulation based on probabilistic analysis for supporting socio-economic diversity and non-mechanistic human behavior. The modelling approach was successfully used to compare three scenarios: business as usual (continuation of illegal acquiring, total disengagement of the population and private/public partnership with incentives for restoring argan parcel. The model yields geographic information about (i the magnitude of the on-going process; (ii the potential occurrence of land use conflicts induced by new policies; and (iii the location of land conservation or degradation hot-spots. The outcomes of the “business as usual” and of the “total disengagement” models were similar over a 30-year simulation period: in both cases, the proportion of “highly degraded” parcels was doubled and the number of “quite degraded” parcels was increased by 50%. On the other hand, should the private/public partnership effectively work, about 40% of the parcels could be restored to a sustainable level.

  20. Modeling Guru: Knowledge Base for NASA Modelers

    Science.gov (United States)

    Seablom, M. S.; Wojcik, G. S.; van Aartsen, B. H.

    2009-05-01

    Modeling Guru is an on-line knowledge-sharing resource for anyone involved with or interested in NASA's scientific models or High End Computing (HEC) systems. Developed and maintained by the NASA's Software Integration and Visualization Office (SIVO) and the NASA Center for Computational Sciences (NCCS), Modeling Guru's combined forums and knowledge base for research and collaboration is becoming a repository for the accumulated expertise of NASA's scientific modeling and HEC communities. All NASA modelers and associates are encouraged to participate and provide knowledge about the models and systems so that other users may benefit from their experience. Modeling Guru is divided into a hierarchy of communities, each with its own set forums and knowledge base documents. Current modeling communities include those for space science, land and atmospheric dynamics, atmospheric chemistry, and oceanography. In addition, there are communities focused on NCCS systems, HEC tools and libraries, and programming and scripting languages. Anyone may view most of the content on Modeling Guru (available at http://modelingguru.nasa.gov/), but you must log in to post messages and subscribe to community postings. The site offers a full range of "Web 2.0" features, including discussion forums, "wiki" document generation, document uploading, RSS feeds, search tools, blogs, email notification, and "breadcrumb" links. A discussion (a.k.a. forum "thread") is used to post comments, solicit feedback, or ask questions. If marked as a question, SIVO will monitor the thread, and normally respond within a day. Discussions can include embedded images, tables, and formatting through the use of the Rich Text Editor. Also, the user can add "Tags" to their thread to facilitate later searches. The "knowledge base" is comprised of documents that are used to capture and share expertise with others. The default "wiki" document lets users edit within the browser so others can easily collaborate on the

  1. Model-reduced gradient-based history matching

    NARCIS (Netherlands)

    Kaleta, M.P.

    2011-01-01

    Since the world's energy demand increases every year, the oil & gas industry makes a continuous effort to improve fossil fuel recovery. Physics-based petroleum reservoir modeling and closed-loop model-based reservoir management concept can play an important role here. In this concept measured data

  2. Stimulating Scientific Reasoning with Drawing-Based Modeling

    Science.gov (United States)

    Heijnes, Dewi; van Joolingen, Wouter; Leenaars, Frank

    2018-01-01

    We investigate the way students' reasoning about evolution can be supported by drawing-based modeling. We modified the drawing-based modeling tool SimSketch to allow for modeling evolutionary processes. In three iterations of development and testing, students in lower secondary education worked on creating an evolutionary model. After each…

  3. The effects of an educational program based on PRECEDE model on depression levels in patients with coronary artery bypass grafting

    Directory of Open Access Journals (Sweden)

    Sayyed Mohammad Mahdi Hazavei

    2012-06-01

    Full Text Available    BACKGROUND: Depression is among the most important barriers to proper treatment of cardiac patients. It causes failure in accepting their conditions, decreases their motivation in following the therapeutic recommendations, and thus negatively affects their functionality and quality of life. The present study aimed to investigate the effects of an educational program based on Predisposing, Reinforcing, Enabling Constructs in Educational Diagnosis and Evaluation (PRECEDE model on depression level in coronary artery bypass grafting (CABG surgery patients.    METHODS: This was a quasi-experimental study in which 54 post-bypass surgery patients of Isfahan Cardiovascular Research Center were investigated. The patients were randomly divided into two groups of intervention and control. The data was collected using two questionnaires. Primarily, the cardiac depression scale was used to measure the degree of depression followed by PRECEDE model-based educational questionnaire to identify the role of the educational intervention on patients. The PRECEDE model-based intervention composed of 9 educational sessions per week (60-90 minutes each. The patients were followed up for two months post-intervention.    RESULTS: Following the educational intervention, mean scores of predisposing, enabling, and reinforcing factors, and self-helping behaviors significantly increased in the intervention group compared to the control group (P < 0.001. In addition, a significant difference in mean scores of depression was observed between the two groups following the educational intervention (P < 0.001.    CONCLUSION: The findings of the current study confirmed the practicability and effectiveness of the PRECEDE model-based educational programs on preventing or decreasing depression levels in CABG patients.         Keywords: Educational Program, PRECEDE Model, Depression, Coronary Artery Bypass Surgery.  

  4. Model-Based Requirements Management in Gear Systems Design Based On Graph-Based Design Languages

    Directory of Open Access Journals (Sweden)

    Kevin Holder

    2017-10-01

    Full Text Available For several decades, a wide-spread consensus concerning the enormous importance of an in-depth clarification of the specifications of a product has been observed. A weak clarification of specifications is repeatedly listed as a main cause for the failure of product development projects. Requirements, which can be defined as the purpose, goals, constraints, and criteria associated with a product development project, play a central role in the clarification of specifications. The collection of activities which ensure that requirements are identified, documented, maintained, communicated, and traced throughout the life cycle of a system, product, or service can be referred to as “requirements engineering”. These activities can be supported by a collection and combination of strategies, methods, and tools which are appropriate for the clarification of specifications. Numerous publications describe the strategy and the components of requirements management. Furthermore, recent research investigates its industrial application. Simultaneously, promising developments of graph-based design languages for a holistic digital representation of the product life cycle are presented. Current developments realize graph-based languages by the diagrams of the Unified Modelling Language (UML, and allow the automatic generation and evaluation of multiple product variants. The research presented in this paper seeks to present a method in order to combine the advantages of a conscious requirements management process and graph-based design languages. Consequently, the main objective of this paper is the investigation of a model-based integration of requirements in a product development process by means of graph-based design languages. The research method is based on an in-depth analysis of an exemplary industrial product development, a gear system for so-called “Electrical Multiple Units” (EMU. Important requirements were abstracted from a gear system

  5. Constitutive modeling of a nickel base superalloy -with a focus on gas turbine applications

    Energy Technology Data Exchange (ETDEWEB)

    Almroth, Per

    2003-05-01

    Gas turbines are used where large amounts of energy is needed, typically as engines in aircraft, ferries and power plants. From an efficiency point of view it is desirable to increase the service temperature as much as possible. One of the limiting factors is then the maximum allowable metal temperatures in the turbine stages, primarily in the blades of the first stage, that are exposed to the highest gas temperatures. Specially designed materials are used to cope with these severe conditions, such as the nickel base superalloy IN792. In order to be able to design the components for higher temperatures and tighter tolerances, a detailed understanding and computationel models of the material behaviour is needed. The models presented in this work have been developed with the objective of being physically well motivated, and with the intention of avoiding excessive numbers of parameters. The influence of the parameters should also be as easy as possible to interpret. The models are to describe the behaviour of IN792, under conditions typically found for a gas turbine blade. Specifically the high- and intermediate temperature isothermal modelling of IN792 have been addressed. One main issue when characterising the material and calibrating the models is the use of relevant tests, that are representative of component conditions. Therefore isothermal tests with an eye on the typical environment of a turbine blade have been planned and performed. Using numerical optimization techniques the material parameters for the isothermal behaviour of IN792 at 650 deg and 850 deg have been estimated. The good overall calibration results for these specific temperatures, using the presented modeling concept and nonstandard constitutive tests, suggests that the model can describe the behaviour of IN792 in gas turbine hot part applications.

  6. Event-Based Conceptual Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    2009-01-01

    The purpose of the paper is to obtain insight into and provide practical advice for event-based conceptual modeling. We analyze a set of event concepts and use the results to formulate a conceptual event model that is used to identify guidelines for creation of dynamic process models and static...... information models. We characterize events as short-duration processes that have participants, consequences, and properties, and that may be modeled in terms of information structures. The conceptual event model is used to characterize a variety of event concepts and it is used to illustrate how events can...... be used to integrate dynamic modeling of processes and static modeling of information structures. The results are unique in the sense that no other general event concept has been used to unify a similar broad variety of seemingly incompatible event concepts. The general event concept can be used...

  7. Identification of walking human model using agent-based modelling

    Science.gov (United States)

    Shahabpoor, Erfan; Pavic, Aleksandar; Racic, Vitomir

    2018-03-01

    The interaction of walking people with large vibrating structures, such as footbridges and floors, in the vertical direction is an important yet challenging phenomenon to describe mathematically. Several different models have been proposed in the literature to simulate interaction of stationary people with vibrating structures. However, the research on moving (walking) human models, explicitly identified for vibration serviceability assessment of civil structures, is still sparse. In this study, the results of a comprehensive set of FRF-based modal tests were used, in which, over a hundred test subjects walked in different group sizes and walking patterns on a test structure. An agent-based model was used to simulate discrete traffic-structure interactions. The occupied structure modal parameters found in tests were used to identify the parameters of the walking individual's single-degree-of-freedom (SDOF) mass-spring-damper model using 'reverse engineering' methodology. The analysis of the results suggested that the normal distribution with the average of μ = 2.85Hz and standard deviation of σ = 0.34Hz can describe human SDOF model natural frequency. Similarly, the normal distribution with μ = 0.295 and σ = 0.047 can describe the human model damping ratio. Compared to the previous studies, the agent-based modelling methodology proposed in this paper offers significant flexibility in simulating multi-pedestrian walking traffics, external forces and simulating different mechanisms of human-structure and human-environment interaction at the same time.

  8. Ground-Based Telescope Parametric Cost Model

    Science.gov (United States)

    Stahl, H. Philip; Rowell, Ginger Holmes

    2004-01-01

    A parametric cost model for ground-based telescopes is developed using multi-variable statistical analysis, The model includes both engineering and performance parameters. While diameter continues to be the dominant cost driver, other significant factors include primary mirror radius of curvature and diffraction limited wavelength. The model includes an explicit factor for primary mirror segmentation and/or duplication (i.e.. multi-telescope phased-array systems). Additionally, single variable models based on aperture diameter are derived. This analysis indicates that recent mirror technology advances have indeed reduced the historical telescope cost curve.

  9. Multiagent-Based Model For ESCM

    Directory of Open Access Journals (Sweden)

    Delia MARINCAS

    2011-01-01

    Full Text Available Web based applications for Supply Chain Management (SCM are now a necessity for every company in order to meet the increasing customer demands, to face the global competition and to make profit. Multiagent-based approach is appropriate for eSCM because it shows many of the characteristics a SCM system should have. For this reason, we have proposed a multiagent-based eSCM model which configures a virtual SC, automates the SC activities: selling, purchasing, manufacturing, planning, inventory, etc. This model will allow a better coordination of the supply chain network and will increase the effectiveness of Web and intel-ligent technologies employed in eSCM software.

  10. SLS Navigation Model-Based Design Approach

    Science.gov (United States)

    Oliver, T. Emerson; Anzalone, Evan; Geohagan, Kevin; Bernard, Bill; Park, Thomas

    2018-01-01

    The SLS Program chose to implement a Model-based Design and Model-based Requirements approach for managing component design information and system requirements. This approach differs from previous large-scale design efforts at Marshall Space Flight Center where design documentation alone conveyed information required for vehicle design and analysis and where extensive requirements sets were used to scope and constrain the design. The SLS Navigation Team has been responsible for the Program-controlled Design Math Models (DMMs) which describe and represent the performance of the Inertial Navigation System (INS) and the Rate Gyro Assemblies (RGAs) used by Guidance, Navigation, and Controls (GN&C). The SLS Navigation Team is also responsible for the navigation algorithms. The navigation algorithms are delivered for implementation on the flight hardware as a DMM. For the SLS Block 1-B design, the additional GPS Receiver hardware is managed as a DMM at the vehicle design level. This paper provides a discussion of the processes and methods used to engineer, design, and coordinate engineering trades and performance assessments using SLS practices as applied to the GN&C system, with a particular focus on the Navigation components. These include composing system requirements, requirements verification, model development, model verification and validation, and modeling and analysis approaches. The Model-based Design and Requirements approach does not reduce the effort associated with the design process versus previous processes used at Marshall Space Flight Center. Instead, the approach takes advantage of overlap between the requirements development and management process, and the design and analysis process by efficiently combining the control (i.e. the requirement) and the design mechanisms. The design mechanism is the representation of the component behavior and performance in design and analysis tools. The focus in the early design process shifts from the development and

  11. A Mesoscale Model-Based Climatography of Nocturnal Boundary-Layer Characteristics over the Complex Terrain of North-Western Utah.

    Science.gov (United States)

    Serafin, Stefano; De Wekker, Stephan F J; Knievel, Jason C

    Nocturnal boundary-layer phenomena in regions of complex topography are extremely diverse and respond to a multiplicity of forcing factors, acting primarily at the mesoscale and microscale. The interaction between different physical processes, e.g., drainage promoted by near-surface cooling and ambient flow over topography in a statically stable environment, may give rise to special flow patterns, uncommon over flat terrain. Here we present a climatography of boundary-layer flows, based on a 2-year archive of simulations from a high-resolution operational mesoscale weather modelling system, 4DWX. The geographical context is Dugway Proving Ground, in north-western Utah, USA, target area of the field campaigns of the MATERHORN (Mountain Terrain Atmospheric Modeling and Observations Program) project. The comparison between model fields and available observations in 2012-2014 shows that the 4DWX model system provides a realistic representation of wind speed and direction in the area, at least in an average sense. Regions displaying strong spatial gradients in the field variables, thought to be responsible for enhanced nocturnal mixing, are typically located in transition areas from mountain sidewalls to adjacent plains. A key dynamical process in this respect is the separation of dynamically accelerated downslope flows from the surface.

  12. Quantitative Analysis of Variability and Uncertainty in Environmental Data and Models. Volume 1. Theory and Methodology Based Upon Bootstrap Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Frey, H. Christopher [North Carolina State University, Raleigh, NC (United States); Rhodes, David S. [North Carolina State University, Raleigh, NC (United States)

    1999-04-30

    This is Volume 1 of a two-volume set of reports describing work conducted at North Carolina State University sponsored by Grant Number DE-FG05-95ER30250 by the U.S. Department of Energy. The title of the project is “Quantitative Analysis of Variability and Uncertainty in Acid Rain Assessments.” The work conducted under sponsorship of this grant pertains primarily to two main topics: (1) development of new methods for quantitative analysis of variability and uncertainty applicable to any type of model; and (2) analysis of variability and uncertainty in the performance, emissions, and cost of electric power plant combustion-based NOx control technologies. These two main topics are reported separately in Volumes 1 and 2.

  13. An acoustical model based monitoring network

    NARCIS (Netherlands)

    Wessels, P.W.; Basten, T.G.H.; Eerden, F.J.M. van der

    2010-01-01

    In this paper the approach for an acoustical model based monitoring network is demonstrated. This network is capable of reconstructing a noise map, based on the combination of measured sound levels and an acoustic model of the area. By pre-calculating the sound attenuation within the network the

  14. Model-Based Security Testing

    Directory of Open Access Journals (Sweden)

    Ina Schieferdecker

    2012-02-01

    Full Text Available Security testing aims at validating software system requirements related to security properties like confidentiality, integrity, authentication, authorization, availability, and non-repudiation. Although security testing techniques are available for many years, there has been little approaches that allow for specification of test cases at a higher level of abstraction, for enabling guidance on test identification and specification as well as for automated test generation. Model-based security testing (MBST is a relatively new field and especially dedicated to the systematic and efficient specification and documentation of security test objectives, security test cases and test suites, as well as to their automated or semi-automated generation. In particular, the combination of security modelling and test generation approaches is still a challenge in research and of high interest for industrial applications. MBST includes e.g. security functional testing, model-based fuzzing, risk- and threat-oriented testing, and the usage of security test patterns. This paper provides a survey on MBST techniques and the related models as well as samples of new methods and tools that are under development in the European ITEA2-project DIAMONDS.

  15. Model-based Abstraction of Data Provenance

    DEFF Research Database (Denmark)

    Probst, Christian W.; Hansen, René Rydhof

    2014-01-01

    to bigger models, and the analyses adapt accordingly. Our approach extends provenance both with the origin of data, the actors and processes involved in the handling of data, and policies applied while doing so. The model and corresponding analyses are based on a formal model of spatial and organisational......Identifying provenance of data provides insights to the origin of data and intermediate results, and has recently gained increased interest due to data-centric applications. In this work we extend a data-centric system view with actors handling the data and policies restricting actions....... This extension is based on provenance analysis performed on system models. System models have been introduced to model and analyse spatial and organisational aspects of organisations, to identify, e.g., potential insider threats. Both the models and analyses are naturally modular; models can be combined...

  16. Density-Based Clustering with Geographical Background Constraints Using a Semantic Expression Model

    Directory of Open Access Journals (Sweden)

    Qingyun Du

    2016-05-01

    Full Text Available A semantics-based method for density-based clustering with constraints imposed by geographical background knowledge is proposed. In this paper, we apply an ontological approach to the DBSCAN (Density-Based Geospatial Clustering of Applications with Noise algorithm in the form of knowledge representation for constraint clustering. When used in the process of clustering geographic information, semantic reasoning based on a defined ontology and its relationships is primarily intended to overcome the lack of knowledge of the relevant geospatial data. Better constraints on the geographical knowledge yield more reasonable clustering results. This article uses an ontology to describe the four types of semantic constraints for geographical backgrounds: “No Constraints”, “Constraints”, “Cannot-Link Constraints”, and “Must-Link Constraints”. This paper also reports the implementation of a prototype clustering program. Based on the proposed approach, DBSCAN can be applied with both obstacle and non-obstacle constraints as a semi-supervised clustering algorithm and the clustering results are displayed on a digital map.

  17. A new equilibrium trading model with asymmetric information

    Directory of Open Access Journals (Sweden)

    Lianzhang Bao

    2018-03-01

    Full Text Available Taking arbitrage opportunities into consideration in an incomplete market, dealers will pricebonds based on asymmetric information. The dealer with the best offering price wins the bid. The riskpremium in dealer’s offering price is primarily determined by the dealer’s add-on rate of change tothe term structure. To optimize the trading strategy, a new equilibrium trading model is introduced.Optimal sequential estimation scheme for detecting the risk premium due to private inforamtion isproposed based on historical prices, and the best bond pricing formula is given with the accordingoptimal trading strategy. Numerical examples are provided to illustrate the economic insights underthe certain stochastic term structure interest rate models.

  18. Model Validation in Ontology Based Transformations

    Directory of Open Access Journals (Sweden)

    Jesús M. Almendros-Jiménez

    2012-10-01

    Full Text Available Model Driven Engineering (MDE is an emerging approach of software engineering. MDE emphasizes the construction of models from which the implementation should be derived by applying model transformations. The Ontology Definition Meta-model (ODM has been proposed as a profile for UML models of the Web Ontology Language (OWL. In this context, transformations of UML models can be mapped into ODM/OWL transformations. On the other hand, model validation is a crucial task in model transformation. Meta-modeling permits to give a syntactic structure to source and target models. However, semantic requirements have to be imposed on source and target models. A given transformation will be sound when source and target models fulfill the syntactic and semantic requirements. In this paper, we present an approach for model validation in ODM based transformations. Adopting a logic programming based transformational approach we will show how it is possible to transform and validate models. Properties to be validated range from structural and semantic requirements of models (pre and post conditions to properties of the transformation (invariants. The approach has been applied to a well-known example of model transformation: the Entity-Relationship (ER to Relational Model (RM transformation.

  19. Model-Based Knowing: How Do Students Ground Their Understanding About Climate Systems in Agent-Based Computer Models?

    Science.gov (United States)

    Markauskaite, Lina; Kelly, Nick; Jacobson, Michael J.

    2017-12-01

    This paper gives a grounded cognition account of model-based learning of complex scientific knowledge related to socio-scientific issues, such as climate change. It draws on the results from a study of high school students learning about the carbon cycle through computational agent-based models and investigates two questions: First, how do students ground their understanding about the phenomenon when they learn and solve problems with computer models? Second, what are common sources of mistakes in students' reasoning with computer models? Results show that students ground their understanding in computer models in five ways: direct observation, straight abstraction, generalisation, conceptualisation, and extension. Students also incorporate into their reasoning their knowledge and experiences that extend beyond phenomena represented in the models, such as attitudes about unsustainable carbon emission rates, human agency, external events, and the nature of computational models. The most common difficulties of the students relate to seeing the modelled scientific phenomenon and connecting results from the observations with other experiences and understandings about the phenomenon in the outside world. An important contribution of this study is the constructed coding scheme for establishing different ways of grounding, which helps to understand some challenges that students encounter when they learn about complex phenomena with agent-based computer models.

  20. Worldwide Diversity in Funded Pension Plans : Four Role Models on Choice and Participation

    NARCIS (Netherlands)

    Garcia Huitron, Manuel; Ponds, Eduard

    2015-01-01

    This paper provides an in-depth comparison of funded pension savings plans around the world. The large variety in plan designs is a reflection of historical, cultural and institutional diversity. We postulate a new classification of four role models of funded pension plans, primarily based on choice

  1. Data-based Non-Markovian Model Inference

    Science.gov (United States)

    Ghil, Michael

    2015-04-01

    This talk concentrates on obtaining stable and efficient data-based models for simulation and prediction in the geosciences and life sciences. The proposed model derivation relies on using a multivariate time series of partial observations from a large-dimensional system, and the resulting low-order models are compared with the optimal closures predicted by the non-Markovian Mori-Zwanzig formalism of statistical physics. Multilayer stochastic models (MSMs) are introduced as both a very broad generalization and a time-continuous limit of existing multilevel, regression-based approaches to data-based closure, in particular of empirical model reduction (EMR). We show that the multilayer structure of MSMs can provide a natural Markov approximation to the generalized Langevin equation (GLE) of the Mori-Zwanzig formalism. A simple correlation-based stopping criterion for an EMR-MSM model is derived to assess how well it approximates the GLE solution. Sufficient conditions are given for the nonlinear cross-interactions between the constitutive layers of a given MSM to guarantee the existence of a global random attractor. This existence ensures that no blow-up can occur for a very broad class of MSM applications. The EMR-MSM methodology is first applied to a conceptual, nonlinear, stochastic climate model of coupled slow and fast variables, in which only slow variables are observed. The resulting reduced model with energy-conserving nonlinearities captures the main statistical features of the slow variables, even when there is no formal scale separation and the fast variables are quite energetic. Second, an MSM is shown to successfully reproduce the statistics of a partially observed, generalized Lokta-Volterra model of population dynamics in its chaotic regime. The positivity constraint on the solutions' components replaces here the quadratic-energy-preserving constraint of fluid-flow problems and it successfully prevents blow-up. This work is based on a close

  2. An Emotional Agent Model Based on Granular Computing

    Directory of Open Access Journals (Sweden)

    Jun Hu

    2012-01-01

    Full Text Available Affective computing has a very important significance for fulfilling intelligent information processing and harmonious communication between human being and computers. A new model for emotional agent is proposed in this paper to make agent have the ability of handling emotions, based on the granular computing theory and the traditional BDI agent model. Firstly, a new emotion knowledge base based on granular computing for emotion expression is presented in the model. Secondly, a new emotional reasoning algorithm based on granular computing is proposed. Thirdly, a new emotional agent model based on granular computing is presented. Finally, based on the model, an emotional agent for patient assistant in hospital is realized, experiment results show that it is efficient to handle simple emotions.

  3. CEAI: CCM based Email Authorship Identification Model

    DEFF Research Database (Denmark)

    Nizamani, Sarwat; Memon, Nasrullah

    2013-01-01

    In this paper we present a model for email authorship identification (EAI) by employing a Cluster-based Classification (CCM) technique. Traditionally, stylometric features have been successfully employed in various authorship analysis tasks; we extend the traditional feature-set to include some...... more interesting and effective features for email authorship identification (e.g. the last punctuation mark used in an email, the tendency of an author to use capitalization at the start of an email, or the punctuation after a greeting or farewell). We also included Info Gain feature selection based...... reveal that the proposed CCM-based email authorship identification model, along with the proposed feature set, outperforms the state-of-the-art support vector machine (SVM)-based models, as well as the models proposed by Iqbal et al. [1, 2]. The proposed model attains an accuracy rate of 94% for 10...

  4. Individual-based modeling of fish: Linking to physical models and water quality.

    Energy Technology Data Exchange (ETDEWEB)

    Rose, K.A.

    1997-08-01

    The individual-based modeling approach for the simulating fish population and community dynamics is gaining popularity. Individual-based modeling has been used in many other fields, such as forest succession and astronomy. The popularity of the individual-based approach is partly a result of the lack of success of the more aggregate modeling approaches traditionally used for simulating fish population and community dynamics. Also, recent recognition that it is often the atypical individual that survives has fostered interest in the individual-based approach. Two general types of individual-based models are distribution and configuration. Distribution models follow the probability distributions of individual characteristics, such as length and age. Configuration models explicitly simulate each individual; the sum over individuals being the population. DeAngelis et al (1992) showed that, when distribution and configuration models were formulated from the same common pool of information, both approaches generated similar predictions. The distribution approach was more compact and general, while the configuration approach was more flexible. Simple biological changes, such as making growth rate dependent on previous days growth rates, were easy to implement in the configuration version but prevented simple analytical solution of the distribution version.

  5. A general model for membrane-based separation processes

    DEFF Research Database (Denmark)

    Soni, Vipasha; Abildskov, Jens; Jonsson, Gunnar Eigil

    2009-01-01

    behaviour will play an important role. In this paper, modelling of membrane-based processes for separation of gas and liquid mixtures are considered. Two general models, one for membrane-based liquid separation processes (with phase change) and another for membrane-based gas separation are presented....... The separation processes covered are: membrane-based gas separation processes, pervaporation and various types of membrane distillation processes. The specific model for each type of membrane-based process is generated from the two general models by applying the specific system descriptions and the corresponding...

  6. A Recipe for implementing the Arrhenius-Shock-Temperature State Sensitive WSD (AWSD) model, with parameters for PBX 9502

    Energy Technology Data Exchange (ETDEWEB)

    Aslam, Tariq Dennis [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-10-03

    A reactive ow model for the tri-amino-tri-nitro-benzene (TATB) based plastic bonded explosive PBX 9502 is presented. This newly devised model is based primarily on the shock temperature of the material, along with local pressure, and accurately models a broader range of detonation and initiation scenarios. The equation of state for the reactants and products, as well as the thermodynamic closure of pressure and temperature equilibration are carried over from the Wescott-Stewart-Davis (WSD) model7,8. Thus, modifying an existing WSD model in a hydrocode should be rather straightforward.

  7. Anisotropy in wavelet-based phase field models

    KAUST Repository

    Korzec, Maciek; Mü nch, Andreas; Sü li, Endre; Wagner, Barbara

    2016-01-01

    When describing the anisotropic evolution of microstructures in solids using phase-field models, the anisotropy of the crystalline phases is usually introduced into the interfacial energy by directional dependencies of the gradient energy coefficients. We consider an alternative approach based on a wavelet analogue of the Laplace operator that is intrinsically anisotropic and linear. The paper focuses on the classical coupled temperature/Ginzburg--Landau type phase-field model for dendritic growth. For the model based on the wavelet analogue, existence, uniqueness and continuous dependence on initial data are proved for weak solutions. Numerical studies of the wavelet based phase-field model show dendritic growth similar to the results obtained for classical phase-field models.

  8. Anisotropy in wavelet-based phase field models

    KAUST Repository

    Korzec, Maciek

    2016-04-01

    When describing the anisotropic evolution of microstructures in solids using phase-field models, the anisotropy of the crystalline phases is usually introduced into the interfacial energy by directional dependencies of the gradient energy coefficients. We consider an alternative approach based on a wavelet analogue of the Laplace operator that is intrinsically anisotropic and linear. The paper focuses on the classical coupled temperature/Ginzburg--Landau type phase-field model for dendritic growth. For the model based on the wavelet analogue, existence, uniqueness and continuous dependence on initial data are proved for weak solutions. Numerical studies of the wavelet based phase-field model show dendritic growth similar to the results obtained for classical phase-field models.

  9. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 6; Issue 3. Computer Based Modelling and Simulation - Modelling Deterministic Systems. N K Srinivasan. General Article Volume 6 Issue 3 March 2001 pp 46-54. Fulltext. Click here to view fulltext PDF. Permanent link:

  10. Application of model-based and knowledge-based measuring methods as analytical redundancy

    International Nuclear Information System (INIS)

    Hampel, R.; Kaestner, W.; Chaker, N.; Vandreier, B.

    1997-01-01

    The safe operation of nuclear power plants requires the application of modern and intelligent methods of signal processing for the normal operation as well as for the management of accident conditions. Such modern and intelligent methods are model-based and knowledge-based ones being founded on analytical knowledge (mathematical models) as well as experiences (fuzzy information). In addition to the existing hardware redundancies analytical redundancies will be established with the help of these modern methods. These analytical redundancies support the operating staff during the decision-making. The design of a hybrid model-based and knowledge-based measuring method will be demonstrated by the example of a fuzzy-supported observer. Within the fuzzy-supported observer a classical linear observer is connected with a fuzzy-supported adaptation of the model matrices of the observer model. This application is realized for the estimation of the non-measurable variables as steam content and mixture level within pressure vessels with water-steam mixture during accidental depressurizations. For this example the existing non-linearities will be classified and the verification of the model will be explained. The advantages of the hybrid method in comparison to the classical model-based measuring methods will be demonstrated by the results of estimation. The consideration of the parameters which have an important influence on the non-linearities requires the inclusion of high-dimensional structures of fuzzy logic within the model-based measuring methods. Therefore methods will be presented which allow the conversion of these high-dimensional structures to two-dimensional structures of fuzzy logic. As an efficient solution of this problem a method based on cascaded fuzzy controllers will be presented. (author). 2 refs, 12 figs, 5 tabs

  11. How can model comparison help improving species distribution models?

    Directory of Open Access Journals (Sweden)

    Emmanuel Stephan Gritti

    Full Text Available Today, more than ever, robust projections of potential species range shifts are needed to anticipate and mitigate the impacts of climate change on biodiversity and ecosystem services. Such projections are so far provided almost exclusively by correlative species distribution models (correlative SDMs. However, concerns regarding the reliability of their predictive power are growing and several authors call for the development of process-based SDMs. Still, each of these methods presents strengths and weakness which have to be estimated if they are to be reliably used by decision makers. In this study we compare projections of three different SDMs (STASH, LPJ and PHENOFIT that lie in the continuum between correlative models and process-based models for the current distribution of three major European tree species, Fagussylvatica L., Quercusrobur L. and Pinussylvestris L. We compare the consistency of the model simulations using an innovative comparison map profile method, integrating local and multi-scale comparisons. The three models simulate relatively accurately the current distribution of the three species. The process-based model performs almost as well as the correlative model, although parameters of the former are not fitted to the observed species distributions. According to our simulations, species range limits are triggered, at the European scale, by establishment and survival through processes primarily related to phenology and resistance to abiotic stress rather than to growth efficiency. The accuracy of projections of the hybrid and process-based model could however be improved by integrating a more realistic representation of the species resistance to water stress for instance, advocating for pursuing efforts to understand and formulate explicitly the impact of climatic conditions and variations on these processes.

  12. A hybrid agent-based approach for modeling microbiological systems.

    Science.gov (United States)

    Guo, Zaiyi; Sloot, Peter M A; Tay, Joc Cing

    2008-11-21

    Models for systems biology commonly adopt Differential Equations or Agent-Based modeling approaches for simulating the processes as a whole. Models based on differential equations presuppose phenomenological intracellular behavioral mechanisms, while models based on Multi-Agent approach often use directly translated, and quantitatively less precise if-then logical rule constructs. We propose an extendible systems model based on a hybrid agent-based approach where biological cells are modeled as individuals (agents) while molecules are represented by quantities. This hybridization in entity representation entails a combined modeling strategy with agent-based behavioral rules and differential equations, thereby balancing the requirements of extendible model granularity with computational tractability. We demonstrate the efficacy of this approach with models of chemotaxis involving an assay of 10(3) cells and 1.2x10(6) molecules. The model produces cell migration patterns that are comparable to laboratory observations.

  13. Cognitive components underpinning the development of model-based learning.

    Science.gov (United States)

    Potter, Tracey C S; Bryce, Nessa V; Hartley, Catherine A

    2017-06-01

    Reinforcement learning theory distinguishes "model-free" learning, which fosters reflexive repetition of previously rewarded actions, from "model-based" learning, which recruits a mental model of the environment to flexibly select goal-directed actions. Whereas model-free learning is evident across development, recruitment of model-based learning appears to increase with age. However, the cognitive processes underlying the development of model-based learning remain poorly characterized. Here, we examined whether age-related differences in cognitive processes underlying the construction and flexible recruitment of mental models predict developmental increases in model-based choice. In a cohort of participants aged 9-25, we examined whether the abilities to infer sequential regularities in the environment ("statistical learning"), maintain information in an active state ("working memory") and integrate distant concepts to solve problems ("fluid reasoning") predicted age-related improvements in model-based choice. We found that age-related improvements in statistical learning performance did not mediate the relationship between age and model-based choice. Ceiling performance on our working memory assay prevented examination of its contribution to model-based learning. However, age-related improvements in fluid reasoning statistically mediated the developmental increase in the recruitment of a model-based strategy. These findings suggest that gradual development of fluid reasoning may be a critical component process underlying the emergence of model-based learning. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  14. An Agent-Based Monetary Production Simulation Model

    DEFF Research Database (Denmark)

    Bruun, Charlotte

    2006-01-01

    An Agent-Based Simulation Model Programmed in Objective Borland Pascal. Program and source code is downloadable......An Agent-Based Simulation Model Programmed in Objective Borland Pascal. Program and source code is downloadable...

  15. Information modelling and knowledge bases XXV

    CERN Document Server

    Tokuda, T; Jaakkola, H; Yoshida, N

    2014-01-01

    Because of our ever increasing use of and reliance on technology and information systems, information modelling and knowledge bases continue to be important topics in those academic communities concerned with data handling and computer science. As the information itself becomes more complex, so do the levels of abstraction and the databases themselves. This book is part of the series Information Modelling and Knowledge Bases, which concentrates on a variety of themes in the important domains of conceptual modeling, design and specification of information systems, multimedia information modelin

  16. Evaluating Emulation-based Models of Distributed Computing Systems

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Stephen T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Cyber Initiatives; Gabert, Kasimir G. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Cyber Initiatives; Tarman, Thomas D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Emulytics Initiatives

    2017-08-01

    Emulation-based models of distributed computing systems are collections of virtual ma- chines, virtual networks, and other emulation components configured to stand in for oper- ational systems when performing experimental science, training, analysis of design alterna- tives, test and evaluation, or idea generation. As with any tool, we should carefully evaluate whether our uses of emulation-based models are appropriate and justified. Otherwise, we run the risk of using a model incorrectly and creating meaningless results. The variety of uses of emulation-based models each have their own goals and deserve thoughtful evaluation. In this paper, we enumerate some of these uses and describe approaches that one can take to build an evidence-based case that a use of an emulation-based model is credible. Predictive uses of emulation-based models, where we expect a model to tell us something true about the real world, set the bar especially high and the principal evaluation method, called validation , is comensurately rigorous. We spend the majority of our time describing and demonstrating the validation of a simple predictive model using a well-established methodology inherited from decades of development in the compuational science and engineering community.

  17. Evaluation of pipeline defect's characteristic axial length via model-based parameter estimation in ultrasonic guided wave-based inspection

    International Nuclear Information System (INIS)

    Wang, Xiaojuan; Tse, Peter W; Dordjevich, Alexandar

    2011-01-01

    The reflection signal from a defect in the process of guided wave-based pipeline inspection usually includes sufficient information to detect and define the defect. In previous research, it has been found that the reflection of guided waves from even a complex defect primarily results from the interference between reflection components generated at the front and the back edges of the defect. The respective contribution of different parameters of a defect to the overall reflection can be affected by the features of the two primary reflection components. The identification of these components embedded in the reflection signal is therefore useful in characterizing the concerned defect. In this research, we propose a method of model-based parameter estimation with the aid of the Hilbert–Huang transform technique for the purpose of decomposition of a reflection signal to enable characterization of the pipeline defect. Once two primary edge reflection components are decomposed and identified, the distance between the reflection positions, which closely relates to the axial length of the defect, could be easily and accurately determined. Considering the irregular profiles of complex pipeline defects at their two edges, which is often the case in real situations, the average of varied axial lengths of such a defect along the circumference of the pipeline is used in this paper as the characteristic value of actual axial length for comparison purpose. The experimental results of artificial defects and real corrosion in sample pipes were considered in this paper to demonstrate the effectiveness of the proposed method

  18. Model-Based Reasoning in Humans Becomes Automatic with Training.

    Directory of Open Access Journals (Sweden)

    Marcos Economides

    2015-09-01

    Full Text Available Model-based and model-free reinforcement learning (RL have been suggested as algorithmic realizations of goal-directed and habitual action strategies. Model-based RL is more flexible than model-free but requires sophisticated calculations using a learnt model of the world. This has led model-based RL to be identified with slow, deliberative processing, and model-free RL with fast, automatic processing. In support of this distinction, it has recently been shown that model-based reasoning is impaired by placing subjects under cognitive load--a hallmark of non-automaticity. Here, using the same task, we show that cognitive load does not impair model-based reasoning if subjects receive prior training on the task. This finding is replicated across two studies and a variety of analysis methods. Thus, task familiarity permits use of model-based reasoning in parallel with other cognitive demands. The ability to deploy model-based reasoning in an automatic, parallelizable fashion has widespread theoretical implications, particularly for the learning and execution of complex behaviors. It also suggests a range of important failure modes in psychiatric disorders.

  19. Modeling and simulation of complex systems a framework for efficient agent-based modeling and simulation

    CERN Document Server

    Siegfried, Robert

    2014-01-01

    Robert Siegfried presents a framework for efficient agent-based modeling and simulation of complex systems. He compares different approaches for describing structure and dynamics of agent-based models in detail. Based on this evaluation the author introduces the "General Reference Model for Agent-based Modeling and Simulation" (GRAMS). Furthermore he presents parallel and distributed simulation approaches for execution of agent-based models -from small scale to very large scale. The author shows how agent-based models may be executed by different simulation engines that utilize underlying hard

  20. New Temperature-based Models for Predicting Global Solar Radiation

    International Nuclear Information System (INIS)

    Hassan, Gasser E.; Youssef, M. Elsayed; Mohamed, Zahraa E.; Ali, Mohamed A.; Hanafy, Ahmed A.

    2016-01-01

    Highlights: • New temperature-based models for estimating solar radiation are investigated. • The models are validated against 20-years measured data of global solar radiation. • The new temperature-based model shows the best performance for coastal sites. • The new temperature-based model is more accurate than the sunshine-based models. • The new model is highly applicable with weather temperature forecast techniques. - Abstract: This study presents new ambient-temperature-based models for estimating global solar radiation as alternatives to the widely used sunshine-based models owing to the unavailability of sunshine data at all locations around the world. Seventeen new temperature-based models are established, validated and compared with other three models proposed in the literature (the Annandale, Allen and Goodin models) to estimate the monthly average daily global solar radiation on a horizontal surface. These models are developed using a 20-year measured dataset of global solar radiation for the case study location (Lat. 30°51′N and long. 29°34′E), and then, the general formulae of the newly suggested models are examined for ten different locations around Egypt. Moreover, the local formulae for the models are established and validated for two coastal locations where the general formulae give inaccurate predictions. Mostly common statistical errors are utilized to evaluate the performance of these models and identify the most accurate model. The obtained results show that the local formula for the most accurate new model provides good predictions for global solar radiation at different locations, especially at coastal sites. Moreover, the local and general formulas of the most accurate temperature-based model also perform better than the two most accurate sunshine-based models from the literature. The quick and accurate estimations of the global solar radiation using this approach can be employed in the design and evaluation of performance for

  1. Flow Formulation-based Model for the Curriculum-based Course Timetabling Problem

    DEFF Research Database (Denmark)

    Bagger, Niels-Christian Fink; Kristiansen, Simon; Sørensen, Matias

    2015-01-01

    problem. This decreases the number of integer variables signicantly and improves the performance compared to the basic formulation. It also shows competitiveness with other approaches based on mixed integer programming from the literature and improves the currently best known lower bound on one data...... instance in the benchmark data set from the second international timetabling competition.......In this work we will present a new mixed integer programming formulation for the curriculum-based course timetabling problem. We show that the model contains an underlying network model by dividing the problem into two models and then connecting the two models back into one model using a maximum ow...

  2. Model-based reasoning technology for the power industry

    International Nuclear Information System (INIS)

    Touchton, R.A.; Subramanyan, N.S.; Naser, J.A.

    1991-01-01

    This paper reports on model-based reasoning which refers to an expert system implementation methodology that uses a model of the system which is being reasoned about. Model-based representation and reasoning techniques offer many advantages and are highly suitable for domains where the individual components, their interconnection, and their behavior is well-known. Technology Applications, Inc. (TAI), under contract to the Electric Power Research Institute (EPRI), investigated the use of model-based reasoning in the power industry including the nuclear power industry. During this project, a model-based monitoring and diagnostic tool, called ProSys, was developed. Also, an alarm prioritization system was developed as a demonstration prototype

  3. Linking Adverse Outcome Pathways to Dynamic Energy Budgets: A Conceptual Model

    Energy Technology Data Exchange (ETDEWEB)

    Murphy, Cheryl [Michigan State University, East Lansing; Nisbet, Roger [University of California Santa Barbara; Antczak, Philipp [University of Liverpool, UK; Reyero, Natalia [Army Corps of Engineers, Vicksburg; Gergs, Andre [Gaiac; Lika, Dina [University of Crete; Mathews, Teresa J. [ORNL; Muller, Eric [University of California, Santa Barbara; Nacci, Dianne [U.S. Environmental Protection Agency (EPA); Peace, Angela L. [ORNL; Remien, Chris [University of Idaho; Schulz, Irv [Pacific Northwest National Laboratory (PNNL); Watanabe, Karen [Arizona State University

    2018-02-01

    Ecological risk assessment quantifies the likelihood of undesirable impacts of stressors, primarily at high levels of biological organization. Data used to inform ecological risk assessments come primarily from tests on individual organisms or from suborganismal studies, indicating a disconnect between primary data and protection goals. We know how to relate individual responses to population dynamics using individual-based models, and there are emerging ideas on how to make connections to ecosystem services. However, there is no established methodology to connect effects seen at higher levels of biological organization with suborganismal dynamics, despite progress made in identifying Adverse Outcome Pathways (AOPs) that link molecular initiating events to ecologically relevant key events. This chapter is a product of a working group at the National Center for Mathematical and Biological Synthesis (NIMBioS) that assessed the feasibility of using dynamic energy budget (DEB) models of individual organisms as a “pivot” connecting suborganismal processes to higher level ecological processes. AOP models quantify explicit molecular, cellular or organ-level processes, but do not offer a route to linking sub-organismal damage to adverse effects on individual growth, reproduction, and survival, which can be propagated to the population level through individual-based models. DEB models describe these processes, but use abstract variables with undetermined connections to suborganismal biology. We propose linking DEB and quantitative AOP models by interpreting AOP key events as measures of damage-inducing processes in a DEB model. Here, we present a conceptual model for linking AOPs to DEB models and review existing modeling tools available for both AOP and DEB.

  4. Structure-Based Turbulence Model

    National Research Council Canada - National Science Library

    Reynolds, W

    2000-01-01

    .... Maire carried out this work as part of his Phi) research. During the award period we began to explore ways to simplify the structure-based modeling so that it could be used in repetitive engineering calculations...

  5. Model-Based Motion Tracking of Infants

    DEFF Research Database (Denmark)

    Olsen, Mikkel Damgaard; Herskind, Anna; Nielsen, Jens Bo

    2014-01-01

    Even though motion tracking is a widely used technique to analyze and measure human movements, only a few studies focus on motion tracking of infants. In recent years, a number of studies have emerged focusing on analyzing the motion pattern of infants, using computer vision. Most of these studies...... are based on 2D images, but few are based on 3D information. In this paper, we present a model-based approach for tracking infants in 3D. The study extends a novel study on graph-based motion tracking of infants and we show that the extension improves the tracking results. A 3D model is constructed...

  6. Model-based Prognostics with Concurrent Damage Progression Processes

    Data.gov (United States)

    National Aeronautics and Space Administration — Model-based prognostics approaches rely on physics-based models that describe the behavior of systems and their components. These models must account for the several...

  7. Probabilistic Model-based Background Subtraction

    DEFF Research Database (Denmark)

    Krüger, Volker; Anderson, Jakob; Prehn, Thomas

    2005-01-01

    is the correlation between pixels. In this paper we introduce a model-based background subtraction approach which facilitates prior knowledge of pixel correlations for clearer and better results. Model knowledge is being learned from good training video data, the data is stored for fast access in a hierarchical...

  8. The development and characterization of a primarily mineral calcium phosphate - poly(epsilon-caprolactone) biocomposite

    Science.gov (United States)

    Dunkley, Ian Robert

    Orthopaedic reconstruction often involves the surgical introduction of structural implants that provide for rigid fixation, skeletal stabilization, and bone integration. The high stresses incurred by these implanted devices have historically limited material choices to metallic and select polymeric formulations. While mechanical requirements are achieved, these non-degradable materials do not participate actively in the remodeling of the skeleton and present the possibility of long-term failure or rejection. This is particularly relevant in cervical fusion, an orthopaedic procedure to treat damaged, degenerative or diseased intervertebral discs. A significant improvement on the available synthetic bone replacement/regeneration options for implants to treat these conditions in the cervical spine may be achieved with the development of primarily mineral biocomposites comprised of a bioactive ceramic matrix reinforced with a biodegradable polymer. Such a biocomposite may be engineered to possess the clinically required mechanical properties of a particular application, while maintaining the ability to be remodeled completely by the body. A biocomposite of Si-doped calcium phosphate (Si-CaP) and poly(epsilon-caprolactone) (PCL) was developed for application as such a synthetic bone material for potential use as a fusion device in the cervical spine. In this thesis, a method by which high mineral content Si-CaP/PCL biocomposites with interpenetrating matrices of mineral and polymer phases may be prepared will be demonstrated, in addition to the effects of the various preparation parameters on the biocomposite density, porosity and mechanical properties. This new technique by which dense, primarily ceramic Si-CaP/PCL biocomposites were prepared, allowed for the incorporation of mineral contents ranging between 45-97vol%. Polymer infiltration, accomplished solely by passive capillary uptake over several days, was found to be capable of fully infiltrating the microporosity

  9. Model-based optimization biofilm based systems performing autotrophic nitrogen removal using the comprehensive NDHA model

    DEFF Research Database (Denmark)

    Valverde Pérez, Borja; Ma, Yunjie; Morset, Martin

    Completely autotrophic nitrogen removal (CANR) can be obtained in single stage biofilm-based bioreactors. However, their environmental footprint is compromised due to elevated N2O emissions. We developed novel spatially explicit biochemical process model of biofilm based CANR systems that predicts...

  10. Probabilistic reasoning for assembly-based 3D modeling

    KAUST Repository

    Chaudhuri, Siddhartha

    2011-01-01

    Assembly-based modeling is a promising approach to broadening the accessibility of 3D modeling. In assembly-based modeling, new models are assembled from shape components extracted from a database. A key challenge in assembly-based modeling is the identification of relevant components to be presented to the user. In this paper, we introduce a probabilistic reasoning approach to this problem. Given a repository of shapes, our approach learns a probabilistic graphical model that encodes semantic and geometric relationships among shape components. The probabilistic model is used to present components that are semantically and stylistically compatible with the 3D model that is being assembled. Our experiments indicate that the probabilistic model increases the relevance of presented components. © 2011 ACM.

  11. Multi-Domain Modeling Based on Modelica

    Directory of Open Access Journals (Sweden)

    Liu Jun

    2016-01-01

    Full Text Available With the application of simulation technology in large-scale and multi-field problems, multi-domain unified modeling become an effective way to solve these problems. This paper introduces several basic methods and advantages of the multidisciplinary model, and focuses on the simulation based on Modelica language. The Modelica/Mworks is a newly developed simulation software with features of an object-oriented and non-casual language for modeling of the large, multi-domain system, which makes the model easier to grasp, develop and maintain.It This article shows the single degree of freedom mechanical vibration system based on Modelica language special connection mechanism in Mworks. This method that multi-domain modeling has simple and feasible, high reusability. it closer to the physical system, and many other advantages.

  12. Testing R&D-Based Endogenous Growth Models

    DEFF Research Database (Denmark)

    Kruse-Andersen, Peter Kjær

    2017-01-01

    R&D-based growth models are tested using US data for the period 1953-2014. A general growth model is developed which nests the model varieties of interest. The model implies a cointegrating relationship between multifactor productivity, research intensity, and employment. This relationship...... is estimated using cointegrated VAR models. The results provide evidence against the widely used fully endogenous variety and in favor of the semi-endogenous variety. Forecasts based on the empirical estimates suggest that the slowdown in US productivity growth will continue. Particularly, the annual long...

  13. PV panel model based on datasheet values

    DEFF Research Database (Denmark)

    Sera, Dezso; Teodorescu, Remus; Rodriguez, Pedro

    2007-01-01

    This work presents the construction of a model for a PV panel using the single-diode five-parameters model, based exclusively on data-sheet parameters. The model takes into account the series and parallel (shunt) resistance of the panel. The equivalent circuit and the basic equations of the PV cell....... Based on these equations, a PV panel model, which is able to predict the panel behavior in different temperature and irradiance conditions, is built and tested....

  14. Springer handbook of model-based science

    CERN Document Server

    Bertolotti, Tommaso

    2017-01-01

    The handbook offers the first comprehensive reference guide to the interdisciplinary field of model-based reasoning. It highlights the role of models as mediators between theory and experimentation, and as educational devices, as well as their relevance in testing hypotheses and explanatory functions. The Springer Handbook merges philosophical, cognitive and epistemological perspectives on models with the more practical needs related to the application of this tool across various disciplines and practices. The result is a unique, reliable source of information that guides readers toward an understanding of different aspects of model-based science, such as the theoretical and cognitive nature of models, as well as their practical and logical aspects. The inferential role of models in hypothetical reasoning, abduction and creativity once they are constructed, adopted, and manipulated for different scientific and technological purposes is also discussed. Written by a group of internationally renowned experts in ...

  15. Image based 3D city modeling : Comparative study

    Directory of Open Access Journals (Sweden)

    S. P. Singh

    2014-06-01

    Full Text Available 3D city model is a digital representation of the Earth’s surface and it’s related objects such as building, tree, vegetation, and some manmade feature belonging to urban area. The demand of 3D city modeling is increasing rapidly for various engineering and non-engineering applications. Generally four main image based approaches were used for virtual 3D city models generation. In first approach, researchers were used Sketch based modeling, second method is Procedural grammar based modeling, third approach is Close range photogrammetry based modeling and fourth approach is mainly based on Computer Vision techniques. SketchUp, CityEngine, Photomodeler and Agisoft Photoscan are the main softwares to represent these approaches respectively. These softwares have different approaches & methods suitable for image based 3D city modeling. Literature study shows that till date, there is no complete such type of comparative study available to create complete 3D city model by using images. This paper gives a comparative assessment of these four image based 3D modeling approaches. This comparative study is mainly based on data acquisition methods, data processing techniques and output 3D model products. For this research work, study area is the campus of civil engineering department, Indian Institute of Technology, Roorkee (India. This 3D campus acts as a prototype for city. This study also explains various governing parameters, factors and work experiences. This research work also gives a brief introduction, strengths and weakness of these four image based techniques. Some personal comment is also given as what can do or what can’t do from these softwares. At the last, this study shows; it concluded that, each and every software has some advantages and limitations. Choice of software depends on user requirements of 3D project. For normal visualization project, SketchUp software is a good option. For 3D documentation record, Photomodeler gives good

  16. Model-Based Integration and Interpretation of Data

    DEFF Research Database (Denmark)

    Petersen, Johannes

    2004-01-01

    Data integration and interpretation plays a crucial role in supervisory control. The paper defines a set of generic inference steps for the data integration and interpretation process based on a three-layer model of system representations. The three-layer model is used to clarify the combination...... of constraint and object-centered representations of the work domain throwing new light on the basic principles underlying the data integration and interpretation process of Rasmussen's abstraction hierarchy as well as other model-based approaches combining constraint and object-centered representations. Based...

  17. Modeling the Behaviour of an Advanced Material Based Smart Landing Gear System for Aerospace Vehicles

    International Nuclear Information System (INIS)

    Varughese, Byji; Dayananda, G. N.; Rao, M. Subba

    2008-01-01

    The last two decades have seen a substantial rise in the use of advanced materials such as polymer composites for aerospace structural applications. In more recent years there has been a concerted effort to integrate materials, which mimic biological functions (referred to as smart materials) with polymeric composites. Prominent among smart materials are shape memory alloys, which possess both actuating and sensory functions that can be realized simultaneously. The proper characterization and modeling of advanced and smart materials holds the key to the design and development of efficient smart devices/systems. This paper focuses on the material characterization; modeling and validation of the model in relation to the development of a Shape Memory Alloy (SMA) based smart landing gear (with high energy dissipation features) for a semi rigid radio controlled airship (RC-blimp). The Super Elastic (SE) SMA element is configured in such a way that it is forced into a tensile mode of high elastic deformation. The smart landing gear comprises of a landing beam, an arch and a super elastic Nickel-Titanium (Ni-Ti) SMA element. The landing gear is primarily made of polymer carbon composites, which possess high specific stiffness and high specific strength compared to conventional materials, and are therefore ideally suited for the design and development of an efficient skid landing gear system with good energy dissipation characteristics. The development of the smart landing gear in relation to a conventional metal landing gear design is also dealt with

  18. CEAI: CCM-based email authorship identification model

    Directory of Open Access Journals (Sweden)

    Sarwat Nizamani

    2013-11-01

    Full Text Available In this paper we present a model for email authorship identification (EAI by employing a Cluster-based Classification (CCM technique. Traditionally, stylometric features have been successfully employed in various authorship analysis tasks; we extend the traditional feature set to include some more interesting and effective features for email authorship identification (e.g., the last punctuation mark used in an email, the tendency of an author to use capitalization at the start of an email, or the punctuation after a greeting or farewell. We also included Info Gain feature selection based content features. It is observed that the use of such features in the authorship identification process has a positive impact on the accuracy of the authorship identification task. We performed experiments to justify our arguments and compared the results with other base line models. Experimental results reveal that the proposed CCM-based email authorship identification model, along with the proposed feature set, outperforms the state-of-the-art support vector machine (SVM-based models, as well as the models proposed by Iqbal et al. (2010, 2013 [1,2]. The proposed model attains an accuracy rate of 94% for 10 authors, 89% for 25 authors, and 81% for 50 authors, respectively on Enron dataset, while 89.5% accuracy has been achieved on authors’ constructed real email dataset. The results on Enron dataset have been achieved on quite a large number of authors as compared to the models proposed by Iqbal et al. [1,2].

  19. Guidelines for visualizing and annotating rule-based models.

    Science.gov (United States)

    Chylek, Lily A; Hu, Bin; Blinov, Michael L; Emonet, Thierry; Faeder, James R; Goldstein, Byron; Gutenkunst, Ryan N; Haugh, Jason M; Lipniacki, Tomasz; Posner, Richard G; Yang, Jin; Hlavacek, William S

    2011-10-01

    Rule-based modeling provides a means to represent cell signaling systems in a way that captures site-specific details of molecular interactions. For rule-based models to be more widely understood and (re)used, conventions for model visualization and annotation are needed. We have developed the concepts of an extended contact map and a model guide for illustrating and annotating rule-based models. An extended contact map represents the scope of a model by providing an illustration of each molecule, molecular component, direct physical interaction, post-translational modification, and enzyme-substrate relationship considered in a model. A map can also illustrate allosteric effects, structural relationships among molecular components, and compartmental locations of molecules. A model guide associates elements of a contact map with annotation and elements of an underlying model, which may be fully or partially specified. A guide can also serve to document the biological knowledge upon which a model is based. We provide examples of a map and guide for a published rule-based model that characterizes early events in IgE receptor (FcεRI) signaling. We also provide examples of how to visualize a variety of processes that are common in cell signaling systems but not considered in the example model, such as ubiquitination. An extended contact map and an associated guide can document knowledge of a cell signaling system in a form that is visual as well as executable. As a tool for model annotation, a map and guide can communicate the content of a model clearly and with precision, even for large models.

  20. Thermal-based modeling of coupled carbon, water, and energy fluxes using nominal light use efficiencies constrained by leaf chlorophyll observations

    KAUST Repository

    Schull, M. A.; Anderson, M. C.; Houborg, Rasmus; Gitelson, A.; Kustas, W. P.

    2015-01-01

    is nonlinearly related to βn, with variability primarily related to phenological changes during early growth and senescence. Utilizing seasonally varying βn inputs based on an empirical relationship with in situ measured Chl resulted in improvements in carbon

  1. Automated extraction of knowledge for model-based diagnostics

    Science.gov (United States)

    Gonzalez, Avelino J.; Myler, Harley R.; Towhidnejad, Massood; Mckenzie, Frederic D.; Kladke, Robin R.

    1990-01-01

    The concept of accessing computer aided design (CAD) design databases and extracting a process model automatically is investigated as a possible source for the generation of knowledge bases for model-based reasoning systems. The resulting system, referred to as automated knowledge generation (AKG), uses an object-oriented programming structure and constraint techniques as well as internal database of component descriptions to generate a frame-based structure that describes the model. The procedure has been designed to be general enough to be easily coupled to CAD systems that feature a database capable of providing label and connectivity data from the drawn system. The AKG system is capable of defining knowledge bases in formats required by various model-based reasoning tools.

  2. Development of a geodatabase and conceptual model of the hydrogeologic units beneath air force plant 4 and Naval Air Station-Joint Reserve Base Carswell Field, Fort Worth, Texas

    Science.gov (United States)

    Shah, Sachin D.

    2004-01-01

    Air Force Plant 4 and adjacent Naval Air Station-Joint Reserve Base Carswell Field at Fort Worth, Texas, constitute a government-owned, contractor-operated facility that has been in operation since 1942. Contaminants from AFP4, primarily volatile organic compounds and metals, have entered the ground-water-flow system through leakage from waste-disposal sites and from manufacturing processes. The U.S. Geological Survey developed a comprehensive geodatabase of temporal and spatial environmental information associated with the hydrogeologic units (alluvial aquifer, Goodland-Walnut confining unit, and Paluxy aquifer) beneath the facility and a three-dimensional conceptual model of the hydrogeologic units integrally linked to the geodatabase. The geodatabase design uses a thematic layer approach to create layers of feature data using a geographic information system. The various features are separated into relational tables in the geodatabase on the basis of how they interact and correspond to one another. Using the geodatabase, geographic data at the site are manipulated to produce maps, allow interactive queries, and perform spatial analyses. The conceptual model for the study area comprises computer-generated, three-dimensional block diagrams of the hydrogeologic units. The conceptual model provides a platform for visualization of hydrogeologic-unit sections and surfaces and for subsurface environmental analyses. The conceptual model is based on three structural surfaces and two thickness configurations of the study area. The three structural surfaces depict the altitudes of the tops of the three hydrogeologic units. The two thickness configurations are those of the alluvial aquifer and the Goodland-Walnut confining unit. The surface of the alluvial aquifer was created using a U.S. Geological Survey 10-meter digital elevation model. The 2,130 point altitudes of the top of the Goodland-Walnut unit were compiled from lithologic logs from existing wells, available soil

  3. Model-based accelerator controls: What, why and how

    International Nuclear Information System (INIS)

    Sidhu, S.S.

    1987-01-01

    Model-based control is defined as a gamut of techniques whose aim is to improve the reliability of an accelerator and enhance the capabilities of the operator, and therefore of the whole control system. The aim of model-based control is seen as gradually moving the function of model-reference from the operator to the computer. The role of the operator in accelerator control and the need for and application of model-based control are briefly summarized

  4. Néron Models and Base Change

    DEFF Research Database (Denmark)

    Halle, Lars Halvard; Nicaise, Johannes

    Presenting the first systematic treatment of the behavior of Néron models under ramified base change, this book can be read as an introduction to various subtle invariants and constructions related to Néron models of semi-abelian varieties, motivated by concrete research problems and complemented...... on Néron component groups, Edixhoven’s filtration and the base change conductor of Chai and Yu, and we study these invariants using various techniques such as models of curves, sheaves on Grothendieck sites and non-archimedean uniformization. We then apply our results to the study of motivic zeta functions...

  5. Learning of Chemical Equilibrium through Modelling-Based Teaching

    Science.gov (United States)

    Maia, Poliana Flavia; Justi, Rosaria

    2009-01-01

    This paper presents and discusses students' learning process of chemical equilibrium from a modelling-based approach developed from the use of the "Model of Modelling" diagram. The investigation was conducted in a regular classroom (students 14-15 years old) and aimed at discussing how modelling-based teaching can contribute to students…

  6. Model predictive control based on reduced order models applied to belt conveyor system.

    Science.gov (United States)

    Chen, Wei; Li, Xin

    2016-11-01

    In the paper, a model predictive controller based on reduced order model is proposed to control belt conveyor system, which is an electro-mechanics complex system with long visco-elastic body. Firstly, in order to design low-degree controller, the balanced truncation method is used for belt conveyor model reduction. Secondly, MPC algorithm based on reduced order model for belt conveyor system is presented. Because of the error bound between the full-order model and reduced order model, two Kalman state estimators are applied in the control scheme to achieve better system performance. Finally, the simulation experiments are shown that balanced truncation method can significantly reduce the model order with high-accuracy and model predictive control based on reduced-model performs well in controlling the belt conveyor system. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  7. Nitric oxide circulates in mammalian plasma primarily as an S-nitroso adduct of serum albumin.

    Science.gov (United States)

    Stamler, J S; Jaraki, O; Osborne, J; Simon, D I; Keaney, J; Vita, J; Singel, D; Valeri, C R; Loscalzo, J

    1992-01-01

    We have recently shown that nitric oxide or authentic endothelium-derived relaxing factor generated in a biologic system reacts in the presence of specific protein thiols to form S-nitrosoprotein derivatives that have endothelium-derived relaxing factor-like properties. The single free cysteine of serum albumin, Cys-34, is particularly reactive toward nitrogen oxides (most likely nitrosonium ion) under physiologic conditions, primarily because of its anomalously low pK; given its abundance in plasma, where it accounts for approximately 0.5 mM thiol, we hypothesized that this plasma protein serves as a reservoir for nitric oxide produced by the endothelial cell. To test this hypothesis, we developed a methodology, which involves UV photolytic cleavage of the S--NO bond before reaction with ozone for chemiluminescence detection, with which to measure free nitric oxide, S-nitrosothiols, and S-nitrosoproteins in biologic systems. We found that human plasma contains approximately 7 microM S-nitrosothiols, of which 96% are S-nitrosoproteins, 82% of which is accounted for by S-nitroso-serum albumin. By contrast, plasma levels of free nitric oxide are only in the 3-nM range. In rabbits, plasma S-nitrosothiols are present at approximately 1 microM; 60 min after administration of NG-monomethyl-L-arginine at 50 mg/ml, a selective and potent inhibitor of nitric oxide synthetases, S-nitrosothiols decreased by approximately 40% (greater than 95% of which were accounted for by S-nitrosoproteins, and approximately 80% of which was S-nitroso-serum albumin); this decrease was accompanied by a concomitant increase in mean arterial blood pressure of 22%. These data suggest that naturally produced nitric oxide circulates in plasma primarily complexed in S-nitrosothiol species, principal among which is S-nitroso-serum albumin. This abundant, relatively long-lived adduct likely serves as a reservoir with which plasma levels of highly reactive, short-lived free nitric oxide can be

  8. Multivariate statistical modelling based on generalized linear models

    CERN Document Server

    Fahrmeir, Ludwig

    1994-01-01

    This book is concerned with the use of generalized linear models for univariate and multivariate regression analysis. Its emphasis is to provide a detailed introductory survey of the subject based on the analysis of real data drawn from a variety of subjects including the biological sciences, economics, and the social sciences. Where possible, technical details and proofs are deferred to an appendix in order to provide an accessible account for non-experts. Topics covered include: models for multi-categorical responses, model checking, time series and longitudinal data, random effects models, and state-space models. Throughout, the authors have taken great pains to discuss the underlying theoretical ideas in ways that relate well to the data at hand. As a result, numerous researchers whose work relies on the use of these models will find this an invaluable account to have on their desks. "The basic aim of the authors is to bring together and review a large part of recent advances in statistical modelling of m...

  9. The Design of Model-Based Training Programs

    Science.gov (United States)

    Polson, Peter; Sherry, Lance; Feary, Michael; Palmer, Everett; Alkin, Marty; McCrobie, Dan; Kelley, Jerry; Rosekind, Mark (Technical Monitor)

    1997-01-01

    This paper proposes a model-based training program for the skills necessary to operate advance avionics systems that incorporate advanced autopilots and fight management systems. The training model is based on a formalism, the operational procedure model, that represents the mission model, the rules, and the functions of a modem avionics system. This formalism has been defined such that it can be understood and shared by pilots, the avionics software, and design engineers. Each element of the software is defined in terms of its intent (What?), the rationale (Why?), and the resulting behavior (How?). The Advanced Computer Tutoring project at Carnegie Mellon University has developed a type of model-based, computer aided instructional technology called cognitive tutors. They summarize numerous studies showing that training times to a specified level of competence can be achieved in one third the time of conventional class room instruction. We are developing a similar model-based training program for the skills necessary to operation the avionics. The model underlying the instructional program and that simulates the effects of pilots entries and the behavior of the avionics is based on the operational procedure model. Pilots are given a series of vertical flightpath management problems. Entries that result in violations, such as failure to make a crossing restriction or violating the speed limits, result in error messages with instruction. At any time, the flightcrew can request suggestions on the appropriate set of actions. A similar and successful training program for basic skills for the FMS on the Boeing 737-300 was developed and evaluated. The results strongly support the claim that the training methodology can be adapted to the cockpit.

  10. Quality prediction modeling for sintered ores based on mechanism models of sintering and extreme learning machine based error compensation

    Science.gov (United States)

    Tiebin, Wu; Yunlian, Liu; Xinjun, Li; Yi, Yu; Bin, Zhang

    2018-06-01

    Aiming at the difficulty in quality prediction of sintered ores, a hybrid prediction model is established based on mechanism models of sintering and time-weighted error compensation on the basis of the extreme learning machine (ELM). At first, mechanism models of drum index, total iron, and alkalinity are constructed according to the chemical reaction mechanism and conservation of matter in the sintering process. As the process is simplified in the mechanism models, these models are not able to describe high nonlinearity. Therefore, errors are inevitable. For this reason, the time-weighted ELM based error compensation model is established. Simulation results verify that the hybrid model has a high accuracy and can meet the requirement for industrial applications.

  11. Facilitating Change to a Problem-based Model

    DEFF Research Database (Denmark)

    Kolmos, Anette

    2002-01-01

    The paper presents the barriers which arise during the change process from a traditional educational system to a problem-based educational model.......The paper presents the barriers which arise during the change process from a traditional educational system to a problem-based educational model....

  12. Energy based prediction models for building acoustics

    DEFF Research Database (Denmark)

    Brunskog, Jonas

    2012-01-01

    In order to reach robust and simplified yet accurate prediction models, energy based principle are commonly used in many fields of acoustics, especially in building acoustics. This includes simple energy flow models, the framework of statistical energy analysis (SEA) as well as more elaborated...... principles as, e.g., wave intensity analysis (WIA). The European standards for building acoustic predictions, the EN 12354 series, are based on energy flow and SEA principles. In the present paper, different energy based prediction models are discussed and critically reviewed. Special attention is placed...... on underlying basic assumptions, such as diffuse fields, high modal overlap, resonant field being dominant, etc., and the consequences of these in terms of limitations in the theory and in the practical use of the models....

  13. Pixel-based meshfree modelling of skeletal muscles.

    Science.gov (United States)

    Chen, Jiun-Shyan; Basava, Ramya Rao; Zhang, Yantao; Csapo, Robert; Malis, Vadim; Sinha, Usha; Hodgson, John; Sinha, Shantanu

    2016-01-01

    This paper introduces the meshfree Reproducing Kernel Particle Method (RKPM) for 3D image-based modeling of skeletal muscles. This approach allows for construction of simulation model based on pixel data obtained from medical images. The material properties and muscle fiber direction obtained from Diffusion Tensor Imaging (DTI) are input at each pixel point. The reproducing kernel (RK) approximation allows a representation of material heterogeneity with smooth transition. A multiphase multichannel level set based segmentation framework is adopted for individual muscle segmentation using Magnetic Resonance Images (MRI) and DTI. The application of the proposed methods for modeling the human lower leg is demonstrated.

  14. Correlation between the model accuracy and model-based SOC estimation

    International Nuclear Information System (INIS)

    Wang, Qianqian; Wang, Jiao; Zhao, Pengju; Kang, Jianqiang; Yan, Few; Du, Changqing

    2017-01-01

    State-of-charge (SOC) estimation is a core technology for battery management systems. Considerable progress has been achieved in the study of SOC estimation algorithms, especially the algorithm on the basis of Kalman filter to meet the increasing demand of model-based battery management systems. The Kalman filter weakens the influence of white noise and initial error during SOC estimation but cannot eliminate the existing error of the battery model itself. As such, the accuracy of SOC estimation is directly related to the accuracy of the battery model. Thus far, the quantitative relationship between model accuracy and model-based SOC estimation remains unknown. This study summarizes three equivalent circuit lithium-ion battery models, namely, Thevenin, PNGV, and DP models. The model parameters are identified through hybrid pulse power characterization test. The three models are evaluated, and SOC estimation conducted by EKF-Ah method under three operating conditions are quantitatively studied. The regression and correlation of the standard deviation and normalized RMSE are studied and compared between the model error and the SOC estimation error. These parameters exhibit a strong linear relationship. Results indicate that the model accuracy affects the SOC estimation accuracy mainly in two ways: dispersion of the frequency distribution of the error and the overall level of the error. On the basis of the relationship between model error and SOC estimation error, our study provides a strategy for selecting a suitable cell model to meet the requirements of SOC precision using Kalman filter.

  15. 3-D model-based vehicle tracking.

    Science.gov (United States)

    Lou, Jianguang; Tan, Tieniu; Hu, Weiming; Yang, Hao; Maybank, Steven J

    2005-10-01

    This paper aims at tracking vehicles from monocular intensity image sequences and presents an efficient and robust approach to three-dimensional (3-D) model-based vehicle tracking. Under the weak perspective assumption and the ground-plane constraint, the movements of model projection in the two-dimensional image plane can be decomposed into two motions: translation and rotation. They are the results of the corresponding movements of 3-D translation on the ground plane (GP) and rotation around the normal of the GP, which can be determined separately. A new metric based on point-to-line segment distance is proposed to evaluate the similarity between an image region and an instantiation of a 3-D vehicle model under a given pose. Based on this, we provide an efficient pose refinement method to refine the vehicle's pose parameters. An improved EKF is also proposed to track and to predict vehicle motion with a precise kinematics model. Experimental results with both indoor and outdoor data show that the algorithm obtains desirable performance even under severe occlusion and clutter.

  16. Re-Evaluation of Acid-Base Prediction Rules in Patients with Chronic Respiratory Acidosis

    Directory of Open Access Journals (Sweden)

    Tereza Martinu

    2003-01-01

    Full Text Available RATIONALE: The prediction rules for the evaluation of the acid-base status in patients with chronic respiratory acidosis, derived primarily from an experimental canine model, suggest that complete compensation should not occur. This appears to contradict frequent observations of normal or near-normal pH levels in patients with chronic hypercapnia.

  17. Y-Chromosomal Diversity in Europe Is Clinal and Influenced Primarily by Geography, Rather than by Language

    Science.gov (United States)

    Rosser, Zoë H.; Zerjal, Tatiana; Hurles, Matthew E.; Adojaan, Maarja; Alavantic, Dragan; Amorim, António; Amos, William; Armenteros, Manuel; Arroyo, Eduardo; Barbujani, Guido; Beckman, Gunhild; Beckman, Lars; Bertranpetit, Jaume; Bosch, Elena; Bradley, Daniel G.; Brede, Gaute; Cooper, Gillian; Côrte-Real, Helena B. S. M.; de Knijff, Peter; Decorte, Ronny; Dubrova, Yuri E.; Evgrafov, Oleg; Gilissen, Anja; Glisic, Sanja; Gölge, Mukaddes; Hill, Emmeline W.; Jeziorowska, Anna; Kalaydjieva, Luba; Kayser, Manfred; Kivisild, Toomas; Kravchenko, Sergey A.; Krumina, Astrida; Kučinskas, Vaidutis; Lavinha, João; Livshits, Ludmila A.; Malaspina, Patrizia; Maria, Syrrou; McElreavey, Ken; Meitinger, Thomas A.; Mikelsaar, Aavo-Valdur; Mitchell, R. John; Nafa, Khedoudja; Nicholson, Jayne; Nørby, Søren; Pandya, Arpita; Parik, Jüri; Patsalis, Philippos C.; Pereira, Luísa; Peterlin, Borut; Pielberg, Gerli; Prata, Maria João; Previderé, Carlo; Roewer, Lutz; Rootsi, Siiri; Rubinsztein, D. C.; Saillard, Juliette; Santos, Fabrício R.; Stefanescu, Gheorghe; Sykes, Bryan C.; Tolun, Aslihan; Villems, Richard; Tyler-Smith, Chris; Jobling, Mark A.

    2000-01-01

    Clinal patterns of autosomal genetic diversity within Europe have been interpreted in previous studies in terms of a Neolithic demic diffusion model for the spread of agriculture; in contrast, studies using mtDNA have traced many founding lineages to the Paleolithic and have not shown strongly clinal variation. We have used 11 human Y-chromosomal biallelic polymorphisms, defining 10 haplogroups, to analyze a sample of 3,616 Y chromosomes belonging to 47 European and circum-European populations. Patterns of geographic differentiation are highly nonrandom, and, when they are assessed using spatial autocorrelation analysis, they show significant clines for five of six haplogroups analyzed. Clines for two haplogroups, representing 45% of the chromosomes, are continentwide and consistent with the demic diffusion hypothesis. Clines for three other haplogroups each have different foci and are more regionally restricted and are likely to reflect distinct population movements, including one from north of the Black Sea. Principal-components analysis suggests that populations are related primarily on the basis of geography, rather than on the basis of linguistic affinity. This is confirmed in Mantel tests, which show a strong and highly significant partial correlation between genetics and geography but a low, nonsignificant partial correlation between genetics and language. Genetic-barrier analysis also indicates the primacy of geography in the shaping of patterns of variation. These patterns retain a strong signal of expansion from the Near East but also suggest that the demographic history of Europe has been complex and influenced by other major population movements, as well as by linguistic and geographic heterogeneities and the effects of drift. PMID:11078479

  18. Sandboxes for Model-Based Inquiry

    Science.gov (United States)

    Brady, Corey; Holbert, Nathan; Soylu, Firat; Novak, Michael; Wilensky, Uri

    2015-04-01

    In this article, we introduce a class of constructionist learning environments that we call Emergent Systems Sandboxes ( ESSs), which have served as a centerpiece of our recent work in developing curriculum to support scalable model-based learning in classroom settings. ESSs are a carefully specified form of virtual construction environment that support students in creating, exploring, and sharing computational models of dynamic systems that exhibit emergent phenomena. They provide learners with "entity"-level construction primitives that reflect an underlying scientific model. These primitives can be directly "painted" into a sandbox space, where they can then be combined, arranged, and manipulated to construct complex systems and explore the emergent properties of those systems. We argue that ESSs offer a means of addressing some of the key barriers to adopting rich, constructionist model-based inquiry approaches in science classrooms at scale. Situating the ESS in a large-scale science modeling curriculum we are implementing across the USA, we describe how the unique "entity-level" primitive design of an ESS facilitates knowledge system refinement at both an individual and social level, we describe how it supports flexible modeling practices by providing both continuous and discrete modes of executability, and we illustrate how it offers students a variety of opportunities for validating their qualitative understandings of emergent systems as they develop.

  19. Dynamic modeling method for infrared smoke based on enhanced discrete phase model

    Science.gov (United States)

    Zhang, Zhendong; Yang, Chunling; Zhang, Yan; Zhu, Hongbo

    2018-03-01

    The dynamic modeling of infrared (IR) smoke plays an important role in IR scene simulation systems and its accuracy directly influences the system veracity. However, current IR smoke models cannot provide high veracity, because certain physical characteristics are frequently ignored in fluid simulation; simplifying the discrete phase as a continuous phase and ignoring the IR decoy missile-body spinning. To address this defect, this paper proposes a dynamic modeling method for IR smoke, based on an enhanced discrete phase model (DPM). A mathematical simulation model based on an enhanced DPM is built and a dynamic computing fluid mesh is generated. The dynamic model of IR smoke is then established using an extended equivalent-blackbody-molecule model. Experiments demonstrate that this model realizes a dynamic method for modeling IR smoke with higher veracity.

  20. Ratio-based vs. model-based methods to correct for urinary creatinine concentrations.

    Science.gov (United States)

    Jain, Ram B

    2016-08-01

    Creatinine-corrected urinary analyte concentration is usually computed as the ratio of the observed level of analyte concentration divided by the observed level of the urinary creatinine concentration (UCR). This ratio-based method is flawed since it implicitly assumes that hydration is the only factor that affects urinary creatinine concentrations. On the contrary, it has been shown in the literature, that age, gender, race/ethnicity, and other factors also affect UCR. Consequently, an optimal method to correct for UCR should correct for hydration as well as other factors like age, gender, and race/ethnicity that affect UCR. Model-based creatinine correction in which observed UCRs are used as an independent variable in regression models has been proposed. This study was conducted to evaluate the performance of ratio-based and model-based creatinine correction methods when the effects of gender, age, and race/ethnicity are evaluated one factor at a time for selected urinary analytes and metabolites. It was observed that ratio-based method leads to statistically significant pairwise differences, for example, between males and females or between non-Hispanic whites (NHW) and non-Hispanic blacks (NHB), more often than the model-based method. However, depending upon the analyte of interest, the reverse is also possible. The estimated ratios of geometric means (GM), for example, male to female or NHW to NHB, were also compared for the two methods. When estimated UCRs were higher for the group (for example, males) in the numerator of this ratio, these ratios were higher for the model-based method, for example, male to female ratio of GMs. When estimated UCR were lower for the group (for example, NHW) in the numerator of this ratio, these ratios were higher for the ratio-based method, for example, NHW to NHB ratio of GMs. Model-based method is the method of choice if all factors that affect UCR are to be accounted for.

  1. Agent Based Modeling Applications for Geosciences

    Science.gov (United States)

    Stein, J. S.

    2004-12-01

    Agent-based modeling techniques have successfully been applied to systems in which complex behaviors or outcomes arise from varied interactions between individuals in the system. Each individual interacts with its environment, as well as with other individuals, by following a set of relatively simple rules. Traditionally this "bottom-up" modeling approach has been applied to problems in the fields of economics and sociology, but more recently has been introduced to various disciplines in the geosciences. This technique can help explain the origin of complex processes from a relatively simple set of rules, incorporate large and detailed datasets when they exist, and simulate the effects of extreme events on system-wide behavior. Some of the challenges associated with this modeling method include: significant computational requirements in order to keep track of thousands to millions of agents, methods and strategies of model validation are lacking, as is a formal methodology for evaluating model uncertainty. Challenges specific to the geosciences, include how to define agents that control water, contaminant fluxes, climate forcing and other physical processes and how to link these "geo-agents" into larger agent-based simulations that include social systems such as demographics economics and regulations. Effective management of limited natural resources (such as water, hydrocarbons, or land) requires an understanding of what factors influence the demand for these resources on a regional and temporal scale. Agent-based models can be used to simulate this demand across a variety of sectors under a range of conditions and determine effective and robust management policies and monitoring strategies. The recent focus on the role of biological processes in the geosciences is another example of an area that could benefit from agent-based applications. A typical approach to modeling the effect of biological processes in geologic media has been to represent these processes in

  2. Self-esteem: models and implications for management

    OpenAIRE

    Becker, Manfred R.

    1993-01-01

    Approved for public release; distribution unlimited. This thesis presents a literature review of self-esteem, primarily as it relates to organizations and management. Based on this literature review, self-esteem is defined as the emotional valuation individuals have of themselves and the degree of certainty of this valuation. Several models of self-esteem are presented. The relationship of coping and avoidance to self-esteem is considered. Coping is presented as being one of the primary so...

  3. Segment-based Eyring-Wilson viscosity model for polymer solutions

    International Nuclear Information System (INIS)

    Sadeghi, Rahmat

    2005-01-01

    A theory-based model is presented for correlating viscosity of polymer solutions and is based on the segment-based Eyring mixture viscosity model as well as the segment-based Wilson model for describing deviations from ideality. The model has been applied to several polymer solutions and the results show that it is reliable both for correlation and prediction of the viscosity of polymer solutions at different molar masses and temperature of the polymer

  4. Modeling Energy and Development : An Evaluation of Models and Concepts

    NARCIS (Netherlands)

    Ruijven, Bas van; Urban, Frauke; Benders, René M.J.; Moll, Henri C.; Sluijs, Jeroen P. van der; Vries, Bert de; Vuuren, Detlef P. van

    2008-01-01

    Most global energy models are developed by institutes from developed countries focusing primarily oil issues that are important in industrialized countries. Evaluation of the results for Asia of the IPCC/SRES models shows that broad concepts of energy and development. the energy ladder and the

  5. Kinetic models of gene expression including non-coding RNAs

    Energy Technology Data Exchange (ETDEWEB)

    Zhdanov, Vladimir P., E-mail: zhdanov@catalysis.r

    2011-03-15

    In cells, genes are transcribed into mRNAs, and the latter are translated into proteins. Due to the feedbacks between these processes, the kinetics of gene expression may be complex even in the simplest genetic networks. The corresponding models have already been reviewed in the literature. A new avenue in this field is related to the recognition that the conventional scenario of gene expression is fully applicable only to prokaryotes whose genomes consist of tightly packed protein-coding sequences. In eukaryotic cells, in contrast, such sequences are relatively rare, and the rest of the genome includes numerous transcript units representing non-coding RNAs (ncRNAs). During the past decade, it has become clear that such RNAs play a crucial role in gene expression and accordingly influence a multitude of cellular processes both in the normal state and during diseases. The numerous biological functions of ncRNAs are based primarily on their abilities to silence genes via pairing with a target mRNA and subsequently preventing its translation or facilitating degradation of the mRNA-ncRNA complex. Many other abilities of ncRNAs have been discovered as well. Our review is focused on the available kinetic models describing the mRNA, ncRNA and protein interplay. In particular, we systematically present the simplest models without kinetic feedbacks, models containing feedbacks and predicting bistability and oscillations in simple genetic networks, and models describing the effect of ncRNAs on complex genetic networks. Mathematically, the presentation is based primarily on temporal mean-field kinetic equations. The stochastic and spatio-temporal effects are also briefly discussed.

  6. Translating the foundational model of anatomy into french using knowledge-based and lexical methods

    Directory of Open Access Journals (Sweden)

    Merabti Tayeb

    2011-10-01

    Full Text Available Abstract Background The Foundational Model of Anatomy (FMA is the reference ontology regarding human anatomy. FMA vocabulary was integrated into the Health Multi Terminological Portal (HMTP developed by CISMeF based on the CISMeF Information System which also includes 26 other terminologies and controlled vocabularies, mainly in French. However, FMA is primarily in English. In this context, the translation of FMA English terms into French could also be useful for searching and indexing French anatomy resources. Various studies have investigated automatic methods to assist the translation of medical terminologies or create multilingual medical vocabularies. The goal of this study was to facilitate the translation of FMA vocabulary into French. Methods We compare two types of approaches to translate the FMA terms into French. The first one is UMLS-based on the conceptual information of the UMLS metathesaurus. The second method is lexically-based on several Natural Language Processing (NLP tools. Results The UMLS-based approach produced a translation of 3,661 FMA terms into French whereas the lexical approach produced a translation of 3,129 FMA terms into French. A qualitative evaluation was made on 100 FMA terms translated by each method. For the UMLS-based approach, among the 100 translations, 52% were manually rated as "very good" and only 7% translations as "bad". For the lexical approach, among the 100 translations, 47% were rated as "very good" and 20% translations as "bad". Conclusions Overall, a low rate of translations were demonstrated by the two methods. The two approaches permitted us to semi-automatically translate 3,776 FMA terms from English into French, this was to added to the existing 10,844 French FMA terms in the HMTP (4,436 FMA French terms and 6,408 FMA terms manually translated.

  7. Language acquisition is model-based rather than model-free.

    Science.gov (United States)

    Wang, Felix Hao; Mintz, Toben H

    2016-01-01

    Christiansen & Chater (C&C) propose that learning language is learning to process language. However, we believe that the general-purpose prediction mechanism they propose is insufficient to account for many phenomena in language acquisition. We argue from theoretical considerations and empirical evidence that many acquisition tasks are model-based, and that different acquisition tasks require different, specialized models.

  8. Drawing-Based Procedural Modeling of Chinese Architectures.

    Science.gov (United States)

    Fei Hou; Yue Qi; Hong Qin

    2012-01-01

    This paper presents a novel modeling framework to build 3D models of Chinese architectures from elevation drawing. Our algorithm integrates the capability of automatic drawing recognition with powerful procedural modeling to extract production rules from elevation drawing. First, different from the previous symbol-based floor plan recognition, based on the novel concept of repetitive pattern trees, small horizontal repetitive regions of the elevation drawing are clustered in a bottom-up manner to form architectural components with maximum repetition, which collectively serve as building blocks for 3D model generation. Second, to discover the global architectural structure and its components' interdependencies, the components are structured into a shape tree in a top-down subdivision manner and recognized hierarchically at each level of the shape tree based on Markov Random Fields (MRFs). Third, shape grammar rules can be derived to construct 3D semantic model and its possible variations with the help of a 3D component repository. The salient contribution lies in the novel integration of procedural modeling with elevation drawing, with a unique application to Chinese architectures.

  9. Analysis of survival data with dependent censoring copula-based approaches

    CERN Document Server

    Emura, Takeshi

    2018-01-01

    This book introduces readers to copula-based statistical methods for analyzing survival data involving dependent censoring. Primarily focusing on likelihood-based methods performed under copula models, it is the first book solely devoted to the problem of dependent censoring. The book demonstrates the advantages of the copula-based methods in the context of medical research, especially with regard to cancer patients’ survival data. Needless to say, the statistical methods presented here can also be applied to many other branches of science, especially in reliability, where survival analysis plays an important role. The book can be used as a textbook for graduate coursework or a short course aimed at (bio-) statisticians. To deepen readers’ understanding of copula-based approaches, the book provides an accessible introduction to basic survival analysis and explains the mathematical foundations of copula-based survival models.

  10. Agent-based Modeling Methodology for Analyzing Weapons Systems

    Science.gov (United States)

    2015-03-26

    technique involve model structure, system representation and the degree of validity, coupled with the simplicity, of the overall model. ABM is best suited... system representation of the air combat system . We feel that a simulation model that combines ABM with equation-based representation of weapons and...AGENT-BASED MODELING METHODOLOGY FOR ANALYZING WEAPONS SYSTEMS THESIS Casey D. Connors, Major, USA

  11. Population PK modelling and simulation based on fluoxetine and norfluoxetine concentrations in milk: a milk concentration-based prediction model.

    Science.gov (United States)

    Tanoshima, Reo; Bournissen, Facundo Garcia; Tanigawara, Yusuke; Kristensen, Judith H; Taddio, Anna; Ilett, Kenneth F; Begg, Evan J; Wallach, Izhar; Ito, Shinya

    2014-10-01

    Population pharmacokinetic (pop PK) modelling can be used for PK assessment of drugs in breast milk. However, complex mechanistic modelling of a parent and an active metabolite using both blood and milk samples is challenging. We aimed to develop a simple predictive pop PK model for milk concentration-time profiles of a parent and a metabolite, using data on fluoxetine (FX) and its active metabolite, norfluoxetine (NFX), in milk. Using a previously published data set of drug concentrations in milk from 25 women treated with FX, a pop PK model predictive of milk concentration-time profiles of FX and NFX was developed. Simulation was performed with the model to generate FX and NFX concentration-time profiles in milk of 1000 mothers. This milk concentration-based pop PK model was compared with the previously validated plasma/milk concentration-based pop PK model of FX. Milk FX and NFX concentration-time profiles were described reasonably well by a one compartment model with a FX-to-NFX conversion coefficient. Median values of the simulated relative infant dose on a weight basis (sRID: weight-adjusted daily doses of FX and NFX through breastmilk to the infant, expressed as a fraction of therapeutic FX daily dose per body weight) were 0.028 for FX and 0.029 for NFX. The FX sRID estimates were consistent with those of the plasma/milk-based pop PK model. A predictive pop PK model based on only milk concentrations can be developed for simultaneous estimation of milk concentration-time profiles of a parent (FX) and an active metabolite (NFX). © 2014 The British Pharmacological Society.

  12. Sigma models in the presence of dynamical point-like defects

    International Nuclear Information System (INIS)

    Doikou, Anastasia; Karaiskos, Nikos

    2013-01-01

    Point-like Liouville integrable dynamical defects are introduced in the context of the Landau–Lifshitz and Principal Chiral (Faddeev–Reshetikhin) models. Based primarily on the underlying quadratic algebra we identify the first local integrals of motion, the associated Lax pairs as well as the relevant sewing conditions around the defect point. The involution of the integrals of motion is shown taking into account the sewing conditions.

  13. GPU-accelerated 3-D model-based tracking

    International Nuclear Information System (INIS)

    Brown, J Anthony; Capson, David W

    2010-01-01

    Model-based approaches to tracking the pose of a 3-D object in video are effective but computationally demanding. While statistical estimation techniques, such as the particle filter, are often employed to minimize the search space, real-time performance remains unachievable on current generation CPUs. Recent advances in graphics processing units (GPUs) have brought massively parallel computational power to the desktop environment and powerful developer tools, such as NVIDIA Compute Unified Device Architecture (CUDA), have provided programmers with a mechanism to exploit it. NVIDIA GPUs' single-instruction multiple-thread (SIMT) programming model is well-suited to many computer vision tasks, particularly model-based tracking, which requires several hundred 3-D model poses to be dynamically configured, rendered, and evaluated against each frame in the video sequence. Using 6 degree-of-freedom (DOF) rigid hand tracking as an example application, this work harnesses consumer-grade GPUs to achieve real-time, 3-D model-based, markerless object tracking in monocular video.

  14. Towards a standard model for research in agent-based modeling and simulation

    Directory of Open Access Journals (Sweden)

    Nuno Fachada

    2015-11-01

    Full Text Available Agent-based modeling (ABM is a bottom-up modeling approach, where each entity of the system being modeled is uniquely represented as an independent decision-making agent. ABMs are very sensitive to implementation details. Thus, it is very easy to inadvertently introduce changes which modify model dynamics. Such problems usually arise due to the lack of transparency in model descriptions, which constrains how models are assessed, implemented and replicated. In this paper, we present PPHPC, a model which aims to serve as a standard in agent based modeling research, namely, but not limited to, conceptual model specification, statistical analysis of simulation output, model comparison and parallelization studies. This paper focuses on the first two aspects (conceptual model specification and statistical analysis of simulation output, also providing a canonical implementation of PPHPC. The paper serves as a complete reference to the presented model, and can be used as a tutorial for simulation practitioners who wish to improve the way they communicate their ABMs.

  15. Systemic resilience model

    International Nuclear Information System (INIS)

    Lundberg, Jonas; Johansson, Björn JE

    2015-01-01

    It has been realized that resilience as a concept involves several contradictory definitions, both for instance resilience as agile adjustment and as robust resistance to situations. Our analysis of resilience concepts and models suggest that beyond simplistic definitions, it is possible to draw up a systemic resilience model (SyRes) that maintains these opposing characteristics without contradiction. We outline six functions in a systemic model, drawing primarily on resilience engineering, and disaster response: anticipation, monitoring, response, recovery, learning, and self-monitoring. The model consists of four areas: Event-based constraints, Functional Dependencies, Adaptive Capacity and Strategy. The paper describes dependencies between constraints, functions and strategies. We argue that models such as SyRes should be useful both for envisioning new resilience methods and metrics, as well as for engineering and evaluating resilient systems. - Highlights: • The SyRes model resolves contradictions between previous resilience definitions. • SyRes is a core model for envisioning and evaluating resilience metrics and models. • SyRes describes six functions in a systemic model. • They are anticipation, monitoring, response, recovery, learning, self-monitoring. • The model describes dependencies between constraints, functions and strategies

  16. Model-based Sensor Data Acquisition and Management

    OpenAIRE

    Aggarwal, Charu C.; Sathe, Saket; Papaioannou, Thanasis G.; Jeung, Ho Young; Aberer, Karl

    2012-01-01

    In recent years, due to the proliferation of sensor networks, there has been a genuine need of researching techniques for sensor data acquisition and management. To this end, a large number of techniques have emerged that advocate model-based sensor data acquisition and management. These techniques use mathematical models for performing various, day-to-day tasks involved in managing sensor data. In this chapter, we survey the state-of-the-art techniques for model-based sensor data acquisition...

  17. INDIVIDUAL-BASED MODELS: POWERFUL OR POWER STRUGGLE?

    Science.gov (United States)

    Willem, L; Stijven, S; Hens, N; Vladislavleva, E; Broeckhove, J; Beutels, P

    2015-01-01

    Individual-based models (IBMs) offer endless possibilities to explore various research questions but come with high model complexity and computational burden. Large-scale IBMs have become feasible but the novel hardware architectures require adapted software. The increased model complexity also requires systematic exploration to gain thorough system understanding. We elaborate on the development of IBMs for vaccine-preventable infectious diseases and model exploration with active learning. Investment in IBM simulator code can lead to significant runtime reductions. We found large performance differences due to data locality. Sorting the population once, reduced simulation time by a factor two. Storing person attributes separately instead of using person objects also seemed more efficient. Next, we improved model performance up to 70% by structuring potential contacts based on health status before processing disease transmission. The active learning approach we present is based on iterative surrogate modelling and model-guided experimentation. Symbolic regression is used for nonlinear response surface modelling with automatic feature selection. We illustrate our approach using an IBM for influenza vaccination. After optimizing the parameter spade, we observed an inverse relationship between vaccination coverage and the clinical attack rate reinforced by herd immunity. These insights can be used to focus and optimise research activities, and to reduce both dimensionality and decision uncertainty.

  18. Hybrid modelling framework by using mathematics-based and information-based methods

    International Nuclear Information System (INIS)

    Ghaboussi, J; Kim, J; Elnashai, A

    2010-01-01

    Mathematics-based computational mechanics involves idealization in going from the observed behaviour of a system into mathematical equations representing the underlying mechanics of that behaviour. Idealization may lead mathematical models that exclude certain aspects of the complex behaviour that may be significant. An alternative approach is data-centric modelling that constitutes a fundamental shift from mathematical equations to data that contain the required information about the underlying mechanics. However, purely data-centric methods often fail for infrequent events and large state changes. In this article, a new hybrid modelling framework is proposed to improve accuracy in simulation of real-world systems. In the hybrid framework, a mathematical model is complemented by information-based components. The role of informational components is to model aspects which the mathematical model leaves out. The missing aspects are extracted and identified through Autoprogressive Algorithms. The proposed hybrid modelling framework has a wide range of potential applications for natural and engineered systems. The potential of the hybrid methodology is illustrated through modelling highly pinched hysteretic behaviour of beam-to-column connections in steel frames.

  19. Cluster Based Text Classification Model

    DEFF Research Database (Denmark)

    Nizamani, Sarwat; Memon, Nasrullah; Wiil, Uffe Kock

    2011-01-01

    We propose a cluster based classification model for suspicious email detection and other text classification tasks. The text classification tasks comprise many training examples that require a complex classification model. Using clusters for classification makes the model simpler and increases...... the accuracy at the same time. The test example is classified using simpler and smaller model. The training examples in a particular cluster share the common vocabulary. At the time of clustering, we do not take into account the labels of the training examples. After the clusters have been created......, the classifier is trained on each cluster having reduced dimensionality and less number of examples. The experimental results show that the proposed model outperforms the existing classification models for the task of suspicious email detection and topic categorization on the Reuters-21578 and 20 Newsgroups...

  20. Crowdsourcing Based 3d Modeling

    Science.gov (United States)

    Somogyi, A.; Barsi, A.; Molnar, B.; Lovas, T.

    2016-06-01

    Web-based photo albums that support organizing and viewing the users' images are widely used. These services provide a convenient solution for storing, editing and sharing images. In many cases, the users attach geotags to the images in order to enable using them e.g. in location based applications on social networks. Our paper discusses a procedure that collects open access images from a site frequently visited by tourists. Geotagged pictures showing the image of a sight or tourist attraction are selected and processed in photogrammetric processing software that produces the 3D model of the captured object. For the particular investigation we selected three attractions in Budapest. To assess the geometrical accuracy, we used laser scanner and DSLR as well as smart phone photography to derive reference values to enable verifying the spatial model obtained from the web-album images. The investigation shows how detailed and accurate models could be derived applying photogrammetric processing software, simply by using images of the community, without visiting the site.

  1. Evolutionary modeling-based approach for model errors correction

    Directory of Open Access Journals (Sweden)

    S. Q. Wan

    2012-08-01

    Full Text Available The inverse problem of using the information of historical data to estimate model errors is one of the science frontier research topics. In this study, we investigate such a problem using the classic Lorenz (1963 equation as a prediction model and the Lorenz equation with a periodic evolutionary function as an accurate representation of reality to generate "observational data."

    On the basis of the intelligent features of evolutionary modeling (EM, including self-organization, self-adaptive and self-learning, the dynamic information contained in the historical data can be identified and extracted by computer automatically. Thereby, a new approach is proposed to estimate model errors based on EM in the present paper. Numerical tests demonstrate the ability of the new approach to correct model structural errors. In fact, it can actualize the combination of the statistics and dynamics to certain extent.

  2. Model-Based Reconstructive Elasticity Imaging Using Ultrasound

    Directory of Open Access Journals (Sweden)

    Salavat R. Aglyamov

    2007-01-01

    Full Text Available Elasticity imaging is a reconstructive imaging technique where tissue motion in response to mechanical excitation is measured using modern imaging systems, and the estimated displacements are then used to reconstruct the spatial distribution of Young's modulus. Here we present an ultrasound elasticity imaging method that utilizes the model-based technique for Young's modulus reconstruction. Based on the geometry of the imaged object, only one axial component of the strain tensor is used. The numerical implementation of the method is highly efficient because the reconstruction is based on an analytic solution of the forward elastic problem. The model-based approach is illustrated using two potential clinical applications: differentiation of liver hemangioma and staging of deep venous thrombosis. Overall, these studies demonstrate that model-based reconstructive elasticity imaging can be used in applications where the geometry of the object and the surrounding tissue is somewhat known and certain assumptions about the pathology can be made.

  3. System Dynamics as Model-Based Theory Building

    OpenAIRE

    Schwaninger, Markus; Grösser, Stefan N.

    2008-01-01

    This paper introduces model-based theory building as a feature of system dynamics (SD) with large potential. It presents a systemic approach to actualizing that potential, thereby opening up a new perspective on theory building in the social sciences. The question addressed is if and how SD enables the construction of high-quality theories. This contribution is based on field experiment type projects which have been focused on model-based theory building, specifically the construction of a mi...

  4. Search-based model identification of smart-structure damage

    Science.gov (United States)

    Glass, B. J.; Macalou, A.

    1991-01-01

    This paper describes the use of a combined model and parameter identification approach, based on modal analysis and artificial intelligence (AI) techniques, for identifying damage or flaws in a rotating truss structure incorporating embedded piezoceramic sensors. This smart structure example is representative of a class of structures commonly found in aerospace systems and next generation space structures. Artificial intelligence techniques of classification, heuristic search, and an object-oriented knowledge base are used in an AI-based model identification approach. A finite model space is classified into a search tree, over which a variant of best-first search is used to identify the model whose stored response most closely matches that of the input. Newly-encountered models can be incorporated into the model space. This adaptativeness demonstrates the potential for learning control. Following this output-error model identification, numerical parameter identification is used to further refine the identified model. Given the rotating truss example in this paper, noisy data corresponding to various damage configurations are input to both this approach and a conventional parameter identification method. The combination of the AI-based model identification with parameter identification is shown to lead to smaller parameter corrections than required by the use of parameter identification alone.

  5. Research on Turbofan Engine Model above Idle State Based on NARX Modeling Approach

    Science.gov (United States)

    Yu, Bing; Shu, Wenjun

    2017-03-01

    The nonlinear model for turbofan engine above idle state based on NARX is studied. Above all, the data sets for the JT9D engine from existing model are obtained via simulation. Then, a nonlinear modeling scheme based on NARX is proposed and several models with different parameters are built according to the former data sets. Finally, the simulations have been taken to verify the precise and dynamic performance the models, the results show that the NARX model can well reflect the dynamics characteristic of the turbofan engine with high accuracy.

  6. Adolescent Pornography Use and Dating Violence among a Sample of Primarily Black and Hispanic, Urban-Residing, Underage Youth

    Directory of Open Access Journals (Sweden)

    Emily F. Rothman

    2015-12-01

    Full Text Available This cross-sectional study was designed to characterize the pornography viewing preferences of a sample of U.S.-based, urban-residing, economically disadvantaged, primarily Black and Hispanic youth (n = 72, and to assess whether pornography use was associated with experiences of adolescent dating abuse (ADA victimization. The sample was recruited from a large, urban, safety net hospital, and participants were 53% female, 59% Black, 19% Hispanic, 14% Other race, 6% White, and 1% Native American. All were 16–17 years old. More than half (51% had been asked to watch pornography together by a dating or sexual partner, and 44% had been asked to do something sexual that a partner saw in pornography. Adolescent dating abuse (ADA victimization was associated with more frequent pornography use, viewing pornography in the company of others, being asked to perform a sexual act that a partner first saw in pornography, and watching pornography during or after marijuana use. Approximately 50% of ADA victims and 32% of non-victims reported that they had been asked to do a sexual act that their partner saw in pornography (p = 0.15, and 58% did not feel happy to have been asked. Results suggest that weekly pornography use among underage, urban-residing youth is common, and may be associated with ADA victimization.

  7. Adolescent Pornography Use and Dating Violence among a Sample of Primarily Black and Hispanic, Urban-Residing, Underage Youth

    Science.gov (United States)

    Rothman, Emily F.; Adhia, Avanti

    2015-01-01

    This cross-sectional study was designed to characterize the pornography viewing preferences of a sample of U.S.-based, urban-residing, economically disadvantaged, primarily Black and Hispanic youth (n = 72), and to assess whether pornography use was associated with experiences of adolescent dating abuse (ADA) victimization. The sample was recruited from a large, urban, safety net hospital, and participants were 53% female, 59% Black, 19% Hispanic, 14% Other race, 6% White, and 1% Native American. All were 16–17 years old. More than half (51%) had been asked to watch pornography together by a dating or sexual partner, and 44% had been asked to do something sexual that a partner saw in pornography. Adolescent dating abuse (ADA) victimization was associated with more frequent pornography use, viewing pornography in the company of others, being asked to perform a sexual act that a partner first saw in pornography, and watching pornography during or after marijuana use. Approximately 50% of ADA victims and 32% of non-victims reported that they had been asked to do a sexual act that their partner saw in pornography (p = 0.15), and 58% did not feel happy to have been asked. Results suggest that weekly pornography use among underage, urban-residing youth may be common, and may be associated with ADA victimization. PMID:26703744

  8. Linking agent-based models and stochastic models of financial markets.

    Science.gov (United States)

    Feng, Ling; Li, Baowen; Podobnik, Boris; Preis, Tobias; Stanley, H Eugene

    2012-05-29

    It is well-known that financial asset returns exhibit fat-tailed distributions and long-term memory. These empirical features are the main objectives of modeling efforts using (i) stochastic processes to quantitatively reproduce these features and (ii) agent-based simulations to understand the underlying microscopic interactions. After reviewing selected empirical and theoretical evidence documenting the behavior of traders, we construct an agent-based model to quantitatively demonstrate that "fat" tails in return distributions arise when traders share similar technical trading strategies and decisions. Extending our behavioral model to a stochastic model, we derive and explain a set of quantitative scaling relations of long-term memory from the empirical behavior of individual market participants. Our analysis provides a behavioral interpretation of the long-term memory of absolute and squared price returns: They are directly linked to the way investors evaluate their investments by applying technical strategies at different investment horizons, and this quantitative relationship is in agreement with empirical findings. Our approach provides a possible behavioral explanation for stochastic models for financial systems in general and provides a method to parameterize such models from market data rather than from statistical fitting.

  9. Variability in Dopamine Genes Dissociates Model-Based and Model-Free Reinforcement Learning.

    Science.gov (United States)

    Doll, Bradley B; Bath, Kevin G; Daw, Nathaniel D; Frank, Michael J

    2016-01-27

    Considerable evidence suggests that multiple learning systems can drive behavior. Choice can proceed reflexively from previous actions and their associated outcomes, as captured by "model-free" learning algorithms, or flexibly from prospective consideration of outcomes that might occur, as captured by "model-based" learning algorithms. However, differential contributions of dopamine to these systems are poorly understood. Dopamine is widely thought to support model-free learning by modulating plasticity in striatum. Model-based learning may also be affected by these striatal effects, or by other dopaminergic effects elsewhere, notably on prefrontal working memory function. Indeed, prominent demonstrations linking striatal dopamine to putatively model-free learning did not rule out model-based effects, whereas other studies have reported dopaminergic modulation of verifiably model-based learning, but without distinguishing a prefrontal versus striatal locus. To clarify the relationships between dopamine, neural systems, and learning strategies, we combine a genetic association approach in humans with two well-studied reinforcement learning tasks: one isolating model-based from model-free behavior and the other sensitive to key aspects of striatal plasticity. Prefrontal function was indexed by a polymorphism in the COMT gene, differences of which reflect dopamine levels in the prefrontal cortex. This polymorphism has been associated with differences in prefrontal activity and working memory. Striatal function was indexed by a gene coding for DARPP-32, which is densely expressed in the striatum where it is necessary for synaptic plasticity. We found evidence for our hypothesis that variations in prefrontal dopamine relate to model-based learning, whereas variations in striatal dopamine function relate to model-free learning. Decisions can stem reflexively from their previously associated outcomes or flexibly from deliberative consideration of potential choice outcomes

  10. Derivation of Continuum Models from An Agent-based Cancer Model: Optimization and Sensitivity Analysis.

    Science.gov (United States)

    Voulgarelis, Dimitrios; Velayudhan, Ajoy; Smith, Frank

    2017-01-01

    Agent-based models provide a formidable tool for exploring complex and emergent behaviour of biological systems as well as accurate results but with the drawback of needing a lot of computational power and time for subsequent analysis. On the other hand, equation-based models can more easily be used for complex analysis in a much shorter timescale. This paper formulates an ordinary differential equations and stochastic differential equations model to capture the behaviour of an existing agent-based model of tumour cell reprogramming and applies it to optimization of possible treatment as well as dosage sensitivity analysis. For certain values of the parameter space a close match between the equation-based and agent-based models is achieved. The need for division of labour between the two approaches is explored. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  11. Holistic oil field value management: using system dynamics for 'intermediate level' and 'value-based' modelling in the oil industry

    International Nuclear Information System (INIS)

    Corben, D.; Stevenson, R.; Wolstenholme, E.F.

    1999-01-01

    System dynamics has been seen primarily as a strategic tool, most effectively used at the highest level of strategy to identify robust policy interventions under a wide range of scenarios. However, an alternative, complementary and powerful role is emerging. This is at an 'intermediate level' in organisations to coordinate and integrate policies across the value chain. It is at this level where business value, as defined by the discounted value of future free cash flow, is both created and destroyed. This paper introduces the need for 'intermediate-level' and 'value-based' modelling and emphasises the natural role of system dynamics in supporting a methodology to fulfil the need. It describes the development of an approach and its application in the oil industry to coordinate the response of people and tools within operational, financial and commercial functions across the value chain to address a variety of problems and issues. (author)

  12. Simple Models for Model-based Portfolio Load Balancing Controller Synthesis

    DEFF Research Database (Denmark)

    Edlund, Kristian Skjoldborg; Mølbak, Tommy; Bendtsen, Jan Dimon

    2010-01-01

    of generation units existing in an electrical power supply network, for instance in model-based predictive control or declarative control schemes. We focus on the effectuators found in the Danish power system. In particular, the paper presents models for boiler load, district heating, condensate throttling...

  13. Socio-economic vulnerability to natural hazards - proposal for an indicator-based model

    Science.gov (United States)

    Eidsvig, U.; McLean, A.; Vangelsten, B. V.; Kalsnes, B.; Ciurean, R. L.; Argyroudis, S.; Winter, M.; Corominas, J.; Mavrouli, O. C.; Fotopoulou, S.; Pitilakis, K.; Baills, A.; Malet, J. P.

    2012-04-01

    Vulnerability assessment, with respect to natural hazards, is a complex process that must consider multiple dimensions of vulnerability, including both physical and social factors. Physical vulnerability refers to conditions of physical assets, and may be modeled by the intensity and magnitude of the hazard, the degree of physical protection provided by the natural and built environment, and the physical robustness of the exposed elements. Social vulnerability refers to the underlying factors leading to the inability of people, organizations, and societies to withstand impacts from the natural hazards. Social vulnerability models can be used in combination with physical vulnerability models to estimate both direct losses, i.e. losses that occur during and immediately after the impact, as well as indirect losses, i.e. long-term effects of the event. Direct impact of a landslide typically includes casualties and damages to buildings and infrastructure while indirect losses may e.g. include business closures or limitations in public services. The direct losses are often assessed using physical vulnerability indicators (e.g. construction material, height of buildings), while indirect losses are mainly assessed using social indicators (e.g. economical resources, demographic conditions). Within the EC-FP7 SafeLand research project, an indicator-based method was proposed to assess relative socio-economic vulnerability to landslides. The indicators represent the underlying factors which influence a community's ability to prepare for, deal with, and recover from the damage associated with landslides. The proposed model includes indicators representing demographic, economic and social characteristics as well as indicators representing the degree of preparedness and recovery capacity. Although the model focuses primarily on the indirect losses, it could easily be extended to include more physical indicators which account for the direct losses. Each indicator is individually

  14. Whole body acid-base modeling revisited.

    Science.gov (United States)

    Ring, Troels; Nielsen, Søren

    2017-04-01

    The textbook account of whole body acid-base balance in terms of endogenous acid production, renal net acid excretion, and gastrointestinal alkali absorption, which is the only comprehensive model around, has never been applied in clinical practice or been formally validated. To improve understanding of acid-base modeling, we managed to write up this conventional model as an expression solely on urine chemistry. Renal net acid excretion and endogenous acid production were already formulated in terms of urine chemistry, and we could from the literature also see gastrointestinal alkali absorption in terms of urine excretions. With a few assumptions it was possible to see that this expression of net acid balance was arithmetically identical to minus urine charge, whereby under the development of acidosis, urine was predicted to acquire a net negative charge. The literature already mentions unexplained negative urine charges so we scrutinized a series of seminal papers and confirmed empirically the theoretical prediction that observed urine charge did acquire negative charge as acidosis developed. Hence, we can conclude that the conventional model is problematic since it predicts what is physiologically impossible. Therefore, we need a new model for whole body acid-base balance, which does not have impossible implications. Furthermore, new experimental studies are needed to account for charge imbalance in urine under development of acidosis. Copyright © 2017 the American Physiological Society.

  15. Developing a Physiologically-Based Pharmacokinetic Model Knowledgebase in Support of Provisional Model Construction

    Science.gov (United States)

    Grulke, Christopher M.; Chang, Daniel T.; Brooks, Raina D.; Leonard, Jeremy A.; Phillips, Martin B.; Hypes, Ethan D.; Fair, Matthew J.; Tornero-Velez, Rogelio; Johnson, Jeffre; Dary, Curtis C.; Tan, Yu-Mei

    2016-01-01

    Developing physiologically-based pharmacokinetic (PBPK) models for chemicals can be resource-intensive, as neither chemical-specific parameters nor in vivo pharmacokinetic data are easily available for model construction. Previously developed, well-parameterized, and thoroughly-vetted models can be a great resource for the construction of models pertaining to new chemicals. A PBPK knowledgebase was compiled and developed from existing PBPK-related articles and used to develop new models. From 2,039 PBPK-related articles published between 1977 and 2013, 307 unique chemicals were identified for use as the basis of our knowledgebase. Keywords related to species, gender, developmental stages, and organs were analyzed from the articles within the PBPK knowledgebase. A correlation matrix of the 307 chemicals in the PBPK knowledgebase was calculated based on pharmacokinetic-relevant molecular descriptors. Chemicals in the PBPK knowledgebase were ranked based on their correlation toward ethylbenzene and gefitinib. Next, multiple chemicals were selected to represent exact matches, close analogues, or non-analogues of the target case study chemicals. Parameters, equations, or experimental data relevant to existing models for these chemicals and their analogues were used to construct new models, and model predictions were compared to observed values. This compiled knowledgebase provides a chemical structure-based approach for identifying PBPK models relevant to other chemical entities. Using suitable correlation metrics, we demonstrated that models of chemical analogues in the PBPK knowledgebase can guide the construction of PBPK models for other chemicals. PMID:26871706

  16. Radiotherapy planning for glioblastoma based on a tumor growth model: improving target volume delineation

    Science.gov (United States)

    Unkelbach, Jan; Menze, Bjoern H.; Konukoglu, Ender; Dittmann, Florian; Le, Matthieu; Ayache, Nicholas; Shih, Helen A.

    2014-02-01

    Glioblastoma differ from many other tumors in the sense that they grow infiltratively into the brain tissue instead of forming a solid tumor mass with a defined boundary. Only the part of the tumor with high tumor cell density can be localized through imaging directly. In contrast, brain tissue infiltrated by tumor cells at low density appears normal on current imaging modalities. In current clinical practice, a uniform margin, typically two centimeters, is applied to account for microscopic spread of disease that is not directly assessable through imaging. The current treatment planning procedure can potentially be improved by accounting for the anisotropy of tumor growth, which arises from different factors: anatomical barriers such as the falx cerebri represent boundaries for migrating tumor cells. In addition, tumor cells primarily spread in white matter and infiltrate gray matter at lower rate. We investigate the use of a phenomenological tumor growth model for treatment planning. The model is based on the Fisher-Kolmogorov equation, which formalizes these growth characteristics and estimates the spatial distribution of tumor cells in normal appearing regions of the brain. The target volume for radiotherapy planning can be defined as an isoline of the simulated tumor cell density. This paper analyzes the model with respect to implications for target volume definition and identifies its most critical components. A retrospective study involving ten glioblastoma patients treated at our institution has been performed. To illustrate the main findings of the study, a detailed case study is presented for a glioblastoma located close to the falx. In this situation, the falx represents a boundary for migrating tumor cells, whereas the corpus callosum provides a route for the tumor to spread to the contralateral hemisphere. We further discuss the sensitivity of the model with respect to the input parameters. Correct segmentation of the brain appears to be the most

  17. Radiotherapy planning for glioblastoma based on a tumor growth model: improving target volume delineation

    International Nuclear Information System (INIS)

    Unkelbach, Jan; Dittmann, Florian; Le, Matthieu; Shih, Helen A; Menze, Bjoern H; Ayache, Nicholas; Konukoglu, Ender

    2014-01-01

    Glioblastoma differ from many other tumors in the sense that they grow infiltratively into the brain tissue instead of forming a solid tumor mass with a defined boundary. Only the part of the tumor with high tumor cell density can be localized through imaging directly. In contrast, brain tissue infiltrated by tumor cells at low density appears normal on current imaging modalities. In current clinical practice, a uniform margin, typically two centimeters, is applied to account for microscopic spread of disease that is not directly assessable through imaging. The current treatment planning procedure can potentially be improved by accounting for the anisotropy of tumor growth, which arises from different factors: anatomical barriers such as the falx cerebri represent boundaries for migrating tumor cells. In addition, tumor cells primarily spread in white matter and infiltrate gray matter at lower rate. We investigate the use of a phenomenological tumor growth model for treatment planning. The model is based on the Fisher–Kolmogorov equation, which formalizes these growth characteristics and estimates the spatial distribution of tumor cells in normal appearing regions of the brain. The target volume for radiotherapy planning can be defined as an isoline of the simulated tumor cell density. This paper analyzes the model with respect to implications for target volume definition and identifies its most critical components. A retrospective study involving ten glioblastoma patients treated at our institution has been performed. To illustrate the main findings of the study, a detailed case study is presented for a glioblastoma located close to the falx. In this situation, the falx represents a boundary for migrating tumor cells, whereas the corpus callosum provides a route for the tumor to spread to the contralateral hemisphere. We further discuss the sensitivity of the model with respect to the input parameters. Correct segmentation of the brain appears to be the most

  18. Model-Based Power Plant Master Control

    Energy Technology Data Exchange (ETDEWEB)

    Boman, Katarina; Thomas, Jean; Funkquist, Jonas

    2010-08-15

    The main goal of the project has been to evaluate the potential of a coordinated master control for a solid fuel power plant in terms of tracking capability, stability and robustness. The control strategy has been model-based predictive control (MPC) and the plant used in the case study has been the Vattenfall power plant Idbaecken in Nykoeping. A dynamic plant model based on nonlinear physical models was used to imitate the true plant in MATLAB/SIMULINK simulations. The basis for this model was already developed in previous Vattenfall internal projects, along with a simulation model of the existing control implementation with traditional PID controllers. The existing PID control is used as a reference performance, and it has been thoroughly studied and tuned in these previous Vattenfall internal projects. A turbine model was developed with characteristics based on the results of steady-state simulations of the plant using the software EBSILON. Using the derived model as a representative for the actual process, an MPC control strategy was developed using linearization and gain-scheduling. The control signal constraints (rate of change) and constraints on outputs were implemented to comply with plant constraints. After tuning the MPC control parameters, a number of simulation scenarios were performed to compare the MPC strategy with the existing PID control structure. The simulation scenarios also included cases highlighting the robustness properties of the MPC strategy. From the study, the main conclusions are: - The proposed Master MPC controller shows excellent set-point tracking performance even though the plant has strong interactions and non-linearity, and the controls and their rate of change are bounded. - The proposed Master MPC controller is robust, stable in the presence of disturbances and parameter variations. Even though the current study only considered a very small number of the possible disturbances and modelling errors, the considered cases are

  19. Semi-active control of magnetorheological elastomer base isolation system utilising learning-based inverse model

    Science.gov (United States)

    Gu, Xiaoyu; Yu, Yang; Li, Jianchun; Li, Yancheng

    2017-10-01

    Magnetorheological elastomer (MRE) base isolations have attracted considerable attention over the last two decades thanks to its self-adaptability and high-authority controllability in semi-active control realm. Due to the inherent nonlinearity and hysteresis of the devices, it is challenging to obtain a reasonably complicated mathematical model to describe the inverse dynamics of MRE base isolators and hence to realise control synthesis of the MRE base isolation system. Two aims have been achieved in this paper: i) development of an inverse model for MRE base isolator based on optimal general regression neural network (GRNN); ii) numerical and experimental validation of a real-time semi-active controlled MRE base isolation system utilising LQR controller and GRNN inverse model. The superiority of GRNN inverse model lays in fewer input variables requirement, faster training process and prompt calculation response, which makes it suitable for online training and real-time control. The control system is integrated with a three-storey shear building model and control performance of the MRE base isolation system is compared with bare building, passive-on isolation system and passive-off isolation system. Testing results show that the proposed GRNN inverse model is able to reproduce desired control force accurately and the MRE base isolation system can effectively suppress the structural responses when compared to the passive isolation system.

  20. Mechanics and model-based control of advanced engineering systems

    CERN Document Server

    Irschik, Hans; Krommer, Michael

    2014-01-01

    Mechanics and Model-Based Control of Advanced Engineering Systems collects 32 contributions presented at the International Workshop on Advanced Dynamics and Model Based Control of Structures and Machines, which took place in St. Petersburg, Russia in July 2012. The workshop continued a series of international workshops, which started with a Japan-Austria Joint Workshop on Mechanics and Model Based Control of Smart Materials and Structures and a Russia-Austria Joint Workshop on Advanced Dynamics and Model Based Control of Structures and Machines. In the present volume, 10 full-length papers based on presentations from Russia, 9 from Austria, 8 from Japan, 3 from Italy, one from Germany and one from Taiwan are included, which represent the state of the art in the field of mechanics and model based control, with particular emphasis on the application of advanced structures and machines.

  1. Thermal-based modeling of coupled carbon, water, and energy fluxes using nominal light use efficiencies constrained by leaf chlorophyll observations

    KAUST Repository

    Schull, M. A.

    2015-03-11

    Recent studies have shown that estimates of leaf chlorophyll content (Chl), defined as the combined mass of chlorophyll a and chlorophyll b per unit leaf area, can be useful for constraining estimates of canopy light use efficiency (LUE). Canopy LUE describes the amount of carbon assimilated by a vegetative canopy for a given amount of absorbed photosynthetically active radiation (APAR) and is a key parameter for modeling land-surface carbon fluxes. A carbon-enabled version of the remote-sensing-based two-source energy balance (TSEB) model simulates coupled canopy transpiration and carbon assimilation using an analytical sub-model of canopy resistance constrained by inputs of nominal LUE (βn), which is modulated within the model in response to varying conditions in light, humidity, ambient CO2 concentration, and temperature. Soil moisture constraints on water and carbon exchange are conveyed to the TSEB-LUE indirectly through thermal infrared measurements of land-surface temperature. We investigate the capability of using Chl estimates for capturing seasonal trends in the canopy βn from in situ measurements of Chl acquired in irrigated and rain-fed fields of soybean and maize near Mead, Nebraska. The results show that field-measured Chl is nonlinearly related to βn, with variability primarily related to phenological changes during early growth and senescence. Utilizing seasonally varying βn inputs based on an empirical relationship with in situ measured Chl resulted in improvements in carbon flux estimates from the TSEB model, while adjusting the partitioning of total water loss between plant transpiration and soil evaporation. The observed Chl-βn relationship provides a functional mechanism for integrating remotely sensed Chl into the TSEB model, with the potential for improved mapping of coupled carbon, water, and energy fluxes across vegetated landscapes.

  2. Approximation Algorithms for Model-Based Diagnosis

    NARCIS (Netherlands)

    Feldman, A.B.

    2010-01-01

    Model-based diagnosis is an area of abductive inference that uses a system model, together with observations about system behavior, to isolate sets of faulty components (diagnoses) that explain the observed behavior, according to some minimality criterion. This thesis presents greedy approximation

  3. Ionospheric forecasting model using fuzzy logic-based gradient descent method

    Directory of Open Access Journals (Sweden)

    D. Venkata Ratnam

    2017-09-01

    Full Text Available Space weather phenomena cause satellite to ground or satellite to aircraft transmission outages over the VHF to L-band frequency range, particularly in the low latitude region. Global Positioning System (GPS is primarily susceptible to this form of space weather. Faulty GPS signals are attributed to ionospheric error, which is a function of Total Electron Content (TEC. Importantly, precise forecasts of space weather conditions and appropriate hazard observant cautions required for ionospheric space weather observations are limited. In this paper, a fuzzy logic-based gradient descent method has been proposed to forecast the ionospheric TEC values. In this technique, membership functions have been tuned based on the gradient descent estimated values. The proposed algorithm has been tested with the TEC data of two geomagnetic storms in the low latitude station of KL University, Guntur, India (16.44°N, 80.62°E. It has been found that the gradient descent method performs well and the predicted TEC values are close to the original TEC measurements.

  4. Random sampling or geostatistical modelling? Choosing between design-based and model-based sampling strategies for soil (with discussion)

    NARCIS (Netherlands)

    Brus, D.J.; Gruijter, de J.J.

    1997-01-01

    Classical sampling theory has been repeatedly identified with classical statistics which assumes that data are identically and independently distributed. This explains the switch of many soil scientists from design-based sampling strategies, based on classical sampling theory, to the model-based

  5. Néron Models and Base Change

    DEFF Research Database (Denmark)

    Halle, Lars Halvard; Nicaise, Johannes

    Presenting the first systematic treatment of the behavior of Néron models under ramified base change, this book can be read as an introduction to various subtle invariants and constructions related to Néron models of semi-abelian varieties, motivated by concrete research problems and complemented...... with explicit examples. Néron models of abelian and semi-abelian varieties have become an indispensable tool in algebraic and arithmetic geometry since Néron introduced them in his seminal 1964 paper. Applications range from the theory of heights in Diophantine geometry to Hodge theory. We focus specifically...... on Néron component groups, Edixhoven’s filtration and the base change conductor of Chai and Yu, and we study these invariants using various techniques such as models of curves, sheaves on Grothendieck sites and non-archimedean uniformization. We then apply our results to the study of motivic zeta functions...

  6. Simple inhomogeneous cosmological (toy) models

    International Nuclear Information System (INIS)

    Isidro, Eddy G. Chirinos; Zimdahl, Winfried; Vargas, Cristofher Zuñiga

    2016-01-01

    Based on the Lemaître-Tolman-Bondi (LTB) metric we consider two flat inhomogeneous big-bang models. We aim at clarifying, as far as possible analytically, basic features of the dynamics of the simplest inhomogeneous models and to point out the potential usefulness of exact inhomogeneous solutions as generalizations of the homogeneous configurations of the cosmological standard model. We discuss explicitly partial successes but also potential pitfalls of these simplest models. Although primarily seen as toy models, the relevant free parameters are fixed by best-fit values using the Joint Light-curve Analysis (JLA)-sample data. On the basis of a likelihood analysis we find that a local hump with an extension of almost 2 Gpc provides a better description of the observations than a local void for which we obtain a best-fit scale of about 30 Mpc. Future redshift-drift measurements are discussed as a promising tool to discriminate between inhomogeneous configurations and the ΛCDM model.

  7. Elastoplastic cup model for cement-based materials

    Directory of Open Access Journals (Sweden)

    Yan Zhang

    2010-03-01

    Full Text Available Based on experimental data obtained from triaxial tests and a hydrostatic test, a cup model was formulated. Two plastic mechanisms, respectively a deviatoric shearing and a pore collapse, are taken into account. This model also considers the influence of confining pressure. In this paper, the calibration of the model is detailed and numerical simulations of the main mechanical behavior of cement paste over a large range of stress are described, showing good agreement with experimental results. The case study shows that this cup model has extensive applicability for cement-based materials and other quasi-brittle and high-porosity materials in a complex stress state.

  8. Towards Modeling False Memory With Computational Knowledge Bases.

    Science.gov (United States)

    Li, Justin; Kohanyi, Emma

    2017-01-01

    One challenge to creating realistic cognitive models of memory is the inability to account for the vast common-sense knowledge of human participants. Large computational knowledge bases such as WordNet and DBpedia may offer a solution to this problem but may pose other challenges. This paper explores some of these difficulties through a semantic network spreading activation model of the Deese-Roediger-McDermott false memory task. In three experiments, we show that these knowledge bases only capture a subset of human associations, while irrelevant information introduces noise and makes efficient modeling difficult. We conclude that the contents of these knowledge bases must be augmented and, more important, that the algorithms must be refined and optimized, before large knowledge bases can be widely used for cognitive modeling. Copyright © 2016 Cognitive Science Society, Inc.

  9. Testing process predictions of models of risky choice: a quantitative model comparison approach

    Science.gov (United States)

    Pachur, Thorsten; Hertwig, Ralph; Gigerenzer, Gerd; Brandstätter, Eduard

    2013-01-01

    This article presents a quantitative model comparison contrasting the process predictions of two prominent views on risky choice. One view assumes a trade-off between probabilities and outcomes (or non-linear functions thereof) and the separate evaluation of risky options (expectation models). Another view assumes that risky choice is based on comparative evaluation, limited search, aspiration levels, and the forgoing of trade-offs (heuristic models). We derived quantitative process predictions for a generic expectation model and for a specific heuristic model, namely the priority heuristic (Brandstätter et al., 2006), and tested them in two experiments. The focus was on two key features of the cognitive process: acquisition frequencies (i.e., how frequently individual reasons are looked up) and direction of search (i.e., gamble-wise vs. reason-wise). In Experiment 1, the priority heuristic predicted direction of search better than the expectation model (although neither model predicted the acquisition process perfectly); acquisition frequencies, however, were inconsistent with both models. Additional analyses revealed that these frequencies were primarily a function of what Rubinstein (1988) called “similarity.” In Experiment 2, the quantitative model comparison approach showed that people seemed to rely more on the priority heuristic in difficult problems, but to make more trade-offs in easy problems. This finding suggests that risky choice may be based on a mental toolbox of strategies. PMID:24151472

  10. Testing Process Predictions of Models of Risky Choice: A Quantitative Model Comparison Approach

    Directory of Open Access Journals (Sweden)

    Thorsten ePachur

    2013-09-01

    Full Text Available This article presents a quantitative model comparison contrasting the process predictions of two prominent views on risky choice. One view assumes a trade-off between probabilities and outcomes (or nonlinear functions thereof and the separate evaluation of risky options (expectation models. Another view assumes that risky choice is based on comparative evaluation, limited search, aspiration levels, and the forgoing of trade-offs (heuristic models. We derived quantitative process predictions for a generic expectation model and for a specific heuristic model, namely the priority heuristic (Brandstätter, Gigerenzer, & Hertwig, 2006, and tested them in two experiments. The focus was on two key features of the cognitive process: acquisition frequencies (i.e., how frequently individual reasons are looked up and direction of search (i.e., gamble-wise vs. reason-wise. In Experiment 1, the priority heuristic predicted direction of search better than the expectation model (although neither model predicted the acquisition process perfectly; acquisition frequencies, however, were inconsistent with both models. Additional analyses revealed that these frequencies were primarily a function of what Rubinstein (1988 called similarity. In Experiment 2, the quantitative model comparison approach showed that people seemed to rely more on the priority heuristic in difficult problems, but to make more trade-offs in easy problems. This finding suggests that risky choice may be based on a mental toolbox of strategies.

  11. Better Patient Care At High-Quality Hospitals May Save Medicare Money And Bolster Episode-Based Payment Models.

    Science.gov (United States)

    Tsai, Thomas C; Greaves, Felix; Zheng, Jie; Orav, E John; Zinner, Michael J; Jha, Ashish K

    2016-09-01

    US policy makers are making efforts to simultaneously improve the quality of and reduce spending on health care through alternative payment models such as bundled payment. Bundled payment models are predicated on the theory that aligning financial incentives for all providers across an episode of care will lower health care spending while improving quality. Whether this is true remains unknown. Using national Medicare fee-for-service claims for the period 2011-12 and data on hospital quality, we evaluated how thirty- and ninety-day episode-based spending were related to two validated measures of surgical quality-patient satisfaction and surgical mortality. We found that patients who had major surgery at high-quality hospitals cost Medicare less than those who had surgery at low-quality institutions, for both thirty- and ninety-day periods. The difference in Medicare spending between low- and high-quality hospitals was driven primarily by postacute care, which accounted for 59.5 percent of the difference in thirty-day episode spending, and readmissions, which accounted for 19.9 percent. These findings suggest that efforts to achieve value through bundled payment should focus on improving care at low-quality hospitals and reducing unnecessary use of postacute care. Project HOPE—The People-to-People Health Foundation, Inc.

  12. A technology path to tactical agent-based modeling

    Science.gov (United States)

    James, Alex; Hanratty, Timothy P.

    2017-05-01

    Wargaming is a process of thinking through and visualizing events that could occur during a possible course of action. Over the past 200 years, wargaming has matured into a set of formalized processes. One area of growing interest is the application of agent-based modeling. Agent-based modeling and its additional supporting technologies has potential to introduce a third-generation wargaming capability to the Army, creating a positive overmatch decision-making capability. In its simplest form, agent-based modeling is a computational technique that helps the modeler understand and simulate how the "whole of a system" responds to change over time. It provides a decentralized method of looking at situations where individual agents are instantiated within an environment, interact with each other, and empowered to make their own decisions. However, this technology is not without its own risks and limitations. This paper explores a technology roadmap, identifying research topics that could realize agent-based modeling within a tactical wargaming context.

  13. Modeling potential Emerald Ash Borer spread through GIS/cell-based/gravity models with data bolstered by web-based inputs

    Science.gov (United States)

    Louis R. Iverson; Anantha M. Prasad; Davis Sydnor; Jonathan Bossenbroek; Mark W. Schwartz; Mark W. Schwartz

    2006-01-01

    We model the susceptibility and potential spread of the organism across the eastern United States and especially through Michigan and Ohio using Forest Inventory and Analysis (FIA) data. We are also developing a cell-based model for the potential spread of the organism. We have developed a web-based tool for public agencies and private individuals to enter the...

  14. Model Based Analysis and Test Generation for Flight Software

    Science.gov (United States)

    Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep

    2009-01-01

    We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.

  15. Abstracting event-based control models for high autonomy systems

    Science.gov (United States)

    Luh, Cheng-Jye; Zeigler, Bernard P.

    1993-01-01

    A high autonomy system needs many models on which to base control, management, design, and other interventions. These models differ in level of abstraction and in formalism. Concepts and tools are needed to organize the models into a coherent whole. The paper deals with the abstraction processes for systematic derivation of related models for use in event-based control. The multifaceted modeling methodology is briefly reviewed. The morphism concepts needed for application to model abstraction are described. A theory for supporting the construction of DEVS models needed for event-based control is then presented. An implemented morphism on the basis of this theory is also described.

  16. LEARNING CREATIVE WRITING MODEL BASED ON NEUROLINGUISTIC PROGRAMMING

    OpenAIRE

    Rustan, Edhy

    2017-01-01

    The objectives of the study are to determine: (1) condition on learning creative writing at high school students in Makassar, (2) requirement of learning model in creative writing, (3) program planning and design model in ideal creative writing, (4) feasibility of model study based on creative writing in neurolinguistic programming, and (5) the effectiveness of the learning model based on creative writing in neurolinguisticprogramming.The method of this research uses research development of L...

  17. Simulation-Based Internal Models for Safer Robots

    Directory of Open Access Journals (Sweden)

    Christian Blum

    2018-01-01

    Full Text Available In this paper, we explore the potential of mobile robots with simulation-based internal models for safety in highly dynamic environments. We propose a robot with a simulation of itself, other dynamic actors and its environment, inside itself. Operating in real time, this simulation-based internal model is able to look ahead and predict the consequences of both the robot’s own actions and those of the other dynamic actors in its vicinity. Hence, the robot continuously modifies its own actions in order to actively maintain its own safety while also achieving its goal. Inspired by the problem of how mobile robots could move quickly and safely through crowds of moving humans, we present experimental results which compare the performance of our internal simulation-based controller with a purely reactive approach as a proof-of-concept study for the practical use of simulation-based internal models.

  18. Problem-Oriented Corporate Knowledge Base Models on the Case-Based Reasoning Approach Basis

    Science.gov (United States)

    Gluhih, I. N.; Akhmadulin, R. K.

    2017-07-01

    One of the urgent directions of efficiency enhancement of production processes and enterprises activities management is creation and use of corporate knowledge bases. The article suggests a concept of problem-oriented corporate knowledge bases (PO CKB), in which knowledge is arranged around possible problem situations and represents a tool for making and implementing decisions in such situations. For knowledge representation in PO CKB a case-based reasoning approach is encouraged to use. Under this approach, the content of a case as a knowledge base component has been defined; based on the situation tree a PO CKB knowledge model has been developed, in which the knowledge about typical situations as well as specific examples of situations and solutions have been represented. A generalized problem-oriented corporate knowledge base structural chart and possible modes of its operation have been suggested. The obtained models allow creating and using corporate knowledge bases for support of decision making and implementing, training, staff skill upgrading and analysis of the decisions taken. The universal interpretation of terms “situation” and “solution” adopted in the work allows using the suggested models to develop problem-oriented corporate knowledge bases in different subject domains. It has been suggested to use the developed models for making corporate knowledge bases of the enterprises that operate engineer systems and networks at large production facilities.

  19. CHIRP-Like Signals: Estimation, Detection and Processing A Sequential Model-Based Approach

    Energy Technology Data Exchange (ETDEWEB)

    Candy, J. V. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-08-04

    Chirp signals have evolved primarily from radar/sonar signal processing applications specifically attempting to estimate the location of a target in surveillance/tracking volume. The chirp, which is essentially a sinusoidal signal whose phase changes instantaneously at each time sample, has an interesting property in that its correlation approximates an impulse function. It is well-known that a matched-filter detector in radar/sonar estimates the target range by cross-correlating a replicant of the transmitted chirp with the measurement data reflected from the target back to the radar/sonar receiver yielding a maximum peak corresponding to the echo time and therefore enabling the desired range estimate. In this application, we perform the same operation as a radar or sonar system, that is, we transmit a “chirp-like pulse” into the target medium and attempt to first detect its presence and second estimate its location or range. Our problem is complicated by the presence of disturbance signals from surrounding broadcast stations as well as extraneous sources of interference in our frequency bands and of course the ever present random noise from instrumentation. First, we discuss the chirp signal itself and illustrate its inherent properties and then develop a model-based processing scheme enabling both the detection and estimation of the signal from noisy measurement data.

  20. Cultural, Human, and Social Capital as Determinants of Corporal Punishment: Toward an Integrated Theoretical Model.

    Science.gov (United States)

    Xu, Xiaohe; Tung, Yuk-Ying; Dunaway, R. Gregory

    2000-01-01

    This article constructs a model to predict the likelihood of parental use of corporal punishment on children in two-parent families. Reports that corporal punishment is primarily determined by cultural, human, and social capital that are available to, or already acquired by parents. Discusses an integrated, resource-based theory for predicting use…

  1. Agent-based modelling of cholera diffusion

    NARCIS (Netherlands)

    Augustijn-Beckers, Petronella; Doldersum, Tom; Useya, Juliana; Augustijn, Dionysius C.M.

    2016-01-01

    This paper introduces a spatially explicit agent-based simulation model for micro-scale cholera diffusion. The model simulates both an environmental reservoir of naturally occurring V.cholerae bacteria and hyperinfectious V. cholerae. Objective of the research is to test if runoff from open refuse

  2. Embracing model-based designs for dose-finding trials.

    Science.gov (United States)

    Love, Sharon B; Brown, Sarah; Weir, Christopher J; Harbron, Chris; Yap, Christina; Gaschler-Markefski, Birgit; Matcham, James; Caffrey, Louise; McKevitt, Christopher; Clive, Sally; Craddock, Charlie; Spicer, James; Cornelius, Victoria

    2017-07-25

    Dose-finding trials are essential to drug development as they establish recommended doses for later-phase testing. We aim to motivate wider use of model-based designs for dose finding, such as the continual reassessment method (CRM). We carried out a literature review of dose-finding designs and conducted a survey to identify perceived barriers to their implementation. We describe the benefits of model-based designs (flexibility, superior operating characteristics, extended scope), their current uptake, and existing resources. The most prominent barriers to implementation of a model-based design were lack of suitable training, chief investigators' preference for algorithm-based designs (e.g., 3+3), and limited resources for study design before funding. We use a real-world example to illustrate how these barriers can be overcome. There is overwhelming evidence for the benefits of CRM. Many leading pharmaceutical companies routinely implement model-based designs. Our analysis identified barriers for academic statisticians and clinical academics in mirroring the progress industry has made in trial design. Unified support from funders, regulators, and journal editors could result in more accurate doses for later-phase testing, and increase the efficiency and success of clinical drug development. We give recommendations for increasing the uptake of model-based designs for dose-finding trials in academia.

  3. New approaches in agent-based modeling of complex financial systems

    Science.gov (United States)

    Chen, Ting-Ting; Zheng, Bo; Li, Yan; Jiang, Xiong-Fei

    2017-12-01

    Agent-based modeling is a powerful simulation technique to understand the collective behavior and microscopic interaction in complex financial systems. Recently, the concept for determining the key parameters of agent-based models from empirical data instead of setting them artificially was suggested. We first review several agent-based models and the new approaches to determine the key model parameters from historical market data. Based on the agents' behaviors with heterogeneous personal preferences and interactions, these models are successful in explaining the microscopic origination of the temporal and spatial correlations of financial markets. We then present a novel paradigm combining big-data analysis with agent-based modeling. Specifically, from internet query and stock market data, we extract the information driving forces and develop an agent-based model to simulate the dynamic behaviors of complex financial systems.

  4. SEP modeling based on global heliospheric models at the CCMC

    Science.gov (United States)

    Mays, M. L.; Luhmann, J. G.; Odstrcil, D.; Bain, H. M.; Schwadron, N.; Gorby, M.; Li, Y.; Lee, K.; Zeitlin, C.; Jian, L. K.; Lee, C. O.; Mewaldt, R. A.; Galvin, A. B.

    2017-12-01

    Heliospheric models provide contextual information of conditions in the heliosphere, including the background solar wind conditions and shock structures, and are used as input to SEP models, providing an essential tool for understanding SEP properties. The global 3D MHD WSA-ENLIL+Cone model provides a time-dependent background heliospheric description, into which a spherical shaped hydrodynamic CME can be inserted. ENLIL simulates solar wind parameters and additionally one can extract the magnetic topologies of observer-connected magnetic field lines and all plasma and shock properties along those field lines. An accurate representation of the background solar wind is necessary for simulating transients. ENLIL simulations also drive SEP models such as the Solar Energetic Particle Model (SEPMOD) (Luhmann et al. 2007, 2010) and the Energetic Particle Radiation Environment Module (EPREM) (Schwadron et al. 2010). The Community Coordinated Modeling Center (CCMC) is in the process of making these SEP models available to the community and offering a system to run SEP models driven by a variety of heliospheric models available at CCMC. SEPMOD injects protons onto a sequence of observer field lines at intensities dependent on the connected shock source strength which are then integrated at the observer to approximate the proton flux. EPREM couples with MHD models such as ENLIL and computes energetic particle distributions based on the focused transport equation along a Lagrangian grid of nodes that propagate out with the solar wind. The coupled SEP models allow us to derive the longitudinal distribution of SEP profiles of different types of events throughout the heliosphere. The coupled ENLIL and SEP models allow us to derive the longitudinal distribution of SEP profiles of different types of events throughout the heliosphere. In this presentation we demonstrate several case studies of SEP event modeling at different observers based on WSA-ENLIL+Cone simulations.

  5. The evolution of process-based hydrologic models

    NARCIS (Netherlands)

    Clark, Martyn P.; Bierkens, Marc F.P.; Samaniego, Luis; Woods, Ross A.; Uijlenhoet, Remko; Bennett, Katrina E.; Pauwels, Valentijn R.N.; Cai, Xitian; Wood, Andrew W.; Peters-Lidard, Christa D.

    2017-01-01

    The diversity in hydrologic models has historically led to great controversy on the "correct" approach to process-based hydrologic modeling, with debates centered on the adequacy of process parameterizations, data limitations and uncertainty, and computational constraints on model analysis. In this

  6. Modeling soft factors in computer-based wargames

    Science.gov (United States)

    Alexander, Steven M.; Ross, David O.; Vinarskai, Jonathan S.; Farr, Steven D.

    2002-07-01

    Computer-based wargames have seen much improvement in recent years due to rapid increases in computing power. Because these games have been developed for the entertainment industry, most of these advances have centered on the graphics, sound, and user interfaces integrated into these wargames with less attention paid to the game's fidelity. However, for a wargame to be useful to the military, it must closely approximate as many of the elements of war as possible. Among the elements that are typically not modeled or are poorly modeled in nearly all military computer-based wargames are systematic effects, command and control, intelligence, morale, training, and other human and political factors. These aspects of war, with the possible exception of systematic effects, are individually modeled quite well in many board-based commercial wargames. The work described in this paper focuses on incorporating these elements from the board-based games into a computer-based wargame. This paper will also address the modeling and simulation of the systemic paralysis of an adversary that is implied by the concept of Effects Based Operations (EBO). Combining the fidelity of current commercial board wargames with the speed, ease of use, and advanced visualization of the computer can significantly improve the effectiveness of military decision making and education. Once in place, the process of converting board wargames concepts to computer wargames will allow the infusion of soft factors into military training and planning.

  7. Evaluating Water Demand Using Agent-Based Modeling

    Science.gov (United States)

    Lowry, T. S.

    2004-12-01

    The supply and demand of water resources are functions of complex, inter-related systems including hydrology, climate, demographics, economics, and policy. To assess the safety and sustainability of water resources, planners often rely on complex numerical models that relate some or all of these systems using mathematical abstractions. The accuracy of these models relies on how well the abstractions capture the true nature of the systems interactions. Typically, these abstractions are based on analyses of observations and/or experiments that account only for the statistical mean behavior of each system. This limits the approach in two important ways: 1) It cannot capture cross-system disruptive events, such as major drought, significant policy change, or terrorist attack, and 2) it cannot resolve sub-system level responses. To overcome these limitations, we are developing an agent-based water resources model that includes the systems of hydrology, climate, demographics, economics, and policy, to examine water demand during normal and extraordinary conditions. Agent-based modeling (ABM) develops functional relationships between systems by modeling the interaction between individuals (agents), who behave according to a probabilistic set of rules. ABM is a "bottom-up" modeling approach in that it defines macro-system behavior by modeling the micro-behavior of individual agents. While each agent's behavior is often simple and predictable, the aggregate behavior of all agents in each system can be complex, unpredictable, and different than behaviors observed in mean-behavior models. Furthermore, the ABM approach creates a virtual laboratory where the effects of policy changes and/or extraordinary events can be simulated. Our model, which is based on the demographics and hydrology of the Middle Rio Grande Basin in the state of New Mexico, includes agent groups of residential, agricultural, and industrial users. Each agent within each group determines its water usage

  8. CROWDSOURCING BASED 3D MODELING

    Directory of Open Access Journals (Sweden)

    A. Somogyi

    2016-06-01

    Full Text Available Web-based photo albums that support organizing and viewing the users’ images are widely used. These services provide a convenient solution for storing, editing and sharing images. In many cases, the users attach geotags to the images in order to enable using them e.g. in location based applications on social networks. Our paper discusses a procedure that collects open access images from a site frequently visited by tourists. Geotagged pictures showing the image of a sight or tourist attraction are selected and processed in photogrammetric processing software that produces the 3D model of the captured object. For the particular investigation we selected three attractions in Budapest. To assess the geometrical accuracy, we used laser scanner and DSLR as well as smart phone photography to derive reference values to enable verifying the spatial model obtained from the web-album images. The investigation shows how detailed and accurate models could be derived applying photogrammetric processing software, simply by using images of the community, without visiting the site.

  9. A Collective Case Study of Secondary Students' Model-Based Inquiry on Natural Selection through Programming in an Agent-Based Modeling Environment

    Science.gov (United States)

    Xiang, Lin

    2011-01-01

    This is a collective case study seeking to develop detailed descriptions of how programming an agent-based simulation influences a group of 8th grade students' model-based inquiry (MBI) by examining students' agent-based programmable modeling (ABPM) processes and the learning outcomes. The context of the present study was a biology unit on…

  10. Nonlinear system modeling based on bilinear Laguerre orthonormal bases.

    Science.gov (United States)

    Garna, Tarek; Bouzrara, Kais; Ragot, José; Messaoud, Hassani

    2013-05-01

    This paper proposes a new representation of discrete bilinear model by developing its coefficients associated to the input, to the output and to the crossed product on three independent Laguerre orthonormal bases. Compared to classical bilinear model, the resulting model entitled bilinear-Laguerre model ensures a significant parameter number reduction as well as simple recursive representation. However, such reduction still constrained by an optimal choice of Laguerre pole characterizing each basis. To do so, we develop a pole optimization algorithm which constitutes an extension of that proposed by Tanguy et al.. The bilinear-Laguerre model as well as the proposed pole optimization algorithm are illustrated and tested on a numerical simulations and validated on the Continuous Stirred Tank Reactor (CSTR) System. Copyright © 2012 ISA. Published by Elsevier Ltd. All rights reserved.

  11. Modeling base excision repair in Escherichia coli bacterial cells

    International Nuclear Information System (INIS)

    Belov, O.V.

    2011-01-01

    A model describing the key processes in Escherichia coli bacterial cells during base excision repair is developed. The mechanism is modeled of damaged base elimination involving formamidopyrimidine DNA glycosylase (the Fpg protein), which possesses several types of activities. The modeling of the transitions between DNA states is based on a stochastic approach to the chemical reaction description

  12. The fractional volatility model: An agent-based interpretation

    Science.gov (United States)

    Vilela Mendes, R.

    2008-06-01

    Based on the criteria of mathematical simplicity and consistency with empirical market data, a model with volatility driven by fractional noise has been constructed which provides a fairly accurate mathematical parametrization of the data. Here, some features of the model are reviewed and extended to account for leverage effects. Using agent-based models, one tries to find which agent strategies and (or) properties of the financial institutions might be responsible for the features of the fractional volatility model.

  13. Néron models and base change

    CERN Document Server

    Halle, Lars Halvard

    2016-01-01

    Presenting the first systematic treatment of the behavior of Néron models under ramified base change, this book can be read as an introduction to various subtle invariants and constructions related to Néron models of semi-abelian varieties, motivated by concrete research problems and complemented with explicit examples. Néron models of abelian and semi-abelian varieties have become an indispensable tool in algebraic and arithmetic geometry since Néron introduced them in his seminal 1964 paper. Applications range from the theory of heights in Diophantine geometry to Hodge theory. We focus specifically on Néron component groups, Edixhoven’s filtration and the base change conductor of Chai and Yu, and we study these invariants using various techniques such as models of curves, sheaves on Grothendieck sites and non-archimedean uniformization. We then apply our results to the study of motivic zeta functions of abelian varieties. The final chapter contains a list of challenging open questions. This book is a...

  14. Inference-based procedural modeling of solids

    KAUST Repository

    Biggers, Keith

    2011-11-01

    As virtual environments become larger and more complex, there is an increasing need for more automated construction algorithms to support the development process. We present an approach for modeling solids by combining prior examples with a simple sketch. Our algorithm uses an inference-based approach to incrementally fit patches together in a consistent fashion to define the boundary of an object. This algorithm samples and extracts surface patches from input models, and develops a Petri net structure that describes the relationship between patches along an imposed parameterization. Then, given a new parameterized line or curve, we use the Petri net to logically fit patches together in a manner consistent with the input model. This allows us to easily construct objects of varying sizes and configurations using arbitrary articulation, repetition, and interchanging of parts. The result of our process is a solid model representation of the constructed object that can be integrated into a simulation-based environment. © 2011 Elsevier Ltd. All rights reserved.

  15. A Coupled Simulation Architecture for Agent-Based/Geohydrological Modelling

    Science.gov (United States)

    Jaxa-Rozen, M.

    2016-12-01

    The quantitative modelling of social-ecological systems can provide useful insights into the interplay between social and environmental processes, and their impact on emergent system dynamics. However, such models should acknowledge the complexity and uncertainty of both of the underlying subsystems. For instance, the agent-based models which are increasingly popular for groundwater management studies can be made more useful by directly accounting for the hydrological processes which drive environmental outcomes. Conversely, conventional environmental models can benefit from an agent-based depiction of the feedbacks and heuristics which influence the decisions of groundwater users. From this perspective, this work describes a Python-based software architecture which couples the popular NetLogo agent-based platform with the MODFLOW/SEAWAT geohydrological modelling environment. This approach enables users to implement agent-based models in NetLogo's user-friendly platform, while benefiting from the full capabilities of MODFLOW/SEAWAT packages or reusing existing geohydrological models. The software architecture is based on the pyNetLogo connector, which provides an interface between the NetLogo agent-based modelling software and the Python programming language. This functionality is then extended and combined with Python's object-oriented features, to design a simulation architecture which couples NetLogo with MODFLOW/SEAWAT through the FloPy library (Bakker et al., 2016). The Python programming language also provides access to a range of external packages which can be used for testing and analysing the coupled models, which is illustrated for an application of Aquifer Thermal Energy Storage (ATES).

  16. Model-based Clustering of High-Dimensional Data in Astrophysics

    Science.gov (United States)

    Bouveyron, C.

    2016-05-01

    The nature of data in Astrophysics has changed, as in other scientific fields, in the past decades due to the increase of the measurement capabilities. As a consequence, data are nowadays frequently of high dimensionality and available in mass or stream. Model-based techniques for clustering are popular tools which are renowned for their probabilistic foundations and their flexibility. However, classical model-based techniques show a disappointing behavior in high-dimensional spaces which is mainly due to their dramatical over-parametrization. The recent developments in model-based classification overcome these drawbacks and allow to efficiently classify high-dimensional data, even in the "small n / large p" situation. This work presents a comprehensive review of these recent approaches, including regularization-based techniques, parsimonious modeling, subspace classification methods and classification methods based on variable selection. The use of these model-based methods is also illustrated on real-world classification problems in Astrophysics using R packages.

  17. Fuzzy rule-based model for hydropower reservoirs operation

    Energy Technology Data Exchange (ETDEWEB)

    Moeini, R.; Afshar, A.; Afshar, M.H. [School of Civil Engineering, Iran University of Science and Technology, Tehran (Iran, Islamic Republic of)

    2011-02-15

    Real-time hydropower reservoir operation is a continuous decision-making process of determining the water level of a reservoir or the volume of water released from it. The hydropower operation is usually based on operating policies and rules defined and decided upon in strategic planning. This paper presents a fuzzy rule-based model for the operation of hydropower reservoirs. The proposed fuzzy rule-based model presents a set of suitable operating rules for release from the reservoir based on ideal or target storage levels. The model operates on an 'if-then' principle, in which the 'if' is a vector of fuzzy premises and the 'then' is a vector of fuzzy consequences. In this paper, reservoir storage, inflow, and period are used as premises and the release as the consequence. The steps involved in the development of the model include, construction of membership functions for the inflow, storage and the release, formulation of fuzzy rules, implication, aggregation and defuzzification. The required knowledge bases for the formulation of the fuzzy rules is obtained form a stochastic dynamic programming (SDP) model with a steady state policy. The proposed model is applied to the hydropower operation of ''Dez'' reservoir in Iran and the results are presented and compared with those of the SDP model. The results indicate the ability of the method to solve hydropower reservoir operation problems. (author)

  18. A General Attribute and Rule Based Role-Based Access Control Model

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Growing numbers of users and many access control policies which involve many different resource attributes in service-oriented environments bring various problems in protecting resource. This paper analyzes the relationships of resource attributes to user attributes in all policies, and propose a general attribute and rule based role-based access control(GAR-RBAC) model to meet the security needs. The model can dynamically assign users to roles via rules to meet the need of growing numbers of users. These rules use different attribute expression and permission as a part of authorization constraints, and are defined by analyzing relations of resource attributes to user attributes in many access policies that are defined by the enterprise. The model is a general access control model, and can support many access control policies, and also can be used to wider application for service. The paper also describes how to use the GAR-RBAC model in Web service environments.

  19. Improving Agent Based Modeling of Critical Incidents

    Directory of Open Access Journals (Sweden)

    Robert Till

    2010-04-01

    Full Text Available Agent Based Modeling (ABM is a powerful method that has been used to simulate potential critical incidents in the infrastructure and built environments. This paper will discuss the modeling of some critical incidents currently simulated using ABM and how they may be expanded and improved by using better physiological modeling, psychological modeling, modeling the actions of interveners, introducing Geographic Information Systems (GIS and open source models.

  20. Model-based testing for software safety

    NARCIS (Netherlands)

    Gurbuz, Havva Gulay; Tekinerdogan, Bedir

    2017-01-01

    Testing safety-critical systems is crucial since a failure or malfunction may result in death or serious injuries to people, equipment, or environment. An important challenge in testing is the derivation of test cases that can identify the potential faults. Model-based testing adopts models of a

  1. A standard protocol for describing individual-based and agent-based models

    Science.gov (United States)

    Grimm, Volker; Berger, Uta; Bastiansen, Finn; Eliassen, Sigrunn; Ginot, Vincent; Giske, Jarl; Goss-Custard, John; Grand, Tamara; Heinz, Simone K.; Huse, Geir; Huth, Andreas; Jepsen, Jane U.; Jorgensen, Christian; Mooij, Wolf M.; Muller, Birgit; Pe'er, Guy; Piou, Cyril; Railsback, Steven F.; Robbins, Andrew M.; Robbins, Martha M.; Rossmanith, Eva; Ruger, Nadja; Strand, Espen; Souissi, Sami; Stillman, Richard A.; Vabo, Rune; Visser, Ute; DeAngelis, Donald L.

    2006-01-01

    Simulation models that describe autonomous individual organisms (individual based models, IBM) or agents (agent-based models, ABM) have become a widely used tool, not only in ecology, but also in many other disciplines dealing with complex systems made up of autonomous entities. However, there is no standard protocol for describing such simulation models, which can make them difficult to understand and to duplicate. This paper presents a proposed standard protocol, ODD, for describing IBMs and ABMs, developed and tested by 28 modellers who cover a wide range of fields within ecology. This protocol consists of three blocks (Overview, Design concepts, and Details), which are subdivided into seven elements: Purpose, State variables and scales, Process overview and scheduling, Design concepts, Initialization, Input, and Submodels. We explain which aspects of a model should be described in each element, and we present an example to illustrate the protocol in use. In addition, 19 examples are available in an Online Appendix. We consider ODD as a first step for establishing a more detailed common format of the description of IBMs and ABMs. Once initiated, the protocol will hopefully evolve as it becomes used by a sufficiently large proportion of modellers.

  2. FADD Expression as a Prognosticator in Early-Stage Glottic Squamous Cell Carcinoma of the Larynx Treated Primarily With Radiotherapy

    International Nuclear Information System (INIS)

    Schrijvers, Michiel L.; Pattje, Wouter J.; Slagter-Menkema, Lorian; Mastik, Mirjam F.; Gibcus, Johan H.; Langendijk, Johannes A.; Wal, Jacqueline E. van der; Laan, Bernard F.A.M. vn der; Schuuring, E.

    2012-01-01

    Purpose: We recently reported on the identification of the Fas-associated death domain (FADD) as a possible driver of the chromosome 11q13 amplicon and the association between increased FADD expression and disease-specific survival in advanced-stage laryngeal carcinoma. The aim of this study was to examine whether expression of FADD and its Ser194-phosphorylated isoform (pFADD) predicts local control in patients with early-stage glottic carcinoma primarily treated with radiotherapy only. Methods and Materials: Immunohistochemical staining for FADD and pFADD was performed on pretreatment biopsy specimens of 92 patients with T1–T2 glottic squamous cell carcinoma primarily treated with radiotherapy between 1996 and 2005. Cox regression analysis was used to correlate expression levels with local control. Results: High levels of pFADD were associated with significantly better local control (hazard ratio, 2.40; 95% confidence interval, 1.04–5.55; p = 0.040). FADD overexpression showed a trend toward better local control (hazard ratio, 3.656; 95% confidence interval, 0.853–15.663; p = 0.081). Multivariate Cox regression analysis showed that high pFADD expression was the best predictor of local control after radiotherapy. Conclusions: This study showed that expression of phosphorylated FADD is a new prognostic biomarker for better local control after radiotherapy in patients with early-stage glottic carcinomas.

  3. A subchannel based annular flow dryout model

    International Nuclear Information System (INIS)

    Hammouda, Najmeddine; Cheng, Zhong; Rao, Yanfei F.

    2016-01-01

    Highlights: • A modified annular flow dryout model for subchannel thermalhydraulic analysis. • Implementation of the model in Canadian subchannel code ASSERT-PV. • Assessment of the model against tube CHF experiments. • Assessment of the model against CANDU-bundle CHF experiments. - Abstract: This paper assesses a popular tube-based mechanistic critical heat flux model (Hewitt and Govan’s annular flow model (based on the model of Whalley et al.), and modifies and implements the model for bundle geometries. It describes the results of the ASSERT subchannel code predictions using the modified model, as applied to a single tube and the 28-element, 37-element and 43-element (CANFLEX) CANDU bundles. A quantitative comparison between the model predictions and experimental data indicates good agreement for a wide range of flow conditions. The comparison has resulted in an overall average error of −0.15% and an overall root-mean-square error of 5.46% with tube data representing annular film dryout type critical heat flux, and in an overall average error of −0.9% and an overall RMS error of 9.9% with Stern Laboratories’ CANDU-bundle data.

  4. Opinion dynamics model based on quantum formalism

    Energy Technology Data Exchange (ETDEWEB)

    Artawan, I. Nengah, E-mail: nengahartawan@gmail.com [Theoretical Physics Division, Department of Physics, Udayana University (Indonesia); Trisnawati, N. L. P., E-mail: nlptrisnawati@gmail.com [Biophysics, Department of Physics, Udayana University (Indonesia)

    2016-03-11

    Opinion dynamics model based on quantum formalism is proposed. The core of the quantum formalism is on the half spin dynamics system. In this research the implicit time evolution operators are derived. The analogy between the model with Deffuant dan Sznajd models is discussed.

  5. The Relevance of Shopper Logistics for Consumers of Store-Based Retail Formats

    DEFF Research Database (Denmark)

    Teller, Christoph; Kotzab, Herbert; Grant, David B.

    2012-01-01

    influence consumers' perceptions of shopping related costs. Nevertheless, shopper logistics does not affect consumer behaviour in terms of the share of visits of a store. These results are moderated by age, hedonic shopping orientation, shopping frequency, average spending per trip and store format......This paper discusses and empirically evaluates the relevance of shopping-related logistics for consumers of store-based retail formats. Based on a literature review a conceptual model was developed and subsequently tested using a survey of more than six hundred consumers in the grocery retail...... sector. Respondents were those primarily responsible for grocery shopping in their households located in a highly concentrated European urban retail market. Variance based structural equation modelling reveals that shopper logistics has a major impact on the convenience of store-based shopping and partly...

  6. Model-based testing for embedded systems

    CERN Document Server

    Zander, Justyna; Mosterman, Pieter J

    2011-01-01

    What the experts have to say about Model-Based Testing for Embedded Systems: "This book is exactly what is needed at the exact right time in this fast-growing area. From its beginnings over 10 years ago of deriving tests from UML statecharts, model-based testing has matured into a topic with both breadth and depth. Testing embedded systems is a natural application of MBT, and this book hits the nail exactly on the head. Numerous topics are presented clearly, thoroughly, and concisely in this cutting-edge book. The authors are world-class leading experts in this area and teach us well-used

  7. Research on Single Base-Station Distance Estimation Algorithm in Quasi-GPS Ultrasonic Location System

    Energy Technology Data Exchange (ETDEWEB)

    Cheng, X C; Su, S J; Wang, Y K; Du, J B [Instrument Department, College of Mechatronics Engineering and Automation, National University of Defense Technology, ChangSha, Hunan, 410073 (China)

    2006-10-15

    In order to identify each base-station in quasi-GPS ultrasonic location system, a unique pseudo-random code is assigned to each base-station. This article primarily studies the distance estimation problem between Autonomous Guide Vehicle (AGV) and single base-station, and then the ultrasonic spread-spectrum distance measurement Time Delay Estimation (TDE) model is established. Based on the above model, the envelope correlation fast TDE algorithm based on FFT is presented and analyzed. It shows by experiments that when the m sequence used in the received signal is as same as the reference signal, there will be a sharp correlation value in their envelope correlation function after they are processed by the above algorithm; otherwise, the will be no prominent correlation value. So, the AGV can identify each base-station easily.

  8. Research on Single Base-Station Distance Estimation Algorithm in Quasi-GPS Ultrasonic Location System

    International Nuclear Information System (INIS)

    Cheng, X C; Su, S J; Wang, Y K; Du, J B

    2006-01-01

    In order to identify each base-station in quasi-GPS ultrasonic location system, a unique pseudo-random code is assigned to each base-station. This article primarily studies the distance estimation problem between Autonomous Guide Vehicle (AGV) and single base-station, and then the ultrasonic spread-spectrum distance measurement Time Delay Estimation (TDE) model is established. Based on the above model, the envelope correlation fast TDE algorithm based on FFT is presented and analyzed. It shows by experiments that when the m sequence used in the received signal is as same as the reference signal, there will be a sharp correlation value in their envelope correlation function after they are processed by the above algorithm; otherwise, the will be no prominent correlation value. So, the AGV can identify each base-station easily

  9. Intelligent model-based diagnostics for vehicle health management

    Science.gov (United States)

    Luo, Jianhui; Tu, Fang; Azam, Mohammad S.; Pattipati, Krishna R.; Willett, Peter K.; Qiao, Liu; Kawamoto, Masayuki

    2003-08-01

    The recent advances in sensor technology, remote communication and computational capabilities, and standardized hardware/software interfaces are creating a dramatic shift in the way the health of vehicles is monitored and managed. These advances facilitate remote monitoring, diagnosis and condition-based maintenance of automotive systems. With the increased sophistication of electronic control systems in vehicles, there is a concomitant increased difficulty in the identification of the malfunction phenomena. Consequently, the current rule-based diagnostic systems are difficult to develop, validate and maintain. New intelligent model-based diagnostic methodologies that exploit the advances in sensor, telecommunications, computing and software technologies are needed. In this paper, we will investigate hybrid model-based techniques that seamlessly employ quantitative (analytical) models and graph-based dependency models for intelligent diagnosis. Automotive engineers have found quantitative simulation (e.g. MATLAB/SIMULINK) to be a vital tool in the development of advanced control systems. The hybrid method exploits this capability to improve the diagnostic system's accuracy and consistency, utilizes existing validated knowledge on rule-based methods, enables remote diagnosis, and responds to the challenges of increased system complexity. The solution is generic and has the potential for application in a wide range of systems.

  10. Models for the field-based toxicity of copper and zinc salts to wheat in 11 Australian soils and comparison to laboratory-based models

    International Nuclear Information System (INIS)

    Warne, Michael St.J.; Heemsbergen, Diane; McLaughlin, Mike; Bell, Mike; Broos, Kris; Whatmuff, Mark; Barry, Glenn; Nash, David; Pritchard, Deb; Penney, Nancy

    2008-01-01

    Laboratory-based relationships that model the phytotoxicity of metals using soil properties have been developed. This paper presents the first field-based phytotoxicity relationships. Wheat (Triticum aestivum L.) was grown at 11 Australian field sites at which soil was spiked with copper (Cu) and zinc (Zn) salts. Toxicity was measured as inhibition of plant growth at 8 weeks and grain yield at harvest. The added Cu and Zn EC10 values for both endpoints ranged from approximately 3 to 4760 mg/kg. There were no relationships between field-based 8-week biomass and grain yield toxicity values for either metal. Cu toxicity was best modelled using pH and organic carbon content while Zn toxicity was best modelled using pH and the cation exchange capacity. The best relationships estimated toxicity within a factor of two of measured values. Laboratory-based phytotoxicity relationships could not accurately predict field-based phytotoxicity responses. - Field-based toxicity of Cu and Zn to wheat can be modelled using soil properties. Laboratory-based models should not be used to estimate toxicity in the field

  11. Model based energy benchmarking for glass furnace

    International Nuclear Information System (INIS)

    Sardeshpande, Vishal; Gaitonde, U.N.; Banerjee, Rangan

    2007-01-01

    Energy benchmarking of processes is important for setting energy efficiency targets and planning energy management strategies. Most approaches used for energy benchmarking are based on statistical methods by comparing with a sample of existing plants. This paper presents a model based approach for benchmarking of energy intensive industrial processes and illustrates this approach for industrial glass furnaces. A simulation model for a glass furnace is developed using mass and energy balances, and heat loss equations for the different zones and empirical equations based on operating practices. The model is checked with field data from end fired industrial glass furnaces in India. The simulation model enables calculation of the energy performance of a given furnace design. The model results show the potential for improvement and the impact of different operating and design preferences on specific energy consumption. A case study for a 100 TPD end fired furnace is presented. An achievable minimum energy consumption of about 3830 kJ/kg is estimated for this furnace. The useful heat carried by glass is about 53% of the heat supplied by the fuel. Actual furnaces operating at these production scales have a potential for reduction in energy consumption of about 20-25%

  12. Thermodynamics-based models of transcriptional regulation with gene sequence.

    Science.gov (United States)

    Wang, Shuqiang; Shen, Yanyan; Hu, Jinxing

    2015-12-01

    Quantitative models of gene regulatory activity have the potential to improve our mechanistic understanding of transcriptional regulation. However, the few models available today have been based on simplistic assumptions about the sequences being modeled or heuristic approximations of the underlying regulatory mechanisms. In this work, we have developed a thermodynamics-based model to predict gene expression driven by any DNA sequence. The proposed model relies on a continuous time, differential equation description of transcriptional dynamics. The sequence features of the promoter are exploited to derive the binding affinity which is derived based on statistical molecular thermodynamics. Experimental results show that the proposed model can effectively identify the activity levels of transcription factors and the regulatory parameters. Comparing with the previous models, the proposed model can reveal more biological sense.

  13. Comparing model-based and model-free analysis methods for QUASAR arterial spin labeling perfusion quantification.

    Science.gov (United States)

    Chappell, Michael A; Woolrich, Mark W; Petersen, Esben T; Golay, Xavier; Payne, Stephen J

    2013-05-01

    Amongst the various implementations of arterial spin labeling MRI methods for quantifying cerebral perfusion, the QUASAR method is unique. By using a combination of labeling with and without flow suppression gradients, the QUASAR method offers the separation of macrovascular and tissue signals. This permits local arterial input functions to be defined and "model-free" analysis, using numerical deconvolution, to be used. However, it remains unclear whether arterial spin labeling data are best treated using model-free or model-based analysis. This work provides a critical comparison of these two approaches for QUASAR arterial spin labeling in the healthy brain. An existing two-component (arterial and tissue) model was extended to the mixed flow suppression scheme of QUASAR to provide an optimal model-based analysis. The model-based analysis was extended to incorporate dispersion of the labeled bolus, generally regarded as the major source of discrepancy between the two analysis approaches. Model-free and model-based analyses were compared for perfusion quantification including absolute measurements, uncertainty estimation, and spatial variation in cerebral blood flow estimates. Major sources of discrepancies between model-free and model-based analysis were attributed to the effects of dispersion and the degree to which the two methods can separate macrovascular and tissue signal. Copyright © 2012 Wiley Periodicals, Inc.

  14. Model-based monitoring of rotors with multiple coexisting faults

    International Nuclear Information System (INIS)

    Rossner, Markus

    2015-01-01

    Monitoring systems are applied to many rotors, but only few monitoring systems can separate coexisting errors and identify their quantity. This research project solves this problem using a combination of signal-based and model-based monitoring. The signal-based part performs a pre-selection of possible errors; these errors are further separated with model-based methods. This approach is demonstrated for the errors unbalance, bow, stator-fixed misalignment, rotor-fixed misalignment and roundness errors. For the model-based part, unambiguous error definitions and models are set up. The Ritz approach reduces the model order and therefore speeds up the diagnosis. Identification algorithms are developed for the different rotor faults. Hereto, reliable damage indicators and proper sub steps of the diagnosis have to be defined. For several monitoring problems, measuring both deflection and bearing force is very useful. The monitoring system is verified by experiments on an academic rotor test rig. The interpretation of the measurements requires much knowledge concerning the dynamics of the rotor. Due to the model-based approach, the system can separate errors with similar signal patterns and identify bow and roundness error online at operation speed. [de

  15. Pixel-based meshfree modelling of skeletal muscles

    OpenAIRE

    Chen, Jiun-Shyan; Basava, Ramya Rao; Zhang, Yantao; Csapo, Robert; Malis, Vadim; Sinha, Usha; Hodgson, John; Sinha, Shantanu

    2015-01-01

    This paper introduces the meshfree Reproducing Kernel Particle Method (RKPM) for 3D image-based modeling of skeletal muscles. This approach allows for construction of simulation model based on pixel data obtained from medical images. The material properties and muscle fiber direction obtained from Diffusion Tensor Imaging (DTI) are input at each pixel point. The reproducing kernel (RK) approximation allows a representation of material heterogeneity with smooth transition. A ...

  16. A CSP-Based Agent Modeling Framework for the Cougaar Agent-Based Architecture

    Science.gov (United States)

    Gracanin, Denis; Singh, H. Lally; Eltoweissy, Mohamed; Hinchey, Michael G.; Bohner, Shawn A.

    2005-01-01

    Cognitive Agent Architecture (Cougaar) is a Java-based architecture for large-scale distributed agent-based applications. A Cougaar agent is an autonomous software entity with behaviors that represent a real-world entity (e.g., a business process). A Cougaar-based Model Driven Architecture approach, currently under development, uses a description of system's functionality (requirements) to automatically implement the system in Cougaar. The Communicating Sequential Processes (CSP) formalism is used for the formal validation of the generated system. Two main agent components, a blackboard and a plugin, are modeled as CSP processes. A set of channels represents communications between the blackboard and individual plugins. The blackboard is represented as a CSP process that communicates with every agent in the collection. The developed CSP-based Cougaar modeling framework provides a starting point for a more complete formal verification of the automatically generated Cougaar code. Currently it is used to verify the behavior of an individual agent in terms of CSP properties and to analyze the corresponding Cougaar society.

  17. Surrogate-Based Optimization of Biogeochemical Transport Models

    Science.gov (United States)

    Prieß, Malte; Slawig, Thomas

    2010-09-01

    First approaches towards a surrogate-based optimization method for a one-dimensional marine biogeochemical model of NPZD type are presented. The model, developed by Oschlies and Garcon [1], simulates the distribution of nitrogen, phytoplankton, zooplankton and detritus in a water column and is driven by ocean circulation data. A key issue is to minimize the misfit between the model output and given observational data. Our aim is to reduce the overall optimization cost avoiding expensive function and derivative evaluations by using a surrogate model replacing the high-fidelity model in focus. This in particular becomes important for more complex three-dimensional models. We analyse a coarsening in the discretization of the model equations as one way to create such a surrogate. Here the numerical stability crucially depends upon the discrete stepsize in time and space and the biochemical terms. We show that for given model parameters the level of grid coarsening can be choosen accordingly yielding a stable and satisfactory surrogate. As one example of a surrogate-based optimization method we present results of the Aggressive Space Mapping technique (developed by John W. Bandler [2, 3]) applied to the optimization of this one-dimensional biogeochemical transport model.

  18. Structural Acoustic Physics Based Modeling of Curved Composite Shells

    Science.gov (United States)

    2017-09-19

    NUWC-NPT Technical Report 12,236 19 September 2017 Structural Acoustic Physics -Based Modeling of Curved Composite Shells Rachel E. Hesse...SUBTITLE Structural Acoustic Physics -Based Modeling of Curved Composite Shells 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...study was to use physics -based modeling (PBM) to investigate wave propagations through curved shells that are subjected to acoustic excitation. An

  19. Agent-Based Modeling in Systems Pharmacology.

    Science.gov (United States)

    Cosgrove, J; Butler, J; Alden, K; Read, M; Kumar, V; Cucurull-Sanchez, L; Timmis, J; Coles, M

    2015-11-01

    Modeling and simulation (M&S) techniques provide a platform for knowledge integration and hypothesis testing to gain insights into biological systems that would not be possible a priori. Agent-based modeling (ABM) is an M&S technique that focuses on describing individual components rather than homogenous populations. This tutorial introduces ABM to systems pharmacologists, using relevant case studies to highlight how ABM-specific strengths have yielded success in the area of preclinical mechanistic modeling.

  20. A pedagogical model for simulation-based learning in healthcare

    Directory of Open Access Journals (Sweden)

    Tuulikki Keskitalo

    2015-11-01

    Full Text Available The aim of this study was to design a pedagogical model for a simulation-based learning environment (SBLE in healthcare. Currently, simulation and virtual reality are a major focus in healthcare education. However, when and how these learning environments should be applied is not well-known. The present study tries to fill that gap. We pose the following research question: What kind of pedagogical model supports and facilitates students’ meaningful learning in SBLEs? The study used design-based research (DBR and case study approaches. We report the results from our second case study and how the pedagogical model was developed based on the lessons learned. The study involved nine facilitators and 25 students. Data were collected and analysed using mixed methods. The main result of this study is the refined pedagogical model. The model is based on the socio-cultural theory of learning and characteristics of meaningful learning as well as previous pedagogical models. The model will provide a more holistic and meaningful approach to teaching and learning in SBLEs. However, the model requires evidence and further development.

  1. Development of a materials data base for modeling

    International Nuclear Information System (INIS)

    Iwata, S.; Ashino, T.; Ishino, S.

    1988-01-01

    Materials selection for fusion reactors requires a materials data base and a set of methods to estimate material properties in a ''virtual'' fusion reactor. This estimation process, namely, modeling, is analyzed as compromising of design requirements, available data bases and methods of estimation, and a concept of an ideal computer system to support this modeling process is proposed. The limitations of a commercial DBMS (Data Base Management System) to handle sophisticated materials data are described in accordance with our experiences. Secondly, ways to manipulate analytical expressions are discussed as the next step for computer assisted modeling. Finally, an advanced method is presented which is able to manage models and data in the same manner without paying attention to annoying rules compelled by constraints of using computers. (orig.)

  2. Interactive Coherence-Based Façade Modeling

    KAUST Repository

    Musialski, Przemyslaw

    2012-05-01

    We propose a novel interactive framework for modeling building facades from images. Our method is based on the notion of coherence-based editing which allows exploiting partial symmetries across the facade at any level of detail. The proposed workflow mixes manual interaction with automatic splitting and grouping operations based on unsupervised cluster analysis. In contrast to previous work, our approach leads to detailed 3d geometric models with up to several thousand regions per facade. We compare our modeling scheme to others and evaluate our approach in a user study with an experienced user and several novice users.

  3. Vehicle-specific emissions modeling based upon on-road measurements.

    Science.gov (United States)

    Frey, H Christopher; Zhang, Kaishan; Rouphail, Nagui M

    2010-05-01

    Vehicle-specific microscale fuel use and emissions rate models are developed based upon real-world hot-stabilized tailpipe measurements made using a portable emissions measurement system. Consecutive averaging periods of one to three multiples of the response time are used to compare two semiempirical physically based modeling schemes. One scheme is based on internally observable variables (IOVs), such as engine speed and manifold absolute pressure, while the other is based on externally observable variables (EOVs), such as speed, acceleration, and road grade. For NO, HC, and CO emission rates, the average R(2) ranged from 0.41 to 0.66 for the former and from 0.17 to 0.30 for the latter. The EOV models have R(2) for CO(2) of 0.43 to 0.79 versus 0.99 for the IOV models. The models are sensitive to episodic events in driving cycles such as high acceleration. Intervehicle and fleet average modeling approaches are compared; the former account for microscale variations that might be useful for some types of assessments. EOV-based models have practical value for traffic management or simulation applications since IOVs usually are not available or not used for emission estimation.

  4. Empirical agent-based modelling challenges and solutions

    CERN Document Server

    Barreteau, Olivier

    2014-01-01

    This instructional book showcases techniques to parameterise human agents in empirical agent-based models (ABM). In doing so, it provides a timely overview of key ABM methodologies and the most innovative approaches through a variety of empirical applications.  It features cutting-edge research from leading academics and practitioners, and will provide a guide for characterising and parameterising human agents in empirical ABM.  In order to facilitate learning, this text shares the valuable experiences of other modellers in particular modelling situations. Very little has been published in the area of empirical ABM, and this contributed volume will appeal to graduate-level students and researchers studying simulation modeling in economics, sociology, ecology, and trans-disciplinary studies, such as topics related to sustainability. In a similar vein to the instruction found in a cookbook, this text provides the empirical modeller with a set of 'recipes'  ready to be implemented. Agent-based modeling (AB...

  5. Model Based Mission Assurance: Emerging Opportunities for Robotic Systems

    Science.gov (United States)

    Evans, John W.; DiVenti, Tony

    2016-01-01

    The emergence of Model Based Systems Engineering (MBSE) in a Model Based Engineering framework has created new opportunities to improve effectiveness and efficiencies across the assurance functions. The MBSE environment supports not only system architecture development, but provides for support of Systems Safety, Reliability and Risk Analysis concurrently in the same framework. Linking to detailed design will further improve assurance capabilities to support failures avoidance and mitigation in flight systems. This also is leading new assurance functions including model assurance and management of uncertainty in the modeling environment. Further, the assurance cases, a structured hierarchal argument or model, are emerging as a basis for supporting a comprehensive viewpoint in which to support Model Based Mission Assurance (MBMA).

  6. [GSH fermentation process modeling using entropy-criterion based RBF neural network model].

    Science.gov (United States)

    Tan, Zuoping; Wang, Shitong; Deng, Zhaohong; Du, Guocheng

    2008-05-01

    The prediction accuracy and generalization of GSH fermentation process modeling are often deteriorated by noise existing in the corresponding experimental data. In order to avoid this problem, we present a novel RBF neural network modeling approach based on entropy criterion. It considers the whole distribution structure of the training data set in the parameter learning process compared with the traditional MSE-criterion based parameter learning, and thus effectively avoids the weak generalization and over-learning. Then the proposed approach is applied to the GSH fermentation process modeling. Our results demonstrate that this proposed method has better prediction accuracy, generalization and robustness such that it offers a potential application merit for the GSH fermentation process modeling.

  7. A community-based framework for aquatic ecosystem models

    DEFF Research Database (Denmark)

    Trolle, Didde; Hamilton, D. P.; Hipsey, M. R.

    2012-01-01

    Here, we communicate a point of departure in the development of aquatic ecosystem models, namely a new community-based framework, which supports an enhanced and transparent union between the collective expertise that exists in the communities of traditional ecologists and model developers. Through...... a literature survey, we document the growing importance of numerical aquatic ecosystem models while also noting the difficulties, up until now, of the aquatic scientific community to make significant advances in these models during the past two decades. Through a common forum for aquatic ecosystem modellers we...... aim to (i) advance collaboration within the aquatic ecosystem modelling community, (ii) enable increased use of models for research, policy and ecosystem-based management, (iii) facilitate a collective framework using common (standardised) code to ensure that model development is incremental, (iv...

  8. NHPP-Based Software Reliability Models Using Equilibrium Distribution

    Science.gov (United States)

    Xiao, Xiao; Okamura, Hiroyuki; Dohi, Tadashi

    Non-homogeneous Poisson processes (NHPPs) have gained much popularity in actual software testing phases to estimate the software reliability, the number of remaining faults in software and the software release timing. In this paper, we propose a new modeling approach for the NHPP-based software reliability models (SRMs) to describe the stochastic behavior of software fault-detection processes. The fundamental idea is to apply the equilibrium distribution to the fault-detection time distribution in NHPP-based modeling. We also develop efficient parameter estimation procedures for the proposed NHPP-based SRMs. Through numerical experiments, it can be concluded that the proposed NHPP-based SRMs outperform the existing ones in many data sets from the perspective of goodness-of-fit and prediction performance.

  9. A knowledge representation meta-model for rule-based modelling of signalling networks

    Directory of Open Access Journals (Sweden)

    Adrien Basso-Blandin

    2016-03-01

    Full Text Available The study of cellular signalling pathways and their deregulation in disease states, such as cancer, is a large and extremely complex task. Indeed, these systems involve many parts and processes but are studied piecewise and their literatures and data are consequently fragmented, distributed and sometimes—at least apparently—inconsistent. This makes it extremely difficult to build significant explanatory models with the result that effects in these systems that are brought about by many interacting factors are poorly understood. The rule-based approach to modelling has shown some promise for the representation of the highly combinatorial systems typically found in signalling where many of the proteins are composed of multiple binding domains, capable of simultaneous interactions, and/or peptide motifs controlled by post-translational modifications. However, the rule-based approach requires highly detailed information about the precise conditions for each and every interaction which is rarely available from any one single source. Rather, these conditions must be painstakingly inferred and curated, by hand, from information contained in many papers—each of which contains only part of the story. In this paper, we introduce a graph-based meta-model, attuned to the representation of cellular signalling networks, which aims to ease this massive cognitive burden on the rule-based curation process. This meta-model is a generalization of that used by Kappa and BNGL which allows for the flexible representation of knowledge at various levels of granularity. In particular, it allows us to deal with information which has either too little, or too much, detail with respect to the strict rule-based meta-model. Our approach provides a basis for the gradual aggregation of fragmented biological knowledge extracted from the literature into an instance of the meta-model from which we can define an automated translation into executable Kappa programs.

  10. Intelligent Transportation and Evacuation Planning A Modeling-Based Approach

    CERN Document Server

    Naser, Arab

    2012-01-01

    Intelligent Transportation and Evacuation Planning: A Modeling-Based Approach provides a new paradigm for evacuation planning strategies and techniques. Recently, evacuation planning and modeling have increasingly attracted interest among researchers as well as government officials. This interest stems from the recent catastrophic hurricanes and weather-related events that occurred in the southeastern United States (Hurricane Katrina and Rita). The evacuation methods that were in place before and during the hurricanes did not work well and resulted in thousands of deaths. This book offers insights into the methods and techniques that allow for implementing mathematical-based, simulation-based, and integrated optimization and simulation-based engineering approaches for evacuation planning. This book also: Comprehensively discusses the application of mathematical models for evacuation and intelligent transportation modeling Covers advanced methodologies in evacuation modeling and planning Discusses principles a...

  11. Development of zircaloy deformation model to describe the zircaloy-4 cladding tube during accidents

    International Nuclear Information System (INIS)

    Raff, S.

    1978-01-01

    The development of a high-temperature deformation model for Zircaloy-4 cans is primarily based on numerous well-parametrized tensile tests to get the material behaviour including statistical variance. It is shown that plastic deformation may be described by a power creep law, the coefficients of which show strong dependence on temperature in the relevant temperature region. These coefficients have been determined. A model based on these coefficients has been established which, apart from best estimate deformation, gives upper and lower bounds of possible deformation. The model derived from isothermal uniaxial tests is being verified against isothermal and transient tube burst tests. The influence of preoxidation and increased oxygen concentration during deformation is modeled on the basis of the pseudobinary Zircaloy-oxygen phase diagram. (author)

  12. Distributed Prognostics Based on Structural Model Decomposition

    Data.gov (United States)

    National Aeronautics and Space Administration — Within systems health management, prognostics focuses on predicting the remaining useful life of a system. In the model-based prognostics paradigm, physics-based...

  13. Model-based and model-free “plug-and-play” building energy efficient control

    International Nuclear Information System (INIS)

    Baldi, Simone; Michailidis, Iakovos; Ravanis, Christos; Kosmatopoulos, Elias B.

    2015-01-01

    Highlights: • “Plug-and-play” Building Optimization and Control (BOC) driven by building data. • Ability to handle the large-scale and complex nature of the BOC problem. • Adaptation to learn the optimal BOC policy when no building model is available. • Comparisons with rule-based and advanced BOC strategies. • Simulation and real-life experiments in a ten-office building. - Abstract: Considerable research efforts in Building Optimization and Control (BOC) have been directed toward the development of “plug-and-play” BOC systems that can achieve energy efficiency without compromising thermal comfort and without the need of qualified personnel engaged in a tedious and time-consuming manual fine-tuning phase. In this paper, we report on how a recently introduced Parametrized Cognitive Adaptive Optimization – abbreviated as PCAO – can be used toward the design of both model-based and model-free “plug-and-play” BOC systems, with minimum human effort required to accomplish the design. In the model-based case, PCAO assesses the performance of its control strategy via a simulation model of the building dynamics; in the model-free case, PCAO optimizes its control strategy without relying on any model of the building dynamics. Extensive simulation and real-life experiments performed on a 10-office building demonstrate the effectiveness of the PCAO–BOC system in providing significant energy efficiency and improved thermal comfort. The mechanisms embedded within PCAO render it capable of automatically and quickly learning an efficient BOC strategy either in the presence of complex nonlinear simulation models of the building dynamics (model-based) or when no model for the building dynamics is available (model-free). Comparative studies with alternative state-of-the-art BOC systems show the effectiveness of the PCAO–BOC solution

  14. Model-based processing for underwater acoustic arrays

    CERN Document Server

    Sullivan, Edmund J

    2015-01-01

    This monograph presents a unified approach to model-based processing for underwater acoustic arrays. The use of physical models in passive array processing is not a new idea, but it has been used on a case-by-case basis, and as such, lacks any unifying structure. This work views all such processing methods as estimation procedures, which then can be unified by treating them all as a form of joint estimation based on a Kalman-type recursive processor, which can be recursive either in space or time, depending on the application. This is done for three reasons. First, the Kalman filter provides a natural framework for the inclusion of physical models in a processing scheme. Second, it allows poorly known model parameters to be jointly estimated along with the quantities of interest. This is important, since in certain areas of array processing already in use, such as those based on matched-field processing, the so-called mismatch problem either degrades performance or, indeed, prevents any solution at all. Third...

  15. DEVELOPMENT MODEL OF PATISSERIE PROJECT-BASED LEARNING

    OpenAIRE

    Ana Ana; Lutfhiyah Nurlaela

    2013-01-01

    The study aims to find a model of patisserie project-based learning with production approach that can improve effectiveness of patisserie learning. Delphi Technique, Cohen's Kappa and percentages of agreements were used to assess model of patisserie project based learning. Data collection techniques employed in the study were questionnaire, check list worksheet, observation, and interview sheets. Subjects were 13 lectures of expertise food and nutrition and 91 students of Food and Nutrition ...

  16. Model-based and model-free Pavlovian reward learning: revaluation, revision, and revelation.

    Science.gov (United States)

    Dayan, Peter; Berridge, Kent C

    2014-06-01

    Evidence supports at least two methods for learning about reward and punishment and making predictions for guiding actions. One method, called model-free, progressively acquires cached estimates of the long-run values of circumstances and actions from retrospective experience. The other method, called model-based, uses representations of the environment, expectations, and prospective calculations to make cognitive predictions of future value. Extensive attention has been paid to both methods in computational analyses of instrumental learning. By contrast, although a full computational analysis has been lacking, Pavlovian learning and prediction has typically been presumed to be solely model-free. Here, we revise that presumption and review compelling evidence from Pavlovian revaluation experiments showing that Pavlovian predictions can involve their own form of model-based evaluation. In model-based Pavlovian evaluation, prevailing states of the body and brain influence value computations, and thereby produce powerful incentive motivations that can sometimes be quite new. We consider the consequences of this revised Pavlovian view for the computational landscape of prediction, response, and choice. We also revisit differences between Pavlovian and instrumental learning in the control of incentive motivation.

  17. Vibration-Based Damage Diagnosis in a Laboratory Cable-Stayed Bridge Model via an RCP-ARX Model Based Method

    International Nuclear Information System (INIS)

    Michaelides, P G; Apostolellis, P G; Fassois, S D

    2011-01-01

    Vibration-based damage detection and identification in a laboratory cable-stayed bridge model is addressed under inherent, environmental, and experimental uncertainties. The problem is challenging as conventional stochastic methods face difficulties due to uncertainty underestimation. A novel method is formulated based on identified Random Coefficient Pooled ARX (RCP-ARX) representations of the dynamics and statistical hypothesis testing. The method benefits from the ability of RCP models in properly capturing uncertainty. Its effectiveness is demonstrated via a high number of experiments under a variety of damage scenarios.

  18. Vibration-Based Damage Diagnosis in a Laboratory Cable-Stayed Bridge Model via an RCP-ARX Model Based Method

    Energy Technology Data Exchange (ETDEWEB)

    Michaelides, P G; Apostolellis, P G; Fassois, S D, E-mail: mixail@mech.upatras.gr, E-mail: fassois@mech.upatras.gr [Laboratory for Stochastic Mechanical Systems and Automation (SMSA), Department of Mechanical and Aeronautical Engineering, University of Patras, GR 265 00 Patras (Greece)

    2011-07-19

    Vibration-based damage detection and identification in a laboratory cable-stayed bridge model is addressed under inherent, environmental, and experimental uncertainties. The problem is challenging as conventional stochastic methods face difficulties due to uncertainty underestimation. A novel method is formulated based on identified Random Coefficient Pooled ARX (RCP-ARX) representations of the dynamics and statistical hypothesis testing. The method benefits from the ability of RCP models in properly capturing uncertainty. Its effectiveness is demonstrated via a high number of experiments under a variety of damage scenarios.

  19. Model-based security testing

    OpenAIRE

    Schieferdecker, Ina; Großmann, Jürgen; Schneider, Martin

    2012-01-01

    Security testing aims at validating software system requirements related to security properties like confidentiality, integrity, authentication, authorization, availability, and non-repudiation. Although security testing techniques are available for many years, there has been little approaches that allow for specification of test cases at a higher level of abstraction, for enabling guidance on test identification and specification as well as for automated test generation. Model-based security...

  20. Bayesian Based Diagnostic Model for Condition Based Maintenance of Offshore Wind Farms

    Directory of Open Access Journals (Sweden)

    Masoud Asgarpour

    2018-01-01

    Full Text Available Operation and maintenance costs are a major contributor to the Levelized Cost of Energy for electricity produced by offshore wind and can be significantly reduced if existing corrective actions are performed as efficiently as possible and if future corrective actions are avoided by performing sufficient preventive actions. This paper presents an applied and generic diagnostic model for fault detection and condition based maintenance of offshore wind components. The diagnostic model is based on two probabilistic matrices; first, a confidence matrix, representing the probability of detection using each fault detection method, and second, a diagnosis matrix, representing the individual outcome of each fault detection method. Once the confidence and diagnosis matrices of a component are defined, the individual diagnoses of each fault detection method are combined into a final verdict on the fault state of that component. Furthermore, this paper introduces a Bayesian updating model based on observations collected by inspections to decrease the uncertainty of initial confidence matrix. The framework and implementation of the presented diagnostic model are further explained within a case study for a wind turbine component based on vibration, temperature, and oil particle fault detection methods. The last part of the paper will have a discussion of the case study results and present conclusions.

  1. Predictor-Based Model Reference Adaptive Control

    Science.gov (United States)

    Lavretsky, Eugene; Gadient, Ross; Gregory, Irene M.

    2010-01-01

    This paper is devoted to the design and analysis of a predictor-based model reference adaptive control. Stable adaptive laws are derived using Lyapunov framework. The proposed architecture is compared with the now classical model reference adaptive control. A simulation example is presented in which numerical evidence indicates that the proposed controller yields improved transient characteristics.

  2. Agent-based Modeling Automated: Data-driven Generation of Innovation Diffusion Models

    NARCIS (Netherlands)

    Jensen, T.; Chappin, E.J.L.

    2016-01-01

    Simulation modeling is useful to gain insights into driving mechanisms of diffusion of innovations. This study aims to introduce automation to make identification of such mechanisms with agent-based simulation modeling less costly in time and labor. We present a novel automation procedure in which

  3. Enabling Accessibility Through Model-Based User Interface Development.

    Science.gov (United States)

    Ziegler, Daniel; Peissner, Matthias

    2017-01-01

    Adaptive user interfaces (AUIs) can increase the accessibility of interactive systems. They provide personalized display and interaction modes to fit individual user needs. Most AUI approaches rely on model-based development, which is considered relatively demanding. This paper explores strategies to make model-based development more attractive for mainstream developers.

  4. Variance-based sensitivity indices for models with dependent inputs

    International Nuclear Information System (INIS)

    Mara, Thierry A.; Tarantola, Stefano

    2012-01-01

    Computational models are intensively used in engineering for risk analysis or prediction of future outcomes. Uncertainty and sensitivity analyses are of great help in these purposes. Although several methods exist to perform variance-based sensitivity analysis of model output with independent inputs only a few are proposed in the literature in the case of dependent inputs. This is explained by the fact that the theoretical framework for the independent case is set and a univocal set of variance-based sensitivity indices is defined. In the present work, we propose a set of variance-based sensitivity indices to perform sensitivity analysis of models with dependent inputs. These measures allow us to distinguish between the mutual dependent contribution and the independent contribution of an input to the model response variance. Their definition relies on a specific orthogonalisation of the inputs and ANOVA-representations of the model output. In the applications, we show the interest of the new sensitivity indices for model simplification setting. - Highlights: ► Uncertainty and sensitivity analyses are of great help in engineering. ► Several methods exist to perform variance-based sensitivity analysis of model output with independent inputs. ► We define a set of variance-based sensitivity indices for models with dependent inputs. ► Inputs mutual contributions are distinguished from their independent contributions. ► Analytical and computational tests are performed and discussed.

  5. User Context Aware Base Station Power Flow Model

    OpenAIRE

    Walsh, Barbara; Farrell, Ronan

    2005-01-01

    At present the testing of power amplifiers within base station transmitters is limited to testing at component level as opposed to testing at the system level. While the detection of catastrophic failure is possible, that of performance degradation is not. This paper proposes a base station model with respect to transmitter output power with the aim of introducing system level monitoring of the power amplifier behaviour within the base station. Our model reflects the expe...

  6. Collaborative Practice Model: Improving the Delivery of Bad News.

    Science.gov (United States)

    Bowman, Pamela N; Slusser, Kim; Allen, Deborah

    2018-02-01

    Ideal bad news delivery requires skilled communication and team support. The literature has primarily focused on patient preferences, impact on care decisions, healthcare roles, and communication styles, without addressing systematic implementation. This article describes how an interdisciplinary team, led by advanced practice nurses, developed and implemented a collaborative practice model to deliver bad news on a unit that had struggled with inconsistencies. Using evidence-based practices, the authors explored current processes, role perceptions and expectations, and perceived barriers to developing the model, which is now the standard of care and an example of interprofessional team collaboration across the healthcare system. This model for delivering bad news can be easily adapted to meet the needs of other clinical units.
.

  7. The Challenge of Forecasting Metropolitan Growth: Urban Characteristics Based Models versus Regional Dummy Based Models

    OpenAIRE

    NA

    2005-01-01

    This paper presents a study of errors in forecasting the population of Metropolitan Statistical Areas and the Primary MSAs of Consolidated Metropolitan Statistical Areas and New England MAs. The forecasts are for the year 2000 and are based on a semi-structural model estimated by Mills and Lubelle using 1970 to 1990 census data on population, employment and relative real wages. This model allows the testing of regional effects on population and employment growth. The year 2000 forecasts are f...

  8. Process-based modelling of turbidity-current hydrodynamics and sedimentation

    NARCIS (Netherlands)

    Groenenberg, R.M.

    2007-01-01

    The production potential of deep-water reservoirs is primarily determined by rock bulk volume, porosity and permeability. Quantification of the geometry and spatial distribution of reservoir sands in deep-water deposits can provide crucial information to assess sand body volume, connectivity and the

  9. Fuzzy model-based control of a nuclear reactor

    International Nuclear Information System (INIS)

    Van Den Durpel, L.; Ruan, D.

    1994-01-01

    The fuzzy model-based control of a nuclear power reactor is an emerging research topic world-wide. SCK-CEN is dealing with this research in a preliminary stage, including two aspects, namely fuzzy control and fuzzy modelling. The aim is to combine both methodologies in contrast to conventional model-based PID control techniques, and to state advantages of including fuzzy parameters as safety and operator feedback. This paper summarizes the general scheme of this new research project

  10. Documentation for Grants Equal to Tax model: Volume 1, Technical description

    International Nuclear Information System (INIS)

    1986-01-01

    A computerized model, the Grants Equal to Tax (GETT) model, was developed to assist in evaluating the amount of federal grant monies that would go to state and local jurisdictions under the provisions outlined in the Nuclear Waste Policy Act of 1982. The GETT model is capable of forecasting the amount of tax liability associated with all property owned and all activities undertaken by the US Department of Energy (DOE) in site characterization and repository development. The GETT program is a user-friendly, menu-driven model developed using dBASE III/trademark/, a relational data base management system. The data base for GETT consists primarily of eight separate dBASE III/trademark/ files corresponding to each of the eight taxes levied by state and local jurisdictions on business property and activity. Additional smaller files help to control model inputs and reporting options. Volume 1 of the GETT model documentation is a technical description of the program and its capabilities providing (1) descriptions of the data management system and its procedures; (2) formulas for calculating taxes (illustrated with flow charts); (3) descriptions of tax data base variables for the Deaf Smith County, Texas, Richton Dome, Mississippi, and Davis Canyon, Utah, salt sites; and (4) data inputs for the GETT model. 10 refs., 18 figs., 3 tabs

  11. Relational grounding facilitates development of scientifically useful multiscale models

    Directory of Open Access Journals (Sweden)

    Lam Tai

    2011-09-01

    Full Text Available Abstract We review grounding issues that influence the scientific usefulness of any biomedical multiscale model (MSM. Groundings are the collection of units, dimensions, and/or objects to which a variable or model constituent refers. To date, models that primarily use continuous mathematics rely heavily on absolute grounding, whereas those that primarily use discrete software paradigms (e.g., object-oriented, agent-based, actor typically employ relational grounding. We review grounding issues and identify strategies to address them. We maintain that grounding issues should be addressed at the start of any MSM project and should be reevaluated throughout the model development process. We make the following points. Grounding decisions influence model flexibility, adaptability, and thus reusability. Grounding choices should be influenced by measures, uncertainty, system information, and the nature of available validation data. Absolute grounding complicates the process of combining models to form larger models unless all are grounded absolutely. Relational grounding facilitates referent knowledge embodiment within computational mechanisms but requires separate model-to-referent mappings. Absolute grounding can simplify integration by forcing common units and, hence, a common integration target, but context change may require model reengineering. Relational grounding enables synthesis of large, composite (multi-module models that can be robust to context changes. Because biological components have varying degrees of autonomy, corresponding components in MSMs need to do the same. Relational grounding facilitates achieving such autonomy. Biomimetic analogues designed to facilitate translational research and development must have long lifecycles. Exploring mechanisms of normal-to-disease transition requires model components that are grounded relationally. Multi-paradigm modeling requires both hyperspatial and relational grounding.

  12. Quantitation of base substitutions in eukaryotic 5S rRNA: selection for the maintenance of RNA secondary structure.

    Science.gov (United States)

    Curtiss, W C; Vournakis, J N

    1984-01-01

    Eukaryotic 5S rRNA sequences from 34 diverse species were compared by the following method: (1) The sequences were aligned; (2) the positions of substitutions were located by comparison of all possible pairs of sequences; (3) the substitution sites were mapped to an assumed general base pairing model; and (4) the R-Y model of base stacking was used to study stacking pattern relationships in the structure. An analysis of the sequence and structure variability in each region of the molecule is presented. It was found that the degree of base substitution varies over a wide range, from absolute conservation to occurrence of over 90% of the possible observable substitutions. The substitutions are located primarily in stem regions of the 5S rRNA secondary structure. More than 88% of the substitutions in helical regions maintain base pairing. The disruptive substitutions are primarily located at the edges of helical regions, resulting in shortening of the helical regions and lengthening of the adjacent nonpaired regions. Base stacking patterns determined by the R-Y model are mapped onto the general secondary structure. Intrastrand and interstrand stacking could stabilize alternative coaxial structures and limit the conformational flexibility of nonpaired regions. Two short contiguous regions are 100% conserved in all species. This may reflect evolutionary constraints imposed at the DNA level by the requirement for binding of a 5S gene transcription initiation factor during gene expression.

  13. A stream-based mathematical model for distributed information processing systems - SysLab system model

    OpenAIRE

    Klein, Cornel; Rumpe, Bernhard; Broy, Manfred

    2014-01-01

    In the SysLab project we develop a software engineering method based on a mathematical foundation. The SysLab system model serves as an abstract mathematical model for information systems and their components. It is used to formalize the semantics of all used description techniques such as object diagrams state automata sequence charts or data-flow diagrams. Based on the requirements for such a reference model, we define the system model including its different views and their relationships.

  14. Characteristic Model-Based Robust Model Predictive Control for Hypersonic Vehicles with Constraints

    Directory of Open Access Journals (Sweden)

    Jun Zhang

    2017-06-01

    Full Text Available Designing robust control for hypersonic vehicles in reentry is difficult, due to the features of the vehicles including strong coupling, non-linearity, and multiple constraints. This paper proposed a characteristic model-based robust model predictive control (MPC for hypersonic vehicles with reentry constraints. First, the hypersonic vehicle is modeled by a characteristic model composed of a linear time-varying system and a lumped disturbance. Then, the identification data are regenerated by the accumulative sum idea in the gray theory, which weakens effects of the random noises and strengthens regularity of the identification data. Based on the regenerated data, the time-varying parameters and the disturbance are online estimated according to the gray identification. At last, the mixed H2/H∞ robust predictive control law is proposed based on linear matrix inequalities (LMIs and receding horizon optimization techniques. Using active tackling system constraints of MPC, the input and state constraints are satisfied in the closed-loop control system. The validity of the proposed control is verified theoretically according to Lyapunov theory and illustrated by simulation results.

  15. A continuum based fem model for friction stir welding-model development

    Energy Technology Data Exchange (ETDEWEB)

    Buffa, G. [Ohio State University, Department of Industrial, Welding and Systems Engineering, 1971 Neil Avenue, 210 Baker Systems, Columbus, OH 43210 (United States) and Dipartimento di Tecnologia Meccanica, Produzione e Ingegneria Gestionale, Universita di Palermo, Viale delle Scienze, 90128 Palermo (Italy)]. E-mail: g.buffa@dtpm.unipa.it; Hua, J. [Ohio State University, Department of Industrial, Welding and Systems Engineering, 1971 Neil Avenue, 210 Baker Systems, Columbus, OH 43210 (United States)]. E-mail: hua.14@osu.edu; Shivpuri, R. [Ohio State University, Department of Industrial, Welding and Systems Engineering, 1971 Neil Avenue, 210 Baker Systems, Columbus, OH 43210 (United States)]. E-mail: shivpuri.1@osu.edu; Fratini, L. [Dipartimento di Tecnologia Meccanica, Produzione e Ingegneria Gestionale, Universita di Palermo, Viale delle Scienze, 90128 Palermo (Italy)]. E-mail: abaqus@dtpm.unipa.it

    2006-03-15

    Although friction stir welding (FSW) has been successfully used to join materials that are difficult-to-weld or unweldeable by fusion welding methods, it is still in its early development stage and, therefore, a scientific knowledge based predictive model is of significant help for thorough understanding of FSW process. In this paper, a continuum based FEM model for friction stir welding process is proposed, that is 3D Lagrangian implicit, coupled, rigid-viscoplastic. This model is calibrated by comparing with experimental results of force and temperature distribution, then is used to investigate the distribution of temperature and strain in heat affect zone and the weld nugget. The model correctly predicts the non-symmetric nature of FSW process, and the relationships between the tool forces and the variation in the process parameters. It is found that the effective strain distribution is non-symmetric about the weld line while the temperature profile is almost symmetric in the weld zone.

  16. A continuum based fem model for friction stir welding-model development

    International Nuclear Information System (INIS)

    Buffa, G.; Hua, J.; Shivpuri, R.; Fratini, L.

    2006-01-01

    Although friction stir welding (FSW) has been successfully used to join materials that are difficult-to-weld or unweldeable by fusion welding methods, it is still in its early development stage and, therefore, a scientific knowledge based predictive model is of significant help for thorough understanding of FSW process. In this paper, a continuum based FEM model for friction stir welding process is proposed, that is 3D Lagrangian implicit, coupled, rigid-viscoplastic. This model is calibrated by comparing with experimental results of force and temperature distribution, then is used to investigate the distribution of temperature and strain in heat affect zone and the weld nugget. The model correctly predicts the non-symmetric nature of FSW process, and the relationships between the tool forces and the variation in the process parameters. It is found that the effective strain distribution is non-symmetric about the weld line while the temperature profile is almost symmetric in the weld zone

  17. Transcendental Political Systems and the Gravity Model

    Science.gov (United States)

    Lock, Connor

    2012-01-01

    This summer I have been working on an Army Deep Futures Model project named Themis. Themis is a JPL based modeling framework that anticipates possible future states for the world within the next 25 years. The goal of this framework is to determine the likelihood that the US Army will need to intervene on behalf of the US strategic interests. Key elements that are modeled within this tool include the world structure and major decisions that are made by key actors. Each actor makes decisions based on their goals and within the constraints of the structure of the system in which they are located. In my research I have focused primarily on the effects of structures upon the decision-making processes of the actors within them. This research is a natural extension of my major program at Georgetown University, where I am studying the International Political Economy and the structures that make it up. My basic goal for this summer project was to be a helpful asset to the Themis modeling team, with any research done or processes learned constituting a bonus.

  18. Dissemination of Cultural Norms and Values: Agent-Based Modeling

    Directory of Open Access Journals (Sweden)

    Denis Andreevich Degterev

    2016-12-01

    Full Text Available This article shows how agent-based modeling allows us to explore the mechanisms of the dissemination of cultural norms and values both within one country and in the whole world. In recent years, this type of simulation is particularly prevalent in the analysis of international relations, becoming more popular than the system dynamics and discrete event simulation. The use of agent-based modeling in the analysis of international relations is connected with the agent-structure problem in international relations. Structure and agents act as interdependent and dynamically changing in the process of interaction between entities. Agent-structure interaction could be modeled by means of the theory of complex adaptive systems with the use of agent-based modeling techniques. One of the first examples of the use of agent-based modeling in political science is a model of racial segregation T. Shellinga. On the basis of this model, the author shows how the change in behavioral patterns at micro-level impacts on the macro-level. Patterns are changing due to the dynamics of cultural norms and values, formed by mass-media and other social institutes. The author shows the main areas of modern application of agent-based modeling in international studies including the analysis of ethnic conflicts, the formation of international coalitions. Particular attention is paid to Robert Axelrod approach based on the use of genetic algorithms to the spread of cultural norms and values. Agent-based modeling shows how to how to create such conditions that the norms that originally are not shared by a significant part of the population, eventually spread everywhere. Practical application of these algorithms is shown by the author of the article on the example of the situation in Ukraine in 2015-2016. The article also reveals the mechanisms of international spread of cultural norms and values. The main think-tanks using agent-based modeling in international studies are

  19. PARTICIPATION BASED MODEL OF SHIP CREW MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Toni Bielić

    2014-10-01

    Full Text Available 800x600 This paper analyse the participation - based model on board the ship as possibly optimal leadership model existing in the shipping industry with accent on decision - making process. In the paper authors have tried to define master’s behaviour model and management style identifying drawbacks and disadvantages of vertical, pyramidal organization with master on the top. Paper describes efficiency of decision making within team organization and optimization of a ship’s organisation by introducing teamwork on board the ship. Three examples of the ship’s accidents are studied and evaluated through “Leader - participation” model. The model of participation based management as a model of the teamwork has been applied in studying the cause - and - effect of accidents with the critical review of the communication and managing the human resources on a ship. The results have showed that the cause of all three accidents is the autocratic behaviour of the leaders and lack of communication within teams. Normal 0 21 false false false HR X-NONE X-NONE MicrosoftInternetExplorer4

  20. Markov chain aggregation for agent-based models

    CERN Document Server

    Banisch, Sven

    2016-01-01

    This self-contained text develops a Markov chain approach that makes the rigorous analysis of a class of microscopic models that specify the dynamics of complex systems at the individual level possible. It presents a general framework of aggregation in agent-based and related computational models, one which makes use of lumpability and information theory in order to link the micro and macro levels of observation. The starting point is a microscopic Markov chain description of the dynamical process in complete correspondence with the dynamical behavior of the agent-based model (ABM), which is obtained by considering the set of all possible agent configurations as the state space of a huge Markov chain. An explicit formal representation of a resulting “micro-chain” including microscopic transition rates is derived for a class of models by using the random mapping representation of a Markov process. The type of probability distribution used to implement the stochastic part of the model, which defines the upd...

  1. Expediting model-based optoacoustic reconstructions with tomographic symmetries

    International Nuclear Information System (INIS)

    Lutzweiler, Christian; Deán-Ben, Xosé Luís; Razansky, Daniel

    2014-01-01

    Purpose: Image quantification in optoacoustic tomography implies the use of accurate forward models of excitation, propagation, and detection of optoacoustic signals while inversions with high spatial resolution usually involve very large matrices, leading to unreasonably long computation times. The development of fast and memory efficient model-based approaches represents then an important challenge to advance on the quantitative and dynamic imaging capabilities of tomographic optoacoustic imaging. Methods: Herein, a method for simplification and acceleration of model-based inversions, relying on inherent symmetries present in common tomographic acquisition geometries, has been introduced. The method is showcased for the case of cylindrical symmetries by using polar image discretization of the time-domain optoacoustic forward model combined with efficient storage and inversion strategies. Results: The suggested methodology is shown to render fast and accurate model-based inversions in both numerical simulations andpost mortem small animal experiments. In case of a full-view detection scheme, the memory requirements are reduced by one order of magnitude while high-resolution reconstructions are achieved at video rate. Conclusions: By considering the rotational symmetry present in many tomographic optoacoustic imaging systems, the proposed methodology allows exploiting the advantages of model-based algorithms with feasible computational requirements and fast reconstruction times, so that its convenience and general applicability in optoacoustic imaging systems with tomographic symmetries is anticipated

  2. A model-based risk management framework

    Energy Technology Data Exchange (ETDEWEB)

    Gran, Bjoern Axel; Fredriksen, Rune

    2002-08-15

    The ongoing research activity addresses these issues through two co-operative activities. The first is the IST funded research project CORAS, where Institutt for energiteknikk takes part as responsible for the work package for Risk Analysis. The main objective of the CORAS project is to develop a framework to support risk assessment of security critical systems. The second, called the Halden Open Dependability Demonstrator (HODD), is established in cooperation between Oestfold University College, local companies and HRP. The objective of HODD is to provide an open-source test bed for testing, teaching and learning about risk analysis methods, risk analysis tools, and fault tolerance techniques. The Inverted Pendulum Control System (IPCON), which main task is to keep a pendulum balanced and controlled, is the first system that has been established. In order to make risk assessment one need to know what a system does, or is intended to do. Furthermore, the risk assessment requires correct descriptions of the system, its context and all relevant features. A basic assumption is that a precise model of this knowledge, based on formal or semi-formal descriptions, such as UML, will facilitate a systematic risk assessment. It is also necessary to have a framework to integrate the different risk assessment methods. The experiences so far support this hypothesis. This report presents CORAS and the CORAS model-based risk management framework, including a preliminary guideline for model-based risk assessment. The CORAS framework for model-based risk analysis offers a structured and systematic approach to identify and assess security issues of ICT systems. From the initial assessment of IPCON, we also believe that the framework is applicable in a safety context. Further work on IPCON, as well as the experiences from the CORAS trials, will provide insight and feedback for further improvements. (Author)

  3. Model-Based Software Testing for Object-Oriented Software

    Science.gov (United States)

    Biju, Soly Mathew

    2008-01-01

    Model-based testing is one of the best solutions for testing object-oriented software. It has a better test coverage than other testing styles. Model-based testing takes into consideration behavioural aspects of a class, which are usually unchecked in other testing methods. An increase in the complexity of software has forced the software industry…

  4. Dynamic ligand-based pharmacophore modeling and virtual ...

    Indian Academy of Sciences (India)

    Five ligand-based pharmacophore models were generated from 40 different .... the Phase module of the Schrodinger program.35 Each model consisted of six types of ... ligand preparation included the OPLS_2005 force field and to retain the ...

  5. The eGo grid model: An open source approach towards a model of German high and extra-high voltage power grids

    Science.gov (United States)

    Mueller, Ulf Philipp; Wienholt, Lukas; Kleinhans, David; Cussmann, Ilka; Bunke, Wolf-Dieter; Pleßmann, Guido; Wendiggensen, Jochen

    2018-02-01

    There are several power grid modelling approaches suitable for simulations in the field of power grid planning. The restrictive policies of grid operators, regulators and research institutes concerning their original data and models lead to an increased interest in open source approaches of grid models based on open data. By including all voltage levels between 60 kV (high voltage) and 380kV (extra high voltage), we dissolve the common distinction between transmission and distribution grid in energy system models and utilize a single, integrated model instead. An open data set for primarily Germany, which can be used for non-linear, linear and linear-optimal power flow methods, was developed. This data set consists of an electrically parameterised grid topology as well as allocated generation and demand characteristics for present and future scenarios at high spatial and temporal resolution. The usability of the grid model was demonstrated by the performance of exemplary power flow optimizations. Based on a marginal cost driven power plant dispatch, being subject to grid restrictions, congested power lines were identified. Continuous validation of the model is nescessary in order to reliably model storage and grid expansion in progressing research.

  6. Modeling and knowledge acquisition processes using case-based inference

    Directory of Open Access Journals (Sweden)

    Ameneh Khadivar

    2017-03-01

    Full Text Available The method of acquisition and presentation of the organizational Process Knowledge has considered by many KM researches. In this research a model for process knowledge acquisition and presentation has been presented by using the approach of Case Base Reasoning. The validation of the presented model was evaluated by conducting an expert panel. Then a software has been developed based on the presented model and implemented in Eghtesad Novin Bank of Iran. In this company, based on the stages of the presented model, first the knowledge intensive processes has been identified, then the Process Knowledge was stored in a knowledge base in the format of problem/solution/consequent .The retrieval of the knowledge was done based on the similarity of the nearest neighbor algorithm. For validating of the implemented system, results of the system has compared by the results of the decision making of the expert of the process.

  7. Cholinesterase-based biosensors.

    Science.gov (United States)

    Štěpánková, Šárka; Vorčáková, Katarína

    2016-01-01

    Recently, cholinesterase-based biosensors are widely used for assaying anticholinergic compounds. Primarily biosensors based on enzyme inhibition are useful analytical tools for fast screening of inhibitors, such as organophosphates and carbamates. The present review is aimed at compilation of the most important facts about cholinesterase based biosensors, types of physico-chemical transduction, immobilization strategies and practical applications.

  8. Agent Based Modeling as an Educational Tool

    Science.gov (United States)

    Fuller, J. H.; Johnson, R.; Castillo, V.

    2012-12-01

    Motivation is a key element in high school education. One way to improve motivation and provide content, while helping address critical thinking and problem solving skills, is to have students build and study agent based models in the classroom. This activity visually connects concepts with their applied mathematical representation. "Engaging students in constructing models may provide a bridge between frequently disconnected conceptual and mathematical forms of knowledge." (Levy and Wilensky, 2011) We wanted to discover the feasibility of implementing a model based curriculum in the classroom given current and anticipated core and content standards.; Simulation using California GIS data ; Simulation of high school student lunch popularity using aerial photograph on top of terrain value map.

  9. Thermal Modelling and Design of On-board DC-DC Power Converter using Finite Element Method

    DEFF Research Database (Denmark)

    Staliulionis, Z.; Zhang, Z.; Pittini, R.

    2014-01-01

    Power electronic converters are widely used and play a pivotal role in electronics area. The temperature causes around 54 % of all power converters failures. Thermal loads are nowadays one of the bottlenecks in the power system design and the cooling efficiency of a system is primarily determined...... by numerical modelling techniques. Therefore, thermal design through thermal modelling and simulation is becoming an integral part of the design process as less expensive compared to the experimental cut-and-try approach. Here the investigation is performed using finite element method-based modelling, and also...

  10. Thermal Modeling and Design of On-board DC-DC Power Converter using Finite Element Method

    DEFF Research Database (Denmark)

    Staliulionis, Zygimantas; Zhang, Zhe; Pittini, Riccardo

    2014-01-01

    Power electronic converters are widely used and play a pivotal role in electronics area . The temperature causes around 54 % of all power converters failures. Thermal loads are nowadays one of the bottlenecks in the power system design and the cooling efficiency of a system is primarily determined...... by numerical modeling techniques. Therefore, thermal design through thermal modeling and simulation is becoming an integral part of the design process as less expensive compared to the experimenta l cut - and - try approach. Here the investigation is performed using finite element method - based modeling...

  11. Multiscale agent-based cancer modeling.

    Science.gov (United States)

    Zhang, Le; Wang, Zhihui; Sagotsky, Jonathan A; Deisboeck, Thomas S

    2009-04-01

    Agent-based modeling (ABM) is an in silico technique that is being used in a variety of research areas such as in social sciences, economics and increasingly in biomedicine as an interdisciplinary tool to study the dynamics of complex systems. Here, we describe its applicability to integrative tumor biology research by introducing a multi-scale tumor modeling platform that understands brain cancer as a complex dynamic biosystem. We summarize significant findings of this work, and discuss both challenges and future directions for ABM in the field of cancer research.

  12. Model Predictive Control based on Finite Impulse Response Models

    DEFF Research Database (Denmark)

    Prasath, Guru; Jørgensen, John Bagterp

    2008-01-01

    We develop a regularized l2 finite impulse response (FIR) predictive controller with input and input-rate constraints. Feedback is based on a simple constant output disturbance filter. The performance of the predictive controller in the face of plant-model mismatch is investigated by simulations...... and related to the uncertainty of the impulse response coefficients. The simulations can be used to benchmark l2 MPC against FIR based robust MPC as well as to estimate the maximum performance improvements by robust MPC....

  13. Development of uncertainty-based work injury model using Bayesian structural equation modelling.

    Science.gov (United States)

    Chatterjee, Snehamoy

    2014-01-01

    This paper proposed a Bayesian method-based structural equation model (SEM) of miners' work injury for an underground coal mine in India. The environmental and behavioural variables for work injury were identified and causal relationships were developed. For Bayesian modelling, prior distributions of SEM parameters are necessary to develop the model. In this paper, two approaches were adopted to obtain prior distribution for factor loading parameters and structural parameters of SEM. In the first approach, the prior distributions were considered as a fixed distribution function with specific parameter values, whereas, in the second approach, prior distributions of the parameters were generated from experts' opinions. The posterior distributions of these parameters were obtained by applying Bayesian rule. The Markov Chain Monte Carlo sampling in the form Gibbs sampling was applied for sampling from the posterior distribution. The results revealed that all coefficients of structural and measurement model parameters are statistically significant in experts' opinion-based priors, whereas, two coefficients are not statistically significant when fixed prior-based distributions are applied. The error statistics reveals that Bayesian structural model provides reasonably good fit of work injury with high coefficient of determination (0.91) and less mean squared error as compared to traditional SEM.

  14. Exploring Spatiotemporal Trends in Commercial Fishing Effort of an Abalone Fishing Zone: A GIS-Based Hotspot Model

    Science.gov (United States)

    Jalali, M. Ali; Ierodiaconou, Daniel; Gorfine, Harry; Monk, Jacquomo; Rattray, Alex

    2015-01-01

    Assessing patterns of fisheries activity at a scale related to resource exploitation has received particular attention in recent times. However, acquiring data about the distribution and spatiotemporal allocation of catch and fishing effort in small scale benthic fisheries remains challenging. Here, we used GIS-based spatio-statistical models to investigate the footprint of commercial diving events on blacklip abalone (Haliotis rubra) stocks along the south-west coast of Victoria, Australia from 2008 to 2011. Using abalone catch data matched with GPS location we found catch per unit of fishing effort (CPUE) was not uniformly spatially and temporally distributed across the study area. Spatial autocorrelation and hotspot analysis revealed significant spatiotemporal clusters of CPUE (with distance thresholds of 100’s of meters) among years, indicating the presence of CPUE hotspots focused on specific reefs. Cumulative hotspot maps indicated that certain reef complexes were consistently targeted across years but with varying intensity, however often a relatively small proportion of the full reef extent was targeted. Integrating CPUE with remotely-sensed light detection and ranging (LiDAR) derived bathymetry data using generalized additive mixed model corroborated that fishing pressure primarily coincided with shallow, rugose and complex components of reef structures. This study demonstrates that a geospatial approach is efficient in detecting patterns and trends in commercial fishing effort and its association with seafloor characteristics. PMID:25992800

  15. Exploring Spatiotemporal Trends in Commercial Fishing Effort of an Abalone Fishing Zone: A GIS-Based Hotspot Model.

    Directory of Open Access Journals (Sweden)

    M Ali Jalali

    Full Text Available Assessing patterns of fisheries activity at a scale related to resource exploitation has received particular attention in recent times. However, acquiring data about the distribution and spatiotemporal allocation of catch and fishing effort in small scale benthic fisheries remains challenging. Here, we used GIS-based spatio-statistical models to investigate the footprint of commercial diving events on blacklip abalone (Haliotis rubra stocks along the south-west coast of Victoria, Australia from 2008 to 2011. Using abalone catch data matched with GPS location we found catch per unit of fishing effort (CPUE was not uniformly spatially and temporally distributed across the study area. Spatial autocorrelation and hotspot analysis revealed significant spatiotemporal clusters of CPUE (with distance thresholds of 100's of meters among years, indicating the presence of CPUE hotspots focused on specific reefs. Cumulative hotspot maps indicated that certain reef complexes were consistently targeted across years but with varying intensity, however often a relatively small proportion of the full reef extent was targeted. Integrating CPUE with remotely-sensed light detection and ranging (LiDAR derived bathymetry data using generalized additive mixed model corroborated that fishing pressure primarily coincided with shallow, rugose and complex components of reef structures. This study demonstrates that a geospatial approach is efficient in detecting patterns and trends in commercial fishing effort and its association with seafloor characteristics.

  16. A simple model for super critical fluid extraction of bio oils from biomass

    International Nuclear Information System (INIS)

    Patel, Rajesh N.; Bandyopadhyay, Santanu; Ganesh, Anuradda

    2011-01-01

    A simple mathematical model to characterize the supercritical extraction process has been proposed in this paper. This model is primarily based on two mass transfer mechanisms: solubility and diffusion. The model assumes two districts mode of extraction: initial constant rate extraction that is controlled by solubility and falling rate extraction that is controlled by diffusivity. Effects of extraction parameters such as pressure and temperature on the extraction of oil have also been studied. The proposed model, when compared with existing models, shows better agreement with the experimental results. The proposed model developed has been applied for both high initial oil content material (cashew nut shells) and low initial oil content material (black pepper).

  17. Thermodynamic Modeling of Savannah River Evaporators

    Energy Technology Data Exchange (ETDEWEB)

    Weber, C.F.

    2001-08-02

    A thermodynamic model based on the code SOLGASMIX is developed to calculate phase equilibrium in evaporators and related tank wastes at the Savannah River Site (SRS). This model uses the Pitzer method to calculate activity coefficients, and many of the required Pitzer parameters have been determined in the course of this work. Principal chemical species in standard SRS simulant solutions are included, and the temperature range for most parameters has been extended above 100 C. The SOLGASMIX model and calculations using the code Geochemists Workbench are compared to actual solubility data including silicate, aluminate, and aluminosilicate solutions. In addition, SOLGASMIX model calculations are also compared to transient solubility data involving SRS simulant solutions. These comparisons indicate that the SOLGASMIX predictions closely match reliable data over the range of temperature and solution composition expected in the SRS evaporator and related tanks. Predictions using the Geochemists Workbench may be unreliable, due primarily to the use of an inaccurate activity coefficient model.

  18. Model-based auditing using REA

    NARCIS (Netherlands)

    Weigand, H.; Elsas, P.

    2012-01-01

    The recent financial crisis has renewed interest in the value of the owner-ordered auditing tradition that starts from society's long-term interest rather than management interest. This tradition uses a model-based auditing approach in which control requirements are derived in a principled way. A

  19. Dopamine selectively remediates 'model-based' reward learning: a computational approach.

    Science.gov (United States)

    Sharp, Madeleine E; Foerde, Karin; Daw, Nathaniel D; Shohamy, Daphna

    2016-02-01

    Patients with loss of dopamine due to Parkinson's disease are impaired at learning from reward. However, it remains unknown precisely which aspect of learning is impaired. In particular, learning from reward, or reinforcement learning, can be driven by two distinct computational processes. One involves habitual stamping-in of stimulus-response associations, hypothesized to arise computationally from 'model-free' learning. The other, 'model-based' learning, involves learning a model of the world that is believed to support goal-directed behaviour. Much work has pointed to a role for dopamine in model-free learning. But recent work suggests model-based learning may also involve dopamine modulation, raising the possibility that model-based learning may contribute to the learning impairment in Parkinson's disease. To directly test this, we used a two-step reward-learning task which dissociates model-free versus model-based learning. We evaluated learning in patients with Parkinson's disease tested ON versus OFF their dopamine replacement medication and in healthy controls. Surprisingly, we found no effect of disease or medication on model-free learning. Instead, we found that patients tested OFF medication showed a marked impairment in model-based learning, and that this impairment was remediated by dopaminergic medication. Moreover, model-based learning was positively correlated with a separate measure of working memory performance, raising the possibility of common neural substrates. Our results suggest that some learning deficits in Parkinson's disease may be related to an inability to pursue reward based on complete representations of the environment. © The Author (2015). Published by Oxford University Press on behalf of the Guarantors of Brain. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  20. Model-Based Methods for Fault Diagnosis: Some Guide-Lines

    DEFF Research Database (Denmark)

    Patton, R.J.; Chen, J.; Nielsen, S.B.

    1995-01-01

    This paper provides a review of model-based fault diagnosis techniques. Starting from basic principles, the properties.......This paper provides a review of model-based fault diagnosis techniques. Starting from basic principles, the properties....

  1. Mathematical Modeling of Column-Base Connections under Monotonic Loading

    Directory of Open Access Journals (Sweden)

    Gholamreza Abdollahzadeh

    2014-12-01

    Full Text Available Some considerable damage to steel structures during the Hyogo-ken Nanbu Earthquake occurred. Among them, many exposed-type column bases failed in several consistent patterns, such as brittle base plate fracture, excessive bolt elongation, unexpected early bolt failure, and inferior construction work, etc. The lessons from these phenomena led to the need for improved understanding of column base behavior. Joint behavior must be modeled when analyzing semi-rigid frames, which is associated with a mathematical model of the moment–rotation curve. The most accurate model uses continuous nonlinear functions. This article presents three areas of steel joint research: (1 analysis methods of semi-rigid joints; (2 prediction methods for the mechanical behavior of joints; (3 mathematical representations of the moment–rotation curve. In the current study, a new exponential model to depict the moment–rotation relationship of column base connection is proposed. The proposed nonlinear model represents an approach to the prediction of M–θ curves, taking into account the possible failure modes and the deformation characteristics of the connection elements. The new model has three physical parameters, along with two curve-fitted factors. These physical parameters are generated from dimensional details of the connection, as well as the material properties. The M–θ curves obtained by the model are compared with published connection tests and 3D FEM research. The proposed mathematical model adequately comes close to characterizing M–θ behavior through the full range of loading/rotations. As a result, modeling of column base connections using the proposed mathematical model can give crucial beforehand information, and overcome the disadvantages of time consuming workmanship and cost of experimental studies.

  2. CAD-based automatic modeling method for Geant4 geometry model through MCAM

    International Nuclear Information System (INIS)

    Wang, D.; Nie, F.; Wang, G.; Long, P.; LV, Z.

    2013-01-01

    The full text of publication follows. Geant4 is a widely used Monte Carlo transport simulation package. Before calculating using Geant4, the calculation model need be established which could be described by using Geometry Description Markup Language (GDML) or C++ language. However, it is time-consuming and error-prone to manually describe the models by GDML. Automatic modeling methods have been developed recently, but there are some problems that exist in most present modeling programs, specially some of them were not accurate or adapted to specifically CAD format. To convert the GDML format models to CAD format accurately, a Geant4 Computer Aided Design (CAD) based modeling method was developed for automatically converting complex CAD geometry model into GDML geometry model. The essence of this method was dealing with CAD model represented with boundary representation (B-REP) and GDML model represented with constructive solid geometry (CSG). At first, CAD model was decomposed to several simple solids which had only one close shell. And then the simple solid was decomposed to convex shell set. Then corresponding GDML convex basic solids were generated by the boundary surfaces getting from the topological characteristic of a convex shell. After the generation of these solids, GDML model was accomplished with series boolean operations. This method was adopted in CAD/Image-based Automatic Modeling Program for Neutronics and Radiation Transport (MCAM), and tested with several models including the examples in Geant4 install package. The results showed that this method could convert standard CAD model accurately, and can be used for Geant4 automatic modeling. (authors)

  3. Model-based safety analysis of a control system using Simulink and Simscape extended models

    Directory of Open Access Journals (Sweden)

    Shao Nian

    2017-01-01

    Full Text Available The aircraft or system safety assessment process is an integral part of the overall aircraft development cycle. It is usually characterized by a very high timely and financial effort and can become a critical design driver in certain cases. Therefore, an increasing demand of effective methods to assist the safety assessment process arises within the aerospace community. One approach is the utilization of model-based technology, which is already well-established in the system development, for safety assessment purposes. This paper mainly describes a new tool for Model-Based Safety Analysis. A formal model for an example system is generated and enriched with extended models. Then, system safety analyses are performed on the model with the assistance of automation tools and compared to the results of a manual analysis. The objective of this paper is to improve the increasingly complex aircraft systems development process. This paper develops a new model-based analysis tool in Simulink/Simscape environment.

  4. Modeling of driver's collision avoidance maneuver based on controller switching model.

    Science.gov (United States)

    Kim, Jong-Hae; Hayakawa, Soichiro; Suzuki, Tatsuya; Hayashi, Koji; Okuma, Shigeru; Tsuchida, Nuio; Shimizu, Masayuki; Kido, Shigeyuki

    2005-12-01

    This paper presents a modeling strategy of human driving behavior based on the controller switching model focusing on the driver's collision avoidance maneuver. The driving data are collected by using the three-dimensional (3-D) driving simulator based on the CAVE Automatic Virtual Environment (CAVE), which provides stereoscopic immersive virtual environment. In our modeling, the control scenario of the human driver, that is, the mapping from the driver's sensory information to the operation of the driver such as acceleration, braking, and steering, is expressed by Piecewise Polynomial (PWP) model. Since the PWP model includes both continuous behaviors given by polynomials and discrete logical conditions, it can be regarded as a class of Hybrid Dynamical System (HDS). The identification problem for the PWP model is formulated as the Mixed Integer Linear Programming (MILP) by transforming the switching conditions into binary variables. From the obtained results, it is found that the driver appropriately switches the "control law" according to the sensory information. In addition, the driving characteristics of the beginner driver and the expert driver are compared and discussed. These results enable us to capture not only the physical meaning of the driving skill but the decision-making aspect (switching conditions) in the driver's collision avoidance maneuver as well.

  5. Observation-Based Modeling for Model-Based Testing

    NARCIS (Netherlands)

    Kanstrén, T.; Piel, E.; Gross, H.G.

    2009-01-01

    One of the single most important reasons that modeling and modelbased testing are not yet common practice in industry is the perceived difficulty of making the models up to the level of detail and quality required for their automated processing. Models unleash their full potential only through

  6. Cloud-Based Model Calibration Using OpenStudio: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Hale, E.; Lisell, L.; Goldwasser, D.; Macumber, D.; Dean, J.; Metzger, I.; Parker, A.; Long, N.; Ball, B.; Schott, M.; Weaver, E.; Brackney, L.

    2014-03-01

    OpenStudio is a free, open source Software Development Kit (SDK) and application suite for performing building energy modeling and analysis. The OpenStudio Parametric Analysis Tool has been extended to allow cloud-based simulation of multiple OpenStudio models parametrically related to a baseline model. This paper describes the new cloud-based simulation functionality and presents a model cali-bration case study. Calibration is initiated by entering actual monthly utility bill data into the baseline model. Multiple parameters are then varied over multiple iterations to reduce the difference between actual energy consumption and model simulation results, as calculated and visualized by billing period and by fuel type. Simulations are per-formed in parallel using the Amazon Elastic Cloud service. This paper highlights model parameterizations (measures) used for calibration, but the same multi-nodal computing architecture is available for other purposes, for example, recommending combinations of retrofit energy saving measures using the calibrated model as the new baseline.

  7. Models Archive and ModelWeb at NSSDC

    Science.gov (United States)

    Bilitza, D.; Papitashvili, N.; King, J. H.

    2002-05-01

    In addition to its large data holdings, NASA's National Space Science Data Center (NSSDC) also maintains an archive of space physics models for public use (ftp://nssdcftp.gsfc.nasa.gov/models/). The more than 60 model entries cover a wide range of parameters from the atmosphere, to the ionosphere, to the magnetosphere, to the heliosphere. The models are primarily empirical models developed by the respective model authors based on long data records from ground and space experiments. An online model catalog (http://nssdc.gsfc.nasa.gov/space/model/) provides information about these and other models and links to the model software if available. We will briefly review the existing model holdings and highlight some of its usages and users. In response to a growing need by the user community, NSSDC began to develop web-interfaces for the most frequently requested models. These interfaces enable users to compute and plot model parameters online for the specific conditions that they are interested in. Currently included in the Modelweb system (http://nssdc.gsfc.nasa.gov/space/model/) are the following models: the International Reference Ionosphere (IRI) model, the Mass Spectrometer Incoherent Scatter (MSIS) E90 model, the International Geomagnetic Reference Field (IGRF) and the AP/AE-8 models for the radiation belt electrons and protons. User accesses to both systems have been steadily increasing over the last years with occasional spikes prior to large scientific meetings. The current monthly rate is between 5,000 to 10,000 accesses for either system; in February 2002 13,872 accesses were recorded to the Modelsweb and 7092 accesses to the models archive.

  8. Assessing model-based reasoning using evidence-centered design a suite of research-based design patterns

    CERN Document Server

    Mislevy, Robert J; Riconscente, Michelle; Wise Rutstein, Daisy; Ziker, Cindy

    2017-01-01

    This Springer Brief provides theory, practical guidance, and support tools to help designers create complex, valid assessment tasks for hard-to-measure, yet crucial, science education standards. Understanding, exploring, and interacting with the world through models characterizes science in all its branches and at all levels of education. Model-based reasoning is central to science education and thus science assessment. Current interest in developing and using models has increased with the release of the Next Generation Science Standards, which identified this as one of the eight practices of science and engineering. However, the interactive, complex, and often technology-based tasks that are needed to assess model-based reasoning in its fullest forms are difficult to develop. Building on research in assessment, science education, and learning science, this Brief describes a suite of design patterns that can help assessment designers, researchers, and teachers create tasks for assessing aspects of model-based...

  9. Integration of Long-Term Research into a GIS Based Landscape Habitat Model for the Red-Cockaded Woodpecker

    Energy Technology Data Exchange (ETDEWEB)

    Franzreb, K.; Lloyd, F.T.

    2000-10-01

    The red cockaded woodpecker has been intensively studied since 1985 when the population was on the verge of extinction. The population decline is primarily the result of timber harvesting prior to 1950 and restricted burning. Construction of artificial cavities, translocations, competitor control, and removal of hardwood mid-story has provided suitable habitat. Since 1985, the population has increased from 4 to 99 birds. A GIS model is being developed to simulate the development of habitat at SRS in relation to management and existing vegetation.

  10. Stochastic Differential Equation-Based Flexible Software Reliability Growth Model

    Directory of Open Access Journals (Sweden)

    P. K. Kapur

    2009-01-01

    Full Text Available Several software reliability growth models (SRGMs have been developed by software developers in tracking and measuring the growth of reliability. As the size of software system is large and the number of faults detected during the testing phase becomes large, so the change of the number of faults that are detected and removed through each debugging becomes sufficiently small compared with the initial fault content at the beginning of the testing phase. In such a situation, we can model the software fault detection process as a stochastic process with continuous state space. In this paper, we propose a new software reliability growth model based on Itô type of stochastic differential equation. We consider an SDE-based generalized Erlang model with logistic error detection function. The model is estimated and validated on real-life data sets cited in literature to show its flexibility. The proposed model integrated with the concept of stochastic differential equation performs comparatively better than the existing NHPP-based models.

  11. Dependability modeling and assessment in UML-based software development.

    Science.gov (United States)

    Bernardi, Simona; Merseguer, José; Petriu, Dorina C

    2012-01-01

    Assessment of software nonfunctional properties (NFP) is an important problem in software development. In the context of model-driven development, an emerging approach for the analysis of different NFPs consists of the following steps: (a) to extend the software models with annotations describing the NFP of interest; (b) to transform automatically the annotated software model to the formalism chosen for NFP analysis; (c) to analyze the formal model using existing solvers; (d) to assess the software based on the results and give feedback to designers. Such a modeling→analysis→assessment approach can be applied to any software modeling language, be it general purpose or domain specific. In this paper, we focus on UML-based development and on the dependability NFP, which encompasses reliability, availability, safety, integrity, and maintainability. The paper presents the profile used to extend UML with dependability information, the model transformation to generate a DSPN formal model, and the assessment of the system properties based on the DSPN results.

  12. Model based rib-cage unfolding for trauma CT

    Science.gov (United States)

    von Berg, Jens; Klinder, Tobias; Lorenz, Cristian

    2018-03-01

    A CT rib-cage unfolding method is proposed that does not require to determine rib centerlines but determines the visceral cavity surface by model base segmentation. Image intensities are sampled across this surface that is flattened using a model based 3D thin-plate-spline registration. An average rib centerline model projected onto this surface serves as a reference system for registration. The flattening registration is designed so that ribs similar to the centerline model are mapped onto parallel lines preserving their relative length. Ribs deviating from this model appear deviating from straight parallel ribs in the unfolded view, accordingly. As the mapping is continuous also the details in intercostal space and those adjacent to the ribs are rendered well. The most beneficial application area is Trauma CT where a fast detection of rib fractures is a crucial task. Specifically in trauma, automatic rib centerline detection may not be guaranteed due to fractures and dislocations. The application by visual assessment on the large public LIDC data base of lung CT proved general feasibility of this early work.

  13. Graph Model Based Indoor Tracking

    DEFF Research Database (Denmark)

    Jensen, Christian Søndergaard; Lu, Hua; Yang, Bin

    2009-01-01

    The tracking of the locations of moving objects in large indoor spaces is important, as it enables a range of applications related to, e.g., security and indoor navigation and guidance. This paper presents a graph model based approach to indoor tracking that offers a uniform data management...

  14. Modelling Web-Based Instructional Systems

    NARCIS (Netherlands)

    Retalis, Symeon; Avgeriou, Paris

    2002-01-01

    The size and complexity of modern instructional systems, which are based on the World Wide Web, bring about great intricacy in their crafting, as there is not enough knowledge or experience in this field. This imposes the use of new instructional design models in order to achieve risk-mitigation,

  15. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    GENERAL I ARTICLE. Computer Based ... universities, and later did system analysis, ... sonal computers (PC) and low cost software packages and tools. They can serve as useful learning experience through student projects. Models are .... Let us consider a numerical example: to calculate the velocity of a trainer aircraft ...

  16. Bridging process-based and empirical approaches to modeling tree growth

    Science.gov (United States)

    Harry T. Valentine; Annikki Makela; Annikki Makela

    2005-01-01

    The gulf between process-based and empirical approaches to modeling tree growth may be bridged, in part, by the use of a common model. To this end, we have formulated a process-based model of tree growth that can be fitted and applied in an empirical mode. The growth model is grounded in pipe model theory and an optimal control model of crown development. Together, the...

  17. ARIMA-Based Time Series Model of Stochastic Wind Power Generation

    DEFF Research Database (Denmark)

    Chen, Peiyuan; Pedersen, Troels; Bak-Jensen, Birgitte

    2010-01-01

    This paper proposes a stochastic wind power model based on an autoregressive integrated moving average (ARIMA) process. The model takes into account the nonstationarity and physical limits of stochastic wind power generation. The model is constructed based on wind power measurement of one year from...... the Nysted offshore wind farm in Denmark. The proposed limited-ARIMA (LARIMA) model introduces a limiter and characterizes the stochastic wind power generation by mean level, temporal correlation and driving noise. The model is validated against the measurement in terms of temporal correlation...... and probability distribution. The LARIMA model outperforms a first-order transition matrix based discrete Markov model in terms of temporal correlation, probability distribution and model parameter number. The proposed LARIMA model is further extended to include the monthly variation of the stochastic wind power...

  18. Model of climate evolution based on continental drift and polar wandering

    Science.gov (United States)

    Donn, W. L.; Shaw, D. M.

    1977-01-01

    The thermodynamic meteorologic model of Adem is used to trace the evolution of climate from Triassic to present time by applying it to changing geography as described by continental drift and polar wandering. Results show that the gross changes of climate in the Northern Hemisphere can be fully explained by the strong cooling in high latitudes as continents moved poleward. High-latitude mean temperatures in the Northern Hemisphere dropped below the freezing point 10 to 15 m.y. ago, thereby accounting for the late Cenozoic glacial age. Computed meridional temperature gradients for the Northern Hemisphere steepened from 20 to 40 C over the 200-m.y. period, an effect caused primarily by the high-latitude temperature decrease. The primary result of the work is that the cooling that has occurred since the warm Mesozoic period and has culminated in glaciation is explainable wholly by terrestrial processes.

  19. A semi-analytical bearing model considering outer race flexibility for model based bearing load monitoring

    Science.gov (United States)

    Kerst, Stijn; Shyrokau, Barys; Holweg, Edward

    2018-05-01

    This paper proposes a novel semi-analytical bearing model addressing flexibility of the bearing outer race structure. It furthermore presents the application of this model in a bearing load condition monitoring approach. The bearing model is developed as current computational low cost bearing models fail to provide an accurate description of the more and more common flexible size and weight optimized bearing designs due to their assumptions of rigidity. In the proposed bearing model raceway flexibility is described by the use of static deformation shapes. The excitation of the deformation shapes is calculated based on the modelled rolling element loads and a Fourier series based compliance approximation. The resulting model is computational low cost and provides an accurate description of the rolling element loads for flexible outer raceway structures. The latter is validated by a simulation-based comparison study with a well-established bearing simulation software tool. An experimental study finally shows the potential of the proposed model in a bearing load monitoring approach.

  20. Analysis of Future Vehicle Energy Demand in China Based on a Gompertz Function Method and Computable General Equilibrium Model

    Directory of Open Access Journals (Sweden)

    Tian Wu

    2014-11-01

    Full Text Available This paper presents a model for the projection of Chinese vehicle stocks and road vehicle energy demand through 2050 based on low-, medium-, and high-growth scenarios. To derive a gross-domestic product (GDP-dependent Gompertz function, Chinese GDP is estimated using a recursive dynamic Computable General Equilibrium (CGE model. The Gompertz function is estimated using historical data on vehicle development trends in North America, Pacific Rim and Europe to overcome the problem of insufficient long-running data on Chinese vehicle ownership. Results indicate that the number of projected vehicle stocks for 2050 is 300, 455 and 463 million for low-, medium-, and high-growth scenarios respectively. Furthermore, the growth in China’s vehicle stock will increase beyond the inflection point of Gompertz curve by 2020, but will not reach saturation point during the period 2014–2050. Of major road vehicle categories, cars are the largest energy consumers, followed by trucks and buses. Growth in Chinese vehicle demand is primarily determined by per capita GDP. Vehicle saturation levels solely influence the shape of the Gompertz curve and population growth weakly affects vehicle demand. Projected total energy consumption of road vehicles in 2050 is 380, 575 and 586 million tonnes of oil equivalent for each scenario.

  1. Annotation-based feature extraction from sets of SBML models.

    Science.gov (United States)

    Alm, Rebekka; Waltemath, Dagmar; Wolfien, Markus; Wolkenhauer, Olaf; Henkel, Ron

    2015-01-01

    Model repositories such as BioModels Database provide computational models of biological systems for the scientific community. These models contain rich semantic annotations that link model entities to concepts in well-established bio-ontologies such as Gene Ontology. Consequently, thematically similar models are likely to share similar annotations. Based on this assumption, we argue that semantic annotations are a suitable tool to characterize sets of models. These characteristics improve model classification, allow to identify additional features for model retrieval tasks, and enable the comparison of sets of models. In this paper we discuss four methods for annotation-based feature extraction from model sets. We tested all methods on sets of models in SBML format which were composed from BioModels Database. To characterize each of these sets, we analyzed and extracted concepts from three frequently used ontologies, namely Gene Ontology, ChEBI and SBO. We find that three out of the methods are suitable to determine characteristic features for arbitrary sets of models: The selected features vary depending on the underlying model set, and they are also specific to the chosen model set. We show that the identified features map on concepts that are higher up in the hierarchy of the ontologies than the concepts used for model annotations. Our analysis also reveals that the information content of concepts in ontologies and their usage for model annotation do not correlate. Annotation-based feature extraction enables the comparison of model sets, as opposed to existing methods for model-to-keyword comparison, or model-to-model comparison.

  2. An ontology-based approach for modelling architectural styles

    OpenAIRE

    Pahl, Claus; Giesecke, Simon; Hasselbring, Wilhelm

    2007-01-01

    peer-reviewed The conceptual modelling of software architectures is of central importance for the quality of a software system. A rich modelling language is required to integrate the different aspects of architecture modelling, such as architectural styles, structural and behavioural modelling, into a coherent framework.We propose an ontological approach for architectural style modelling based on description logic as an abstract, meta-level modelling instrument. Architect...

  3. A DSM-based framework for integrated function modelling

    DEFF Research Database (Denmark)

    Eisenbart, Boris; Gericke, Kilian; Blessing, Lucienne T. M.

    2017-01-01

    an integrated function modelling framework, which specifically aims at relating between the different function modelling perspectives prominently addressed in different disciplines. It uses interlinked matrices based on the concept of DSM and MDM in order to facilitate cross-disciplinary modelling and analysis...... of the functionality of a system. The article further presents the application of the framework based on a product example. Finally, an empirical study in industry is presented. Therein, feedback on the potential of the proposed framework to support interdisciplinary design practice as well as on areas of further...

  4. Bayesian based Diagnostic Model for Condition based Maintenance of Offshore Wind Farms

    DEFF Research Database (Denmark)

    Asgarpour, Masoud; Sørensen, John Dalsgaard

    2018-01-01

    Operation and maintenance costs are a major contributor to the Levelized Cost of Energy for electricity produced by offshore wind and can be significantly reduced if existing corrective actions are performed as efficiently as possible and if future corrective actions are avoided by performing...... sufficient preventive actions. This paper presents an applied and generic diagnostic model for fault detection and condition based maintenance of offshore wind components. The diagnostic model is based on two probabilistic matrices; first, a confidence matrix, representing the probability of detection using...... for a wind turbine component based on vibration, temperature, and oil particle fault detection methods. The last part of the paper will have a discussion of the case study results and present conclusions....

  5. Model Based Control of Reefer Container Systems

    DEFF Research Database (Denmark)

    Sørensen, Kresten Kjær

    This thesis is concerned with the development of model based control for the Star Cool refrigerated container (reefer) with the objective of reducing energy consumption. This project has been carried out under the Danish Industrial PhD programme and has been financed by Lodam together with the Da......This thesis is concerned with the development of model based control for the Star Cool refrigerated container (reefer) with the objective of reducing energy consumption. This project has been carried out under the Danish Industrial PhD programme and has been financed by Lodam together...

  6. Perti Net-Based Workflow Access Control Model

    Institute of Scientific and Technical Information of China (English)

    陈卓; 骆婷; 石磊; 洪帆

    2004-01-01

    Access control is an important protection mechanism for information systems. This paper shows how to make access control in workflow system. We give a workflow access control model (WACM) based on several current access control models. The model supports roles assignment and dynamic authorization. The paper defines the workflow using Petri net. It firstly gives the definition and description of the workflow, and then analyzes the architecture of the workflow access control model (WACM). Finally, an example of an e-commerce workflow access control model is discussed in detail.

  7. Not just the norm: exemplar-based models also predict face aftereffects.

    Science.gov (United States)

    Ross, David A; Deroche, Mickael; Palmeri, Thomas J

    2014-02-01

    The face recognition literature has considered two competing accounts of how faces are represented within the visual system: Exemplar-based models assume that faces are represented via their similarity to exemplars of previously experienced faces, while norm-based models assume that faces are represented with respect to their deviation from an average face, or norm. Face identity aftereffects have been taken as compelling evidence in favor of a norm-based account over an exemplar-based account. After a relatively brief period of adaptation to an adaptor face, the perceived identity of a test face is shifted toward a face with attributes opposite to those of the adaptor, suggesting an explicit psychological representation of the norm. Surprisingly, despite near universal recognition that face identity aftereffects imply norm-based coding, there have been no published attempts to simulate the predictions of norm- and exemplar-based models in face adaptation paradigms. Here, we implemented and tested variations of norm and exemplar models. Contrary to common claims, our simulations revealed that both an exemplar-based model and a version of a two-pool norm-based model, but not a traditional norm-based model, predict face identity aftereffects following face adaptation.

  8. Perceptual decision neurosciences: a model-based review

    NARCIS (Netherlands)

    Mulder, M.J.; van Maanen, L.; Forstmann, B.U.

    2014-01-01

    In this review we summarize findings published over the past 10 years focusing on the neural correlates of perceptual decision-making. Importantly, this review highlights only studies that employ a model-based approach, i.e., they use quantitative cognitive models in combination with neuroscientific

  9. Model-based uncertainty in species range prediction

    DEFF Research Database (Denmark)

    Pearson, R. G.; Thuiller, Wilfried; Bastos Araujo, Miguel

    2006-01-01

    Aim Many attempts to predict the potential range of species rely on environmental niche (or 'bioclimate envelope') modelling, yet the effects of using different niche-based methodologies require further investigation. Here we investigate the impact that the choice of model can have on predictions...

  10. Structuring Qualitative Data for Agent-Based Modelling

    NARCIS (Netherlands)

    Ghorbani, Amineh; Dijkema, Gerard P.J.; Schrauwen, Noortje

    2015-01-01

    Using ethnography to build agent-based models may result in more empirically grounded simulations. Our study on innovation practice and culture in the Westland horticulture sector served to explore what information and data from ethnographic analysis could be used in models and how. MAIA, a

  11. Knowledge representation to support reasoning based on multiple models

    Science.gov (United States)

    Gillam, April; Seidel, Jorge P.; Parker, Alice C.

    1990-01-01

    Model Based Reasoning is a powerful tool used to design and analyze systems, which are often composed of numerous interactive, interrelated subsystems. Models of the subsystems are written independently and may be used together while they are still under development. Thus the models are not static. They evolve as information becomes obsolete, as improved artifact descriptions are developed, and as system capabilities change. Researchers are using three methods to support knowledge/data base growth, to track the model evolution, and to handle knowledge from diverse domains. First, the representation methodology is based on having pools, or types, of knowledge from which each model is constructed. In addition information is explicit. This includes the interactions between components, the description of the artifact structure, and the constraints and limitations of the models. The third principle we have followed is the separation of the data and knowledge from the inferencing and equation solving mechanisms. This methodology is used in two distinct knowledge-based systems: one for the design of space systems and another for the synthesis of VLSI circuits. It has facilitated the growth and evolution of our models, made accountability of results explicit, and provided credibility for the user community. These capabilities have been implemented and are being used in actual design projects.

  12. Map-based model of the cardiac action potential

    International Nuclear Information System (INIS)

    Pavlov, Evgeny A.; Osipov, Grigory V.; Chan, C.K.; Suykens, Johan A.K.

    2011-01-01

    A simple computationally efficient model which is capable of replicating the basic features of cardiac cell action potential is proposed. The model is a four-dimensional map and demonstrates good correspondence with real cardiac cells. Various regimes of cardiac activity, which can be reproduced by the proposed model, are shown. Bifurcation mechanisms of these regimes transitions are explained using phase space analysis. The dynamics of 1D and 2D lattices of coupled maps which model the behavior of electrically connected cells is discussed in the context of synchronization theory. -- Highlights: → Recent experimental-data based models are complicated for analysis and simulation. → The simplified map-based model of the cardiac cell is constructed. → The model is capable for replication of different types of cardiac activity. → The spatio-temporal dynamics of ensembles of coupled maps are investigated. → Received data are analyzed in context of biophysical processes in the myocardium.

  13. Map-based model of the cardiac action potential

    Energy Technology Data Exchange (ETDEWEB)

    Pavlov, Evgeny A., E-mail: genie.pavlov@gmail.com [Department of Computational Mathematics and Cybernetics, Nizhny Novgorod State University, 23, Gagarin Avenue, 603950 Nizhny Novgorod (Russian Federation); Osipov, Grigory V. [Department of Computational Mathematics and Cybernetics, Nizhny Novgorod State University, 23, Gagarin Avenue, 603950 Nizhny Novgorod (Russian Federation); Chan, C.K. [Institute of Physics, Academia Sinica, 128 Sec. 2, Academia Road, Nankang, Taipei 115, Taiwan (China); Suykens, Johan A.K. [K.U. Leuven, ESAT-SCD/SISTA, Kasteelpark Arenberg 10, B-3001 Leuven (Heverlee) (Belgium)

    2011-07-25

    A simple computationally efficient model which is capable of replicating the basic features of cardiac cell action potential is proposed. The model is a four-dimensional map and demonstrates good correspondence with real cardiac cells. Various regimes of cardiac activity, which can be reproduced by the proposed model, are shown. Bifurcation mechanisms of these regimes transitions are explained using phase space analysis. The dynamics of 1D and 2D lattices of coupled maps which model the behavior of electrically connected cells is discussed in the context of synchronization theory. -- Highlights: → Recent experimental-data based models are complicated for analysis and simulation. → The simplified map-based model of the cardiac cell is constructed. → The model is capable for replication of different types of cardiac activity. → The spatio-temporal dynamics of ensembles of coupled maps are investigated. → Received data are analyzed in context of biophysical processes in the myocardium.

  14. Comparisons of complex network based models and real train flow model to analyze Chinese railway vulnerability

    International Nuclear Information System (INIS)

    Ouyang, Min; Zhao, Lijing; Hong, Liu; Pan, Zhezhe

    2014-01-01

    Recently numerous studies have applied complex network based models to study the performance and vulnerability of infrastructure systems under various types of attacks and hazards. But how effective are these models to capture their real performance response is still a question worthy of research. Taking the Chinese railway system as an example, this paper selects three typical complex network based models, including purely topological model (PTM), purely shortest path model (PSPM), and weight (link length) based shortest path model (WBSPM), to analyze railway accessibility and flow-based vulnerability and compare their results with those from the real train flow model (RTFM). The results show that the WBSPM can produce the train routines with 83% stations and 77% railway links identical to the real routines and can approach the RTFM the best for railway vulnerability under both single and multiple component failures. The correlation coefficient for accessibility vulnerability from WBSPM and RTFM under single station failures is 0.96 while it is 0.92 for flow-based vulnerability; under multiple station failures, where each station has the same failure probability fp, the WBSPM can produce almost identical vulnerability results with those from the RTFM under almost all failures scenarios when fp is larger than 0.62 for accessibility vulnerability and 0.86 for flow-based vulnerability

  15. [Model-based biofuels system analysis: a review].

    Science.gov (United States)

    Chang, Shiyan; Zhang, Xiliang; Zhao, Lili; Ou, Xunmin

    2011-03-01

    Model-based system analysis is an important tool for evaluating the potential and impacts of biofuels, and for drafting biofuels technology roadmaps and targets. The broad reach of the biofuels supply chain requires that biofuels system analyses span a range of disciplines, including agriculture/forestry, energy, economics, and the environment. Here we reviewed various models developed for or applied to modeling biofuels, and presented a critical analysis of Agriculture/Forestry System Models, Energy System Models, Integrated Assessment Models, Micro-level Cost, Energy and Emission Calculation Models, and Specific Macro-level Biofuel Models. We focused on the models' strengths, weaknesses, and applicability, facilitating the selection of a suitable type of model for specific issues. Such an analysis was a prerequisite for future biofuels system modeling, and represented a valuable resource for researchers and policy makers.

  16. Modeling analysis of pulsed magnetization process of magnetic core based on inverse Jiles-Atherton model

    Science.gov (United States)

    Liu, Yi; Zhang, He; Liu, Siwei; Lin, Fuchang

    2018-05-01

    The J-A (Jiles-Atherton) model is widely used to describe the magnetization characteristics of magnetic cores in a low-frequency alternating field. However, this model is deficient in the quantitative analysis of the eddy current loss and residual loss in a high-frequency magnetic field. Based on the decomposition of magnetization intensity, an inverse J-A model is established which uses magnetic flux density B as an input variable. Static and dynamic core losses under high frequency excitation are separated based on the inverse J-A model. Optimized parameters of the inverse J-A model are obtained based on particle swarm optimization. The platform for the pulsed magnetization characteristic test is designed and constructed. The hysteresis curves of ferrite and Fe-based nanocrystalline cores at high magnetization rates are measured. The simulated and measured hysteresis curves are presented and compared. It is found that the inverse J-A model can be used to describe the magnetization characteristics at high magnetization rates and to separate the static loss and dynamic loss accurately.

  17. 3D virtual human rapid modeling method based on top-down modeling mechanism

    Directory of Open Access Journals (Sweden)

    LI Taotao

    2017-01-01

    Full Text Available Aiming to satisfy the vast custom-made character demand of 3D virtual human and the rapid modeling in the field of 3D virtual reality, a new virtual human top-down rapid modeling method is put for-ward in this paper based on the systematic analysis of the current situation and shortage of the virtual hu-man modeling technology. After the top-level realization of virtual human hierarchical structure frame de-sign, modular expression of the virtual human and parameter design for each module is achieved gradu-al-level downwards. While the relationship of connectors and mapping restraints among different modules is established, the definition of the size and texture parameter is also completed. Standardized process is meanwhile produced to support and adapt the virtual human top-down rapid modeling practice operation. Finally, the modeling application, which takes a Chinese captain character as an example, is carried out to validate the virtual human rapid modeling method based on top-down modeling mechanism. The result demonstrates high modelling efficiency and provides one new concept for 3D virtual human geometric mod-eling and texture modeling.

  18. Mechanics model for actin-based motility.

    Science.gov (United States)

    Lin, Yuan

    2009-02-01

    We present here a mechanics model for the force generation by actin polymerization. The possible adhesions between the actin filaments and the load surface, as well as the nucleation and capping of filament tips, are included in this model on top of the well-known elastic Brownian ratchet formulation. A closed form solution is provided from which the force-velocity relationship, summarizing the mechanics of polymerization, can be drawn. Model predictions on the velocity of moving beads driven by actin polymerization are consistent with experiment observations. This model also seems capable of explaining the enhanced actin-based motility of Listeria monocytogenes and beads by the presence of Vasodilator-stimulated phosphoprotein, as observed in recent experiments.

  19. Discounted cost model for condition-based maintenance optimization

    International Nuclear Information System (INIS)

    Weide, J.A.M. van der; Pandey, M.D.; Noortwijk, J.M. van

    2010-01-01

    This paper presents methods to evaluate the reliability and optimize the maintenance of engineering systems that are damaged by shocks or transients arriving randomly in time and overall degradation is modeled as a cumulative stochastic point process. The paper presents a conceptually clear and comprehensive derivation of formulas for computing the discounted cost associated with a maintenance policy combining both condition-based and age-based criteria for preventive maintenance. The proposed discounted cost model provides a more realistic basis for optimizing the maintenance policies than those based on the asymptotic, non-discounted cost rate criterion.

  20. Promoting Model-based Definition to Establish a Complete Product Definition.

    Science.gov (United States)

    Ruemler, Shawn P; Zimmerman, Kyle E; Hartman, Nathan W; Hedberg, Thomas; Feeny, Allison Barnard

    2017-05-01

    The manufacturing industry is evolving and starting to use 3D models as the central knowledge artifact for product data and product definition, or what is known as Model-based Definition (MBD). The Model-based Enterprise (MBE) uses MBD as a way to transition away from using traditional paper-based drawings and documentation. As MBD grows in popularity, it is imperative to understand what information is needed in the transition from drawings to models so that models represent all the relevant information needed for processes to continue efficiently. Finding this information can help define what data is common amongst different models in different stages of the lifecycle, which could help establish a Common Information Model. The Common Information Model is a source that contains common information from domain specific elements amongst different aspects of the lifecycle. To help establish this Common Information Model, information about how models are used in industry within different workflows needs to be understood. To retrieve this information, a survey mechanism was administered to industry professionals from various sectors. Based on the results of the survey a Common Information Model could not be established. However, the results gave great insight that will help in further investigation of the Common Information Model.

  1. Cluster-based analysis of multi-model climate ensembles

    Science.gov (United States)

    Hyde, Richard; Hossaini, Ryan; Leeson, Amber A.

    2018-06-01

    Clustering - the automated grouping of similar data - can provide powerful and unique insight into large and complex data sets, in a fast and computationally efficient manner. While clustering has been used in a variety of fields (from medical image processing to economics), its application within atmospheric science has been fairly limited to date, and the potential benefits of the application of advanced clustering techniques to climate data (both model output and observations) has yet to be fully realised. In this paper, we explore the specific application of clustering to a multi-model climate ensemble. We hypothesise that clustering techniques can provide (a) a flexible, data-driven method of testing model-observation agreement and (b) a mechanism with which to identify model development priorities. We focus our analysis on chemistry-climate model (CCM) output of tropospheric ozone - an important greenhouse gas - from the recent Atmospheric Chemistry and Climate Model Intercomparison Project (ACCMIP). Tropospheric column ozone from the ACCMIP ensemble was clustered using the Data Density based Clustering (DDC) algorithm. We find that a multi-model mean (MMM) calculated using members of the most-populous cluster identified at each location offers a reduction of up to ˜ 20 % in the global absolute mean bias between the MMM and an observed satellite-based tropospheric ozone climatology, with respect to a simple, all-model MMM. On a spatial basis, the bias is reduced at ˜ 62 % of all locations, with the largest bias reductions occurring in the Northern Hemisphere - where ozone concentrations are relatively large. However, the bias is unchanged at 9 % of all locations and increases at 29 %, particularly in the Southern Hemisphere. The latter demonstrates that although cluster-based subsampling acts to remove outlier model data, such data may in fact be closer to observed values in some locations. We further demonstrate that clustering can provide a viable and

  2. Designing Network-based Business Model Ontology

    DEFF Research Database (Denmark)

    Hashemi Nekoo, Ali Reza; Ashourizadeh, Shayegheh; Zarei, Behrouz

    2015-01-01

    Survival on dynamic environment is not achieved without a map. Scanning and monitoring of the market show business models as a fruitful tool. But scholars believe that old-fashioned business models are dead; as they are not included the effect of internet and network in themselves. This paper...... is going to propose e-business model ontology from the network point of view and its application in real world. The suggested ontology for network-based businesses is composed of individuals` characteristics and what kind of resources they own. also, their connections and pre-conceptions of connections...... such as shared-mental model and trust. However, it mostly covers previous business model elements. To confirm the applicability of this ontology, it has been implemented in business angel network and showed how it works....

  3. Agent-based models in economics a toolkit

    CERN Document Server

    Fagiolo, Giorgio; Gallegati, Mauro; Richiardi, Matteo; Russo, Alberto

    2018-01-01

    In contrast to mainstream economics, complexity theory conceives the economy as a complex system of heterogeneous interacting agents characterised by limited information and bounded rationality. Agent Based Models (ABMs) are the analytical and computational tools developed by the proponents of this emerging methodology. Aimed at students and scholars of contemporary economics, this book includes a comprehensive toolkit for agent-based computational economics, now quickly becoming the new way to study evolving economic systems. Leading scholars in the field explain how ABMs can be applied fruitfully to many real-world economic examples and represent a great advancement over mainstream approaches. The essays discuss the methodological bases of agent-based approaches and demonstrate step-by-step how to build, simulate and analyse ABMs and how to validate their outputs empirically using the data. They also present a wide set of applications of these models to key economic topics, including the business cycle, lab...

  4. GENESIS - The GENEric SImulation System for Modelling State Transitions.

    Science.gov (United States)

    Gillman, Matthew S

    2017-09-20

    This software implements a discrete time Markov chain model, used to model transitions between states when the transition probabilities are known a priori . It is highly configurable; the user supplies two text files, a "state transition table" and a "config file", to the Perl script genesis.pl. Given the content of these files, the script generates a set of C++ classes based on the State design pattern, and a main program, which can then be compiled and run. The C++ code generated is based on the specification in the text files. Both multiple branching and bi-directional transitions are allowed. The software has been used to model the natural histories of colorectal cancer in Mexico. Although written primarily to model such disease processes, it can be used in any process which depends on discrete states with known transition probabilities between those states. One suitable area may be in environmental modelling. A test suite is supplied with the distribution. Due to its high degree of configurability and flexibility, this software has good re-use potential. It is stored on the Figshare repository.

  5. A model-based approach to estimating forest area

    Science.gov (United States)

    Ronald E. McRoberts

    2006-01-01

    A logistic regression model based on forest inventory plot data and transformations of Landsat Thematic Mapper satellite imagery was used to predict the probability of forest for 15 study areas in Indiana, USA, and 15 in Minnesota, USA. Within each study area, model-based estimates of forest area were obtained for circular areas with radii of 5 km, 10 km, and 15 km and...

  6. A Modified Critical State Two-surface Plasticity Model for Sand

    DEFF Research Database (Denmark)

    Bakmar, Christian LeBlanc; Hededal, O.; Ibsen, Lars Bo

    This paper provides background information and documentation for the implementation of a robust plasticity model as a user-subroutine in the commercial finite difference code, FLAC3D by Itasca. The plasticity model presented is equal to the 3 dimensional critical state two-surface plasticity model...... volumetric and stress-strain behaviour under monotonic and cyclic loading and thereby related observations like accumulation of pore pressure, cyclic mobility and cyclic liquefaction. The plasticity model is implemented with an integration scheme based on the general return mapping algorithm. The integration...... scheme faces convergence difficulties, primarily at very low mean effective stresses. The convergence problems are addressed by suitable correction strategies designed to add robustness, stability and efficiency to the integration scheme. An outline of all model parameters is given with suggestions...

  7. Base Flow Model Validation, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The program focuses on turbulence modeling enhancements for predicting high-speed rocket base flows. A key component of the effort is the collection of high-fidelity...

  8. Are Integrated Portfolio Systems the Answer? An Evaluation of a Web-Based Portfolio System to Improve Preservice Teachers' Reflective Thinking Skills

    Science.gov (United States)

    Oner, Diler; Adadan, Emine

    2016-01-01

    This study investigated the effectiveness of an integrated web-based portfolio system, namely the BOUNCE System, which primarily focuses on improving preservice teachers' reflective thinking skills. BOUNCE©, the software component of the system, was designed and developed to support a teaching practice model including a cycle of activities to be…

  9. Unsteady aerodynamic modelling of wind turbines

    Energy Technology Data Exchange (ETDEWEB)

    Coton, F.N.; Galbraith, R.A. [Univ. og Glasgow, Dept. of Aerospace Engineering, Glasgow (United Kingdom)

    1997-08-01

    The following current and future work is discussed: Collaborative wind tunnel based PIV project to study wind turbine wake structures in head-on and yawed flow. Prescribed wake model has been embedded in a source panel representation of the wind tunnel walls to allow comparison with experiment; Modelling of tower shadow using high resolution but efficient vortex model in tower shadow domain; Extension of model to yawing flow; Upgrading and tuning of unsteady aerodynamic model for low speed, thick airfoil flows. Glasgow has a considerable collection of low speed dynamic stall data. Currently, the Leishman - Beddoes model is not ideally suited to such flows. For example: Range of stall onset criteria used for dynamic stall prediction including Beddoes. Wide variation of stall onset prediction. Beddoes representation was developed primarily with reference to compressible flows. Analyses of low speed data from Glasgow indicate deficiencies in the current model; Predicted versus measured response during ramp down motion. Modification of the Beddoes representation is required to obtain a fit with the measured data. (EG)

  10. Introducing Waqf Based Takaful Model in India

    Directory of Open Access Journals (Sweden)

    Syed Ahmed Salman

    2014-03-01

    Full Text Available Objective – Waqf is a unique feature of the socioeconomic system of Islam in a multi- religious and developing country like India. India is a rich country with waqf assets. The history of waqf in India can be traced back to 800 years ago. Most of the researchers, suggest how waqf can be used a tool to mitigate the poverty of Muslims. India has the third highest Muslim population after Indonesia and Pakistan. However, the majority of Muslims belong to the low income group and they are in need of help. It is believed that waqf can be utilized for the betterment of Indian Muslim community. Among the available uses of waqf assets, the main objective of this paper is to introduce waqf based takaful model in India. In addition, how this proposed model can be adopted in India is highlighted.Methods – Library research is applied since this paper relies on secondary data by thoroughlyreviewing the most relevant literature.Result – India as a rich country with waqf assets should fully utilize the resources to help the Muslims through takaful.Conclusion – In this study, we have proposed waqf based takaful model with the combination of the concepts mudarabah and wakalah for India. We recommend this model based on the background of the  country and situations. Since we have not tested the viability of this model in India, future research should be continued on this testing.Keywords : Wakaf, Takaful, Kemiskinan dan India

  11. TP-model transformation-based-control design frameworks

    CERN Document Server

    Baranyi, Péter

    2016-01-01

    This book covers new aspects and frameworks of control, design, and optimization based on the TP model transformation and its various extensions. The author outlines the three main steps of polytopic and LMI based control design: 1) development of the qLPV state-space model, 2) generation of the polytopic model; and 3) application of LMI to derive controller and observer. He goes on to describe why literature has extensively studied LMI design, but has not focused much on the second step, in part because the generation and manipulation of the polytopic form was not tractable in many cases. The author then shows how the TP model transformation facilitates this second step and hence reveals new directions, leading to powerful design procedures and the formulation of new questions. The chapters of this book, and the complex dynamical control tasks which they cover, are organized so as to present and analyze the beneficial aspect of the family of approaches (control, design, and optimization). Additionally, the b...

  12. Variability-Specific Abstraction Refinement for Family-Based Model Checking

    DEFF Research Database (Denmark)

    Dimovski, Aleksandar; Wasowski, Andrzej

    2017-01-01

    and property, while the number of possible scenarios is very large. In this work, we present an automatic iterative abstraction refinement procedure for family-based model checking. We use Craig interpolation to refine abstract variational models based on the obtained spurious counterexamples (traces...

  13. Augment clinical measurement using a constraint-based esophageal model

    Science.gov (United States)

    Kou, Wenjun; Acharya, Shashank; Kahrilas, Peter; Patankar, Neelesh; Pandolfino, John

    2017-11-01

    Quantifying the mechanical properties of the esophageal wall is crucial to understanding impairments of trans-esophageal flow characteristic of several esophageal diseases. However, these data are unavailable owing to technological limitations of current clinical diagnostic instruments that instead display esophageal luminal cross sectional area based on intraluminal impedance change. In this work, we developed an esophageal model to predict bolus flow and the wall property based on clinical measurements. The model used the constraint-based immersed-boundary method developed previously by our group. Specifically, we first approximate the time-dependent wall geometry based on impedance planimetry data on luminal cross sectional area. We then fed these along with pressure data into the model and computed wall tension based on simulated pressure and flow fields, and the material property based on the strain-stress relationship. As examples, we applied this model to augment FLIP (Functional Luminal Imaging Probe) measurements in three clinical cases: a normal subject, achalasia, and eosinophilic esophagitis (EoE). Our findings suggest that the wall stiffness was greatest in the EoE case, followed by the achalasia case, and then the normal. This is supported by NIH Grant R01 DK56033 and R01 DK079902.

  14. Development of a robust model-based reactivity control system

    International Nuclear Information System (INIS)

    Rovere, L.A.; Otaduy, P.J.; Brittain, C.R.

    1990-01-01

    This paper describes the development and implementation of a digital model-based reactivity control system that incorporates a knowledge of the plant physics into the control algorithm to improve system performance. This controller is composed of a model-based module and modified proportional-integral-derivative (PID) module. The model-based module has an estimation component to synthesize unmeasurable process variables that are necessary for the control action computation. These estimated variables, besides being used within the control algorithm, will be used for diagnostic purposes by a supervisory control system under development. The PID module compensates for inaccuracies in model coefficients by supplementing the model-based output with a correction term that eliminates any demand tracking or steady state errors. This control algorithm has been applied to develop controllers for a simulation of liquid metal reactors in a multimodular plant. It has shown its capability to track demands in neutron power much more accurately than conventional controllers, reducing overshoots to almost negligible value while providing a good degree of robustness to unmodeled dynamics. 10 refs., 4 figs

  15. Agent-based modeling and simulation Part 3 : desktop ABMS.

    Energy Technology Data Exchange (ETDEWEB)

    Macal, C. M.; North, M. J.; Decision and Information Sciences

    2007-01-01

    Agent-based modeling and simulation (ABMS) is a new approach to modeling systems comprised of autonomous, interacting agents. ABMS promises to have far-reaching effects on the way that businesses use computers to support decision-making and researchers use electronic laboratories to support their research. Some have gone so far as to contend that ABMS 'is a third way of doing science,' in addition to traditional deductive and inductive reasoning (Axelrod 1997b). Computational advances have made possible a growing number of agent-based models across a variety of application domains. Applications range from modeling agent behavior in the stock market, supply chains, and consumer markets, to predicting the spread of epidemics, the threat of bio-warfare, and the factors responsible for the fall of ancient civilizations. This tutorial describes the theoretical and practical foundations of ABMS, identifies toolkits and methods for developing agent models, and illustrates the development of a simple agent-based model of shopper behavior using spreadsheets.

  16. Model identification methodology for fluid-based inerters

    Science.gov (United States)

    Liu, Xiaofu; Jiang, Jason Zheng; Titurus, Branislav; Harrison, Andrew

    2018-06-01

    Inerter is the mechanical dual of the capacitor via the force-current analogy. It has the property that the force across the terminals is proportional to their relative acceleration. Compared with flywheel-based inerters, fluid-based forms have advantages of improved durability, inherent damping and simplicity of design. In order to improve the understanding of the physical behaviour of this fluid-based device, especially caused by the hydraulic resistance and inertial effects in the external tube, this work proposes a comprehensive model identification methodology. Firstly, a modelling procedure is established, which allows the topological arrangement of the mechanical networks to be obtained by mapping the damping, inertance and stiffness effects directly to their respective hydraulic counterparts. Secondly, an experimental sequence is followed, which separates the identification of friction, stiffness and various damping effects. Furthermore, an experimental set-up is introduced, where two pressure gauges are used to accurately measure the pressure drop across the external tube. The theoretical models with improved confidence are obtained using the proposed methodology for a helical-tube fluid inerter prototype. The sources of remaining discrepancies are further analysed.

  17. Similar words analysis based on POS-CBOW language model

    Directory of Open Access Journals (Sweden)

    Dongru RUAN

    2015-10-01

    Full Text Available Similar words analysis is one of the important aspects in the field of natural language processing, and it has important research and application values in text classification, machine translation and information recommendation. Focusing on the features of Sina Weibo's short text, this paper presents a language model named as POS-CBOW, which is a kind of continuous bag-of-words language model with the filtering layer and part-of-speech tagging layer. The proposed approach can adjust the word vectors' similarity according to the cosine similarity and the word vectors' part-of-speech metrics. It can also filter those similar words set on the base of the statistical analysis model. The experimental result shows that the similar words analysis algorithm based on the proposed POS-CBOW language model is better than that based on the traditional CBOW language model.

  18. A 4DCT imaging-based breathing lung model with relative hysteresis

    Energy Technology Data Exchange (ETDEWEB)

    Miyawaki, Shinjiro; Choi, Sanghun [IIHR – Hydroscience & Engineering, The University of Iowa, Iowa City, IA 52242 (United States); Hoffman, Eric A. [Department of Biomedical Engineering, The University of Iowa, Iowa City, IA 52242 (United States); Department of Medicine, The University of Iowa, Iowa City, IA 52242 (United States); Department of Radiology, The University of Iowa, Iowa City, IA 52242 (United States); Lin, Ching-Long, E-mail: ching-long-lin@uiowa.edu [IIHR – Hydroscience & Engineering, The University of Iowa, Iowa City, IA 52242 (United States); Department of Mechanical and Industrial Engineering, The University of Iowa, 3131 Seamans Center, Iowa City, IA 52242 (United States)

    2016-12-01

    To reproduce realistic airway motion and airflow, the authors developed a deforming lung computational fluid dynamics (CFD) model based on four-dimensional (4D, space and time) dynamic computed tomography (CT) images. A total of 13 time points within controlled tidal volume respiration were used to account for realistic and irregular lung motion in human volunteers. Because of the irregular motion of 4DCT-based airways, we identified an optimal interpolation method for airway surface deformation during respiration, and implemented a computational solid mechanics-based moving mesh algorithm to produce smooth deforming airway mesh. In addition, we developed physiologically realistic airflow boundary conditions for both models based on multiple images and a single image. Furthermore, we examined simplified models based on one or two dynamic or static images. By comparing these simplified models with the model based on 13 dynamic images, we investigated the effects of relative hysteresis of lung structure with respect to lung volume, lung deformation, and imaging methods, i.e., dynamic vs. static scans, on CFD-predicted pressure drop. The effect of imaging method on pressure drop was 24 percentage points due to the differences in airflow distribution and airway geometry. - Highlights: • We developed a breathing human lung CFD model based on 4D-dynamic CT images. • The 4DCT-based breathing lung model is able to capture lung relative hysteresis. • A new boundary condition for lung model based on one static CT image was proposed. • The difference between lung models based on 4D and static CT images was quantified.

  19. INDIVIDUAL BASED MODELLING APPROACH TO THERMAL ...

    Science.gov (United States)

    Diadromous fish populations in the Pacific Northwest face challenges along their migratory routes from declining habitat quality, harvest, and barriers to longitudinal connectivity. Changes in river temperature regimes are producing an additional challenge for upstream migrating adult salmon and steelhead, species that are sensitive to absolute and cumulative thermal exposure. Adult salmon populations have been shown to utilize cold water patches along migration routes when mainstem river temperatures exceed thermal optimums. We are employing an individual based model (IBM) to explore the costs and benefits of spatially-distributed cold water refugia for adult migrating salmon. Our model, developed in the HexSim platform, is built around a mechanistic behavioral decision tree that drives individual interactions with their spatially explicit simulated environment. Population-scale responses to dynamic thermal regimes, coupled with other stressors such as disease and harvest, become emergent properties of the spatial IBM. Other model outputs include arrival times, species-specific survival rates, body energetic content, and reproductive fitness levels. Here, we discuss the challenges associated with parameterizing an individual based model of salmon and steelhead in a section of the Columbia River. Many rivers and streams in the Pacific Northwest are currently listed as impaired under the Clean Water Act as a result of high summer water temperatures. Adverse effec

  20. Recent Advances in Modeling of the Atmospheric Boundary Layer and Land Surface in the Coupled WRF-CMAQ Model

    Science.gov (United States)

    Advances in the land surface model (LSM) and planetary boundary layer (PBL) components of the WRF-CMAQ coupled meteorology and air quality modeling system are described. The aim of these modifications was primarily to improve the modeling of ground level concentrations of trace c...

  1. Hebbian learning in a model with dynamic rate-coded neurons: an alternative to the generative model approach for learning receptive fields from natural scenes.

    Science.gov (United States)

    Hamker, Fred H; Wiltschut, Jan

    2007-09-01

    Most computational models of coding are based on a generative model according to which the feedback signal aims to reconstruct the visual scene as close as possible. We here explore an alternative model of feedback. It is derived from studies of attention and thus, probably more flexible with respect to attentive processing in higher brain areas. According to this model, feedback implements a gain increase of the feedforward signal. We use a dynamic model with presynaptic inhibition and Hebbian learning to simultaneously learn feedforward and feedback weights. The weights converge to localized, oriented, and bandpass filters similar as the ones found in V1. Due to presynaptic inhibition the model predicts the organization of receptive fields within the feedforward pathway, whereas feedback primarily serves to tune early visual processing according to the needs of the task.

  2. Gradient based filtering of digital elevation models

    DEFF Research Database (Denmark)

    Knudsen, Thomas; Andersen, Rune Carbuhn

    We present a filtering method for digital terrain models (DTMs). The method is based on mathematical morphological filtering within gradient (slope) defined domains. The intention with the filtering procedure is to improbé the cartographic quality of height contours generated from a DTM based...

  3. COMPARISON of FUZZY-BASED MODELS in LANDSLIDE HAZARD MAPPING

    Directory of Open Access Journals (Sweden)

    N. Mijani

    2017-09-01

    Full Text Available Landslide is one of the main geomorphic processes which effects on the development of prospect in mountainous areas and causes disastrous accidents. Landslide is an event which has different uncertain criteria such as altitude, slope, aspect, land use, vegetation density, precipitation, distance from the river and distance from the road network. This research aims to compare and evaluate different fuzzy-based models including Fuzzy Analytic Hierarchy Process (Fuzzy-AHP, Fuzzy Gamma and Fuzzy-OR. The main contribution of this paper reveals to the comprehensive criteria causing landslide hazard considering their uncertainties and comparison of different fuzzy-based models. The quantify of evaluation process are calculated by Density Ratio (DR and Quality Sum (QS. The proposed methodology implemented in Sari, one of the city of Iran which has faced multiple landslide accidents in recent years due to the particular environmental conditions. The achieved results of accuracy assessment based on the quantifier strated that Fuzzy-AHP model has higher accuracy compared to other two models in landslide hazard zonation. Accuracy of zoning obtained from Fuzzy-AHP model is respectively 0.92 and 0.45 based on method Precision (P and QS indicators. Based on obtained landslide hazard maps, Fuzzy-AHP, Fuzzy Gamma and Fuzzy-OR respectively cover 13, 26 and 35 percent of the study area with a very high risk level. Based on these findings, fuzzy-AHP model has been selected as the most appropriate method of zoning landslide in the city of Sari and the Fuzzy-gamma method with a minor difference is in the second order.

  4. A physiological foundation for the nutrition-based efficiency wage model

    DEFF Research Database (Denmark)

    Dalgaard, Carl-Johan Lars; Strulik, Holger

    2011-01-01

    Drawing on recent research on allometric scaling and energy consumption, the present paper develops a nutrition-based efficiency wage model from first principles. The biologically micro-founded model allows us to address empirical criticism of the original nutrition-based efficiency wage model...

  5. Multi-Province Listeriosis Outbreak Linked to Contaminated Deli Meat Consumed Primarily in Institutional Settings, Canada, 2008.

    Science.gov (United States)

    Currie, Andrea; Farber, Jeffrey M; Nadon, Céline; Sharma, Davendra; Whitfield, Yvonne; Gaulin, Colette; Galanis, Eleni; Bekal, Sadjia; Flint, James; Tschetter, Lorelee; Pagotto, Franco; Lee, Brenda; Jamieson, Fred; Badiani, Tina; MacDonald, Diane; Ellis, Andrea; May-Hadford, Jennifer; McCormick, Rachel; Savelli, Carmen; Middleton, Dean; Allen, Vanessa; Tremblay, Francois-William; MacDougall, Laura; Hoang, Linda; Shyng, Sion; Everett, Doug; Chui, Linda; Louie, Marie; Bangura, Helen; Levett, Paul N; Wilkinson, Krista; Wylie, John; Reid, Janet; Major, Brian; Engel, Dave; Douey, Donna; Huszczynski, George; Di Lecci, Joe; Strazds, Judy; Rousseau, Josée; Ma, Kenneth; Isaac, Leah; Sierpinska, Urszula

    2015-08-01

    A multi-province outbreak of listeriosis occurred in Canada from June to November 2008. Fifty-seven persons were infected with 1 of 3 similar outbreak strains defined by pulsed-field gel electrophoresis, and 24 (42%) individuals died. Forty-one (72%) of 57 individuals were residents of long-term care facilities or hospital inpatients during their exposure period. Descriptive epidemiology, product traceback, and detection of the outbreak strains of Listeria monocytogenes in food samples and the plant environment confirmed delicatessen meat manufactured by one establishment and purchased primarily by institutions was the source of the outbreak. The food safety investigation identified a plant environment conducive to the introduction and proliferation of L. monocytogenes and persistently contaminated with Listeria spp. This outbreak demonstrated the need for improved listeriosis surveillance, strict control of L. monocytogenes in establishments producing ready-to-eat foods, and advice to vulnerable populations and institutions serving these populations regarding which high-risk foods to avoid.

  6. Podocytes degrade endocytosed albumin primarily in lysosomes.

    Science.gov (United States)

    Carson, John M; Okamura, Kayo; Wakashin, Hidefumi; McFann, Kim; Dobrinskikh, Evgenia; Kopp, Jeffrey B; Blaine, Judith

    2014-01-01

    Albuminuria is a strong, independent predictor of chronic kidney disease progression. We hypothesize that podocyte processing of albumin via the lysosome may be an important determinant of podocyte injury and loss. A human urine derived podocyte-like epithelial cell (HUPEC) line was used for in vitro experiments. Albumin uptake was quantified by Western blot after loading HUPECs with fluorescein-labeled (FITC) albumin. Co-localization of albumin with lysosomes was determined by confocal microscopy. Albumin degradation was measured by quantifying FITC-albumin abundance in HUPEC lysates by Western blot. Degradation experiments were repeated using HUPECs treated with chloroquine, a lysosome inhibitor, or MG-132, a proteasome inhibitor. Lysosome activity was measured by fluorescence recovery after photo bleaching (FRAP). Cytokine production was measured by ELISA. Cell death was determined by trypan blue staining. In vivo, staining with lysosome-associated membrane protein-1 (LAMP-1) was performed on tissue from a Denys-Drash trangenic mouse model of nephrotic syndrome. HUPECs endocytosed albumin, which co-localized with lysosomes. Choloroquine, but not MG-132, inhibited albumin degradation, indicating that degradation occurs in lysosomes. Cathepsin B activity, measured by FRAP, significantly decreased in HUPECs exposed to albumin (12.5% of activity in controls) and chloroquine (12.8%), and declined further with exposure to albumin plus chloroquine (8.2%, plysosomes are involved in the processing of endocytosed albumin in podocytes, and lysosomal dysfunction may contribute to podocyte injury and glomerulosclerosis in albuminuric diseases. Modifiers of lysosomal activity may have therapeutic potential in slowing the progression of glomerulosclerosis by enhancing the ability of podocytes to process and degrade albumin.

  7. Knowledge-based inspection:modelling complex processes with the integrated Safeguards Modelling Method (iSMM)

    International Nuclear Information System (INIS)

    Abazi, F.

    2011-01-01

    Increased level of complexity in almost every discipline and operation today raises the demand for knowledge in order to successfully run an organization whether to generate profit or to attain a non-profit mission. Traditional way of transferring knowledge to information systems rich in data structures and complex algorithms continue to hinder the ability to swiftly turnover concepts into operations. Diagrammatic modelling commonly applied in engineering in order to represent concepts or reality remains to be an excellent way of converging knowledge from domain experts. The nuclear verification domain represents ever more a matter which has great importance to the World safety and security. Demand for knowledge about nuclear processes and verification activities used to offset potential misuse of nuclear technology will intensify with the growth of the subject technology. This Doctoral thesis contributes with a model-based approach for representing complex process such as nuclear inspections. The work presented contributes to other domains characterized with knowledge intensive and complex processes. Based on characteristics of a complex process a conceptual framework was established as the theoretical basis for creating a number of modelling languages to represent the domain. The integrated Safeguards Modelling Method (iSMM) is formalized through an integrated meta-model. The diagrammatic modelling languages represent the verification domain and relevant nuclear verification aspects. Such a meta-model conceptualizes the relation between practices of process management, knowledge management and domain specific verification principles. This fusion is considered as necessary in order to create quality processes. The study also extends the formalization achieved through a meta-model by contributing with a formalization language based on Pattern Theory. Through the use of graphical and mathematical constructs of the theory, process structures are formalized enhancing

  8. Energy-based ferromagnetic material model with magnetic anisotropy

    Energy Technology Data Exchange (ETDEWEB)

    Steentjes, Simon, E-mail: simon.steentjes@iem.rwth-aachen.de [Institute of Electrical Machines - RWTH Aachen University, Schinkelstr. 4, D-52056 Aachen (Germany); Henrotte, François, E-mail: francois.henrotte@uclouvain.be [Institute of Mechanics Materials and Civil Engineering - UCL, Av. G. Lemaître 4-6, B-1348 Louvain-la-Neuve (Belgium); Hameyer, Kay [Institute of Electrical Machines - RWTH Aachen University, Schinkelstr. 4, D-52056 Aachen (Germany)

    2017-03-01

    Non-oriented soft magnetic materials are commonly assumed to be magnetically isotropic. However, due to the rolling process a preferred direction exists along the rolling direction. This uniaxial magnetic anisotropy, and the related magnetostriction effect, are critical to the accurate calculation of iron losses and magnetic forces in rotating electrical machines. This paper proposes an extension of an isotropic energy-based vector hysteresis model to account for these two effects. - Highlights: • Energy-based vector hysteresis model with magnetic anisotropy. • Two-scale model to account for pinning field distribution. • Pinning force and reluctivity are extended to anisotropic case.

  9. Structure and sensitivity analysis of individual-based predator–prey models

    International Nuclear Information System (INIS)

    Imron, Muhammad Ali; Gergs, Andre; Berger, Uta

    2012-01-01

    The expensive computational cost of sensitivity analyses has hampered the use of these techniques for analysing individual-based models in ecology. A relatively cheap computational cost, referred to as the Morris method, was chosen to assess the relative effects of all parameters on the model’s outputs and to gain insights into predator–prey systems. Structure and results of the sensitivity analysis of the Sumatran tiger model – the Panthera Population Persistence (PPP) and the Notonecta foraging model (NFM) – were compared. Both models are based on a general predation cycle and designed to understand the mechanisms behind the predator–prey interaction being considered. However, the models differ significantly in their complexity and the details of the processes involved. In the sensitivity analysis, parameters that directly contribute to the number of prey items killed were found to be most influential. These were the growth rate of prey and the hunting radius of tigers in the PPP model as well as attack rate parameters and encounter distance of backswimmers in the NFM model. Analysis of distances in both of the models revealed further similarities in the sensitivity of the two individual-based models. The findings highlight the applicability and importance of sensitivity analyses in general, and screening design methods in particular, during early development of ecological individual-based models. Comparison of model structures and sensitivity analyses provides a first step for the derivation of general rules in the design of predator–prey models for both practical conservation and conceptual understanding. - Highlights: ► Structure of predation processes is similar in tiger and backswimmer model. ► The two individual-based models (IBM) differ in space formulations. ► In both models foraging distance is among the sensitive parameters. ► Morris method is applicable for the sensitivity analysis even of complex IBMs.

  10. Excellent approach to modeling urban expansion by fuzzy cellular automata: agent base model

    Science.gov (United States)

    Khajavigodellou, Yousef; Alesheikh, Ali A.; Mohammed, Abdulrazak A. S.; Chapi, Kamran

    2014-09-01

    Recently, the interaction between humans and their environment is the one of important challenges in the world. Landuse/ cover change (LUCC) is a complex process that includes actors and factors at different social and spatial levels. The complexity and dynamics of urban systems make the applicable practice of urban modeling very difficult. With the increased computational power and the greater availability of spatial data, micro-simulation such as the agent based and cellular automata simulation methods, has been developed by geographers, planners, and scholars, and it has shown great potential for representing and simulating the complexity of the dynamic processes involved in urban growth and land use change. This paper presents Fuzzy Cellular Automata in Geospatial Information System and remote Sensing to simulated and predicted urban expansion pattern. These FCA-based dynamic spatial urban models provide an improved ability to forecast and assess future urban growth and to create planning scenarios, allowing us to explore the potential impacts of simulations that correspond to urban planning and management policies. A fuzzy inference guided cellular automata approach. Semantic or linguistic knowledge on Land use change is expressed as fuzzy rules, based on which fuzzy inference is applied to determine the urban development potential for each pixel. The model integrates an ABM (agent-based model) and FCA (Fuzzy Cellular Automata) to investigate a complex decision-making process and future urban dynamic processes. Based on this model rapid development and green land protection under the influences of the behaviors and decision modes of regional authority agents, real estate developer agents, resident agents and non- resident agents and their interactions have been applied to predict the future development patterns of the Erbil metropolitan region.

  11. Adaptive MPC based on MIMO ARX-Laguerre model.

    Science.gov (United States)

    Ben Abdelwahed, Imen; Mbarek, Abdelkader; Bouzrara, Kais

    2017-03-01

    This paper proposes a method for synthesizing an adaptive predictive controller using a reduced complexity model. This latter is given by the projection of the ARX model on Laguerre bases. The resulting model is entitled MIMO ARX-Laguerre and it is characterized by an easy recursive representation. The adaptive predictive control law is computed based on multi-step-ahead finite-element predictors, identified directly from experimental input/output data. The model is tuned in each iteration by an online identification algorithms of both model parameters and Laguerre poles. The proposed approach avoids time consuming numerical optimization algorithms associated with most common linear predictive control strategies, which makes it suitable for real-time implementation. The method is used to synthesize and test in numerical simulations adaptive predictive controllers for the CSTR process benchmark. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  12. Individual-based modelling and control of bovine brucellosis

    Science.gov (United States)

    Nepomuceno, Erivelton G.; Barbosa, Alípio M.; Silva, Marcos X.; Perc, Matjaž

    2018-05-01

    We present a theoretical approach to control bovine brucellosis. We have used individual-based modelling, which is a network-type alternative to compartmental models. Our model thus considers heterogeneous populations, and spatial aspects such as migration among herds and control actions described as pulse interventions are also easily implemented. We show that individual-based modelling reproduces the mean field behaviour of an equivalent compartmental model. Details of this process, as well as flowcharts, are provided to facilitate the reproduction of the presented results. We further investigate three numerical examples using real parameters of herds in the São Paulo state of Brazil, in scenarios which explore eradication, continuous and pulsed vaccination and meta-population effects. The obtained results are in good agreement with the expected behaviour of this disease, which ultimately showcases the effectiveness of our theory.

  13. Models and parameters for environmental radiological assessments

    International Nuclear Information System (INIS)

    Miller, C.W.

    1983-01-01

    This article reviews the forthcoming book Models and Parameters for Environmental Radiological Assessments, which presents a unified compilation of models and parameters for assessing the impact on man of radioactive discharges, both routine and accidental, into the environment. Models presented in this book include those developed for the prediction of atmospheric and hydrologic transport and deposition, for terrestrial and aquatic food-chain bioaccumulation, and for internal and external dosimetry. Summaries are presented for each of the transport and dosimetry areas previously for each of the transport and dosimetry areas previously mentioned, and details are available in the literature cited. A chapter of example problems illustrates many of the methodologies presented throughout the text. Models and parameters presented are based on the results of extensive literature reviews and evaluations performed primarily by the staff of the Health and Safety Research Division of Oak Ridge National Laboratory

  14. High-level PC-based laser system modeling

    Science.gov (United States)

    Taylor, Michael S.

    1991-05-01

    Since the inception of the Strategic Defense Initiative (SDI) there have been a multitude of comparison studies done in an attempt to evaluate the effectiveness and relative sizes of complementary, and sometimes competitive, laser weapon systems. It became more and more apparent that what the systems analyst needed was not only a fast, but a cost effective way to perform high-level trade studies. In the present investigation, a general procedure is presented for the development of PC-based algorithmic systems models for laser systems. This procedure points out all of the major issues that should be addressed in the design and development of such a model. Issues addressed include defining the problem to be modeled, defining a strategy for development, and finally, effective use of the model once developed. Being a general procedure, it will allow a systems analyst to develop a model to meet specific needs. To illustrate this method of model development, a description of the Strategic Defense Simulation - Design To (SDS-DT) model developed and used by Science Applications International Corporation (SAIC) is presented. SDS-DT is a menu-driven, fast executing, PC-based program that can be used to either calculate performance, weight, volume, and cost values for a particular design or, alternatively, to run parametrics on particular system parameters to perhaps optimize a design.

  15. Stochastic Watershed Models for Risk Based Decision Making

    Science.gov (United States)

    Vogel, R. M.

    2017-12-01

    Over half a century ago, the Harvard Water Program introduced the field of operational or synthetic hydrology providing stochastic streamflow models (SSMs), which could generate ensembles of synthetic streamflow traces useful for hydrologic risk management. The application of SSMs, based on streamflow observations alone, revolutionized water resources planning activities, yet has fallen out of favor due, in part, to their inability to account for the now nearly ubiquitous anthropogenic influences on streamflow. This commentary advances the modern equivalent of SSMs, termed `stochastic watershed models' (SWMs) useful as input to nearly all modern risk based water resource decision making approaches. SWMs are deterministic watershed models implemented using stochastic meteorological series, model parameters and model errors, to generate ensembles of streamflow traces that represent the variability in possible future streamflows. SWMs combine deterministic watershed models, which are ideally suited to accounting for anthropogenic influences, with recent developments in uncertainty analysis and principles of stochastic simulation

  16. Artificial emotional model based on finite state machine

    Institute of Scientific and Technical Information of China (English)

    MENG Qing-mei; WU Wei-guo

    2008-01-01

    According to the basic emotional theory, the artificial emotional model based on the finite state machine(FSM) was presented. In finite state machine model of emotion, the emotional space included the basic emotional space and the multiple emotional spaces. The emotion-switching diagram was defined and transition function was developed using Markov chain and linear interpolation algorithm. The simulation model was built using Stateflow toolbox and Simulink toolbox based on the Matlab platform.And the model included three subsystems: the input one, the emotion one and the behavior one. In the emotional subsystem, the responses of different personalities to the external stimuli were described by defining personal space. This model takes states from an emotional space and updates its state depending on its current state and a state of its input (also a state-emotion). The simulation model realizes the process of switching the emotion from the neutral state to other basic emotions. The simulation result is proved to correspond to emotion-switching law of human beings.

  17. A sediment graph model based on SCS-CN method

    Science.gov (United States)

    Singh, P. K.; Bhunya, P. K.; Mishra, S. K.; Chaube, U. C.

    2008-01-01

    SummaryThis paper proposes new conceptual sediment graph models based on coupling of popular and extensively used methods, viz., Nash model based instantaneous unit sediment graph (IUSG), soil conservation service curve number (SCS-CN) method, and Power law. These models vary in their complexity and this paper tests their performance using data of the Nagwan watershed (area = 92.46 km 2) (India). The sensitivity of total sediment yield and peak sediment flow rate computations to model parameterisation is analysed. The exponent of the Power law, β, is more sensitive than other model parameters. The models are found to have substantial potential for computing sediment graphs (temporal sediment flow rate distribution) as well as total sediment yield.

  18. A Memory-Based Model of Hick's Law

    Science.gov (United States)

    Schneider, Darryl W.; Anderson, John R.

    2011-01-01

    We propose and evaluate a memory-based model of Hick's law, the approximately linear increase in choice reaction time with the logarithm of set size (the number of stimulus-response alternatives). According to the model, Hick's law reflects a combination of associative interference during retrieval from declarative memory and occasional savings…

  19. A Causal Model of Consumer-Based Brand Equity

    Directory of Open Access Journals (Sweden)

    Szőcs Attila

    2015-12-01

    Full Text Available Branding literature suggests that consumer-based brand equity (CBBE is a multidimensional construct. Starting from this approach and developing a conceptual multidimensional model, this study finds that CBBE can be best modelled with a two-dimensional structure and claims that it achieves this result by choosing the theoretically based causal specification. On the contrary, with reflective specification, one will be able to fit almost any valid construct because of the halo effect and common method bias. In the final model, Trust (in quality and Advantage are causing the second-order Brand Equity. The two-dimensional brand equity is an intuitive model easy to interpret and easy to measure, which thus may be a much more attractive means for the management as well.

  20. Modelling regime shifts in the southern Benguela: a frame-based ...

    African Journals Online (AJOL)

    Modelling regime shifts in the southern Benguela: a frame-based approach. MD Smith, A Jarre. Abstract. This study explores the usefulness of a frame-based modelling approach in the southern Benguela upwelling ecosystem, with four frames describing observed small pelagic fish dominance patterns. We modelled the ...

  1. Modeling Cyclic Variation of Intracranial Pressure

    National Research Council Canada - National Science Library

    Daley, M

    2001-01-01

    ...) recording during mechanical ventilation are due to cyclic extravascular compressional modulation primarily of the cerebral venous bed, an established isovolumetric model of cerebrospinal fluid...

  2. SLS Model Based Design: A Navigation Perspective

    Science.gov (United States)

    Oliver, T. Emerson; Anzalone, Evan; Park, Thomas; Geohagan, Kevin

    2018-01-01

    The SLS Program has implemented a Model-based Design (MBD) and Model-based Requirements approach for managing component design information and system requirements. This approach differs from previous large-scale design efforts at Marshall Space Flight Center where design documentation alone conveyed information required for vehicle design and analysis and where extensive requirements sets were used to scope and constrain the design. The SLS Navigation Team is responsible for the Program-controlled Design Math Models (DMMs) which describe and represent the performance of the Inertial Navigation System (INS) and the Rate Gyro Assemblies (RGAs) used by Guidance, Navigation, and Controls (GN&C). The SLS Navigation Team is also responsible for navigation algorithms. The navigation algorithms are delivered for implementation on the flight hardware as a DMM. For the SLS Block 1B design, the additional GPS Receiver hardware model is managed as a DMM at the vehicle design level. This paper describes the models, and discusses the processes and methods used to engineer, design, and coordinate engineering trades and performance assessments using SLS practices as applied to the GN&C system, with a particular focus on the navigation components.

  3. Improvement of a Robotic Manipulator Model Based on Multivariate Residual Modeling

    Directory of Open Access Journals (Sweden)

    Serge Gale

    2017-07-01

    Full Text Available A new method is presented for extending a dynamic model of a six degrees of freedom robotic manipulator. A non-linear multivariate calibration of input–output training data from several typical motion trajectories is carried out with the aim of predicting the model systematic output error at time (t + 1 from known input reference up till and including time (t. A new partial least squares regression (PLSR based method, nominal PLSR with interactions was developed and used to handle, unmodelled non-linearities. The performance of the new method is compared with least squares (LS. Different cross-validation schemes were compared in order to assess the sampling of the state space based on conventional trajectories. The method developed in the paper can be used as fault monitoring mechanism and early warning system for sensor failure. The results show that the suggested methods improves trajectory tracking performance of the robotic manipulator by extending the initial dynamic model of the manipulator.

  4. The Goddard Snow Radiance Assimilation Project: An Integrated Snow Radiance and Snow Physics Modeling Framework for Snow/cold Land Surface Modeling

    Science.gov (United States)

    Kim, E.; Tedesco, M.; Reichle, R.; Choudhury, B.; Peters-Lidard C.; Foster, J.; Hall, D.; Riggs, G.

    2006-01-01

    Microwave-based retrievals of snow parameters from satellite observations have a long heritage and have so far been generated primarily by regression-based empirical "inversion" methods based on snapshots in time. Direct assimilation of microwave radiance into physical land surface models can be used to avoid errors associated with such retrieval/inversion methods, instead utilizing more straightforward forward models and temporal information. This approach has been used for years for atmospheric parameters by the operational weather forecasting community with great success. Recent developments in forward radiative transfer modeling, physical land surface modeling, and land data assimilation are converging to allow the assembly of an integrated framework for snow/cold lands modeling and radiance assimilation. The objective of the Goddard snow radiance assimilation project is to develop such a framework and explore its capabilities. The key elements of this framework include: a forward radiative transfer model (FRTM) for snow, a snowpack physical model, a land surface water/energy cycle model, and a data assimilation scheme. In fact, multiple models are available for each element enabling optimization to match the needs of a particular study. Together these form a modular and flexible framework for self-consistent, physically-based remote sensing and water/energy cycle studies. In this paper we will describe the elements and the integration plan. All modules will operate within the framework of the Land Information System (LIS), a land surface modeling framework with data assimilation capabilities running on a parallel-node computing cluster. Capabilities for assimilation of snow retrieval products are already under development for LIS. We will describe plans to add radiance-based assimilation capabilities. Plans for validation activities using field measurements will also be discussed.

  5. A continuum mechanics-based musculo-mechanical model for esophageal transport

    Science.gov (United States)

    Kou, Wenjun; Griffith, Boyce E.; Pandolfino, John E.; Kahrilas, Peter J.; Patankar, Neelesh A.

    2017-11-01

    In this work, we extend our previous esophageal transport model using an immersed boundary (IB) method with discrete fiber-based structural model, to one using a continuum mechanics-based model that is approximated based on finite elements (IB-FE). To deal with the leakage of flow when the Lagrangian mesh becomes coarser than the fluid mesh, we employ adaptive interaction quadrature points to deal with Lagrangian-Eulerian interaction equations based on a previous work (Griffith and Luo [1]). In particular, we introduce a new anisotropic adaptive interaction quadrature rule. The new rule permits us to vary the interaction quadrature points not only at each time-step and element but also at different orientations per element. This helps to avoid the leakage issue without sacrificing the computational efficiency and accuracy in dealing with the interaction equations. For the material model, we extend our previous fiber-based model to a continuum-based model. We present formulations for general fiber-reinforced material models in the IB-FE framework. The new material model can handle non-linear elasticity and fiber-matrix interactions, and thus permits us to consider more realistic material behavior of biological tissues. To validate our method, we first study a case in which a three-dimensional short tube is dilated. Results on the pressure-displacement relationship and the stress distribution matches very well with those obtained from the implicit FE method. We remark that in our IB-FE case, the three-dimensional tube undergoes a very large deformation and the Lagrangian mesh-size becomes about 6 times of Eulerian mesh-size in the circumferential orientation. To validate the performance of the method in handling fiber-matrix material models, we perform a second study on dilating a long fiber-reinforced tube. Errors are small when we compare numerical solutions with analytical solutions. The technique is then applied to the problem of esophageal transport. We use two

  6. On Process Modelling Using Physical Oriented And Phenomena Based Principles

    Directory of Open Access Journals (Sweden)

    Mihai Culea

    2000-12-01

    Full Text Available This work presents a modelling framework based on phenomena description of the process. The approach is taken to easy understand and construct process model in heterogeneous possible distributed modelling and simulation environments. A simplified case study of a heat exchanger is considered and Modelica modelling language to check the proposed concept. The partial results are promising and the research effort will be extended in a computer aided modelling environment based on phenomena.

  7. Modelling requirements for future assessments based on FEP analysis

    International Nuclear Information System (INIS)

    Locke, J.; Bailey, L.

    1998-01-01

    This report forms part of a suite of documents describing the Nirex model development programme. The programme is designed to provide a clear audit trail from the identification of significant features, events and processes (FEPs) to the models and modelling processes employed within a detailed safety assessment. A scenario approach to performance assessment has been adopted. It is proposed that potential evolutions of a deep geological radioactive waste repository can be represented by a base scenario and a number of variant scenarios. The base scenario is chosen to be broad-ranging and to represent the natural evolution of the repository system and its surrounding environment. The base scenario is defined to include all those FEPs that are certain to occur and those which are judged likely to occur for a significant period of the assessment timescale. The structuring of FEPs on a Master Directed Diagram (MDD) provides a systematic framework for identifying those FEPs that form part of the natural evolution of the system and those, which may define alternative potential evolutions of the repository system. In order to construct a description of the base scenario, FEPs have been grouped into a series of conceptual models. Conceptual models are groups of FEPs, identified from the MDD, representing a specific component or process within the disposal system. It has been found appropriate to define conceptual models in terms of the three main components of the disposal system: the repository engineered system, the surrounding geosphere and the biosphere. For each of these components, conceptual models provide a description of the relevant subsystem in terms of its initial characteristics, subsequent evolution and the processes affecting radionuclide transport for the groundwater and gas pathways. The aim of this document is to present the methodology that has been developed for deriving modelling requirements and to illustrate the application of the methodology by

  8. Constructing rule-based models using the belief functions framework

    NARCIS (Netherlands)

    Almeida, R.J.; Denoeux, T.; Kaymak, U.; Greco, S.; Bouchon-Meunier, B.; Coletti, G.; Fedrizzi, M.; Matarazzo, B.; Yager, R.R.

    2012-01-01

    Abstract. We study a new approach to regression analysis. We propose a new rule-based regression model using the theoretical framework of belief functions. For this purpose we use the recently proposed Evidential c-means (ECM) to derive rule-based models solely from data. ECM allocates, for each

  9. Group Contribution Based Process Flowsheet Synthesis, Design and Modelling

    DEFF Research Database (Denmark)

    d'Anterroches, Loïc; Gani, Rafiqul

    2004-01-01

    This paper presents a process-group-contribution Method to model. simulate and synthesize a flowsheet. The process-group based representation of a flowsheet together with a process "property" model are presented. The process-group based synthesis method is developed on the basis of the computer...... aided molecular design methods and gives the ability to screen numerous process alternatives without the need to use the rigorous process simulation models. The process "property" model calculates the design targets for the generated flowsheet alternatives while a reverse modelling method (also...... developed) determines the design variables matching the target. A simple illustrative example highlighting the main features of the methodology is also presented....

  10. Mathematical modelling in engineering: A proposal to introduce linear algebra concepts

    Directory of Open Access Journals (Sweden)

    Andrea Dorila Cárcamo

    2016-03-01

    Full Text Available The modern dynamic world requires that basic science courses for engineering, including linear algebra, emphasize the development of mathematical abilities primarily associated with modelling and interpreting, which aren´t limited only to calculus abilities. Considering this, an instructional design was elaborated based on mathematic modelling and emerging heuristic models for the construction of specific linear algebra concepts:  span and spanning set. This was applied to first year engineering students. Results suggest that this type of instructional design contributes to the construction of these mathematical concepts and can also favour first year engineering students understanding of key linear algebra concepts and potentiate the development of higher order skills.

  11. Landsat analysis of tropical forest succession employing a terrain model

    Science.gov (United States)

    Barringer, T. H.; Robinson, V. B.; Coiner, J. C.; Bruce, R. C.

    1980-01-01

    Landsat multispectral scanner (MSS) data have yielded a dual classification of rain forest and shadow in an analysis of a semi-deciduous forest on Mindonoro Island, Philippines. Both a spatial terrain model, using a fifth side polynomial trend surface analysis for quantitatively estimating the general spatial variation in the data set, and a spectral terrain model, based on the MSS data, have been set up. A discriminant analysis, using both sets of data, has suggested that shadowing effects may be due primarily to local variations in the spectral regions and can therefore be compensated for through the decomposition of the spatial variation in both elevation and MSS data.

  12. On Model Based Synthesis of Embedded Control Software

    OpenAIRE

    Alimguzhin, Vadim; Mari, Federico; Melatti, Igor; Salvo, Ivano; Tronci, Enrico

    2012-01-01

    Many Embedded Systems are indeed Software Based Control Systems (SBCSs), that is control systems whose controller consists of control software running on a microcontroller device. This motivates investigation on Formal Model Based Design approaches for control software. Given the formal model of a plant as a Discrete Time Linear Hybrid System and the implementation specifications (that is, number of bits in the Analog-to-Digital (AD) conversion) correct-by-construction control software can be...

  13. Trojan detection model based on network behavior analysis

    International Nuclear Information System (INIS)

    Liu Junrong; Liu Baoxu; Wang Wenjin

    2012-01-01

    Based on the analysis of existing Trojan detection technology, this paper presents a Trojan detection model based on network behavior analysis. First of all, we abstract description of the Trojan network behavior, then according to certain rules to establish the characteristic behavior library, and then use the support vector machine algorithm to determine whether a Trojan invasion. Finally, through the intrusion detection experiments, shows that this model can effectively detect Trojans. (authors)

  14. Model based risk assessment - the CORAS framework

    Energy Technology Data Exchange (ETDEWEB)

    Gran, Bjoern Axel; Fredriksen, Rune; Thunem, Atoosa P-J.

    2004-04-15

    Traditional risk analysis and assessment is based on failure-oriented models of the system. In contrast to this, model-based risk assessment (MBRA) utilizes success-oriented models describing all intended system aspects, including functional, operational and organizational aspects of the target. The target models are then used as input sources for complementary risk analysis and assessment techniques, as well as a basis for the documentation of the assessment results. The EU-funded CORAS project developed a tool-supported methodology for the application of MBRA in security-critical systems. The methodology has been tested with successful outcome through a series of seven trial within the telemedicine and ecommerce areas. The CORAS project in general and the CORAS application of MBRA in particular have contributed positively to the visibility of model-based risk assessment and thus to the disclosure of several potentials for further exploitation of various aspects within this important research field. In that connection, the CORAS methodology's possibilities for further improvement towards utilization in more complex architectures and also in other application domains such as the nuclear field can be addressed. The latter calls for adapting the framework to address nuclear standards such as IEC 60880 and IEC 61513. For this development we recommend applying a trial driven approach within the nuclear field. The tool supported approach for combining risk analysis and system development also fits well with the HRP proposal for developing an Integrated Design Environment (IDE) providing efficient methods and tools to support control room systems design. (Author)

  15. Fuzzy model-based servo and model following control for nonlinear systems.

    Science.gov (United States)

    Ohtake, Hiroshi; Tanaka, Kazuo; Wang, Hua O

    2009-12-01

    This correspondence presents servo and nonlinear model following controls for a class of nonlinear systems using the Takagi-Sugeno fuzzy model-based control approach. First, the construction method of the augmented fuzzy system for continuous-time nonlinear systems is proposed by differentiating the original nonlinear system. Second, the dynamic fuzzy servo controller and the dynamic fuzzy model following controller, which can make outputs of the nonlinear system converge to target points and to outputs of the reference system, respectively, are introduced. Finally, the servo and model following controller design conditions are given in terms of linear matrix inequalities. Design examples illustrate the utility of this approach.

  16. Data base on animal mortality

    International Nuclear Information System (INIS)

    Jones, T.D.

    1987-01-01

    A data base on animal mortality has been compiled. The literature on LD 50 and the dose-response function for radiation-induced lethality, reflect several inconsistencies - primarily due to dose assignments and to analytical methods and/or mathematical models used. Thus, in order to make the individual experiments which were included in the data base as consistent as possible, an estimate of the uniform dose received by the bone marrow in each treatment group was made so that the interspecies differences are minimized. The LD 50 was recalculated using a single estimation procedure for all studies for which sufficient experimental data are available. For small animals such as mice, the dose to the hematopoietic system is approximately equal to the treatment dose, but for large animals the marrow dose may be about half of the treatment dose

  17. A burnout prediction model based around char morphology

    Energy Technology Data Exchange (ETDEWEB)

    Tao Wu; Edward Lester; Michael Cloke [University of Nottingham, Nottingham (United Kingdom). School of Chemical, Environmental and Mining Engineering

    2006-05-15

    Several combustion models have been developed that can make predictions about coal burnout and burnout potential. Most of these kinetic models require standard parameters such as volatile content and particle size to make a burnout prediction. This article presents a new model called the char burnout (ChB) model, which also uses detailed information about char morphology in its prediction. The input data to the model is based on information derived from two different image analysis techniques. One technique generates characterization data from real char samples, and the other predicts char types based on characterization data from image analysis of coal particles. The pyrolyzed chars in this study were created in a drop tube furnace operating at 1300{sup o}C, 200 ms, and 1% oxygen. Modeling results were compared with a different carbon burnout kinetic model as well as the actual burnout data from refiring the same chars in a drop tube furnace operating at 1300{sup o}C, 5% oxygen, and residence times of 200, 400, and 600 ms. A good agreement between ChB model and experimental data indicates that the inclusion of char morphology in combustion models could well improve model predictions. 38 refs., 5 figs., 6 tabs.

  18. Characteristics of the large corporation-based, bureaucratic model among oecd countries - an foi model analysis

    Directory of Open Access Journals (Sweden)

    Bartha Zoltán

    2014-03-01

    Full Text Available Deciding on the development path of the economy has been a delicate question in economic policy, not least because of the trade-off effects which immediately worsen certain economic indicators as steps are taken to improve others. The aim of the paper is to present a framework that helps decide on such policy dilemmas. This framework is based on an analysis conducted among OECD countries with the FOI model (focusing on future, outside and inside potentials. Several development models can be deduced by this method, out of which only the large corporation-based, bureaucratic model is discussed in detail. The large corporation-based, bureaucratic model implies a development strategy focused on the creation of domestic safe havens. Based on country studies, it is concluded that well-performing safe havens require the active participation of the state. We find that, in countries adhering to this model, business competitiveness is sustained through intensive public support, and an active role taken by the government in education, research and development, in detecting and exploiting special market niches, and in encouraging sectorial cooperation.

  19. An approach to model validation and model-based prediction -- polyurethane foam case study.

    Energy Technology Data Exchange (ETDEWEB)

    Dowding, Kevin J.; Rutherford, Brian Milne

    2003-07-01

    Enhanced software methodology and improved computing hardware have advanced the state of simulation technology to a point where large physics-based codes can be a major contributor in many systems analyses. This shift toward the use of computational methods has brought with it new research challenges in a number of areas including characterization of uncertainty, model validation, and the analysis of computer output. It is these challenges that have motivated the work described in this report. Approaches to and methods for model validation and (model-based) prediction have been developed recently in the engineering, mathematics and statistical literatures. In this report we have provided a fairly detailed account of one approach to model validation and prediction applied to an analysis investigating thermal decomposition of polyurethane foam. A model simulates the evolution of the foam in a high temperature environment as it transforms from a solid to a gas phase. The available modeling and experimental results serve as data for a case study focusing our model validation and prediction developmental efforts on this specific thermal application. We discuss several elements of the ''philosophy'' behind the validation and prediction approach: (1) We view the validation process as an activity applying to the use of a specific computational model for a specific application. We do acknowledge, however, that an important part of the overall development of a computational simulation initiative is the feedback provided to model developers and analysts associated with the application. (2) We utilize information obtained for the calibration of model parameters to estimate the parameters and quantify uncertainty in the estimates. We rely, however, on validation data (or data from similar analyses) to measure the variability that contributes to the uncertainty in predictions for specific systems or units (unit-to-unit variability). (3) We perform statistical

  20. A model-data based systems approach to process intensification

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    . Their developments, however, are largely due to experiment based trial and error approaches and while they do not require validation, they can be time consuming and resource intensive. Also, one may ask, can a truly new intensified unit operation be obtained in this way? An alternative two-stage approach is to apply...... a model-based synthesis method to systematically generate and evaluate alternatives in the first stage and an experiment-model based validation in the second stage. In this way, the search for alternatives is done very quickly, reliably and systematically over a wide range, while resources are preserved...... for focused validation of only the promising candidates in the second-stage. This approach, however, would be limited to intensification based on “known” unit operations, unless the PI process synthesis/design is considered at a lower level of aggregation, namely the phenomena level. That is, the model-based...