WorldWideScience

Sample records for modeling approach shows

  1. An integrated proteomics approach shows synaptic plasticity changes in an APP/PS1 Alzheimer's mouse model

    DEFF Research Database (Denmark)

    Kempf, Stefan J; Metaxas, Athanasios; Ibáñez-Vea, María

    2016-01-01

    The aim of this study was to elucidate the molecular signature of Alzheimer's disease-associated amyloid pathology.We used the double APPswe/PS1ΔE9 mouse, a widely used model of cerebral amyloidosis, to compare changes in proteome, including global phosphorylation and sialylated N-linked glycosyl...

  2. Modeling antibiotic treatment in hospitals: A systematic approach shows benefits of combination therapy over cycling, mixing, and mono-drug therapies.

    Science.gov (United States)

    Tepekule, Burcu; Uecker, Hildegard; Derungs, Isabel; Frenoy, Antoine; Bonhoeffer, Sebastian

    2017-09-01

    Multiple treatment strategies are available for empiric antibiotic therapy in hospitals, but neither clinical studies nor theoretical investigations have yielded a clear picture when which strategy is optimal and why. Extending earlier work of others and us, we present a mathematical model capturing treatment strategies using two drugs, i.e the multi-drug therapies referred to as cycling, mixing, and combination therapy, as well as monotherapy with either drug. We randomly sample a large parameter space to determine the conditions determining success or failure of these strategies. We find that combination therapy tends to outperform the other treatment strategies. By using linear discriminant analysis and particle swarm optimization, we find that the most important parameters determining success or failure of combination therapy relative to the other treatment strategies are the de novo rate of emergence of double resistance in patients infected with sensitive bacteria and the fitness costs associated with double resistance. The rate at which double resistance is imported into the hospital via patients admitted from the outside community has little influence, as all treatment strategies are affected equally. The parameter sets for which combination therapy fails tend to fall into areas with low biological plausibility as they are characterised by very high rates of de novo emergence of resistance to both drugs compared to a single drug, and the cost of double resistance is considerably smaller than the sum of the costs of single resistance.

  3. Restrained eaters show enhanced automatic approach tendencies towards food

    NARCIS (Netherlands)

    Veenstra, Esther M.; de Jong, Peter J.

    Although restrained eaters intend to limit their caloric intake, they nevertheless frequently fail and indulge in exactly the foods they want to avoid. Because automatic food-relevant approach tendencies and affective associations may both (independently) contribute to the dysregulation of food

  4. A Population Pharmacokinetic Modeling Approach Shows that Serum Penicillin G Concentrations Are Below Inhibitory Concentrations by Two Weeks after Benzathine Penicillin G Injection in the Majority of Young Adults

    Science.gov (United States)

    2014-11-01

    a proposed protective threshold against group A Streptococcus pyogenes (GAS). The final population model included linear absorption into a central...the prevention and treatment of group A Streptococcus pyogenes (GAS) infections. This minimum threshold differs among authorities but is usually set...liter is the suggested minimum protective concentration of penicillin G against group A streptococcus . Note that the majority of measured concen- trations

  5. HEDR modeling approach

    International Nuclear Information System (INIS)

    Shipler, D.B.; Napier, B.A.

    1992-07-01

    This report details the conceptual approaches to be used in calculating radiation doses to individuals throughout the various periods of operations at the Hanford Site. The report considers the major environmental transport pathways--atmospheric, surface water, and ground water--and projects and appropriate modeling technique for each. The modeling sequence chosen for each pathway depends on the available data on doses, the degree of confidence justified by such existing data, and the level of sophistication deemed appropriate for the particular pathway and time period being considered

  6. Modeling prosody: Different approaches

    Science.gov (United States)

    Carmichael, Lesley M.

    2002-11-01

    Prosody pervades all aspects of a speech signal, both in terms of raw acoustic outcomes and linguistically meaningful units, from the phoneme to the discourse unit. It is carried in the suprasegmental features of fundamental frequency, loudness, and duration. Several models have been developed to account for the way prosody organizes speech, and they vary widely in terms of their theoretical assumptions, organizational primitives, actual procedures of application to speech, and intended use (e.g., to generate speech from text vs. to model the prosodic phonology of a language). In many cases, these models overtly contradict one another with regard to their fundamental premises or their identification of the perceptible objects of linguistic prosody. These competing models are directly compared. Each model is applied to the same speech samples. This parallel analysis allows for a critical inspection of each model and its efficacy in assessing the suprasegmental behavior of the speech. The analyses illustrate how different approaches are better equipped to account for different aspects of prosody. Viewing the models and their successes from an objective perspective allows for creative possibilities in terms of combining strengths from models which might otherwise be considered fundamentally incompatible.

  7. Showing that the race model inequality is not violated

    DEFF Research Database (Denmark)

    Gondan, Matthias; Riehl, Verena; Blurton, Steven Paul

    2012-01-01

    When participants are asked to respond in the same way to stimuli from different sources (e. g., auditory and visual), responses are often observed to be substantially faster when both stimuli are presented simultaneously (redundancy gain). Different models account for this effect, the two most...

  8. Material Modelling - Composite Approach

    DEFF Research Database (Denmark)

    Nielsen, Lauge Fuglsang

    1997-01-01

    This report is part of a research project on "Control of Early Age Cracking" - which, in turn, is part of the major research programme, "High Performance Concrete - The Contractor's Technology (HETEK)", coordinated by the Danish Road Directorate, Copenhagen, Denmark, 1997.A composite-rheological ......This report is part of a research project on "Control of Early Age Cracking" - which, in turn, is part of the major research programme, "High Performance Concrete - The Contractor's Technology (HETEK)", coordinated by the Danish Road Directorate, Copenhagen, Denmark, 1997.A composite......-rheological model of concrete is presented by which consistent predictions of creep, relaxation, and internal stresses can be made from known concrete composition, age at loading, and climatic conditions. No other existing "creep prediction method" offers these possibilities in one approach.The model...... in this report is that cement paste and concrete behave practically as linear-viscoelastic materials from an age of approximately 10 hours. This is a significant age extension relative to earlier studies in the literature where linear-viscoelastic behavior is only demonstrated from ages of a few days. Thus...

  9. Classifying Multi-Model Wheat Yield Impact Response Surfaces Showing Sensitivity to Temperature and Precipitation Change

    Science.gov (United States)

    Fronzek, Stefan; Pirttioja, Nina; Carter, Timothy R.; Bindi, Marco; Hoffmann, Holger; Palosuo, Taru; Ruiz-Ramos, Margarita; Tao, Fulu; Trnka, Miroslav; Acutis, Marco; hide

    2017-01-01

    Crop growth simulation models can differ greatly in their treatment of key processes and hence in their response to environmental conditions. Here, we used an ensemble of 26 process-based wheat models applied at sites across a European transect to compare their sensitivity to changes in temperature (minus 2 to plus 9 degrees Centigrade) and precipitation (minus 50 to plus 50 percent). Model results were analysed by plotting them as impact response surfaces (IRSs), classifying the IRS patterns of individual model simulations, describing these classes and analysing factors that may explain the major differences in model responses. The model ensemble was used to simulate yields of winter and spring wheat at four sites in Finland, Germany and Spain. Results were plotted as IRSs that show changes in yields relative to the baseline with respect to temperature and precipitation. IRSs of 30-year means and selected extreme years were classified using two approaches describing their pattern. The expert diagnostic approach (EDA) combines two aspects of IRS patterns: location of the maximum yield (nine classes) and strength of the yield response with respect to climate (four classes), resulting in a total of 36 combined classes defined using criteria pre-specified by experts. The statistical diagnostic approach (SDA) groups IRSs by comparing their pattern and magnitude, without attempting to interpret these features. It applies a hierarchical clustering method, grouping response patterns using a distance metric that combines the spatial correlation and Euclidian distance between IRS pairs. The two approaches were used to investigate whether different patterns of yield response could be related to different properties of the crop models, specifically their genealogy, calibration and process description. Although no single model property across a large model ensemble was found to explain the integrated yield response to temperature and precipitation perturbations, the

  10. Dog owners show experience-based viewing behaviour in judging dog face approachability

    OpenAIRE

    Gavin, Carla Jade; Houghton, Sarah; Guo, Kun

    2017-01-01

    Our prior visual experience plays a critical role in face perception. We show superior perceptual performance for differentiating conspecific (vs non-conspecific), own-race (vs other-race) and familiar (vs unfamiliar) faces. However, it remains unclear whether our experience with faces of other species would influence our gaze allocation for extracting salient facial information. In this eye-tracking study, we asked both dog owners and non-owners to judge the approachability of human, monkey ...

  11. Showing the Unsayable: Participatory Visual Approaches and the Constitution of 'Patient Experience' in Healthcare Quality Improvement.

    Science.gov (United States)

    Papoulias, Constantina

    2018-06-01

    This article considers the strengths and potential contributions of participatory visual methods for healthcare quality improvement research. It argues that such approaches may enable us to expand our understanding of 'patient experience' and of its potential for generating new knowledge for health systems. In particular, they may open up dimensions of people's engagement with services and treatments which exceed both the declarative nature of responses to questionnaires and the narrative sequencing of self reports gathered through qualitative interviewing. I will suggest that working with such methods may necessitate a more reflexive approach to the constitution of evidence in quality improvement work. To this end, the article will first consider the emerging rationale for the use of visual participatory methods in improvement before outlining the implications of two related approaches-photo-elicitation and PhotoVoice-for the constitution of 'experience'. It will then move to a participatory model for healthcare improvement work, Experience Based Co-Design (EBCD). It will argue that EBCD exemplifies both the strengths and the limitations of adequating visual participatory approaches to quality improvement ends. The article will conclude with a critical reflection on a small photographic study, in which the author participated, and which sought to harness service user perspectives for the design of psychiatric facilities, as a way of considering the potential contribution of visual participatory methods for quality improvement.

  12. Dog owners show experience-based viewing behaviour in judging dog face approachability.

    Science.gov (United States)

    Gavin, Carla Jade; Houghton, Sarah; Guo, Kun

    2017-01-01

    Our prior visual experience plays a critical role in face perception. We show superior perceptual performance for differentiating conspecific (vs non-conspecific), own-race (vs other-race) and familiar (vs unfamiliar) faces. However, it remains unclear whether our experience with faces of other species would influence our gaze allocation for extracting salient facial information. In this eye-tracking study, we asked both dog owners and non-owners to judge the approachability of human, monkey and dog faces, and systematically compared their behavioural performance and gaze pattern associated with the task. Compared to non-owners, dog owners assessed dog faces with shorter time and fewer fixations, but gave higher approachability ratings. The gaze allocation within local facial features was also modulated by the ownership. The averaged proportion of the fixations and viewing time directed at the dog mouth region were significantly less for the dog owners, and more experienced dog owners tended to look more at the dog eyes, suggesting the adoption of a prior experience-based viewing behaviour for assessing dog approachability. No differences in behavioural performance and gaze pattern were observed between dog owners and non-owners when judging human and monkey faces, implying that the dog owner's experience-based gaze strategy for viewing dog faces was not transferable across faces of other species.

  13. A novel statistical approach shows evidence for multi-system physiological dysregulation during aging.

    Science.gov (United States)

    Cohen, Alan A; Milot, Emmanuel; Yong, Jian; Seplaki, Christopher L; Fülöp, Tamàs; Bandeen-Roche, Karen; Fried, Linda P

    2013-03-01

    Previous studies have identified many biomarkers that are associated with aging and related outcomes, but the relevance of these markers for underlying processes and their relationship to hypothesized systemic dysregulation is not clear. We address this gap by presenting a novel method for measuring dysregulation via the joint distribution of multiple biomarkers and assessing associations of dysregulation with age and mortality. Using longitudinal data from the Women's Health and Aging Study, we selected a 14-marker subset from 63 blood measures: those that diverged from the baseline population mean with age. For the 14 markers and all combinatorial sub-subsets we calculated a multivariate distance called the Mahalanobis distance (MHBD) for all observations, indicating how "strange" each individual's biomarker profile was relative to the baseline population mean. In most models, MHBD correlated positively with age, MHBD increased within individuals over time, and higher MHBD predicted higher risk of subsequent mortality. Predictive power increased as more variables were incorporated into the calculation of MHBD. Biomarkers from multiple systems were implicated. These results support hypotheses of simultaneous dysregulation in multiple systems and confirm the need for longitudinal, multivariate approaches to understanding biomarkers in aging. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  14. Sequence-based typing of HLA-DQA1: comprehensive approach showed molecular heterogeneity.

    Science.gov (United States)

    Voorter, C E M; Lee, K W; Smillie, D; Tilanus, M G J; van den Berg-Loonen, E M

    2007-04-01

    Within the human leukocyte antigen-DQA1 workshop project the level of molecular heterogeneity of the DQA1 gene was investigated. An improved sequence-based typing protocol was used, enabling analysis of the complete coding sequence, comprising exons 1-4. The participating laboratories implemented the amplification and sequencing primers in their own sequence-based typing approach. The method proved to be sufficiently robust to handle the differences in protocols. All reference samples used for validation were correctly typed for DQA1 by all participating laboratories. Three different populations with a total of 736 individuals were investigated: a population of Korean origin (n= 467), a British Caucasian (n= 114), and a Dutch Caucasian (n= 155) population. Sixteen of the known 28 DQA1 alleles were detected and six new alleles were identified. All novel alleles showed a nucleotide substitution outside exon 2. Comparison of the calculated allele frequencies revealed major differences between the Korean and the Caucasian populations but also between Dutch and British Caucasians. A tight association between DQA1 and DRB1/DQB1 alleles was observed in all three populations.

  15. Hybrid approaches to physiologic modeling and prediction

    Science.gov (United States)

    Olengü, Nicholas O.; Reifman, Jaques

    2005-05-01

    This paper explores how the accuracy of a first-principles physiological model can be enhanced by integrating data-driven, "black-box" models with the original model to form a "hybrid" model system. Both linear (autoregressive) and nonlinear (neural network) data-driven techniques are separately combined with a first-principles model to predict human body core temperature. Rectal core temperature data from nine volunteers, subject to four 30/10-minute cycles of moderate exercise/rest regimen in both CONTROL and HUMID environmental conditions, are used to develop and test the approach. The results show significant improvements in prediction accuracy, with average improvements of up to 30% for prediction horizons of 20 minutes. The models developed from one subject's data are also used in the prediction of another subject's core temperature. Initial results for this approach for a 20-minute horizon show no significant improvement over the first-principles model by itself.

  16. Show Me the Way: Future Faculty Prefer Directive Feedback When Trying Active Learning Approaches

    Science.gov (United States)

    Stephens, Jessica D.; Battle, David C.; Gormally, Cara L.; Brickman, Peggy

    2017-01-01

    Early training opportunities for future faculty, namely graduate students and postdoctoral researchers, can better prepare them to use active learning approaches. We know that instructional feedback supports sustained change and motivates instructors to improve teaching practices. Here, we incorporate feedback as a key component of a pedagogical…

  17. Visualizing Three-dimensional Slab Geometries with ShowEarthModel

    Science.gov (United States)

    Chang, B.; Jadamec, M. A.; Fischer, K. M.; Kreylos, O.; Yikilmaz, M. B.

    2017-12-01

    Seismic data that characterize the morphology of modern subducted slabs on Earth suggest that a two-dimensional paradigm is no longer adequate to describe the subduction process. Here we demonstrate the effect of data exploration of three-dimensional (3D) global slab geometries with the open source program ShowEarthModel. ShowEarthModel was designed specifically to support data exploration, by focusing on interactivity and real-time response using the Vrui toolkit. Sixteen movies are presented that explore the 3D complexity of modern subduction zones on Earth. The first movie provides a guided tour through the Earth's major subduction zones, comparing the global slab geometry data sets of Gudmundsson and Sambridge (1998), Syracuse and Abers (2006), and Hayes et al. (2012). Fifteen regional movies explore the individual subduction zones and regions intersecting slabs, using the Hayes et al. (2012) slab geometry models where available and the Engdahl and Villasenor (2002) global earthquake data set. Viewing the subduction zones in this way provides an improved conceptualization of the 3D morphology within a given subduction zone as well as the 3D spatial relations between the intersecting slabs. This approach provides a powerful tool for rendering earth properties and broadening capabilities in both Earth Science research and education by allowing for whole earth visualization. The 3D characterization of global slab geometries is placed in the context of 3D slab-driven mantle flow and observations of shear wave splitting in subduction zones. These visualizations contribute to the paradigm shift from a 2D to 3D subduction framework by facilitating the conceptualization of the modern subduction system on Earth in 3D space.

  18. Learning Actions Models: Qualitative Approach

    DEFF Research Database (Denmark)

    Bolander, Thomas; Gierasimczuk, Nina

    2015-01-01

    —they are identifiable in the limit.We then move on to a particular learning method, which proceeds via restriction of a space of events within a learning-specific action model. This way of learning closely resembles the well-known update method from dynamic epistemic logic. We introduce several different learning...... identifiability (conclusively inferring the appropriate action model in finite time) and identifiability in the limit (inconclusive convergence to the right action model). We show that deterministic actions are finitely identifiable, while non-deterministic actions require more learning power...... methods suited for finite identifiability of particular types of deterministic actions....

  19. Rifalazil and derivative compounds show potent efficacy in a mouse model of H. pylori colonization.

    Science.gov (United States)

    Rothstein, David M; Mullin, Steve; Sirokman, Klari; Söndergaard, Karen L; Johnson, Starrla; Gwathmey, Judith K; van Duzer, John; Murphy, Christopher K

    2008-08-01

    The rifamycin rifalazil (RFZ), and derivatives (NCEs) were efficacious in a mouse model of Helicobacter pylori colonization. Select NCEs were more active in vitro and showed greater efficacy than RFZ. A systemic component contributes to efficacy.

  20. HEDR modeling approach: Revision 1

    International Nuclear Information System (INIS)

    Shipler, D.B.; Napier, B.A.

    1994-05-01

    This report is a revision of the previous Hanford Environmental Dose Reconstruction (HEDR) Project modeling approach report. This revised report describes the methods used in performing scoping studies and estimating final radiation doses to real and representative individuals who lived in the vicinity of the Hanford Site. The scoping studies and dose estimates pertain to various environmental pathways during various periods of time. The original report discussed the concepts under consideration in 1991. The methods for estimating dose have been refined as understanding of existing data, the scope of pathways, and the magnitudes of dose estimates were evaluated through scoping studies

  1. HEDR modeling approach: Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    Shipler, D.B.; Napier, B.A.

    1994-05-01

    This report is a revision of the previous Hanford Environmental Dose Reconstruction (HEDR) Project modeling approach report. This revised report describes the methods used in performing scoping studies and estimating final radiation doses to real and representative individuals who lived in the vicinity of the Hanford Site. The scoping studies and dose estimates pertain to various environmental pathways during various periods of time. The original report discussed the concepts under consideration in 1991. The methods for estimating dose have been refined as understanding of existing data, the scope of pathways, and the magnitudes of dose estimates were evaluated through scoping studies.

  2. The speed of memory errors shows the influence of misleading information: Testing the diffusion model and discrete-state models.

    Science.gov (United States)

    Starns, Jeffrey J; Dubé, Chad; Frelinger, Matthew E

    2018-05-01

    In this report, we evaluate single-item and forced-choice recognition memory for the same items and use the resulting accuracy and reaction time data to test the predictions of discrete-state and continuous models. For the single-item trials, participants saw a word and indicated whether or not it was studied on a previous list. The forced-choice trials had one studied and one non-studied word that both appeared in the earlier single-item trials and both received the same response. Thus, forced-choice trials always had one word with a previous correct response and one with a previous error. Participants were asked to select the studied word regardless of whether they previously called both words "studied" or "not studied." The diffusion model predicts that forced-choice accuracy should be lower when the word with a previous error had a fast versus a slow single-item RT, because fast errors are associated with more compelling misleading memory retrieval. The two-high-threshold (2HT) model does not share this prediction because all errors are guesses, so error RT is not related to memory strength. A low-threshold version of the discrete state approach predicts an effect similar to the diffusion model, because errors are a mixture of responses based on misleading retrieval and guesses, and the guesses should tend to be slower. Results showed that faster single-trial errors were associated with lower forced-choice accuracy, as predicted by the diffusion and low-threshold models. Copyright © 2018 Elsevier Inc. All rights reserved.

  3. Modeling Approaches in Planetary Seismology

    Science.gov (United States)

    Weber, Renee; Knapmeyer, Martin; Panning, Mark; Schmerr, Nick

    2014-01-01

    Of the many geophysical means that can be used to probe a planet's interior, seismology remains the most direct. Given that the seismic data gathered on the Moon over 40 years ago revolutionized our understanding of the Moon and are still being used today to produce new insight into the state of the lunar interior, it is no wonder that many future missions, both real and conceptual, plan to take seismometers to other planets. To best facilitate the return of high-quality data from these instruments, as well as to further our understanding of the dynamic processes that modify a planet's interior, various modeling approaches are used to quantify parameters such as the amount and distribution of seismicity, tidal deformation, and seismic structure on and of the terrestrial planets. In addition, recent advances in wavefield modeling have permitted a renewed look at seismic energy transmission and the effects of attenuation and scattering, as well as the presence and effect of a core, on recorded seismograms. In this chapter, we will review these approaches.

  4. A Conceptual Modeling Approach for OLAP Personalization

    Science.gov (United States)

    Garrigós, Irene; Pardillo, Jesús; Mazón, Jose-Norberto; Trujillo, Juan

    Data warehouses rely on multidimensional models in order to provide decision makers with appropriate structures to intuitively analyze data with OLAP technologies. However, data warehouses may be potentially large and multidimensional structures become increasingly complex to be understood at a glance. Even if a departmental data warehouse (also known as data mart) is used, these structures would be also too complex. As a consequence, acquiring the required information is more costly than expected and decision makers using OLAP tools may get frustrated. In this context, current approaches for data warehouse design are focused on deriving a unique OLAP schema for all analysts from their previously stated information requirements, which is not enough to lighten the complexity of the decision making process. To overcome this drawback, we argue for personalizing multidimensional models for OLAP technologies according to the continuously changing user characteristics, context, requirements and behaviour. In this paper, we present a novel approach to personalizing OLAP systems at the conceptual level based on the underlying multidimensional model of the data warehouse, a user model and a set of personalization rules. The great advantage of our approach is that a personalized OLAP schema is provided for each decision maker contributing to better satisfy their specific analysis needs. Finally, we show the applicability of our approach through a sample scenario based on our CASE tool for data warehouse development.

  5. Branding approach and valuation models

    Directory of Open Access Journals (Sweden)

    Mamula Tatjana

    2006-01-01

    Full Text Available Much of the skill of marketing and branding nowadays is concerned with building equity for products whose characteristics, pricing, distribution and availability are really quite close to each other. Brands allow the consumer to shop with confidence. The real power of successful brands is that they meet the expectations of those that buy them or, to put it another way, they represent a promise kept. As such they are a contract between a seller and a buyer: if the seller keeps to its side of the bargain, the buyer will be satisfied; if not, the buyer will in future look elsewhere. Understanding consumer perceptions and associations is an important first step to understanding brand preferences and choices. In this paper, we discuss different models to measure value of brand according to couple of well known approaches according to request by companies. We rely upon several empirical examples.

  6. Szekeres models: a covariant approach

    Science.gov (United States)

    Apostolopoulos, Pantelis S.

    2017-05-01

    We exploit the 1  +  1  +  2 formalism to covariantly describe the inhomogeneous and anisotropic Szekeres models. It is shown that an average scale length can be defined covariantly which satisfies a 2d equation of motion driven from the effective gravitational mass (EGM) contained in the dust cloud. The contributions to the EGM are encoded to the energy density of the dust fluid and the free gravitational field E ab . We show that the quasi-symmetric property of the Szekeres models is justified through the existence of 3 independent intrinsic Killing vector fields (IKVFs). In addition the notions of the apparent and absolute apparent horizons are briefly discussed and we give an alternative gauge-invariant form to define them in terms of the kinematical variables of the spacelike congruences. We argue that the proposed program can be used in order to express Sachs’ optical equations in a covariant form and analyze the confrontation of a spatially inhomogeneous irrotational overdense fluid model with the observational data.

  7. Interfacial Fluid Mechanics A Mathematical Modeling Approach

    CERN Document Server

    Ajaev, Vladimir S

    2012-01-01

    Interfacial Fluid Mechanics: A Mathematical Modeling Approach provides an introduction to mathematical models of viscous flow used in rapidly developing fields of microfluidics and microscale heat transfer. The basic physical effects are first introduced in the context of simple configurations and their relative importance in typical microscale applications is discussed. Then,several configurations of importance to microfluidics, most notably thin films/droplets on substrates and confined bubbles, are discussed in detail.  Topics from current research on electrokinetic phenomena, liquid flow near structured solid surfaces, evaporation/condensation, and surfactant phenomena are discussed in the later chapters. This book also:  Discusses mathematical models in the context of actual applications such as electrowetting Includes unique material on fluid flow near structured surfaces and phase change phenomena Shows readers how to solve modeling problems related to microscale multiphase flows Interfacial Fluid Me...

  8. Modeled hydrologic metrics show links between hydrology and the functional composition of stream assemblages.

    Science.gov (United States)

    Patrick, Christopher J; Yuan, Lester L

    2017-07-01

    Flow alteration is widespread in streams, but current understanding of the effects of differences in flow characteristics on stream biological communities is incomplete. We tested hypotheses about the effect of variation in hydrology on stream communities by using generalized additive models to relate watershed information to the values of different flow metrics at gauged sites. Flow models accounted for 54-80% of the spatial variation in flow metric values among gauged sites. We then used these models to predict flow metrics in 842 ungauged stream sites in the mid-Atlantic United States that were sampled for fish, macroinvertebrates, and environmental covariates. Fish and macroinvertebrate assemblages were characterized in terms of a suite of metrics that quantified aspects of community composition, diversity, and functional traits that were expected to be associated with differences in flow characteristics. We related modeled flow metrics to biological metrics in a series of stressor-response models. Our analyses identified both drying and base flow instability as explaining 30-50% of the observed variability in fish and invertebrate community composition. Variations in community composition were related to variations in the prevalence of dispersal traits in invertebrates and trophic guilds in fish. The results demonstrate that we can use statistical models to predict hydrologic conditions at bioassessment sites, which, in turn, we can use to estimate relationships between flow conditions and biological characteristics. This analysis provides an approach to quantify the effects of spatial variation in flow metrics using readily available biomonitoring data. © 2017 by the Ecological Society of America.

  9. A physics-explicit model of bacterial conjugation shows the stabilizing role of the conjugative junction

    OpenAIRE

    Pastuszak, Jakub; Waclaw, Bartlomiej

    2017-01-01

    Conjugation is a process in which bacteria exchange DNA through a physical connection (conjugative junction) between mating cells. Despite its significance for processes such as the spread of antibiotic resistance, the role of physical forces in conjugation is poorly understood. Here we use computer models to show that the conjugative junction not only serves as a link to transfer the DNA but it also mechanically stabilises the mating pair which significantly increases the conjugation rate. W...

  10. An equilibrium approach to modelling social interaction

    Science.gov (United States)

    Gallo, Ignacio

    2009-07-01

    The aim of this work is to put forward a statistical mechanics theory of social interaction, generalizing econometric discrete choice models. After showing the formal equivalence linking econometric multinomial logit models to equilibrium statical mechanics, a multi-population generalization of the Curie-Weiss model for ferromagnets is considered as a starting point in developing a model capable of describing sudden shifts in aggregate human behaviour. Existence of the thermodynamic limit for the model is shown by an asymptotic sub-additivity method and factorization of correlation functions is proved almost everywhere. The exact solution of the model is provided in the thermodynamical limit by finding converging upper and lower bounds for the system's pressure, and the solution is used to prove an analytic result regarding the number of possible equilibrium states of a two-population system. The work stresses the importance of linking regimes predicted by the model to real phenomena, and to this end it proposes two possible procedures to estimate the model's parameters starting from micro-level data. These are applied to three case studies based on census type data: though these studies are found to be ultimately inconclusive on an empirical level, considerations are drawn that encourage further refinements of the chosen modelling approach.

  11. A Bayesian approach to model uncertainty

    International Nuclear Information System (INIS)

    Buslik, A.

    1994-01-01

    A Bayesian approach to model uncertainty is taken. For the case of a finite number of alternative models, the model uncertainty is equivalent to parameter uncertainty. A derivation based on Savage's partition problem is given

  12. Microarray profiling shows distinct differences between primary tumors and commonly used preclinical models in hepatocellular carcinoma

    International Nuclear Information System (INIS)

    Wang, Weining; Iyer, N. Gopalakrishna; Tay, Hsien Ts’ung; Wu, Yonghui; Lim, Tony K. H.; Zheng, Lin; Song, In Chin; Kwoh, Chee Keong; Huynh, Hung; Tan, Patrick O. B.; Chow, Pierce K. H.

    2015-01-01

    Despite advances in therapeutics, outcomes for hepatocellular carcinoma (HCC) remain poor and there is an urgent need for efficacious systemic therapy. Unfortunately, drugs that are successful in preclinical studies often fail in the clinical setting, and we hypothesize that this is due to functional differences between primary tumors and commonly used preclinical models. In this study, we attempt to answer this question by comparing tumor morphology and gene expression profiles between primary tumors, xenografts and HCC cell lines. Hep G2 cell lines and tumor cells from patient tumor explants were subcutaneously (ectopically) injected into the flank and orthotopically into liver parenchyma of Mus Musculus SCID mice. The mice were euthanized after two weeks. RNA was extracted from the tumors, and gene expression profiling was performed using the Gene Chip Human Genome U133 Plus 2.0. Principal component analyses (PCA) and construction of dendrograms were conducted using Partek genomics suite. PCA showed that the commonly used HepG2 cell line model and its xenograft counterparts were vastly different from all fresh primary tumors. Expression profiles of primary tumors were also significantly divergent from their counterpart patient-derived xenograft (PDX) models, regardless of the site of implantation. Xenografts from the same primary tumors were more likely to cluster together regardless of site of implantation, although heat maps showed distinct differences in gene expression profiles between orthotopic and ectopic models. The data presented here challenges the utility of routinely used preclinical models. Models using HepG2 were vastly different from primary tumors and PDXs, suggesting that this is not clinically representative. Surprisingly, site of implantation (orthotopic versus ectopic) resulted in limited impact on gene expression profiles, and in both scenarios xenografts differed significantly from the original primary tumors, challenging the long

  13. Porcine Esophageal Submucosal Gland Culture Model Shows Capacity for Proliferation and DifferentiationSummary

    Directory of Open Access Journals (Sweden)

    Richard J. von Furstenberg

    2017-11-01

    Full Text Available Background & Aims: Although cells comprising esophageal submucosal glands (ESMGs represent a potential progenitor cell niche, new models are needed to understand their capacity to proliferate and differentiate. By histologic appearance, ESMGs have been associated with both overlying normal squamous epithelium and columnar epithelium. Our aim was to assess ESMG proliferation and differentiation in a 3-dimensional culture model. Methods: We evaluated proliferation in human ESMGs from normal and diseased tissue by proliferating cell nuclear antigen immunohistochemistry. Next, we compared 5-ethynyl-2′-deoxyuridine labeling in porcine ESMGs in vivo before and after esophageal injury with a novel in vitro porcine organoid ESMG model. Microarray analysis of ESMGs in culture was compared with squamous epithelium and fresh ESMGs. Results: Marked proliferation was observed in human ESMGs of diseased tissue. This activated ESMG state was recapitulated after esophageal injury in an in vivo porcine model, ESMGs assumed a ductal appearance with increased proliferation compared with control. Isolated and cultured porcine ESMGs produced buds with actively cycling cells and passaged to form epidermal growth factor–dependent spheroids. These spheroids were highly proliferative and were passaged multiple times. Two phenotypes of spheroids were identified: solid squamous (P63+ and hollow/ductal (cytokeratin 7+. Microarray analysis showed spheroids to be distinct from parent ESMGs and enriched for columnar transcripts. Conclusions: Our results suggest that the activated ESMG state, seen in both human disease and our porcine model, may provide a source of cells to repopulate damaged epithelium in a normal manner (squamous or abnormally (columnar epithelium. This culture model will allow the evaluation of factors that drive ESMGs in the regeneration of injured epithelium. The raw microarray data have been uploaded to the National Center for

  14. Porcine Esophageal Submucosal Gland Culture Model Shows Capacity for Proliferation and Differentiation.

    Science.gov (United States)

    von Furstenberg, Richard J; Li, Joy; Stolarchuk, Christina; Feder, Rachel; Campbell, Alexa; Kruger, Leandi; Gonzalez, Liara M; Blikslager, Anthony T; Cardona, Diana M; McCall, Shannon J; Henning, Susan J; Garman, Katherine S

    2017-11-01

    Although cells comprising esophageal submucosal glands (ESMGs) represent a potential progenitor cell niche, new models are needed to understand their capacity to proliferate and differentiate. By histologic appearance, ESMGs have been associated with both overlying normal squamous epithelium and columnar epithelium. Our aim was to assess ESMG proliferation and differentiation in a 3-dimensional culture model. We evaluated proliferation in human ESMGs from normal and diseased tissue by proliferating cell nuclear antigen immunohistochemistry. Next, we compared 5-ethynyl-2'-deoxyuridine labeling in porcine ESMGs in vivo before and after esophageal injury with a novel in vitro porcine organoid ESMG model. Microarray analysis of ESMGs in culture was compared with squamous epithelium and fresh ESMGs. Marked proliferation was observed in human ESMGs of diseased tissue. This activated ESMG state was recapitulated after esophageal injury in an in vivo porcine model, ESMGs assumed a ductal appearance with increased proliferation compared with control. Isolated and cultured porcine ESMGs produced buds with actively cycling cells and passaged to form epidermal growth factor-dependent spheroids. These spheroids were highly proliferative and were passaged multiple times. Two phenotypes of spheroids were identified: solid squamous (P63+) and hollow/ductal (cytokeratin 7+). Microarray analysis showed spheroids to be distinct from parent ESMGs and enriched for columnar transcripts. Our results suggest that the activated ESMG state, seen in both human disease and our porcine model, may provide a source of cells to repopulate damaged epithelium in a normal manner (squamous) or abnormally (columnar epithelium). This culture model will allow the evaluation of factors that drive ESMGs in the regeneration of injured epithelium. The raw microarray data have been uploaded to the National Center for Biotechnology Information Gene Expression Omnibus (accession number: GSE100543).

  15. Small GSK-3 Inhibitor Shows Efficacy in a Motor Neuron Disease Murine Model Modulating Autophagy.

    Directory of Open Access Journals (Sweden)

    Estefanía de Munck

    Full Text Available Amyotrophic lateral sclerosis (ALS is a progressive motor neuron degenerative disease that has no effective treatment up to date. Drug discovery tasks have been hampered due to the lack of knowledge in its molecular etiology together with the limited animal models for research. Recently, a motor neuron disease animal model has been developed using β-N-methylamino-L-alanine (L-BMAA, a neurotoxic amino acid related to the appearing of ALS. In the present work, the neuroprotective role of VP2.51, a small heterocyclic GSK-3 inhibitor, is analysed in this novel murine model together with the analysis of autophagy. VP2.51 daily administration for two weeks, starting the first day after L-BMAA treatment, leads to total recovery of neurological symptoms and prevents the activation of autophagic processes in rats. These results show that the L-BMAA murine model can be used to test the efficacy of new drugs. In addition, the results confirm the therapeutic potential of GSK-3 inhibitors, and specially VP2.51, for the disease-modifying future treatment of motor neuron disorders like ALS.

  16. Human Commercial Models' Eye Colour Shows Negative Frequency-Dependent Selection.

    Directory of Open Access Journals (Sweden)

    Isabela Rodrigues Nogueira Forti

    Full Text Available In this study we investigated the eye colour of human commercial models registered in the UK (400 female and 400 male and Brazil (400 female and 400 male to test the hypothesis that model eye colour frequency was the result of negative frequency-dependent selection. The eye colours of the models were classified as: blue, brown or intermediate. Chi-square analyses of data for countries separated by sex showed that in the United Kingdom brown eyes and intermediate colours were significantly more frequent than expected in comparison to the general United Kingdom population (P<0.001. In Brazil, the most frequent eye colour brown was significantly less frequent than expected in comparison to the general Brazilian population. These results support the hypothesis that model eye colour is the result of negative frequency-dependent selection. This could be the result of people using eye colour as a marker of genetic diversity and finding rarer eye colours more attractive because of the potential advantage more genetically diverse offspring that could result from such a choice. Eye colour may be important because in comparison to many other physical traits (e.g., hair colour it is hard to modify, hide or disguise, and it is highly polymorphic.

  17. Human Commercial Models' Eye Colour Shows Negative Frequency-Dependent Selection.

    Science.gov (United States)

    Forti, Isabela Rodrigues Nogueira; Young, Robert John

    2016-01-01

    In this study we investigated the eye colour of human commercial models registered in the UK (400 female and 400 male) and Brazil (400 female and 400 male) to test the hypothesis that model eye colour frequency was the result of negative frequency-dependent selection. The eye colours of the models were classified as: blue, brown or intermediate. Chi-square analyses of data for countries separated by sex showed that in the United Kingdom brown eyes and intermediate colours were significantly more frequent than expected in comparison to the general United Kingdom population (PBrazilian population. These results support the hypothesis that model eye colour is the result of negative frequency-dependent selection. This could be the result of people using eye colour as a marker of genetic diversity and finding rarer eye colours more attractive because of the potential advantage more genetically diverse offspring that could result from such a choice. Eye colour may be important because in comparison to many other physical traits (e.g., hair colour) it is hard to modify, hide or disguise, and it is highly polymorphic.

  18. Histidine decarboxylase knockout mice, a genetic model of Tourette syndrome, show repetitive grooming after induced fear.

    Science.gov (United States)

    Xu, Meiyu; Li, Lina; Ohtsu, Hiroshi; Pittenger, Christopher

    2015-05-19

    Tics, such as are seen in Tourette syndrome (TS), are common and can cause profound morbidity, but they are poorly understood. Tics are potentiated by psychostimulants, stress, and sleep deprivation. Mutations in the gene histidine decarboxylase (Hdc) have been implicated as a rare genetic cause of TS, and Hdc knockout mice have been validated as a genetic model that recapitulates phenomenological and pathophysiological aspects of the disorder. Tic-like stereotypies in this model have not been observed at baseline but emerge after acute challenge with the psychostimulant d-amphetamine. We tested the ability of an acute stressor to stimulate stereotypies in this model, using tone fear conditioning. Hdc knockout mice acquired conditioned fear normally, as manifested by freezing during the presentation of a tone 48h after it had been paired with a shock. During the 30min following tone presentation, knockout mice showed increased grooming. Heterozygotes exhibited normal freezing and intermediate grooming. These data validate a new paradigm for the examination of tic-like stereotypies in animals without pharmacological challenge and enhance the face validity of the Hdc knockout mouse as a pathophysiologically grounded model of tic disorders. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  19. MTO1-deficient mouse model mirrors the human phenotype showing complex I defect and cardiomyopathy.

    Directory of Open Access Journals (Sweden)

    Lore Becker

    Full Text Available Recently, mutations in the mitochondrial translation optimization factor 1 gene (MTO1 were identified as causative in children with hypertrophic cardiomyopathy, lactic acidosis and respiratory chain defect. Here, we describe an MTO1-deficient mouse model generated by gene trap mutagenesis that mirrors the human phenotype remarkably well. As in patients, the most prominent signs and symptoms were cardiovascular and included bradycardia and cardiomyopathy. In addition, the mutant mice showed a marked worsening of arrhythmias during induction and reversal of anaesthesia. The detailed morphological and biochemical workup of murine hearts indicated that the myocardial damage was due to complex I deficiency and mitochondrial dysfunction. In contrast, neurological examination was largely normal in Mto1-deficient mice. A translational consequence of this mouse model may be to caution against anaesthesia-related cardiac arrhythmias which may be fatal in patients.

  20. Visual modeling shows that avian host parents use multiple visual cues in rejecting parasitic eggs.

    Science.gov (United States)

    Spottiswoode, Claire N; Stevens, Martin

    2010-05-11

    One of the most striking outcomes of coevolution between species is egg mimicry by brood parasitic birds, resulting from rejection behavior by discriminating host parents. Yet, how exactly does a host detect a parasitic egg? Brood parasitism and egg rejection behavior provide a model system for exploring the relative importance of different visual cues used in a behavioral task. Although hosts are discriminating, we do not know exactly what cues they use, and to answer this it is crucial to account for the receiver's visual perception. Color, luminance ("perceived lightness") and pattern information have never been simultaneously quantified and experimentally tested through a bird's eye. The cuckoo finch Anomalospiza imberbis and its hosts show spectacular polymorphisms in egg appearance, providing a good opportunity for investigating visual discrimination owing to the large range of patterns and colors involved. Here we combine field experiments in Africa with modeling of avian color vision and pattern discrimination to identify the specific visual cues used by hosts in making rejection decisions. We found that disparity between host and foreign eggs in both color and several aspects of pattern (dispersion, principal marking size, and variability in marking size) were important predictors of rejection, especially color. These cues correspond exactly to the principal differences between host and parasitic eggs, showing that hosts use the most reliable available cues in making rejection decisions, and select for parasitic eggs that are increasingly mimetic in a range of visual attributes.

  1. Transchromosomic cell model of Down syndrome shows aberrant migration, adhesion and proteome response to extracellular matrix

    Directory of Open Access Journals (Sweden)

    Cotter Finbarr E

    2009-08-01

    Full Text Available Abstract Background Down syndrome (DS, caused by trisomy of human chromosome 21 (HSA21, is the most common genetic birth defect. Congenital heart defects (CHD are seen in 40% of DS children, and >50% of all atrioventricular canal defects in infancy are caused by trisomy 21, but the causative genes remain unknown. Results Here we show that aberrant adhesion and proliferation of DS cells can be reproduced using a transchromosomic model of DS (mouse fibroblasts bearing supernumerary HSA21. We also demonstrate a deacrease of cell migration in transchromosomic cells independently of their adhesion properties. We show that cell-autonomous proteome response to the presence of Collagen VI in extracellular matrix is strongly affected by trisomy 21. Conclusion This set of experiments establishes a new model system for genetic dissection of the specific HSA21 gene-overdose contributions to aberrant cell migration, adhesion, proliferation and specific proteome response to collagen VI, cellular phenotypes linked to the pathogenesis of CHD.

  2. "A cigarette a day keeps the goodies away": smokers show automatic approach tendencies for smoking--but not for food-related stimuli.

    Directory of Open Access Journals (Sweden)

    Alla Machulska

    Full Text Available Smoking leads to the development of automatic tendencies that promote approach behavior toward smoking-related stimuli which in turn may maintain addictive behavior. The present study examined whether automatic approach tendencies toward smoking-related stimuli can be measured by using an adapted version of the Approach-Avoidance Task (AAT. Given that progression of addictive behavior has been associated with a decreased reactivity of the brain reward system for stimuli signaling natural rewards, we also used the AAT to measure approach behavior toward natural rewarding stimuli in smokers. During the AAT, 92 smokers and 51 non-smokers viewed smoking-related vs. non-smoking-related pictures and pictures of natural rewards (i.e. highly palatable food vs. neutral pictures. They were instructed to ignore image content and to respond to picture orientation by either pulling or pushing a joystick. Within-group comparisons revealed that smokers showed an automatic approach bias exclusively for smoking-related pictures. Contrary to our expectations, there was no difference in smokers' and non-smokers' approach bias for nicotine-related stimuli, indicating that non-smokers also showed approach tendencies for this picture category. Yet, in contrast to non-smokers, smokers did not show an approach bias for food-related pictures. Moreover, self-reported smoking attitude could not predict approach-avoidance behavior toward nicotine-related pictures in smokers or non-smokers. Our findings indicate that the AAT is suited for measuring smoking-related approach tendencies in smokers. Furthermore, we provide evidence for a diminished approach tendency toward food-related stimuli in smokers, suggesting a decreased sensitivity to natural rewards in the course of nicotine addiction. Our results indicate that in contrast to similar studies conducted in alcohol, cannabis and heroin users, the AAT might only be partially suited for measuring smoking-related approach

  3. Learning Action Models: Qualitative Approach

    NARCIS (Netherlands)

    Bolander, T.; Gierasimczuk, N.; van der Hoek, W.; Holliday, W.H.; Wang, W.-F.

    2015-01-01

    In dynamic epistemic logic, actions are described using action models. In this paper we introduce a framework for studying learnability of action models from observations. We present first results concerning propositional action models. First we check two basic learnability criteria: finite

  4. Estimating carbon and showing impacts of drought using satellite data in regression-tree models

    Science.gov (United States)

    Boyte, Stephen; Wylie, Bruce K.; Howard, Danny; Dahal, Devendra; Gilmanov, Tagir G.

    2018-01-01

    Integrating spatially explicit biogeophysical and remotely sensed data into regression-tree models enables the spatial extrapolation of training data over large geographic spaces, allowing a better understanding of broad-scale ecosystem processes. The current study presents annual gross primary production (GPP) and annual ecosystem respiration (RE) for 2000–2013 in several short-statured vegetation types using carbon flux data from towers that are located strategically across the conterminous United States (CONUS). We calculate carbon fluxes (annual net ecosystem production [NEP]) for each year in our study period, which includes 2012 when drought and higher-than-normal temperatures influence vegetation productivity in large parts of the study area. We present and analyse carbon flux dynamics in the CONUS to better understand how drought affects GPP, RE, and NEP. Model accuracy metrics show strong correlation coefficients (r) (r ≥ 94%) between training and estimated data for both GPP and RE. Overall, average annual GPP, RE, and NEP are relatively constant throughout the study period except during 2012 when almost 60% less carbon is sequestered than normal. These results allow us to conclude that this modelling method effectively estimates carbon dynamics through time and allows the exploration of impacts of meteorological anomalies and vegetation types on carbon dynamics.

  5. modeling, observation and control, a multi-model approach

    OpenAIRE

    Elkhalil, Mansoura

    2011-01-01

    This thesis is devoted to the control of systems which dynamics can be suitably described by a multimodel approach from an investigation study of a model reference adaptative control performance enhancement. Four multimodel control approaches have been proposed. The first approach is based on an output reference model control design. A successful experimental validation involving a chemical reactor has been carried out. The second approach is based on a suitable partial state model reference ...

  6. MERGING DIGITAL SURFACE MODELS IMPLEMENTING BAYESIAN APPROACHES

    Directory of Open Access Journals (Sweden)

    H. Sadeq

    2016-06-01

    Full Text Available In this research different DSMs from different sources have been merged. The merging is based on a probabilistic model using a Bayesian Approach. The implemented data have been sourced from very high resolution satellite imagery sensors (e.g. WorldView-1 and Pleiades. It is deemed preferable to use a Bayesian Approach when the data obtained from the sensors are limited and it is difficult to obtain many measurements or it would be very costly, thus the problem of the lack of data can be solved by introducing a priori estimations of data. To infer the prior data, it is assumed that the roofs of the buildings are specified as smooth, and for that purpose local entropy has been implemented. In addition to the a priori estimations, GNSS RTK measurements have been collected in the field which are used as check points to assess the quality of the DSMs and to validate the merging result. The model has been applied in the West-End of Glasgow containing different kinds of buildings, such as flat roofed and hipped roofed buildings. Both quantitative and qualitative methods have been employed to validate the merged DSM. The validation results have shown that the model was successfully able to improve the quality of the DSMs and improving some characteristics such as the roof surfaces, which consequently led to better representations. In addition to that, the developed model has been compared with the well established Maximum Likelihood model and showed similar quantitative statistical results and better qualitative results. Although the proposed model has been applied on DSMs that were derived from satellite imagery, it can be applied to any other sourced DSMs.

  7. Etoposide incorporated into camel milk phospholipids liposomes shows increased activity against fibrosarcoma in a mouse model.

    Science.gov (United States)

    Maswadeh, Hamzah M; Aljarbou, Ahmad N; Alorainy, Mohammed S; Alsharidah, Mansour S; Khan, Masood A

    2015-01-01

    Phospholipids were isolated from camel milk and identified by using high performance liquid chromatography and gas chromatography-mass spectrometry (GC/MS). Anticancer drug etoposide (ETP) was entrapped in liposomes, prepared from camel milk phospholipids, to determine its activity against fibrosarcoma in a murine model. Fibrosarcoma was induced in mice by injecting benzopyrene (BAP) and tumor-bearing mice were treated with various formulations of etoposide, including etoposide entrapped camel milk phospholipids liposomes (ETP-Cam-liposomes) and etoposide-loaded DPPC-liposomes (ETP-DPPC-liposomes). The tumor-bearing mice treated with ETP-Cam-liposomes showed slow progression of tumors and increased survival compared to free ETP or ETP-DPPC-liposomes. These results suggest that ETP-Cam-liposomes may prove to be a better drug delivery system for anticancer drugs.

  8. Etoposide Incorporated into Camel Milk Phospholipids Liposomes Shows Increased Activity against Fibrosarcoma in a Mouse Model

    Directory of Open Access Journals (Sweden)

    Hamzah M. Maswadeh

    2015-01-01

    Full Text Available Phospholipids were isolated from camel milk and identified by using high performance liquid chromatography and gas chromatography-mass spectrometry (GC/MS. Anticancer drug etoposide (ETP was entrapped in liposomes, prepared from camel milk phospholipids, to determine its activity against fibrosarcoma in a murine model. Fibrosarcoma was induced in mice by injecting benzopyrene (BAP and tumor-bearing mice were treated with various formulations of etoposide, including etoposide entrapped camel milk phospholipids liposomes (ETP-Cam-liposomes and etoposide-loaded DPPC-liposomes (ETP-DPPC-liposomes. The tumor-bearing mice treated with ETP-Cam-liposomes showed slow progression of tumors and increased survival compared to free ETP or ETP-DPPC-liposomes. These results suggest that ETP-Cam-liposomes may prove to be a better drug delivery system for anticancer drugs.

  9. Phenolic Acids from Wheat Show Different Absorption Profiles in Plasma: A Model Experiment with Catheterized Pigs

    DEFF Research Database (Denmark)

    Nørskov, Natalja; Hedemann, Mette Skou; Theil, Peter Kappel

    2013-01-01

    consumed. Benzoic acid derivatives showed low concentration in the plasma (diets. The exception was p-hydroxybenzoic acid, with a plasma concentration (4 ± 0.4 μM), much higher than the other plant phenolic acids, likely because it is an intermediate in the phenolic acid metabolism......The concentration and absorption of the nine phenolic acids of wheat were measured in a model experiment with catheterized pigs fed whole grain wheat and wheat aleurone diets. Six pigs in a repeated crossover design were fitted with catheters in the portal vein and mesenteric artery to study....... It was concluded that plant phenolic acids undergo extensive interconversion in the colon and that their absorption profiles reflected their low bioavailability in the plant matrix....

  10. Ebola Virus Makona Shows Reduced Lethality in an Immune-deficient Mouse Model.

    Science.gov (United States)

    Smither, Sophie J; Eastaugh, Lin; Ngugi, Sarah; O'Brien, Lyn; Phelps, Amanda; Steward, Jackie; Lever, Mark Stephen

    2016-10-15

    Ebola virus Makona (EBOV-Makona; from the 2013-2016 West Africa outbreak) shows decreased virulence in an immune-deficient mouse model, compared with a strain from 1976. Unlike other filoviruses tested, EBOV-Makona may be slightly more virulent by the aerosol route than by the injected route, as 2 mice died following aerosol exposure, compared with no mortality among mice that received intraperitoneal injection of equivalent or higher doses. Although most mice did not succumb to infection, the detection of an immunoglobulin G antibody response along with observed clinical signs suggest that the mice were infected but able to clear the infection and recover. We hypothesize that this may be due to the growth rates and kinetics of the virus, which appear slower than that for other filoviruses and consequently give more time for an immune response that results in clearance of the virus. In this instance, the immune-deficient mouse model is unlikely to be appropriate for testing medical countermeasures against this EBOV-Makona stock but may provide insight into pathogenesis and the immune response to virus. © Crown copyright 2016.

  11. Global energy modeling - A biophysical approach

    Energy Technology Data Exchange (ETDEWEB)

    Dale, Michael

    2010-09-15

    This paper contrasts the standard economic approach to energy modelling with energy models using a biophysical approach. Neither of these approaches includes changing energy-returns-on-investment (EROI) due to declining resource quality or the capital intensive nature of renewable energy sources. Both of these factors will become increasingly important in the future. An extension to the biophysical approach is outlined which encompasses a dynamic EROI function that explicitly incorporates technological learning. The model is used to explore several scenarios of long-term future energy supply especially concerning the global transition to renewable energy sources in the quest for a sustainable energy system.

  12. A nationwide modelling approach to decommissioning - 16182

    International Nuclear Information System (INIS)

    Kelly, Bernard; Lowe, Andy; Mort, Paul

    2009-01-01

    In this paper we describe a proposed UK national approach to modelling decommissioning. For the first time, we shall have an insight into optimizing the safety and efficiency of a national decommissioning strategy. To do this we use the General Case Integrated Waste Algorithm (GIA), a universal model of decommissioning nuclear plant, power plant, waste arisings and the associated knowledge capture. The model scales from individual items of plant through cells, groups of cells, buildings, whole sites and then on up to a national scale. We describe the national vision for GIA which can be broken down into three levels: 1) the capture of the chronological order of activities that an experienced decommissioner would use to decommission any nuclear facility anywhere in the world - this is Level 1 of GIA; 2) the construction of an Operational Research (OR) model based on Level 1 to allow rapid what if scenarios to be tested quickly (Level 2); 3) the construction of a state of the art knowledge capture capability that allows future generations to learn from our current decommissioning experience (Level 3). We show the progress to date in developing GIA in levels 1 and 2. As part of level 1, GIA has assisted in the development of an IMechE professional decommissioning qualification. Furthermore, we describe GIA as the basis of a UK-Owned database of decommissioning norms for such things as costs, productivity, durations etc. From level 2, we report on a pilot study that has successfully tested the basic principles for the OR numerical simulation of the algorithm. We then highlight the advantages of applying the OR modelling approach nationally. In essence, a series of 'what if...' scenarios can be tested that will improve the safety and efficiency of decommissioning. (authors)

  13. A Unified Approach to Modeling and Programming

    DEFF Research Database (Denmark)

    Madsen, Ole Lehrmann; Møller-Pedersen, Birger

    2010-01-01

    of this paper is to go back to the future and get inspiration from SIMULA and propose a unied approach. In addition to reintroducing the contributions of SIMULA and the Scandinavian approach to object-oriented programming, we do this by discussing a number of issues in modeling and programming and argue3 why we......SIMULA was a language for modeling and programming and provided a unied approach to modeling and programming in contrast to methodologies based on structured analysis and design. The current development seems to be going in the direction of separation of modeling and programming. The goal...

  14. Atovaquone Nanosuspensions Show Excellent Therapeutic Effect in a New Murine Model of Reactivated Toxoplasmosis

    Science.gov (United States)

    Schöler, Nadja; Krause, Karsten; Kayser, Oliver; Müller, Rainer H.; Borner, Klaus; Hahn, Helmut; Liesenfeld, Oliver

    2001-01-01

    Immunocompromised patients are at risk of developing toxoplasma encephalitis (TE). Standard therapy regimens (including sulfadiazine plus pyrimethamine) are hampered by severe side effects. While atovaquone has potent in vitro activity against Toxoplasma gondii, it is poorly absorbed after oral administration and shows poor therapeutic efficacy against TE. To overcome the low absorption of atovaquone, we prepared atovaquone nanosuspensions (ANSs) for intravenous (i.v.) administration. At concentrations higher than 1.0 μg/ml, ANS did not exert cytotoxicity and was as effective as free atovaquone (i.e., atovaquone suspended in medium) against T. gondii in freshly isolated peritoneal macrophages. In a new murine model of TE that closely mimics reactivated toxoplasmosis in immunocompromised hosts, using mice with a targeted mutation in the gene encoding the interferon consensus sequence binding protein, i.v.-administered ANS doses of 10.0 mg/kg of body weight protected the animals against development of TE and death. Atovaquone was detectable in the sera, brains, livers, and lungs of mice by high-performance liquid chromatography. Development of TE and mortality in mice treated with 1.0- or 0.1-mg/kg i.v. doses of ANS did not differ from that in mice treated orally with 100 mg of atovaquone/kg. In conclusion, i.v. ANSs may prove to be an effective treatment alternative for patients with TE. PMID:11353624

  15. New azole derivatives showing antimicrobial effects and their mechanism of antifungal activity by molecular modeling studies.

    Science.gov (United States)

    Doğan, İnci Selin; Saraç, Selma; Sari, Suat; Kart, Didem; Eşsiz Gökhan, Şebnem; Vural, İmran; Dalkara, Sevim

    2017-04-21

    Azole antifungals are potent inhibitors of fungal lanosterol 14α demethylase (CYP51) and have been used for eradication of systemic candidiasis clinically. Herein we report the design, synthesis, and biological evaluation of a series of 1-phenyl/1-(4-chlorophenyl)-2-(1H-imidazol-1-yl)ethanol esters. Many of these derivatives showed fungal growth inhibition at very low concentrations. Minimal inhibition concentration (MIC) value of 15 was 0.125 μg/mL against Candida albicans. Additionally, some of our compounds, such as 19 (MIC: 0.25 μg/mL), were potent against resistant C. glabrata, a fungal strain less susceptible to some first-line antifungal drugs. We confirmed their antifungal efficacy by antibiofilm test and their safety against human monocytes by cytotoxicity assay. To rationalize their mechanism of action, we performed computational analysis utilizing molecular docking and dynamics simulations on the C. albicans and C. glabrata CYP51 (CACYP51 and CGCYP51) homology models we built. Leu130 and T131 emerged as possible key residues for inhibition of CGCYP51 by 19. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  16. Showing a model's eye movements in examples does not improve learning of problem-solving tasks

    NARCIS (Netherlands)

    van Marlen, Tim; van Wermeskerken, Margot; Jarodzka, Halszka; van Gog, Tamara

    2016-01-01

    Eye movement modeling examples (EMME) are demonstrations of a computer-based task by a human model (e.g., a teacher), with the model's eye movements superimposed on the task to guide learners' attention. EMME have been shown to enhance learning of perceptual classification tasks; however, it is an

  17. Multiple Model Approaches to Modelling and Control,

    DEFF Research Database (Denmark)

    appeal in building systems which operate robustly over a wide range of operating conditions by decomposing them into a number of simplerlinear modelling or control problems, even for nonlinear modelling or control problems. This appeal has been a factor in the development of increasinglypopular `local...... to problems in the process industries, biomedical applications and autonomoussystems. The successful application of the ideas to demanding problems is already encouraging, but creative development of the basic framework isneeded to better allow the integration of human knowledge with automated learning....... The underlying question is `How should we partition the system - what is `local'?'. This book presents alternative ways of bringing submodels together,which lead to varying levels of performance and insight. Some are further developed for autonomous learning of parameters from data, while others havefocused...

  18. Modeling software behavior a craftsman's approach

    CERN Document Server

    Jorgensen, Paul C

    2009-01-01

    A common problem with most texts on requirements specifications is that they emphasize structural models to the near exclusion of behavioral models-focusing on what the software is, rather than what it does. If they do cover behavioral models, the coverage is brief and usually focused on a single model. Modeling Software Behavior: A Craftsman's Approach provides detailed treatment of various models of software behavior that support early analysis, comprehension, and model-based testing. Based on the popular and continually evolving course on requirements specification models taught by the auth

  19. System Behavior Models: A Survey of Approaches

    Science.gov (United States)

    2016-06-01

    the Petri model allowed a quick assessment of all potential states but was more cumbersome to build than the MP model. A comparison of approaches...identical state space results. The combined state space graph of the Petri model allowed a quick assessment of all potential states but was more...59 INITIAL DISTRIBUTION LIST ...................................................................................65 ix LIST

  20. Problem-based learning using patient-simulated videos showing daily life for a comprehensive clinical approach.

    Science.gov (United States)

    Ikegami, Akiko; Ohira, Yoshiyuki; Uehara, Takanori; Noda, Kazutaka; Suzuki, Shingo; Shikino, Kiyoshi; Kajiwara, Hideki; Kondo, Takeshi; Hirota, Yusuke; Ikusaka, Masatomi

    2017-02-27

    We examined whether problem-based learning tutorials using patient-simulated videos showing daily life are more practical for clinical learning, compared with traditional paper-based problem-based learning, for the consideration rate of psychosocial issues and the recall rate for experienced learning. Twenty-two groups with 120 fifth-year students were each assigned paper-based problem-based learning and video-based problem-based learning using patient-simulated videos. We compared target achievement rates in questionnaires using the Wilcoxon signed-rank test and discussion contents diversity using the Mann-Whitney U test. A follow-up survey used a chi-square test to measure students' recall of cases in three categories: video, paper, and non-experienced. Video-based problem-based learning displayed significantly higher achievement rates for imagining authentic patients (p=0.001), incorporating a comprehensive approach including psychosocial aspects (pproblem-based tutorials can be implemented if we create patient-simulated videos for each symptom as teaching materials.

  1. Current approaches to gene regulatory network modelling

    Directory of Open Access Journals (Sweden)

    Brazma Alvis

    2007-09-01

    Full Text Available Abstract Many different approaches have been developed to model and simulate gene regulatory networks. We proposed the following categories for gene regulatory network models: network parts lists, network topology models, network control logic models, and dynamic models. Here we will describe some examples for each of these categories. We will study the topology of gene regulatory networks in yeast in more detail, comparing a direct network derived from transcription factor binding data and an indirect network derived from genome-wide expression data in mutants. Regarding the network dynamics we briefly describe discrete and continuous approaches to network modelling, then describe a hybrid model called Finite State Linear Model and demonstrate that some simple network dynamics can be simulated in this model.

  2. Rhetoric Versus Reality? Laboratory Surveys Show Actual Practice Differs Considerably from Proposed Models and Mandated Calculations.

    Science.gov (United States)

    Westgard, Sten A

    2017-03-01

    The scientific debate on goals, measurement uncertainty, and individualized quality control plans has diverged significantly from the reality of laboratory operation. Academic articles promoting certain approaches are being ignored; laboratories may be in compliance with new regulations, mandates, and calculations, but most of them still adhere to traditional quality management practices. Despite a considerable effort to enforce measurement uncertainty and eliminate or discredit allowable total error, laboratories continue to use these older, more practical approaches for quality management. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. An Almost Integration-free Approach to Ordered Response Models

    NARCIS (Netherlands)

    van Praag, B.M.S.; Ferrer-i-Carbonell, A.

    2006-01-01

    'In this paper we propose an alternative approach to the estimation of ordered response models. We show that the Probit-method may be replaced by a simple OLS-approach, called P(robit)OLS, without any loss of efficiency. This method can be generalized to the analysis of panel data. For large-scale

  4. Distributed simulation a model driven engineering approach

    CERN Document Server

    Topçu, Okan; Oğuztüzün, Halit; Yilmaz, Levent

    2016-01-01

    Backed by substantive case studies, the novel approach to software engineering for distributed simulation outlined in this text demonstrates the potent synergies between model-driven techniques, simulation, intelligent agents, and computer systems development.

  5. Classifying multi-model wheat yield impact response surfaces showing sensitivity to temperature and precipitation change

    NARCIS (Netherlands)

    Fronzek, Stefan; Pirttioja, Nina; Carter, Timothy R.; Bindi, Marco; Hoffmann, Holger; Palosuo, Taru; Ruiz-Ramos, Margarita; Tao, Fulu; Trnka, Miroslav; Acutis, Marco; Asseng, Senthold; Baranowski, Piotr; Basso, Bruno; Bodin, Per; Buis, Samuel; Cammarano, Davide; Deligios, Paola; Destain, Marie France; Dumont, Benjamin; Ewert, Frank; Ferrise, Roberto; François, Louis; Gaiser, Thomas; Hlavinka, Petr; Jacquemin, Ingrid; Kersebaum, Kurt Christian; Kollas, Chris; Krzyszczak, Jaromir; Lorite, Ignacio J.; Minet, Julien; Minguez, M.I.; Montesino, Manuel; Moriondo, Marco; Müller, Christoph; Nendel, Claas; Öztürk, Isik; Perego, Alessia; Rodríguez, Alfredo; Ruane, Alex C.; Ruget, Françoise; Sanna, Mattia; Semenov, Mikhail A.; Slawinski, Cezary; Stratonovitch, Pierre; Supit, Iwan; Waha, Katharina; Wang, Enli; Wu, Lianhai; Zhao, Zhigan; Rötter, Reimund P.

    2018-01-01

    Crop growth simulation models can differ greatly in their treatment of key processes and hence in their response to environmental conditions. Here, we used an ensemble of 26 process-based wheat models applied at sites across a European transect to compare their sensitivity to changes in

  6. Classifying multi-model wheat yield impact response surfaces showing sensitivity to temperature and precipitation change

    Czech Academy of Sciences Publication Activity Database

    Fronzek, S.; Pirttioja, N. K.; Carter, T. R.; Bindi, M.; Hoffmann, H.; Palosuo, T.; Ruiz-Ramos, M.; Tao, F.; Trnka, Miroslav; Acutis, M.; Asseng, S.; Baranowski, P.; Basso, B.; Bodin, P.; Buis, S.; Cammarano, D.; Deligios, P.; Destain, M. F.; Dumont, B.; Ewert, F.; Ferrise, R.; Francois, L.; Gaiser, T.; Hlavinka, Petr; Jacquemin, I.; Kersebaum, K. C.; Kollas, C.; Krzyszczak, J.; Lorite, I. J.; Minet, J.; Ines Minguez, M.; Montesino, M.; Moriondo, M.; Mueller, C.; Nendel, C.; Öztürk, I.; Perego, A.; Rodriguez, A.; Ruane, A. C.; Ruget, F.; Sanna, M.; Semenov, M. A.; Slawinski, C.; Stratonovitch, P.; Supit, I.; Waha, K.; Wang, E.; Wu, L.; Zhao, Z.; Rötter, R.

    2018-01-01

    Roč. 159, jan (2018), s. 209-224 ISSN 0308-521X Keywords : climate-change * crop models * probabilistic assessment * simulating impacts * british catchments * uncertainty * europe * productivity * calibration * adaptation * Classification * Climate change * Crop model * Ensemble * Sensitivity analysis * Wheat Impact factor: 2.571, year: 2016

  7. Predictive Modeling of Influenza Shows the Promise of Applied Evolutionary Biology.

    Science.gov (United States)

    Morris, Dylan H; Gostic, Katelyn M; Pompei, Simone; Bedford, Trevor; Łuksza, Marta; Neher, Richard A; Grenfell, Bryan T; Lässig, Michael; McCauley, John W

    2018-02-01

    Seasonal influenza is controlled through vaccination campaigns. Evolution of influenza virus antigens means that vaccines must be updated to match novel strains, and vaccine effectiveness depends on the ability of scientists to predict nearly a year in advance which influenza variants will dominate in upcoming seasons. In this review, we highlight a promising new surveillance tool: predictive models. Based on data-sharing and close collaboration between the World Health Organization and academic scientists, these models use surveillance data to make quantitative predictions regarding influenza evolution. Predictive models demonstrate the potential of applied evolutionary biology to improve public health and disease control. We review the state of influenza predictive modeling and discuss next steps and recommendations to ensure that these models deliver upon their considerable biomedical promise. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Fractal approach to computer-analytical modelling of tree crown

    International Nuclear Information System (INIS)

    Berezovskaya, F.S.; Karev, G.P.; Kisliuk, O.F.; Khlebopros, R.G.; Tcelniker, Yu.L.

    1993-09-01

    In this paper we discuss three approaches to the modeling of a tree crown development. These approaches are experimental (i.e. regressive), theoretical (i.e. analytical) and simulation (i.e. computer) modeling. The common assumption of these is that a tree can be regarded as one of the fractal objects which is the collection of semi-similar objects and combines the properties of two- and three-dimensional bodies. We show that a fractal measure of crown can be used as the link between the mathematical models of crown growth and light propagation through canopy. The computer approach gives the possibility to visualize a crown development and to calibrate the model on experimental data. In the paper different stages of the above-mentioned approaches are described. The experimental data for spruce, the description of computer system for modeling and the variant of computer model are presented. (author). 9 refs, 4 figs

  9. Validation of Modeling Flow Approaching Navigation Locks

    Science.gov (United States)

    2013-08-01

    USACE, Pittsburgh District ( LRP ) requested that the US Army Engineer Research and Development Center, Coastal and ERDC/CHL TR-13-9 2 Hydraulics...approaching the lock and dam. The second set of experiments considered a design, referred to as Plan B lock approach, which contained the weir field in...conditions and model parameters A discharge of 1.35 cfs was set as the inflow boundary condition at the upstream end of the model. The outflow boundary was

  10. A new approach to modeling aviation accidents

    Science.gov (United States)

    Rao, Arjun Harsha

    views aviation accidents as a set of hazardous states of a system (pilot and aircraft), and triggers that cause the system to move between hazardous states. I used the NTSB's accident coding manual (that contains nearly 4000 different codes) to develop a "dictionary" of hazardous states, triggers, and information codes. Then, I created the "grammar", or a set of rules, that: (1) orders the hazardous states in each accident; and, (2) links the hazardous states using the appropriate triggers. This approach: (1) provides a more correct count of the causes for accidents in the NTSB database; and, (2) checks for gaps or omissions in NTSB accident data, and fills in some of these gaps using logic-based rules. These rules also help identify and count causes for accidents that were not discernable from previous analyses of historical accident data. I apply the model to 6200 helicopter accidents that occurred in the US between 1982 and 2015. First, I identify the states and triggers that are most likely to be associated with fatal and non-fatal accidents. The results suggest that non-fatal accidents, which account for approximately 84% of the accidents, provide valuable opportunities to learn about the causes for accidents. Next, I investigate the causes of inflight loss of control using both a conventional approach and using the state-based approach. The conventional analysis provides little insight into the causal mechanism for LOC. For instance, the top cause of LOC is "aircraft control/directional control not maintained", which does not provide any insight. In contrast, the state-based analysis showed that pilots' tendency to clip objects frequently triggered LOC (16.7% of LOC accidents)--this finding was not directly discernable from conventional analyses. Finally, I investigate the causes for improper autorotations using both a conventional approach and the state-based approach. The conventional approach uses modifiers (e.g., "improper", "misjudged") associated with "24520

  11. Metabolic modeling of energy balances in Mycoplasma hyopneumoniae shows that pyruvate addition increases growth rate.

    Science.gov (United States)

    Kamminga, Tjerko; Slagman, Simen-Jan; Bijlsma, Jetta J E; Martins Dos Santos, Vitor A P; Suarez-Diez, Maria; Schaap, Peter J

    2017-10-01

    Mycoplasma hyopneumoniae is cultured on large-scale to produce antigen for inactivated whole-cell vaccines against respiratory disease in pigs. However, the fastidious nutrient requirements of this minimal bacterium and the low growth rate make it challenging to reach sufficient biomass yield for antigen production. In this study, we sequenced the genome of M. hyopneumoniae strain 11 and constructed a high quality constraint-based genome-scale metabolic model of 284 chemical reactions and 298 metabolites. We validated the model with time-series data of duplicate fermentation cultures to aim for an integrated model describing the dynamic profiles measured in fermentations. The model predicted that 84% of cellular energy in a standard M. hyopneumoniae cultivation was used for non-growth associated maintenance and only 16% of cellular energy was used for growth and growth associated maintenance. Following a cycle of model-driven experimentation in dedicated fermentation experiments, we were able to increase the fraction of cellular energy used for growth through pyruvate addition to the medium. This increase in turn led to an increase in growth rate and a 2.3 times increase in the total biomass concentration reached after 3-4 days of fermentation, enhancing the productivity of the overall process. The model presented provides a solid basis to understand and further improve M. hyopneumoniae fermentation processes. Biotechnol. Bioeng. 2017;114: 2339-2347. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  12. Simple solvable energy-landscape model that shows a thermodynamic phase transition and a glass transition.

    Science.gov (United States)

    Naumis, Gerardo G

    2012-06-01

    When a liquid melt is cooled, a glass or phase transition can be obtained depending on the cooling rate. Yet, this behavior has not been clearly captured in energy-landscape models. Here, a model is provided in which two key ingredients are considered in the landscape, metastable states and their multiplicity. Metastable states are considered as in two level system models. However, their multiplicity and topology allows a phase transition in the thermodynamic limit for slow cooling, while a transition to the glass is obtained for fast cooling. By solving the corresponding master equation, the minimal speed of cooling required to produce the glass is obtained as a function of the distribution of metastable states.

  13. Downscaling CMIP5 climate models shows increased tropical cyclone activity over the 21st century.

    Science.gov (United States)

    Emanuel, Kerry A

    2013-07-23

    A recently developed technique for simulating large [O(10(4))] numbers of tropical cyclones in climate states described by global gridded data is applied to simulations of historical and future climate states simulated by six Coupled Model Intercomparison Project 5 (CMIP5) global climate models. Tropical cyclones downscaled from the climate of the period 1950-2005 are compared with those of the 21st century in simulations that stipulate that the radiative forcing from greenhouse gases increases by over preindustrial values. In contrast to storms that appear explicitly in most global models, the frequency of downscaled tropical cyclones increases during the 21st century in most locations. The intensity of such storms, as measured by their maximum wind speeds, also increases, in agreement with previous results. Increases in tropical cyclone activity are most prominent in the western North Pacific, but are evident in other regions except for the southwestern Pacific. The increased frequency of events is consistent with increases in a genesis potential index based on monthly mean global model output. These results are compared and contrasted with other inferences concerning the effect of global warming on tropical cyclones.

  14. Towards modeling future energy infrastructures - the ELECTRA system engineering approach

    DEFF Research Database (Denmark)

    Uslar, Mathias; Heussen, Kai

    2016-01-01

    of the IEC 62559 use case template as well as needed changes to cope particularly with the aspects of controller conflicts and Greenfield technology modeling. From the original envisioned use of the standards, we show a possible transfer on how to properly deal with a Greenfield approach when modeling....

  15. ECOMOD - An ecological approach to radioecological modelling

    International Nuclear Information System (INIS)

    Sazykina, Tatiana G.

    2000-01-01

    A unified methodology is proposed to simulate the dynamic processes of radionuclide migration in aquatic food chains in parallel with their stable analogue elements. The distinguishing feature of the unified radioecological/ecological approach is the description of radionuclide migration along with dynamic equations for the ecosystem. The ability of the methodology to predict the results of radioecological experiments is demonstrated by an example of radionuclide (iron group) accumulation by a laboratory culture of the algae Platymonas viridis. Based on the unified methodology, the 'ECOMOD' radioecological model was developed to simulate dynamic radioecological processes in aquatic ecosystems. It comprises three basic modules, which are operated as a set of inter-related programs. The 'ECOSYSTEM' module solves non-linear ecological equations, describing the biomass dynamics of essential ecosystem components. The 'RADIONUCLIDE DISTRIBUTION' module calculates the radionuclide distribution in abiotic and biotic components of the aquatic ecosystem. The 'DOSE ASSESSMENT' module calculates doses to aquatic biota and doses to man from aquatic food chains. The application of the ECOMOD model to reconstruct the radionuclide distribution in the Chernobyl Cooling Pond ecosystem in the early period after the accident shows good agreement with observations

  16. Animal Models for Muscular Dystrophy Show Different Patterns of Sarcolemmal Disruption

    OpenAIRE

    Straub, Volker; Rafael, Jill A.; Chamberlain, Jeffrey S.; Campbell, Kevin P.

    1997-01-01

    Genetic defects in a number of components of the dystrophin–glycoprotein complex (DGC) lead to distinct forms of muscular dystrophy. However, little is known about how alterations in the DGC are manifested in the pathophysiology present in dystrophic muscle tissue. One hypothesis is that the DGC protects the sarcolemma from contraction-induced damage. Using tracer molecules, we compared sarcolemmal integrity in animal models for muscular dystrophy and in muscular dystrophy patient samples. Ev...

  17. The PROMETHEUS bundled payment experiment: slow start shows problems in implementing new payment models.

    Science.gov (United States)

    Hussey, Peter S; Ridgely, M Susan; Rosenthal, Meredith B

    2011-11-01

    Fee-for-service payment is blamed for many of the problems observed in the US health care system. One of the leading alternative payment models proposed in the Affordable Care Act of 2010 is bundled payment, which provides payment for all of the care a patient needs over the course of a defined clinical episode, instead of paying for each discrete service. We evaluated the initial "road test" of PROMETHEUS Payment, one of several bundled payment pilot projects. The project has faced substantial implementation challenges, and none of the three pilot sites had executed contracts or made bundled payments as of May 2011. The pilots have taken longer to set up than expected, primarily because of the complexity of the payment model and the fact that it builds on the existing fee-for-service payment system and other complexities of health care. Participants continue to see promise and value in the bundled payment model, but the pilot results suggest that the desired benefits of this and other payment reforms may take time and considerable effort to materialize.

  18. A Murine Model of Candida glabrata Vaginitis Shows No Evidence of an Inflammatory Immunopathogenic Response.

    Directory of Open Access Journals (Sweden)

    Evelyn E Nash

    Full Text Available Candida glabrata is the second most common organism isolated from women with vulvovaginal candidiasis (VVC, particularly in women with uncontrolled diabetes mellitus. However, mechanisms involved in the pathogenesis of C. glabrata-associated VVC are unknown and have not been studied at any depth in animal models. The objective of this study was to evaluate host responses to infection following efforts to optimize a murine model of C. glabrata VVC. For this, various designs were evaluated for consistent experimental vaginal colonization (i.e., type 1 and type 2 diabetic mice, exogenous estrogen, varying inocula, and co-infection with C. albicans. Upon model optimization, vaginal fungal burden and polymorphonuclear neutrophil (PMN recruitment were assessed longitudinally over 21 days post-inoculation, together with vaginal concentrations of IL-1β, S100A8 alarmin, lactate dehydrogenase (LDH, and in vivo biofilm formation. Consistent and sustained vaginal colonization with C. glabrata was achieved in estrogenized streptozotocin-induced type 1 diabetic mice. Vaginal PMN infiltration was consistently low, with IL-1β, S100A8, and LDH concentrations similar to uninoculated mice. Biofilm formation was not detected in vivo, and co-infection with C. albicans did not induce synergistic immunopathogenic effects. This data suggests that experimental vaginal colonization of C. glabrata is not associated with an inflammatory immunopathogenic response or biofilm formation.

  19. Risk Modelling for Passages in Approach Channel

    Directory of Open Access Journals (Sweden)

    Leszek Smolarek

    2013-01-01

    Full Text Available Methods of multivariate statistics, stochastic processes, and simulation methods are used to identify and assess the risk measures. This paper presents the use of generalized linear models and Markov models to study risks to ships along the approach channel. These models combined with simulation testing are used to determine the time required for continuous monitoring of endangered objects or period at which the level of risk should be verified.

  20. Global thermal niche models of two European grasses show high invasion risks in Antarctica.

    Science.gov (United States)

    Pertierra, Luis R; Aragón, Pedro; Shaw, Justine D; Bergstrom, Dana M; Terauds, Aleks; Olalla-Tárraga, Miguel Ángel

    2017-07-01

    The two non-native grasses that have established long-term populations in Antarctica (Poa pratensis and Poa annua) were studied from a global multidimensional thermal niche perspective to address the biological invasion risk to Antarctica. These two species exhibit contrasting introduction histories and reproductive strategies and represent two referential case studies of biological invasion processes. We used a multistep process with a range of species distribution modelling techniques (ecological niche factor analysis, multidimensional envelopes, distance/entropy algorithms) together with a suite of thermoclimatic variables, to characterize the potential ranges of these species. Their native bioclimatic thermal envelopes in Eurasia, together with the different naturalized populations across continents, were compared next. The potential niche of P. pratensis was wider at the cold extremes; however, P. annua life history attributes enable it to be a more successful colonizer. We observe that particularly cold summers are a key aspect of the unique Antarctic environment. In consequence, ruderals such as P. annua can quickly expand under such harsh conditions, whereas the more stress-tolerant P. pratensis endures and persist through steady growth. Compiled data on human pressure at the Antarctic Peninsula allowed us to provide site-specific biosecurity risk indicators. We conclude that several areas across the region are vulnerable to invasions from these and other similar species. This can only be visualized in species distribution models (SDMs) when accounting for founder populations that reveal nonanalogous conditions. Results reinforce the need for strict management practices to minimize introductions. Furthermore, our novel set of temperature-based bioclimatic GIS layers for ice-free terrestrial Antarctica provide a mechanism for regional and global species distribution models to be built for other potentially invasive species. © 2017 John Wiley & Sons Ltd.

  1. ASIC1a Deficient Mice Show Unaltered Neurodegeneration in the Subacute MPTP Model of Parkinson Disease.

    Science.gov (United States)

    Komnig, Daniel; Imgrund, Silke; Reich, Arno; Gründer, Stefan; Falkenburger, Björn H

    2016-01-01

    Inflammation contributes to the death of dopaminergic neurons in Parkinson disease and can be accompanied by acidification of extracellular pH, which may activate acid-sensing ion channels (ASIC). Accordingly, amiloride, a non-selective inhibitor of ASIC, was protective in an acute 1-methyl-4-phenyl-1,2,3,6-tetrahydropyridine (MPTP) mouse model of Parkinson disease. To complement these findings we determined MPTP toxicity in mice deficient for ASIC1a, the most common ASIC isoform in neurons. MPTP was applied i.p. in doses of 30 mg per kg on five consecutive days. We determined the number of dopaminergic neurons in the substantia nigra, assayed by stereological counting 14 days after the last MPTP injection, the number of Nissl positive neurons in the substantia nigra, and the concentration of catecholamines in the striatum. There was no difference between ASIC1a-deficient mice and wildtype controls. We are therefore not able to confirm that ASIC1a are involved in MPTP toxicity. The difference might relate to the subacute MPTP model we used, which more closely resembles the pathogenesis of Parkinson disease, or to further targets of amiloride.

  2. Progesterone treatment shows benefit in a pediatric model of moderate to severe bilateral brain injury.

    Directory of Open Access Journals (Sweden)

    Rastafa I Geddes

    Full Text Available Controlled cortical impact (CCI models in adult and aged Sprague-Dawley (SD rats have been used extensively to study medial prefrontal cortex (mPFC injury and the effects of post-injury progesterone treatment, but the hormone's effects after traumatic brain injury (TBI in juvenile animals have not been determined. In the present proof-of-concept study we investigated whether progesterone had neuroprotective effects in a pediatric model of moderate to severe bilateral brain injury.Twenty-eight-day old (PND 28 male Sprague Dawley rats received sham (n = 24 or CCI (n = 47 injury and were given progesterone (4, 8, or 16 mg/kg per 100 g body weight or vehicle injections on post-injury days (PID 1-7, subjected to behavioral testing from PID 9-27, and analyzed for lesion size at PID 28.The 8 and 16 mg/kg doses of progesterone were observed to be most beneficial in reducing the effect of CCI on lesion size and behavior in PND 28 male SD rats.Our findings suggest that a midline CCI injury to the frontal cortex will reliably produce a moderate TBI comparable to what is seen in the adult male rat and that progesterone can ameliorate the injury-induced deficits.

  3. ASIC1a Deficient Mice Show Unaltered Neurodegeneration in the Subacute MPTP Model of Parkinson Disease.

    Directory of Open Access Journals (Sweden)

    Daniel Komnig

    Full Text Available Inflammation contributes to the death of dopaminergic neurons in Parkinson disease and can be accompanied by acidification of extracellular pH, which may activate acid-sensing ion channels (ASIC. Accordingly, amiloride, a non-selective inhibitor of ASIC, was protective in an acute 1-methyl-4-phenyl-1,2,3,6-tetrahydropyridine (MPTP mouse model of Parkinson disease. To complement these findings we determined MPTP toxicity in mice deficient for ASIC1a, the most common ASIC isoform in neurons. MPTP was applied i.p. in doses of 30 mg per kg on five consecutive days. We determined the number of dopaminergic neurons in the substantia nigra, assayed by stereological counting 14 days after the last MPTP injection, the number of Nissl positive neurons in the substantia nigra, and the concentration of catecholamines in the striatum. There was no difference between ASIC1a-deficient mice and wildtype controls. We are therefore not able to confirm that ASIC1a are involved in MPTP toxicity. The difference might relate to the subacute MPTP model we used, which more closely resembles the pathogenesis of Parkinson disease, or to further targets of amiloride.

  4. Towards new approaches in phenological modelling

    Science.gov (United States)

    Chmielewski, Frank-M.; Götz, Klaus-P.; Rawel, Harshard M.; Homann, Thomas

    2014-05-01

    Modelling of phenological stages is based on temperature sums for many decades, describing both the chilling and the forcing requirement of woody plants until the beginning of leafing or flowering. Parts of this approach go back to Reaumur (1735), who originally proposed the concept of growing degree-days. Now, there is a growing body of opinion that asks for new methods in phenological modelling and more in-depth studies on dormancy release of woody plants. This requirement is easily understandable if we consider the wide application of phenological models, which can even affect the results of climate models. To this day, in phenological models still a number of parameters need to be optimised on observations, although some basic physiological knowledge of the chilling and forcing requirement of plants is already considered in these approaches (semi-mechanistic models). Limiting, for a fundamental improvement of these models, is the lack of knowledge about the course of dormancy in woody plants, which cannot be directly observed and which is also insufficiently described in the literature. Modern metabolomic methods provide a solution for this problem and allow both, the validation of currently used phenological models as well as the development of mechanistic approaches. In order to develop this kind of models, changes of metabolites (concentration, temporal course) must be set in relation to the variability of environmental (steering) parameters (weather, day length, etc.). This necessarily requires multi-year (3-5 yr.) and high-resolution (weekly probes between autumn and spring) data. The feasibility of this approach has already been tested in a 3-year pilot-study on sweet cherries. Our suggested methodology is not only limited to the flowering of fruit trees, it can be also applied to tree species of the natural vegetation, where even greater deficits in phenological modelling exist.

  5. A zebrafish model of glucocorticoid resistance shows serotonergic modulation of the stress response

    Directory of Open Access Journals (Sweden)

    Brian eGriffiths

    2012-10-01

    Full Text Available One function of glucocorticoids is to restore homeostasis after an acute stress response by providing negative feedback to stress circuits in the brain. Loss of this negative feedback leads to elevated physiological stress and may contribute to depression, anxiety and post-traumatic stress disorder. We investigated the early, developmental effects of glucocorticoid signaling deficits on stress physiology and related behaviors using a mutant zebrafish, grs357, with non-functional glucocorticoid receptors. These mutants are morphologically inconspicuous and adult-viable. A previous study of adult grs357 mutants showed loss of glucocorticoid-mediated negative feedback and elevated physiological and behavioral stress markers. Already at five days post-fertilization, mutant larvae had elevated whole body cortisol, increased expression of pro-opiomelanocortin (POMC, the precursor of adrenocorticotropic hormone (ACTH, and failed to show normal suppression of stress markers after dexamethasone treatment. Mutant larvae had larger auditory-evoked startle responses compared to wildtype sibling controls (grwt, despite having lower spontaneous activity levels. Fluoxetine (Prozac treatment in mutants decreased startle responding and increased spontaneous activity, making them behaviorally similar to wildtype. This result mirrors known effects of selective serotonin reuptake inhibitors (SSRIs in modifying glucocorticoid signaling and alleviating stress disorders in human patients. Our results suggest that larval grs357 zebrafish can be used to study behavioral, physiological and molecular aspects of stress disorders. Most importantly, interactions between glucocorticoid and serotonin signaling appear to be highly conserved among vertebrates, suggesting deep homologies at the neural circuit level and opening up new avenues for research into psychiatric conditions.

  6. Metabolic remodeling agents show beneficial effects in the dystrophin-deficient mdx mouse model

    Directory of Open Access Journals (Sweden)

    Jahnke Vanessa E

    2012-08-01

    Full Text Available Abstract Background Duchenne muscular dystrophy is a genetic disease involving a severe muscle wasting that is characterized by cycles of muscle degeneration/regeneration and culminates in early death in affected boys. Mitochondria are presumed to be involved in the regulation of myoblast proliferation/differentiation; enhancing mitochondrial activity with exercise mimetics (AMPK and PPAR-delta agonists increases muscle function and inhibits muscle wasting in healthy mice. We therefore asked whether metabolic remodeling agents that increase mitochondrial activity would improve muscle function in mdx mice. Methods Twelve-week-old mdx mice were treated with two different metabolic remodeling agents (GW501516 and AICAR, separately or in combination, for 4 weeks. Extensive systematic behavioral, functional, histological, biochemical, and molecular tests were conducted to assess the drug(s' effects. Results We found a gain in body and muscle weight in all treated mice. Histologic examination showed a decrease in muscle inflammation and in the number of fibers with central nuclei and an increase in fibers with peripheral nuclei, with significantly fewer activated satellite cells and regenerating fibers. Together with an inhibition of FoXO1 signaling, these results indicated that the treatments reduced ongoing muscle damage. Conclusions The three treatments produced significant improvements in disease phenotype, including an increase in overall behavioral activity and significant gains in forelimb and hind limb strength. Our findings suggest that triggering mitochondrial activity with exercise mimetics improves muscle function in dystrophin-deficient mdx mice.

  7. Male Wistar rats show individual differences in an animal model of conformity.

    Science.gov (United States)

    Jolles, Jolle W; de Visser, Leonie; van den Bos, Ruud

    2011-09-01

    Conformity refers to the act of changing one's behaviour to match that of others. Recent studies in humans have shown that individual differences exist in conformity and that these differences are related to differences in neuronal activity. To understand the neuronal mechanisms in more detail, animal tests to assess conformity are needed. Here, we used a test of conformity in rats that has previously been evaluated in female, but not male, rats and assessed the nature of individual differences in conformity. Male Wistar rats were given the opportunity to learn that two diets differed in palatability. They were subsequently exposed to a demonstrator that had consumed the less palatable food. Thereafter, they were exposed to the same diets again. Just like female rats, male rats decreased their preference for the more palatable food after interaction with demonstrator rats that had eaten the less palatable food. Individual differences existed for this shift, which were only weakly related to an interaction between their own initial preference and the amount consumed by the demonstrator rat. The data show that this conformity test in rats is a promising tool to study the neurobiology of conformity.

  8. SLS Navigation Model-Based Design Approach

    Science.gov (United States)

    Oliver, T. Emerson; Anzalone, Evan; Geohagan, Kevin; Bernard, Bill; Park, Thomas

    2018-01-01

    The SLS Program chose to implement a Model-based Design and Model-based Requirements approach for managing component design information and system requirements. This approach differs from previous large-scale design efforts at Marshall Space Flight Center where design documentation alone conveyed information required for vehicle design and analysis and where extensive requirements sets were used to scope and constrain the design. The SLS Navigation Team has been responsible for the Program-controlled Design Math Models (DMMs) which describe and represent the performance of the Inertial Navigation System (INS) and the Rate Gyro Assemblies (RGAs) used by Guidance, Navigation, and Controls (GN&C). The SLS Navigation Team is also responsible for the navigation algorithms. The navigation algorithms are delivered for implementation on the flight hardware as a DMM. For the SLS Block 1-B design, the additional GPS Receiver hardware is managed as a DMM at the vehicle design level. This paper provides a discussion of the processes and methods used to engineer, design, and coordinate engineering trades and performance assessments using SLS practices as applied to the GN&C system, with a particular focus on the Navigation components. These include composing system requirements, requirements verification, model development, model verification and validation, and modeling and analysis approaches. The Model-based Design and Requirements approach does not reduce the effort associated with the design process versus previous processes used at Marshall Space Flight Center. Instead, the approach takes advantage of overlap between the requirements development and management process, and the design and analysis process by efficiently combining the control (i.e. the requirement) and the design mechanisms. The design mechanism is the representation of the component behavior and performance in design and analysis tools. The focus in the early design process shifts from the development and

  9. Modeling serotonin uptake in the lung shows endothelial transporters dominate over cleft permeation

    Science.gov (United States)

    Bassingthwaighte, James B.

    2013-01-01

    A four-region (capillary plasma, endothelium, interstitial fluid, cell) multipath model was configured to describe the kinetics of blood-tissue exchange for small solutes in the lung, accounting for regional flow heterogeneity, permeation of cell membranes and through interendothelial clefts, and intracellular reactions. Serotonin uptake data from the Multiple indicator dilution “bolus sweep” experiments of Rickaby and coworkers (Rickaby DA, Linehan JH, Bronikowski TA, Dawson CA. J Appl Physiol 51: 405–414, 1981; Rickaby DA, Dawson CA, and Linehan JH. J Appl Physiol 56: 1170–1177, 1984) and Malcorps et al. (Malcorps CM, Dawson CA, Linehan JH, Bronikowski TA, Rickaby DA, Herman AG, Will JA. J Appl Physiol 57: 720–730, 1984) were analyzed to distinguish facilitated transport into the endothelial cells (EC) and the inhibition of tracer transport by nontracer serotonin in the bolus of injectate from the free uninhibited permeation through the clefts into the interstitial fluid space. The permeability-surface area products (PS) for serotonin via the inter-EC clefts were ∼0.3 ml·g−1·min−1, low compared with the transporter-mediated maximum PS of 13 ml·g−1·min−1 (with Km = ∼0.3 μM and Vmax = ∼4 nmol·g−1·min−1). The estimates of serotonin PS values for EC transporters from their multiple data sets were similar and were influenced only modestly by accounting for the cleft permeability in parallel. The cleft PS estimates in these Ringer-perfused lungs are less than half of those for anesthetized dogs (Yipintsoi T. Circ Res 39: 523–531, 1976) with normal hematocrits, but are compatible with passive noncarrier-mediated transport observed later in the same laboratory (Dawson CA, Linehan JH, Rickaby DA, Bronikowski TA. Ann Biomed Eng 15: 217–227, 1987; Peeters FAM, Bronikowski TA, Dawson CA, Linehan JH, Bult H, Herman AG. J Appl Physiol 66: 2328–2337, 1989) The identification and quantitation of the cleft pathway conductance from these

  10. Neural network approaches for noisy language modeling.

    Science.gov (United States)

    Li, Jun; Ouazzane, Karim; Kazemian, Hassan B; Afzal, Muhammad Sajid

    2013-11-01

    Text entry from people is not only grammatical and distinct, but also noisy. For example, a user's typing stream contains all the information about the user's interaction with computer using a QWERTY keyboard, which may include the user's typing mistakes as well as specific vocabulary, typing habit, and typing performance. In particular, these features are obvious in disabled users' typing streams. This paper proposes a new concept called noisy language modeling by further developing information theory and applies neural networks to one of its specific application-typing stream. This paper experimentally uses a neural network approach to analyze the disabled users' typing streams both in general and specific ways to identify their typing behaviors and subsequently, to make typing predictions and typing corrections. In this paper, a focused time-delay neural network (FTDNN) language model, a time gap model, a prediction model based on time gap, and a probabilistic neural network model (PNN) are developed. A 38% first hitting rate (HR) and a 53% first three HR in symbol prediction are obtained based on the analysis of a user's typing history through the FTDNN language modeling, while the modeling results using the time gap prediction model and the PNN model demonstrate that the correction rates lie predominantly in between 65% and 90% with the current testing samples, and 70% of all test scores above basic correction rates, respectively. The modeling process demonstrates that a neural network is a suitable and robust language modeling tool to analyze the noisy language stream. The research also paves the way for practical application development in areas such as informational analysis, text prediction, and error correction by providing a theoretical basis of neural network approaches for noisy language modeling.

  11. Heat transfer modeling an inductive approach

    CERN Document Server

    Sidebotham, George

    2015-01-01

    This innovative text emphasizes a "less-is-more" approach to modeling complicated systems such as heat transfer by treating them first as "1-node lumped models" that yield simple closed-form solutions. The author develops numerical techniques for students to obtain more detail, but also trains them to use the techniques only when simpler approaches fail. Covering all essential methods offered in traditional texts, but with a different order, Professor Sidebotham stresses inductive thinking and problem solving as well as a constructive understanding of modern, computer-based practice. Readers learn to develop their own code in the context of the material, rather than just how to use packaged software, offering a deeper, intrinsic grasp behind models of heat transfer. Developed from over twenty-five years of lecture notes to teach students of mechanical and chemical engineering at The Cooper Union for the Advancement of Science and Art, the book is ideal for students and practitioners across engineering discipl...

  12. KEEFEKTIFAN MODEL SHOW NOT TELL DAN MIND MAP PADA PEMBELAJARAN MENULIS TEKS EKSPOSISI BERDASARKAN MINAT PESERTA DIDIK KELAS X SMK

    Directory of Open Access Journals (Sweden)

    Wiwit Lili Sokhipah

    2015-03-01

    Full Text Available Tujuan penelitian ini adalah (1 menentukan keefektifan penggunaan model show not tell pada pembelajaran keterampilan menulis teks eksposisi berdasarkan minat peserta didik SMK Kelas X, (2 menentukan keefektifan penggunaan model mind map pada pembelajaran keterampilan menulis teks eksposisi berdasarkan minat peserta didik SMK kelas X, (3 menentukan keefektifan interaksi show not tell dan mind map pada pembelajaran keterampilan menulis teks eksposisi berdasarkan minat peserta didik SMK kelas X. Penelitian ini adalah quasi experimental design (pretes-postes control group design. Dalam desain ini terdapat dua kelompok eksperimen yakni penerapan model show not tell dalam pembelajaran keterampilan menulis teks eksposisipeserta didik dengan minat tinggi dan penerapan model mind map dalam pembelajaran keterampilan menulis teks eksposisi  peserta didik dengan minat rendah. Hasil penelitian adalah (1 model show not tell efektif digunakan  dalam membelajarkan menulis teks eksposisi bagi peserta didik yang memiliki minat tinggi, (2 model mind map efektif digunakan dalam membelajarkan menulis teks eksposisi bagi peserta didik yang memiliki minat rendah, dan (3 model show not tell lebih efektif digunakan dalam membelajarkan menulis teks eksposisi bagi peserta didik yang memiliki minat tinggi, sedangkan model mind map efektif digunakan dalam membelajarkan teks eksposisi pagi peserta didik yang memiliki minat rendah.

  13. A multiscale modeling approach for biomolecular systems

    Energy Technology Data Exchange (ETDEWEB)

    Bowling, Alan, E-mail: bowling@uta.edu; Haghshenas-Jaryani, Mahdi, E-mail: mahdi.haghshenasjaryani@mavs.uta.edu [The University of Texas at Arlington, Department of Mechanical and Aerospace Engineering (United States)

    2015-04-15

    This paper presents a new multiscale molecular dynamic model for investigating the effects of external interactions, such as contact and impact, during stepping and docking of motor proteins and other biomolecular systems. The model retains the mass properties ensuring that the result satisfies Newton’s second law. This idea is presented using a simple particle model to facilitate discussion of the rigid body model; however, the particle model does provide insights into particle dynamics at the nanoscale. The resulting three-dimensional model predicts a significant decrease in the effect of the random forces associated with Brownian motion. This conclusion runs contrary to the widely accepted notion that the motor protein’s movements are primarily the result of thermal effects. This work focuses on the mechanical aspects of protein locomotion; the effect ATP hydrolysis is estimated as internal forces acting on the mechanical model. In addition, the proposed model can be numerically integrated in a reasonable amount of time. Herein, the differences between the motion predicted by the old and new modeling approaches are compared using a simplified model of myosin V.

  14. Quasirelativistic quark model in quasipotential approach

    CERN Document Server

    Matveev, V A; Savrin, V I; Sissakian, A N

    2002-01-01

    The relativistic particles interaction is described within the frames of quasipotential approach. The presentation is based on the so called covariant simultaneous formulation of the quantum field theory, where by the theory is considered on the spatial-like three-dimensional hypersurface in the Minkowski space. Special attention is paid to the methods of plotting various quasipotentials as well as to the applications of the quasipotential approach to describing the characteristics of the relativistic particles interaction in the quark models, namely: the hadrons elastic scattering amplitudes, the mass spectra and widths mesons decays, the cross sections of the deep inelastic leptons scattering on the hadrons

  15. A new approach for developing adjoint models

    Science.gov (United States)

    Farrell, P. E.; Funke, S. W.

    2011-12-01

    Many data assimilation algorithms rely on the availability of gradients of misfit functionals, which can be efficiently computed with adjoint models. However, the development of an adjoint model for a complex geophysical code is generally very difficult. Algorithmic differentiation (AD, also called automatic differentiation) offers one strategy for simplifying this task: it takes the abstraction that a model is a sequence of primitive instructions, each of which may be differentiated in turn. While extremely successful, this low-level abstraction runs into time-consuming difficulties when applied to the whole codebase of a model, such as differentiating through linear solves, model I/O, calls to external libraries, language features that are unsupported by the AD tool, and the use of multiple programming languages. While these difficulties can be overcome, it requires a large amount of technical expertise and an intimate familiarity with both the AD tool and the model. An alternative to applying the AD tool to the whole codebase is to assemble the discrete adjoint equations and use these to compute the necessary gradients. With this approach, the AD tool must be applied to the nonlinear assembly operators, which are typically small, self-contained units of the codebase. The disadvantage of this approach is that the assembly of the discrete adjoint equations is still very difficult to perform correctly, especially for complex multiphysics models that perform temporal integration; as it stands, this approach is as difficult and time-consuming as applying AD to the whole model. In this work, we have developed a library which greatly simplifies and automates the alternate approach of assembling the discrete adjoint equations. We propose a complementary, higher-level abstraction to that of AD: that a model is a sequence of linear solves. The developer annotates model source code with library calls that build a 'tape' of the operators involved and their dependencies, and

  16. Metamodelling Approach and Software Tools for Physical Modelling and Simulation

    Directory of Open Access Journals (Sweden)

    Vitaliy Mezhuyev

    2015-02-01

    Full Text Available In computer science, metamodelling approach becomes more and more popular for the purpose of software systems development. In this paper, we discuss applicability of the metamodelling approach for development of software tools for physical modelling and simulation.To define a metamodel for physical modelling the analysis of physical models will be done. The result of such the analyses will show the invariant physical structures, we propose to use as the basic abstractions of the physical metamodel. It is a system of geometrical objects, allowing to build a spatial structure of physical models and to set a distribution of physical properties. For such geometry of distributed physical properties, the different mathematical methods can be applied. To prove the proposed metamodelling approach, we consider the developed prototypes of software tools.

  17. Dynamic Metabolic Model Building Based on the Ensemble Modeling Approach

    Energy Technology Data Exchange (ETDEWEB)

    Liao, James C. [Univ. of California, Los Angeles, CA (United States)

    2016-10-01

    Ensemble modeling of kinetic systems addresses the challenges of kinetic model construction, with respect to parameter value selection, and still allows for the rich insights possible from kinetic models. This project aimed to show that constructing, implementing, and analyzing such models is a useful tool for the metabolic engineering toolkit, and that they can result in actionable insights from models. Key concepts are developed and deliverable publications and results are presented.

  18. A Theoretical Approach to Norm Ecosystems: Two Adaptive Architectures of Indirect Reciprocity Show Different Paths to the Evolution of Cooperation

    Directory of Open Access Journals (Sweden)

    Satoshi Uchida

    2018-02-01

    Full Text Available Indirect reciprocity is one of the basic mechanisms to sustain mutual cooperation, by which beneficial acts are returned, not by the recipient, but by third parties. This mechanism relies on the ability of individuals to know the past actions of others, and to assess those actions. There are many different systems of assessing others, which can be interpreted as rudimentary social norms (i.e., views on what is “good” or “bad”. In this paper, impacts of different adaptive architectures, i.e., ways for individuals to adapt to environments, on indirect reciprocity are investigated. We examine two representative architectures: one based on replicator dynamics and the other on genetic algorithm. Different from the replicator dynamics, the genetic algorithm requires describing the mixture of all possible norms in the norm space under consideration. Therefore, we also propose an analytic method to study norm ecosystems in which all possible second order social norms potentially exist and compete. The analysis reveals that the different adaptive architectures show different paths to the evolution of cooperation. Especially we find that so called Stern-Judging, one of the best studied norms in the literature, exhibits distinct behaviors in both architectures. On one hand, in the replicator dynamics, Stern-Judging remains alive and gets a majority steadily when the population reaches a cooperative state. On the other hand, in the genetic algorithm, it gets a majority only temporarily and becomes extinct in the end.

  19. Evolutionary modeling-based approach for model errors correction

    Directory of Open Access Journals (Sweden)

    S. Q. Wan

    2012-08-01

    Full Text Available The inverse problem of using the information of historical data to estimate model errors is one of the science frontier research topics. In this study, we investigate such a problem using the classic Lorenz (1963 equation as a prediction model and the Lorenz equation with a periodic evolutionary function as an accurate representation of reality to generate "observational data."

    On the basis of the intelligent features of evolutionary modeling (EM, including self-organization, self-adaptive and self-learning, the dynamic information contained in the historical data can be identified and extracted by computer automatically. Thereby, a new approach is proposed to estimate model errors based on EM in the present paper. Numerical tests demonstrate the ability of the new approach to correct model structural errors. In fact, it can actualize the combination of the statistics and dynamics to certain extent.

  20. MODELS OF TECHNOLOGY ADOPTION: AN INTEGRATIVE APPROACH

    Directory of Open Access Journals (Sweden)

    Andrei OGREZEANU

    2015-06-01

    Full Text Available The interdisciplinary study of information technology adoption has developed rapidly over the last 30 years. Various theoretical models have been developed and applied such as: the Technology Acceptance Model (TAM, Innovation Diffusion Theory (IDT, Theory of Planned Behavior (TPB, etc. The result of these many years of research is thousands of contributions to the field, which, however, remain highly fragmented. This paper develops a theoretical model of technology adoption by integrating major theories in the field: primarily IDT, TAM, and TPB. To do so while avoiding mess, an approach that goes back to basics in independent variable type’s development is proposed; emphasizing: 1 the logic of classification, and 2 psychological mechanisms behind variable types. Once developed these types are then populated with variables originating in empirical research. Conclusions are developed on which types are underpopulated and present potential for future research. I end with a set of methodological recommendations for future application of the model.

  1. Continuum modeling an approach through practical examples

    CERN Document Server

    Muntean, Adrian

    2015-01-01

    This book develops continuum modeling skills and approaches the topic from three sides: (1) derivation of global integral laws together with the associated local differential equations, (2) design of constitutive laws and (3) modeling boundary processes. The focus of this presentation lies on many practical examples covering aspects such as coupled flow, diffusion and reaction in porous media or microwave heating of a pizza, as well as traffic issues in bacterial colonies and energy harvesting from geothermal wells. The target audience comprises primarily graduate students in pure and applied mathematics as well as working practitioners in engineering who are faced by nonstandard rheological topics like those typically arising in the food industry.

  2. Regularization of quantum gravity in the matrix model approach

    International Nuclear Information System (INIS)

    Ueda, Haruhiko

    1991-02-01

    We study divergence problem of the partition function in the matrix model approach for two-dimensional quantum gravity. We propose a new model V(φ) = 1/2Trφ 2 + g 4 /NTrφ 4 + g'/N 4 Tr(φ 4 ) 2 and show that in the sphere case it has no divergence problem and the critical exponent is of pure gravity. (author)

  3. Datamining approaches for modeling tumor control probability.

    Science.gov (United States)

    Naqa, Issam El; Deasy, Joseph O; Mu, Yi; Huang, Ellen; Hope, Andrew J; Lindsay, Patricia E; Apte, Aditya; Alaly, James; Bradley, Jeffrey D

    2010-11-01

    Tumor control probability (TCP) to radiotherapy is determined by complex interactions between tumor biology, tumor microenvironment, radiation dosimetry, and patient-related variables. The complexity of these heterogeneous variable interactions constitutes a challenge for building predictive models for routine clinical practice. We describe a datamining framework that can unravel the higher order relationships among dosimetric dose-volume prognostic variables, interrogate various radiobiological processes, and generalize to unseen data before when applied prospectively. Several datamining approaches are discussed that include dose-volume metrics, equivalent uniform dose, mechanistic Poisson model, and model building methods using statistical regression and machine learning techniques. Institutional datasets of non-small cell lung cancer (NSCLC) patients are used to demonstrate these methods. The performance of the different methods was evaluated using bivariate Spearman rank correlations (rs). Over-fitting was controlled via resampling methods. Using a dataset of 56 patients with primary NCSLC tumors and 23 candidate variables, we estimated GTV volume and V75 to be the best model parameters for predicting TCP using statistical resampling and a logistic model. Using these variables, the support vector machine (SVM) kernel method provided superior performance for TCP prediction with an rs=0.68 on leave-one-out testing compared to logistic regression (rs=0.4), Poisson-based TCP (rs=0.33), and cell kill equivalent uniform dose model (rs=0.17). The prediction of treatment response can be improved by utilizing datamining approaches, which are able to unravel important non-linear complex interactions among model variables and have the capacity to predict on unseen data for prospective clinical applications.

  4. 'Leaner' approach shows its benefits.

    Science.gov (United States)

    Baillie, Jonathan

    2011-03-01

    Once viewed almost exclusively as temporary facilities built down to a cost, especially by the architectural community, modular off-site built healthcare buildings have enjoyed increasing success in recent years, as perceptions about their quality, and recognition of their advantages over "traditional" on-site constructed buildings, especially in terms of speed of build, reduced on-site disruption, future adaptability, and lower environmental impact, has spread.

  5. Crime Modeling using Spatial Regression Approach

    Science.gov (United States)

    Saleh Ahmar, Ansari; Adiatma; Kasim Aidid, M.

    2018-01-01

    Act of criminality in Indonesia increased both variety and quantity every year. As murder, rape, assault, vandalism, theft, fraud, fencing, and other cases that make people feel unsafe. Risk of society exposed to crime is the number of reported cases in the police institution. The higher of the number of reporter to the police institution then the number of crime in the region is increasing. In this research, modeling criminality in South Sulawesi, Indonesia with the dependent variable used is the society exposed to the risk of crime. Modelling done by area approach is the using Spatial Autoregressive (SAR) and Spatial Error Model (SEM) methods. The independent variable used is the population density, the number of poor population, GDP per capita, unemployment and the human development index (HDI). Based on the analysis using spatial regression can be shown that there are no dependencies spatial both lag or errors in South Sulawesi.

  6. A Set Theoretical Approach to Maturity Models

    DEFF Research Database (Denmark)

    Lasrado, Lester; Vatrapu, Ravi; Andersen, Kim Normann

    2016-01-01

    of it application on a social media maturity data-set. Specifically, we employ Necessary Condition Analysis (NCA) to identify maturity stage boundaries as necessary conditions and Qualitative Comparative Analysis (QCA) to arrive at multiple configurations that can be equally effective in progressing to higher......Maturity Model research in IS has been criticized for the lack of theoretical grounding, methodological rigor, empirical validations, and ignorance of multiple and non-linear paths to maturity. To address these criticisms, this paper proposes a novel set-theoretical approach to maturity models...... characterized by equifinality, multiple conjunctural causation, and case diversity. We prescribe methodological guidelines consisting of a six-step procedure to systematically apply set theoretic methods to conceptualize, develop, and empirically derive maturity models and provide a demonstration...

  7. Plot showing ATLAS limits on Standard Model Higgs production in the mass range 110-150 GeV

    CERN Multimedia

    ATLAS Collaboration

    2011-01-01

    The combined upper limit on the Standard Model Higgs boson production cross section divided by the Standard Model expectation as a function of mH is indicated by the solid line. This is a 95% CL limit using the CLs method in in the low mass range. The dotted line shows the median expected limit in the absence of a signal and the green and yellow bands reflect the corresponding 68% and 95% expected

  8. Plot showing ATLAS limits on Standard Model Higgs production in the mass range 100-600 GeV

    CERN Multimedia

    ATLAS Collaboration

    2011-01-01

    The combined upper limit on the Standard Model Higgs boson production cross section divided by the Standard Model expectation as a function of mH is indicated by the solid line. This is a 95% CL limit using the CLs method in the entire mass range. The dotted line shows the median expected limit in the absence of a signal and the green and yellow bands reflect the corresponding 68% and 95% expected

  9. A Modeling Approach for Marine Observatory

    Directory of Open Access Journals (Sweden)

    Charbel Geryes Aoun

    2015-02-01

    Full Text Available Infrastructure of Marine Observatory (MO is an UnderWater Sensor Networks (UW-SN to perform collaborative monitoring tasks over a given area. This observation should take into consideration the environmental constraints since it may require specific tools, materials and devices (cables, servers, etc.. The logical and physical components that are used in these observatories provide data exchanged between the various devices of the environment (Smart Sensor, Data Fusion. These components provide new functionalities or services due to the long period running of the network. In this paper, we present our approach in extending the modeling languages to include new domain- specific concepts and constraints. Thus, we propose a meta-model that is used to generate a new design tool (ArchiMO. We illustrate our proposal with an example from the MO domain on object localization with several acoustics sensors. Additionally, we generate the corresponding simulation code for a standard network simulator using our self-developed domain-specific model compiler. Our approach helps to reduce the complexity and time of the design activity of a Marine Observatory. It provides a way to share the different viewpoints of the designers in the MO domain and obtain simulation results to estimate the network capabilities.

  10. Wave Resource Characterization Using an Unstructured Grid Modeling Approach

    Directory of Open Access Journals (Sweden)

    Wei-Cheng Wu

    2018-03-01

    Full Text Available This paper presents a modeling study conducted on the central Oregon coast for wave resource characterization, using the unstructured grid Simulating WAve Nearshore (SWAN model coupled with a nested grid WAVEWATCH III® (WWIII model. The flexibility of models with various spatial resolutions and the effects of open boundary conditions simulated by a nested grid WWIII model with different physics packages were evaluated. The model results demonstrate the advantage of the unstructured grid-modeling approach for flexible model resolution and good model skills in simulating the six wave resource parameters recommended by the International Electrotechnical Commission in comparison to the observed data in Year 2009 at National Data Buoy Center Buoy 46050. Notably, spectral analysis indicates that the ST4 physics package improves upon the ST2 physics package’s ability to predict wave power density for large waves, which is important for wave resource assessment, load calculation of devices, and risk management. In addition, bivariate distributions show that the simulated sea state of maximum occurrence with the ST4 physics package matched the observed data better than with the ST2 physics package. This study demonstrated that the unstructured grid wave modeling approach, driven by regional nested grid WWIII outputs along with the ST4 physics package, can efficiently provide accurate wave hindcasts to support wave resource characterization. Our study also suggests that wind effects need to be considered if the dimension of the model domain is greater than approximately 100 km, or O (102 km.

  11. A multiscale approach for modeling atherosclerosis progression.

    Science.gov (United States)

    Exarchos, Konstantinos P; Carpegianni, Clara; Rigas, Georgios; Exarchos, Themis P; Vozzi, Federico; Sakellarios, Antonis; Marraccini, Paolo; Naka, Katerina; Michalis, Lambros; Parodi, Oberdan; Fotiadis, Dimitrios I

    2015-03-01

    Progression of atherosclerotic process constitutes a serious and quite common condition due to accumulation of fatty materials in the arterial wall, consequently posing serious cardiovascular complications. In this paper, we assemble and analyze a multitude of heterogeneous data in order to model the progression of atherosclerosis (ATS) in coronary vessels. The patient's medical record, biochemical analytes, monocyte information, adhesion molecules, and therapy-related data comprise the input for the subsequent analysis. As indicator of coronary lesion progression, two consecutive coronary computed tomography angiographies have been evaluated in the same patient. To this end, a set of 39 patients is studied using a twofold approach, namely, baseline analysis and temporal analysis. The former approach employs baseline information in order to predict the future state of the patient (in terms of progression of ATS). The latter is based on an approach encompassing dynamic Bayesian networks whereby snapshots of the patient's status over the follow-up are analyzed in order to model the evolvement of ATS, taking into account the temporal dimension of the disease. The quantitative assessment of our work has resulted in 93.3% accuracy for the case of baseline analysis, and 83% overall accuracy for the temporal analysis, in terms of modeling and predicting the evolvement of ATS. It should be noted that the application of the SMOTE algorithm for handling class imbalance and the subsequent evaluation procedure might have introduced an overestimation of the performance metrics, due to the employment of synthesized instances. The most prominent features found to play a substantial role in the progression of the disease are: diabetes, cholesterol and cholesterol/HDL. Among novel markers, the CD11b marker of leukocyte integrin complex is associated with coronary plaque progression.

  12. Bianchi VI0 and III models: self-similar approach

    International Nuclear Information System (INIS)

    Belinchon, Jose Antonio

    2009-01-01

    We study several cosmological models with Bianchi VI 0 and III symmetries under the self-similar approach. We find new solutions for the 'classical' perfect fluid model as well as for the vacuum model although they are really restrictive for the equation of state. We also study a perfect fluid model with time-varying constants, G and Λ. As in other studied models we find that the behaviour of G and Λ are related. If G behaves as a growing time function then Λ is a positive decreasing time function but if G is decreasing then Λ 0 is negative. We end by studying a massive cosmic string model, putting special emphasis in calculating the numerical values of the equations of state. We show that there is no SS solution for a string model with time-varying constants.

  13. Comparative flood damage model assessment: towards a European approach

    Science.gov (United States)

    Jongman, B.; Kreibich, H.; Apel, H.; Barredo, J. I.; Bates, P. D.; Feyen, L.; Gericke, A.; Neal, J.; Aerts, J. C. J. H.; Ward, P. J.

    2012-12-01

    There is a wide variety of flood damage models in use internationally, differing substantially in their approaches and economic estimates. Since these models are being used more and more as a basis for investment and planning decisions on an increasingly large scale, there is a need to reduce the uncertainties involved and develop a harmonised European approach, in particular with respect to the EU Flood Risks Directive. In this paper we present a qualitative and quantitative assessment of seven flood damage models, using two case studies of past flood events in Germany and the United Kingdom. The qualitative analysis shows that modelling approaches vary strongly, and that current methodologies for estimating infrastructural damage are not as well developed as methodologies for the estimation of damage to buildings. The quantitative results show that the model outcomes are very sensitive to uncertainty in both vulnerability (i.e. depth-damage functions) and exposure (i.e. asset values), whereby the first has a larger effect than the latter. We conclude that care needs to be taken when using aggregated land use data for flood risk assessment, and that it is essential to adjust asset values to the regional economic situation and property characteristics. We call for the development of a flexible but consistent European framework that applies best practice from existing models while providing room for including necessary regional adjustments.

  14. A Final Approach Trajectory Model for Current Operations

    Science.gov (United States)

    Gong, Chester; Sadovsky, Alexander

    2010-01-01

    Predicting accurate trajectories with limited intent information is a challenge faced by air traffic management decision support tools in operation today. One such tool is the FAA's Terminal Proximity Alert system which is intended to assist controllers in maintaining safe separation of arrival aircraft during final approach. In an effort to improve the performance of such tools, two final approach trajectory models are proposed; one based on polynomial interpolation, the other on the Fourier transform. These models were tested against actual traffic data and used to study effects of the key final approach trajectory modeling parameters of wind, aircraft type, and weight class, on trajectory prediction accuracy. Using only the limited intent data available to today's ATM system, both the polynomial interpolation and Fourier transform models showed improved trajectory prediction accuracy over a baseline dead reckoning model. Analysis of actual arrival traffic showed that this improved trajectory prediction accuracy leads to improved inter-arrival separation prediction accuracy for longer look ahead times. The difference in mean inter-arrival separation prediction error between the Fourier transform and dead reckoning models was 0.2 nmi for a look ahead time of 120 sec, a 33 percent improvement, with a corresponding 32 percent improvement in standard deviation.

  15. Modeling in transport phenomena a conceptual approach

    CERN Document Server

    Tosun, Ismail

    2007-01-01

    Modeling in Transport Phenomena, Second Edition presents and clearly explains with example problems the basic concepts and their applications to fluid flow, heat transfer, mass transfer, chemical reaction engineering and thermodynamics. A balanced approach is presented between analysis and synthesis, students will understand how to use the solution in engineering analysis. Systematic derivations of the equations and the physical significance of each term are given in detail, for students to easily understand and follow up the material. There is a strong incentive in science and engineering to

  16. Model approach brings multi-level success.

    Science.gov (United States)

    Howell, Mark

    2012-08-01

    n an article that first appeared in US magazine, Medical Construction & Design, Mark Howell, senior vice-president of Skanska USA Building, based in Seattle, describes the design and construction of a new nine-storey, 350,000 ft2 extension to the Good Samaritan Hospital in Puyallup, Washington state. He explains how the use of an Integrated Project Delivery (IPD) approach by the key players, and extensive use of building information modelling (BIM), combined to deliver a healthcare facility that he believes should meet the needs of patients, families, and the clinical care team, 'well into the future'.

  17. Pedagogic process modeling: Humanistic-integrative approach

    Directory of Open Access Journals (Sweden)

    Boritko Nikolaj M.

    2007-01-01

    Full Text Available The paper deals with some current problems of modeling the dynamics of the subject-features development of the individual. The term "process" is considered in the context of the humanistic-integrative approach, in which the principles of self education are regarded as criteria for efficient pedagogic activity. Four basic characteristics of the pedagogic process are pointed out: intentionality reflects logicality and regularity of the development of the process; discreteness (stageability in dicates qualitative stages through which the pedagogic phenomenon passes; nonlinearity explains the crisis character of pedagogic processes and reveals inner factors of self-development; situationality requires a selection of pedagogic conditions in accordance with the inner factors, which would enable steering the pedagogic process. Offered are two steps for singling out a particular stage and the algorithm for developing an integrative model for it. The suggested conclusions might be of use for further theoretic research, analyses of educational practices and for realistic predicting of pedagogical phenomena. .

  18. Joint Modeling of Multiple Crimes: A Bayesian Spatial Approach

    Directory of Open Access Journals (Sweden)

    Hongqiang Liu

    2017-01-01

    Full Text Available A multivariate Bayesian spatial modeling approach was used to jointly model the counts of two types of crime, i.e., burglary and non-motor vehicle theft, and explore the geographic pattern of crime risks and relevant risk factors. In contrast to the univariate model, which assumes independence across outcomes, the multivariate approach takes into account potential correlations between crimes. Six independent variables are included in the model as potential risk factors. In order to fully present this method, both the multivariate model and its univariate counterpart are examined. We fitted the two models to the data and assessed them using the deviance information criterion. A comparison of the results from the two models indicates that the multivariate model was superior to the univariate model. Our results show that population density and bar density are clearly associated with both burglary and non-motor vehicle theft risks and indicate a close relationship between these two types of crime. The posterior means and 2.5% percentile of type-specific crime risks estimated by the multivariate model were mapped to uncover the geographic patterns. The implications, limitations and future work of the study are discussed in the concluding section.

  19. Earthquake response analysis of RC bridges using simplified modeling approaches

    Science.gov (United States)

    Lee, Do Hyung; Kim, Dookie; Park, Taehyo

    2009-07-01

    In this paper, simplified modeling approaches describing the hysteretic behavior of reinforced concrete bridge piers are proposed. For this purpose, flexure-axial and shear-axial interaction models are developed and implemented into a nonlinear finite element analysis program. Comparative verifications for reinforced concrete columns prove that the analytical predictions obtained with the new formulations show good correlation with experimental results under various levels of axial forces and section types. In addition, analytical correlation studies for the inelastic earthquake response of reinforced concrete bridge structures are also carried out using the simplified modeling approaches. Relatively good agreement is observed in the results between the current modeling approach and the elaborated fiber models. It is thus encouraging that the present developments and approaches are capable of identifying the contribution of deformation mechanisms correctly. Subsequently, the present developments can be used as a simple yet effective tool for the deformation capacity evaluation of reinforced concrete columns in general and reinforced concrete bridge piers in particular.

  20. A Two Step Face Alignment Approach Using Statistical Models

    Directory of Open Access Journals (Sweden)

    Ying Cui

    2012-10-01

    Full Text Available Although face alignment using the Active Appearance Model (AAM is relatively stable, it is known to be sensitive to initial values and not robust under inconstant circumstances. In order to strengthen the ability of AAM performance for face alignment, a two step approach for face alignment combining AAM and Active Shape Model (ASM is proposed. In the first step, AAM is used to locate the inner landmarks of the face. In the second step, the extended ASM is used to locate the outer landmarks of the face under the constraint of the estimated inner landmarks by AAM. The two kinds of landmarks are then combined together to form the whole facial landmarks. The proposed approach is compared with the basic AAM and the progressive AAM methods. Experimental results show that the proposed approach gives a much more effective performance.

  1. Fibroblast motility on substrates with different rigidities: modeling approach

    Science.gov (United States)

    Gracheva, Maria; Dokukina, Irina

    2009-03-01

    We develop a discrete model for cell locomotion on substrates with different rigidities and simulate experiments described in Lo, Wang, Dembo, Wang (2000) ``Cell movement is guided by the rigidity of the substrate'', Biophys. J. 79: 144-152. In these experiments fibroblasts were planted on a substrate with a step rigidity and showed preference for locomotion over stiffer side of the substrate when approaches the boundary between the soft and the stiff sides of the substrate. The model reproduces experimentally observed behavior of fibroblasts. In particular, we are able to show with our model how cell characteristics (such as cell length, shape, area and speed) change during cell crawling through the ``soft-stiff'' substrate boundary. Also, our model suggests the temporary increase of both cell speed and area in that very moment when cell leaves soft side of substrate.

  2. Modeling Electronic Circular Dichroism within the Polarizable Embedding Approach

    DEFF Research Database (Denmark)

    Nørby, Morten S; Olsen, Jógvan Magnus Haugaard; Steinmann, Casper

    2017-01-01

    We present a systematic investigation of the key components needed to model single chromophore electronic circular dichroism (ECD) within the polarizable embedding (PE) approach. By relying on accurate forms of the embedding potential, where especially the inclusion of local field effects...... are in focus, we show that qualitative agreement between rotatory strength parameters calculated by full quantum mechanical calculations and the more efficient embedding calculations can be obtained. An important aspect in the computation of reliable absorption parameters is the need for conformational...

  3. Approaches and models of intercultural education

    Directory of Open Access Journals (Sweden)

    Iván Manuel Sánchez Fontalvo

    2013-10-01

    Full Text Available Needed to be aware of the need to build an intercultural society, awareness must be assumed in all social spheres, where stands the role play education. A role of transcendental, since it must promote educational spaces to form people with virtues and powers that allow them to live together / as in multicultural contexts and social diversities (sometimes uneven in an increasingly globalized and interconnected world, and foster the development of feelings of civic belonging shared before the neighborhood, city, region and country, allowing them concern and critical judgement to marginalization, poverty, misery and inequitable distribution of wealth, causes of structural violence, but at the same time, wanting to work for the welfare and transformation of these scenarios. Since these budgets, it is important to know the approaches and models of intercultural education that have been developed so far, analysing their impact on the contexts educational where apply.   

  4. Skeletal Muscle Differentiation on a Chip Shows Human Donor Mesoangioblasts' Efficiency in Restoring Dystrophin in a Duchenne Muscular Dystrophy Model.

    Science.gov (United States)

    Serena, Elena; Zatti, Susi; Zoso, Alice; Lo Verso, Francesca; Tedesco, F Saverio; Cossu, Giulio; Elvassore, Nicola

    2016-12-01

    : Restoration of the protein dystrophin on muscle membrane is the goal of many research lines aimed at curing Duchenne muscular dystrophy (DMD). Results of ongoing preclinical and clinical trials suggest that partial restoration of dystrophin might be sufficient to significantly reduce muscle damage. Different myogenic progenitors are candidates for cell therapy of muscular dystrophies, but only satellite cells and pericytes have already entered clinical experimentation. This study aimed to provide in vitro quantitative evidence of the ability of mesoangioblasts to restore dystrophin, in terms of protein accumulation and distribution, within myotubes derived from DMD patients, using a microengineered model. We designed an ad hoc experimental strategy to miniaturize on a chip the standard process of muscle regeneration independent of variables such as inflammation and fibrosis. It is based on the coculture, at different ratios, of human dystrophin-positive myogenic progenitors and dystrophin-negative myoblasts in a substrate with muscle-like physiological stiffness and cell micropatterns. Results showed that both healthy myoblasts and mesoangioblasts restored dystrophin expression in DMD myotubes. However, mesoangioblasts showed unexpected efficiency with respect to myoblasts in dystrophin production in terms of the amount of protein produced (40% vs. 15%) and length of the dystrophin membrane domain (210-240 µm vs. 40-70 µm). These results show that our microscaled in vitro model of human DMD skeletal muscle validated previous in vivo preclinical work and may be used to predict efficacy of new methods aimed at enhancing dystrophin accumulation and distribution before they are tested in vivo, reducing time, costs, and variability of clinical experimentation. This study aimed to provide in vitro quantitative evidence of the ability of human mesoangioblasts to restore dystrophin, in terms of protein accumulation and distribution, within myotubes derived from

  5. Skeletal Muscle Differentiation on a Chip Shows Human Donor Mesoangioblasts’ Efficiency in Restoring Dystrophin in a Duchenne Muscular Dystrophy Model

    Science.gov (United States)

    Serena, Elena; Zatti, Susi; Zoso, Alice; Lo Verso, Francesca; Tedesco, F. Saverio; Cossu, Giulio

    2016-01-01

    Restoration of the protein dystrophin on muscle membrane is the goal of many research lines aimed at curing Duchenne muscular dystrophy (DMD). Results of ongoing preclinical and clinical trials suggest that partial restoration of dystrophin might be sufficient to significantly reduce muscle damage. Different myogenic progenitors are candidates for cell therapy of muscular dystrophies, but only satellite cells and pericytes have already entered clinical experimentation. This study aimed to provide in vitro quantitative evidence of the ability of mesoangioblasts to restore dystrophin, in terms of protein accumulation and distribution, within myotubes derived from DMD patients, using a microengineered model. We designed an ad hoc experimental strategy to miniaturize on a chip the standard process of muscle regeneration independent of variables such as inflammation and fibrosis. It is based on the coculture, at different ratios, of human dystrophin-positive myogenic progenitors and dystrophin-negative myoblasts in a substrate with muscle-like physiological stiffness and cell micropatterns. Results showed that both healthy myoblasts and mesoangioblasts restored dystrophin expression in DMD myotubes. However, mesoangioblasts showed unexpected efficiency with respect to myoblasts in dystrophin production in terms of the amount of protein produced (40% vs. 15%) and length of the dystrophin membrane domain (210–240 µm vs. 40–70 µm). These results show that our microscaled in vitro model of human DMD skeletal muscle validated previous in vivo preclinical work and may be used to predict efficacy of new methods aimed at enhancing dystrophin accumulation and distribution before they are tested in vivo, reducing time, costs, and variability of clinical experimentation. Significance This study aimed to provide in vitro quantitative evidence of the ability of human mesoangioblasts to restore dystrophin, in terms of protein accumulation and distribution, within myotubes

  6. Systems Approaches to Modeling Chronic Mucosal Inflammation

    Science.gov (United States)

    Gao, Boning; Choudhary, Sanjeev; Wood, Thomas G.; Carmical, Joseph R.; Boldogh, Istvan; Mitra, Sankar; Minna, John D.; Brasier, Allan R.

    2013-01-01

    The respiratory mucosa is a major coordinator of the inflammatory response in chronic airway diseases, including asthma and chronic obstructive pulmonary disease (COPD). Signals produced by the chronic inflammatory process induce epithelial mesenchymal transition (EMT) that dramatically alters the epithelial cell phenotype. The effects of EMT on epigenetic reprogramming and the activation of transcriptional networks are known, its effects on the innate inflammatory response are underexplored. We used a multiplex gene expression profiling platform to investigate the perturbations of the innate pathways induced by TGFβ in a primary airway epithelial cell model of EMT. EMT had dramatic effects on the induction of the innate pathway and the coupling interval of the canonical and noncanonical NF-κB pathways. Simulation experiments demonstrate that rapid, coordinated cap-independent translation of TRAF-1 and NF-κB2 is required to reduce the noncanonical pathway coupling interval. Experiments using amantadine confirmed the prediction that TRAF-1 and NF-κB2/p100 production is mediated by an IRES-dependent mechanism. These data indicate that the epigenetic changes produced by EMT induce dynamic state changes of the innate signaling pathway. Further applications of systems approaches will provide understanding of this complex phenotype through deterministic modeling and multidimensional (genomic and proteomic) profiling. PMID:24228254

  7. Setting conservation management thresholds using a novel participatory modeling approach.

    Science.gov (United States)

    Addison, P F E; de Bie, K; Rumpff, L

    2015-10-01

    We devised a participatory modeling approach for setting management thresholds that show when management intervention is required to address undesirable ecosystem changes. This approach was designed to be used when management thresholds: must be set for environmental indicators in the face of multiple competing objectives; need to incorporate scientific understanding and value judgments; and will be set by participants with limited modeling experience. We applied our approach to a case study where management thresholds were set for a mat-forming brown alga, Hormosira banksii, in a protected area management context. Participants, including management staff and scientists, were involved in a workshop to test the approach, and set management thresholds to address the threat of trampling by visitors to an intertidal rocky reef. The approach involved trading off the environmental objective, to maintain the condition of intertidal reef communities, with social and economic objectives to ensure management intervention was cost-effective. Ecological scenarios, developed using scenario planning, were a key feature that provided the foundation for where to set management thresholds. The scenarios developed represented declines in percent cover of H. banksii that may occur under increased threatening processes. Participants defined 4 discrete management alternatives to address the threat of trampling and estimated the effect of these alternatives on the objectives under each ecological scenario. A weighted additive model was used to aggregate participants' consequence estimates. Model outputs (decision scores) clearly expressed uncertainty, which can be considered by decision makers and used to inform where to set management thresholds. This approach encourages a proactive form of conservation, where management thresholds and associated actions are defined a priori for ecological indicators, rather than reacting to unexpected ecosystem changes in the future. © 2015 The

  8. Data-driven approach to dynamic visual attention modelling

    Science.gov (United States)

    Culibrk, Dubravko; Sladojevic, Srdjan; Riche, Nicolas; Mancas, Matei; Crnojevic, Vladimir

    2012-06-01

    Visual attention deployment mechanisms allow the Human Visual System to cope with an overwhelming amount of visual data by dedicating most of the processing power to objects of interest. The ability to automatically detect areas of the visual scene that will be attended to by humans is of interest for a large number of applications, from video coding, video quality assessment to scene understanding. Due to this fact, visual saliency (bottom-up attention) models have generated significant scientific interest in recent years. Most recent work in this area deals with dynamic models of attention that deal with moving stimuli (videos) instead of traditionally used still images. Visual saliency models are usually evaluated against ground-truth eye-tracking data collected from human subjects. However, there are precious few recently published approaches that try to learn saliency from eyetracking data and, to the best of our knowledge, no approaches that try to do so when dynamic saliency is concerned. The paper attempts to fill this gap and describes an approach to data-driven dynamic saliency model learning. A framework is proposed that enables the use of eye-tracking data to train an arbitrary machine learning algorithm, using arbitrary features derived from the scene. We evaluate the methodology using features from a state-of-the art dynamic saliency model and show how simple machine learning algorithms can be trained to distinguish between visually salient and non-salient parts of the scene.

  9. Jackiw-Pi model: A superfield approach

    Science.gov (United States)

    Gupta, Saurabh

    2014-12-01

    We derive the off-shell nilpotent and absolutely anticommuting Becchi-Rouet-Stora-Tyutin (BRST) as well as anti-BRST transformations s ( a) b corresponding to the Yang-Mills gauge transformations of 3D Jackiw-Pi model by exploiting the "augmented" super-field formalism. We also show that the Curci-Ferrari restriction, which is a hallmark of any non-Abelian 1-form gauge theories, emerges naturally within this formalism and plays an instrumental role in providing the proof of absolute anticommutativity of s ( a) b .

  10. Modeling energy fluxes in heterogeneous landscapes employing a mosaic approach

    Science.gov (United States)

    Klein, Christian; Thieme, Christoph; Priesack, Eckart

    2015-04-01

    Recent studies show that uncertainties in regional and global climate and weather simulations are partly due to inadequate descriptions of the energy flux exchanges between the land surface and the atmosphere. One major shortcoming is the limitation of the grid-cell resolution, which is recommended to be about at least 3x3 km² in most models due to limitations in the model physics. To represent each individual grid cell most models select one dominant soil type and one dominant land use type. This resolution, however, is often too coarse in regions where the spatial diversity of soil and land use types are high, e.g. in Central Europe. An elegant method to avoid the shortcoming of grid cell resolution is the so called mosaic approach. This approach is part of the recently developed ecosystem model framework Expert-N 5.0. The aim of this study was to analyze the impact of the characteristics of two managed fields, planted with winter wheat and potato, on the near surface soil moistures and on the near surface energy flux exchanges of the soil-plant-atmosphere interface. The simulated energy fluxes were compared with eddy flux tower measurements between the respective fields at the research farm Scheyern, North-West of Munich, Germany. To perform these simulations, we coupled the ecosystem model Expert-N 5.0 to an analytical footprint model. The coupled model system has the ability to calculate the mixing ratio of the surface energy fluxes at a given point within one grid cell (in this case at the flux tower between the two fields). This approach accounts for the differences of the two soil types, of land use managements, and of canopy properties due to footprint size dynamics. Our preliminary simulation results show that a mosaic approach can improve modeling and analyzing energy fluxes when the land surface is heterogeneous. In this case our applied method is a promising approach to extend weather and climate models on the regional and on the global scale.

  11. Plectasin shows intracellular activity against Staphylococcus aureus in human THP-1 monocytes and in a mouse peritonitis model

    DEFF Research Database (Denmark)

    Brinch, Karoline Sidelmann; Sandberg, Anne; Baudoux, Pierre

    2009-01-01

    Antimicrobial therapy of infections with Staphylococcus aureus can pose a challenge due to slow response to therapy and recurrence of infection. These treatment difficulties can partly be explained by intracellular survival of staphylococci, which is why the intracellular activity...... was maintained (maximal relative efficacy [E(max)], 1.0- to 1.3-log reduction in CFU) even though efficacy was inferior to that of extracellular killing (E(max), >4.5-log CFU reduction). Animal studies included a novel use of the mouse peritonitis model, exploiting extra- and intracellular differentiation assays...... concentration. These findings stress the importance of performing studies of extra- and intracellular activity since these features cannot be predicted from traditional MIC and killing kinetic studies. Application of both the THP-1 and the mouse peritonitis models showed that the in vitro results were similar...

  12. A simplified GIS approach to modeling global leaf water isoscapes.

    Directory of Open Access Journals (Sweden)

    Jason B West

    Full Text Available The stable hydrogen (delta(2H and oxygen (delta(18O isotope ratios of organic and inorganic materials record biological and physical processes through the effects of substrate isotopic composition and fractionations that occur as reactions proceed. At large scales, these processes can exhibit spatial predictability because of the effects of coherent climatic patterns over the Earth's surface. Attempts to model spatial variation in the stable isotope ratios of water have been made for decades. Leaf water has a particular importance for some applications, including plant organic materials that record spatial and temporal climate variability and that may be a source of food for migrating animals. It is also an important source of the variability in the isotopic composition of atmospheric gases. Although efforts to model global-scale leaf water isotope ratio spatial variation have been made (especially of delta(18O, significant uncertainty remains in models and their execution across spatial domains. We introduce here a Geographic Information System (GIS approach to the generation of global, spatially-explicit isotope landscapes (= isoscapes of "climate normal" leaf water isotope ratios. We evaluate the approach and the resulting products by comparison with simulation model outputs and point measurements, where obtainable, over the Earth's surface. The isoscapes were generated using biophysical models of isotope fractionation and spatially continuous precipitation isotope and climate layers as input model drivers. Leaf water delta(18O isoscapes produced here generally agreed with latitudinal averages from GCM/biophysical model products, as well as mean values from point measurements. These results show global-scale spatial coherence in leaf water isotope ratios, similar to that observed for precipitation and validate the GIS approach to modeling leaf water isotopes. These results demonstrate that relatively simple models of leaf water enrichment

  13. Energy and Development. A Modelling Approach

    International Nuclear Information System (INIS)

    Van Ruijven, B.J.

    2008-01-01

    Rapid economic growth of developing countries like India and China implies that these countries become important actors in the global energy system. Examples of this impact are the present day oil shortages and rapidly increasing emissions of greenhouse gases. Global energy models are used to explore possible future developments of the global energy system and identify policies to prevent potential problems. Such estimations of future energy use in developing countries are very uncertain. Crucial factors in the future energy use of these regions are electrification, urbanisation and income distribution, issues that are generally not included in present day global energy models. Model simulations in this thesis show that current insight in developments in low-income regions lead to a wide range of expected energy use in 2030 of the residential and transport sectors. This is mainly caused by many different model calibration options that result from the limited data availability for model development and calibration. We developed a method to identify the impact of model calibration uncertainty on future projections. We developed a new model for residential energy use in India, in collaboration with the Indian Institute of Science. Experiments with this model show that the impact of electrification and income distribution is less univocal than often assumed. The use of fuelwood, with related health risks, can decrease rapidly if the income of poor groups increases. However, there is a trade off in terms of CO2 emissions because these groups gain access to electricity and the ownership of appliances increases. Another issue is the potential role of new technologies in developing countries: will they use the opportunities of leapfrogging? We explored the potential role of hydrogen, an energy carrier that might play a central role in a sustainable energy system. We found that hydrogen only plays a role before 2050 under very optimistic assumptions. Regional energy

  14. Energy and Development. A Modelling Approach

    Energy Technology Data Exchange (ETDEWEB)

    Van Ruijven, B.J.

    2008-12-17

    Rapid economic growth of developing countries like India and China implies that these countries become important actors in the global energy system. Examples of this impact are the present day oil shortages and rapidly increasing emissions of greenhouse gases. Global energy models are used to explore possible future developments of the global energy system and identify policies to prevent potential problems. Such estimations of future energy use in developing countries are very uncertain. Crucial factors in the future energy use of these regions are electrification, urbanisation and income distribution, issues that are generally not included in present day global energy models. Model simulations in this thesis show that current insight in developments in low-income regions lead to a wide range of expected energy use in 2030 of the residential and transport sectors. This is mainly caused by many different model calibration options that result from the limited data availability for model development and calibration. We developed a method to identify the impact of model calibration uncertainty on future projections. We developed a new model for residential energy use in India, in collaboration with the Indian Institute of Science. Experiments with this model show that the impact of electrification and income distribution is less univocal than often assumed. The use of fuelwood, with related health risks, can decrease rapidly if the income of poor groups increases. However, there is a trade off in terms of CO2 emissions because these groups gain access to electricity and the ownership of appliances increases. Another issue is the potential role of new technologies in developing countries: will they use the opportunities of leapfrogging? We explored the potential role of hydrogen, an energy carrier that might play a central role in a sustainable energy system. We found that hydrogen only plays a role before 2050 under very optimistic assumptions. Regional energy

  15. Climate Modelling Shows Increased Risk to Eucalyptus sideroxylon on the Eastern Coast of Australia Compared to Eucalyptus albens.

    Science.gov (United States)

    Shabani, Farzin; Kumar, Lalit; Ahmadi, Mohsen

    2017-11-24

    Aim: To identify the extent and direction of range shift of Eucalyptus sideroxylon and E. albens in Australia by 2050 through an ensemble forecast of four species distribution models (SDMs). Each was generated using four global climate models (GCMs), under two representative concentration pathways (RCPs). Location: Australia. Methods : We used four SDMs of (i) generalized linear model, (ii) MaxEnt, (iii) random forest, and (iv) boosted regression tree to construct SDMs for species E. sideroxylon and E. albens under four GCMs including (a) MRI-CGCM3, (b) MIROC5, (c) HadGEM2-AO and (d) CCSM4, under two RCPs of 4.5 and 6.0. Here, the true skill statistic (TSS) index was used to assess the accuracy of each SDM. Results: Results showed that E. albens and E. sideroxylon will lose large areas of their current suitable range by 2050 and E. sideroxylon is projected to gain in eastern and southeastern Australia. Some areas were also projected to remain suitable for each species between now and 2050. Our modelling showed that E. sideroxylon will lose suitable habitat on the western side and will not gain any on the eastern side because this region is one the most heavily populated areas in the country, and the populated areas are moving westward. The predicted decrease in E. sideroxylon's distribution suggests that land managers should monitor its population closely, and evaluate whether it meets criteria for a protected legal status. Main conclusions: Both Eucalyptus sideroxylon and E. albens will be negatively affected by climate change and it is projected that E. sideroxylon will be at greater risk of losing habitat than E. albens .

  16. Climate Modelling Shows Increased Risk to Eucalyptus sideroxylon on the Eastern Coast of Australia Compared to Eucalyptus albens

    Directory of Open Access Journals (Sweden)

    Farzin Shabani

    2017-11-01

    Full Text Available Aim: To identify the extent and direction of range shift of Eucalyptus sideroxylon and E. albens in Australia by 2050 through an ensemble forecast of four species distribution models (SDMs. Each was generated using four global climate models (GCMs, under two representative concentration pathways (RCPs. Location: Australia. Methods: We used four SDMs of (i generalized linear model, (ii MaxEnt, (iii random forest, and (iv boosted regression tree to construct SDMs for species E. sideroxylon and E. albens under four GCMs including (a MRI-CGCM3, (b MIROC5, (c HadGEM2-AO and (d CCSM4, under two RCPs of 4.5 and 6.0. Here, the true skill statistic (TSS index was used to assess the accuracy of each SDM. Results: Results showed that E. albens and E. sideroxylon will lose large areas of their current suitable range by 2050 and E. sideroxylon is projected to gain in eastern and southeastern Australia. Some areas were also projected to remain suitable for each species between now and 2050. Our modelling showed that E. sideroxylon will lose suitable habitat on the western side and will not gain any on the eastern side because this region is one the most heavily populated areas in the country, and the populated areas are moving westward. The predicted decrease in E. sideroxylon’s distribution suggests that land managers should monitor its population closely, and evaluate whether it meets criteria for a protected legal status. Main conclusions: Both Eucalyptus sideroxylon and E. albens will be negatively affected by climate change and it is projected that E. sideroxylon will be at greater risk of losing habitat than E. albens.

  17. Coadministration of doxorubicin and etoposide loaded in camel milk phospholipids liposomes showed increased antitumor activity in a murine model

    Directory of Open Access Journals (Sweden)

    Maswadeh HM

    2015-04-01

    Full Text Available Hamzah M Maswadeh,1 Ahmed N Aljarbou,1 Mohammed S Alorainy,2 Arshad H Rahmani,3 Masood A Khan3 1Department of Pharmaceutics, College of Pharmacy, 2Department of Pharmacology and Therapeutics, College of Medicine, 3College of Applied Medical Sciences, Qassim University, Buraydah, Kingdom of Saudi Arabia Abstract: Small unilamellar vesicles from camel milk phospholipids (CML mixture or from 1,2 dipalmitoyl-sn-glycero-3-phosphatidylcholine (DPPC were prepared, and anticancer drugs doxorubicin (Dox or etoposide (ETP were loaded. Liposomal formulations were used against fibrosarcoma in a murine model. Results showed a very high percentage of Dox encapsulation (~98% in liposomes (Lip prepared from CML-Lip or DPPC-Lip, whereas the percentage of encapsulations of ETP was on the lower side, 22% of CML-Lip and 18% for DPPC-Lip. Differential scanning calorimetry curves show that Dox enhances the lamellar formation in CML-Lip, whereas ETP enhances the nonlamellar formation. Differential scanning calorimetry curves also showed that the presence of Dox and ETP together into DPPC-Lip produced the interdigitation effect. The in vivo anticancer activity of liposomal formulations of Dox or ETP or a combination of both was assessed against benzopyrene (BAP-induced fibrosarcoma in a murine model. Tumor-bearing mice treated with a combination of Dox and ETP loaded into CML-Lip showed increased survival and reduced tumor growth compared to other groups, including the combination of Dox and ETP in DPPC-Lip. Fibrosarcoma-bearing mice treated with a combination of free (Dox + ETP showed much higher tumor growth compared to those groups treated with CML-Lip-(Dox + ETP or DPPC-Lip-(Dox + ETP. Immunohistochemical study was also performed to show the expression of tumor-suppressor PTEN, and it was found that the tumor tissues from the group of mice treated with a combination of free (Dox + ETP showed greater loss of cytoplasmic PTEN than tumor tissues obtained from the

  18. Connecting with The Biggest Loser: an extended model of parasocial interaction and identification in health-related reality TV shows.

    Science.gov (United States)

    Tian, Yan; Yoo, Jina H

    2015-01-01

    This study investigates audience responses to health-related reality TV shows in the setting of The Biggest Loser. It conceptualizes a model for audience members' parasocial interaction and identification with cast members and explores antecedents and outcomes of parasocial interaction and identification. Data analysis suggests the following direct relationships: (1) audience members' exposure to the show is positively associated with parasocial interaction, which in turn is positively associated with identification, (2) parasocial interaction is positively associated with exercise self-efficacy, whereas identification is negatively associated with exercise self-efficacy, and (3) exercise self-efficacy is positively associated with exercise behavior. Indirect effects of parasocial interaction and identification on exercise self-efficacy and exercise behavior are also significant. We discuss the theoretical and practical implications of these findings.

  19. Social Network Analyses and Nutritional Behavior: An Integrated Modeling Approach

    Directory of Open Access Journals (Sweden)

    Alistair McNair Senior

    2016-01-01

    Full Text Available Animals have evolved complex foraging strategies to obtain a nutritionally balanced diet and associated fitness benefits. Recent advances in nutrition research, combining state-space models of nutritional geometry with agent-based models of systems biology, show how nutrient targeted foraging behavior can also influence animal social interactions, ultimately affecting collective dynamics and group structures. Here we demonstrate how social network analyses can be integrated into such a modeling framework and provide a tangible and practical analytical tool to compare experimental results with theory. We illustrate our approach by examining the case of nutritionally mediated dominance hierarchies. First we show how nutritionally explicit agent-based models that simulate the emergence of dominance hierarchies can be used to generate social networks. Importantly the structural properties of our simulated networks bear similarities to dominance networks of real animals (where conflicts are not always directly related to nutrition. Finally, we demonstrate how metrics from social network analyses can be used to predict the fitness of agents in these simulated competitive environments. Our results highlight the potential importance of nutritional mechanisms in shaping dominance interactions in a wide range of social and ecological contexts. Nutrition likely influences social interaction in many species, and yet a theoretical framework for exploring these effects is currently lacking. Combining social network analyses with computational models from nutritional ecology may bridge this divide, representing a pragmatic approach for generating theoretical predictions for nutritional experiments.

  20. CP-809,101, a selective 5-HT2C agonist, shows activity in animal models of antipsychotic activity.

    Science.gov (United States)

    Siuciak, Judith A; Chapin, Douglas S; McCarthy, Sheryl A; Guanowsky, Victor; Brown, Janice; Chiang, Phoebe; Marala, Ravi; Patterson, Terrell; Seymour, Patricia A; Swick, Andrew; Iredale, Philip A

    2007-02-01

    CP-809,101 is a potent, functionally selective 5-HT(2C) agonist that displays approximately 100% efficacy in vitro. The aim of the present studies was to assess the efficacy of a selective 5-HT(2C) agonist in animal models predictive of antipsychotic-like efficacy and side-effect liability. Similar to currently available antipsychotic drugs, CP-809,101 dose-dependently inhibited conditioned avoidance responding (CAR, ED(50)=4.8 mg/kg, sc). The efficacy of CP-809,101 in CAR was completely antagonized by the concurrent administration of the 5-HT(2C) receptor antagonist, SB-224,282. CP-809,101 antagonized both PCP- and d-amphetamine-induced hyperactivity with ED(50) values of 2.4 and 2.9 mg/kg (sc), respectively and also reversed an apomorphine induced-deficit in prepulse inhibition. At doses up to 56 mg/kg, CP-809,101 did not produce catalepsy. Thus, the present results demonstrate that the 5-HT(2C) agonist, CP-809,101, has a pharmacological profile similar to that of the atypical antipsychotics with low extrapyramidal symptom liability. CP-809,101 was inactive in two animal models of antidepressant-like activity, the forced swim test and learned helplessness. However, CP-809,101 was active in novel object recognition, an animal model of cognitive function. These data suggest that 5-HT(2C) agonists may be a novel approach in the treatment of psychosis as well as for the improvement of cognitive dysfunction associated with schizophrenia.

  1. ISM Approach to Model Offshore Outsourcing Risks

    Directory of Open Access Journals (Sweden)

    Sunand Kumar

    2014-07-01

    Full Text Available In an effort to achieve a competitive advantage via cost reductions and improved market responsiveness, organizations are increasingly employing offshore outsourcing as a major component of their supply chain strategies. But as evident from literature number of risks such as Political risk, Risk due to cultural differences, Compliance and regulatory risk, Opportunistic risk and Organization structural risk, which adversely affect the performance of offshore outsourcing in a supply chain network. This also leads to dissatisfaction among different stake holders. The main objective of this paper is to identify and understand the mutual interaction among various risks which affect the performance of offshore outsourcing.  To this effect, authors have identified various risks through extant review of literature.  From this information, an integrated model using interpretive structural modelling (ISM for risks affecting offshore outsourcing is developed and the structural relationships between these risks are modeled.  Further, MICMAC analysis is done to analyze the driving power and dependency of risks which shall be helpful to managers to identify and classify important criterions and to reveal the direct and indirect effects of each criterion on offshore outsourcing. Results show that political risk and risk due to cultural differences are act as strong drivers.

  2. A fuzzy approach for modelling radionuclide in lake system

    International Nuclear Information System (INIS)

    Desai, H.K.; Christian, R.A.; Banerjee, J.; Patra, A.K.

    2013-01-01

    Radioactive liquid waste is generated during operation and maintenance of Pressurised Heavy Water Reactors (PHWRs). Generally low level liquid waste is diluted and then discharged into the near by water-body through blowdown water discharge line as per the standard waste management practice. The effluents from nuclear installations are treated adequately and then released in a controlled manner under strict compliance of discharge criteria. An attempt was made to predict the concentration of 3 H released from Kakrapar Atomic Power Station at Ratania Regulator, about 2.5 km away from the discharge point, where human exposure is expected. Scarcity of data and complex geometry of the lake prompted the use of Heuristic approach. Under this condition, Fuzzy rule based approach was adopted to develop a model, which could predict 3 H concentration at Ratania Regulator. Three hundred data were generated for developing the fuzzy rules, in which input parameters were water flow from lake and 3 H concentration at discharge point. The Output was 3 H concentration at Ratania Regulator. These data points were generated by multiple regression analysis of the original data. Again by using same methodology hundred data were generated for the validation of the model, which were compared against the predicted output generated by using Fuzzy Rule based approach. Root Mean Square Error of the model came out to be 1.95, which showed good agreement by Fuzzy model of natural ecosystem. -- Highlights: • Uncommon approach (Fuzzy Rule Base) of modelling radionuclide dispersion in Lake. • Predicts 3 H released from Kakrapar Atomic Power Station at a point of human exposure. • RMSE of fuzzy model is 1.95, which means, it has well imitated natural ecosystem

  3. Spatial Heterodyne Observations of Water (SHOW) vapour in the upper troposphere and lower stratosphere from a high altitude aircraft: Modelling and sensitivity analysis

    Science.gov (United States)

    Langille, J. A.; Letros, D.; Zawada, D.; Bourassa, A.; Degenstein, D.; Solheim, B.

    2018-04-01

    A spatial heterodyne spectrometer (SHS) has been developed to measure the vertical distribution of water vapour in the upper troposphere and the lower stratosphere with a high vertical resolution (∼500 m). The Spatial Heterodyne Observations of Water (SHOW) instrument combines an imaging system with a monolithic field-widened SHS to observe limb scattered sunlight in a vibrational band of water (1363 nm-1366 nm). The instrument has been optimized for observations from NASA's ER-2 aircraft as a proof-of-concept for a future low earth orbit satellite deployment. A robust model has been developed to simulate SHOW ER-2 limb measurements and retrievals. This paper presents the simulation of the SHOW ER-2 limb measurements along a hypothetical flight track and examines the sensitivity of the measurement and retrieval approach. Water vapour fields from an Environment and Climate Change Canada forecast model are used to represent realistic spatial variability along the flight path. High spectral resolution limb scattered radiances are simulated using the SASKTRAN radiative transfer model. It is shown that the SHOW instrument onboard the ER-2 is capable of resolving the water vapour variability in the UTLS from approximately 12 km - 18 km with ±1 ppm accuracy. Vertical resolutions between 500 m and 1 km are feasible. The along track sampling capability of the instrument is also discussed.

  4. Risk communication: a mental models approach

    National Research Council Canada - National Science Library

    Morgan, M. Granger (Millett Granger)

    2002-01-01

    ... information about risks. The procedure uses approaches from risk and decision analysis to identify the most relevant information; it also uses approaches from psychology and communication theory to ensure that its message is understood. This book is written in nontechnical terms, designed to make the approach feasible for anyone willing to try it. It is illustrat...

  5. Restless led syndrome model Drosophila melanogaster show successful olfactory learning and 1-day retention of the acquired memory

    Directory of Open Access Journals (Sweden)

    Mika F. Asaba

    2013-09-01

    Full Text Available Restless Legs Syndrome (RLS is a prevalent but poorly understood disorder that ischaracterized by uncontrollable movements during sleep, resulting in sleep disturbance.Olfactory memory in Drosophila melanogaster has proven to be a useful tool for the study ofcognitive deficits caused by sleep disturbances, such as those seen in RLS. A recently generatedDrosophila model of RLS exhibited disturbed sleep patterns similar to those seen in humans withRLS. This research seeks to improve understanding of the relationship between cognitivefunctioning and sleep disturbances in a new model for RLS. Here, we tested learning andmemory in wild type and dBTBD9 mutant flies by Pavlovian olfactory conditioning, duringwhich a shock was paired with one of two odors. Flies were then placed in a T-maze with oneodor on either side, and successful associative learning was recorded when the flies chose theside with the unpaired odor. We hypothesized that due to disrupted sleep patterns, dBTBD9mutant flies would be unable to learn the shock-odor association. However, the current studyreports that the recently generated Drosophila model of RLS shows successful olfactorylearning, despite disturbed sleep patterns, with learning performance levels matching or betterthan wild type flies.

  6. A reservoir simulation approach for modeling of naturally fractured reservoirs

    Directory of Open Access Journals (Sweden)

    H. Mohammadi

    2012-12-01

    Full Text Available In this investigation, the Warren and Root model proposed for the simulation of naturally fractured reservoir was improved. A reservoir simulation approach was used to develop a 2D model of a synthetic oil reservoir. Main rock properties of each gridblock were defined for two different types of gridblocks called matrix and fracture gridblocks. These two gridblocks were different in porosity and permeability values which were higher for fracture gridblocks compared to the matrix gridblocks. This model was solved using the implicit finite difference method. Results showed an improvement in the Warren and Root model especially in region 2 of the semilog plot of pressure drop versus time, which indicated a linear transition zone with no inflection point as predicted by other investigators. Effects of fracture spacing, fracture permeability, fracture porosity, matrix permeability and matrix porosity on the behavior of a typical naturally fractured reservoir were also presented.

  7. A screening-level modeling approach to estimate nitrogen ...

    Science.gov (United States)

    This paper presents a screening-level modeling approach that can be used to rapidly estimate nutrient loading and assess numerical nutrient standard exceedance risk of surface waters leading to potential classification as impaired for designated use. It can also be used to explore best management practice (BMP) implementation to reduce loading. The modeling framework uses a hybrid statistical and process based approach to estimate source of pollutants, their transport and decay in the terrestrial and aquatic parts of watersheds. The framework is developed in the ArcGIS environment and is based on the total maximum daily load (TMDL) balance model. Nitrogen (N) is currently addressed in the framework, referred to as WQM-TMDL-N. Loading for each catchment includes non-point sources (NPS) and point sources (PS). NPS loading is estimated using export coefficient or event mean concentration methods depending on the temporal scales, i.e., annual or daily. Loading from atmospheric deposition is also included. The probability of a nutrient load to exceed a target load is evaluated using probabilistic risk assessment, by including the uncertainty associated with export coefficients of various land uses. The computed risk data can be visualized as spatial maps which show the load exceedance probability for all stream segments. In an application of this modeling approach to the Tippecanoe River watershed in Indiana, USA, total nitrogen (TN) loading and risk of standard exce

  8. A fuzzy approach for modelling radionuclide in lake system.

    Science.gov (United States)

    Desai, H K; Christian, R A; Banerjee, J; Patra, A K

    2013-10-01

    Radioactive liquid waste is generated during operation and maintenance of Pressurised Heavy Water Reactors (PHWRs). Generally low level liquid waste is diluted and then discharged into the near by water-body through blowdown water discharge line as per the standard waste management practice. The effluents from nuclear installations are treated adequately and then released in a controlled manner under strict compliance of discharge criteria. An attempt was made to predict the concentration of (3)H released from Kakrapar Atomic Power Station at Ratania Regulator, about 2.5 km away from the discharge point, where human exposure is expected. Scarcity of data and complex geometry of the lake prompted the use of Heuristic approach. Under this condition, Fuzzy rule based approach was adopted to develop a model, which could predict (3)H concentration at Ratania Regulator. Three hundred data were generated for developing the fuzzy rules, in which input parameters were water flow from lake and (3)H concentration at discharge point. The Output was (3)H concentration at Ratania Regulator. These data points were generated by multiple regression analysis of the original data. Again by using same methodology hundred data were generated for the validation of the model, which were compared against the predicted output generated by using Fuzzy Rule based approach. Root Mean Square Error of the model came out to be 1.95, which showed good agreement by Fuzzy model of natural ecosystem. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. Anomalous superconductivity in the tJ model; moment approach

    DEFF Research Database (Denmark)

    Sørensen, Mads Peter; Rodriguez-Nunez, J.J.

    1997-01-01

    By extending the moment approach of Nolting (Z, Phys, 225 (1972) 25) in the superconducting phase, we have constructed the one-particle spectral functions (diagonal and off-diagonal) for the tJ model in any dimensions. We propose that both the diagonal and the off-diagonal spectral functions...... Hartree shift which in the end result enlarges the bandwidth of the free carriers allowing us to take relative high values of J/t and allowing superconductivity to live in the T-c-rho phase diagram, in agreement with numerical calculations in a cluster, We have calculated the static spin susceptibility......, chi(T), and the specific heat, C-v(T), within the moment approach. We find that all the relevant physical quantities show the signature of superconductivity at T-c in the form of kinks (anomalous behavior) or jumps, for low density, in agreement with recent published literature, showing a generic...

  10. The BACHD Rat Model of Huntington Disease Shows Specific Deficits in a Test Battery of Motor Function

    Directory of Open Access Journals (Sweden)

    Giuseppe Manfré

    2017-11-01

    Full Text Available Rationale: Huntington disease (HD is a progressive neurodegenerative disorder characterized by motor, cognitive and neuropsychiatric symptoms. HD is usually diagnosed by the appearance of motor deficits, resulting in skilled hand use disruption, gait abnormality, muscle wasting and choreatic movements. The BACHD transgenic rat model for HD represents a well-established transgenic rodent model of HD, offering the prospect of an in-depth characterization of the motor phenotype.Objective: The present study aims to characterize different aspects of motor function in BACHD rats, combining classical paradigms with novel high-throughput behavioral phenotyping.Methods: Wild-type (WT and transgenic animals were tested longitudinally from 2 to 12 months of age. To measure fine motor control, rats were challenged with the pasta handling test and the pellet reaching test. To evaluate gross motor function, animals were assessed by using the holding bar and the grip strength tests. Spontaneous locomotor activity and circadian rhythmicity were assessed in an automated home-cage environment, namely the PhenoTyper. We then integrated existing classical methodologies to test motor function with automated home-cage assessment of motor performance.Results: BACHD rats showed strong impairment in muscle endurance at 2 months of age. Altered circadian rhythmicity and locomotor activity were observed in transgenic animals. On the other hand, reaching behavior, forepaw dexterity and muscle strength were unaffected.Conclusions: The BACHD rat model exhibits certain features of HD patients, like muscle weakness and changes in circadian behavior. We have observed modest but clear-cut deficits in distinct motor phenotypes, thus confirming the validity of this transgenic rat model for treatment and drug discovery purposes.

  11. Genetic Algorithm Approaches to Prebiobiotic Chemistry Modeling

    Science.gov (United States)

    Lohn, Jason; Colombano, Silvano

    1997-01-01

    We model an artificial chemistry comprised of interacting polymers by specifying two initial conditions: a distribution of polymers and a fixed set of reversible catalytic reactions. A genetic algorithm is used to find a set of reactions that exhibit a desired dynamical behavior. Such a technique is useful because it allows an investigator to determine whether a specific pattern of dynamics can be produced, and if it can, the reaction network found can be then analyzed. We present our results in the context of studying simplified chemical dynamics in theorized protocells - hypothesized precursors of the first living organisms. Our results show that given a small sample of plausible protocell reaction dynamics, catalytic reaction sets can be found. We present cases where this is not possible and also analyze the evolved reaction sets.

  12. Social Network Analysis and Nutritional Behavior: An Integrated Modeling Approach.

    Science.gov (United States)

    Senior, Alistair M; Lihoreau, Mathieu; Buhl, Jerome; Raubenheimer, David; Simpson, Stephen J

    2016-01-01

    Animals have evolved complex foraging strategies to obtain a nutritionally balanced diet and associated fitness benefits. Recent research combining state-space models of nutritional geometry with agent-based models (ABMs), show how nutrient targeted foraging behavior can also influence animal social interactions, ultimately affecting collective dynamics and group structures. Here we demonstrate how social network analyses can be integrated into such a modeling framework and provide a practical analytical tool to compare experimental results with theory. We illustrate our approach by examining the case of nutritionally mediated dominance hierarchies. First we show how nutritionally explicit ABMs that simulate the emergence of dominance hierarchies can be used to generate social networks. Importantly the structural properties of our simulated networks bear similarities to dominance networks of real animals (where conflicts are not always directly related to nutrition). Finally, we demonstrate how metrics from social network analyses can be used to predict the fitness of agents in these simulated competitive environments. Our results highlight the potential importance of nutritional mechanisms in shaping dominance interactions in a wide range of social and ecological contexts. Nutrition likely influences social interactions in many species, and yet a theoretical framework for exploring these effects is currently lacking. Combining social network analyses with computational models from nutritional ecology may bridge this divide, representing a pragmatic approach for generating theoretical predictions for nutritional experiments.

  13. BO-1055, a novel DNA cross-linking agent with remarkable low myelotoxicity shows potent activity in sarcoma models.

    Science.gov (United States)

    Ambati, Srikanth R; Shieh, Jae-Hung; Pera, Benet; Lopes, Eloisi Caldas; Chaudhry, Anisha; Wong, Elissa W P; Saxena, Ashish; Su, Tsann-Long; Moore, Malcolm A S

    2016-07-12

    DNA damaging agents cause rapid shrinkage of tumors and form the basis of chemotherapy for sarcomas despite significant toxicities. Drugs having superior efficacy and wider therapeutic windows are needed to improve patient outcomes. We used cell proliferation and apoptosis assays in sarcoma cell lines and benign cells; γ-H2AX expression, comet assay, immunoblot analyses and drug combination studies in vitro and in patient derived xenograft (PDX) models. BO-1055 caused apoptosis and cell death in a concentration and time dependent manner in sarcoma cell lines. BO-1055 had potent activity (submicromolar IC50) against Ewing sarcoma and rhabdomyosarcoma, intermediate activity in DSRCT (IC50 = 2-3μM) and very weak activity in osteosarcoma (IC50 >10μM) cell lines. BO-1055 exhibited a wide therapeutic window compared to other DNA damaging drugs. BO-1055 induced more DNA double strand breaks and γH2AX expression in cancer cells compared to benign cells. BO-1055 showed inhibition of tumor growth in A673 xenografts and caused tumor regression in cyclophosphamide resistant patient-derived Ewing sarcoma xenografts and A204 xenografts. Combination of BO-1055 and irinotecan demonstrated synergism in Ewing sarcoma PDX models. Potent activity on sarcoma cells and its relative lack of toxicity presents a strong rationale for further development of BO-1055 as a therapeutic agent.

  14. A model approach to the thermodynamics of microemulsion systems: Estimation of adequacy of the two-phase model of microemulsions

    Science.gov (United States)

    Kartsev, V. N.; Polikhronidi, N. G.; Batov, D. V.; Shtykov, S. N.; Stepanov, G. V.

    2010-02-01

    An approach to the thermodynamics of microemulsions based on the use of the two-phase model was suggested. In this model, one phase is the dispersion medium, and the other, the sum of disperse phase nanodrops. Experimental estimation of the adequacy of this approach showed that the model can be used to quantitatively satisfactorily solve microemulsion thermodynamics problems. The degree of two-phase model inadequacy did not exceed 10%.

  15. Demographical history and palaeodistribution modelling show range shift towards Amazon Basin for a Neotropical tree species in the LGM.

    Science.gov (United States)

    Vitorino, Luciana Cristina; Lima-Ribeiro, Matheus S; Terribile, Levi Carina; Collevatti, Rosane G

    2016-10-13

    We studied the phylogeography and demographical history of Tabebuia serratifolia (Bignoniaceae) to understand the disjunct geographical distribution of South American seasonally dry tropical forests (SDTFs). We specifically tested if the multiple and isolated patches of SDTFs are current climatic relicts of a widespread and continuously distributed dry forest during the last glacial maximum (LGM), the so called South American dry forest refugia hypothesis, using ecological niche modelling (ENM) and statistical phylogeography. We sampled 235 individuals of T. serratifolia in 17 populations in Brazil and analysed the polymorphisms at three intergenic chloroplast regions and ITS nuclear ribosomal DNA. Coalescent analyses showed a demographical expansion at the last c. 130 ka (thousand years before present). Simulations and ENM also showed that the current spatial pattern of genetic diversity is most likely due to a scenario of range expansion and range shift towards the Amazon Basin during the colder and arid climatic conditions associated with the LGM, matching the expected for the South American dry forest refugia hypothesis, although contrasting to the Pleistocene Arc hypothesis. Populations in more stable areas or with higher suitability through time showed higher genetic diversity. Postglacial range shift towards the Southeast and Atlantic coast may have led to spatial genome assortment due to leading edge colonization as the species tracks suitable environments, leading to lower genetic diversity in populations at higher distance from the distribution centroid at 21 ka. Haplotype sharing or common ancestry among populations from Caatinga in Northeast Brazil, Atlantic Forest in Southeast and Cerrado biome and ENM evince the past connection among these biomes.

  16. Ecotoxicological modelling of cosmetics for aquatic organisms: A QSTR approach.

    Science.gov (United States)

    Khan, K; Roy, K

    2017-07-01

    In this study, externally validated quantitative structure-toxicity relationship (QSTR) models were developed for toxicity of cosmetic ingredients on three different ecotoxicologically relevant organisms, namely Pseudokirchneriella subcapitata, Daphnia magna and Pimephales promelas following the OECD guidelines. The final models were developed by partial least squares (PLS) regression technique, which is more robust than multiple linear regression. The obtained model for P. subcapitata shows that molecular size and complexity have significant impacts on the toxicity of cosmetics. In case of P. promelas and D. magna, we found that the largest contribution to the toxicity was shown by hydrophobicity and van der Waals surface area, respectively. All models were validated using both internal and test compounds employing multiple strategies. For each QSTR model, applicability domain studies were also performed using the "Distance to Model in X-space" method. A comparison was made with the ECOSAR predictions in order to prove the good predictive performances of our developed models. Finally, individual models were applied to predict toxicity for an external set of 596 personal care products having no experimental data for at least one of the endpoints, and the compounds were ranked based on a decreasing order of toxicity using a scaling approach.

  17. Consensus approach for modeling HTS assays using in silico descriptors

    Directory of Open Access Journals (Sweden)

    Ahmed eAbdelaziz Sayed

    2016-02-01

    Full Text Available The need for filling information gaps while reducing toxicity testing in animals is becoming more predominant in risk assessment. Recent legislations are accepting in silico approaches for predicting toxicological outcomes. This article describes the results of Quantitative Structure Activity Relationship (QSAR modeling efforts within Tox21 Data Challenge 2014, which calculated the best balanced accuracy across all molecular pathway endpoints as well as the highest scores for ATAD5 and mitochondrial membrane potential disruption. Automated QSPR workflow systems, OCHEM (http://ochem.eu, the analytics platform, KNIME and the statistics software, CRAN R, were used to conduct the analysis and develop consensus models using ten different descriptor sets. A detailed analysis of QSAR models for all 12 molecular pathways and the effect of underlying models’ accuracy on the quality of the consensus model are provided. The resulting consensus models yielded a balanced accuracy as high as 88.1%±0.6 for mitochondrial membrane disruptors. Such high balanced accuracy and use of the applicability domain show a promising potential for in silico modeling to complement design HTS screening experiments. The summary statistics of all models are publicly available online at https://github.com/amaziz/Tox21-Challenge-Publication while the developed consensus models can be accessed at http://ochem.eu/article/98009.

  18. A Discrete Monetary Economic Growth Model with the MIU Approach

    Directory of Open Access Journals (Sweden)

    Wei-Bin Zhang

    2008-01-01

    Full Text Available This paper proposes an alternative approach to economic growth with money. The production side is the same as the Solow model, the Ramsey model, and the Tobin model. But we deal with behavior of consumers differently from the traditional approaches. The model is influenced by the money-in-the-utility (MIU approach in monetary economics. It provides a mechanism of endogenous saving which the Solow model lacks and avoids the assumption of adding up utility over a period of time upon which the Ramsey approach is based.

  19. Modeling shows that the NS5A inhibitor daclatasvir has two modes of action and yields a shorter estimate of the hepatitis C virus half-life.

    Science.gov (United States)

    Guedj, Jeremie; Dahari, Harel; Rong, Libin; Sansone, Natasha D; Nettles, Richard E; Cotler, Scott J; Layden, Thomas J; Uprichard, Susan L; Perelson, Alan S

    2013-03-05

    The nonstructural 5A (NS5A) protein is a target for drug development against hepatitis C virus (HCV). Interestingly, the NS5A inhibitor daclatasvir (BMS-790052) caused a decrease in serum HCV RNA levels by about two orders of magnitude within 6 h of administration. However, NS5A has no known enzymatic functions, making it difficult to understand daclatasvir's mode of action (MOA) and to estimate its antiviral effectiveness. Modeling viral kinetics during therapy has provided important insights into the MOA and effectiveness of a variety of anti-HCV agents. Here, we show that understanding the effects of daclatasvir in vivo requires a multiscale model that incorporates drug effects on the HCV intracellular lifecycle, and we validated this approach with in vitro HCV infection experiments. The model predicts that daclatasvir efficiently blocks two distinct stages of the viral lifecycle, namely viral RNA synthesis and virion assembly/secretion with mean effectiveness of 99% and 99.8%, respectively, and yields a more precise estimate of the serum HCV half-life, 45 min, i.e., around four times shorter than previous estimates. Intracellular HCV RNA in HCV-infected cells treated with daclatasvir and the HCV polymerase inhibitor NM107 showed a similar pattern of decline. However, daclatasvir treatment led to an immediate and rapid decline of extracellular HCV titers compared to a delayed (6-9 h) and slower decline with NM107, confirming an effect of daclatasvir on both viral replication and assembly/secretion. The multiscale modeling approach, validated with in vitro kinetic experiments, brings a unique conceptual framework for understanding the mechanism of action of a variety of agents in development for the treatment of HCV.

  20. Network models of TEM β-lactamase mutations coevolving under antibiotic selection show modular structure and anticipate evolutionary trajectories.

    Science.gov (United States)

    Guthrie, Violeta Beleva; Allen, Jennifer; Camps, Manel; Karchin, Rachel

    2011-09-01

    Understanding how novel functions evolve (genetic adaptation) is a critical goal of evolutionary biology. Among asexual organisms, genetic adaptation involves multiple mutations that frequently interact in a non-linear fashion (epistasis). Non-linear interactions pose a formidable challenge for the computational prediction of mutation effects. Here we use the recent evolution of β-lactamase under antibiotic selection as a model for genetic adaptation. We build a network of coevolving residues (possible functional interactions), in which nodes are mutant residue positions and links represent two positions found mutated together in the same sequence. Most often these pairs occur in the setting of more complex mutants. Focusing on extended-spectrum resistant sequences, we use network-theoretical tools to identify triple mutant trajectories of likely special significance for adaptation. We extrapolate evolutionary paths (n = 3) that increase resistance and that are longer than the units used to build the network (n = 2). These paths consist of a limited number of residue positions and are enriched for known triple mutant combinations that increase cefotaxime resistance. We find that the pairs of residues used to build the network frequently decrease resistance compared to their corresponding singlets. This is a surprising result, given that their coevolution suggests a selective advantage. Thus, β-lactamase adaptation is highly epistatic. Our method can identify triplets that increase resistance despite the underlying rugged fitness landscape and has the unique ability to make predictions by placing each mutant residue position in its functional context. Our approach requires only sequence information, sufficient genetic diversity, and discrete selective pressures. Thus, it can be used to analyze recent evolutionary events, where coevolution analysis methods that use phylogeny or statistical coupling are not possible. Improving our ability to assess

  1. Object-Oriented Approach to Modeling Units of Pneumatic Systems

    Directory of Open Access Journals (Sweden)

    Yu. V. Kyurdzhiev

    2014-01-01

    Full Text Available The article shows the relevance of the approaches to the object-oriented programming when modeling the pneumatic units (PU.Based on the analysis of the calculation schemes of aggregates pneumatic systems two basic objects, namely a cavity flow and a material point were highlighted.Basic interactions of objects are defined. Cavity-cavity interaction: ex-change of matter and energy with the flows of mass. Cavity-point interaction: force interaction, exchange of energy in the form of operation. Point-point in-teraction: force interaction, elastic interaction, inelastic interaction, and inter-vals of displacement.The authors have developed mathematical models of basic objects and interactions. Models and interaction of elements are implemented in the object-oriented programming.Mathematical models of elements of PU design scheme are implemented in derived from the base class. These classes implement the models of flow cavity, piston, diaphragm, short channel, diaphragm to be open by a given law, spring, bellows, elastic collision, inelastic collision, friction, PU stages with a limited movement, etc.A numerical integration of differential equations for the mathematical models of PU design scheme elements is based on the Runge-Kutta method of the fourth order. On request each class performs a tact of integration i.e. calcu-lation of the coefficient method.The paper presents an integration algorithm of the system of differential equations. All objects of the PU design scheme are placed in a unidirectional class list. Iterator loop cycle initiates the integration tact of all the objects in the list. One in four iteration makes a transition to the next step of integration. Calculation process stops when any object shows a shutdowns flag.The proposed approach was tested in the calculation of a number of PU designs. With regard to traditional approaches to modeling, the authors-proposed method features in easy enhancement, code reuse, high reliability

  2. Systematic approach to verification and validation: High explosive burn models

    Energy Technology Data Exchange (ETDEWEB)

    Menikoff, Ralph [Los Alamos National Laboratory; Scovel, Christina A. [Los Alamos National Laboratory

    2012-04-16

    , run a simulation, and generate a comparison plot showing simulated and experimental velocity gauge data. These scripts are then applied to several series of experiments and to several HE burn models. The same systematic approach is applicable to other types of material models; for example, equations of state models and material strength models.

  3. A distributed approach for parameters estimation in System Biology models

    International Nuclear Information System (INIS)

    Mosca, E.; Merelli, I.; Alfieri, R.; Milanesi, L.

    2009-01-01

    Due to the lack of experimental measurements, biological variability and experimental errors, the value of many parameters of the systems biology mathematical models is yet unknown or uncertain. A possible computational solution is the parameter estimation, that is the identification of the parameter values that determine the best model fitting respect to experimental data. We have developed an environment to distribute each run of the parameter estimation algorithm on a different computational resource. The key feature of the implementation is a relational database that allows the user to swap the candidate solutions among the working nodes during the computations. The comparison of the distributed implementation with the parallel one showed that the presented approach enables a faster and better parameter estimation of systems biology models.

  4. Comparing large-scale computational approaches to epidemic modeling: Agent-based versus structured metapopulation models

    Directory of Open Access Journals (Sweden)

    Merler Stefano

    2010-06-01

    Full Text Available Abstract Background In recent years large-scale computational models for the realistic simulation of epidemic outbreaks have been used with increased frequency. Methodologies adapt to the scale of interest and range from very detailed agent-based models to spatially-structured metapopulation models. One major issue thus concerns to what extent the geotemporal spreading pattern found by different modeling approaches may differ and depend on the different approximations and assumptions used. Methods We provide for the first time a side-by-side comparison of the results obtained with a stochastic agent-based model and a structured metapopulation stochastic model for the progression of a baseline pandemic event in Italy, a large and geographically heterogeneous European country. The agent-based model is based on the explicit representation of the Italian population through highly detailed data on the socio-demographic structure. The metapopulation simulations use the GLobal Epidemic and Mobility (GLEaM model, based on high-resolution census data worldwide, and integrating airline travel flow data with short-range human mobility patterns at the global scale. The model also considers age structure data for Italy. GLEaM and the agent-based models are synchronized in their initial conditions by using the same disease parameterization, and by defining the same importation of infected cases from international travels. Results The results obtained show that both models provide epidemic patterns that are in very good agreement at the granularity levels accessible by both approaches, with differences in peak timing on the order of a few days. The relative difference of the epidemic size depends on the basic reproductive ratio, R0, and on the fact that the metapopulation model consistently yields a larger incidence than the agent-based model, as expected due to the differences in the structure in the intra-population contact pattern of the approaches. The age

  5. Comparing large-scale computational approaches to epidemic modeling: agent-based versus structured metapopulation models.

    Science.gov (United States)

    Ajelli, Marco; Gonçalves, Bruno; Balcan, Duygu; Colizza, Vittoria; Hu, Hao; Ramasco, José J; Merler, Stefano; Vespignani, Alessandro

    2010-06-29

    In recent years large-scale computational models for the realistic simulation of epidemic outbreaks have been used with increased frequency. Methodologies adapt to the scale of interest and range from very detailed agent-based models to spatially-structured metapopulation models. One major issue thus concerns to what extent the geotemporal spreading pattern found by different modeling approaches may differ and depend on the different approximations and assumptions used. We provide for the first time a side-by-side comparison of the results obtained with a stochastic agent-based model and a structured metapopulation stochastic model for the progression of a baseline pandemic event in Italy, a large and geographically heterogeneous European country. The agent-based model is based on the explicit representation of the Italian population through highly detailed data on the socio-demographic structure. The metapopulation simulations use the GLobal Epidemic and Mobility (GLEaM) model, based on high-resolution census data worldwide, and integrating airline travel flow data with short-range human mobility patterns at the global scale. The model also considers age structure data for Italy. GLEaM and the agent-based models are synchronized in their initial conditions by using the same disease parameterization, and by defining the same importation of infected cases from international travels. The results obtained show that both models provide epidemic patterns that are in very good agreement at the granularity levels accessible by both approaches, with differences in peak timing on the order of a few days. The relative difference of the epidemic size depends on the basic reproductive ratio, R0, and on the fact that the metapopulation model consistently yields a larger incidence than the agent-based model, as expected due to the differences in the structure in the intra-population contact pattern of the approaches. The age breakdown analysis shows that similar attack rates are

  6. Mathematical Modelling Approach in Mathematics Education

    Science.gov (United States)

    Arseven, Ayla

    2015-01-01

    The topic of models and modeling has come to be important for science and mathematics education in recent years. The topic of "Modeling" topic is especially important for examinations such as PISA which is conducted at an international level and measures a student's success in mathematics. Mathematical modeling can be defined as using…

  7. A Multivariate Approach to Functional Neuro Modeling

    DEFF Research Database (Denmark)

    Mørch, Niels J.S.

    1998-01-01

    by the application of linear and more flexible, nonlinear microscopic regression models to a real-world dataset. The dependency of model performance, as quantified by generalization error, on model flexibility and training set size is demonstrated, leading to the important realization that no uniformly optimal model......, provides the basis for a generalization theoretical framework relating model performance to model complexity and dataset size. Briefly summarized the major topics discussed in the thesis include: - An introduction of the representation of functional datasets by pairs of neuronal activity patterns...... exists. - Model visualization and interpretation techniques. The simplicity of this task for linear models contrasts the difficulties involved when dealing with nonlinear models. Finally, a visualization technique for nonlinear models is proposed. A single observation emerges from the thesis...

  8. A DYNAMICAL SYSTEM APPROACH IN MODELING TECHNOLOGY TRANSFER

    Directory of Open Access Journals (Sweden)

    Hennie Husniah

    2016-05-01

    Full Text Available In this paper we discuss a mathematical model of two parties technology transfer from a leader to a follower. The model is reconstructed via dynamical system approach from a known standard Raz and Assa model and we found some important conclusion which have not been discussed in the original model. The model assumes that in the absence of technology transfer from a leader to a follower, both the leader and the follower have a capability to grow independently with a known upper limit of the development. We obtain a rich mathematical structure of the steady state solution of the model. We discuss a special situation in which the upper limit of the technological development of the follower is higher than that of the leader, but the leader has started earlier than the follower in implementing the technology. In this case we show a paradox stating that the follower is unable to reach its original upper limit of the technological development could appear whenever the transfer rate is sufficiently high.  We propose a new model to increase realism so that any technological transfer rate could only has a positive effect in accelerating the rate of growth of the follower in reaching its original upper limit of the development.

  9. Modeling of problems of projection: A non-countercyclic approach

    Directory of Open Access Journals (Sweden)

    Jason Ginsburg

    2016-06-01

    Full Text Available This paper describes a computational implementation of the recent Problems of Projection (POP approach to the study of language (Chomsky 2013; 2015. While adopting the basic proposals of POP, notably with respect to how labeling occurs, we a attempt to formalize the basic proposals of POP, and b develop new proposals that overcome some problems with POP that arise with respect to cyclicity, labeling, and wh-movement operations. We show how this approach accounts for simple declarative sentences, ECM constructions, and constructions that involve long-distance movement of a wh-phrase (including the that-trace effect. We implemented these proposals with a computer model that automatically constructs step-by-step derivations of target sentences, thus making it possible to verify that these proposals work.

  10. Uncertainty in biology a computational modeling approach

    CERN Document Server

    Gomez-Cabrero, David

    2016-01-01

    Computational modeling of biomedical processes is gaining more and more weight in the current research into the etiology of biomedical problems and potential treatment strategies.  Computational modeling allows to reduce, refine and replace animal experimentation as well as to translate findings obtained in these experiments to the human background. However these biomedical problems are inherently complex with a myriad of influencing factors, which strongly complicates the model building and validation process.  This book wants to address four main issues related to the building and validation of computational models of biomedical processes: Modeling establishment under uncertainty Model selection and parameter fitting Sensitivity analysis and model adaptation Model predictions under uncertainty In each of the abovementioned areas, the book discusses a number of key-techniques by means of a general theoretical description followed by one or more practical examples.  This book is intended for graduate stude...

  11. Relaxed memory models: an operational approach

    OpenAIRE

    Boudol , Gérard; Petri , Gustavo

    2009-01-01

    International audience; Memory models define an interface between programs written in some language and their implementation, determining which behaviour the memory (and thus a program) is allowed to have in a given model. A minimal guarantee memory models should provide to the programmer is that well-synchronized, that is, data-race free code has a standard semantics. Traditionally, memory models are defined axiomatically, setting constraints on the order in which memory operations are allow...

  12. Numerical modelling approach for mine backfill

    Indian Academy of Sciences (India)

    ... of mine backfill material needs special attention as the numerical model must behave realistically and in accordance with the site conditions. This paper discusses a numerical modelling strategy for modelling mine backfill material. Themodelling strategy is studied using a case study mine from Canadian mining industry.

  13. Spintronic device modeling and evaluation using modular approach to spintronics

    Science.gov (United States)

    Ganguly, Samiran

    Spintronics technology finds itself in an exciting stage today. Riding on the backs of rapid growth and impressive advances in materials and phenomena, it has started to make headway in the memory industry as solid state magnetic memories (STT-MRAM) and is considered a possible candidate to replace the CMOS when its scaling reaches physical limits. It is necessary to bring all these advances together in a coherent fashion to explore and evaluate the potential of spintronic devices. This work creates a framework for this exploration and evaluation based on Modular Approach to Spintronics, which encapsulate the physics of transport of charge and spin through materials and the phenomenology of magnetic dynamics and interaction in benchmarked elemental modules. These modules can then be combined together to form spin-circuit models of complex spintronic devices and structures which can be simulated using SPICE like circuit simulators. In this work we demonstrate how Modular Approach to Spintronics can be used to build spin-circuit models of functional spintronic devices of all types: memory, logic, and oscillators. We then show how Modular Approach to Spintronics can help identify critical factors behind static and dynamic dissipation in spintronic devices and provide remedies by exploring the use of various alternative materials and phenomena. Lastly, we show the use of Modular Approach to Spintronics in exploring new paradigms of computing enabled by the inherent physics of spintronic devices. We hope that this work will encourage more research and experiments that will establish spintronics as a viable technology for continued advancement of electronics.

  14. A new formulation of cannabidiol in cream shows therapeutic effects in a mouse model of experimental autoimmune encephalomyelitis.

    Science.gov (United States)

    Giacoppo, Sabrina; Galuppo, Maria; Pollastro, Federica; Grassi, Gianpaolo; Bramanti, Placido; Mazzon, Emanuela

    2015-10-21

    The present study was designed to investigate the efficacy of a new formulation of alone, purified cannabidiol (CBD) (>98 %), the main non-psychotropic cannabinoid of Cannabis sativa, as a topical treatment in an experimental model of autoimmune encephalomyelitis (EAE), the most commonly used model for multiple sclerosis (MS). Particularly, we evaluated whether administration of a topical 1 % CBD-cream, given at the time of symptomatic disease onset, could affect the EAE progression and if this treatment could also recover paralysis of hind limbs, qualifying topical-CBD for the symptomatic treatment of MS. In order to have a preparation of 1 % of CBD-cream, pure CBD have been solubilized in propylene glycoland basic dense cream O/A. EAE was induced by immunization with myelin oligodendroglial glycoprotein peptide (MOG35-55) in C57BL/6 mice. After EAE onset, mice were allocated into several experimental groups (Naïve, EAE, EAE-1 % CBD-cream, EAE-vehicle cream, CTRL-1 % CBD-cream, CTRL-vehicle cream). Mice were observed daily for signs of EAE and weight loss. At the sacrifice of the animals, which occurred at the 28(th) day from EAE-induction, spinal cord and spleen tissues were collected in order to perform histological evaluation, immunohistochemistry and western blotting analysis. Achieved results surprisingly show that daily treatment with topical 1 % CBD-cream may exert neuroprotective effects against EAE, diminishing clinical disease score (mean of 5.0 in EAE mice vs 1.5 in EAE + CBD-cream), by recovering of paralysis of hind limbs and by ameliorating histological score typical of disease (lymphocytic infiltration and demyelination) in spinal cord tissues. Also, 1 % CBD-cream is able to counteract the EAE-induced damage reducing release of CD4 and CD8α T cells (spleen tissue localization was quantified about 10,69 % and 35,96 % of positive staining respectively in EAE mice) and expression of the main pro-inflammatory cytokines as well as several other

  15. Vertically-integrated Approaches for Carbon Sequestration Modeling

    Science.gov (United States)

    Bandilla, K.; Celia, M. A.; Guo, B.

    2015-12-01

    Carbon capture and sequestration (CCS) is being considered as an approach to mitigate anthropogenic CO2 emissions from large stationary sources such as coal fired power plants and natural gas processing plants. Computer modeling is an essential tool for site design and operational planning as it allows prediction of the pressure response as well as the migration of both CO2 and brine in the subsurface. Many processes, such as buoyancy, hysteresis, geomechanics and geochemistry, can have important impacts on the system. While all of the processes can be taken into account simultaneously, the resulting models are computationally very expensive and require large numbers of parameters which are often uncertain or unknown. In many cases of practical interest, the computational and data requirements can be reduced by choosing a smaller domain and/or by neglecting or simplifying certain processes. This leads to a series of models with different complexity, ranging from coupled multi-physics, multi-phase three-dimensional models to semi-analytical single-phase models. Under certain conditions the three-dimensional equations can be integrated in the vertical direction, leading to a suite of two-dimensional multi-phase models, termed vertically-integrated models. These models are either solved numerically or simplified further (e.g., assumption of vertical equilibrium) to allow analytical or semi-analytical solutions. This presentation focuses on how different vertically-integrated models have been applied to the simulation of CO2 and brine migration during CCS projects. Several example sites, such as the Illinois Basin and the Wabamun Lake region of the Alberta Basin, are discussed to show how vertically-integrated models can be used to gain understanding of CCS operations.

  16. Novel approach for modeling separation forces between deformable bodies.

    Science.gov (United States)

    Mahvash, Mohsen

    2006-07-01

    Many minimally invasive surgeries (MISs) involve removing whole organs or tumors that are connected to other organs. Development of haptic simulators that reproduce separation forces between organs can help surgeons learn MIS procedures. Powerful computational approaches such as finite-element methods generally cannot simulate separation in real time. This paper presents a novel approach for real-time computation of separation forces between deformable bodies. Separation occurs either due to fracture when a tool applies extensive forces to the bodies or due to evaporation when a laser beam burns the connection between the bodies. The separation forces are generated online from precalculated force-displacement functions that depend on the local adhesion/separation states between bodies. The precalculated functions are accurately synthesized from a large number of force responses obtained through either offline simulation, measurement, or analytical approximation during the preprocessing step. The approach does not require online computation of force versus global deformation to obtain separation forces. Only online interpolation of precalculated responses is required. The state of adhesion/separation during fracture and evaporation are updated by computationally simple models, which are derived based on the law of conservation of energy. An implementation of the approach for the haptic simulation of the removal of a diseased organ is presented, showing the fidelity of the simulation.

  17. Models Portability: Some Considerations about Transdisciplinary Approaches

    Science.gov (United States)

    Giuliani, Alessandro

    Some critical issues about the relative portability of models and solutions across disciplinary barriers are discussed. The risks linked to the use of models and theories coming from different disciplines are evidentiated with a particular emphasis on biology. A metaphorical use of conceptual tools coming from other fields is suggested, together with the unescapable need to judge about the relative merits of a model on the basis of the amount of facts relative to the particular domain of application it explains. Some examples of metaphorical modeling coming from biochemistry and psychobiology are briefly discussed in order to clarify the above positions.

  18. Nonlinear Modeling of the PEMFC Based On NNARX Approach

    OpenAIRE

    Shan-Jen Cheng; Te-Jen Chang; Kuang-Hsiung Tan; Shou-Ling Kuo

    2015-01-01

    Polymer Electrolyte Membrane Fuel Cell (PEMFC) is such a time-vary nonlinear dynamic system. The traditional linear modeling approach is hard to estimate structure correctly of PEMFC system. From this reason, this paper presents a nonlinear modeling of the PEMFC using Neural Network Auto-regressive model with eXogenous inputs (NNARX) approach. The multilayer perception (MLP) network is applied to evaluate the structure of the NNARX model of PEMFC. The validity and accurac...

  19. A New Approach to Model Verification, Falsification and Selection

    Directory of Open Access Journals (Sweden)

    Andrew J. Buck

    2015-06-01

    Full Text Available This paper shows that a qualitative analysis, i.e., an assessment of the consistency of a hypothesized sign pattern for structural arrays with the sign pattern of the estimated reduced form, can always provide decisive insight into a model’s validity both in general and compared to other models. Qualitative analysis can show that it is impossible for some models to have generated the data used to estimate the reduced form, even though standard specification tests might show the model to be adequate. A partially specified structural hypothesis can be falsified by estimating as few as one reduced form equation. Zero restrictions in the structure can themselves be falsified. It is further shown how the information content of the hypothesized structural sign patterns can be measured using a commonly applied concept of statistical entropy. The lower the hypothesized structural sign pattern’s entropy, the more a priori information it proposes about the sign pattern of the estimated reduced form. As an hypothesized structural sign pattern has a lower entropy, it is more subject to type 1 error and less subject to type 2 error. Three cases illustrate the approach taken here.

  20. A visual approach for modeling spatiotemporal relations

    NARCIS (Netherlands)

    R.L. Guimarães (Rodrigo); C.S.S. Neto; L.F.G. Soares

    2008-01-01

    htmlabstractTextual programming languages have proven to be difficult to learn and to use effectively for many people. For this sake, visual tools can be useful to abstract the complexity of such textual languages, minimizing the specification efforts. In this paper we present a visual approach for

  1. DIVERSE APPROACHES TO MODELLING THE ASSIMILATIVE ...

    African Journals Online (AJOL)

    This study evaluated the assimilative capacity of Ikpoba River using different approaches namely: homogeneous differential equation, ANOVA/Duncan Multiple rage test, first and second order differential equations, correlation analysis, Eigen values and eigenvectors, multiple linear regression, bootstrapping and far-field ...

  2. Realistic Matematic Approach through Numbered Head Together Learning Model

    Science.gov (United States)

    Sugihatno, A. C. M. S.; Budiyono; Slamet, I.

    2017-09-01

    Recently, the teaching process which is conducted based on teacher center affect the students interaction in the class. It causes students become less interest to participate. That is why teachers should be more creative in designing learning using other types of cooperative learning model. Therefore, this research is aimed to implement NHT with RMA in the teaching process. We utilize NHT since it is a variant of group discussion whose aim is giving a chance to the students to share their ideas related to the teacher’s question. By using NHT in the class, a teacher can give a better understanding about the material which is given with the help of Realistic Mathematics Approach (RMA) which known for its real problem contex. Meanwhile, the researcher assumes instead of selecting teaching model, Adversity Quotient (AQ) of student also influences students’ achievement. This research used the quasi experimental research. The samples is 60 students in junior high school, it was taken by using the stratified cluster random sampling technique. The results show NHT-RMA gives a better learning achievement of mathematics than direct teaching model and NHT-RMA teaching model with categorized as high AQ show different learning achievement from the students with categorized as moderate and low AQ.

  3. Comparison of two novel approaches to model fibre reinforced concrete

    NARCIS (Netherlands)

    Radtke, F.K.F.; Simone, A.; Sluys, L.J.

    2009-01-01

    We present two approaches to model fibre reinforced concrete. In both approaches, discrete fibre distributions and the behaviour of the fibre-matrix interface are explicitly considered. One approach employs the reaction forces from fibre to matrix while the other is based on the partition of unity

  4. Modeling Approaches for Describing Microbial Population Heterogeneity

    DEFF Research Database (Denmark)

    Lencastre Fernandes, Rita

    in a computational (CFD) fluid dynamic model. The anaerobic Growth of a budding yeast population in a continuously run microbioreactor was used as example. The proposed integrated model describes the fluid flow, the local cell size and cell cycle position distributions, as well as the local concentrations of glucose...

  5. A simplified approach to feedwater train modeling

    International Nuclear Information System (INIS)

    Ollat, X.; Smoak, R.A.

    1990-01-01

    This paper presents a method to simplify feedwater train models for power plants. A simple set of algebraic equations, based on mass and energy balances, is developed to replace complex representations of the components under certain assumptions. The method was tested and used to model the low pressure heaters of the Sequoyah Nuclear Plant in a larger simulation

  6. The workshop on ecosystems modelling approaches for South ...

    African Journals Online (AJOL)

    roles played by models in the OMP approach, and raises questions about the costs of the data collection. (in particular) needed to apply a multispecies modelling approach in South African fisheries management. It then summarizes the deliberations of workshops held by the Scientific Committees of two international ma-.

  7. A simple approach to modeling ductile failure.

    Energy Technology Data Exchange (ETDEWEB)

    Wellman, Gerald William

    2012-06-01

    Sandia National Laboratories has the need to predict the behavior of structures after the occurrence of an initial failure. In some cases determining the extent of failure, beyond initiation, is required, while in a few cases the initial failure is a design feature used to tailor the subsequent load paths. In either case, the ability to numerically simulate the initiation and propagation of failures is a highly desired capability. This document describes one approach to the simulation of failure initiation and propagation.

  8. Advanced language modeling approaches, case study: Expert search

    NARCIS (Netherlands)

    Hiemstra, Djoerd

    2008-01-01

    This tutorial gives a clear and detailed overview of advanced language modeling approaches and tools, including the use of document priors, translation models, relevance models, parsimonious models and expectation maximization training. Expert search will be used as a case study to explain the

  9. Chemotaxis: A Multi-Scale Modeling Approach

    Science.gov (United States)

    Bhowmik, Arpan

    We are attempting to build a working simulation of population level self-organization in dictyostelium discoideum cells by combining existing models for chemo-attractant production and detection, along with phenomenological motility models. Our goal is to create a computationally-viable model-framework within which a population of cells can self-generate chemo-attractant waves and self-organize based on the directional cues of those waves. The work is a direct continuation of our previous work published in Physical Biology titled ``Excitable waves and direction-sensing in Dictyostelium Discoideum: steps towards a chemotaxis model''. This is a work in progress, no official draft/paper exists yet.

  10. An Integrated Approach to Modeling Evacuation Behavior

    Science.gov (United States)

    2011-02-01

    A spate of recent hurricanes and other natural disasters have drawn a lot of attention to the evacuation decision of individuals. Here we focus on evacuation models that incorporate two economic phenomena that seem to be increasingly important in exp...

  11. Infectious disease modeling a hybrid system approach

    CERN Document Server

    Liu, Xinzhi

    2017-01-01

    This volume presents infectious diseases modeled mathematically, taking seasonality and changes in population behavior into account, using a switched and hybrid systems framework. The scope of coverage includes background on mathematical epidemiology, including classical formulations and results; a motivation for seasonal effects and changes in population behavior, an investigation into term-time forced epidemic models with switching parameters, and a detailed account of several different control strategies. The main goal is to study these models theoretically and to establish conditions under which eradication or persistence of the disease is guaranteed. In doing so, the long-term behavior of the models is determined through mathematical techniques from switched systems theory. Numerical simulations are also given to augment and illustrate the theoretical results and to help study the efficacy of the control schemes.

  12. Challenges and opportunities for integrating lake ecosystem modelling approaches

    Science.gov (United States)

    Mooij, Wolf M.; Trolle, Dennis; Jeppesen, Erik; Arhonditsis, George; Belolipetsky, Pavel V.; Chitamwebwa, Deonatus B.R.; Degermendzhy, Andrey G.; DeAngelis, Donald L.; Domis, Lisette N. De Senerpont; Downing, Andrea S.; Elliott, J. Alex; Ruberto, Carlos Ruberto; Gaedke, Ursula; Genova, Svetlana N.; Gulati, Ramesh D.; Hakanson, Lars; Hamilton, David P.; Hipsey, Matthew R.; Hoen, Jochem 't; Hulsmann, Stephan; Los, F. Hans; Makler-Pick, Vardit; Petzoldt, Thomas; Prokopkin, Igor G.; Rinke, Karsten; Schep, Sebastiaan A.; Tominaga, Koji; Van Dam, Anne A.; Van Nes, Egbert H.; Wells, Scott A.; Janse, Jan H.

    2010-01-01

    A large number and wide variety of lake ecosystem models have been developed and published during the past four decades. We identify two challenges for making further progress in this field. One such challenge is to avoid developing more models largely following the concept of others ('reinventing the wheel'). The other challenge is to avoid focusing on only one type of model, while ignoring new and diverse approaches that have become available ('having tunnel vision'). In this paper, we aim at improving the awareness of existing models and knowledge of concurrent approaches in lake ecosystem modelling, without covering all possible model tools and avenues. First, we present a broad variety of modelling approaches. To illustrate these approaches, we give brief descriptions of rather arbitrarily selected sets of specific models. We deal with static models (steady state and regression models), complex dynamic models (CAEDYM, CE-QUAL-W2, Delft 3D-ECO, LakeMab, LakeWeb, MyLake, PCLake, PROTECH, SALMO), structurally dynamic models and minimal dynamic models. We also discuss a group of approaches that could all be classified as individual based: super-individual models (Piscator, Charisma), physiologically structured models, stage-structured models and trait-based models. We briefly mention genetic algorithms, neural networks, Kalman filters and fuzzy logic. Thereafter, we zoom in, as an in-depth example, on the multi-decadal development and application of the lake ecosystem model PCLake and related models (PCLake Metamodel, Lake Shira Model, IPH-TRIM3D-PCLake). In the discussion, we argue that while the historical development of each approach and model is understandable given its 'leading principle', there are many opportunities for combining approaches. We take the point of view that a single 'right' approach does not exist and should not be strived for. Instead, multiple modelling approaches, applied concurrently to a given problem, can help develop an integrative

  13. "Dispersion modeling approaches for near road | Science ...

    Science.gov (United States)

    Roadway design and roadside barriers can have significant effects on the dispersion of traffic-generated pollutants, especially in the near-road environment. Dispersion models that can accurately simulate these effects are needed to fully assess these impacts for a variety of applications. For example, such models can be useful for evaluating the mitigation potential of roadside barriers in reducing near-road exposures and their associated adverse health effects. Two databases, a tracer field study and a wind tunnel study, provide measurements used in the development and/or validation of algorithms to simulate dispersion in the presence of noise barriers. The tracer field study was performed in Idaho Falls, ID, USA with a 6-m noise barrier and a finite line source in a variety of atmospheric conditions. The second study was performed in the meteorological wind tunnel at the US EPA and simulated line sources at different distances from a model noise barrier to capture the effect on emissions from individual lanes of traffic. In both cases, velocity and concentration measurements characterized the effect of the barrier on dispersion.This paper presents comparisons with the two datasets of the barrier algorithms implemented in two different dispersion models: US EPA’s R-LINE (a research dispersion modelling tool under development by the US EPA’s Office of Research and Development) and CERC’s ADMS model (ADMS-Urban). In R-LINE the physical features reveal

  14. A parsimonious approach to modeling animal movement data.

    Directory of Open Access Journals (Sweden)

    Yann Tremblay

    Full Text Available Animal tracking is a growing field in ecology and previous work has shown that simple speed filtering of tracking data is not sufficient and that improvement of tracking location estimates are possible. To date, this has required methods that are complicated and often time-consuming (state-space models, resulting in limited application of this technique and the potential for analysis errors due to poor understanding of the fundamental framework behind the approach. We describe and test an alternative and intuitive approach consisting of bootstrapping random walks biased by forward particles. The model uses recorded data accuracy estimates, and can assimilate other sources of data such as sea-surface temperature, bathymetry and/or physical boundaries. We tested our model using ARGOS and geolocation tracks of elephant seals that also carried GPS tags in addition to PTTs, enabling true validation. Among pinnipeds, elephant seals are extreme divers that spend little time at the surface, which considerably impact the quality of both ARGOS and light-based geolocation tracks. Despite such low overall quality tracks, our model provided location estimates within 4.0, 5.5 and 12.0 km of true location 50% of the time, and within 9, 10.5 and 20.0 km 90% of the time, for above, equal or below average elephant seal ARGOS track qualities, respectively. With geolocation data, 50% of errors were less than 104.8 km (<0.94 degrees, and 90% were less than 199.8 km (<1.80 degrees. Larger errors were due to lack of sea-surface temperature gradients. In addition we show that our model is flexible enough to solve the obstacle avoidance problem by assimilating high resolution coastline data. This reduced the number of invalid on-land location by almost an order of magnitude. The method is intuitive, flexible and efficient, promising extensive utilization in future research.

  15. A consortium approach to glass furnace modeling.

    Energy Technology Data Exchange (ETDEWEB)

    Chang, S.-L.; Golchert, B.; Petrick, M.

    1999-04-20

    Using computational fluid dynamics to model a glass furnace is a difficult task for any one glass company, laboratory, or university to accomplish. The task of building a computational model of the furnace requires knowledge and experience in modeling two dissimilar regimes (the combustion space and the liquid glass bath), along with the skill necessary to couple these two regimes. Also, a detailed set of experimental data is needed in order to evaluate the output of the code to ensure that the code is providing proper results. Since all these diverse skills are not present in any one research institution, a consortium was formed between Argonne National Laboratory, Purdue University, Mississippi State University, and five glass companies in order to marshal these skills into one three-year program. The objective of this program is to develop a fully coupled, validated simulation of a glass melting furnace that may be used by industry to optimize the performance of existing furnaces.

  16. A chain reaction approach to modelling gene pathways.

    Science.gov (United States)

    Cheng, Gary C; Chen, Dung-Tsa; Chen, James J; Soong, Seng-Jaw; Lamartiniere, Coral; Barnes, Stephen

    2012-08-01

    nutrient-containing diets regulate gene expression in the estrogen synthesis pathway during puberty; (II) global tests to assess an overall association of this particular pathway with time factor by utilizing generalized linear models to analyze microarray data; and (III) a chain reaction model to simulate the pathway. This is a novel application because we are able to translate the gene pathway into the chemical reactions in which each reaction channel describes gene-gene relationship in the pathway. In the chain reaction model, the implicit scheme is employed to efficiently solve the differential equations. Data analysis results show the proposed model is capable of predicting gene expression changes and demonstrating the effect of nutrient-containing diets on gene expression changes in the pathway. One of the objectives of this study is to explore and develop a numerical approach for simulating the gene expression change so that it can be applied and calibrated when the data of more time slices are available, and thus can be used to interpolate the expression change at a desired time point without conducting expensive experiments for a large amount of time points. Hence, we are not claiming this is either essential or the most efficient way for simulating this problem, rather a mathematical/numerical approach that can model the expression change of a large set of genes of a complex pathway. In addition, we understand the limitation of this experiment and realize that it is still far from being a complete model of predicting nutrient-gene interactions. The reason is that in the present model, the reaction rates were estimated based on available data at two time points; hence, the gene expression change is dependent upon the reaction rates and a linear function of the gene expressions. More data sets containing gene expression at various time slices are needed in order to improve the present model so that a non-linear variation of gene expression changes at different time

  17. Bioavailability of particulate metal to zebra mussels: Biodynamic modelling shows that assimilation efficiencies are site-specific

    Energy Technology Data Exchange (ETDEWEB)

    Bourgeault, Adeline, E-mail: bourgeault@ensil.unilim.fr [Cemagref, Unite de Recherche Hydrosystemes et Bioprocedes, 1 rue Pierre-Gilles de Gennes, 92761 Antony (France); FIRE, FR-3020, 4 place Jussieu, 75005 Paris (France); Gourlay-France, Catherine, E-mail: catherine.gourlay@cemagref.fr [Cemagref, Unite de Recherche Hydrosystemes et Bioprocedes, 1 rue Pierre-Gilles de Gennes, 92761 Antony (France); FIRE, FR-3020, 4 place Jussieu, 75005 Paris (France); Priadi, Cindy, E-mail: cindy.priadi@eng.ui.ac.id [LSCE/IPSL CEA-CNRS-UVSQ, Avenue de la Terrasse, 91198 Gif-sur-Yvette (France); Ayrault, Sophie, E-mail: Sophie.Ayrault@lsce.ipsl.fr [LSCE/IPSL CEA-CNRS-UVSQ, Avenue de la Terrasse, 91198 Gif-sur-Yvette (France); Tusseau-Vuillemin, Marie-Helene, E-mail: Marie-helene.tusseau@ifremer.fr [IFREMER Technopolis 40, 155 rue Jean-Jacques Rousseau, 92138 Issy-Les-Moulineaux (France)

    2011-12-15

    This study investigates the ability of the biodynamic model to predict the trophic bioaccumulation of cadmium (Cd), chromium (Cr), copper (Cu), nickel (Ni) and zinc (Zn) in a freshwater bivalve. Zebra mussels were transplanted to three sites along the Seine River (France) and collected monthly for 11 months. Measurements of the metal body burdens in mussels were compared with the predictions from the biodynamic model. The exchangeable fraction of metal particles did not account for the bioavailability of particulate metals, since it did not capture the differences between sites. The assimilation efficiency (AE) parameter is necessary to take into account biotic factors influencing particulate metal bioavailability. The biodynamic model, applied with AEs from the literature, overestimated the measured concentrations in zebra mussels, the extent of overestimation being site-specific. Therefore, an original methodology was proposed for in situ AE measurements for each site and metal. - Highlights: > Exchangeable fraction of metal particles did not account for the bioavailability of particulate metals. > Need for site-specific biodynamic parameters. > Field-determined AE provide a good fit between the biodynamic model predictions and bioaccumulation measurements. - The interpretation of metal bioaccumulation in transplanted zebra mussels with biodynamic modelling highlights the need for site-specific assimilation efficiencies of particulate metals.

  18. Phytoplankton as Particles - A New Approach to Modeling Algal Blooms

    Science.gov (United States)

    2013-07-01

    ER D C/ EL T R -1 3 -1 3 Civil Works Basic Research Program Phytoplankton as Particles – A New Approach to Modeling Algal Blooms E nv... Phytoplankton as Particles – A New Approach to Modeling Algal Blooms Carl F. Cerco and Mark R. Noel Environmental Laboratory U.S. Army Engineer Research... phytoplankton blooms can be modeled by treating phytoplankton as discrete particles capable of self- induced transport via buoyancy regulation or other

  19. Contribution of a companion modelling approach

    African Journals Online (AJOL)

    2009-09-16

    Sep 16, 2009 ... This paper describes the role of participatory modelling and simulation as a way to provide a meaningful framework to enable actors to understand the interdependencies in peri-urban catchment management. A role-playing game, connecting the quantitative and qualitative dynamics of the resources with ...

  20. Numerical modelling approach for mine backfill

    Indian Academy of Sciences (India)

    Muhammad Zaka Emad

    2017-07-24

    Jul 24, 2017 ... Abstract. Numerical modelling is broadly used for assessing complex scenarios in underground mines, including mining sequence and blast-induced vibrations from production blasting. Sublevel stoping mining methods with delayed backfill are extensively used to exploit steeply dipping ore bodies by ...

  1. Energy and development : A modelling approach

    NARCIS (Netherlands)

    van Ruijven, B.J.|info:eu-repo/dai/nl/304834521

    2008-01-01

    Rapid economic growth of developing countries like India and China implies that these countries become important actors in the global energy system. Examples of this impact are the present day oil shortages and rapidly increasing emissions of greenhouse gases. Global energy models are used explore

  2. Numerical modelling approach for mine backfill

    Indian Academy of Sciences (India)

    Muhammad Zaka Emad

    2017-07-24

    Jul 24, 2017 ... pulse is applied as a stress history on the CRF stope. Blast wave data obtained from the on-site monitoring are very complex. It requires processing before interpreting and using it for numerical models. Generally, mining compa- nies hire geophysics experts for interpretation of such data. The blast wave ...

  3. A new approach to model mixed hydrates

    Czech Academy of Sciences Publication Activity Database

    Hielscher, S.; Vinš, Václav; Jäger, A.; Hrubý, Jan; Breitkopf, C.; Span, R.

    2018-01-01

    Roč. 459, March (2018), s. 170-185 ISSN 0378-3812 R&D Projects: GA ČR(CZ) GA17-08218S Institutional support: RVO:61388998 Keywords : gas hydrate * mixture * modeling Subject RIV: BJ - Thermodynamics Impact factor: 2.473, year: 2016 https://www. science direct.com/ science /article/pii/S0378381217304983

  4. Different approach to the modeling of nonfree particle diffusion

    Science.gov (United States)

    Buhl, Niels

    2018-03-01

    A new approach to the modeling of nonfree particle diffusion is presented. The approach uses a general setup based on geometric graphs (networks of curves), which means that particle diffusion in anything from arrays of barriers and pore networks to general geometric domains can be considered and that the (free random walk) central limit theorem can be generalized to cover also the nonfree case. The latter gives rise to a continuum-limit description of the diffusive motion where the effect of partially absorbing barriers is accounted for in a natural and non-Markovian way that, in contrast to the traditional approach, quantifies the absorptivity of a barrier in terms of a dimensionless parameter in the range 0 to 1. The generalized theorem gives two general analytic expressions for the continuum-limit propagator: an infinite sum of Gaussians and an infinite sum of plane waves. These expressions entail the known method-of-images and Laplace eigenfunction expansions as special cases and show how the presence of partially absorbing barriers can lead to phenomena such as line splitting and band gap formation in the plane wave wave-number spectrum.

  5. Bioavailability of particulate metal to zebra mussels: biodynamic modelling shows that assimilation efficiencies are site-specific.

    Science.gov (United States)

    Bourgeault, Adeline; Gourlay-Francé, Catherine; Priadi, Cindy; Ayrault, Sophie; Tusseau-Vuillemin, Marie-Hélène

    2011-12-01

    This study investigates the ability of the biodynamic model to predict the trophic bioaccumulation of cadmium (Cd), chromium (Cr), copper (Cu), nickel (Ni) and zinc (Zn) in a freshwater bivalve. Zebra mussels were transplanted to three sites along the Seine River (France) and collected monthly for 11 months. Measurements of the metal body burdens in mussels were compared with the predictions from the biodynamic model. The exchangeable fraction of metal particles did not account for the bioavailability of particulate metals, since it did not capture the differences between sites. The assimilation efficiency (AE) parameter is necessary to take into account biotic factors influencing particulate metal bioavailability. The biodynamic model, applied with AEs from the literature, overestimated the measured concentrations in zebra mussels, the extent of overestimation being site-specific. Therefore, an original methodology was proposed for in situ AE measurements for each site and metal. Copyright © 2011 Elsevier Ltd. All rights reserved.

  6. Thin inclusion approach for modelling of heterogeneous conducting materials

    Science.gov (United States)

    Lavrov, Nikolay; Smirnova, Alevtina; Gorgun, Haluk; Sammes, Nigel

    Experimental data show that heterogeneous nanostructure of solid oxide and polymer electrolyte fuel cells could be approximated as an infinite set of fiber-like or penny-shaped inclusions in a continuous medium. Inclusions can be arranged in a cluster mode and regular or random order. In the newly proposed theoretical model of nanostructured material, the most attention is paid to the small aspect ratio of structural elements as well as to some model problems of electrostatics. The proposed integral equation for electric potential caused by the charge distributed over the single circular or elliptic cylindrical conductor of finite length, as a single unit of a nanostructured material, has been asymptotically simplified for the small aspect ratio and solved numerically. The result demonstrates that surface density changes slightly in the middle part of the thin domain and has boundary layers localized near the edges. It is anticipated, that contribution of boundary layer solution to the surface density is significant and cannot be governed by classic equation for smooth linear charge. The role of the cross-section shape is also investigated. Proposed approach is sufficiently simple, robust and allows extension to either regular or irregular system of various inclusions. This approach can be used for the development of the system of conducting inclusions, which are commonly present in nanostructured materials used for solid oxide and polymer electrolyte fuel cell (PEMFC) materials.

  7. A Modeling Approach for Plastic-Metal Laser Direct Joining

    Science.gov (United States)

    Lutey, Adrian H. A.; Fortunato, Alessandro; Ascari, Alessandro; Romoli, Luca

    2017-09-01

    Laser processing has been identified as a feasible approach to direct joining of metal and plastic components without the need for adhesives or mechanical fasteners. The present work sees development of a modeling approach for conduction and transmission laser direct joining of these materials based on multi-layer optical propagation theory and numerical heat flow simulation. The scope of this methodology is to predict process outcomes based on the calculated joint interface and upper surface temperatures. Three representative cases are considered for model verification, including conduction joining of PBT and aluminum alloy, transmission joining of optically transparent PET and stainless steel, and transmission joining of semi-transparent PA 66 and stainless steel. Conduction direct laser joining experiments are performed on black PBT and 6082 anticorodal aluminum alloy, achieving shear loads of over 2000 N with specimens of 2 mm thickness and 25 mm width. Comparison with simulation results shows that consistently high strength is achieved where the peak interface temperature is above the plastic degradation temperature. Comparison of transmission joining simulations and published experimental results confirms these findings and highlights the influence of plastic layer optical absorption on process feasibility.

  8. Integration models: multicultural and liberal approaches confronted

    Science.gov (United States)

    Janicki, Wojciech

    2012-01-01

    European societies have been shaped by their Christian past, upsurge of international migration, democratic rule and liberal tradition rooted in religious tolerance. Boosting globalization processes impose new challenges on European societies, striving to protect their diversity. This struggle is especially clearly visible in case of minorities trying to resist melting into mainstream culture. European countries' legal systems and cultural policies respond to these efforts in many ways. Respecting identity politics-driven group rights seems to be the most common approach, resulting in creation of a multicultural society. However, the outcome of respecting group rights may be remarkably contradictory to both individual rights growing out from liberal tradition, and to reinforced concept of integration of immigrants into host societies. The hereby paper discusses identity politics upturn in the context of both individual rights and integration of European societies.

  9. Modelling thermal plume impacts - Kalpakkam approach

    International Nuclear Information System (INIS)

    Rao, T.S.; Anup Kumar, B.; Narasimhan, S.V.

    2002-01-01

    A good understanding of temperature patterns in the receiving waters is essential to know the heat dissipation from thermal plumes originating from coastal power plants. The seasonal temperature profiles of the Kalpakkam coast near Madras Atomic Power Station (MAPS) thermal out fall site are determined and analysed. It is observed that the seasonal current reversal in the near shore zone is one of the major mechanisms for the transport of effluents away from the point of mixing. To further refine our understanding of the mixing and dilution processes, it is necessary to numerically simulate the coastal ocean processes by parameterising the key factors concerned. In this paper, we outline the experimental approach to achieve this objective. (author)

  10. Modelling approach for photochemical pollution studies

    International Nuclear Information System (INIS)

    Silibello, C.; Catenacci, G.; Calori, G.; Crapanzano, G.; Pirovano, G.

    1996-01-01

    The comprehension of the relationships between primary pollutants emissions and secondary pollutants concentration and deposition is necessary to design policies and strategies for the maintenance of a healthy environment. The use of mathematical models is a powerful tool to assess the effect of the emissions and of physical and chemical transformations of pollutants on air quality. A photochemical model, Calgrid, developed by CARB (California Air Resources Board), has been used to test the effect of different meteorological and air quality, scenarios on the ozone concentration levels. This way we can evaluate the influence of these conditions to determine the most important chemical species and reactions in atmosphere. The ozone levels are strongly related to the reactive hydrocarbons concentrations and to the solar radiation flux

  11. Colour texture segmentation using modelling approach

    Czech Academy of Sciences Publication Activity Database

    Haindl, Michal; Mikeš, Stanislav

    2005-01-01

    Roč. 3687, č. - (2005), s. 484-491 ISSN 0302-9743. [International Conference on Advances in Pattern Recognition /3./. Bath, 22.08.2005-25.08.2005] R&D Projects: GA MŠk 1M0572; GA AV ČR 1ET400750407; GA AV ČR IAA2075302 Institutional research plan: CEZ:AV0Z10750506 Keywords : colour texture segmentation * image models * segmentation benchmark Subject RIV: BD - Theory of Information

  12. Tumour resistance to cisplatin: a modelling approach

    International Nuclear Information System (INIS)

    Marcu, L; Bezak, E; Olver, I; Doorn, T van

    2005-01-01

    Although chemotherapy has revolutionized the treatment of haematological tumours, in many common solid tumours the success has been limited. Some of the reasons for the limitations are: the timing of drug delivery, resistance to the drug, repopulation between cycles of chemotherapy and the lack of complete understanding of the pharmacokinetics and pharmacodynamics of a specific agent. Cisplatin is among the most effective cytotoxic agents used in head and neck cancer treatments. When modelling cisplatin as a single agent, the properties of cisplatin only have to be taken into account, reducing the number of assumptions that are considered in the generalized chemotherapy models. The aim of the present paper is to model the biological effect of cisplatin and to simulate the consequence of cisplatin resistance on tumour control. The 'treated' tumour is a squamous cell carcinoma of the head and neck, previously grown by computer-based Monte Carlo techniques. The model maintained the biological constitution of a tumour through the generation of stem cells, proliferating cells and non-proliferating cells. Cell kinetic parameters (mean cell cycle time, cell loss factor, thymidine labelling index) were also consistent with the literature. A sensitivity study on the contribution of various mechanisms leading to drug resistance is undertaken. To quantify the extent of drug resistance, the cisplatin resistance factor (CRF) is defined as the ratio between the number of surviving cells of the resistant population and the number of surviving cells of the sensitive population, determined after the same treatment time. It is shown that there is a supra-linear dependence of CRF on the percentage of cisplatin-DNA adducts formed, and a sigmoid-like dependence between CRF and the percentage of cells killed in resistant tumours. Drug resistance is shown to be a cumulative process which eventually can overcome tumour regression leading to treatment failure

  13. 14 Days of supplementation with blueberry extract shows anti-atherogenic properties and improves oxidative parameters in hypercholesterolemic rats model.

    Science.gov (United States)

    Ströher, Deise Jaqueline; Escobar Piccoli, Jacqueline da Costa; Güllich, Angélica Aparecida da Costa; Pilar, Bruna Cocco; Coelho, Ritiéle Pinto; Bruno, Jamila Benvegnú; Faoro, Debora; Manfredini, Vanusa

    2015-01-01

    The effects of supplementation with blueberry (BE) extract (Vaccinium ashei Reade) for 14 consecutive days on biochemical, hematological, histopathological and oxidative parameters in hypercholesterolemic rats were investigated. After supplementation with lyophilized extract of BE, the levels of total cholesterol, low-density lipoprotein cholesterol and triglycerides were decreased. Histopathological analysis showed significant decrease (p < 0.05) of aortic lesions in hypercholesterolemic rats. Oxidative parameters showed significant reductions (p < 0.05) in oxidative damage to lipids and proteins and an increase in activities of antioxidant enzymes such as catalase, superoxide dismutase and glutathione peroxidase. The BE extract showed an important cardioprotective effect by the improvements in the serum lipid profile, antioxidant system, particularly in reducing oxidative stress associated with hypercholesterolemia and anti-atherogenic effect in rats.

  14. Advanced imaging techniques show progressive arthropathy following experimentally induced knee bleeding in a factor VIII-/- rat model

    DEFF Research Database (Denmark)

    Sorensen, K. R.; Roepstorff, K.; Petersen, M.

    2015-01-01

    Background: Joint pathology is most commonly assessed by radiogra-phy, but ultrasonography (US) is increasingly recognized for its acces-sibility, safety and ability to show soft tissue changes, the earliestindicators of haemophilic arthropathy (HA). US, however, lacks theability to visualize...

  15. Automated home cage assessment shows behavioral changes in a transgenic mouse model of spinocerebellar ataxia type 17.

    Science.gov (United States)

    Portal, Esteban; Riess, Olaf; Nguyen, Huu Phuc

    2013-08-01

    Spinocerebellar Ataxia type 17 (SCA17) is an autosomal dominantly inherited, neurodegenerative disease characterized by ataxia, involuntary movements, and dementia. A novel SCA17 mouse model having a 71 polyglutamine repeat expansion in the TATA-binding protein (TBP) has shown age related motor deficit using a classic motor test, yet concomitant weight increase might be a confounding factor for this measurement. In this study we used an automated home cage system to test several motor readouts for this same model to confirm pathological behavior results and evaluate benefits of automated home cage in behavior phenotyping. Our results confirm motor deficits in the Tbp/Q71 mice and present previously unrecognized behavioral characteristics obtained from the automated home cage, indicating its use for high-throughput screening and testing, e.g. of therapeutic compounds. Copyright © 2013 Elsevier B.V. All rights reserved.

  16. BO-1055, a novel DNA cross-linking agent with remarkable low myelotoxicity shows potent activity in sarcoma models

    OpenAIRE

    Ambati, Srikanth R.; Shieh, Jae-Hung; Pera, Benet; Lopes, Eloisi Caldas; Chaudhry, Anisha; Wong, Elissa W.P.; Saxena, Ashish; Su, Tsann-Long; Moore, Malcolm A.S.

    2016-01-01

    DNA damaging agents cause rapid shrinkage of tumors and form the basis of chemotherapy for sarcomas despite significant toxicities. Drugs having superior efficacy and wider therapeutic windows are needed to improve patient outcomes. We used cell proliferation and apoptosis assays in sarcoma cell lines and benign cells; ?-H2AX expression, comet assay, immunoblot analyses and drug combination studies in vitro and in patient derived xenograft (PDX) models. BO-1055 caused apoptosis and cell death...

  17. Betting on change: Tenet deal with Vanguard shows it's primed to try ACO effort, new payment model.

    Science.gov (United States)

    Kutscher, Beth

    2013-07-01

    Tenet Healthcare Corp.'s acquisition of Vanguard Health Systems is a sign the investor-owned chain is willing to take a chance on alternative payment models such as accountable care organizations. There's no certainty that ACOs will deliver the improvements on quality or cost savings, but Vanguard Vice Chairman Keith Pitts, left, says his system's Pioneer ACO in Detroit has already achieved some cost savings.

  18. Restless led syndrome model Drosophila melanogaster show successful olfactory learning and 1-day retention of the acquired memory

    OpenAIRE

    Mika F. Asaba; Adrian A. Bates; Hoa M. Dao; Mika J. Maeda

    2013-01-01

    Restless Legs Syndrome (RLS) is a prevalent but poorly understood disorder that ischaracterized by uncontrollable movements during sleep, resulting in sleep disturbance.Olfactory memory in Drosophila melanogaster has proven to be a useful tool for the study ofcognitive deficits caused by sleep disturbances, such as those seen in RLS. A recently generatedDrosophila model of RLS exhibited disturbed sleep patterns similar to those seen in humans withRLS. This research seeks to improve understand...

  19. Agribusiness model approach to territorial food development

    Directory of Open Access Journals (Sweden)

    Murcia Hector Horacio

    2011-04-01

    Full Text Available

    Several research efforts have coordinated the academic program of Agricultural Business Management from the University De La Salle (Bogota D.C., to the design and implementation of a sustainable agribusiness model applied to food development, with territorial projection. Rural development is considered as a process that aims to improve the current capacity and potential of the inhabitant of the sector, which refers not only to production levels and productivity of agricultural items. It takes into account the guidelines of the Organization of the United Nations “Millennium Development Goals” and considered the concept of sustainable food and agriculture development, including food security and nutrition in an integrated interdisciplinary context, with holistic and systemic dimension. Analysis is specified by a model with an emphasis on sustainable agribusiness production chains related to agricultural food items in a specific region. This model was correlated with farm (technical objectives, family (social purposes and community (collective orientations projects. Within this dimension are considered food development concepts and methodologies of Participatory Action Research (PAR. Finally, it addresses the need to link the results to low-income communities, within the concepts of the “new rurality”.

  20. Smeared crack modelling approach for corrosion-induced concrete damage

    DEFF Research Database (Denmark)

    Thybo, Anna Emilie Anusha; Michel, Alexander; Stang, Henrik

    2017-01-01

    In this paper a smeared crack modelling approach is used to simulate corrosion-induced damage in reinforced concrete. The presented modelling approach utilizes a thermal analogy to mimic the expansive nature of solid corrosion products, while taking into account the penetration of corrosion...... products into the surrounding concrete, non-uniform precipitation of corrosion products, and creep. To demonstrate the applicability of the presented modelling approach, numerical predictions in terms of corrosion-induced deformations as well as formation and propagation of micro- and macrocracks were...

  1. Applied Regression Modeling A Business Approach

    CERN Document Server

    Pardoe, Iain

    2012-01-01

    An applied and concise treatment of statistical regression techniques for business students and professionals who have little or no background in calculusRegression analysis is an invaluable statistical methodology in business settings and is vital to model the relationship between a response variable and one or more predictor variables, as well as the prediction of a response value given values of the predictors. In view of the inherent uncertainty of business processes, such as the volatility of consumer spending and the presence of market uncertainty, business professionals use regression a

  2. A modular approach to numerical human body modeling

    NARCIS (Netherlands)

    Forbes, P.A.; Griotto, G.; Rooij, L. van

    2007-01-01

    The choice of a human body model for a simulated automotive impact scenario must take into account both accurate model response and computational efficiency as key factors. This study presents a "modular numerical human body modeling" approach which allows the creation of a customized human body

  3. Mathematical Modeling in Mathematics Education: Basic Concepts and Approaches

    Science.gov (United States)

    Erbas, Ayhan Kürsat; Kertil, Mahmut; Çetinkaya, Bülent; Çakiroglu, Erdinç; Alacaci, Cengiz; Bas, Sinem

    2014-01-01

    Mathematical modeling and its role in mathematics education have been receiving increasing attention in Turkey, as in many other countries. The growing body of literature on this topic reveals a variety of approaches to mathematical modeling and related concepts, along with differing perspectives on the use of mathematical modeling in teaching and…

  4. Implicit moral evaluations: A multinomial modeling approach.

    Science.gov (United States)

    Cameron, C Daryl; Payne, B Keith; Sinnott-Armstrong, Walter; Scheffer, Julian A; Inzlicht, Michael

    2017-01-01

    Implicit moral evaluations-i.e., immediate, unintentional assessments of the wrongness of actions or persons-play a central role in supporting moral behavior in everyday life. Yet little research has employed methods that rigorously measure individual differences in implicit moral evaluations. In five experiments, we develop a new sequential priming measure-the Moral Categorization Task-and a multinomial model that decomposes judgment on this task into multiple component processes. These include implicit moral evaluations of moral transgression primes (Unintentional Judgment), accurate moral judgments about target actions (Intentional Judgment), and a directional tendency to judge actions as morally wrong (Response Bias). Speeded response deadlines reduced Intentional Judgment but not Unintentional Judgment (Experiment 1). Unintentional Judgment was stronger toward moral transgression primes than non-moral negative primes (Experiments 2-4). Intentional Judgment was associated with increased error-related negativity, a neurophysiological indicator of behavioral control (Experiment 4). Finally, people who voted for an anti-gay marriage amendment had stronger Unintentional Judgment toward gay marriage primes (Experiment 5). Across Experiments 1-4, implicit moral evaluations converged with moral personality: Unintentional Judgment about wrong primes, but not negative primes, was negatively associated with psychopathic tendencies and positively associated with moral identity and guilt proneness. Theoretical and practical applications of formal modeling for moral psychology are discussed. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Keyring models: An approach to steerability

    Science.gov (United States)

    Miller, Carl A.; Colbeck, Roger; Shi, Yaoyun

    2018-02-01

    If a measurement is made on one half of a bipartite system, then, conditioned on the outcome, the other half has a new reduced state. If these reduced states defy classical explanation—that is, if shared randomness cannot produce these reduced states for all possible measurements—the bipartite state is said to be steerable. Determining which states are steerable is a challenging problem even for low dimensions. In the case of two-qubit systems, a criterion is known for T-states (that is, those with maximally mixed marginals) under projective measurements. In the current work, we introduce the concept of keyring models—a special class of local hidden state models. When the measurements made correspond to real projectors, these allow us to study steerability beyond T-states. Using keyring models, we completely solve the steering problem for real projective measurements when the state arises from mixing a pure two-qubit state with uniform noise. We also give a partial solution in the case when the uniform noise is replaced by independent depolarizing channels.

  6. Dynamics and control of quadcopter using linear model predictive control approach

    Science.gov (United States)

    Islam, M.; Okasha, M.; Idres, M. M.

    2017-12-01

    This paper investigates the dynamics and control of a quadcopter using the Model Predictive Control (MPC) approach. The dynamic model is of high fidelity and nonlinear, with six degrees of freedom that include disturbances and model uncertainties. The control approach is developed based on MPC to track different reference trajectories ranging from simple ones such as circular to complex helical trajectories. In this control technique, a linearized model is derived and the receding horizon method is applied to generate the optimal control sequence. Although MPC is computer expensive, it is highly effective to deal with the different types of nonlinearities and constraints such as actuators’ saturation and model uncertainties. The MPC parameters (control and prediction horizons) are selected by trial-and-error approach. Several simulation scenarios are performed to examine and evaluate the performance of the proposed control approach using MATLAB and Simulink environment. Simulation results show that this control approach is highly effective to track a given reference trajectory.

  7. Functional RG approach to the Potts model

    Science.gov (United States)

    Ben Alì Zinati, Riccardo; Codello, Alessandro

    2018-01-01

    The critical behavior of the (n+1) -states Potts model in d-dimensions is studied with functional renormalization group techniques. We devise a general method to derive β-functions for continuous values of d and n and we write the flow equation for the effective potential (LPA’) when instead n is fixed. We calculate several critical exponents, which are found to be in good agreement with Monte Carlo simulations and ɛ-expansion results available in the literature. In particular, we focus on Percolation (n\\to0) and Spanning Forest (n\\to-1) which are the only non-trivial universality classes in d  =  4,5 and where our methods converge faster.

  8. Fusion modeling approach for novel plasma sources

    International Nuclear Information System (INIS)

    Melazzi, D; Manente, M; Pavarin, D; Cardinali, A

    2012-01-01

    The physics involved in the coupling, propagation and absorption of RF helicon waves (electronic whistler) in low temperature Helicon plasma sources is investigated by solving the 3D Maxwell-Vlasov model equations using a WKB asymptotic expansion. The reduced set of equations is formally Hamiltonian and allows for the reconstruction of the wave front of the propagating wave, monitoring along the calculation that the WKB expansion remains satisfied. This method can be fruitfully employed in a new investigation of the power deposition mechanisms involved in common Helicon low temperature plasma sources when a general confinement magnetic field configuration is allowed, unveiling new physical insight in the wave propagation and absorption phenomena and stimulating further research for the design of innovative and more efficient low temperature plasma sources. A brief overview of this methodology and its capabilities has been presented in this paper.

  9. Carbonate rock depositional models: A microfacies approach

    Energy Technology Data Exchange (ETDEWEB)

    Carozzi, A.V.

    1988-01-01

    Carbonate rocks contain more than 50% by weight carbonate minerals such as calcite, dolomite, and siderite. Understanding how these rocks form can lead to more efficient methods of petroleum exploration. Micofacies analysis techniques can be used as a method of predicting models of sedimentation for carbonate rocks. Micofacies in carbonate rocks can be seen clearly only in thin sections under a microscope. This section analysis of carbonate rocks is a tool that can be used to understand depositional environments, diagenetic evolution of carbonate rocks, and the formation of porosity and permeability in carbonate rocks. The use of micofacies analysis techniques is applied to understanding the origin and formation of carbonate ramps, carbonate platforms, and carbonate slopes and basins. This book will be of interest to students and professionals concerned with the disciplines of sedimentary petrology, sedimentology, petroleum geology, and palentology.

  10. Wind Turbine Control: Robust Model Based Approach

    DEFF Research Database (Denmark)

    Mirzaei, Mahmood

    . This is because, on the one hand, control methods can decrease the cost of energy by keeping the turbine close to its maximum efficiency. On the other hand, they can reduce structural fatigue and therefore increase the lifetime of the wind turbine. The power produced by a wind turbine is proportional...... to the square of its rotor radius, therefore it seems reasonable to increase the size of the wind turbine in order to capture more power. However as the size increases, the mass of the blades increases by cube of the rotor size. This means in order to keep structural feasibility and mass of the whole structure...... reasonable, the ratio of mass to size should be reduced. This trend results in more flexible structures. Control of the flexible structure of a wind turbine in a wind field with stochastic nature is very challenging. In this thesis we are examining a number of robust model based methods for wind turbine...

  11. Risk prediction model: Statistical and artificial neural network approach

    Science.gov (United States)

    Paiman, Nuur Azreen; Hariri, Azian; Masood, Ibrahim

    2017-04-01

    Prediction models are increasingly gaining popularity and had been used in numerous areas of studies to complement and fulfilled clinical reasoning and decision making nowadays. The adoption of such models assist physician's decision making, individual's behavior, and consequently improve individual outcomes and the cost-effectiveness of care. The objective of this paper is to reviewed articles related to risk prediction model in order to understand the suitable approach, development and the validation process of risk prediction model. A qualitative review of the aims, methods and significant main outcomes of the nineteen published articles that developed risk prediction models from numerous fields were done. This paper also reviewed on how researchers develop and validate the risk prediction models based on statistical and artificial neural network approach. From the review done, some methodological recommendation in developing and validating the prediction model were highlighted. According to studies that had been done, artificial neural network approached in developing the prediction model were more accurate compared to statistical approach. However currently, only limited published literature discussed on which approach is more accurate for risk prediction model development.

  12. A dual model approach to ground water recovery trench design

    International Nuclear Information System (INIS)

    Clodfelter, C.L.; Crouch, M.S.

    1992-01-01

    The design of trenches for contaminated ground water recovery must consider several variables. This paper presents a dual-model approach for effectively recovering contaminated ground water migrating toward a trench by advection. The approach involves an analytical model to determine the vertical influence of the trench and a numerical flow model to determine the capture zone within the trench and the surrounding aquifer. The analytical model is utilized by varying trench dimensions and head values to design a trench which meets the remediation criteria. The numerical flow model is utilized to select the type of backfill and location of sumps within the trench. The dual-model approach can be used to design a recovery trench which effectively captures advective migration of contaminants in the vertical and horizontal planes

  13. Dynamical system approach to running Λ cosmological models

    International Nuclear Information System (INIS)

    Stachowski, Aleksander; Szydlowski, Marek

    2016-01-01

    We study the dynamics of cosmological models with a time dependent cosmological term. We consider five classes of models; two with the non-covariant parametrization of the cosmological term Λ: Λ(H)CDM cosmologies, Λ(a)CDM cosmologies, and three with the covariant parametrization of Λ: Λ(R)CDM cosmologies, where R(t) is the Ricci scalar, Λ(φ)-cosmologies with diffusion, Λ(X)-cosmologies, where X = (1)/(2)g αβ ∇ α ∇ β φ is a kinetic part of the density of the scalar field. We also consider the case of an emergent Λ(a) relation obtained from the behaviour of trajectories in a neighbourhood of an invariant submanifold. In the study of the dynamics we used dynamical system methods for investigating how an evolutionary scenario can depend on the choice of special initial conditions. We show that the methods of dynamical systems allow one to investigate all admissible solutions of a running Λ cosmology for all initial conditions. We interpret Alcaniz and Lima's approach as a scaling cosmology. We formulate the idea of an emergent cosmological term derived directly from an approximation of the exact dynamics. We show that some non-covariant parametrization of the cosmological term like Λ(a), Λ(H) gives rise to the non-physical behaviour of trajectories in the phase space. This behaviour disappears if the term Λ(a) is emergent from the covariant parametrization. (orig.)

  14. Pridopidine, a dopamine stabilizer, improves motor performance and shows neuroprotective effects in Huntington disease R6/2 mouse model.

    Science.gov (United States)

    Squitieri, Ferdinando; Di Pardo, Alba; Favellato, Mariagrazia; Amico, Enrico; Maglione, Vittorio; Frati, Luigi

    2015-11-01

    Huntington disease (HD) is a neurodegenerative disorder for which new treatments are urgently needed. Pridopidine is a new dopaminergic stabilizer, recently developed for the treatment of motor symptoms associated with HD. The therapeutic effect of pridopidine in patients with HD has been determined in two double-blind randomized clinical trials, however, whether pridopidine exerts neuroprotection remains to be addressed. The main goal of this study was to define the potential neuroprotective effect of pridopidine, in HD in vivo and in vitro models, thus providing evidence that might support a potential disease-modifying action of the drug and possibly clarifying other aspects of pridopidine mode-of-action. Our data corroborated the hypothesis of neuroprotective action of pridopidine in HD experimental models. Administration of pridopidine protected cells from apoptosis, and resulted in highly improved motor performance in R6/2 mice. The anti-apoptotic effect observed in the in vitro system highlighted neuroprotective properties of the drug, and advanced the idea of sigma-1-receptor as an additional molecular target implicated in the mechanism of action of pridopidine. Coherent with protective effects, pridopidine-mediated beneficial effects in R6/2 mice were associated with an increased expression of pro-survival and neurostimulatory molecules, such as brain derived neurotrophic factor and DARPP32, and with a reduction in the size of mHtt aggregates in striatal tissues. Taken together, these findings support the theory of pridopidine as molecule with disease-modifying properties in HD and advance the idea of a valuable therapeutic strategy for effectively treating the disease. © 2015 The Authors. Journal of Cellular and Molecular Medicine published by John Wiley & Sons Ltd and Foundation for Cellular and Molecular Medicine.

  15. Simple queueing approach to segregation dynamics in Schelling model

    OpenAIRE

    Sobkowicz, Pawel

    2007-01-01

    A simple queueing approach for segregation of agents in modified one dimensional Schelling segregation model is presented. The goal is to arrive at simple formula for the number of unhappy agents remaining after the segregation.

  16. Virtuous organization: A structural equation modeling approach

    Directory of Open Access Journals (Sweden)

    Majid Zamahani

    2013-02-01

    Full Text Available For years, the idea of virtue was unfavorable among researchers and virtues were traditionally considered as culture-specific, relativistic and they were supposed to be associated with social conservatism, religious or moral dogmatism, and scientific irrelevance. Virtue and virtuousness have been recently considered seriously among organizational researchers. The proposed study of this paper examines the relationships between leadership, organizational culture, human resource, structure and processes, care for community and virtuous organization. Structural equation modeling is employed to investigate the effects of each variable on other components. The data used in this study consists of questionnaire responses from employees in Payam e Noor University in Yazd province. A total of 250 questionnaires were sent out and a total of 211 valid responses were received. Our results have revealed that all the five variables have positive and significant impacts on virtuous organization. Among the five variables, organizational culture has the most direct impact (0.80 and human resource has the most total impact (0.844 on virtuous organization.

  17. A systemic approach for modeling soil functions

    Science.gov (United States)

    Vogel, Hans-Jörg; Bartke, Stephan; Daedlow, Katrin; Helming, Katharina; Kögel-Knabner, Ingrid; Lang, Birgit; Rabot, Eva; Russell, David; Stößel, Bastian; Weller, Ulrich; Wiesmeier, Martin; Wollschläger, Ute

    2018-03-01

    The central importance of soil for the functioning of terrestrial systems is increasingly recognized. Critically relevant for water quality, climate control, nutrient cycling and biodiversity, soil provides more functions than just the basis for agricultural production. Nowadays, soil is increasingly under pressure as a limited resource for the production of food, energy and raw materials. This has led to an increasing demand for concepts assessing soil functions so that they can be adequately considered in decision-making aimed at sustainable soil management. The various soil science disciplines have progressively developed highly sophisticated methods to explore the multitude of physical, chemical and biological processes in soil. It is not obvious, however, how the steadily improving insight into soil processes may contribute to the evaluation of soil functions. Here, we present to a new systemic modeling framework that allows for a consistent coupling between reductionist yet observable indicators for soil functions with detailed process understanding. It is based on the mechanistic relationships between soil functional attributes, each explained by a network of interacting processes as derived from scientific evidence. The non-linear character of these interactions produces stability and resilience of soil with respect to functional characteristics. We anticipate that this new conceptional framework will integrate the various soil science disciplines and help identify important future research questions at the interface between disciplines. It allows the overwhelming complexity of soil systems to be adequately coped with and paves the way for steadily improving our capability to assess soil functions based on scientific understanding.

  18. A Constructive Neural-Network Approach to Modeling Psychological Development

    Science.gov (United States)

    Shultz, Thomas R.

    2012-01-01

    This article reviews a particular computational modeling approach to the study of psychological development--that of constructive neural networks. This approach is applied to a variety of developmental domains and issues, including Piagetian tasks, shift learning, language acquisition, number comparison, habituation of visual attention, concept…

  19. Towards Translating Graph Transformation Approaches by Model Transformations

    NARCIS (Netherlands)

    Hermann, F.; Kastenberg, H.; Modica, T.; Karsai, G.; Taentzer, G.

    2006-01-01

    Recently, many researchers are working on semantics preserving model transformation. In the field of graph transformation one can think of translating graph grammars written in one approach to a behaviourally equivalent graph grammar in another approach. In this paper we translate graph grammars

  20. Optimizing technology investments: a broad mission model approach

    Science.gov (United States)

    Shishko, R.

    2003-01-01

    A long-standing problem in NASA is how to allocate scarce technology development resources across advanced technologies in order to best support a large set of future potential missions. Within NASA, two orthogonal paradigms have received attention in recent years: the real-options approach and the broad mission model approach. This paper focuses on the latter.

  1. A generalized quarter car modelling approach with frame flexibility ...

    Indian Academy of Sciences (India)

    ... mass distribution and damping. Here we propose a generalized quarter-car modelling approach, incorporating both the frame as well as other-wheel ground contacts. Our approach is linear, uses Laplace transforms, involves vertical motions of key points of interest and has intermediate complexity with improved realism.

  2. MASKED AREAS IN SHEAR PEAK STATISTICS: A FORWARD MODELING APPROACH

    Energy Technology Data Exchange (ETDEWEB)

    Bard, D. [KIPAC, SLAC National Accelerator Laboratory, 2575 Sand Hill Rd, Menlo Park, CA 94025 (United States); Kratochvil, J. M. [Astrophysics and Cosmology Research Unit, University of KwaZulu-Natal, Westville, Durban 4000 (South Africa); Dawson, W., E-mail: djbard@slac.stanford.edu [Lawrence Livermore National Laboratory, 7000 East Ave, Livermore, CA 94550 (United States)

    2016-03-10

    The statistics of shear peaks have been shown to provide valuable cosmological information beyond the power spectrum, and will be an important constraint of models of cosmology in forthcoming astronomical surveys. Surveys include masked areas due to bright stars, bad pixels etc., which must be accounted for in producing constraints on cosmology from shear maps. We advocate a forward-modeling approach, where the impacts of masking and other survey artifacts are accounted for in the theoretical prediction of cosmological parameters, rather than correcting survey data to remove them. We use masks based on the Deep Lens Survey, and explore the impact of up to 37% of the survey area being masked on LSST and DES-scale surveys. By reconstructing maps of aperture mass the masking effect is smoothed out, resulting in up to 14% smaller statistical uncertainties compared to simply reducing the survey area by the masked area. We show that, even in the presence of large survey masks, the bias in cosmological parameter estimation produced in the forward-modeling process is ≈1%, dominated by bias caused by limited simulation volume. We also explore how this potential bias scales with survey area and evaluate how much small survey areas are impacted by the differences in cosmological structure in the data and simulated volumes, due to cosmic variance.

  3. Parameter Estimation of Structural Equation Modeling Using Bayesian Approach

    Directory of Open Access Journals (Sweden)

    Dewi Kurnia Sari

    2016-05-01

    Full Text Available Leadership is a process of influencing, directing or giving an example of employees in order to achieve the objectives of the organization and is a key element in the effectiveness of the organization. In addition to the style of leadership, the success of an organization or company in achieving its objectives can also be influenced by the commitment of the organization. Where organizational commitment is a commitment created by each individual for the betterment of the organization. The purpose of this research is to obtain a model of leadership style and organizational commitment to job satisfaction and employee performance, and determine the factors that influence job satisfaction and employee performance using SEM with Bayesian approach. This research was conducted at Statistics FNI employees in Malang, with 15 people. The result of this study showed that the measurement model, all significant indicators measure each latent variable. Meanwhile in the structural model, it was concluded there are a significant difference between the variables of Leadership Style and Organizational Commitment toward Job Satisfaction directly as well as a significant difference between Job Satisfaction on Employee Performance. As for the influence of Leadership Style and variable Organizational Commitment on Employee Performance directly declared insignificant.

  4. An interdisciplinary approach to modeling tritium transfer into the environment

    International Nuclear Information System (INIS)

    Galeriu, D; Melintescu, A.

    2005-01-01

    equations between soil and plants. Considering mammals, we recently showed that the simplistic models currently applied did not accurately match experimental data from rats and sheep. Specific data for many farm and wild animals are scarce. In this paper, we are advancing a different approach based on energy metabolism, which can be parameterized predominantly based on published metabolic data for mature mammals. We started with the observation that the measured dynamics of 14 C and non-exchangeable organically bound tritium (OBT) were, not surprisingly, similar. We therefore introduced a metabolic definition for the 14 C and OBT loss rate (assumed to be the same) from the whole body and specific organs. We assumed that this was given by the specific metabolic rate of the whole body or organ, divided by the enthalpy of combustion of a kilogram of fresh matter. Since basal metabolism data were taken from the literature, they were modified for energy expenditure above basal need. To keep the model simple, organs were grouped according to their metabolic activity or importance in the food chain. Pools considered were viscera (high metabolic rate organs except the brain), muscle, adipose tissue, blood, and other (all other tissues). We disregarded any detail on substrate utilization from the dietary intake and condensed the postprandial respiration in a single rate. We included considerations of net maintenance and growth needs. For tritium, the transfer between body water and organic compartments was modeled using knowledge of basic metabolism and published relations. We considered the potential influence of rumen digestion and bacterial protein in ruminants. As for model application, we focused on laboratory and farm animals, where some experimental data were available. The model performed well for rat muscle, viscera and adipose tissue, but due to the simplicity of model structure and assumptions, blood and urine data were only satisfactorily reproduced. Whilst for sheep fed

  5. Zonulin transgenic mice show altered gut permeability and increased morbidity/mortality in the DSS colitis model.

    Science.gov (United States)

    Sturgeon, Craig; Lan, Jinggang; Fasano, Alessio

    2017-06-01

    Increased small intestinal permeability (IP) has been proposed to be an integral element, along with genetic makeup and environmental triggers, in the pathogenies of chronic inflammatory diseases (CIDs). We identified zonulin as a master regular of intercellular tight junctions linked to the development of several CIDs. We aim to study the role of zonulin-mediated IP in the pathogenesis of CIDs. Zonulin transgenic Hp2 mice (Ztm) were subjected to dextran sodium sulfate (DSS) treatment for 7 days, followed by 4-7 days' recovery and compared to C57Bl/6 (wild-type (WT)) mice. IP was measured in vivo and ex vivo, and weight, histology, and survival were monitored. To mechanistically link zonulin-dependent impairment of small intestinal barrier function with clinical outcome, Ztm were treated with the zonulin inhibitor AT1001 added to drinking water in addition to DSS. We observed increased morbidity (more pronounced weight loss and colitis) and mortality (40-70% compared with 0% in WT) at 11 days post-DSS treatment in Ztm compared with WT mice. Both in vivo and ex vivo measurements showed an increased IP at baseline in Ztm compared to WT mice, which was exacerbated by DSS treatment and was associated with upregulation of zonulin gene expression (fourfold in the duodenum, sixfold in the jejunum). Treatment with AT1001 prevented the DSS-induced increased IP both in vivo and ex vivo without changing zonulin gene expression and completely reverted morbidity and mortality in Ztm. Our data show that zonulin-dependent small intestinal barrier impairment is an early step leading to the break of tolerance with subsequent development of CIDs. © 2017 New York Academy of Sciences.

  6. Numerical approaches to expansion process modeling

    Directory of Open Access Journals (Sweden)

    G. V. Alekseev

    2017-01-01

    Full Text Available Forage production is currently undergoing a period of intensive renovation and introduction of the most advanced technologies and equipment. More and more often such methods as barley toasting, grain extrusion, steaming and grain flattening, boiling bed explosion, infrared ray treatment of cereals and legumes, followed by flattening, and one-time or two-time granulation of the purified whole grain without humidification in matrix presses By grinding the granules. These methods require special apparatuses, machines, auxiliary equipment, created on the basis of different methods of compiled mathematical models. When roasting, simulating the heat fields arising in the working chamber, provide such conditions, the decomposition of a portion of the starch to monosaccharides, which makes the grain sweetish, but due to protein denaturation the digestibility of the protein and the availability of amino acids decrease somewhat. Grain is roasted mainly for young animals in order to teach them to eat food at an early age, stimulate the secretory activity of digestion, better development of the masticatory muscles. In addition, the high temperature is detrimental to bacterial contamination and various types of fungi, which largely avoids possible diseases of the gastrointestinal tract. This method has found wide application directly on the farms. Apply when used in feeding animals and legumes: peas, soy, lupine and lentils. These feeds are preliminarily ground, and then cooked or steamed for 1 hour for 30–40 minutes. In the feed mill. Such processing of feeds allows inactivating the anti-nutrients in them, which reduce the effectiveness of their use. After processing, legumes are used as protein supplements in an amount of 25–30% of the total nutritional value of the diet. But it is recommended to cook and steal a grain of good quality. A poor-quality grain that has been stored for a long time and damaged by pathogenic micro flora is subject to

  7. Graphical approach to model reduction for nonlinear biochemical networks.

    Science.gov (United States)

    Holland, David O; Krainak, Nicholas C; Saucerman, Jeffrey J

    2011-01-01

    Model reduction is a central challenge to the development and analysis of multiscale physiology models. Advances in model reduction are needed not only for computational feasibility but also for obtaining conceptual insights from complex systems. Here, we introduce an intuitive graphical approach to model reduction based on phase plane analysis. Timescale separation is identified by the degree of hysteresis observed in phase-loops, which guides a "concentration-clamp" procedure for estimating explicit algebraic relationships between species equilibrating on fast timescales. The primary advantages of this approach over Jacobian-based timescale decomposition are that: 1) it incorporates nonlinear system dynamics, and 2) it can be easily visualized, even directly from experimental data. We tested this graphical model reduction approach using a 25-variable model of cardiac β(1)-adrenergic signaling, obtaining 6- and 4-variable reduced models that retain good predictive capabilities even in response to new perturbations. These 6 signaling species appear to be optimal "kinetic biomarkers" of the overall β(1)-adrenergic pathway. The 6-variable reduced model is well suited for integration into multiscale models of heart function, and more generally, this graphical model reduction approach is readily applicable to a variety of other complex biological systems.

  8. Graphical approach to model reduction for nonlinear biochemical networks.

    Directory of Open Access Journals (Sweden)

    David O Holland

    Full Text Available Model reduction is a central challenge to the development and analysis of multiscale physiology models. Advances in model reduction are needed not only for computational feasibility but also for obtaining conceptual insights from complex systems. Here, we introduce an intuitive graphical approach to model reduction based on phase plane analysis. Timescale separation is identified by the degree of hysteresis observed in phase-loops, which guides a "concentration-clamp" procedure for estimating explicit algebraic relationships between species equilibrating on fast timescales. The primary advantages of this approach over Jacobian-based timescale decomposition are that: 1 it incorporates nonlinear system dynamics, and 2 it can be easily visualized, even directly from experimental data. We tested this graphical model reduction approach using a 25-variable model of cardiac β(1-adrenergic signaling, obtaining 6- and 4-variable reduced models that retain good predictive capabilities even in response to new perturbations. These 6 signaling species appear to be optimal "kinetic biomarkers" of the overall β(1-adrenergic pathway. The 6-variable reduced model is well suited for integration into multiscale models of heart function, and more generally, this graphical model reduction approach is readily applicable to a variety of other complex biological systems.

  9. Data Analysis A Model Comparison Approach, Second Edition

    CERN Document Server

    Judd, Charles M; Ryan, Carey S

    2008-01-01

    This completely rewritten classic text features many new examples, insights and topics including mediational, categorical, and multilevel models. Substantially reorganized, this edition provides a briefer, more streamlined examination of data analysis. Noted for its model-comparison approach and unified framework based on the general linear model, the book provides readers with a greater understanding of a variety of statistical procedures. This consistent framework, including consistent vocabulary and notation, is used throughout to develop fewer but more powerful model building techniques. T

  10. A Model Management Approach for Co-Simulation Model Evaluation

    NARCIS (Netherlands)

    Zhang, X.C.; Broenink, Johannes F.; Filipe, Joaquim; Kacprzyk, Janusz; Pina, Nuno

    2011-01-01

    Simulating formal models is a common means for validating the correctness of the system design and reduce the time-to-market. In most of the embedded control system design, multiple engineering disciplines and various domain-specific models are often involved, such as mechanical, control, software

  11. Amniotic fluid stem cells with low γ-interferon response showed behavioral improvement in Parkinsonism rat model.

    Directory of Open Access Journals (Sweden)

    Yu-Jen Chang

    Full Text Available Amniotic fluid stem cells (AFSCs are multipotent stem cells that may be used in transplantation medicine. In this study, AFSCs established from amniocentesis were characterized on the basis of surface marker expression and differentiation potential. To further investigate the properties of AFSCs for translational applications, we examined the cell surface expression of human leukocyte antigens (HLA of these cells and estimated the therapeutic effect of AFSCs in parkinsonian rats. The expression profiles of HLA-II and transcription factors were compared between AFSCs and bone marrow-derived mesenchymal stem cells (BMMSCs following treatment with γ-IFN. We found that stimulation of AFSCs with γ-IFN prompted only a slight increase in the expression of HLA-Ia and HLA-E, and the rare HLA-II expression could also be observed in most AFSCs samples. Consequently, the expression of CIITA and RFX5 was weakly induced by γ-IFN stimulation of AFSCs compared to that of BMMSCs. In the transplantation test, Sprague Dawley rats with 6-hydroxydopamine lesioning of the substantia nigra were used as a parkinsonian-animal model. Following the negative γ-IFN response AFSCs injection, apomorphine-induced rotation was reduced by 75% in AFSCs engrafted parkinsonian rats but was increased by 53% in the control group after 12-weeks post-transplantation. The implanted AFSCs were viable, and were able to migrate into the brain's circuitry and express specific proteins of dopamine neurons, such as tyrosine hydroxylase and dopamine transporter. In conclusion, the relative insensitivity AFSCs to γ-IFN implies that AFSCs might have immune-tolerance in γ-IFN inflammatory conditions. Furthermore, the effective improvement of AFSCs transplantation for apomorphine-induced rotation paves the way for the clinical application in parkinsonian therapy.

  12. Box-wing model approach for solar radiation pressure modelling in a multi-GNSS scenario

    Science.gov (United States)

    Tobias, Guillermo; Jesús García, Adrián

    2016-04-01

    The solar radiation pressure force is the largest orbital perturbation after the gravitational effects and the major error source affecting GNSS satellites. A wide range of approaches have been developed over the years for the modelling of this non gravitational effect as part of the orbit determination process. These approaches are commonly divided into empirical, semi-analytical and analytical, where their main difference relies on the amount of knowledge of a-priori physical information about the properties of the satellites (materials and geometry) and their attitude. It has been shown in the past that the pre-launch analytical models fail to achieve the desired accuracy mainly due to difficulties in the extrapolation of the in-orbit optical and thermic properties, the perturbations in the nominal attitude law and the aging of the satellite's surfaces, whereas empirical models' accuracies strongly depend on the amount of tracking data used for deriving the models, and whose performances are reduced as the area to mass ratio of the GNSS satellites increases, as it happens for the upcoming constellations such as BeiDou and Galileo. This paper proposes to use basic box-wing model for Galileo complemented with empirical parameters, based on the limited available information about the Galileo satellite's geometry. The satellite is modelled as a box, representing the satellite bus, and a wing representing the solar panel. The performance of the model will be assessed for GPS, GLONASS and Galileo constellations. The results of the proposed approach have been analyzed over a one year period. In order to assess the results two different SRP models have been used. Firstly, the proposed box-wing model and secondly, the new CODE empirical model, ECOM2. The orbit performances of both models are assessed using Satellite Laser Ranging (SLR) measurements, together with the evaluation of the orbit prediction accuracy. This comparison shows the advantages and disadvantages of

  13. Catching the Behavior of Stock Market: Numerical Approach to Estimate the Catalytic Chemical Model Parameters

    Directory of Open Access Journals (Sweden)

    Zaäfri Ananto Husodo

    2015-04-01

    Full Text Available This research proposes a numerical approach in estimating the trend of behavior of this market. This approach is applied to a model that is inspired by catalytic chemical model, in terms of differential equations, on four composite indices, New York Stock Exchange, Hong Kong Hang Seng, Straits Times Index, and Jakarta Stock Exchange, as suggested by Caetano and Yoneyama (2011. The approach is used to minimize the difference of estimated indices based on the model with respect to the actual data set. The result shows that the estimation is able to capture the trend of behavior in stock market well.

  14. A novel approach to modeling and diagnosing the cardiovascular system

    Energy Technology Data Exchange (ETDEWEB)

    Keller, P.E.; Kangas, L.J.; Hashem, S.; Kouzes, R.T. [Pacific Northwest Lab., Richland, WA (United States); Allen, P.A. [Life Link, Richland, WA (United States)

    1995-07-01

    A novel approach to modeling and diagnosing the cardiovascular system is introduced. A model exhibits a subset of the dynamics of the cardiovascular behavior of an individual by using a recurrent artificial neural network. Potentially, a model will be incorporated into a cardiovascular diagnostic system. This approach is unique in that each cardiovascular model is developed from physiological measurements of an individual. Any differences between the modeled variables and the variables of an individual at a given time are used for diagnosis. This approach also exploits sensor fusion to optimize the utilization of biomedical sensors. The advantage of sensor fusion has been demonstrated in applications including control and diagnostics of mechanical and chemical processes.

  15. Scientific Approach and Inquiry Learning Model in the Topic of Buffer Solution: A Content Analysis

    Science.gov (United States)

    Kusumaningrum, I. A.; Ashadi, A.; Indriyanti, N. Y.

    2017-09-01

    Many concepts in buffer solution cause student’s misconception. Understanding science concepts should apply the scientific approach. One of learning models which is suitable with this approach is inquiry. Content analysis was used to determine textbook compatibility with scientific approach and inquiry learning model in the concept of buffer solution. By using scientific indicator tools (SIT) and Inquiry indicator tools (IIT), we analyzed three chemistry textbooks grade 11 of senior high school labeled as P, Q, and R. We described how textbook compatibility with scientific approach and inquiry learning model in the concept of buffer solution. The results show that textbook P and Q were very poor and book R was sufficient because the textbook still in procedural level. Chemistry textbooks used at school are needed to be improved in term of scientific approach and inquiry learning model. The result of these analyses might be of interest in order to write future potential textbooks.

  16. A model-driven approach to information security compliance

    Science.gov (United States)

    Correia, Anacleto; Gonçalves, António; Teodoro, M. Filomena

    2017-06-01

    The availability, integrity and confidentiality of information are fundamental to the long-term survival of any organization. Information security is a complex issue that must be holistically approached, combining assets that support corporate systems, in an extended network of business partners, vendors, customers and other stakeholders. This paper addresses the conception and implementation of information security systems, conform the ISO/IEC 27000 set of standards, using the model-driven approach. The process begins with the conception of a domain level model (computation independent model) based on information security vocabulary present in the ISO/IEC 27001 standard. Based on this model, after embedding in the model mandatory rules for attaining ISO/IEC 27001 conformance, a platform independent model is derived. Finally, a platform specific model serves the base for testing the compliance of information security systems with the ISO/IEC 27000 set of standards.

  17. Reusable Component Model Development Approach for Parallel and Distributed Simulation

    Science.gov (United States)

    Zhu, Feng; Yao, Yiping; Chen, Huilong; Yao, Feng

    2014-01-01

    Model reuse is a key issue to be resolved in parallel and distributed simulation at present. However, component models built by different domain experts usually have diversiform interfaces, couple tightly, and bind with simulation platforms closely. As a result, they are difficult to be reused across different simulation platforms and applications. To address the problem, this paper first proposed a reusable component model framework. Based on this framework, then our reusable model development approach is elaborated, which contains two phases: (1) domain experts create simulation computational modules observing three principles to achieve their independence; (2) model developer encapsulates these simulation computational modules with six standard service interfaces to improve their reusability. The case study of a radar model indicates that the model developed using our approach has good reusability and it is easy to be used in different simulation platforms and applications. PMID:24729751

  18. Mathematical models for therapeutic approaches to control HIV disease transmission

    CERN Document Server

    Roy, Priti Kumar

    2015-01-01

    The book discusses different therapeutic approaches based on different mathematical models to control the HIV/AIDS disease transmission. It uses clinical data, collected from different cited sources, to formulate the deterministic as well as stochastic mathematical models of HIV/AIDS. It provides complementary approaches, from deterministic and stochastic points of view, to optimal control strategy with perfect drug adherence and also tries to seek viewpoints of the same issue from different angles with various mathematical models to computer simulations. The book presents essential methods and techniques for students who are interested in designing epidemiological models on HIV/AIDS. It also guides research scientists, working in the periphery of mathematical modeling, and helps them to explore a hypothetical method by examining its consequences in the form of a mathematical modelling and making some scientific predictions. The model equations, mathematical analysis and several numerical simulations that are...

  19. BUSINESS MODEL IN ELECTRICITY INDUSTRY USING BUSINESS MODEL CANVAS APPROACH; THE CASE OF PT. XYZ

    Directory of Open Access Journals (Sweden)

    Achmad Arief Wicaksono

    2017-01-01

    Full Text Available The magnitude of opportunities and project values of electricity system in Indonesia encourages PT. XYZ to develop its business in electrical sector which requires business development strategies. This study aims to identify company's business model using Business Model Canvas approach, formulate business development strategy alternatives, and determine the prioritized business development strategy which is appropriate to the manufacturing business model for PT. XYZ. This study utilized a descriptive approach and the nine elements of the Business Model Canvas. Alternative formulation and priority determination of the strategies were obtained by using Strengths, Weaknesses, Opportunities, Threats (SWOT analysis and pairwise comparison. The results of this study are the improvement of Business Model Canvas on the elements of key resources, key activities, key partners and customer segment. In terms of SWOT analysis on the nine elements of the Business Model Canvas for the first business development, the results show an expansion on the power plant construction project as the main contractor, an increase in sales in its core business in supporting equipment industry of oil and gas,  a development in the second business i.e. an investment in the electricity sector as an independent renewable emery-based power producer. On its first business development, PT. XYZ selected three Business Model Canvas elements which become the priorities of the company i.e. key resources weighing 0.252, key activities weighing 0.240, and key partners weighing 0.231. On its second business development, the company selected three elements to become their the priorities i.e. key partners weighing 0.225, customer segments weighing 0.217, and key resources weighing 0.215.Keywords: business model canvas, SWOT, pairwise comparison, business model

  20. Cellular communication and “non-targeted effects”: Modelling approaches

    Science.gov (United States)

    Ballarini, Francesca; Facoetti, Angelica; Mariotti, Luca; Nano, Rosanna; Ottolenghi, Andrea

    2009-10-01

    During the last decade, a large number of experimental studies on the so-called "non-targeted effects", in particular bystander effects, outlined that cellular communication plays a significant role in the pathways leading to radiobiological damage. Although it is known that two main types of cellular communication (i.e. via gap junctions and/or molecular messengers diffusing in the extra-cellular environment, such as cytokines, NO etc.) play a major role, it is of utmost importance to better understand the underlying mechanisms, and how such mechanisms can be modulated by ionizing radiation. Though the "final" goal is of course to elucidate the in vivo scenario, in the meanwhile also in vitro studies can provide useful insights. In the present paper we will discuss key issues on the mechanisms underlying non-targeted effects and cell communication, for which theoretical models and simulation codes can be of great help. In this framework, we will present in detail three literature models, as well as an approach under development at the University of Pavia. More specifically, we will first focus on a version of the "State-Vector Model" including bystander-induced apoptosis of initiated cells, which was successfully fitted to in vitro data on neoplastic transformation supporting the hypothesis of a protective bystander effect mediated by apoptosis. The second analyzed model, focusing on the kinetics of bystander effects in 3D tissues, was successfully fitted to data on bystander damage in an artificial 3D skin system, indicating a signal range of the order of 0.7-1 mm. A third model for bystander effect, taking into account of spatial location, cell killing and repopulation, showed dose-response curves increasing approximately linearly at low dose rates but quickly flattening out for higher dose rates, also predicting an effect augmentation following dose fractionation. Concerning the Pavia approach, which can model the release, diffusion and depletion/degradation of

  1. Stakeholder approach, Stakeholders mental model: A visualization test with cognitive mapping technique

    Directory of Open Access Journals (Sweden)

    Garoui Nassreddine

    2012-04-01

    Full Text Available The idea of this paper is to determine the mental models of actors in the firm with respect to the stakeholder approach of corporate governance. The use of the cognitive map to view these diagrams to show the ways of thinking and conceptualization of the stakeholder approach. The paper takes a corporate governance perspective, discusses stakeholder model. It takes also a cognitive mapping technique.

  2. "I just had to be flexible and show good patience": management of interactional approaches to enact mentoring roles by peer mentors with developmental disabilities.

    Science.gov (United States)

    Schwartz, Ariel E; Kramer, Jessica M

    2017-06-08

    Peer mentoring may be an effective approach for fostering skill development for mentors and mentees with developmental disabilities. However, little is known about how mentors with developmental disabilities perceive and enact their roles. (1) How do young adults with developmental disabilities describe their role as a peer mentor in the context of instrumental peer mentoring? (2) How do they enact their perceived roles? Thematic analysis of semi-structured reflections completed by six mentors with developmental disabilities (ages 17-35) with multiple mentoring experiences. Mentors perceived themselves as professionals with a primary role of teaching, and for some mentoring relationships, a secondary role of developing an interpersonal relationship. To enact these roles, mentors used a supportive interactional approach characterized by actions such as encouragement and sharing examples and dispositions, such as flexibility and patience. Mentors monitored mentee learning and engagement within the mentoring session and, as needed, adjusted their approach to optimize mentee learning and engagement. To successfully manage their interactional approach, mentors used supports such as peer mentoring scripts, tip sheets, and supervisors. While mentors reported several actions for teaching, they may benefit from training to learn approaches to facilitate more consistent development of interpersonal relationships. Implications for Rehabilitation Peer mentoring may be an effective approach for fostering skill development for young adult mentors and mentees with developmental disabilities. In this study, young adult peer mentors with developmental disabilities perceived themselves as professionals with a primary role of teaching and a secondary role of developing an interpersonal relationship. Peer mentors used actions and dispositions that matched their perceived roles and supported mentees with developmental disabilities to engage in instrumental mentoring. With supports and

  3. Selection of hydrologic modeling approaches for climate change assessment: A comparison of model scale and structures

    Science.gov (United States)

    Surfleet, Christopher G.; Tullos, Desirèe; Chang, Heejun; Jung, Il-Won

    2012-09-01

    SummaryA wide variety of approaches to hydrologic (rainfall-runoff) modeling of river basins confounds our ability to select, develop, and interpret models, particularly in the evaluation of prediction uncertainty associated with climate change assessment. To inform the model selection process, we characterized and compared three structurally-distinct approaches and spatial scales of parameterization to modeling catchment hydrology: a large-scale approach (using the VIC model; 671,000 km2 area), a basin-scale approach (using the PRMS model; 29,700 km2 area), and a site-specific approach (the GSFLOW model; 4700 km2 area) forced by the same future climate estimates. For each approach, we present measures of fit to historic observations and predictions of future response, as well as estimates of model parameter uncertainty, when available. While the site-specific approach generally had the best fit to historic measurements, the performance of the model approaches varied. The site-specific approach generated the best fit at unregulated sites, the large scale approach performed best just downstream of flood control projects, and model performance varied at the farthest downstream sites where streamflow regulation is mitigated to some extent by unregulated tributaries and water diversions. These results illustrate how selection of a modeling approach and interpretation of climate change projections require (a) appropriate parameterization of the models for climate and hydrologic processes governing runoff generation in the area under study, (b) understanding and justifying the assumptions and limitations of the model, and (c) estimates of uncertainty associated with the modeling approach.

  4. Interoperable transactions in business models: A structured approach

    NARCIS (Netherlands)

    Weigand, H.; Verharen, E.; Dignum, F.P.M.

    1996-01-01

    Recent database research has given much attention to the specification of "flexible" transactions that can be used in interoperable systems. Starting from a quite different angle, Business Process Modelling has approached the area of communication modelling as well (the Language/Action

  5. A Model-Driven Approach to e-Course Management

    Science.gov (United States)

    Savic, Goran; Segedinac, Milan; Milenkovic, Dušica; Hrin, Tamara; Segedinac, Mirjana

    2018-01-01

    This paper presents research on using a model-driven approach to the development and management of electronic courses. We propose a course management system which stores a course model represented as distinct machine-readable components containing domain knowledge of different course aspects. Based on this formally defined platform-independent…

  6. Modeling Alaska boreal forests with a controlled trend surface approach

    Science.gov (United States)

    Mo Zhou; Jingjing Liang

    2012-01-01

    An approach of Controlled Trend Surface was proposed to simultaneously take into consideration large-scale spatial trends and nonspatial effects. A geospatial model of the Alaska boreal forest was developed from 446 permanent sample plots, which addressed large-scale spatial trends in recruitment, diameter growth, and mortality. The model was tested on two sets of...

  7. Sensitivity analysis approaches applied to systems biology models.

    Science.gov (United States)

    Zi, Z

    2011-11-01

    With the rising application of systems biology, sensitivity analysis methods have been widely applied to study the biological systems, including metabolic networks, signalling pathways and genetic circuits. Sensitivity analysis can provide valuable insights about how robust the biological responses are with respect to the changes of biological parameters and which model inputs are the key factors that affect the model outputs. In addition, sensitivity analysis is valuable for guiding experimental analysis, model reduction and parameter estimation. Local and global sensitivity analysis approaches are the two types of sensitivity analysis that are commonly applied in systems biology. Local sensitivity analysis is a classic method that studies the impact of small perturbations on the model outputs. On the other hand, global sensitivity analysis approaches have been applied to understand how the model outputs are affected by large variations of the model input parameters. In this review, the author introduces the basic concepts of sensitivity analysis approaches applied to systems biology models. Moreover, the author discusses the advantages and disadvantages of different sensitivity analysis methods, how to choose a proper sensitivity analysis approach, the available sensitivity analysis tools for systems biology models and the caveats in the interpretation of sensitivity analysis results.

  8. Child human model development: a hybrid validation approach

    NARCIS (Netherlands)

    Forbes, P.A.; Rooij, L. van; Rodarius, C.; Crandall, J.

    2008-01-01

    The current study presents a development and validation approach of a child human body model that will help understand child impact injuries and improve the biofidelity of child anthropometric test devices. Due to the lack of fundamental child biomechanical data needed to fully develop such models a

  9. Refining the Committee Approach and Uncertainty Prediction in Hydrological Modelling

    NARCIS (Netherlands)

    Kayastha, N.

    2014-01-01

    Due to the complexity of hydrological systems a single model may be unable to capture the full range of a catchment response and accurately predict the streamflows. The multi modelling approach opens up possibilities for handling such difficulties and allows improve the predictive capability of

  10. Refining the committee approach and uncertainty prediction in hydrological modelling

    NARCIS (Netherlands)

    Kayastha, N.

    2014-01-01

    Due to the complexity of hydrological systems a single model may be unable to capture the full range of a catchment response and accurately predict the streamflows. The multi modelling approach opens up possibilities for handling such difficulties and allows improve the predictive capability of

  11. Product Trial Processing (PTP): a model approach from ...

    African Journals Online (AJOL)

    Product Trial Processing (PTP): a model approach from theconsumer's perspective. ... Global Journal of Social Sciences ... Among the constructs used in the model of consumer's processing of product trail includes; experiential and non- experiential attributes, perceived validity of product trial, consumer perceived expertise, ...

  12. A MIXTURE LIKELIHOOD APPROACH FOR GENERALIZED LINEAR-MODELS

    NARCIS (Netherlands)

    WEDEL, M; DESARBO, WS

    1995-01-01

    A mixture model approach is developed that simultaneously estimates the posterior membership probabilities of observations to a number of unobservable groups or latent classes, and the parameters of a generalized linear model which relates the observations, distributed according to some member of

  13. A distributed delay approach for modeling delayed outcomes in pharmacokinetics and pharmacodynamics studies.

    Science.gov (United States)

    Hu, Shuhua; Dunlavey, Michael; Guzy, Serge; Teuscher, Nathan

    2018-04-01

    A distributed delay approach was proposed in this paper to model delayed outcomes in pharmacokinetics and pharmacodynamics studies. This approach was shown to be general enough to incorporate a wide array of pharmacokinetic and pharmacodynamic models as special cases including transit compartment models, effect compartment models, typical absorption models (either zero-order or first-order absorption), and a number of atypical (or irregular) absorption models (e.g., parallel first-order, mixed first-order and zero-order, inverse Gaussian, and Weibull absorption models). Real-life examples were given to demonstrate how to implement distributed delays in Phoenix ® NLME™ 8.0, and to numerically show the advantages of the distributed delay approach over the traditional methods.

  14. A New Approach to Model Pitch Perception Using Sparse Coding.

    Directory of Open Access Journals (Sweden)

    Oded Barzelay

    2017-01-01

    Full Text Available Our acoustical environment abounds with repetitive sounds, some of which are related to pitch perception. It is still unknown how the auditory system, in processing these sounds, relates a physical stimulus and its percept. Since, in mammals, all auditory stimuli are conveyed into the nervous system through the auditory nerve (AN fibers, a model should explain the perception of pitch as a function of this particular input. However, pitch perception is invariant to certain features of the physical stimulus. For example, a missing fundamental stimulus with resolved or unresolved harmonics, or a low and high-level amplitude stimulus with the same spectral content-these all give rise to the same percept of pitch. In contrast, the AN representations for these different stimuli are not invariant to these effects. In fact, due to saturation and non-linearity of both cochlear and inner hair cells responses, these differences are enhanced by the AN fibers. Thus there is a difficulty in explaining how pitch percept arises from the activity of the AN fibers. We introduce a novel approach for extracting pitch cues from the AN population activity for a given arbitrary stimulus. The method is based on a technique known as sparse coding (SC. It is the representation of pitch cues by a few spatiotemporal atoms (templates from among a large set of possible ones (a dictionary. The amount of activity of each atom is represented by a non-zero coefficient, analogous to an active neuron. Such a technique has been successfully applied to other modalities, particularly vision. The model is composed of a cochlear model, an SC processing unit, and a harmonic sieve. We show that the model copes with different pitch phenomena: extracting resolved and non-resolved harmonics, missing fundamental pitches, stimuli with both high and low amplitudes, iterated rippled noises, and recorded musical instruments.

  15. Meta-analysis a structural equation modeling approach

    CERN Document Server

    Cheung, Mike W-L

    2015-01-01

    Presents a novel approach to conducting meta-analysis using structural equation modeling. Structural equation modeling (SEM) and meta-analysis are two powerful statistical methods in the educational, social, behavioral, and medical sciences. They are often treated as two unrelated topics in the literature. This book presents a unified framework on analyzing meta-analytic data within the SEM framework, and illustrates how to conduct meta-analysis using the metaSEM package in the R statistical environment. Meta-Analysis: A Structural Equation Modeling Approach begins by introducing the impo

  16. A study of multidimensional modeling approaches for data warehouse

    Science.gov (United States)

    Yusof, Sharmila Mat; Sidi, Fatimah; Ibrahim, Hamidah; Affendey, Lilly Suriani

    2016-08-01

    Data warehouse system is used to support the process of organizational decision making. Hence, the system must extract and integrate information from heterogeneous data sources in order to uncover relevant knowledge suitable for decision making process. However, the development of data warehouse is a difficult and complex process especially in its conceptual design (multidimensional modeling). Thus, there have been various approaches proposed to overcome the difficulty. This study surveys and compares the approaches of multidimensional modeling and highlights the issues, trend and solution proposed to date. The contribution is on the state of the art of the multidimensional modeling design.

  17. Numerical linked-cluster approach to quantum lattice models.

    Science.gov (United States)

    Rigol, Marcos; Bryant, Tyler; Singh, Rajiv R P

    2006-11-03

    We present a novel algorithm that allows one to obtain temperature dependent properties of quantum lattice models in the thermodynamic limit from exact diagonalization of small clusters. Our numerical linked-cluster approach provides a systematic framework to assess finite-size effects and is valid for any quantum lattice model. Unlike high temperature expansions, which have a finite radius of convergence in inverse temperature, these calculations are accurate at all temperatures provided the range of correlations is finite. We illustrate the power of our approach studying spin models on kagomé, triangular, and square lattices.

  18. Learning the Task Management Space of an Aircraft Approach Model

    Science.gov (United States)

    Krall, Joseph; Menzies, Tim; Davies, Misty

    2014-01-01

    Validating models of airspace operations is a particular challenge. These models are often aimed at finding and exploring safety violations, and aim to be accurate representations of real-world behavior. However, the rules governing the behavior are quite complex: nonlinear physics, operational modes, human behavior, and stochastic environmental concerns all determine the responses of the system. In this paper, we present a study on aircraft runway approaches as modeled in Georgia Tech's Work Models that Compute (WMC) simulation. We use a new learner, Genetic-Active Learning for Search-Based Software Engineering (GALE) to discover the Pareto frontiers defined by cognitive structures. These cognitive structures organize the prioritization and assignment of tasks of each pilot during approaches. We discuss the benefits of our approach, and also discuss future work necessary to enable uncertainty quantification.

  19. Forecasting wind-driven wildfires using an inverse modelling approach

    Directory of Open Access Journals (Sweden)

    O. Rios

    2014-06-01

    Full Text Available A technology able to rapidly forecast wildfire dynamics would lead to a paradigm shift in the response to emergencies, providing the Fire Service with essential information about the ongoing fire. This paper presents and explores a novel methodology to forecast wildfire dynamics in wind-driven conditions, using real-time data assimilation and inverse modelling. The forecasting algorithm combines Rothermel's rate of spread theory with a perimeter expansion model based on Huygens principle and solves the optimisation problem with a tangent linear approach and forward automatic differentiation. Its potential is investigated using synthetic data and evaluated in different wildfire scenarios. The results show the capacity of the method to quickly predict the location of the fire front with a positive lead time (ahead of the event in the order of 10 min for a spatial scale of 100 m. The greatest strengths of our method are lightness, speed and flexibility. We specifically tailor the forecast to be efficient and computationally cheap so it can be used in mobile systems for field deployment and operativeness. Thus, we put emphasis on producing a positive lead time and the means to maximise it.

  20. An integrated modeling approach to age invariant face recognition

    Science.gov (United States)

    Alvi, Fahad Bashir; Pears, Russel

    2015-03-01

    This Research study proposes a novel method for face recognition based on Anthropometric features that make use of an integrated approach comprising of a global and personalized models. The system is aimed to at situations where lighting, illumination, and pose variations cause problems in face recognition. A Personalized model covers the individual aging patterns while a Global model captures general aging patterns in the database. We introduced a de-aging factor that de-ages each individual in the database test and training sets. We used the k nearest neighbor approach for building a personalized model and global model. Regression analysis was applied to build the models. During the test phase, we resort to voting on different features. We used FG-Net database for checking the results of our technique and achieved 65 percent Rank 1 identification rate.

  1. Modeling Approaches and Systems Related to Structured Modeling.

    Science.gov (United States)

    1987-02-01

    Lasdon 򒾂> and Maturana 򒾃> for surveys of several modern systems. A -6- N NN- %0 CAMPS (Lucas and Mitra 򒾁>) -- Computer Assisted Mathe- %l...583-589. MATURANA , S. 򒾃>. "Comparative Analysis of Mathematical Modeling Systems," informal note, Graduate School of Manage- ment, UCLA, February

  2. Soil moisture simulations using two different modelling approaches

    Czech Academy of Sciences Publication Activity Database

    Šípek, Václav; Tesař, Miroslav

    2013-01-01

    Roč. 64, 3-4 (2013), s. 99-103 ISSN 0006-5471 R&D Projects: GA AV ČR IAA300600901; GA ČR GA205/08/1174 Institutional research plan: CEZ:AV0Z20600510 Keywords : soil moisture modelling * SWIM model * box modelling approach Subject RIV: DA - Hydrology ; Limnology http://www.boku.ac.at/diebodenkultur/volltexte/sondernummern/band-64/heft-3-4/sipek.pdf

  3. A generic approach to haptic modeling of textile artifacts

    Science.gov (United States)

    Shidanshidi, H.; Naghdy, F.; Naghdy, G.; Wood Conroy, D.

    2009-08-01

    Haptic Modeling of textile has attracted significant interest over the last decade. In spite of extensive research, no generic system has been proposed. The previous work mainly assumes that textile has a 2D planar structure. They also require time-consuming measurement of textile properties in construction of the mechanical model. A novel approach for haptic modeling of textile is proposed to overcome the existing shortcomings. The method is generic, assumes a 3D structure for the textile, and deploys computational intelligence to estimate the mechanical properties of textile. The approach is designed primarily for display of textile artifacts in museums. The haptic model is constructed by superimposing the mechanical model of textile over its geometrical model. Digital image processing is applied to the still image of textile to identify its pattern and structure through a fuzzy rule-base algorithm. The 3D geometric model of the artifact is automatically generated in VRML based on the identified pattern and structure obtained from the textile image. Selected mechanical properties of the textile are estimated by an artificial neural network; deploying the textile geometric characteristics and yarn properties as inputs. The estimated mechanical properties are then deployed in the construction of the textile mechanical model. The proposed system is introduced and the developed algorithms are described. The validation of method indicates the feasibility of the approach and its superiority to other haptic modeling algorithms.

  4. Modeling gene expression measurement error: a quasi-likelihood approach

    Directory of Open Access Journals (Sweden)

    Strimmer Korbinian

    2003-03-01

    Full Text Available Abstract Background Using suitable error models for gene expression measurements is essential in the statistical analysis of microarray data. However, the true probabilistic model underlying gene expression intensity readings is generally not known. Instead, in currently used approaches some simple parametric model is assumed (usually a transformed normal distribution or the empirical distribution is estimated. However, both these strategies may not be optimal for gene expression data, as the non-parametric approach ignores known structural information whereas the fully parametric models run the risk of misspecification. A further related problem is the choice of a suitable scale for the model (e.g. observed vs. log-scale. Results Here a simple semi-parametric model for gene expression measurement error is presented. In this approach inference is based an approximate likelihood function (the extended quasi-likelihood. Only partial knowledge about the unknown true distribution is required to construct this function. In case of gene expression this information is available in the form of the postulated (e.g. quadratic variance structure of the data. As the quasi-likelihood behaves (almost like a proper likelihood, it allows for the estimation of calibration and variance parameters, and it is also straightforward to obtain corresponding approximate confidence intervals. Unlike most other frameworks, it also allows analysis on any preferred scale, i.e. both on the original linear scale as well as on a transformed scale. It can also be employed in regression approaches to model systematic (e.g. array or dye effects. Conclusions The quasi-likelihood framework provides a simple and versatile approach to analyze gene expression data that does not make any strong distributional assumptions about the underlying error model. For several simulated as well as real data sets it provides a better fit to the data than competing models. In an example it also

  5. Intelligent Transportation and Evacuation Planning A Modeling-Based Approach

    CERN Document Server

    Naser, Arab

    2012-01-01

    Intelligent Transportation and Evacuation Planning: A Modeling-Based Approach provides a new paradigm for evacuation planning strategies and techniques. Recently, evacuation planning and modeling have increasingly attracted interest among researchers as well as government officials. This interest stems from the recent catastrophic hurricanes and weather-related events that occurred in the southeastern United States (Hurricane Katrina and Rita). The evacuation methods that were in place before and during the hurricanes did not work well and resulted in thousands of deaths. This book offers insights into the methods and techniques that allow for implementing mathematical-based, simulation-based, and integrated optimization and simulation-based engineering approaches for evacuation planning. This book also: Comprehensively discusses the application of mathematical models for evacuation and intelligent transportation modeling Covers advanced methodologies in evacuation modeling and planning Discusses principles a...

  6. A review of function modeling: Approaches and applications

    OpenAIRE

    Erden, M.S.; Komoto, H.; Van Beek, T.J.; D'Amelio, V.; Echavarria, E.; Tomiyama, T.

    2008-01-01

    This work is aimed at establishing a common frame and understanding of function modeling (FM) for our ongoing research activities. A comparative review of the literature is performed to grasp the various FM approaches with their commonalities and differences. The relations of FM with the research fields of artificial intelligence, design theory, and maintenance are discussed. In this discussion the goals are to highlight the features of various classical approaches in relation to FM, to delin...

  7. A model-data based systems approach to process intensification

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    . Their developments, however, are largely due to experiment based trial and error approaches and while they do not require validation, they can be time consuming and resource intensive. Also, one may ask, can a truly new intensified unit operation be obtained in this way? An alternative two-stage approach is to apply...... a model-based synthesis method to systematically generate and evaluate alternatives in the first stage and an experiment-model based validation in the second stage. In this way, the search for alternatives is done very quickly, reliably and systematically over a wide range, while resources are preserved...... for focused validation of only the promising candidates in the second-stage. This approach, however, would be limited to intensification based on “known” unit operations, unless the PI process synthesis/design is considered at a lower level of aggregation, namely the phenomena level. That is, the model-based...

  8. METHODOLOGICAL APPROACHES FOR MODELING THE RURAL SETTLEMENT DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    Gorbenkova Elena Vladimirovna

    2017-10-01

    Full Text Available Subject: the paper describes the research results on validation of a rural settlement developmental model. The basic methods and approaches for solving the problem of assessment of the urban and rural settlement development efficiency are considered. Research objectives: determination of methodological approaches to modeling and creating a model for the development of rural settlements. Materials and methods: domestic and foreign experience in modeling the territorial development of urban and rural settlements and settlement structures was generalized. The motivation for using the Pentagon-model for solving similar problems was demonstrated. Based on a systematic analysis of existing development models of urban and rural settlements as well as the authors-developed method for assessing the level of agro-towns development, the systems/factors that are necessary for a rural settlement sustainable development are identified. Results: we created the rural development model which consists of five major systems that include critical factors essential for achieving a sustainable development of a settlement system: ecological system, economic system, administrative system, anthropogenic (physical system and social system (supra-structure. The methodological approaches for creating an evaluation model of rural settlements development were revealed; the basic motivating factors that provide interrelations of systems were determined; the critical factors for each subsystem were identified and substantiated. Such an approach was justified by the composition of tasks for territorial planning of the local and state administration levels. The feasibility of applying the basic Pentagon-model, which was successfully used for solving the analogous problems of sustainable development, was shown. Conclusions: the resulting model can be used for identifying and substantiating the critical factors for rural sustainable development and also become the basis of

  9. An algebraic approach to modeling in software engineering

    International Nuclear Information System (INIS)

    Loegel, C.J.; Ravishankar, C.V.

    1993-09-01

    Our work couples the formalism of universal algebras with the engineering techniques of mathematical modeling to develop a new approach to the software engineering process. Our purpose in using this combination is twofold. First, abstract data types and their specification using universal algebras can be considered a common point between the practical requirements of software engineering and the formal specification of software systems. Second, mathematical modeling principles provide us with a means for effectively analyzing real-world systems. We first use modeling techniques to analyze a system and then represent the analysis using universal algebras. The rest of the software engineering process exploits properties of universal algebras that preserve the structure of our original model. This paper describes our software engineering process and our experience using it on both research and commercial systems. We need a new approach because current software engineering practices often deliver software that is difficult to develop and maintain. Formal software engineering approaches use universal algebras to describe ''computer science'' objects like abstract data types, but in practice software errors are often caused because ''real-world'' objects are improperly modeled. There is a large semantic gap between the customer's objects and abstract data types. In contrast, mathematical modeling uses engineering techniques to construct valid models for real-world systems, but these models are often implemented in an ad hoc manner. A combination of the best features of both approaches would enable software engineering to formally specify and develop software systems that better model real systems. Software engineering, like mathematical modeling, should concern itself first and foremost with understanding a real system and its behavior under given circumstances, and then with expressing this knowledge in an executable form

  10. Optimal design of supply chain network under uncertainty environment using hybrid analytical and simulation modeling approach

    Science.gov (United States)

    Chiadamrong, N.; Piyathanavong, V.

    2017-12-01

    Models that aim to optimize the design of supply chain networks have gained more interest in the supply chain literature. Mixed-integer linear programming and discrete-event simulation are widely used for such an optimization problem. We present a hybrid approach to support decisions for supply chain network design using a combination of analytical and discrete-event simulation models. The proposed approach is based on iterative procedures until the difference between subsequent solutions satisfies the pre-determined termination criteria. The effectiveness of proposed approach is illustrated by an example, which shows closer to optimal results with much faster solving time than the results obtained from the conventional simulation-based optimization model. The efficacy of this proposed hybrid approach is promising and can be applied as a powerful tool in designing a real supply chain network. It also provides the possibility to model and solve more realistic problems, which incorporate dynamism and uncertainty.

  11. Injury prevention risk communication: A mental models approach

    DEFF Research Database (Denmark)

    Austin, Laurel Cecelia; Fischhoff, Baruch

    2012-01-01

    Individuals' decisions and behaviour can play a critical role in determining both the probability and severity of injury. Behavioural decision research studies peoples' decision-making processes in terms comparable to scientific models of optimal choices, providing a basis for focusing...... interventions on the most critical opportunities to reduce risks. That research often seeks to identify the ‘mental models’ that underlie individuals' interpretations of their circumstances and the outcomes of possible actions. In the context of injury prevention, a mental models approach would ask why people...... and uses examples to discuss how the approach can be used to develop scientifically validated context-sensitive injury risk communications....

  12. The Layer-Oriented Approach to Declarative Languages for Biological Modeling

    Science.gov (United States)

    Raikov, Ivan; De Schutter, Erik

    2012-01-01

    We present a new approach to modeling languages for computational biology, which we call the layer-oriented approach. The approach stems from the observation that many diverse biological phenomena are described using a small set of mathematical formalisms (e.g. differential equations), while at the same time different domains and subdomains of computational biology require that models are structured according to the accepted terminology and classification of that domain. Our approach uses distinct semantic layers to represent the domain-specific biological concepts and the underlying mathematical formalisms. Additional functionality can be transparently added to the language by adding more layers. This approach is specifically concerned with declarative languages, and throughout the paper we note some of the limitations inherent to declarative approaches. The layer-oriented approach is a way to specify explicitly how high-level biological modeling concepts are mapped to a computational representation, while abstracting away details of particular programming languages and simulation environments. To illustrate this process, we define an example language for describing models of ionic currents, and use a general mathematical notation for semantic transformations to show how to generate model simulation code for various simulation environments. We use the example language to describe a Purkinje neuron model and demonstrate how the layer-oriented approach can be used for solving several practical issues of computational neuroscience model development. We discuss the advantages and limitations of the approach in comparison with other modeling language efforts in the domain of computational biology and outline some principles for extensible, flexible modeling language design. We conclude by describing in detail the semantic transformations defined for our language. PMID:22615554

  13. Assessing risk factors for dental caries: a statistical modeling approach.

    Science.gov (United States)

    Trottini, Mario; Bossù, Maurizio; Corridore, Denise; Ierardo, Gaetano; Luzzi, Valeria; Saccucci, Matteo; Polimeni, Antonella

    2015-01-01

    The problem of identifying potential determinants and predictors of dental caries is of key importance in caries research and it has received considerable attention in the scientific literature. From the methodological side, a broad range of statistical models is currently available to analyze dental caries indices (DMFT, dmfs, etc.). These models have been applied in several studies to investigate the impact of different risk factors on the cumulative severity of dental caries experience. However, in most of the cases (i) these studies focus on a very specific subset of risk factors; and (ii) in the statistical modeling only few candidate models are considered and model selection is at best only marginally addressed. As a result, our understanding of the robustness of the statistical inferences with respect to the choice of the model is very limited; the richness of the set of statistical models available for analysis in only marginally exploited; and inferences could be biased due the omission of potentially important confounding variables in the model's specification. In this paper we argue that these limitations can be overcome considering a general class of candidate models and carefully exploring the model space using standard model selection criteria and measures of global fit and predictive performance of the candidate models. Strengths and limitations of the proposed approach are illustrated with a real data set. In our illustration the model space contains more than 2.6 million models, which require inferences to be adjusted for 'optimism'.

  14. A modeling approach to hospital location for effective marketing.

    Science.gov (United States)

    Cokelez, S; Peacock, E

    1993-01-01

    This paper develops a mixed integer linear programming model for locating health care facilities. The parameters of the objective function of this model are based on factor rating analysis and grid method. Subjective and objective factors representative of the real life situations are incorporated into the model in a unique way permitting a trade-off analysis of certain factors pertinent to the location of hospitals. This results in a unified approach and a single model whose credibility is further enhanced by inclusion of geographical and demographical factors.

  15. Mathematical and computer modeling of electro-optic systems using a generic modeling approach

    OpenAIRE

    Smith, M.I.; Murray-Smith, D.J.; Hickman, D.

    2007-01-01

    The conventional approach to modelling electro-optic sensor systems is to develop separate models for individual systems or classes of system, depending on the detector technology employed in the sensor and the application. However, this ignores commonality in design and in components of these systems. A generic approach is presented for modelling a variety of sensor systems operating in the infrared waveband that also allows systems to be modelled with different levels of detail and at diffe...

  16. Deep Appearance Models: A Deep Boltzmann Machine Approach for Face Modeling

    OpenAIRE

    Duong, Chi Nhan; Luu, Khoa; Quach, Kha Gia; Bui, Tien D.

    2016-01-01

    The "interpretation through synthesis" approach to analyze face images, particularly Active Appearance Models (AAMs) method, has become one of the most successful face modeling approaches over the last two decades. AAM models have ability to represent face images through synthesis using a controllable parameterized Principal Component Analysis (PCA) model. However, the accuracy and robustness of the synthesized faces of AAM are highly depended on the training sets and inherently on the genera...

  17. A Nonparametric Operational Risk Modeling Approach Based on Cornish-Fisher Expansion

    Directory of Open Access Journals (Sweden)

    Xiaoqian Zhu

    2014-01-01

    Full Text Available It is generally accepted that the choice of severity distribution in loss distribution approach has a significant effect on the operational risk capital estimation. However, the usually used parametric approaches with predefined distribution assumption might be not able to fit the severity distribution accurately. The objective of this paper is to propose a nonparametric operational risk modeling approach based on Cornish-Fisher expansion. In this approach, the samples of severity are generated by Cornish-Fisher expansion and then used in the Monte Carlo simulation to sketch the annual operational loss distribution. In the experiment, the proposed approach is employed to calculate the operational risk capital charge for the overall Chinese banking. The experiment dataset is the most comprehensive operational risk dataset in China as far as we know. The results show that the proposed approach is able to use the information of high order moments and might be more effective and stable than the usually used parametric approach.

  18. Modeling and control approach to a distinctive quadrotor helicopter.

    Science.gov (United States)

    Wu, Jun; Peng, Hui; Chen, Qing; Peng, Xiaoyan

    2014-01-01

    The referenced quadrotor helicopter in this paper has a unique configuration. It is more complex than commonly used quadrotors because of its inaccurate parameters, unideal symmetrical structure and unknown nonlinear dynamics. A novel method was presented to handle its modeling and control problems in this paper, which adopts a MIMO RBF neural nets-based state-dependent ARX (RBF-ARX) model to represent its nonlinear dynamics, and then a MIMO RBF-ARX model-based global LQR controller is proposed to stabilize the quadrotor's attitude. By comparing with a physical model-based LQR controller and an ARX model-set-based gain scheduling LQR controller, superiority of the MIMO RBF-ARX model-based control approach was confirmed. This successful application verified the validity of the MIMO RBF-ARX modeling method to the quadrotor helicopter with complex nonlinearity. © 2013 Published by ISA. All rights reserved.

  19. Software sensors based on the grey-box modelling approach

    DEFF Research Database (Denmark)

    Carstensen, J.; Harremoës, P.; Strube, Rune

    1996-01-01

    In recent years the grey-box modelling approach has been applied to wastewater transportation and treatment Grey-box models are characterized by the combination of deterministic and stochastic terms to form a model where all the parameters are statistically identifiable from the on......-line measurements. With respect to the development of software sensors, the grey-box models possess two important features. Firstly, the on-line measurements can be filtered according to the grey-box model in order to remove noise deriving from the measuring equipment and controlling devices. Secondly, the grey......-box models may contain terms which can be estimated on-line by use of the models and measurements. In this paper, it is demonstrated that many storage basins in sewer systems can be used as an on-line flow measurement provided that the basin is monitored on-line with a level transmitter and that a grey...

  20. Environmental Radiation Effects on Mammals A Dynamical Modeling Approach

    CERN Document Server

    Smirnova, Olga A

    2010-01-01

    This text is devoted to the theoretical studies of radiation effects on mammals. It uses the framework of developed deterministic mathematical models to investigate the effects of both acute and chronic irradiation in a wide range of doses and dose rates on vital body systems including hematopoiesis, small intestine and humoral immunity, as well as on the development of autoimmune diseases. Thus, these models can contribute to the development of the system and quantitative approaches in radiation biology and ecology. This text is also of practical use. Its modeling studies of the dynamics of granulocytopoiesis and thrombocytopoiesis in humans testify to the efficiency of employment of the developed models in the investigation and prediction of radiation effects on these hematopoietic lines. These models, as well as the properly identified models of other vital body systems, could provide a better understanding of the radiation risks to health. The modeling predictions will enable the implementation of more ef...

  1. Techniques for managing behaviour in pediatric dentistry: comparative study of live modelling and tell-show-do based on children's heart rates during treatment.

    Science.gov (United States)

    Farhat-McHayleh, Nada; Harfouche, Alice; Souaid, Philippe

    2009-05-01

    Tell-show-do is the most popular technique for managing children"s behaviour in dentists" offices. Live modelling is used less frequently, despite the satisfactory results obtained in studies conducted during the 1980s. The purpose of this study was to compare the effects of these 2 techniques on children"s heart rates during dental treatments, heart rate being the simplest biological parameter to measure and an increase in heart rate being the most common physiologic indicator of anxiety and fear. For this randomized, controlled, parallel-group single-centre clinical trial, children 5 to 9 years of age presenting for the first time to the Saint Joseph University dental care centre in Beirut, Lebanon, were divided into 3 groups: those in groups A and B were prepared for dental treatment by means of live modelling, the mother serving as the model for children in group A and the father as the model for children in group B. The children in group C were prepared by a pediatric dentist using the tell-show-do method. Each child"s heart rate was monitored during treatment, which consisted of an oral examination and cleaning. A total of 155 children met the study criteria and participated in the study. Children who received live modelling with the mother as model had lower heart rates than those who received live modelling with the father as model and those who were prepared by the tell-show-do method (p pediatric dentistry.

  2. THE EFECTIVENESS OF RHETORIC-BASED ESSAY WRITING TEACHING MODEL WITH CONTEXTUAL APPROACH

    OpenAIRE

    Akbar, Akbar; HP, Achmad

    2015-01-01

    This study aims to develop a rhetoric–based essay writing teaching model with contextual approach in order to improve essay writing skills of students in the English Department of the Education and Teaching Faculty of Lakidende University of Konawe. This instructional model was developed by using research and development. The results show that the model can improve students’ essay writing skills effectively.. It was done in experimental class of the Education and Teaching Faculty of Lakidende...

  3. An approach to accidents modeling based on compounds road environments.

    Science.gov (United States)

    Fernandes, Ana; Neves, Jose

    2013-04-01

    The most common approach to study the influence of certain road features on accidents has been the consideration of uniform road segments characterized by a unique feature. However, when an accident is related to the road infrastructure, its cause is usually not a single characteristic but rather a complex combination of several characteristics. The main objective of this paper is to describe a methodology developed in order to consider the road as a complete environment by using compound road environments, overcoming the limitations inherented in considering only uniform road segments. The methodology consists of: dividing a sample of roads into segments; grouping them into quite homogeneous road environments using cluster analysis; and identifying the influence of skid resistance and texture depth on road accidents in each environment by using generalized linear models. The application of this methodology is demonstrated for eight roads. Based on real data from accidents and road characteristics, three compound road environments were established where the pavement surface properties significantly influence the occurrence of accidents. Results have showed clearly that road environments where braking maneuvers are more common or those with small radii of curvature and high speeds require higher skid resistance and texture depth as an important contribution to the accident prevention. Copyright © 2013 Elsevier Ltd. All rights reserved.

  4. Spatial pattern evaluation of a calibrated national hydrological model - a remote-sensing-based diagnostic approach

    Science.gov (United States)

    Mendiguren, Gorka; Koch, Julian; Stisen, Simon

    2017-11-01

    Distributed hydrological models are traditionally evaluated against discharge stations, emphasizing the temporal and neglecting the spatial component of a model. The present study widens the traditional paradigm by highlighting spatial patterns of evapotranspiration (ET), a key variable at the land-atmosphere interface, obtained from two different approaches at the national scale of Denmark. The first approach is based on a national water resources model (DK-model), using the MIKE-SHE model code, and the second approach utilizes a two-source energy balance model (TSEB) driven mainly by satellite remote sensing data. Ideally, the hydrological model simulation and remote-sensing-based approach should present similar spatial patterns and driving mechanisms of ET. However, the spatial comparison showed that the differences are significant and indicate insufficient spatial pattern performance of the hydrological model.The differences in spatial patterns can partly be explained by the fact that the hydrological model is configured to run in six domains that are calibrated independently from each other, as it is often the case for large-scale multi-basin calibrations. Furthermore, the model incorporates predefined temporal dynamics of leaf area index (LAI), root depth (RD) and crop coefficient (Kc) for each land cover type. This zonal approach of model parameterization ignores the spatiotemporal complexity of the natural system. To overcome this limitation, this study features a modified version of the DK-model in which LAI, RD and Kc are empirically derived using remote sensing data and detailed soil property maps in order to generate a higher degree of spatiotemporal variability and spatial consistency between the six domains. The effects of these changes are analyzed by using empirical orthogonal function (EOF) analysis to evaluate spatial patterns. The EOF analysis shows that including remote-sensing-derived LAI, RD and Kc in the distributed hydrological model adds

  5. Modelling dynamic ecosystems : venturing beyond boundaries with the Ecopath approach

    OpenAIRE

    Coll, Marta; Akoglu, E.; Arreguin-Sanchez, F.; Fulton, E. A.; Gascuel, D.; Heymans, J. J.; Libralato, S.; Mackinson, S.; Palomera, I.; Piroddi, C.; Shannon, L. J.; Steenbeek, J.; Villasante, S.; Christensen, V.

    2015-01-01

    Thirty years of progress using the Ecopath with Ecosim (EwE) approach in different fields such as ecosystem impacts of fishing and climate change, emergent ecosystem dynamics, ecosystem-based management, and marine conservation and spatial planning were showcased November 2014 at the conference "Ecopath 30 years-modelling dynamic ecosystems: beyond boundaries with EwE". Exciting new developments include temporal-spatial and end-to-end modelling, as well as novel applications to environmental ...

  6. Gray-box modelling approach for description of storage tunnel

    DEFF Research Database (Denmark)

    Harremoës, Poul; Carstensen, Jacob

    1999-01-01

    The dynamics of a storage tunnel is examined using a model based on on-line measured data and a combination of simple deterministic and black-box stochastic elements. This approach, called gray-box modeling, is a new promising methodology for giving an on-line state description of sewer systems. ...... in a SCADA system because the most important information on the specific system is provided on-line...

  7. Development of a Conservative Model Validation Approach for Reliable Analysis

    Science.gov (United States)

    2015-01-01

    conservativeness level , the conservative probability of failure obtained from Section 4 must be maintained. The mathematical formulation of conservative model... CIE 2015 August 2-5, 2015, Boston, Massachusetts, USA [DRAFT] DETC2015-46982 DEVELOPMENT OF A CONSERVATIVE MODEL VALIDATION APPROACH FOR RELIABLE...PDF and a probability of failure are selected from these predicted output PDFs at a user-specified conservativeness level for validation. For

  8. Reliability assessment using degradation models: bayesian and classical approaches

    Directory of Open Access Journals (Sweden)

    Marta Afonso Freitas

    2010-04-01

    Full Text Available Traditionally, reliability assessment of devices has been based on (accelerated life tests. However, for highly reliable products, little information about reliability is provided by life tests in which few or no failures are typically observed. Since most failures arise from a degradation mechanism at work for which there are characteristics that degrade over time, one alternative is monitor the device for a period of time and assess its reliability from the changes in performance (degradation observed during that period. The goal of this article is to illustrate how degradation data can be modeled and analyzed by using "classical" and Bayesian approaches. Four methods of data analysis based on classical inference are presented. Next we show how Bayesian methods can also be used to provide a natural approach to analyzing degradation data. The approaches are applied to a real data set regarding train wheels degradation.Tradicionalmente, o acesso à confiabilidade de dispositivos tem sido baseado em testes de vida (acelerados. Entretanto, para produtos altamente confiáveis, pouca informação a respeito de sua confiabilidade é fornecida por testes de vida no quais poucas ou nenhumas falhas são observadas. Uma vez que boa parte das falhas é induzida por mecanismos de degradação, uma alternativa é monitorar o dispositivo por um período de tempo e acessar sua confiabilidade através das mudanças em desempenho (degradação observadas durante aquele período. O objetivo deste artigo é ilustrar como dados de degradação podem ser modelados e analisados utilizando-se abordagens "clássicas" e Bayesiana. Quatro métodos de análise de dados baseados em inferência clássica são apresentados. A seguir, mostramos como os métodos Bayesianos podem também ser aplicados para proporcionar uma abordagem natural à análise de dados de degradação. As abordagens são aplicadas a um banco de dados real relacionado à degradação de rodas de trens.

  9. Estimating, Testing, and Comparing Specific Effects in Structural Equation Models: The Phantom Model Approach

    Science.gov (United States)

    Macho, Siegfried; Ledermann, Thomas

    2011-01-01

    The phantom model approach for estimating, testing, and comparing specific effects within structural equation models (SEMs) is presented. The rationale underlying this novel method consists in representing the specific effect to be assessed as a total effect within a separate latent variable model, the phantom model that is added to the main…

  10. The Generalised Ecosystem Modelling Approach in Radiological Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Klos, Richard

    2008-03-15

    An independent modelling capability is required by SSI in order to evaluate dose assessments carried out in Sweden by, amongst others, SKB. The main focus is the evaluation of the long-term radiological safety of radioactive waste repositories for both spent fuel and low-level radioactive waste. To meet the requirement for an independent modelling tool for use in biosphere dose assessments, SSI through its modelling team CLIMB commissioned the development of a new model in 2004, a project to produce an integrated model of radionuclides in the landscape. The generalised ecosystem modelling approach (GEMA) is the result. GEMA is a modular system of compartments representing the surface environment. It can be configured, through water and solid material fluxes, to represent local details in the range of ecosystem types found in the past, present and future Swedish landscapes. The approach is generic but fine tuning can be carried out using local details of the surface drainage system. The modular nature of the modelling approach means that GEMA modules can be linked to represent large scale surface drainage features over an extended domain in the landscape. System change can also be managed in GEMA, allowing a flexible and comprehensive model of the evolving landscape to be constructed. Environmental concentrations of radionuclides can be calculated and the GEMA dose pathway model provides a means of evaluating the radiological impact of radionuclide release to the surface environment. This document sets out the philosophy and details of GEMA and illustrates the functioning of the model with a range of examples featuring the recent CLIMB review of SKB's SR-Can assessment

  11. Habitat fragmentation and reproductive success: a structural equation modelling approach.

    Science.gov (United States)

    Le Tortorec, Eric; Helle, Samuli; Käyhkö, Niina; Suorsa, Petri; Huhta, Esa; Hakkarainen, Harri

    2013-09-01

    1. There is great interest on the effects of habitat fragmentation, whereby habitat is lost and the spatial configuration of remaining habitat patches is altered, on individual breeding performance. However, we still lack consensus of how this important process affects reproductive success, and whether its effects are mainly due to reduced fecundity or nestling survival. 2. The main reason for this may be the way that habitat fragmentation has been previously modelled. Studies have treated habitat loss and altered spatial configuration as two independent processes instead of as one hierarchical and interdependent process, and therefore have not been able to consider the relative direct and indirect effects of habitat loss and altered spatial configuration. 3. We investigated how habitat (i.e. old forest) fragmentation, caused by intense forest harvesting at the territory and landscape scales, is associated with the number of fledged offspring of an area-sensitive passerine, the Eurasian treecreeper (Certhia familiaris). We used structural equation modelling (SEM) to examine the complex hierarchical associations between habitat loss and altered spatial configuration on the number of fledged offspring, by controlling for individual condition and weather conditions during incubation. 4. Against generally held expectations, treecreeper reproductive success did not show a significant association with habitat fragmentation measured at the territory scale. Instead, our analyses suggested that an increasing amount of habitat at the landscape scale caused a significant increase in nest predation rates, leading to reduced reproductive success. This effect operated directly on nest predation rates, instead of acting indirectly through altered spatial configuration. 5. Because habitat amount and configuration are inherently strongly collinear, particularly when multiple scales are considered, our study demonstrates the usefulness of a SEM approach for hierarchical partitioning

  12. Modeling of phase equilibria with CPA using the homomorph approach

    DEFF Research Database (Denmark)

    Breil, Martin Peter; Tsivintzelis, Ioannis; Kontogeorgis, Georgios

    2011-01-01

    For association models, like CPA and SAFT, a classical approach is often used for estimating pure-compound and mixture parameters. According to this approach, the pure-compound parameters are estimated from vapor pressure and liquid density data. Then, the binary interaction parameters, kij......, are estimated from binary systems; one binary interaction parameter per system. No additional mixing rules are needed for cross-associating systems, but combining rules are required, e.g. the Elliott rule or the so-called CR-1 rule. There is a very large class of mixtures, e.g. water or glycols with aromatic...... interaction parameters are often used for solvating systems; one for the physical part (kij) and one for the association part (βcross). This limits the predictive capabilities and possibilities of generalization of the model. In this work we present an approach to reduce the number of adjustable parameters...

  13. A model-data based systems approach to process intensification

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    . Their developments, however, are largely due to experiment based trial and error approaches and while they do not require validation, they can be time consuming and resource intensive. Also, one may ask, can a truly new intensified unit operation be obtained in this way? An alternative two-stage approach is to apply...... for focused validation of only the promising candidates in the second-stage. This approach, however, would be limited to intensification based on “known” unit operations, unless the PI process synthesis/design is considered at a lower level of aggregation, namely the phenomena level. That is, the model......-based synthesis method must employ models at lower levels of aggregation and through combination rules for phenomena, generate (synthesize) new intensified unit operations. An efficient solution procedure for the synthesis problem is needed to tackle the potentially large number of options that would be obtained...

  14. Evaluation of Different Modeling Approaches to Simulate Contaminant Transport in a Fractured Limestone Aquifer

    Science.gov (United States)

    Mosthaf, K.; Rosenberg, L.; Balbarini, N.; Broholm, M. M.; Bjerg, P. L.; Binning, P. J.

    2014-12-01

    It is important to understand the fate and transport of contaminants in limestone aquifers because they are a major drinking water resource. This is challenging because they are highly heterogeneous; with micro-porous grains, flint inclusions, and being heavily fractured. Several modeling approaches have been developed to describe contaminant transport in fractured media, such as the discrete fracture (with various fracture geometries), equivalent porous media (with and without anisotropy), and dual porosity models. However, these modeling concepts are not well tested for limestone geologies. Given available field data and model purpose, this paper therefore aims to develop, examine and compare modeling approaches for transport of contaminants in fractured limestone aquifers. The model comparison was conducted for a contaminated site in Denmark, where a plume of a dissolved contaminant (PCE) has migrated through a fractured limestone aquifer. Multilevel monitoring wells have been installed at the site and available data includes information on spill history, extent of contamination, geology and hydrogeology. To describe the geology and fracture network, data from borehole logs was combined with an analysis of heterogeneities and fractures from a nearby excavation (analog site). Methods for translating the geological information and fracture mapping into each of the model concepts were examined. Each model was compared with available field data, considering both model fit and measures of model suitability. An analysis of model parameter identifiability and sensitivity is presented. Results show that there is considerable difference between modeling approaches, and that it is important to identify the right one for the actual scale and model purpose. A challenge in the use of field data is the determination of relevant hydraulic properties and interpretation of aqueous and solid phase contaminant concentration sampling data. Traditional water sampling has a bias

  15. A review of function modeling : Approaches and applications

    NARCIS (Netherlands)

    Erden, M.S.; Komoto, H.; Van Beek, T.J.; D'Amelio, V.; Echavarria, E.; Tomiyama, T.

    2008-01-01

    This work is aimed at establishing a common frame and understanding of function modeling (FM) for our ongoing research activities. A comparative review of the literature is performed to grasp the various FM approaches with their commonalities and differences. The relations of FM with the research

  16. The Bipolar Approach: A Model for Interdisciplinary Art History Courses.

    Science.gov (United States)

    Calabrese, John A.

    1993-01-01

    Describes a college level art history course based on the opposing concepts of Classicism and Romanticism. Contends that all creative work, such as film or architecture, can be categorized according to this bipolar model. Includes suggestions for objects to study and recommends this approach for art education at all education levels. (CFR)

  17. Model-independent approach for dark matter phenomenology ...

    Indian Academy of Sciences (India)

    We have studied the phenomenology of dark matter at the ILC and cosmic positron experiments based on model-independent approach. We have found a strong correlation between dark matter signatures at the ILC and those in the indirect detection experiments of dark matter. Once the dark matter is discovered in the ...

  18. A Behavioral Decision Making Modeling Approach Towards Hedging Services

    NARCIS (Netherlands)

    Pennings, J.M.E.; Candel, M.J.J.M.; Egelkraut, T.M.

    2003-01-01

    This paper takes a behavioral approach toward the market for hedging services. A behavioral decision-making model is developed that provides insight into how and why owner-managers decide the way they do regarding hedging services. Insight into those choice processes reveals information needed by

  19. Comparing State SAT Scores Using a Mixture Modeling Approach

    Science.gov (United States)

    Kim, YoungKoung Rachel

    2009-01-01

    Presented at the national conference for AERA (American Educational Research Association) in April 2009. The large variability of SAT taker population across states makes state-by-state comparisons of the SAT scores challenging. Using a mixture modeling approach, therefore, the current study presents a method of identifying subpopulations in terms…

  20. Export of microplastics from land to sea. A modelling approach

    NARCIS (Netherlands)

    Siegfried, Max; Koelmans, A.A.; Besseling, E.; Kroeze, C.

    2017-01-01

    Quantifying the transport of plastic debris from river to sea is crucial for assessing the risks of plastic debris to human health and the environment. We present a global modelling approach to analyse the composition and quantity of point-source microplastic fluxes from European rivers to the sea.

  1. A novel Monte Carlo approach to hybrid local volatility models

    NARCIS (Netherlands)

    A.W. van der Stoep (Anton); L.A. Grzelak (Lech Aleksander); C.W. Oosterlee (Cornelis)

    2017-01-01

    textabstractWe present in a Monte Carlo simulation framework, a novel approach for the evaluation of hybrid local volatility [Risk, 1994, 7, 18–20], [Int. J. Theor. Appl. Finance, 1998, 1, 61–110] models. In particular, we consider the stochastic local volatility model—see e.g. Lipton et al. [Quant.

  2. Hidden Markov model-based approach for generation of Pitman ...

    Indian Academy of Sciences (India)

    Speech is one of the most basic means of human communication. ... human beings is carried out with the aid of communication and has facilitated the development ... Hidden Markov model-based approach for generation of PSL symbols. 279. Table 1. PSL basic strokes and English consonants. English consonant.

  3. A novel Monte Carlo approach to hybrid local volatility models

    NARCIS (Netherlands)

    van der Stoep, A.W.; Grzelak, L.A.; Oosterlee, C.W.

    2017-01-01

    We present in a Monte Carlo simulation framework, a novel approach for the evaluation of hybrid local volatility [Risk, 1994, 7, 18–20], [Int. J. Theor. Appl. Finance, 1998, 1, 61–110] models. In particular, we consider the stochastic local volatility model—see e.g. Lipton et al. [Quant. Finance,

  4. Model-independent approach for dark matter phenomenology

    Indian Academy of Sciences (India)

    We have studied the phenomenology of dark matter at the ILC and cosmic positron experiments based on model-independent approach. We have found a strong correlation between dark matter signatures at the ILC and those in the indirect detection experiments of dark matter. Once the dark matter is discovered in the ...

  5. Model-independent approach for dark matter phenomenology ...

    Indian Academy of Sciences (India)

    Abstract. We have studied the phenomenology of dark matter at the ILC and cosmic positron experiments based on model-independent approach. We have found a strong correlation between dark matter signatures at the ILC and those in the indirect detec- tion experiments of dark matter. Once the dark matter is discovered ...

  6. Using artificial neural network approach for modelling rainfall–runoff ...

    Indian Academy of Sciences (India)

    Home; Journals; Journal of Earth System Science; Volume 122; Issue 2. Using artificial neural network approach for modelling ... Nevertheless, water level and flow records are essential in hydrological analysis for designing related water works of flood management. Due to the complexity of the hydrological process, ...

  7. An Approach to Quality Estimation in Model-Based Development

    DEFF Research Database (Denmark)

    Holmegaard, Jens Peter; Koch, Peter; Ravn, Anders Peter

    2004-01-01

    We present an approach to estimation of parameters for design space exploration in Model-Based Development, where synthesis of a system is done in two stages. Component qualities like space, execution time or power consumption are defined in a repository by platform dependent values. Connectors...

  8. Hidden Markov model-based approach for generation of Pitman ...

    Indian Academy of Sciences (India)

    In this paper, an approach for feature extraction using Mel frequency cep- stral coefficients (MFCC) and classification using hidden Markov models (HMM) for generating strokes comprising consonants and vowels (CV) in the process of production of Pitman shorthand language from spoken English is proposed. The.

  9. Pruning Chinese trees : an experimental and modelling approach

    NARCIS (Netherlands)

    Zeng, Bo

    2001-01-01

    Pruning of trees, in which some branches are removed from the lower crown of a tree, has been extensively used in China in silvicultural management for many purposes. With an experimental and modelling approach, the effects of pruning on tree growth and on the harvest of plant material were studied.

  10. Non-frontal Model Based Approach to Forensic Face Recognition

    NARCIS (Netherlands)

    Dutta, A.; Veldhuis, Raymond N.J.; Spreeuwers, Lieuwe Jan

    2012-01-01

    In this paper, we propose a non-frontal model based approach which ensures that a face recognition system always gets to compare images having similar view (or pose). This requires a virtual suspect reference set that consists of non-frontal suspect images having pose similar to the surveillance

  11. Reconciliation with oneself and with others: From approach to model

    Directory of Open Access Journals (Sweden)

    Nikolić-Ristanović Vesna

    2010-01-01

    Full Text Available The paper intends to present the approach to dealing with war and its consequences which was developed within Victimology Society of Serbia over the last five years, in the framework of Association Joint Action for Truth and Reconciliation (ZAIP. First, the short review of the Association and the process through which ZAIP approach to dealing with a past was developed is presented. Then, the detailed description of the approach itself, with identification of its most important specificities, is presented. In the conclusion, next steps, aimed at development of the model of reconciliation which will have the basis in ZAIP approach and which will be appropriate to social context of Serbia and its surrounding, are suggested.

  12. EXTENDE MODEL OF COMPETITIVITY THROUG APPLICATION OF NEW APPROACH DIRECTIVES

    Directory of Open Access Journals (Sweden)

    Slavko Arsovski

    2009-03-01

    Full Text Available The basic subject of this work is the model of new approach impact on quality and safety products, and competency of our companies. This work represents real hypothesis on the basis of expert's experiences, in regard to that the infrastructure with using new approach directives wasn't examined until now, it isn't known which product or industry of Serbia is related to directives of the new approach and CE mark, and it is not known which are effects of the use of the CE mark. This work should indicate existing quality reserves and product's safety, the level of possible competency improvement and increasing the profit by discharging new approach directive requires.

  13. Accurate phenotyping: Reconciling approaches through Bayesian model averaging.

    Directory of Open Access Journals (Sweden)

    Carla Chia-Ming Chen

    Full Text Available Genetic research into complex diseases is frequently hindered by a lack of clear biomarkers for phenotype ascertainment. Phenotypes for such diseases are often identified on the basis of clinically defined criteria; however such criteria may not be suitable for understanding the genetic composition of the diseases. Various statistical approaches have been proposed for phenotype definition; however our previous studies have shown that differences in phenotypes estimated using different approaches have substantial impact on subsequent analyses. Instead of obtaining results based upon a single model, we propose a new method, using Bayesian model averaging to overcome problems associated with phenotype definition. Although Bayesian model averaging has been used in other fields of research, this is the first study that uses Bayesian model averaging to reconcile phenotypes obtained using multiple models. We illustrate the new method by applying it to simulated genetic and phenotypic data for Kofendred personality disorder-an imaginary disease with several sub-types. Two separate statistical methods were used to identify clusters of individuals with distinct phenotypes: latent class analysis and grade of membership. Bayesian model averaging was then used to combine the two clusterings for the purpose of subsequent linkage analyses. We found that causative genetic loci for the disease produced higher LOD scores using model averaging than under either individual model separately. We attribute this improvement to consolidation of the cores of phenotype clusters identified using each individual method.

  14. Modeling electricity spot and futures price dependence: A multifrequency approach

    Science.gov (United States)

    Malo, Pekka

    2009-11-01

    Electricity prices are known to exhibit multifractal properties. We accommodate this finding by investigating multifractal models for electricity prices. In this paper we propose a flexible Copula-MSM (Markov Switching Multifractal) approach for modeling spot and weekly futures price dynamics. By using a conditional copula function, the framework allows us to separately model the dependence structure, while enabling use of multifractal stochastic volatility models to characterize fluctuations in marginal returns. An empirical experiment is carried out using data from Nord Pool. A study of volatility forecasting performance for electricity spot prices reveals that multifractal techniques are a competitive alternative to GARCH models. We also demonstrate how the Copula-MSM model can be employed for finding optimal portfolios, which minimizes the Conditional Value-at-Risk.

  15. Multiphysics modeling using COMSOL a first principles approach

    CERN Document Server

    Pryor, Roger W

    2011-01-01

    Multiphysics Modeling Using COMSOL rapidly introduces the senior level undergraduate, graduate or professional scientist or engineer to the art and science of computerized modeling for physical systems and devices. It offers a step-by-step modeling methodology through examples that are linked to the Fundamental Laws of Physics through a First Principles Analysis approach. The text explores a breadth of multiphysics models in coordinate systems that range from 1D to 3D and introduces the readers to the numerical analysis modeling techniques employed in the COMSOL Multiphysics software. After readers have built and run the examples, they will have a much firmer understanding of the concepts, skills, and benefits acquired from the use of computerized modeling techniques to solve their current technological problems and to explore new areas of application for their particular technological areas of interest.

  16. Evaluation of Workflow Management Systems - A Meta Model Approach

    Directory of Open Access Journals (Sweden)

    Michael Rosemann

    1998-11-01

    Full Text Available The automated enactment of processes through the use of workflow management systems enables the outsourcing of the control flow from application systems. By now a large number of systems, that follow different workflow paradigms, are available. This leads to the problem of selecting the appropriate workflow management system for a given situation. In this paper we outline the benefits of a meta model approach for the evaluation and comparison of different workflow management systems. After a general introduction on the topic of meta modeling the meta models of the workflow management systems WorkParty (Siemens Nixdorf and FlowMark (IBM are compared as an example. These product specific meta models can be generalized to meta reference models, which helps to specify a workflow methodology. Exemplary, an organisational reference meta model is presented, which helps users in specifying their requirements for a workflow management system.

  17. A Genetic Algorithm Approach for Modeling a Grounding Electrode

    Science.gov (United States)

    Mishra, Arbind Kumar; Nagaoka, Naoto; Ametani, Akihiro

    This paper has proposed a genetic algorithm based approach to determine a grounding electrode model circuit composed of resistances, inductances and capacitances. The proposed methodology determines the model circuit parameters based on a general ladder circuit directly from a measured result. Transient voltages of some electrodes were measured when applying a step like current. An EMTP simulation of a transient voltage on the grounding electrode has been carried out by adopting the proposed model circuits. The accuracy of the proposed method has been confirmed to be high in comparison with the measured transient voltage.

  18. Polynomial Chaos Expansion Approach to Interest Rate Models

    Directory of Open Access Journals (Sweden)

    Luca Di Persio

    2015-01-01

    Full Text Available The Polynomial Chaos Expansion (PCE technique allows us to recover a finite second-order random variable exploiting suitable linear combinations of orthogonal polynomials which are functions of a given stochastic quantity ξ, hence acting as a kind of random basis. The PCE methodology has been developed as a mathematically rigorous Uncertainty Quantification (UQ method which aims at providing reliable numerical estimates for some uncertain physical quantities defining the dynamic of certain engineering models and their related simulations. In the present paper, we use the PCE approach in order to analyze some equity and interest rate models. In particular, we take into consideration those models which are based on, for example, the Geometric Brownian Motion, the Vasicek model, and the CIR model. We present theoretical as well as related concrete numerical approximation results considering, without loss of generality, the one-dimensional case. We also provide both an efficiency study and an accuracy study of our approach by comparing its outputs with the ones obtained adopting the Monte Carlo approach, both in its standard and its enhanced version.

  19. Common modelling approaches for training simulators for nuclear power plants

    International Nuclear Information System (INIS)

    1990-02-01

    Training simulators for nuclear power plant operating staff have gained increasing importance over the last twenty years. One of the recommendations of the 1983 IAEA Specialists' Meeting on Nuclear Power Plant Training Simulators in Helsinki was to organize a Co-ordinated Research Programme (CRP) on some aspects of training simulators. The goal statement was: ''To establish and maintain a common approach to modelling for nuclear training simulators based on defined training requirements''. Before adapting this goal statement, the participants considered many alternatives for defining the common aspects of training simulator models, such as the programming language used, the nature of the simulator computer system, the size of the simulation computers, the scope of simulation. The participants agreed that it was the training requirements that defined the need for a simulator, the scope of models and hence the type of computer complex that was required, the criteria for fidelity and verification, and was therefore the most appropriate basis for the commonality of modelling approaches. It should be noted that the Co-ordinated Research Programme was restricted, for a variety of reasons, to consider only a few aspects of training simulators. This report reflects these limitations, and covers only the topics considered within the scope of the programme. The information in this document is intended as an aid for operating organizations to identify possible modelling approaches for training simulators for nuclear power plants. 33 refs

  20. A predictive modeling approach to increasing the economic effectiveness of disease management programs.

    Science.gov (United States)

    Bayerstadler, Andreas; Benstetter, Franz; Heumann, Christian; Winter, Fabian

    2014-09-01

    Predictive Modeling (PM) techniques are gaining importance in the worldwide health insurance business. Modern PM methods are used for customer relationship management, risk evaluation or medical management. This article illustrates a PM approach that enables the economic potential of (cost-) effective disease management programs (DMPs) to be fully exploited by optimized candidate selection as an example of successful data-driven business management. The approach is based on a Generalized Linear Model (GLM) that is easy to apply for health insurance companies. By means of a small portfolio from an emerging country, we show that our GLM approach is stable compared to more sophisticated regression techniques in spite of the difficult data environment. Additionally, we demonstrate for this example of a setting that our model can compete with the expensive solutions offered by professional PM vendors and outperforms non-predictive standard approaches for DMP selection commonly used in the market.

  1. Integrated design approach of the pebble bed modular using models

    International Nuclear Information System (INIS)

    Venter, P.J.

    2005-01-01

    The Pebble Bed Modular Reactor (PBMR) is the first pebble bed reactor that will be utilised in a high temperature direct Brayton cycle configuration. This implies that there are a number of unique features in the PBMR that extend from the German experience base. One of the challenges in the design of the PBMR is managing the integrated design process between the designers, the physicists and the analysts. This integrated design process is managed through model-based development work. Three-dimensional CAD models are constructed of the components and parts in the reactor. From the CAD models, CFD models, neutronic models, shielding models, FEM models and other thermodynamic models are derived. These models range from very simple models to extremely detailed and complex models. The models are used in legacy software as well as commercial off-the-shelf software. The different models are also used in code-to-code comparisons to verify the results. This paper will briefly discuss the different models and the interaction between the models, showing the iterative design process that is used in the development of the reactor at PBMR. (author)

  2. Estimating a DIF decomposition model using a random-weights linear logistic test model approach.

    Science.gov (United States)

    Paek, Insu; Fukuhara, Hirotaka

    2015-09-01

    A differential item functioning (DIF) decomposition model separates a testlet item DIF into two sources: item-specific differential functioning and testlet-specific differential functioning. This article provides an alternative model-building framework and estimation approach for a DIF decomposition model that was proposed by Beretvas and Walker (2012). Although their model is formulated under multilevel modeling with the restricted pseudolikelihood estimation method, our approach illustrates DIF decomposition modeling that is directly built upon the random-weights linear logistic test model framework with the marginal maximum likelihood estimation method. In addition to demonstrating our approach's performance, we provide detailed information on how to implement this new DIF decomposition model using an item response theory software program; using DIF decomposition may be challenging for practitioners, yet practical information on how to implement it has previously been unavailable in the measurement literature.

  3. A new multi-objective approach to finite element model updating

    Science.gov (United States)

    Jin, Seung-Seop; Cho, Soojin; Jung, Hyung-Jo; Lee, Jong-Jae; Yun, Chung-Bang

    2014-05-01

    The single objective function (SOF) has been employed for the optimization process in the conventional finite element (FE) model updating. The SOF balances the residual of multiple properties (e.g., modal properties) using weighting factors, but the weighting factors are hard to determine before the run of model updating. Therefore, the trial-and-error strategy is taken to find the most preferred model among alternative updated models resulted from varying weighting factors. In this study, a new approach to the FE model updating using the multi-objective function (MOF) is proposed to get the most preferred model in a single run of updating without trial-and-error. For the optimization using the MOF, non-dominated sorting genetic algorithm-II (NSGA-II) is employed to find the Pareto optimal front. The bend angle related to the trade-off relationship of objective functions is used to select the most preferred model among the solutions on the Pareto optimal front. To validate the proposed approach, a highway bridge is selected as a test-bed and the modal properties of the bridge are obtained from the ambient vibration test. The initial FE model of the bridge is built using SAP2000. The model is updated using the identified modal properties by the SOF approach with varying the weighting factors and the proposed MOF approach. The most preferred model is selected using the bend angle of the Pareto optimal front, and compared with the results from the SOF approach using varying the weighting factors. The comparison shows that the proposed MOF approach is superior to the SOF approach using varying the weighting factors in getting smaller objective function values, estimating better updated parameters, and taking less computational time.

  4. On Approaches to Analyze the Sensitivity of Simulated Hydrologic Fluxes to Model Parameters in the Community Land Model

    Directory of Open Access Journals (Sweden)

    Jie Bao

    2015-12-01

    Full Text Available Effective sensitivity analysis approaches are needed to identify important parameters or factors and their uncertainties in complex Earth system models composed of multi-phase multi-component phenomena and multiple biogeophysical-biogeochemical processes. In this study, the impacts of 10 hydrologic parameters in the Community Land Model on simulations of runoff and latent heat flux are evaluated using data from a watershed. Different metrics, including residual statistics, the Nash–Sutcliffe coefficient, and log mean square error, are used as alternative measures of the deviations between the simulated and field observed values. Four sensitivity analysis (SA approaches, including analysis of variance based on the generalized linear model, generalized cross validation based on the multivariate adaptive regression splines model, standardized regression coefficients based on a linear regression model, and analysis of variance based on support vector machine, are investigated. Results suggest that these approaches show consistent measurement of the impacts of major hydrologic parameters on response variables, but with differences in the relative contributions, particularly for the secondary parameters. The convergence behaviors of the SA with respect to the number of sampling points are also examined with different combinations of input parameter sets and output response variables and their alternative metrics. This study helps identify the optimal SA approach, provides guidance for the calibration of the Community Land Model parameters to improve the model simulations of land surface fluxes, and approximates the magnitudes to be adjusted in the parameter values during parametric model optimization.

  5. Analysis on the crime model using dynamical approach

    Science.gov (United States)

    Mohammad, Fazliza; Roslan, Ummu'Atiqah Mohd

    2017-08-01

    A research is carried out to analyze a dynamical model of the spread crime system. A Simplified 2-Dimensional Model is used in this research. The objectives of this research are to investigate the stability of the model of the spread crime, to summarize the stability by using a bifurcation analysis and to study the relationship of basic reproduction number, R0 with the parameter in the model. Our results for stability of equilibrium points shows that we have two types of stability, which are asymptotically stable and saddle node. While the result for bifurcation analysis shows that the number of criminally active and incarcerated increases as we increase the value of a parameter in the model. The result for the relationship of R0 with the parameter shows that as the parameter increases, R0 increase too, and the rate of crime increase too.

  6. The FITS model: an improved Learning by Design approach

    NARCIS (Netherlands)

    Drs. Ing. Koen Michels; Prof. Dr. Marc de Vries; MEd Dave van Breukelen; MEd Frank Schure

    2016-01-01

    Learning by Design (LBD) is a project-based inquiry approach for interdisciplinary teaching that uses design contexts to learn skills and conceptual knowledge. Research around the year 2000 showed that LBD students achieved high skill performances but disappointing conceptual learning gains. A

  7. Plenary lecture: innovative modeling approaches applicable to risk assessments.

    Science.gov (United States)

    Oscar, T P

    2011-06-01

    Proper identification of safe and unsafe food at the processing plant is important for maximizing the public health benefit of food by ensuring both its consumption and safety. Risk assessment is a holistic approach to food safety that consists of four steps: 1) hazard identification; 2) exposure assessment; 3) hazard characterization; and 4) risk characterization. Risk assessments are modeled by mapping the risk pathway as a series of unit operations and associated pathogen events and then using probability distributions and a random sampling method to simulate the rare, random, variable and uncertain nature of pathogen events in the risk pathway. To model pathogen events, a rare event modeling approach is used that links a discrete distribution for incidence of the pathogen event with a continuous distribution for extent of the pathogen event. When applied to risk assessment, rare event modeling leads to the conclusion that the most highly contaminated food at the processing plant does not necessarily pose the highest risk to public health because of differences in post-processing risk factors among distribution channels and consumer populations. Predictive microbiology models for individual pathogen events can be integrated with risk assessment models using the rare event modeling method. Published by Elsevier Ltd.

  8. A global sensitivity analysis approach for morphogenesis models

    KAUST Repository

    Boas, Sonja E. M.

    2015-11-21

    Background Morphogenesis is a developmental process in which cells organize into shapes and patterns. Complex, non-linear and multi-factorial models with images as output are commonly used to study morphogenesis. It is difficult to understand the relation between the uncertainty in the input and the output of such ‘black-box’ models, giving rise to the need for sensitivity analysis tools. In this paper, we introduce a workflow for a global sensitivity analysis approach to study the impact of single parameters and the interactions between them on the output of morphogenesis models. Results To demonstrate the workflow, we used a published, well-studied model of vascular morphogenesis. The parameters of this cellular Potts model (CPM) represent cell properties and behaviors that drive the mechanisms of angiogenic sprouting. The global sensitivity analysis correctly identified the dominant parameters in the model, consistent with previous studies. Additionally, the analysis provided information on the relative impact of single parameters and of interactions between them. This is very relevant because interactions of parameters impede the experimental verification of the predicted effect of single parameters. The parameter interactions, although of low impact, provided also new insights in the mechanisms of in silico sprouting. Finally, the analysis indicated that the model could be reduced by one parameter. Conclusions We propose global sensitivity analysis as an alternative approach to study the mechanisms of morphogenesis. Comparison of the ranking of the impact of the model parameters to knowledge derived from experimental data and from manipulation experiments can help to falsify models and to find the operand mechanisms in morphogenesis. The workflow is applicable to all ‘black-box’ models, including high-throughput in vitro models in which output measures are affected by a set of experimental perturbations.

  9. A global sensitivity analysis approach for morphogenesis models.

    Science.gov (United States)

    Boas, Sonja E M; Navarro Jimenez, Maria I; Merks, Roeland M H; Blom, Joke G

    2015-11-21

    Morphogenesis is a developmental process in which cells organize into shapes and patterns. Complex, non-linear and multi-factorial models with images as output are commonly used to study morphogenesis. It is difficult to understand the relation between the uncertainty in the input and the output of such 'black-box' models, giving rise to the need for sensitivity analysis tools. In this paper, we introduce a workflow for a global sensitivity analysis approach to study the impact of single parameters and the interactions between them on the output of morphogenesis models. To demonstrate the workflow, we used a published, well-studied model of vascular morphogenesis. The parameters of this cellular Potts model (CPM) represent cell properties and behaviors that drive the mechanisms of angiogenic sprouting. The global sensitivity analysis correctly identified the dominant parameters in the model, consistent with previous studies. Additionally, the analysis provided information on the relative impact of single parameters and of interactions between them. This is very relevant because interactions of parameters impede the experimental verification of the predicted effect of single parameters. The parameter interactions, although of low impact, provided also new insights in the mechanisms of in silico sprouting. Finally, the analysis indicated that the model could be reduced by one parameter. We propose global sensitivity analysis as an alternative approach to study the mechanisms of morphogenesis. Comparison of the ranking of the impact of the model parameters to knowledge derived from experimental data and from manipulation experiments can help to falsify models and to find the operand mechanisms in morphogenesis. The workflow is applicable to all 'black-box' models, including high-throughput in vitro models in which output measures are affected by a set of experimental perturbations.

  10. Comparison of two model approaches in the Zambezi river basin with regard to model reliability and identifiability

    Directory of Open Access Journals (Sweden)

    H. C. Winsemius

    2006-01-01

    Full Text Available Variations of water stocks in the upper Zambezi river basin have been determined by 2 different hydrological modelling approaches. The purpose was to provide preliminary terrestrial storage estimates in the upper Zambezi, which will be compared with estimates derived from the Gravity Recovery And Climate Experiment (GRACE in a future study. The first modelling approach is GIS-based, distributed and conceptual (STREAM. The second approach uses Lumped Elementary Watersheds identified and modelled conceptually (LEW. The STREAM model structure has been assessed using GLUE (Generalized Likelihood Uncertainty Estimation a posteriori to determine parameter identifiability. The LEW approach could, in addition, be tested for model structure, because computational efforts of LEW are low. Both models are threshold models, where the non-linear behaviour of the Zambezi river basin is explained by a combination of thresholds and linear reservoirs. The models were forced by time series of gauged and interpolated rainfall. Where available, runoff station data was used to calibrate the models. Ungauged watersheds were generally given the same parameter sets as their neighbouring calibrated watersheds. It appeared that the LEW model structure could be improved by applying GLUE iteratively. Eventually, it led to better identifiability of parameters and consequently a better model structure than the STREAM model. Hence, the final model structure obtained better represents the true hydrology. After calibration, both models show a comparable efficiency in representing discharge. However the LEW model shows a far greater storage amplitude than the STREAM model. This emphasizes the storage uncertainty related to hydrological modelling in data-scarce environments such as the Zambezi river basin. It underlines the need and potential for independent observations of terrestrial storage to enhance our understanding and modelling capacity of the hydrological processes. GRACE

  11. Bayesian approach to errors-in-variables in regression models

    Science.gov (United States)

    Rozliman, Nur Aainaa; Ibrahim, Adriana Irawati Nur; Yunus, Rossita Mohammad

    2017-05-01

    In many applications and experiments, data sets are often contaminated with error or mismeasured covariates. When at least one of the covariates in a model is measured with error, Errors-in-Variables (EIV) model can be used. Measurement error, when not corrected, would cause misleading statistical inferences and analysis. Therefore, our goal is to examine the relationship of the outcome variable and the unobserved exposure variable given the observed mismeasured surrogate by applying the Bayesian formulation to the EIV model. We shall extend the flexible parametric method proposed by Hossain and Gustafson (2009) to another nonlinear regression model which is the Poisson regression model. We shall then illustrate the application of this approach via a simulation study using Markov chain Monte Carlo sampling methods.

  12. BUSINESS MODEL ROADMAPPING: A PRACTICAL APPROACH TO COME FROM AN EXISTING TO A DESIRED BUSINESS MODEL

    OpenAIRE

    MARK DE REUVER; HARRY BOUWMAN; TIMBER HAAKER

    2013-01-01

    Literature on business models deals extensively with how to design new business models, but hardly with how to make the transition from an existing to a newly designed business model. The transition to a new business model raises several practical and strategic issues, such as how to replace an existing value proposition with a new one, when to acquire new resources and capabilities, and when to start new partnerships. In this paper, we coin the term business model roadmapping as an approach ...

  13. Anthropomorphic Coding of Speech and Audio: A Model Inversion Approach

    Directory of Open Access Journals (Sweden)

    W. Bastiaan Kleijn

    2005-06-01

    Full Text Available Auditory modeling is a well-established methodology that provides insight into human perception and that facilitates the extraction of signal features that are most relevant to the listener. The aim of this paper is to provide a tutorial on perceptual speech and audio coding using an invertible auditory model. In this approach, the audio signal is converted into an auditory representation using an invertible auditory model. The auditory representation is quantized and coded. Upon decoding, it is then transformed back into the acoustic domain. This transformation converts a complex distortion criterion into a simple one, thus facilitating quantization with low complexity. We briefly review past work on auditory models and describe in more detail the components of our invertible model and its inversion procedure, that is, the method to reconstruct the signal from the output of the auditory model. We summarize attempts to use the auditory representation for low-bit-rate coding. Our approach also allows the exploitation of the inherent redundancy of the human auditory system for the purpose of multiple description (joint source-channel coding.

  14. A modal approach to modeling spatially distributed vibration energy dissipation.

    Energy Technology Data Exchange (ETDEWEB)

    Segalman, Daniel Joseph

    2010-08-01

    The nonlinear behavior of mechanical joints is a confounding element in modeling the dynamic response of structures. Though there has been some progress in recent years in modeling individual joints, modeling the full structure with myriad frictional interfaces has remained an obstinate challenge. A strategy is suggested for structural dynamics modeling that can account for the combined effect of interface friction distributed spatially about the structure. This approach accommodates the following observations: (1) At small to modest amplitudes, the nonlinearity of jointed structures is manifest primarily in the energy dissipation - visible as vibration damping; (2) Correspondingly, measured vibration modes do not change significantly with amplitude; and (3) Significant coupling among the modes does not appear to result at modest amplitudes. The mathematical approach presented here postulates the preservation of linear modes and invests all the nonlinearity in the evolution of the modal coordinates. The constitutive form selected is one that works well in modeling spatially discrete joints. When compared against a mathematical truth model, the distributed dissipation approximation performs well.

  15. A structured approach for the engineering of biochemical network models, illustrated for signalling pathways.

    Science.gov (United States)

    Breitling, Rainer; Gilbert, David; Heiner, Monika; Orton, Richard

    2008-09-01

    Quantitative models of biochemical networks (signal transduction cascades, metabolic pathways, gene regulatory circuits) are a central component of modern systems biology. Building and managing these complex models is a major challenge that can benefit from the application of formal methods adopted from theoretical computing science. Here we provide a general introduction to the field of formal modelling, which emphasizes the intuitive biochemical basis of the modelling process, but is also accessible for an audience with a background in computing science and/or model engineering. We show how signal transduction cascades can be modelled in a modular fashion, using both a qualitative approach--qualitative Petri nets, and quantitative approaches--continuous Petri nets and ordinary differential equations (ODEs). We review the major elementary building blocks of a cellular signalling model, discuss which critical design decisions have to be made during model building, and present a number of novel computational tools that can help to explore alternative modular models in an easy and intuitive manner. These tools, which are based on Petri net theory, offer convenient ways of composing hierarchical ODE models, and permit a qualitative analysis of their behaviour. We illustrate the central concepts using signal transduction as our main example. The ultimate aim is to introduce a general approach that provides the foundations for a structured formal engineering of large-scale models of biochemical networks.

  16. A Novel Approach to Implement Takagi-Sugeno Fuzzy Models.

    Science.gov (United States)

    Chang, Chia-Wen; Tao, Chin-Wang

    2017-09-01

    This paper proposes new algorithms based on the fuzzy c-regressing model algorithm for Takagi-Sugeno (T-S) fuzzy modeling of the complex nonlinear systems. A fuzzy c-regression state model (FCRSM) algorithm is a T-S fuzzy model in which the functional antecedent and the state-space-model-type consequent are considered with the available input-output data. The antecedent and consequent forms of the proposed FCRSM consists mainly of two advantages: one is that the FCRSM has low computation load due to only one input variable is considered in the antecedent part; another is that the unknown system can be modeled to not only the polynomial form but also the state-space form. Moreover, the FCRSM can be extended to FCRSM-ND and FCRSM-Free algorithms. An algorithm FCRSM-ND is presented to find the T-S fuzzy state-space model of the nonlinear system when the input-output data cannot be precollected and an assumed effective controller is available. In the practical applications, the mathematical model of controller may be hard to be obtained. In this case, an online tuning algorithm, FCRSM-FREE, is designed such that the parameters of a T-S fuzzy controller and the T-S fuzzy state model of an unknown system can be online tuned simultaneously. Four numerical simulations are given to demonstrate the effectiveness of the proposed approach.

  17. A Composite Modelling Approach to Decision Support by the Use of the CBA-DK Model

    DEFF Research Database (Denmark)

    Barfod, Michael Bruhn; Salling, Kim Bang; Leleur, Steen

    2007-01-01

    This paper presents a decision support system for assessment of transport infrastructure projects. The composite modelling approach, COSIMA, combines a cost-benefit analysis by use of the CBA-DK model with multi-criteria analysis applying the AHP and SMARTER techniques. The modelling uncertaintie...

  18. A moni-modelling approach to manage groundwater risk to pesticide leaching at regional scale.

    Science.gov (United States)

    Di Guardo, Andrea; Finizio, Antonio

    2016-03-01

    Historically, the approach used to manage risk of chemical contamination of water bodies is based on the use of monitoring programmes, which provide a snapshot of the presence/absence of chemicals in water bodies. Monitoring is required in the current EU regulations, such as the Water Framework Directive (WFD), as a tool to record temporal variation in the chemical status of water bodies. More recently, a number of models have been developed and used to forecast chemical contamination of water bodies. These models combine information of chemical properties, their use, and environmental scenarios. Both approaches are useful for risk assessors in decision processes. However, in our opinion, both show flaws and strengths when taken alone. This paper proposes an integrated approach (moni-modelling approach) where monitoring data and modelling simulations work together in order to provide a common decision framework for the risk assessor. This approach would be very useful, particularly for the risk management of pesticides at a territorial level. It fulfils the requirement of the recent Sustainable Use of Pesticides Directive. In fact, the moni-modelling approach could be used to identify sensible areas where implement mitigation measures or limitation of use of pesticides, but even to effectively re-design future monitoring networks or to better calibrate the pedo-climatic input data for the environmental fate models. A case study is presented, where the moni-modelling approach is applied in Lombardy region (North of Italy) to identify groundwater vulnerable areas to pesticides. The approach has been applied to six active substances with different leaching behaviour, in order to highlight the advantages in using the proposed methodology. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. Risk evaluation of uranium mining: A geochemical inverse modelling approach

    Science.gov (United States)

    Rillard, J.; Zuddas, P.; Scislewski, A.

    2011-12-01

    reactive mineral surface area. The formation of coatings on dissolving mineral surfaces significantly reduces the amount of surface available to react with fluids. Our results show that negatively charged ion complexes, responsible for U transport, decreases when alkalinity and rock buffer capacity is similarly lower. Carbonate ion pairs however, may increase U mobility when radionuclide concentration is high and rock buffer capacity is low. The present work helps to orient future monitoring of this site in Brazil as well as of other sites where uranium is linked to igneous rock formations, without the presence of sulphides. Monitoring SO4 migration (in acidic leaching uranium sites) seems to be an efficient and simple way to track different hazards, especially in tropical conditions, where the succession of dry and wet periods increases the weathering action of the residual H2SO4. Nevertheless, models of risk evaluation should take into account reactive surface areas and neogenic minerals since they determine the U ion complex formation, which in turn, controls uranium mobility in natural systems. Keywords: uranium mining, reactive mineral surface area, uranium complexes, inverse modelling approach, risk evaluation

  20. A generalized approach for historical mock-up acquisition and data modelling: Towards historically enriched 3D city models

    Science.gov (United States)

    Hervy, B.; Billen, R.; Laroche, F.; Carré, C.; Servières, M.; Van Ruymbeke, M.; Tourre, V.; Delfosse, V.; Kerouanton, J.-L.

    2012-10-01

    Museums are filled with hidden secrets. One of those secrets lies behind historical mock-ups whose signification goes far behind a simple representation of a city. We face the challenge of designing, storing and showing knowledge related to these mock-ups in order to explain their historical value. Over the last few years, several mock-up digitalisation projects have been realised. Two of them, Nantes 1900 and Virtual Leodium, propose innovative approaches that present a lot of similarities. This paper presents a framework to go one step further by analysing their data modelling processes and extracting what could be a generalized approach to build a numerical mock-up and the knowledge database associated. Geometry modelling and knowledge modelling influence each other and are conducted in a parallel process. Our generalized approach describes a global overview of what can be a data modelling process. Our next goal is obviously to apply this global approach on other historical mock-up, but we also think about applying it to other 3D objects that need to embed semantic data, and approaching historically enriched 3D city models.

  1. The place of quantitative energy models in a prospective approach

    International Nuclear Information System (INIS)

    Taverdet-Popiolek, N.

    2009-01-01

    Futurology above all depends on having the right mind set. Gaston Berger summarizes the prospective approach in 5 five main thrusts: prepare for the distant future, be open-minded (have a systems and multidisciplinary approach), carry out in-depth analyzes (draw out actors which are really determinant or the future, as well as established shed trends), take risks (imagine risky but flexible projects) and finally think about humanity, futurology being a technique at the service of man to help him build a desirable future. On the other hand, forecasting is based on quantified models so as to deduce 'conclusions' about the future. In the field of energy, models are used to draw up scenarios which allow, for instance, measuring medium or long term effects of energy policies on greenhouse gas emissions or global welfare. Scenarios are shaped by the model's inputs (parameters, sets of assumptions) and outputs. Resorting to a model or projecting by scenario is useful in a prospective approach as it ensures coherence for most of the variables that have been identified through systems analysis and that the mind on its own has difficulty to grasp. Interpretation of each scenario must be carried out in the light o the underlying framework of assumptions (the backdrop), developed during the prospective stage. When the horizon is far away (very long-term), the worlds imagined by the futurologist contain breaks (technological, behavioural and organizational) which are hard to integrate into the models. It is here that the main limit for the use of models in futurology is located. (author)

  2. Innovation Networks New Approaches in Modelling and Analyzing

    CERN Document Server

    Pyka, Andreas

    2009-01-01

    The science of graphs and networks has become by now a well-established tool for modelling and analyzing a variety of systems with a large number of interacting components. Starting from the physical sciences, applications have spread rapidly to the natural and social sciences, as well as to economics, and are now further extended, in this volume, to the concept of innovations, viewed broadly. In an abstract, systems-theoretical approach, innovation can be understood as a critical event which destabilizes the current state of the system, and results in a new process of self-organization leading to a new stable state. The contributions to this anthology address different aspects of the relationship between innovation and networks. The various chapters incorporate approaches in evolutionary economics, agent-based modeling, social network analysis and econophysics and explore the epistemic tension between insights into economics and society-related processes, and the insights into new forms of complex dynamics.

  3. Research on teacher education programs: logic model approach.

    Science.gov (United States)

    Newton, Xiaoxia A; Poon, Rebecca C; Nunes, Nicole L; Stone, Elisa M

    2013-02-01

    Teacher education programs in the United States face increasing pressure to demonstrate their effectiveness through pupils' learning gains in classrooms where program graduates teach. The link between teacher candidates' learning in teacher education programs and pupils' learning in K-12 classrooms implicit in the policy discourse suggests a one-to-one correspondence. However, the logical steps leading from what teacher candidates have learned in their programs to what they are doing in classrooms that may contribute to their pupils' learning are anything but straightforward. In this paper, we argue that the logic model approach from scholarship on evaluation can enhance research on teacher education by making explicit the logical links between program processes and intended outcomes. We demonstrate the usefulness of the logic model approach through our own work on designing a longitudinal study that focuses on examining the process and impact of an undergraduate mathematics and science teacher education program. Copyright © 2012 Elsevier Ltd. All rights reserved.

  4. Understanding complex urban systems multidisciplinary approaches to modeling

    CERN Document Server

    Gurr, Jens; Schmidt, J

    2014-01-01

    Understanding Complex Urban Systems takes as its point of departure the insight that the challenges of global urbanization and the complexity of urban systems cannot be understood – let alone ‘managed’ – by sectoral and disciplinary approaches alone. But while there has recently been significant progress in broadening and refining the methodologies for the quantitative modeling of complex urban systems, in deepening the theoretical understanding of cities as complex systems, or in illuminating the implications for urban planning, there is still a lack of well-founded conceptual thinking on the methodological foundations and the strategies of modeling urban complexity across the disciplines. Bringing together experts from the fields of urban and spatial planning, ecology, urban geography, real estate analysis, organizational cybernetics, stochastic optimization, and literary studies, as well as specialists in various systems approaches and in transdisciplinary methodologies of urban analysis, the volum...

  5. DISCRETE LATTICE ELEMENT APPROACH FOR ROCK FAILURE MODELING

    Directory of Open Access Journals (Sweden)

    Mijo Nikolić

    2017-01-01

    Full Text Available This paper presents the ‘discrete lattice model’, or, simply, the ‘lattice model’, developed for rock failure modeling. The main difficulties in numerical modeling, namely, those related to complex crack initiations and multiple crack propagations, their coalescence under the influence of natural disorder, and heterogeneities, are overcome using the approach presented in this paper. The lattice model is constructed as an assembly of Timoshenko beams, representing the cohesive links between the grains of the material, which are described by Voronoi polygons. The kinematics of the Timoshenko beams are enhanced by the embedded strong discontinuities in their axial and transversal directions so as to provide failure modes I, II, and III. The model presented is suitable for meso-scale rock simulations. The representative numerical simulations, in both 2D and 3D settings, are provided in order to illustrate the model’s capabilities.

  6. Computer Modeling of Violent Intent: A Content Analysis Approach

    Energy Technology Data Exchange (ETDEWEB)

    Sanfilippo, Antonio P.; Mcgrath, Liam R.; Bell, Eric B.

    2014-01-03

    We present a computational approach to modeling the intent of a communication source representing a group or an individual to engage in violent behavior. Our aim is to identify and rank aspects of radical rhetoric that are endogenously related to violent intent to predict the potential for violence as encoded in written or spoken language. We use correlations between contentious rhetoric and the propensity for violent behavior found in documents from radical terrorist and non-terrorist groups and individuals to train and evaluate models of violent intent. We then apply these models to unseen instances of linguistic behavior to detect signs of contention that have a positive correlation with violent intent factors. Of particular interest is the application of violent intent models to social media, such as Twitter, that have proved to serve as effective channels in furthering sociopolitical change.

  7. A fuzzy approach to the Weighted Overlap Dominance model

    DEFF Research Database (Denmark)

    Franco de los Rios, Camilo Andres; Hougaard, Jens Leth; Nielsen, Kurt

    2013-01-01

    Decision support models are required to handle the various aspects of multi-criteria decision problems in order to help the individual understand its possible solutions. In this sense, such models have to be capable of aggregating and exploiting different types of measurements and evaluations...... in an interactive way, where input data can take the form of uniquely-graded or interval-valued information. Here we explore the Weighted Overlap Dominance (WOD) model from a fuzzy perspective and its outranking approach to decision support and multidimensional interval analysis. Firstly, imprecision measures...... are introduced for characterizing the type of uncertainty being expressed by intervals, examining at the same time how the WOD model handles both non-interval as well as interval data, and secondly, relevance degrees are proposed for obtaining a ranking over the alternatives. Hence, a complete methodology...

  8. Modeling fabrication of nuclear components: An integrative approach

    Energy Technology Data Exchange (ETDEWEB)

    Hench, K.W.

    1996-08-01

    Reduction of the nuclear weapons stockpile and the general downsizing of the nuclear weapons complex has presented challenges for Los Alamos. One is to design an optimized fabrication facility to manufacture nuclear weapon primary components in an environment of intense regulation and shrinking budgets. This dissertation presents an integrative two-stage approach to modeling the casting operation for fabrication of nuclear weapon primary components. The first stage optimizes personnel radiation exposure for the casting operation layout by modeling the operation as a facility layout problem formulated as a quadratic assignment problem. The solution procedure uses an evolutionary heuristic technique. The best solutions to the layout problem are used as input to the second stage - a simulation model that assesses the impact of competing layouts on operational performance. The focus of the simulation model is to determine the layout that minimizes personnel radiation exposures and nuclear material movement, and maximizes the utilization of capacity for finished units.

  9. Hierarchical Agent-Based Integrated Modelling Approach for Microgrids with Adoption of EVs and HRES

    Directory of Open Access Journals (Sweden)

    Peng Han

    2014-01-01

    Full Text Available The large adoption of electric vehicles (EVs, hybrid renewable energy systems (HRESs, and the increasing of the loads shall bring significant challenges to the microgrid. The methodology to model microgrid with high EVs and HRESs penetrations is the key to EVs adoption assessment and optimized HRESs deployment. However, considering the complex interactions of the microgrid containing massive EVs and HRESs, any previous single modelling approaches are insufficient. Therefore in this paper, the methodology named Hierarchical Agent-based Integrated Modelling Approach (HAIMA is proposed. With the effective integration of the agent-based modelling with other advanced modelling approaches, the proposed approach theoretically contributes to a new microgrid model hierarchically constituted by microgrid management layer, component layer, and event layer. Then the HAIMA further links the key parameters and interconnects them to achieve the interactions of the whole model. Furthermore, HAIMA practically contributes to a comprehensive microgrid operation system, through which the assessment of the proposed model and the impact of the EVs adoption are achieved. Simulations show that the proposed HAIMA methodology will be beneficial for the microgrid study and EV’s operation assessment and shall be further utilized for the energy management, electricity consumption prediction, the EV scheduling control, and HRES deployment optimization.

  10. THE SIGNAL APPROACH TO MODELLING THE BALANCE OF PAYMENT CRISIS

    Directory of Open Access Journals (Sweden)

    O. Chernyak

    2016-12-01

    Full Text Available The paper considers and presents synthesis of theoretical models of balance of payment crisis and investigates the most effective ways to model the crisis in Ukraine. For mathematical formalization of balance of payment crisis, comparative analysis of the effectiveness of different calculation methods of Exchange Market Pressure Index was performed. A set of indicators that signal the growing likelihood of balance of payments crisis was defined using signal approach. With the help of minimization function thresholds indicators were selected, the crossing of which signalize increase in the probability of balance of payment crisis.

  11. Risk Modeling Approaches in Terms of Volatility Banking Transactions

    Directory of Open Access Journals (Sweden)

    Angelica Cucşa (Stratulat

    2016-01-01

    Full Text Available The inseparability of risk and banking activity is one demonstrated ever since banking systems, the importance of the topic being presend in current life and future equally in the development of banking sector. Banking sector development is done in the context of the constraints of nature and number of existing risks and those that may arise, and serves as limiting the risk of banking activity. We intend to develop approaches to analyse risk through mathematical models by also developing a model for the Romanian capital market 10 active trading picks that will test investor reaction in controlled and uncontrolled conditions of risk aggregated with harmonised factors.

  12. Modeling software with finite state machines a practical approach

    CERN Document Server

    Wagner, Ferdinand; Wagner, Thomas; Wolstenholme, Peter

    2006-01-01

    Modeling Software with Finite State Machines: A Practical Approach explains how to apply finite state machines to software development. It provides a critical analysis of using finite state machines as a foundation for executable specifications to reduce software development effort and improve quality. This book discusses the design of a state machine and of a system of state machines. It also presents a detailed analysis of development issues relating to behavior modeling with design examples and design rules for using finite state machines. This volume describes a coherent and well-tested fr

  13. Surrogate based approaches to parameter inference in ocean models

    KAUST Repository

    Knio, Omar

    2016-01-06

    This talk discusses the inference of physical parameters using model surrogates. Attention is focused on the use of sampling schemes to build suitable representations of the dependence of the model response on uncertain input data. Non-intrusive spectral projections and regularized regressions are used for this purpose. A Bayesian inference formalism is then applied to update the uncertain inputs based on available measurements or observations. To perform the update, we consider two alternative approaches, based on the application of Markov Chain Monte Carlo methods or of adjoint-based optimization techniques. We outline the implementation of these techniques to infer dependence of wind drag, bottom drag, and internal mixing coefficients.

  14. Understanding Gulf War Illness: An Integrative Modeling Approach

    Science.gov (United States)

    2017-10-01

    high-order diffusion imaging in a rat model of Gulf War Illness. §These authors contributed equally to the work. Brain Behavior and Immunity. pii...astrocyte specific transcriptome responses to neurotoxicity. §These authors contributed equally to the work. Submitted for Internal CDC-NIOSH...Antagonist: Evaluation of Beneficial Effects for Gulf War Illness 4) GW160116 (Nathanson) Genomics approach to find gender specific mechanisms of GWI

  15. An Approach for Modeling and Formalizing SOA Design Patterns

    OpenAIRE

    Tounsi , Imen; Hadj Kacem , Mohamed; Hadj Kacem , Ahmed; Drira , Khalil

    2013-01-01

    11 pages; International audience; Although design patterns has become increasingly popular, most of them are presented in an informal way, which can give rise to ambiguity and may lead to their incorrect usage. Patterns proposed by the SOA design pattern community are described with informal visual notations. Modeling SOA design patterns with a standard formal notation contributes to avoid misunderstanding by software architects and helps endowing design methods with refinement approaches for...

  16. An approach for quantifying small effects in regression models.

    Science.gov (United States)

    Bedrick, Edward J; Hund, Lauren

    2018-04-01

    We develop a novel approach for quantifying small effects in regression models. Our method is based on variation in the mean function, in contrast to methods that focus on regression coefficients. Our idea applies in diverse settings such as testing for a negligible trend and quantifying differences in regression functions across strata. Straightforward Bayesian methods are proposed for inference. Four examples are used to illustrate the ideas.

  17. A Conditional Approach to Panel Data Models with Common Shocks

    Directory of Open Access Journals (Sweden)

    Giovanni Forchini

    2016-01-01

    Full Text Available This paper studies the effects of common shocks on the OLS estimators of the slopes’ parameters in linear panel data models. The shocks are assumed to affect both the errors and some of the explanatory variables. In contrast to existing approaches, which rely on using results on martingale difference sequences, our method relies on conditional strong laws of large numbers and conditional central limit theorems for conditionally-heterogeneous random variables.

  18. Modeling Defibrillation of the Heart: Approaches and Insights

    Science.gov (United States)

    Trayanova, Natalia; Constantino, Jason; Ashihara, Takashi; Plank, Gernot

    2012-01-01

    Cardiac defibrillation, as accomplished nowadays by automatic, implantable devices (ICDs), constitutes the most important means of combating sudden cardiac death. While ICD therapy has proved to be efficient and reliable, defibrillation is a traumatic experience. Thus, research on defibrillation mechanisms, particularly aimed at lowering defibrillation voltage, remains an important topic. Advancing our understanding towards a full appreciation of the mechanisms by which a shock interacts with the heart is the most promising approach to achieve this goal. The aim of this paper is to assess the current state-of-the-art in ventricular defibrillation modeling, focusing on both numerical modeling approaches and major insights that have been obtained using defibrillation models, primarily those of realistic ventricular geometry. The paper showcases the contributions that modeling and simulation have made to our understanding of the defibrillation process. The review thus provides an example of biophysically based computational modeling of the heart (i.e., cardiac defibrillation) that has advanced the understanding of cardiac electrophysiological interaction at the organ level and has the potential to contribute to the betterment of the clinical practice of defibrillation. PMID:22273793

  19. A multi-model ensemble approach to seabed mapping

    Science.gov (United States)

    Diesing, Markus; Stephens, David

    2015-06-01

    Seabed habitat mapping based on swath acoustic data and ground-truth samples is an emergent and active marine science discipline. Significant progress could be achieved by transferring techniques and approaches that have been successfully developed and employed in such fields as terrestrial land cover mapping. One such promising approach is the multiple classifier system, which aims at improving classification performance by combining the outputs of several classifiers. Here we present results of a multi-model ensemble applied to multibeam acoustic data covering more than 5000 km2 of seabed in the North Sea with the aim to derive accurate spatial predictions of seabed substrate. A suite of six machine learning classifiers (k-Nearest Neighbour, Support Vector Machine, Classification Tree, Random Forest, Neural Network and Naïve Bayes) was trained with ground-truth sample data classified into seabed substrate classes and their prediction accuracy was assessed with an independent set of samples. The three and five best performing models were combined to classifier ensembles. Both ensembles led to increased prediction accuracy as compared to the best performing single classifier. The improvements were however not statistically significant at the 5% level. Although the three-model ensemble did not perform significantly better than its individual component models, we noticed that the five-model ensemble did perform significantly better than three of the five component models. A classifier ensemble might therefore be an effective strategy to improve classification performance. Another advantage is the fact that the agreement in predicted substrate class between the individual models of the ensemble could be used as a measure of confidence. We propose a simple and spatially explicit measure of confidence that is based on model agreement and prediction accuracy.

  20. Policy harmonized approach for the EU agricultural sector modelling

    Directory of Open Access Journals (Sweden)

    G. SALPUTRA

    2008-12-01

    Full Text Available Policy harmonized (PH approach allows for the quantitative assessment of the impact of various elements of EU CAP direct support schemes, where the production effects of direct payments are accounted through reaction prices formed by producer price and policy price add-ons. Using the AGMEMOD model the impacts of two possible EU agricultural policy scenarios upon beef production have been analysed – full decoupling with a switch from historical to regional Single Payment scheme or alternatively with re-distribution of country direct payment envelopes via introduction of EU-wide flat area payment. The PH approach, by systematizing and harmonizing the management and use of policy data, ensures that projected differential policy impacts arising from changes in common EU policies reflect the likely actual differential impact as opposed to differences in how “common” policies are implemented within analytical models. In the second section of the paper the AGMEMOD model’s structure is explained. The policy harmonized evaluation method is presented in the third section. Results from an application of the PH approach are presented and discussed in the paper’s penultimate section, while section 5 concludes.;

  1. Metabolic network modeling approaches for investigating the "hungry cancer".

    Science.gov (United States)

    Sharma, Ashwini Kumar; König, Rainer

    2013-08-01

    Metabolism is the functional phenotype of a cell, at a given condition, resulting from an intricate interplay of various regulatory processes. The study of these dynamic metabolic processes and their capabilities help to identify the fundamental properties of living systems. Metabolic deregulation is an emerging hallmark of cancer cells. This deregulation results in rewiring of the metabolic circuitry conferring an exploitative metabolic advantage for the tumor cells which leads to a distinct benefit in survival and lays the basis for unbound progression. Metabolism can be considered as a thermodynamic open-system in which source substrates of high value are being processed through a well established interconnected biochemical conversion system, strictly obeying physiochemical principles, generating useful intermediates and finally resulting in the release of byproducts. Based on this basic principle of an input-output balance, various models have been developed to interrogate metabolism elucidating its underlying functional properties. However, only a few modeling approaches have proved computationally feasible in elucidating the metabolic nature of cancer at a systems level. Besides this, statistical approaches have been set up to identify biochemical pathways being more relevant for specific types of tumor cells. In this review, we are briefly introducing the basic statistical approaches followed by the major modeling concepts. We have put an emphasis on the methods and their applications that have been used to a greater extent in understanding the metabolic remodeling of cancer. Copyright © 2013 Elsevier Ltd. All rights reserved.

  2. Modeling and forecasting energy consumption for heterogeneous buildings using a physical–statistical approach

    International Nuclear Information System (INIS)

    Lü, Xiaoshu; Lu, Tao; Kibert, Charles J.; Viljanen, Martti

    2015-01-01

    Highlights: • This paper presents a new modeling method to forecast energy demands. • The model is based on physical–statistical approach to improving forecast accuracy. • A new method is proposed to address the heterogeneity challenge. • Comparison with measurements shows accurate forecasts of the model. • The first physical–statistical/heterogeneous building energy modeling approach is proposed and validated. - Abstract: Energy consumption forecasting is a critical and necessary input to planning and controlling energy usage in the building sector which accounts for 40% of the world’s energy use and the world’s greatest fraction of greenhouse gas emissions. However, due to the diversity and complexity of buildings as well as the random nature of weather conditions, energy consumption and loads are stochastic and difficult to predict. This paper presents a new methodology for energy demand forecasting that addresses the heterogeneity challenges in energy modeling of buildings. The new method is based on a physical–statistical approach designed to account for building heterogeneity to improve forecast accuracy. The physical model provides a theoretical input to characterize the underlying physical mechanism of energy flows. Then stochastic parameters are introduced into the physical model and the statistical time series model is formulated to reflect model uncertainties and individual heterogeneity in buildings. A new method of model generalization based on a convex hull technique is further derived to parameterize the individual-level model parameters for consistent model coefficients while maintaining satisfactory modeling accuracy for heterogeneous buildings. The proposed method and its validation are presented in detail for four different sports buildings with field measurements. The results show that the proposed methodology and model can provide a considerable improvement in forecasting accuracy

  3. Dry deposition models for radionuclides dispersed in air: a new approach for deposition velocity evaluation schema

    Science.gov (United States)

    Giardina, M.; Buffa, P.; Cervone, A.; De Rosa, F.; Lombardo, C.; Casamirra, M.

    2017-11-01

    In the framework of a National Research Program funded by the Italian Minister of Economic Development, the Department of Energy, Information Engineering and Mathematical Models (DEIM) of Palermo University and ENEA Research Centre of Bologna, Italy are performing several research activities to study physical models and mathematical approaches aimed at investigating dry deposition mechanisms of radioactive pollutants. On the basis of such studies, a new approach to evaluate the dry deposition velocity for particles is proposed. Comparisons with some literature experimental data show that the proposed dry deposition scheme can capture the main phenomena involved in the dry deposition process successfully.

  4. Rescaled Local Interaction Simulation Approach for Shear Wave Propagation Modelling in Magnetic Resonance Elastography

    Directory of Open Access Journals (Sweden)

    Z. Hashemiyan

    2016-01-01

    Full Text Available Properties of soft biological tissues are increasingly used in medical diagnosis to detect various abnormalities, for example, in liver fibrosis or breast tumors. It is well known that mechanical stiffness of human organs can be obtained from organ responses to shear stress waves through Magnetic Resonance Elastography. The Local Interaction Simulation Approach is proposed for effective modelling of shear wave propagation in soft tissues. The results are validated using experimental data from Magnetic Resonance Elastography. These results show the potential of the method for shear wave propagation modelling in soft tissues. The major advantage of the proposed approach is a significant reduction of computational effort.

  5. Multiscale modeling of alloy solidification using a database approach

    Science.gov (United States)

    Tan, Lijian; Zabaras, Nicholas

    2007-11-01

    A two-scale model based on a database approach is presented to investigate alloy solidification. Appropriate assumptions are introduced to describe the behavior of macroscopic temperature, macroscopic concentration, liquid volume fraction and microstructure features. These assumptions lead to a macroscale model with two unknown functions: liquid volume fraction and microstructure features. These functions are computed using information from microscale solutions of selected problems. This work addresses the selection of sample problems relevant to the interested problem and the utilization of data from the microscale solution of the selected sample problems. A computationally efficient model, which is different from the microscale and macroscale models, is utilized to find relevant sample problems. In this work, the computationally efficient model is a sharp interface solidification model of a pure material. Similarities between the sample problems and the problem of interest are explored by assuming that the liquid volume fraction and microstructure features are functions of solution features extracted from the solution of the computationally efficient model. The solution features of the computationally efficient model are selected as the interface velocity and thermal gradient in the liquid at the time the sharp solid-liquid interface passes through. An analytical solution of the computationally efficient model is utilized to select sample problems relevant to solution features obtained at any location of the domain of the problem of interest. The microscale solution of selected sample problems is then utilized to evaluate the two unknown functions (liquid volume fraction and microstructure features) in the macroscale model. The temperature solution of the macroscale model is further used to improve the estimation of the liquid volume fraction and microstructure features. Interpolation is utilized in the feature space to greatly reduce the number of required

  6. Modeling in biopharmaceutics, pharmacokinetics and pharmacodynamics homogeneous and heterogeneous approaches

    CERN Document Server

    Macheras, Panos

    2016-01-01

    The state of the art in Biopharmaceutics, Pharmacokinetics, and Pharmacodynamics Modeling is presented in this new second edition book. It shows how advanced physical and mathematical methods can expand classical models in order to cover heterogeneous drug-biological processes and therapeutic effects in the body. The book is divided into four parts; the first deals with the fundamental principles of fractals, diffusion and nonlinear dynamics; the second with drug dissolution, release, and absorption; the third with epirical, compartmental, and stochastic pharmacokinetic models, with two new chapters, one on fractional pharmacokinetics and one on bioequivalence; and the fourth mainly with classical and nonclassical aspects of pharmacodynamics. The classical models that have relevance and application to these sciences are also considered throughout. This second edition has new information on reaction limited models of dissolution, non binary biopharmaceutic classification system, time varying models, and interf...

  7. Modeling drug- and chemical- induced hepatotoxicity with systems biology approaches

    Directory of Open Access Journals (Sweden)

    Sudin eBhattacharya

    2012-12-01

    Full Text Available We provide an overview of computational systems biology approaches as applied to the study of chemical- and drug-induced toxicity. The concept of ‘toxicity pathways’ is described in the context of the 2007 US National Academies of Science report, Toxicity testing in the 21st Century: A Vision and A Strategy. Pathway mapping and modeling based on network biology concepts are a key component of the vision laid out in this report for a more biologically-based analysis of dose-response behavior and the safety of chemicals and drugs. We focus on toxicity of the liver (hepatotoxicity – a complex phenotypic response with contributions from a number of different cell types and biological processes. We describe three case studies of complementary multi-scale computational modeling approaches to understand perturbation of toxicity pathways in the human liver as a result of exposure to environmental contaminants and specific drugs. One approach involves development of a spatial, multicellular virtual tissue model of the liver lobule that combines molecular circuits in individual hepatocytes with cell-cell interactions and blood-mediated transport of toxicants through hepatic sinusoids, to enable quantitative, mechanistic prediction of hepatic dose-response for activation of the AhR toxicity pathway. Simultaneously, methods are being developing to extract quantitative maps of intracellular signaling and transcriptional regulatory networks perturbed by environmental contaminants, using a combination of gene expression and genome-wide protein-DNA interaction data. A predictive physiological model (DILIsymTM to understand drug-induced liver injury (DILI, the most common adverse event leading to termination of clinical development programs and regulatory actions on drugs, is also described. The model initially focuses on reactive metabolite-induced DILI in response to administration of acetaminophen, and spans multiple biological scales.

  8. Overview of the FEP analysis approach to model development

    International Nuclear Information System (INIS)

    Bailey, L.

    1998-01-01

    This report heads a suite of documents describing the Nirex model development programme. The programme is designed to provide a clear audit trail from the identification of significant features, events and processes (FEPs) to the models and modelling processes employed within a detailed safety assessment. A five stage approach has been adopted, which provides a systematic framework for addressing uncertainty and for the documentation of all modelling decisions and assumptions. The five stages are as follows: Stage 1: EP Analysis - compilation and structuring of a FEP database; Stage 2: Scenario and Conceptual Model Development; Stage 3: Mathematical Model Development; Stage 4: Software Development; Stage 5: confidence Building. This report describes the development and structuring of a FEP database as a Master Directed Diagram (MDD) and explains how this may be used to identify different modelling scenarios, based upon the identification of scenario -defining FEPs. The methodology describes how the possible evolution of a repository system can be addressed in terms of a base scenario, a broad and reasonable representation of the 'natural' evolution of the system, and a number of variant scenarios, representing the effects of probabilistic events and processes. The MDD has been used to identify conceptual models to represent the base scenario and the interactions between these conceptual models have been systematically reviewed using a matrix diagram technique. This has led to the identification of modelling requirements for the base scenario, against which existing assessment software capabilities have been reviewed. A mechanism for combining probabilistic scenario-defining FEPs to construct multi-FEP variant scenarios has been proposed and trialled using the concept of a 'timeline', a defined sequence of events, from which consequences can be assessed. An iterative approach, based on conservative modelling principles, has been proposed for the evaluation of

  9. Modeling of correlated data with informative cluster sizes: An evaluation of joint modeling and within-cluster resampling approaches.

    Science.gov (United States)

    Zhang, Bo; Liu, Wei; Zhang, Zhiwei; Qu, Yanping; Chen, Zhen; Albert, Paul S

    2017-08-01

    Joint modeling and within-cluster resampling are two approaches that are used for analyzing correlated data with informative cluster sizes. Motivated by a developmental toxicity study, we examined the performances and validity of these two approaches in testing covariate effects in generalized linear mixed-effects models. We show that the joint modeling approach is robust to the misspecification of cluster size models in terms of Type I and Type II errors when the corresponding covariates are not included in the random effects structure; otherwise, statistical tests may be affected. We also evaluate the performance of the within-cluster resampling procedure and thoroughly investigate the validity of it in modeling correlated data with informative cluster sizes. We show that within-cluster resampling is a valid alternative to joint modeling for cluster-specific covariates, but it is invalid for time-dependent covariates. The two methods are applied to a developmental toxicity study that investigated the effect of exposure to diethylene glycol dimethyl ether.

  10. Quantitative versus qualitative modeling: a complementary approach in ecosystem study.

    Science.gov (United States)

    Bondavalli, C; Favilla, S; Bodini, A

    2009-02-01

    Natural disturbance or human perturbation act upon ecosystems by changing some dynamical parameters of one or more species. Foreseeing these modifications is necessary before embarking on an intervention: predictions may help to assess management options and define hypothesis for interventions. Models become valuable tools for studying and making predictions only when they capture types of interactions and their magnitude. Quantitative models are more precise and specific about a system, but require a large effort in model construction. Because of this very often ecological systems remain only partially specified and one possible approach to their description and analysis comes from qualitative modelling. Qualitative models yield predictions as directions of change in species abundance but in complex systems these predictions are often ambiguous, being the result of opposite actions exerted on the same species by way of multiple pathways of interactions. Again, to avoid such ambiguities one needs to know the intensity of all links in the system. One way to make link magnitude explicit in a way that can be used in qualitative analysis is described in this paper and takes advantage of another type of ecosystem representation: ecological flow networks. These flow diagrams contain the structure, the relative position and the connections between the components of a system, and the quantity of matter flowing along every connection. In this paper it is shown how these ecological flow networks can be used to produce a quantitative model similar to the qualitative counterpart. Analyzed through the apparatus of loop analysis this quantitative model yields predictions that are by no means ambiguous, solving in an elegant way the basic problem of qualitative analysis. The approach adopted in this work is still preliminary and we must be careful in its application.

  11. Fugacity superposition: a new approach to dynamic multimedia fate modeling.

    Science.gov (United States)

    Hertwich, E G

    2001-08-01

    The fugacities, concentrations, or inventories of pollutants in environmental compartments as determined by multimedia environmental fate models of the Mackay type can be superimposed on each other. This is true for both steady-state (level III) and dynamic (level IV) models. Any problem in multimedia fate models with linear, time-invariant transfer and transformation coefficients can be solved through a superposition of a set of n independent solutions to a set of coupled, homogeneous first-order differential equations, where n is the number of compartments in the model. For initial condition problems in dynamic models, the initial inventories can be separated, e.g. by a compartment. The solution is obtained by adding the single-compartment solutions. For time-varying emissions, a convolution integral is used to superimpose solutions. The advantage of this approach is that the differential equations have to be solved only once. No numeric integration is required. Alternatively, the dynamic model can be simplified to algebraic equations using the Laplace transform. For time-varying emissions, the Laplace transform of the model equations is simply multiplied with the Laplace transform of the emission profile. It is also shown that the time-integrated inventories of the initial conditions problems are the same as the inventories in the steady-state problem. This implies that important properties of pollutants such as potential dose, persistence, and characteristic travel distance can be derived from the steady state.

  12. Bankruptcy prediction using SVM models with a new approach to combine features selection and parameter optimisation

    Science.gov (United States)

    Zhou, Ligang; Keung Lai, Kin; Yen, Jerome

    2014-03-01

    Due to the economic significance of bankruptcy prediction of companies for financial institutions, investors and governments, many quantitative methods have been used to develop effective prediction models. Support vector machine (SVM), a powerful classification method, has been used for this task; however, the performance of SVM is sensitive to model form, parameter setting and features selection. In this study, a new approach based on direct search and features ranking technology is proposed to optimise features selection and parameter setting for 1-norm and least-squares SVM models for bankruptcy prediction. This approach is also compared to the SVM models with parameter optimisation and features selection by the popular genetic algorithm technique. The experimental results on a data set with 2010 instances show that the proposed models are good alternatives for bankruptcy prediction.

  13. Dealing with noisy absences to optimize species distribution models: an iterative ensemble modelling approach.

    Directory of Open Access Journals (Sweden)

    Christine Lauzeral

    Full Text Available Species distribution models (SDMs are widespread in ecology and conservation biology, but their accuracy can be lowered by non-environmental (noisy absences that are common in species occurrence data. Here we propose an iterative ensemble modelling (IEM method to deal with noisy absences and hence improve the predictive reliability of ensemble modelling of species distributions. In the IEM approach, outputs of a classical ensemble model (EM were used to update the raw occurrence data. The revised data was then used as input for a new EM run. This process was iterated until the predictions stabilized. The outputs of the iterative method were compared to those of the classical EM using virtual species. The IEM process tended to converge rapidly. It increased the consensus between predictions provided by the different methods as well as between those provided by different learning data sets. Comparing IEM and EM showed that for high levels of non-environmental absences, iterations significantly increased prediction reliability measured by the Kappa and TSS indices, as well as the percentage of well-predicted sites. Compared to EM, IEM also reduced biases in estimates of species prevalence. Compared to the classical EM method, IEM improves the reliability of species predictions. It particularly deals with noisy absences that are replaced in the data matrices by simulated presences during the iterative modelling process. IEM thus constitutes a promising way to increase the accuracy of EM predictions of difficult-to-detect species, as well as of species that are not in equilibrium with their environment.

  14. A comprehensive approach to age-dependent dosimetric modeling

    Energy Technology Data Exchange (ETDEWEB)

    Leggett, R.W.; Cristy, M.; Eckerman, K.F.

    1986-01-01

    In the absence of age-specific biokinetic models, current retention models of the International Commission on Radiological Protection (ICRP) frequently are used as a point of departure for evaluation of exposures to the general population. These models were designed and intended for estimation of long-term integrated doses to the adult worker. Their format and empirical basis preclude incorporation of much valuable physiological information and physiologically reasonable assumptions that could be used in characterizing the age-specific behavior of radioelements in humans. In this paper we discuss a comprehensive approach to age-dependent dosimetric modeling in which consideration is given not only to changes with age in masses and relative geometries of body organs and tissues but also to best available physiological and radiobiological information relating to the age-specific biobehavior of radionuclides. This approach is useful in obtaining more accurate estimates of long-term dose commitments as a function of age at intake, but it may be particularly valuable in establishing more accurate estimates of dose rate as a function of age. Age-specific dose rates are needed for a proper analysis of the potential effects on estimates or risk of elevated dose rates per unit intake in certain stages of life, elevated response per unit dose received during some stages of life, and age-specific non-radiogenic competing risks.

  15. A cascade modelling approach to flood extent estimation

    Science.gov (United States)

    Pedrozo-Acuña, Adrian; Rodríguez-Rincón, Juan Pablo; Breña-Naranjo, Agustin

    2014-05-01

    Recent efforts dedicated to the generation of new flood risk management strategies, have pointed out that a possible way forward for an improvement in this field relies on the reduction and quantification of uncertainties associated to the prediction system. With the purpose of reducing these uncertainties, this investigation follows a cascade modelling approach (meteorological - hydrological - 2D hydrodynamic) in combination with high-quality data (LiDAR, satellite imagery, precipitation), to study an extreme event registered last year in Mexico. The presented approach is useful for both, the characterisation of epistemic uncertainties and the generation of flood management strategies through probabilistic flood maps. Uncertainty is considered in both meteorological and hydrological models, and is propagated to a given flood extent as determined with a hydrodynamic model. Despite the methodology does not consider all the uncertainties that may be involved in the determination of a flooded area, it enables better understanding of the interaction between errors in the set-up of models and their propagation to a given result.

  16. A comprehensive approach to age-dependent dosimetric modeling

    International Nuclear Information System (INIS)

    Leggett, R.W.; Cristy, M.; Eckerman, K.F.

    1986-01-01

    In the absence of age-specific biokinetic models, current retention models of the International Commission on Radiological Protection (ICRP) frequently are used as a point of departure for evaluation of exposures to the general population. These models were designed and intended for estimation of long-term integrated doses to the adult worker. Their format and empirical basis preclude incorporation of much valuable physiological information and physiologically reasonable assumptions that could be used in characterizing the age-specific behavior of radioelements in humans. In this paper we discuss a comprehensive approach to age-dependent dosimetric modeling in which consideration is given not only to changes with age in masses and relative geometries of body organs and tissues but also to best available physiological and radiobiological information relating to the age-specific biobehavior of radionuclides. This approach is useful in obtaining more accurate estimates of long-term dose commitments as a function of age at intake, but it may be particularly valuable in establishing more accurate estimates of dose rate as a function of age. Age-specific dose rates are needed for a proper analysis of the potential effects on estimates or risk of elevated dose rates per unit intake in certain stages of life, elevated response per unit dose received during some stages of life, and age-specific non-radiogenic competing risks

  17. Micromechanical modeling and inverse identification of damage using cohesive approaches

    International Nuclear Information System (INIS)

    Blal, Nawfal

    2013-01-01

    In this study a micromechanical model is proposed for a collection of cohesive zone models embedded between two each elements of a standard cohesive-volumetric finite element method. An equivalent 'matrix-inclusions' composite is proposed as a representation of the cohesive-volumetric discretization. The overall behaviour is obtained using homogenization approaches (Hashin Shtrikman scheme and the P. Ponte Castaneda approach). The derived model deals with elastic, brittle and ductile materials. It is available whatever the triaxiality loading rate and the shape of the cohesive law, and leads to direct relationships between the overall material properties and the local cohesive parameters and the mesh density. First, rigorous bounds on the normal and tangential cohesive stiffnesses are obtained leading to a suitable control of the inherent artificial elastic loss induced by intrinsic cohesive models. Second, theoretical criteria on damageable and ductile cohesive parameters are established (cohesive peak stress, critical separation, cohesive failure energy,... ). These criteria allow a practical calibration of the cohesive zone parameters as function of the overall material properties and the mesh length. The main interest of such calibration is its promising capacity to lead to a mesh-insensitive overall response in surface damage. (author) [fr

  18. Artificial Life of Soybean Plant Growth Modeling Using Intelligence Approaches

    Directory of Open Access Journals (Sweden)

    Atris Suyantohadi

    2010-03-01

    Full Text Available The natural process on plant growth system has a complex system and it has could be developed on characteristic studied using intelligent approaches conducting with artificial life system. The approaches on examining the natural process on soybean (Glycine Max L.Merr plant growth have been analyzed and synthesized in these research through modeling using Artificial Neural Network (ANN and Lindenmayer System (L-System methods. Research aimed to design and to visualize plant growth modeling on the soybean varieties which these could help for studying botany of plant based on fertilizer compositions on plant growth with Nitrogen (N, Phosphor (P and Potassium (K. The soybean plant growth has been analyzed based on the treatments of plant fertilizer compositions in the experimental research to develop plant growth modeling. By using N, P, K fertilizer compositions, its capable result on the highest production 2.074 tons/hectares. Using these models, the simulation on artificial life for describing identification and visualization on the characteristic of soybean plant growth could be demonstrated and applied.

  19. A Succinct Approach to Static Analysis and Model Checking

    DEFF Research Database (Denmark)

    Filipiuk, Piotr

    In a number of areas software correctness is crucial, therefore it is often desirable to formally verify the presence of various properties or the absence of errors. This thesis presents a framework for concisely expressing static analysis and model checking problems. The framework facilitates...... in the classical formulation of ALFP logic. Finally, we show that the logics and the associated solvers can be used for rapid prototyping. We illustrate that by a variety of case studies from static analysis and model checking....

  20. Dual Numbers Approach in Multiaxis Machines Error Modeling

    Directory of Open Access Journals (Sweden)

    Jaroslav Hrdina

    2014-01-01

    Full Text Available Multiaxis machines error modeling is set in the context of modern differential geometry and linear algebra. We apply special classes of matrices over dual numbers and propose a generalization of such concept by means of general Weil algebras. We show that the classification of the geometric errors follows directly from the algebraic properties of the matrices over dual numbers and thus the calculus over the dual numbers is the proper tool for the methodology of multiaxis machines error modeling.

  1. Numerical modelling of carbonate platforms and reefs: approaches and opportunities

    Energy Technology Data Exchange (ETDEWEB)

    Dalmasso, H.; Montaggioni, L.F.; Floquet, M. [Universite de Provence, Marseille (France). Centre de Sedimentologie-Palaeontologie; Bosence, D. [Royal Holloway University of London, Egham (United Kingdom). Dept. of Geology

    2001-07-01

    This paper compares different computing procedures that have been utilized in simulating shallow-water carbonate platform development. Based on our geological knowledge we can usually give a rather accurate qualitative description of the mechanisms controlling geological phenomena. Further description requires the use of computer stratigraphic simulation models that allow quantitative evaluation and understanding of the complex interactions of sedimentary depositional carbonate systems. The roles of modelling include: (1) encouraging accuracy and precision in data collection and process interpretation (Watney et al., 1999); (2) providing a means to quantitatively test interpretations concerning the control of various mechanisms on producing sedimentary packages; (3) predicting or extrapolating results into areas of limited control; (4) gaining new insights regarding the interaction of parameters; (5) helping focus on future studies to resolve specific problems. This paper addresses two main questions, namely: (1) What are the advantages and disadvantages of various types of models? (2) How well do models perform? In this paper we compare and discuss the application of five numerical models: CARBONATE (Bosence and Waltham, 1990), FUZZIM (Nordlund, 1999), CARBPLAT (Bosscher, 1992), DYNACARB (Li et al., 1993), PHIL (Bowman, 1997) and SEDPAK (Kendall et al., 1991). The comparison, testing and evaluation of these models allow one to gain a better knowledge and understanding of controlling parameters of carbonate platform development, which are necessary for modelling. Evaluating numerical models, critically comparing results from models using different approaches, and pushing experimental tests to their limits, provide an effective vehicle to improve and develop new numerical models. A main feature of this paper is to closely compare the performance between two numerical models: a forward model (CARBONATE) and a fuzzy logic model (FUZZIM). These two models use common

  2. Return of investment and profitability analysis of bio-fuels production using a modeling approach

    Directory of Open Access Journals (Sweden)

    Yangyang Deng

    2016-06-01

    Full Text Available The objectives of this study were to evaluate the return of investment and profitability of a bio-gasification facility using a modeling method. Based on preliminary market analysis, the results determined that the power facilities driven by biomass gasifiers could be profitable if they consider the most sensitive cost factors such as labor, project investment, and feedstock supply. The result showed that economic feasibility of bio-gasification facility can significantly affect by its production capacity and operating modes (one shift, two shifts, or three shifts. The cost analysis modeling approach developed in this study could be a good approach for economic analysis of bio-syngas and bio-fuel products. In addition, this study demonstrated a unique modeling approach to analyze return of investment and profitability of biofuels production.

  3. Risk assessment of oil price from static and dynamic modelling approaches

    DEFF Research Database (Denmark)

    Mi, Zhi-Fu; Wei, Yi-Ming; Tang, Bao-Jun

    2017-01-01

    market circumstances and volatility of oil price require a comprehensive reestimation of risk. Therefore, this study aims to explore an integrated approach to assess the price risk in the two crude oil markets through the value at risk (VaR) model. The VaR is estimated by the extreme value theory (EVT......) and GARCH model on the basis of generalized error distribution (GED). The results show that EVT is a powerful approach to capture the risk in the oil markets. On the contrary, the traditional variance–covariance (VC) and Monte Carlo (MC) approaches tend to overestimate risk when the confidence level is 95......%, but underestimate risk at the confidence level of 99%. The VaR of WTI returns is larger than that of Brent returns at identical confidence levels. Moreover, the GED-GARCH model can estimate the downside dynamic VaR accurately for WTI and Brent oil returns....

  4. Medical Inpatient Journey Modeling and Clustering: A Bayesian Hidden Markov Model Based Approach.

    Science.gov (United States)

    Huang, Zhengxing; Dong, Wei; Wang, Fei; Duan, Huilong

    2015-01-01

    Modeling and clustering medical inpatient journeys is useful to healthcare organizations for a number of reasons including inpatient journey reorganization in a more convenient way for understanding and browsing, etc. In this study, we present a probabilistic model-based approach to model and cluster medical inpatient journeys. Specifically, we exploit a Bayesian Hidden Markov Model based approach to transform medical inpatient journeys into a probabilistic space, which can be seen as a richer representation of inpatient journeys to be clustered. Then, using hierarchical clustering on the matrix of similarities, inpatient journeys can be clustered into different categories w.r.t their clinical and temporal characteristics. We evaluated the proposed approach on a real clinical data set pertaining to the unstable angina treatment process. The experimental results reveal that our method can identify and model latent treatment topics underlying in personalized inpatient journeys, and yield impressive clustering quality.

  5. A tantalum strength model using a multiscale approach: version 2

    Energy Technology Data Exchange (ETDEWEB)

    Becker, R; Arsenlis, A; Hommes, G; Marian, J; Rhee, M; Yang, L H

    2009-09-21

    A continuum strength model for tantalum was developed in 2007 using a multiscale approach. This was our first attempt at connecting simulation results from atomistic to continuum length scales, and much was learned that we were not able to incorporate into the model at that time. The tantalum model described in this report represents a second cut at pulling together multiscale simulation results into a continuum model. Insight gained in creating previous multiscale models for tantalum and vanadium was used to guide the model construction and functional relations for the present model. While the basic approach follows that of the vanadium model, there are significant departures. Some of the recommendations from the vanadium report were followed, but not all. Results from several new analysis techniques have not yet been incorporated due to technical difficulties. Molecular dynamics simulations of single dislocation motion at several temperatures suggested that the thermal activation barrier was temperature dependent. This dependency required additional temperature functions be included within the assumed Arrhenius relation. The combination of temperature dependent functions created a complex model with a non unique parameterization and extra model constants. The added complexity had no tangible benefits. The recommendation was to abandon the strict Arrhenius form and create a simpler curve fit to the molecular dynamics data for shear stress versus dislocation velocity. Functions relating dislocation velocity and applied shear stress were constructed vor vanadium for both edge and screw dislocations. However, an attempt to formulate a robust continuum constitutive model for vanadium using both dislocation populations was unsuccessful; the level of coupling achieved was inadequate to constrain the dislocation evolution properly. Since the behavior of BCC materials is typically assumed to be dominated by screw dislocations, the constitutive relations were ultimately

  6. Modeling the cometary environment using a fluid approach

    Science.gov (United States)

    Shou, Yinsi

    Comets are believed to have preserved the building material of the early solar system and to hold clues to the origin of life on Earth. Abundant remote observations of comets by telescopes and the in-situ measurements by a handful of space missions reveal that the cometary environments are complicated by various physical and chemical processes among the neutral gases and dust grains released from comets, cometary ions, and the solar wind in the interplanetary space. Therefore, physics-based numerical models are in demand to interpret the observational data and to deepen our understanding of the cometary environment. In this thesis, three models using a fluid approach, which include important physical and chemical processes underlying the cometary environment, have been developed to study the plasma, neutral gas, and the dust grains, respectively. Although models based on the fluid approach have limitations in capturing all of the correct physics for certain applications, especially for very low gas density environment, they are computationally much more efficient than alternatives. In the simulations of comet 67P/Churyumov-Gerasimenko at various heliocentric distances with a wide range of production rates, our multi-fluid cometary neutral gas model and multi-fluid cometary dust model have achieved comparable results to the Direct Simulation Monte Carlo (DSMC) model, which is based on a kinetic approach that is valid in all collisional regimes. Therefore, our model is a powerful alternative to the particle-based model, especially for some computationally intensive simulations. Capable of accounting for the varying heating efficiency under various physical conditions in a self-consistent way, the multi-fluid cometary neutral gas model is a good tool to study the dynamics of the cometary coma with different production rates and heliocentric distances. The modeled H2O expansion speeds reproduce the general trend and the speed's nonlinear dependencies of production rate

  7. Validity of the Neuromuscular Recovery Scale: a measurement model approach.

    Science.gov (United States)

    Velozo, Craig; Moorhouse, Michael; Ardolino, Elizabeth; Lorenz, Doug; Suter, Sarah; Basso, D Michele; Behrman, Andrea L

    2015-08-01

    To determine how well the Neuromuscular Recovery Scale (NRS) items fit the Rasch, 1-parameter, partial-credit measurement model. Confirmatory factor analysis (CFA) and principal components analysis (PCA) of residuals were used to determine dimensionality. The Rasch, 1-parameter, partial-credit rating scale model was used to determine rating scale structure, person/item fit, point-measure item correlations, item discrimination, and measurement precision. Seven NeuroRecovery Network clinical sites. Outpatients (N=188) with spinal cord injury. Not applicable. NRS. While the NRS met 1 of 3 CFA criteria, the PCA revealed that the Rasch measurement dimension explained 76.9% of the variance. Ten of 11 items and 91% of the patients fit the Rasch model, with 9 of 11 items showing high discrimination. Sixty-nine percent of the ratings met criteria. The items showed a logical item-difficulty order, with Stand retraining as the easiest item and Walking as the most challenging item. The NRS showed no ceiling or floor effects and separated the sample into almost 5 statistically distinct strata; individuals with an American Spinal Injury Association Impairment Scale (AIS) D classification showed the most ability, and those with an AIS A classification showed the least ability. Items not meeting the rating scale criteria appear to be related to the low frequency counts. The NRS met many of the Rasch model criteria for construct validity. Copyright © 2015 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  8. International energy market dynamics: a modelling approach. Tome 2

    International Nuclear Information System (INIS)

    Nachet, S.

    1996-01-01

    This work is an attempt to model international energy market and reproduce the behaviour of both energy demand and supply. Energy demand was represented using sector versus source approach. For developing countries, existing link between economic and energy sectors were analysed. Energy supply is exogenous for energy sources other than oil and natural gas. For hydrocarbons, exploration-production process was modelled and produced figures as production yield, exploration effort index, ect. The model build is econometric and is solved using a software that was constructed for this purpose. We explore the energy market future using three scenarios and obtain projections by 2010 for energy demand per source and oil and natural gas supply per region. Economic variables are used to produce different indicators as energy intensity, energy per capita, etc. (author). 378 refs., 26 figs., 35 tabs., 11 appends

  9. International energy market dynamics: a modelling approach. Tome 1

    International Nuclear Information System (INIS)

    Nachet, S.

    1996-01-01

    This work is an attempt to model international energy market and reproduce the behaviour of both energy demand and supply. Energy demand was represented using sector versus source approach. For developing countries, existing link between economic and energy sectors were analysed. Energy supply is exogenous for energy sources other than oil and natural gas. For hydrocarbons, exploration-production process was modelled and produced figures as production yield, exploration effort index, etc. The model built is econometric and is solved using a software that was constructed for this purpose. We explore the energy market future using three scenarios and obtain projections by 2010 for energy demand per source and oil natural gas supply per region. Economic variables are used to produce different indicators as energy intensity, energy per capita, etc. (author). 378 refs., 26 figs., 35 tabs., 11 appends

  10. Bayesian Multi-Energy Computed Tomography reconstruction approaches based on decomposition models

    International Nuclear Information System (INIS)

    Cai, Caifang

    2013-01-01

    Multi-Energy Computed Tomography (MECT) makes it possible to get multiple fractions of basis materials without segmentation. In medical application, one is the soft-tissue equivalent water fraction and the other is the hard-matter equivalent bone fraction. Practical MECT measurements are usually obtained with polychromatic X-ray beams. Existing reconstruction approaches based on linear forward models without counting the beam poly-chromaticity fail to estimate the correct decomposition fractions and result in Beam-Hardening Artifacts (BHA). The existing BHA correction approaches either need to refer to calibration measurements or suffer from the noise amplification caused by the negative-log pre-processing and the water and bone separation problem. To overcome these problems, statistical DECT reconstruction approaches based on non-linear forward models counting the beam poly-chromaticity show great potential for giving accurate fraction images.This work proposes a full-spectral Bayesian reconstruction approach which allows the reconstruction of high quality fraction images from ordinary polychromatic measurements. This approach is based on a Gaussian noise model with unknown variance assigned directly to the projections without taking negative-log. Referring to Bayesian inferences, the decomposition fractions and observation variance are estimated by using the joint Maximum A Posteriori (MAP) estimation method. Subject to an adaptive prior model assigned to the variance, the joint estimation problem is then simplified into a single estimation problem. It transforms the joint MAP estimation problem into a minimization problem with a non-quadratic cost function. To solve it, the use of a monotone Conjugate Gradient (CG) algorithm with suboptimal descent steps is proposed.The performances of the proposed approach are analyzed with both simulated and experimental data. The results show that the proposed Bayesian approach is robust to noise and materials. It is also

  11. A validated age-related normative model for male total testosterone shows increasing variance but no decline after age 40 years.

    Science.gov (United States)

    Kelsey, Thomas W; Li, Lucy Q; Mitchell, Rod T; Whelan, Ashley; Anderson, Richard A; Wallace, W Hamish B

    2014-01-01

    The diagnosis of hypogonadism in human males includes identification of low serum testosterone levels, and hence there is an underlying assumption that normal ranges of testosterone for the healthy population are known for all ages. However, to our knowledge, no such reference model exists in the literature, and hence the availability of an applicable biochemical reference range would be helpful for the clinical assessment of hypogonadal men. In this study, using model selection and validation analysis of data identified and extracted from thirteen studies, we derive and validate a normative model of total testosterone across the lifespan in healthy men. We show that total testosterone peaks [mean (2.5-97.5 percentile)] at 15.4 (7.2-31.1) nmol/L at an average age of 19 years, and falls in the average case [mean (2.5-97.5 percentile)] to 13.0 (6.6-25.3) nmol/L by age 40 years, but we find no evidence for a further fall in mean total testosterone with increasing age through to old age. However we do show that there is an increased variation in total testosterone levels with advancing age after age 40 years. This model provides the age related reference ranges needed to support research and clinical decision making in males who have symptoms that may be due to hypogonadism.

  12. Hierarchical mixture of experts and diagnostic modeling approach to reduce hydrologic model structural uncertainty: STRUCTURAL UNCERTAINTY DIAGNOSTICS

    Energy Technology Data Exchange (ETDEWEB)

    Moges, Edom [Civil and Environmental Engineering Department, Washington State University, Richland Washington USA; Demissie, Yonas [Civil and Environmental Engineering Department, Washington State University, Richland Washington USA; Li, Hong-Yi [Hydrology Group, Pacific Northwest National Laboratory, Richland Washington USA

    2016-04-01

    In most water resources applications, a single model structure might be inadequate to capture the dynamic multi-scale interactions among different hydrological processes. Calibrating single models for dynamic catchments, where multiple dominant processes exist, can result in displacement of errors from structure to parameters, which in turn leads to over-correction and biased predictions. An alternative to a single model structure is to develop local expert structures that are effective in representing the dominant components of the hydrologic process and adaptively integrate them based on an indicator variable. In this study, the Hierarchical Mixture of Experts (HME) framework is applied to integrate expert model structures representing the different components of the hydrologic process. Various signature diagnostic analyses are used to assess the presence of multiple dominant processes and the adequacy of a single model, as well as to identify the structures of the expert models. The approaches are applied for two distinct catchments, the Guadalupe River (Texas) and the French Broad River (North Carolina) from the Model Parameter Estimation Experiment (MOPEX), using different structures of the HBV model. The results show that the HME approach has a better performance over the single model for the Guadalupe catchment, where multiple dominant processes are witnessed through diagnostic measures. Whereas, the diagnostics and aggregated performance measures prove that French Broad has a homogeneous catchment response, making the single model adequate to capture the response.

  13. New business models for electric cars-A holistic approach

    International Nuclear Information System (INIS)

    Kley, Fabian; Lerch, Christian; Dallinger, David

    2011-01-01

    Climate change and global resource shortages have led to rethinking traditional individual mobility services based on combustion engines. As the consequence of technological improvements, the first electric vehicles are now being introduced and greater market penetration can be expected. But any wider implementation of battery-powered electrical propulsion systems in the future will give rise to new challenges for both the traditional automotive industry and other new players, e.g. battery manufacturers, the power supply industry and other service providers. Different application cases of electric vehicles are currently being discussed which means that numerous business models could emerge, leading to new shares in value creation and involving new players. Consequently, individual stakeholders are uncertain about which business models are really effective with regard to targeting a profitable overall concept. Therefore, this paper aims to define a holistic approach to developing business models for electric mobility, which analyzes the system as a whole on the one hand and provides decision support for affected enterprises on the other. To do so, the basic elements of electric mobility are considered and topical approaches to business models for various stakeholders are discussed. The paper concludes by presenting a systemic instrument for business models based on morphological methods. - Highlights: → We present a systemic instrument to analyze business models for electric vehicles. → Provide decision support for an enterprises dealing with electric vehicle innovations. → Combine business aspects of the triad between vehicles concepts, infrastructure as well as system integration. → In the market, activities in all domains have been initiated, but often with undefined or unclear structures.

  14. Authoring and verification of clinical guidelines: a model driven approach.

    Science.gov (United States)

    Pérez, Beatriz; Porres, Ivan

    2010-08-01

    The goal of this research is to provide a framework to enable authoring and verification of clinical guidelines. The framework is part of a larger research project aimed at improving the representation, quality and application of clinical guidelines in daily clinical practice. The verification process of a guideline is based on (1) model checking techniques to verify guidelines against semantic errors and inconsistencies in their definition, (2) combined with Model Driven Development (MDD) techniques, which enable us to automatically process manually created guideline specifications and temporal-logic statements to be checked and verified regarding these specifications, making the verification process faster and cost-effective. Particularly, we use UML statecharts to represent the dynamics of guidelines and, based on this manually defined guideline specifications, we use a MDD-based tool chain to automatically process them to generate the input model of a model checker. The model checker takes the resulted model together with the specific guideline requirements, and verifies whether the guideline fulfils such properties. The overall framework has been implemented as an Eclipse plug-in named GBDSSGenerator which, particularly, starting from the UML statechart representing a guideline, allows the verification of the guideline against specific requirements. Additionally, we have established a pattern-based approach for defining commonly occurring types of requirements in guidelines. We have successfully validated our overall approach by verifying properties in different clinical guidelines resulting in the detection of some inconsistencies in their definition. The proposed framework allows (1) the authoring and (2) the verification of clinical guidelines against specific requirements defined based on a set of property specification patterns, enabling non-experts to easily write formal specifications and thus easing the verification process. Copyright 2010 Elsevier Inc

  15. A new approach towards image based virtual 3D city modeling by using close range photogrammetry

    Science.gov (United States)

    Singh, S. P.; Jain, K.; Mandla, V. R.

    2014-05-01

    3D city model is a digital representation of the Earth's surface and it's related objects such as building, tree, vegetation, and some manmade feature belonging to urban area. The demand of 3D city modeling is increasing day to day for various engineering and non-engineering applications. Generally three main image based approaches are using for virtual 3D city models generation. In first approach, researchers used Sketch based modeling, second method is Procedural grammar based modeling and third approach is Close range photogrammetry based modeling. Literature study shows that till date, there is no complete solution available to create complete 3D city model by using images. These image based methods also have limitations This paper gives a new approach towards image based virtual 3D city modeling by using close range photogrammetry. This approach is divided into three sections. First, data acquisition process, second is 3D data processing, and third is data combination process. In data acquisition process, a multi-camera setup developed and used for video recording of an area. Image frames created from video data. Minimum required and suitable video image frame selected for 3D processing. In second section, based on close range photogrammetric principles and computer vision techniques, 3D model of area created. In third section, this 3D model exported to adding and merging of other pieces of large area. Scaling and alignment of 3D model was done. After applying the texturing and rendering on this model, a final photo-realistic textured 3D model created. This 3D model transferred into walk-through model or in movie form. Most of the processing steps are automatic. So this method is cost effective and less laborious. Accuracy of this model is good. For this research work, study area is the campus of department of civil engineering, Indian Institute of Technology, Roorkee. This campus acts as a prototype for city. Aerial photography is restricted in many country

  16. A mechanical approach to mean field spin models

    Science.gov (United States)

    Genovese, Giuseppe; Barra, Adriano

    2009-05-01

    Inspired by the bridge pioneered by Guerra among statistical mechanics on lattice and analytical mechanics on 1+1 continuous Euclidean space time, we built a self-consistent method to solve for the thermodynamics of mean field models defined on lattice, whose order parameters self-average. We show the whole procedure by analyzing in full detail the simplest test case, namely, the Curie-Weiss model. Further, we report some applications also to models whose order parameters do not self-average by using the Sherrington-Kirkpatrick spin glass as a guide.

  17. Modelling of an anaerobic plug-flow reactor. Process analysis and evaluation approaches with non-ideal mixing considerations.

    Science.gov (United States)

    Donoso-Bravo, Andrés; Sadino-Riquelme, Constanza; Gómez, Daniel; Segura, Camilo; Valdebenito, Emky; Hansen, Felipe

    2018-03-29

    This study shows the implementation of the Anaerobic Digestion Model (ADM1) in an anaerobic plug-flow reactor (PFR) with two approaches based on the use of consecutive continuous stirred tank reactors (CSTR) connected in serie for considering non-ideal mixing. The two-region (TR) model splits each CSTR into two regions, while the particulate retention (PR) model adds a retention parameter. The models were calibrated and validated based on experimental data from a bench-scale reactor treating cow manure. The PFR conventional model slightly outperformed the non-ideal mixing approaches. However, the PR model showed an increase in biomass retention time treating high solid content substrate. Biogas production was not sensitive to variations of the mixing parameters. The liquid fraction content was better represented by the PR model than the PFR and TR models. The study shows how reactor modelling is useful for monitoring and supervising biogas plants. Copyright © 2018 Elsevier Ltd. All rights reserved.

  18. Leader communication approaches and patient safety: An integrated model.

    Science.gov (United States)

    Mattson, Malin; Hellgren, Johnny; Göransson, Sara

    2015-06-01

    Leader communication is known to influence a number of employee behaviors. When it comes to the relationship between leader communication and safety, the evidence is more scarce and ambiguous. The aim of the present study is to investigate whether and in what way leader communication relates to safety outcomes. The study examines two leader communication approaches: leader safety priority communication and feedback to subordinates. These approaches were assumed to affect safety outcomes via different employee behaviors. Questionnaire data, collected from 221 employees at two hospital wards, were analyzed using structural equation modeling. The two examined communication approaches were both positively related to safety outcomes, although leader safety priority communication was mediated by employee compliance and feedback communication by organizational citizenship behaviors. The findings suggest that leader communication plays a vital role in improving organizational and patient safety and that different communication approaches seem to positively affect different but equally essential employee safety behaviors. The results highlights the necessity for leaders to engage in one-way communication of safety values as well as in more relational feedback communication with their subordinates in order to enhance patient safety. Copyright © 2015 Elsevier Ltd. and National Safety Council. Published by Elsevier Ltd. All rights reserved.

  19. A probabilistic approach to the drag-based model

    Science.gov (United States)

    Napoletano, Gianluca; Forte, Roberta; Moro, Dario Del; Pietropaolo, Ermanno; Giovannelli, Luca; Berrilli, Francesco

    2018-02-01

    The forecast of the time of arrival (ToA) of a coronal mass ejection (CME) to Earth is of critical importance for our high-technology society and for any future manned exploration of the Solar System. As critical as the forecast accuracy is the knowledge of its precision, i.e. the error associated to the estimate. We propose a statistical approach for the computation of the ToA using the drag-based model by introducing the probability distributions, rather than exact values, as input parameters, thus allowing the evaluation of the uncertainty on the forecast. We test this approach using a set of CMEs whose transit times are known, and obtain extremely promising results: the average value of the absolute differences between measure and forecast is 9.1h, and half of these residuals are within the estimated errors. These results suggest that this approach deserves further investigation. We are working to realize a real-time implementation which ingests the outputs of automated CME tracking algorithms as inputs to create a database of events useful for a further validation of the approach.

  20. Modeling in biopharmaceutics, pharmacokinetics, and pharmacodynamics homogeneous and heterogeneous approaches

    CERN Document Server

    Macheras, Panos

    2006-01-01

    The state of the art in Biopharmaceutics, Pharmacokinetics, and Pharmacodynamics Modeling is presented in this book. It shows how advanced physical and mathematical methods can expand classical models in order to cover heterogeneous drug-biological processes and therapeutic effects in the body. The book is divided into four parts; the first deals with the fundamental principles of fractals, diffusion and nonlinear dynamics; the second with drug dissolution, release, and absorption; the third with empirical, compartmental, and stochastic pharmacokinetic models, and the fourth mainly with nonclassical aspects of pharmacodynamics. The classical models that have relevance and application to these sciences are also considered throughout. Many examples are used to illustrate the intrinsic complexity of drug administration related phenomena in the human, justifying the use of advanced modeling methods. This timely and useful book will appeal to graduate students and researchers in pharmacology, pharmaceutical scienc...

  1. Model predictive control approach for a CPAP-device

    Directory of Open Access Journals (Sweden)

    Scheel Mathias

    2017-09-01

    Full Text Available The obstructive sleep apnoea syndrome (OSAS is characterized by a collapse of the upper respiratory tract, resulting in a reduction of the blood oxygen- and an increase of the carbon dioxide (CO2 - concentration, which causes repeated sleep disruptions. The gold standard to treat the OSAS is the continuous positive airway pressure (CPAP therapy. The continuous pressure keeps the upper airway open and prevents the collapse of the upper respiratory tract and the pharynx. Most of the available CPAP-devices cannot maintain the pressure reference [1]. In this work a model predictive control approach is provided. This control approach has the possibility to include the patient’s breathing effort into the calculation of the control variable. Therefore a patient-individualized control strategy can be developed.

  2. Optimizing nitrogen fertilizer use: Current approaches and simulation models

    International Nuclear Information System (INIS)

    Baethgen, W.E.

    2000-01-01

    Nitrogen (N) is the most common limiting nutrient in agricultural systems throughout the world. Crops need sufficient available N to achieve optimum yields and adequate grain-protein content. Consequently, sub-optimal rates of N fertilizers typically cause lower economical benefits for farmers. On the other hand, excessive N fertilizer use may result in environmental problems such as nitrate contamination of groundwater and emission of N 2 O and NO. In spite of the economical and environmental importance of good N fertilizer management, the development of optimum fertilizer recommendations is still a major challenge in most agricultural systems. This article reviews the approaches most commonly used for making N recommendations: expected yield level, soil testing and plant analysis (including quick tests). The paper introduces the application of simulation models that complement traditional approaches, and includes some examples of current applications in Africa and South America. (author)

  3. CM5: A pre-Swarm magnetic field model based upon the comprehensive modeling approach

    DEFF Research Database (Denmark)

    Sabaka, T.; Olsen, Nils; Tyler, Robert

    2014-01-01

    We have developed a model based upon the very successful Comprehensive Modeling (CM) approach using recent CHAMP, Ørsted, SAC-C and observatory hourly-means data from September 2000 to the end of 2013. This CM, called CM5, was derived from the algorithm that will provide a consistent line of Leve...

  4. A Complex Network Approach to Distributional Semantic Models.

    Directory of Open Access Journals (Sweden)

    Akira Utsumi

    Full Text Available A number of studies on network analysis have focused on language networks based on free word association, which reflects human lexical knowledge, and have demonstrated the small-world and scale-free properties in the word association network. Nevertheless, there have been very few attempts at applying network analysis to distributional semantic models, despite the fact that these models have been studied extensively as computational or cognitive models of human lexical knowledge. In this paper, we analyze three network properties, namely, small-world, scale-free, and hierarchical properties, of semantic networks created by distributional semantic models. We demonstrate that the created networks generally exhibit the same properties as word association networks. In particular, we show that the distribution of the number of connections in these networks follows the truncated power law, which is also observed in an association network. This indicates that distributional semantic models can provide a plausible model of lexical knowledge. Additionally, the observed differences in the network properties of various implementations of distributional semantic models are consistently explained or predicted by considering the intrinsic semantic features of a word-context matrix and the functions of matrix weighting and smoothing. Furthermore, to simulate a semantic network with the observed network properties, we propose a new growing network model based on the model of Steyvers and Tenenbaum. The idea underlying the proposed model is that both preferential and random attachments are required to reflect different types of semantic relations in network growth process. We demonstrate that this model provides a better explanation of network behaviors generated by distributional semantic models.

  5. A multi-region approach to modeling subsurface flow

    International Nuclear Information System (INIS)

    Gwo, J.P.; Yeh, G.T.; Wilson, G.V.

    1990-01-01

    In this approach the media are assumed to contain n pore-regions at any physical point. Each region has different pore size and hydrologic parameters. Inter-region exchange is approximated by a linear transfer process. Based on the mass balance principle, a system of equations governing the flow and mass exchange in structured or aggregated soils is derived. This system of equations is coupled through linear transfer terms representing the interchange among different pore regions. A numerical MUlti-Region Flow (MURF) model, using the Galerkin finite element method to facilitate the treatment of local and field-scale heterogeneities, is developed to solve the system of equations. A sparse matrix solver is used to solve the resulting matrix equation, which makes the application of MURF to large field problems feasible in terms of CPU time and storage limitations. MURF is first verified by applying it to a ponding infiltration problem over a hill slope, which is a single-region problem and has been previously simulated by a single-region model. Very good agreement is obtained between the results from the two different models. The MURF code is thus partially verified. It is then applied to a two-region fractured medium to investigate the effects of multi-region approach on the flow field. The results are comparable to that obtained by other investigators. (Author) (15 refs., 6 figs., tab.)

  6. Modelling public risk evaluation of natural hazards: a conceptual approach

    Science.gov (United States)

    Plattner, Th.

    2005-04-01

    In recent years, the dealing with natural hazards in Switzerland has shifted away from being hazard-oriented towards a risk-based approach. Decreasing societal acceptance of risk, accompanied by increasing marginal costs of protective measures and decreasing financial resources cause an optimization problem. Therefore, the new focus lies on the mitigation of the hazard's risk in accordance with economical, ecological and social considerations. This modern proceeding requires an approach in which not only technological, engineering or scientific aspects of the definition of the hazard or the computation of the risk are considered, but also the public concerns about the acceptance of these risks. These aspects of a modern risk approach enable a comprehensive assessment of the (risk) situation and, thus, sound risk management decisions. In Switzerland, however, the competent authorities suffer from a lack of decision criteria, as they don't know what risk level the public is willing to accept. Consequently, there exists a need for the authorities to know what the society thinks about risks. A formalized model that allows at least a crude simulation of the public risk evaluation could therefore be a useful tool to support effective and efficient risk mitigation measures. This paper presents a conceptual approach of such an evaluation model using perception affecting factors PAF, evaluation criteria EC and several factors without any immediate relation to the risk itself, but to the evaluating person. Finally, the decision about the acceptance Acc of a certain risk i is made by a comparison of the perceived risk Ri,perc with the acceptable risk Ri,acc.

  7. Time Ordering in Frontal Lobe Patients: A Stochastic Model Approach

    Science.gov (United States)

    Magherini, Anna; Saetti, Maria Cristina; Berta, Emilia; Botti, Claudio; Faglioni, Pietro

    2005-01-01

    Frontal lobe patients reproduced a sequence of capital letters or abstract shapes. Immediate and delayed reproduction trials allowed the analysis of short- and long-term memory for time order by means of suitable Markov chain stochastic models. Patients were as proficient as healthy subjects on the immediate reproduction trial, thus showing spared…

  8. A generalized quarter car modelling approach with frame flexibility ...

    Indian Academy of Sciences (India)

    HUSAIN KANCHWALA

    A simple Matlab code is provided that enables quick parametric studies. Finally, a parametric study and wheel hop analysis are performed for a realistic numerical example. Frequency and time domain responses obtained show clearly the effects of other wheels, which are outside the scope of usual quarter-car models. The.

  9. Assessing the polycyclic aromatic hydrocarbon (PAH) pollution of urban stormwater runoff: a dynamic modeling approach.

    Science.gov (United States)

    Zheng, Yi; Lin, Zhongrong; Li, Hao; Ge, Yan; Zhang, Wei; Ye, Youbin; Wang, Xuejun

    2014-05-15

    Urban stormwater runoff delivers a significant amount of polycyclic aromatic hydrocarbons (PAHs), mostly of atmospheric origin, to receiving water bodies. The PAH pollution of urban stormwater runoff poses serious risk to aquatic life and human health, but has been overlooked by environmental modeling and management. This study proposed a dynamic modeling approach for assessing the PAH pollution and its associated environmental risk. A variable time-step model was developed to simulate the continuous cycles of pollutant buildup and washoff. To reflect the complex interaction among different environmental media (i.e. atmosphere, dust and stormwater), the dependence of the pollution level on antecedent weather conditions was investigated and embodied in the model. Long-term simulations of the model can be efficiently performed, and probabilistic features of the pollution level and its risk can be easily determined. The applicability of this approach and its value to environmental management was demonstrated by a case study in Beijing, China. The results showed that Beijing's PAH pollution of road runoff is relatively severe, and its associated risk exhibits notable seasonal variation. The current sweeping practice is effective in mitigating the pollution, but the effectiveness is both weather-dependent and compound-dependent. The proposed modeling approach can help identify critical timing and major pollutants for monitoring, assessing and controlling efforts to be focused on. The approach is extendable to other urban areas, as well as to other contaminants with similar fate and transport as PAHs. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. Ensembles modeling approach to study Climate Change impacts on Wheat

    Science.gov (United States)

    Ahmed, Mukhtar; Claudio, Stöckle O.; Nelson, Roger; Higgins, Stewart

    2017-04-01

    Simulations of crop yield under climate variability are subject to uncertainties, and quantification of such uncertainties is essential for effective use of projected results in adaptation and mitigation strategies. In this study we evaluated the uncertainties related to crop-climate models using five crop growth simulation models (CropSyst, APSIM, DSSAT, STICS and EPIC) and 14 general circulation models (GCMs) for 2 representative concentration pathways (RCP) of atmospheric CO2 (4.5 and 8.5 W m-2) in the Pacific Northwest (PNW), USA. The aim was to assess how different process-based crop models could be used accurately for estimation of winter wheat growth, development and yield. Firstly, all models were calibrated for high rainfall, medium rainfall, low rainfall and irrigated sites in the PNW using 1979-2010 as the baseline period. Response variables were related to farm management and soil properties, and included crop phenology, leaf area index (LAI), biomass and grain yield of winter wheat. All five models were run from 2000 to 2100 using the 14 GCMs and 2 RCPs to evaluate the effect of future climate (rainfall, temperature and CO2) on winter wheat phenology, LAI, biomass, grain yield and harvest index. Simulated time to flowering and maturity was reduced in all models except EPIC with some level of uncertainty. All models generally predicted an increase in biomass and grain yield under elevated CO2 but this effect was more prominent under rainfed conditions than irrigation. However, there was uncertainty in the simulation of crop phenology, biomass and grain yield under 14 GCMs during three prediction periods (2030, 2050 and 2070). We concluded that to improve accuracy and consistency in simulating wheat growth dynamics and yield under a changing climate, a multimodel ensemble approach should be used.

  11. Vector-model-supported approach in prostate plan optimization

    International Nuclear Information System (INIS)

    Liu, Eva Sau Fan; Wu, Vincent Wing Cheung; Harris, Benjamin; Lehman, Margot; Pryor, David; Chan, Lawrence Wing Chi

    2017-01-01

    Lengthy time consumed in traditional manual plan optimization can limit the use of step-and-shoot intensity-modulated radiotherapy/volumetric-modulated radiotherapy (S&S IMRT/VMAT). A vector model base, retrieving similar radiotherapy cases, was developed with respect to the structural and physiologic features extracted from the Digital Imaging and Communications in Medicine (DICOM) files. Planning parameters were retrieved from the selected similar reference case and applied to the test case to bypass the gradual adjustment of planning parameters. Therefore, the planning time spent on the traditional trial-and-error manual optimization approach in the beginning of optimization could be reduced. Each S&S IMRT/VMAT prostate reference database comprised 100 previously treated cases. Prostate cases were replanned with both traditional optimization and vector-model-supported optimization based on the oncologists' clinical dose prescriptions. A total of 360 plans, which consisted of 30 cases of S&S IMRT, 30 cases of 1-arc VMAT, and 30 cases of 2-arc VMAT plans including first optimization and final optimization with/without vector-model-supported optimization, were compared using the 2-sided t-test and paired Wilcoxon signed rank test, with a significance level of 0.05 and a false discovery rate of less than 0.05. For S&S IMRT, 1-arc VMAT, and 2-arc VMAT prostate plans, there was a significant reduction in the planning time and iteration with vector-model-supported optimization by almost 50%. When the first optimization plans were compared, 2-arc VMAT prostate plans had better plan quality than 1-arc VMAT plans. The volume receiving 35 Gy in the femoral head for 2-arc VMAT plans was reduced with the vector-model-supported optimization compared with the traditional manual optimization approach. Otherwise, the quality of plans from both approaches was comparable. Vector-model-supported optimization was shown to offer much shortened planning time and iteration

  12. Interdependence: a new model for the global approach to disability

    Directory of Open Access Journals (Sweden)

    Nathan Grills

    2015-01-01

    Full Text Available Disability affects over 1 billion people and the WHO estimates that over 80% of individuals with disability live in low and middle income countries, where access to health and social services to respond to disability are limited 1. Compounding this poverty is that medical and technological approaches to disability, however needed, are usually very expensive. Yet, much can be done at low cost to increase the wellbeing of people with disability, and the church and Christians need to take a lead. The WHO’s definition of disability highlights the challenge to us in global health. It has been defined by the WHO as “the interaction between a person’s impairments and the attitudinal and environmental barriers that hinder their full and effective participation in society on an equal basis with others” 2. This understanding of disability requires us to go beyond mere healing and towards inclusion in our response to chronic diseases and disability. This is known as the social model and requires societal attitudinal change and modification of disabling environments in order to facilitate those with disability to be included in our community and churches. These are good responses but the church needs to consider alternative models to those that are currently promoted which strive for independence as the ultimate endpoint. In this paper I introduce some disability-related articles in this issue and outline an approach that goes beyond the Social Model towards an Interdependence Model which I think is a more Biblical model of disability and one which we Christians and churches in global health should consider. This model would go beyond changing society to accommodate for people with disabilities towards acknowledging they play an important part in our community and indeed in our church. We need those people with disability to contribute, love and bless those with and without disabilities. And of course those with disability need the love, care and

  13. A NEW APPROACH OF DIGITAL BRIDGE SURFACE MODEL GENERATION

    Directory of Open Access Journals (Sweden)

    H. Ju

    2012-07-01

    Full Text Available Bridge areas present difficulties for orthophotos generation and to avoid “collapsed” bridges in the orthoimage, operator assistance is required to create the precise DBM (Digital Bridge Model, which is, subsequently, used for the orthoimage generation. In this paper, a new approach of DBM generation, based on fusing LiDAR (Light Detection And Ranging data and aerial imagery, is proposed. The no precise exterior orientation of the aerial image is required for the DBM generation. First, a coarse DBM is produced from LiDAR data. Then, a robust co-registration between LiDAR intensity and aerial image using the orientation constraint is performed. The from-coarse-to-fine hybrid co-registration approach includes LPFFT (Log-Polar Fast Fourier Transform, Harris Corners, PDF (Probability Density Function feature descriptor mean-shift matching, and RANSAC (RANdom Sample Consensus as main components. After that, bridge ROI (Region Of Interest from LiDAR data domain is projected to the aerial image domain as the ROI in the aerial image. Hough transform linear features are extracted in the aerial image ROI. For the straight bridge, the 1st order polynomial function is used; whereas, for the curved bridge, 2nd order polynomial function is used to fit those endpoints of Hough linear features. The last step is the transformation of the smooth bridge boundaries from aerial image back to LiDAR data domain and merge them with the coarse DBM. Based on our experiments, this new approach is capable of providing precise DBM which can be further merged with DTM (Digital Terrain Model derived from LiDAR data to obtain the precise DSM (Digital Surface Model. Such a precise DSM can be used to improve the orthophoto product quality.

  14. Bianchi VI{sub 0} and III models: self-similar approach

    Energy Technology Data Exchange (ETDEWEB)

    Belinchon, Jose Antonio, E-mail: abelcal@ciccp.e [Departamento de Fisica, ETS Arquitectura, UPM, Av. Juan de Herrera 4, Madrid 28040 (Spain)

    2009-09-07

    We study several cosmological models with Bianchi VI{sub 0} and III symmetries under the self-similar approach. We find new solutions for the 'classical' perfect fluid model as well as for the vacuum model although they are really restrictive for the equation of state. We also study a perfect fluid model with time-varying constants, G and LAMBDA. As in other studied models we find that the behaviour of G and LAMBDA are related. If G behaves as a growing time function then LAMBDA is a positive decreasing time function but if G is decreasing then LAMBDA{sub 0} is negative. We end by studying a massive cosmic string model, putting special emphasis in calculating the numerical values of the equations of state. We show that there is no SS solution for a string model with time-varying constants.

  15. Ship Impact Study: Analytical Approaches and Finite Element Modeling

    Directory of Open Access Journals (Sweden)

    Pawel Woelke

    2012-01-01

    Full Text Available The current paper presents the results of a ship impact study conducted using various analytical approaches available in the literature with the results obtained from detailed finite element analysis. Considering a typical container vessel impacting a rigid wall with an initial speed of 10 knots, the study investigates the forces imparted on the struck obstacle, the energy dissipated through inelastic deformation, penetration, local deformation patterns, and local failure of the ship elements. The main objective of the paper is to study the accuracy and generality of the predictions of the vessel collision forces, obtained by means of analytical closed-form solutions, in reference to detailed finite element analyses. The results show that significant discrepancies between simplified analytical approaches and detailed finite element analyses can occur, depending on the specific impact scenarios under consideration.

  16. A new approach for modeling dry deposition velocity of particles

    Science.gov (United States)

    Giardina, M.; Buffa, P.

    2018-05-01

    The dry deposition process is recognized as an important pathway among the various removal processes of pollutants in the atmosphere. In this field, there are several models reported in the literature useful to predict the dry deposition velocity of particles of different diameters but many of them are not capable of representing dry deposition phenomena for several categories of pollutants and deposition surfaces. Moreover, their applications is valid for specific conditions and if the data in that application meet all of the assumptions required of the data used to define the model. In this paper a new dry deposition velocity model based on an electrical analogy schema is proposed to overcome the above issues. The dry deposition velocity is evaluated by assuming that the resistances that affect the particle flux in the Quasi-Laminar Sub-layers can be combined to take into account local features of the mutual influence of inertial impact processes and the turbulent one. Comparisons with the experimental data from literature indicate that the proposed model allows to capture with good agreement the main dry deposition phenomena for the examined environmental conditions and deposition surfaces to be determined. The proposed approach could be easily implemented within atmospheric dispersion modeling codes and efficiently addressing different deposition surfaces for several particle pollution.

  17. A coordination chemistry approach for modeling trace element adsorption

    International Nuclear Information System (INIS)

    Bourg, A.C.M.

    1986-01-01

    The traditional distribution coefficient, Kd, is highly dependent on the water chemistry and the surface properties of the geological system being studied and is therefore quite inappropriate for use in predictive models. Adsorption, one of the many processes included in Kd values, is described here using a coordination chemistry approach. The concept of adsorption of cationic trace elements by solid hydrous oxides can be applied to natural solids. The adsorption process is thus understood in terms of a classical complexation leading to the formation of surface (heterogeneous) ligands. Applications of this concept to some freshwater, estuarine and marine environments are discussed. (author)

  18. Stability of Rotor Systems: A Complex Modelling Approach

    DEFF Research Database (Denmark)

    Kliem, Wolfhard; Pommer, Christian; Stoustrup, Jakob

    1996-01-01

    with the results of the classical approach using Rayleighquotients. Several rotor systems are tested: a simple Laval rotor, a Laval rotor with additional elasticity and damping in thr bearings, and a number of rotor systems with complex symmetric 4x4 randomly generated matrices.......A large class of rotor systems can be modelled by a complex matrix differential equation of secondorder. The angular velocity of the rotor plays the role of a parameter. We apply the Lyapunov matrix equation in a complex setting and prove two new stability results which are compared...

  19. Modelling and simulating retail management practices: a first approach

    OpenAIRE

    Siebers, Peer-Olaf; Aickelin, Uwe; Celia, Helen; Clegg, Chris

    2010-01-01

    Multi-agent systems offer a new and exciting way of understanding the world of work. We apply agent-based modeling and simulation to investigate a set of problems\\ud in a retail context. Specifically, we are working to understand the relationship between people management practices on the shop-floor and retail performance. Despite the fact we are working within a relatively novel and complex domain, it is clear that using an agent-based approach offers great potential for improving organizati...

  20. Data mining approach to model the diagnostic service management.

    Science.gov (United States)

    Lee, Sun-Mi; Lee, Ae-Kyung; Park, Il-Su

    2006-01-01

    Korea has National Health Insurance Program operated by the government-owned National Health Insurance Corporation, and diagnostic services are provided every two year for the insured and their family members. Developing a customer relationship management (CRM) system using data mining technology would be useful to improve the performance of diagnostic service programs. Under these circumstances, this study developed a model for diagnostic service management taking into account the characteristics of subjects using a data mining approach. This study could be further used to develop an automated CRM system contributing to the increase in the rate of receiving diagnostic services.

  1. On quantum approach to modeling of plasmon photovoltaic effect

    DEFF Research Database (Denmark)

    Kluczyk, Katarzyna; David, Christin; Jacak, Witold Aleksander

    2017-01-01

    .g., upon commercial COMSOL software system). Both approaches are essentially classical ones and neglect quantum particularities related to plasmon excitations in metallic components. We demonstrate that these quantum plasmon effects are of crucial importance especially in theoretical simulations of plasmon...... to the semiconductor solar cell mediated by surface plasmons in metallic nanoparticles deposited on the top of the battery. In addition, short-ranged electron-electron interaction in metals is discussed in the framework of the semiclassical hydrodynamic model. The significance of the related quantum corrections...

  2. New Approaches in Reuseable Booster System Life Cycle Cost Modeling

    Science.gov (United States)

    Zapata, Edgar

    2013-01-01

    This paper presents the results of a 2012 life cycle cost (LCC) study of hybrid Reusable Booster Systems (RBS) conducted by NASA Kennedy Space Center (KSC) and the Air Force Research Laboratory (AFRL). The work included the creation of a new cost estimating model and an LCC analysis, building on past work where applicable, but emphasizing the integration of new approaches in life cycle cost estimation. Specifically, the inclusion of industry processes/practices and indirect costs were a new and significant part of the analysis. The focus of LCC estimation has traditionally been from the perspective of technology, design characteristics, and related factors such as reliability. Technology has informed the cost related support to decision makers interested in risk and budget insight. This traditional emphasis on technology occurs even though it is well established that complex aerospace systems costs are mostly about indirect costs, with likely only partial influence in these indirect costs being due to the more visible technology products. Organizational considerations, processes/practices, and indirect costs are traditionally derived ("wrapped") only by relationship to tangible product characteristics. This traditional approach works well as long as it is understood that no significant changes, and by relation no significant improvements, are being pursued in the area of either the government acquisition or industry?s indirect costs. In this sense then, most launch systems cost models ignore most costs. The alternative was implemented in this LCC study, whereby the approach considered technology and process/practices in balance, with as much detail for one as the other. This RBS LCC study has avoided point-designs, for now, instead emphasizing exploring the trade-space of potential technology advances joined with potential process/practice advances. Given the range of decisions, and all their combinations, it was necessary to create a model of the original model

  3. New Approaches in Reusable Booster System Life Cycle Cost Modeling

    Science.gov (United States)

    Zapata, Edgar

    2013-01-01

    This paper presents the results of a 2012 life cycle cost (LCC) study of hybrid Reusable Booster Systems (RBS) conducted by NASA Kennedy Space Center (KSC) and the Air Force Research Laboratory (AFRL). The work included the creation of a new cost estimating model and an LCC analysis, building on past work where applicable, but emphasizing the integration of new approaches in life cycle cost estimation. Specifically, the inclusion of industry processes/practices and indirect costs were a new and significant part of the analysis. The focus of LCC estimation has traditionally been from the perspective of technology, design characteristics, and related factors such as reliability. Technology has informed the cost related support to decision makers interested in risk and budget insight. This traditional emphasis on technology occurs even though it is well established that complex aerospace systems costs are mostly about indirect costs, with likely only partial influence in these indirect costs being due to the more visible technology products. Organizational considerations, processes/practices, and indirect costs are traditionally derived ("wrapped") only by relationship to tangible product characteristics. This traditional approach works well as long as it is understood that no significant changes, and by relation no significant improvements, are being pursued in the area of either the government acquisition or industry?s indirect costs. In this sense then, most launch systems cost models ignore most costs. The alternative was implemented in this LCC study, whereby the approach considered technology and process/practices in balance, with as much detail for one as the other. This RBS LCC study has avoided point-designs, for now, instead emphasizing exploring the trade-space of potential technology advances joined with potential process/practice advances. Given the range of decisions, and all their combinations, it was necessary to create a model of the original model

  4. THE EFECTIVENESS OF RHETORIC-BASED ESSAY WRITING TEACHING MODEL WITH CONTEXTUAL APPROACH

    Directory of Open Access Journals (Sweden)

    - Akbar

    2015-06-01

    Full Text Available This study aims to develop a rhetoric–based essay writing teaching model with contextual approach in order to improve essay writing skills of students in the English Department of the Education and Teaching Faculty of Lakidende University of Konawe. This instructional model was developed by using research and development. The results show that the model can improve students’ essay writing skills effectively.. It was done in experimental class of the Education and Teaching Faculty of Lakidende University of Konawe Southeast Sulawesi province of Indonesia with the score of 69,80. Thus, it can be concluded that the rhetoric–based essay writing teaching model with contextual approach that has been developed can improve the essay writing skills of students of English Department. It was proper.

  5. The influence of mathematics learning using SAVI approach on junior high school students’ mathematical modelling ability

    Science.gov (United States)

    Khusna, H.; Heryaningsih, N. Y.

    2018-01-01

    The aim of this research was to examine mathematical modeling ability who learn mathematics by using SAVI approach. This research was a quasi-experimental research with non-equivalent control group designed by using purposive sampling technique. The population of this research was the state junior high school students in Lembang while the sample consisted of two class at 8th grade. The instrument used in this research was mathematical modeling ability. Data analysis of this research was conducted by using SPSS 20 by Windows. The result showed that students’ ability of mathematical modeling who learn mathematics by using SAVI approach was better than students’ ability of mathematical modeling who learn mathematics using conventional learning.

  6. A generalized linear factor model approach to the hierarchical framework for responses and response times.

    Science.gov (United States)

    Molenaar, Dylan; Tuerlinckx, Francis; van der Maas, Han L J

    2015-05-01

    We show how the hierarchical model for responses and response times as developed by van der Linden (2007), Fox, Klein Entink, and van der Linden (2007), Klein Entink, Fox, and van der Linden (2009), and Glas and van der Linden (2010) can be simplified to a generalized linear factor model with only the mild restriction that there is no hierarchical model at the item side. This result is valuable as it enables all well-developed modelling tools and extensions that come with these methods. We show that the restriction we impose on the hierarchical model does not influence parameter recovery under realistic circumstances. In addition, we present two illustrative real data analyses to demonstrate the practical benefits of our approach. © 2014 The British Psychological Society.

  7. Machine Learning Approach for Predicting Wall Shear Distribution for Abdominal Aortic Aneurysm and Carotid Bifurcation Models.

    Science.gov (United States)

    Jordanski, Milos; Radovic, Milos; Milosevic, Zarko; Filipovic, Nenad; Obradovic, Zoran

    2018-03-01

    Computer simulations based on the finite element method represent powerful tools for modeling blood flow through arteries. However, due to its computational complexity, this approach may be inappropriate when results are needed quickly. In order to reduce computational time, in this paper, we proposed an alternative machine learning based approach for calculation of wall shear stress (WSS) distribution, which may play an important role in mechanisms related to initiation and development of atherosclerosis. In order to capture relationships between geometric parameters, blood density, dynamic viscosity and velocity, and WSS distribution of geometrically parameterized abdominal aortic aneurysm (AAA) and carotid bifurcation models, we proposed multivariate linear regression, multilayer perceptron neural network and Gaussian conditional random fields (GCRF). Results obtained in this paper show that machine learning approaches can successfully predict WSS distribution at different cardiac cycle time points. Even though all proposed methods showed high potential for WSS prediction, GCRF achieved the highest coefficient of determination (0.930-0.948 for AAA model and 0.946-0.954 for carotid bifurcation model) demonstrating benefits of accounting for spatial correlation. The proposed approach can be used as an alternative method for real time calculation of WSS distribution.

  8. A Multiple siRNA-Based Anti-HIV/SHIV Microbicide Shows Protection in Both In Vitro and In Vivo Models.

    Directory of Open Access Journals (Sweden)

    Sandhya Boyapalle

    Full Text Available Human immunodeficiency virus (HIV types 1 and 2 (HIV-1 and HIV-2 are the etiologic agents of AIDS. Most HIV-1 infected individuals worldwide are women, who acquire HIV infections during sexual contact. Blocking HIV mucosal transmission and local spread in the female lower genital tract is important in preventing infection and ultimately eliminating the pandemic. Microbicides work by destroying the microbes or preventing them from establishing an infection. Thus, a number of different types of microbicides are under investigation, however, the lack of their solubility and bioavailability, and toxicity has been major hurdles. Herein, we report the development of multifunctional chitosan-lipid nanocomplexes that can effectively deliver plasmids encoding siRNA(s as microbicides without adverse effects and provide significant protection against HIV in both in vitro and in vivo models. Chitosan or chitosan-lipid (chlipid was complexed with a cocktail of plasmids encoding HIV-1-specific siRNAs (psiRNAs and evaluated for their efficacy in HEK-293 cells, PBMCs derived from nonhuman primates, 3-dimensional human vaginal ectocervical tissue (3D-VEC model and also in non-human primate model. Moreover, prophylactic administration of the chlipid to deliver a psiRNA cocktail intravaginally with a cream formulation in a non-human primate model showed substantial reduction of SHIV (simian/human immunodeficiency virus SF162 viral titers. Taken together, these studies demonstrate the potential of chlipid-siRNA nanocomplexes as a potential genetic microbicide against HIV infections.

  9. A Deep Learning based Approach to Reduced Order Modeling of Fluids using LSTM Neural Networks

    Science.gov (United States)

    Mohan, Arvind; Gaitonde, Datta

    2017-11-01

    Reduced Order Modeling (ROM) can be used as surrogates to prohibitively expensive simulations to model flow behavior for long time periods. ROM is predicated on extracting dominant spatio-temporal features of the flow from CFD or experimental datasets. We explore ROM development with a deep learning approach, which comprises of learning functional relationships between different variables in large datasets for predictive modeling. Although deep learning and related artificial intelligence based predictive modeling techniques have shown varied success in other fields, such approaches are in their initial stages of application to fluid dynamics. Here, we explore the application of the Long Short Term Memory (LSTM) neural network to sequential data, specifically to predict the time coefficients of Proper Orthogonal Decomposition (POD) modes of the flow for future timesteps, by training it on data at previous timesteps. The approach is demonstrated by constructing ROMs of several canonical flows. Additionally, we show that statistical estimates of stationarity in the training data can indicate a priori how amenable a given flow-field is to this approach. Finally, the potential and limitations of deep learning based ROM approaches will be elucidated and further developments discussed.

  10. Worldline approach to the Grosse-Wulkenhaar model

    Science.gov (United States)

    Viñas, Sebastián Franchino; Pisani, Pablo

    2014-11-01

    We apply the worldline formalism to the Grosse-Wulkenhaar model and obtain an expression for the one-loop effective action which provides an efficient way for computing Schwinger functions in this theory. Using this expression we obtain the quantum corrections to the effective background and the β-functions, which are known to vanish at the self-dual point. The case of degenerate noncommutativity is also considered. Our main result can be straightforwardly applied to any polynomial self-interaction of the scalar field and we consider that the worldline approach could be useful for studying effective actions of noncommutative gauge fields as well as in other non-local models or in higher-derivative field theories.

  11. Comparison of different approaches of modelling in a masonry building

    Science.gov (United States)

    Saba, M.; Meloni, D.

    2017-12-01

    The present work has the objective to model a simple masonry building, through two different modelling methods in order to assess their validity in terms of evaluation of static stresses. Have been chosen two of the most commercial software used to address this kind of problem, which are of S.T.A. Data S.r.l. and Sismicad12 of Concrete S.r.l. While the 3Muri software adopts the Frame by Macro Elements Method (FME), which should be more schematic and more efficient, Sismicad12 software uses the Finite Element Method (FEM), which guarantees accurate results, with greater computational burden. Remarkably differences of the static stresses, for such a simple structure between the two approaches have been found, and an interesting comparison and analysis of the reasons is proposed.

  12. A Data Mining Approach to Modelling of Water Supply Assets

    DEFF Research Database (Denmark)

    Babovic, V.; Drecourt, J.; Keijzer, M.

    2002-01-01

    The economic and social costs associated with pipe bursts and associated leakage problems in modern water supply systems are rapidly rising to unacceptable high levels. Pipe burst risks depend on a number of factors which are extremely difficult to characterise. A part of the problem is that water...... with the choice of pipes to be replaced, the outlined approach opens completely new avenues in asset modelling. The condition of an asset such as a water supply network deteriorates with age. With reliable risk models, addressing the evolution of risk with aging asset, it is now possible to plan optimal...... supply assets are mainly situated underground, and therefore not visible and under the influence of various highly unpredictable forces. This paper proposes the use of advanced data mining methods in order to determine the risks of pipe bursts. For example, analysis of the database of already occurred...

  13. The Use of Modeling Approach for Teaching Exponential Functions

    Science.gov (United States)

    Nunes, L. F.; Prates, D. B.; da Silva, J. M.

    2017-12-01

    This work presents a discussion related to the teaching and learning of mathematical contents related to the study of exponential functions in a freshman students group enrolled in the first semester of the Science and Technology Bachelor’s (STB of the Federal University of Jequitinhonha and Mucuri Valleys (UFVJM). As a contextualization tool strongly mentioned in the literature, the modelling approach was used as an educational teaching tool to produce contextualization in the teaching-learning process of exponential functions to these students. In this sense, were used some simple models elaborated with the GeoGebra software and, to have a qualitative evaluation of the investigation and the results, was used Didactic Engineering as a methodology research. As a consequence of this detailed research, some interesting details about the teaching and learning process were observed, discussed and described.

  14. Modelling hybrid stars in quark-hadron approaches

    Energy Technology Data Exchange (ETDEWEB)

    Schramm, S. [FIAS, Frankfurt am Main (Germany); Dexheimer, V. [Kent State University, Department of Physics, Kent, OH (United States); Negreiros, R. [Federal Fluminense University, Gragoata, Niteroi (Brazil)

    2016-01-15

    The density in the core of neutron stars can reach values of about 5 to 10 times nuclear matter saturation density. It is, therefore, a natural assumption that hadrons may have dissolved into quarks under such conditions, forming a hybrid star. This star will have an outer region of hadronic matter and a core of quark matter or even a mixed state of hadrons and quarks. In order to investigate such phases, we discuss different model approaches that can be used in the study of compact stars as well as being applicable to a wider range of temperatures and densities. One major model ingredient, the role of quark interactions in the stability of massive hybrid stars is discussed. In this context, possible conflicts with lattice QCD simulations are investigated. (orig.)

  15. A Data Mining Approach to Modelling of Water Supply Assets

    DEFF Research Database (Denmark)

    Babovic, V.; Drecourt, J.; Keijzer, M.

    2002-01-01

    The economic and social costs associated with pipe bursts and associated leakage problems in modern water supply systems are rapidly rising to unacceptable high levels. Pipe burst risks depend on a number of factors which are extremely difficult to characterise. A part of the problem is that water...... supply assets are mainly situated underground, and therefore not visible and under the influence of various highly unpredictable forces. This paper proposes the use of advanced data mining methods in order to determine the risks of pipe bursts. For example, analysis of the database of already occurred...... with the choice of pipes to be replaced, the outlined approach opens completely new avenues in asset modelling. The condition of an asset such as a water supply network deteriorates with age. With reliable risk models, addressing the evolution of risk with aging asset, it is now possible to plan optimal...

  16. Global GPS Ionospheric Modelling Using Spherical Harmonic Expansion Approach

    Directory of Open Access Journals (Sweden)

    Byung-Kyu Choi

    2010-12-01

    Full Text Available In this study, we developed a global ionosphere model based on measurements from a worldwide network of global positioning system (GPS. The total number of the international GPS reference stations for development of ionospheric model is about 100 and the spherical harmonic expansion approach as a mathematical method was used. In order to produce the ionospheric total electron content (TEC based on grid form, we defined spatial resolution of 2.0 degree and 5.0 degree in latitude and longitude, respectively. Two-dimensional TEC maps were constructed within the interval of one hour, and have a high temporal resolution compared to global ionosphere maps which are produced by several analysis centers. As a result, we could detect the sudden increase of TEC by processing GPS observables on 29 October, 2003 when the massive solar flare took place.

  17. Static models, recursive estimators and the zero-variance approach

    KAUST Repository

    Rubino, Gerardo

    2016-01-07

    When evaluating dependability aspects of complex systems, most models belong to the static world, where time is not an explicit variable. These models suffer from the same problems than dynamic ones (stochastic processes), such as the frequent combinatorial explosion of the state spaces. In the Monte Carlo domain, on of the most significant difficulties is the rare event situation. In this talk, we describe this context and a recent technique that appears to be at the top performance level in the area, where we combined ideas that lead to very fast estimation procedures with another approach called zero-variance approximation. Both ideas produced a very efficient method that has the right theoretical property concerning robustness, the Bounded Relative Error one. Some examples illustrate the results.

  18. Simplistic approach for 2D grown-in microdefect modeling

    Energy Technology Data Exchange (ETDEWEB)

    Prostomolotov, Anatoly; Verezub, Nataliya [Institute for Problems in Mechanics, Russian Academy of Sciences, Moscow (Russian Federation)

    2009-08-15

    In the present paper the analysis of cooling conditions influence on microdefect formation in Si single crystal was carried out on the basis of an analytical formulation for crystal temperature field jointly with developed two-dimensional (2D) models of microdefect formation. The new mathematical model is applied for calculations of vacancy microdefect formation, in which the 2D vacancy migration process is taken into account and the approached calculation algorithm is offered, which is not requiring the data storage for whole defect growth pre-history. The calculated results are discussed for conditions of Cz silicon single crystal growing. (copyright 2009 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim) (orig.)

  19. Making the most of what we have: application of extrapolation approaches in radioecological wildlife transfer models

    International Nuclear Information System (INIS)

    Beresford, Nicholas A.; Wood, Michael D.; Vives i Batlle, Jordi; Yankovich, Tamara L.; Bradshaw, Clare; Willey, Neil

    2016-01-01

    We will never have data to populate all of the potential radioecological modelling parameters required for wildlife assessments. Therefore, we need robust extrapolation approaches which allow us to make best use of our available knowledge. This paper reviews and, in some cases, develops, tests and validates some of the suggested extrapolation approaches. The concentration ratio (CR product-diet or CR wo-diet ) is shown to be a generic (trans-species) parameter which should enable the more abundant data for farm animals to be applied to wild species. An allometric model for predicting the biological half-life of radionuclides in vertebrates is further tested and generally shown to perform acceptably. However, to fully exploit allometry we need to understand why some elements do not scale to expected values. For aquatic ecosystems, the relationship between log 10 (a) (a parameter from the allometric relationship for the organism-water concentration ratio) and log(K d ) presents a potential opportunity to estimate concentration ratios using K d values. An alternative approach to the CR wo-media model proposed for estimating the transfer of radionuclides to freshwater fish is used to satisfactorily predict activity concentrations in fish of different species from three lakes. We recommend that this approach (REML modelling) be further investigated and developed for other radionuclides and across a wider range of organisms and ecosystems. Ecological stoichiometry shows potential as an extrapolation method in radioecology, either from one element to another or from one species to another. Although some of the approaches considered require further development and testing, we demonstrate the potential to significantly improve predictions of radionuclide transfer to wildlife by making better use of available data. - Highlights: • Robust extrapolation approaches allowing best use of available knowledge are needed. • Extrapolation approaches are reviewed, developed

  20. A discrete element modelling approach for block impacts on trees

    Science.gov (United States)

    Toe, David; Bourrier, Franck; Olmedo, Ignatio; Berger, Frederic

    2015-04-01

    These past few year rockfall models explicitly accounting for block shape, especially those using the Discrete Element Method (DEM), have shown a good ability to predict rockfall trajectories. Integrating forest effects into those models still remain challenging. This study aims at using a DEM approach to model impacts of blocks on trees and identify the key parameters controlling the block kinematics after the impact on a tree. A DEM impact model of a block on a tree was developed and validated using laboratory experiments. Then, key parameters were assessed using a global sensitivity analyse. Modelling the impact of a block on a tree using DEM allows taking into account large displacements, material non-linearities and contacts between the block and the tree. Tree stems are represented by flexible cylinders model as plastic beams sustaining normal, shearing, bending, and twisting loading. Root soil interactions are modelled using a rotation stiffness acting on the bending moment at the bottom of the tree and a limit bending moment to account for tree overturning. The crown is taken into account using an additional mass distribute uniformly on the upper part of the tree. The block is represented by a sphere. The contact model between the block and the stem consists of an elastic frictional model. The DEM model was validated using laboratory impact tests carried out on 41 fresh beech (Fagus Sylvatica) stems. Each stem was 1,3 m long with a diameter between 3 to 7 cm. Wood stems were clamped on a rigid structure and impacted by a 149 kg charpy pendulum. Finally an intensive simulation campaign of blocks impacting trees was done to identify the input parameters controlling the block kinematics after the impact on a tree. 20 input parameters were considered in the DEM simulation model : 12 parameters were related to the tree and 8 parameters to the block. The results highlight that the impact velocity, the stem diameter, and the block volume are the three input

  1. Managing the Resilience of Lakes: A Multi-agent Modeling Approach

    Directory of Open Access Journals (Sweden)

    Marco A. Janssen

    1999-12-01

    Full Text Available We demonstrate an approach for integrating social and ecological models to study ecosystem management strategies. We focus on the management of lake eutrophication. A model has been developed in which the dynamics of the lake, the learning dynamics of society, and the interactions between ecology and society are included. Analyses with the model show that active learning is important to retain the resilience of lakes. Although very low levels of phosphorus in the water will not be reached, active learning reduce the chance of catastrophic high phosphorus levels.

  2. An Approach to Model Based Testing of Multiagent Systems

    Directory of Open Access Journals (Sweden)

    Shafiq Ur Rehman

    2015-01-01

    Full Text Available Autonomous agents perform on behalf of the user to achieve defined goals or objectives. They are situated in dynamic environment and are able to operate autonomously to achieve their goals. In a multiagent system, agents cooperate with each other to achieve a common goal. Testing of multiagent systems is a challenging task due to the autonomous and proactive behavior of agents. However, testing is required to build confidence into the working of a multiagent system. Prometheus methodology is a commonly used approach to design multiagents systems. Systematic and thorough testing of each interaction is necessary. This paper proposes a novel approach to testing of multiagent systems based on Prometheus design artifacts. In the proposed approach, different interactions between the agent and actors are considered to test the multiagent system. These interactions include percepts and actions along with messages between the agents which can be modeled in a protocol diagram. The protocol diagram is converted into a protocol graph, on which different coverage criteria are applied to generate test paths that cover interactions between the agents. A prototype tool has been developed to generate test paths from protocol graph according to the specified coverage criterion.

  3. THE FAIRSHARES MODEL: AN ETHICAL APPROACH TO SOCIAL ENTERPRISE DEVELOPMENT?

    Directory of Open Access Journals (Sweden)

    Rory James Ridley-Duff

    2015-07-01

    Full Text Available This paper is based on the keynote address to the 14th International Association of Public and Non-Profit Marketing (IAPNM conference. It explore the question "What impact do ethical values in the FairShares Model have on social entrepreneurial behaviour?" In the first part, three broad approaches to social enterprise are set out: co-operative and mutual enterprises (CMEs, social and responsible businesses (SRBs and charitable trading activities (CTAs. The ethics that guide each approach are examined to provide a conceptual framework for examining FairShares as a case study. In the second part, findings are scrutinised in terms of the ethical values and principles that are activated when FairShares is applied to practice. The paper contributes to knowledge by giving an example of the way OpenSource technology (Loomio has been used to translate 'espoused theories' into 'theories in use' to advance social enterprise development. The review of FairShares using the conceptual framework suggests there is a fourth approach based on multi-stakeholder co-operation to create 'associative democracy' in the workplace.

  4. Engineering approach to model and compute electric power markets settlements

    International Nuclear Information System (INIS)

    Kumar, J.; Petrov, V.

    2006-01-01

    Back-office accounting settlement activities are an important part of market operations in Independent System Operator (ISO) organizations. A potential way to measure ISO market design correctness is to analyze how well market price signals create incentives or penalties for creating an efficient market to achieve market design goals. Market settlement rules are an important tool for implementing price signals which are fed back to participants via the settlement activities of the ISO. ISO's are currently faced with the challenge of high volumes of data resulting from the increasing size of markets and ever-changing market designs, as well as the growing complexity of wholesale energy settlement business rules. This paper analyzed the problem and presented a practical engineering solution using an approach based on mathematical formulation and modeling of large scale calculations. The paper also presented critical comments on various differences in settlement design approaches to electrical power market design, as well as further areas of development. The paper provided a brief introduction to the wholesale energy market settlement systems and discussed problem formulation. An actual settlement implementation framework and discussion of the results and conclusions were also presented. It was concluded that a proper engineering approach to this domain can yield satisfying results by formalizing wholesale energy settlements. Significant improvements were observed in the initial preparation phase, scoping and effort estimation, implementation and testing. 5 refs., 2 figs

  5. Right approach to 3D modeling using CAD tools

    Science.gov (United States)

    Baddam, Mounica Reddy

    The thesis provides a step-by-step methodology to enable an instructor dealing with CAD tools to optimally guide his/her students through an understandable 3D modeling approach which will not only enhance their knowledge about the tool's usage but also enable them to achieve their desired result in comparatively lesser time. In the known practical field, there is particularly very little information available to apply CAD skills to formal beginners' training sessions. Additionally, advent of new software in 3D domain cumulates updating into a more difficult task. Keeping up to the industry's advanced requirements emphasizes the importance of more skilled hands in the field of CAD development, rather than just prioritizing manufacturing in terms of complex software features. The thesis analyses different 3D modeling approaches specified to the varieties of CAD tools currently available in the market. Utilizing performance-time databases, learning curves have been generated to measure their performance time, feature count etc. Based on the results, improvement parameters have also been provided for (Asperl, 2005).

  6. Integration of measurement data in the comprehensive modelling approach

    Science.gov (United States)

    Sieber, I.; Rübenach, O.

    2013-09-01

    Efficient and reliable optical design requires knowledge of the production chain, the materials used, and the environmental circumstances in the field of operation. This is realized in the comprehensive modelling approach consisting of three steps: • Design for manufacturing, i.e. the model must be adjusted to the process chain. Knowledge of design rules is required. • Robust design, i.e. optimization of the functional design with the objective of a compensation of the tolerance influences on the system's performance. Knowledge of the tolerances of the individual process steps is required. • Reliable design with respect to environmental and operational effects, respectively. Coupling of an optical and mechanical simulation tool is required to form the optical simulation environment. The availability of process knowledge such as e.g. design rules and manufacturing tolerances is ensured by coupling of the optical simulation environment with a process knowledge database. Integration of measured surface data in this simulation environment enables a realistic simulation and analysis of real, manufactured optics. This approach allows e.g. for the evaluation of replication methods such as precision molding or injection molding against high-precision manufacturing methods, e.g. diamond turning.

  7. How acquainting shows verbally

    DEFF Research Database (Denmark)

    Hermann, Jesper

    2004-01-01

    A central tenet of the integrational view of language and communication is an anchoring in the acting and integrating behaviour of the communicants themselves. In this way the integrational approach has a certain phenomenological slant. What happens when this approach is combined with a ¿psycholo...... to some of William James' classical psychological observations. 16...... of linguistic exertion¿ based upon the descriptions of the  stream of thought, and of conception (>acquainting) by William James? Text-examples taken from an internet site for dieters illustrate the combined James'ian & integrational approach in practice. Some of the examples are also specifically related......A central tenet of the integrational view of language and communication is an anchoring in the acting and integrating behaviour of the communicants themselves. In this way the integrational approach has a certain phenomenological slant. What happens when this approach is combined with a ¿psychology...

  8. Replacement model of city bus: A dynamic programming approach

    Science.gov (United States)

    Arifin, Dadang; Yusuf, Edhi

    2017-06-01

    This paper aims to develop a replacement model of city bus vehicles operated in Bandung City. This study is driven from real cases encountered by the Damri Company in the efforts to improve services to the public. The replacement model propounds two policy alternatives: First, to maintain or keep the vehicles, and second is to replace them with new ones taking into account operating costs, revenue, salvage value, and acquisition cost of a new vehicle. A deterministic dynamic programming approach is used to solve the model. The optimization process was heuristically executed using empirical data of Perum Damri. The output of the model is to determine the replacement schedule and the best policy if the vehicle has passed the economic life. Based on the results, the technical life of the bus is approximately 20 years old, while the economic life is an average of 9 (nine) years. It means that after the bus is operated for 9 (nine) years, managers should consider the policy of rejuvenation.

  9. A modelling approach to designing microstructures in thermal barrier coatings

    International Nuclear Information System (INIS)

    Gupta, M.; Nylen, P.; Wigren, J.

    2013-01-01

    Thermomechanical properties of Thermal Barrier Coatings (TBCs) are strongly influenced by coating defects, such as delaminations and pores, thus making it essential to have a fundamental understanding of microstructure-property relationships in TBCs to produce a desired coating. Object-Oriented Finite element analysis (OOF) has been shown previously as an effective tool for evaluating thermal and mechanical material behaviour, as this method is capable of incorporating the inherent material microstructure as input to the model. In this work, OOF was used to predict the thermal conductivity and effective Young's modulus of TBC topcoats. A Design of Experiments (DoE) was conducted by varying selected parameters for spraying Yttria-Stabilised Zirconia (YSZ) topcoat. The microstructure was assessed with SEM, and image analysis was used to characterize the porosity content. The relationships between microstructural features and properties predicted by modelling are discussed. The microstructural features having the most beneficial effect on properties were sprayed with a different spray gun so as to verify the results obtained from modelling. Characterisation of the coatings included microstructure evaluation, thermal conductivity and lifetime measurements. The modelling approach in combination with experiments undertaken in this study was shown to be an effective way to achieve coatings with optimised thermo-mechanical properties.

  10. A Modeling Approach for Earthquake-Ionosphere Coupling

    Science.gov (United States)

    Meng, X.; Komjathy, A.; Verkhoglyadova, O. P.; Savastano, G.; Mannucci, A. J.

    2017-12-01

    We present a newly developed modeling approach for the earthquake-ionosphere coupling process, which extends the capability of Wave Perturbation - Global Ionosphere-Thermosphere Model (WP-GITM) developed originally for tsunami-ionosphere coupling. The new WP-GITM represents an earthquake as a point source at its epicenter, and takes the ground vertical velocity data from seismic measurements as input. The model then solves the neutral density, velocity, and temperature perturbations generated by spherical acoustic-gravity waves and the resulting perturbations in ions and electrons. We apply the model to simulate the near-field ionospheric disturbances during two earthquake events with different local times including the 2011 Tohoku-Oki (local afternoon) and the 2015 Illapel events (local evening). To validate the results, we retrieve receiver-to-satellite total electron content (TEC) perturbations from the simulations and compare them to the corresponding slant TEC perturbations from Global Positioning System (GPS) TEC observations. We find good agreement on magnitudes and arrival times between the simulations and observations and discuss directions of future research.

  11. A graphical vector autoregressive modelling approach to the analysis of electronic diary data

    Directory of Open Access Journals (Sweden)

    Zipfel Stephan

    2010-04-01

    Full Text Available Abstract Background In recent years, electronic diaries are increasingly used in medical research and practice to investigate patients' processes and fluctuations in symptoms over time. To model dynamic dependence structures and feedback mechanisms between symptom-relevant variables, a multivariate time series method has to be applied. Methods We propose to analyse the temporal interrelationships among the variables by a structural modelling approach based on graphical vector autoregressive (VAR models. We give a comprehensive description of the underlying concepts and explain how the dependence structure can be recovered from electronic diary data by a search over suitable constrained (graphical VAR models. Results The graphical VAR approach is applied to the electronic diary data of 35 obese patients with and without binge eating disorder (BED. The dynamic relationships for the two subgroups between eating behaviour, depression, anxiety and eating control are visualized in two path diagrams. Results show that the two subgroups of obese patients with and without BED are distinguishable by the temporal patterns which influence their respective eating behaviours. Conclusion The use of the graphical VAR approach for the analysis of electronic diary data leads to a deeper insight into patient's dynamics and dependence structures. An increasing use of this modelling approach could lead to a better understanding of complex psychological and physiological mechanisms in different areas of medical care and research.

  12. Predicting future glacial lakes in Austria using different modelling approaches

    Science.gov (United States)

    Otto, Jan-Christoph; Helfricht, Kay; Prasicek, Günther; Buckel, Johannes; Keuschnig, Markus

    2017-04-01

    Glacier retreat is one of the most apparent consequences of temperature rise in the 20th and 21th centuries in the European Alps. In Austria, more than 240 new lakes have formed in glacier forefields since the Little Ice Age. A similar signal is reported from many mountain areas worldwide. Glacial lakes can constitute important environmental and socio-economic impacts on high mountain systems including water resource management, sediment delivery, natural hazards, energy production and tourism. Their development significantly modifies the landscape configuration and visual appearance of high mountain areas. Knowledge on the location, number and extent of these future lakes can be used to assess potential impacts on high mountain geo-ecosystems and upland-lowland interactions. Information on new lakes is critical to appraise emerging threads and potentials for society. The recent development of regional ice thickness models and their combination with high resolution glacier surface data allows predicting the topography below current glaciers by subtracting ice thickness from glacier surface. Analyzing these modelled glacier bed surfaces reveals overdeepenings that represent potential locations for future lakes. In order to predict the location of future glacial lakes below recent glaciers in the Austrian Alps we apply different ice thickness models using high resolution terrain data and glacier outlines. The results are compared and validated with ice thickness data from geophysical surveys. Additionally, we run the models on three different glacier extents provided by the Austrian Glacier Inventories from 1969, 1998 and 2006. Results of this historical glacier extent modelling are compared to existing glacier lakes and discussed focusing on geomorphological impacts on lake evolution. We discuss model performance and observed differences in the results in order to assess the approach for a realistic prediction of future lake locations. The presentation delivers

  13. How to interpret the results of medical time series data analysis: Classical statistical approaches versus dynamic Bayesian network modeling.

    Science.gov (United States)

    Onisko, Agnieszka; Druzdzel, Marek J; Austin, R Marshall

    2016-01-01

    Classical statistics is a well-established approach in the analysis of medical data. While the medical community seems to be familiar with the concept of a statistical analysis and its interpretation, the Bayesian approach, argued by many of its proponents to be superior to the classical frequentist approach, is still not well-recognized in the analysis of medical data. The goal of this study is to encourage data analysts to use the Bayesian approach, such as modeling with graphical probabilistic networks, as an insightful alternative to classical statistical analysis of medical data. This paper offers a comparison of two approaches to analysis of medical time series data: (1) classical statistical approach, such as the Kaplan-Meier estimator and the Cox proportional hazards regression model, and (2) dynamic Bayesian network modeling. Our comparison is based on time series cervical cancer screening data collected at Magee-Womens Hospital, University of Pittsburgh Medical Center over 10 years. The main outcomes of our comparison are cervical cancer risk assessments produced by the three approaches. However, our analysis discusses also several aspects of the comparison, such as modeling assumptions, model building, dealing with incomplete data, individualized risk assessment, results interpretation, and model validation. Our study shows that the Bayesian approach is (1) much more flexible in terms of modeling effort, and (2) it offers an individualized risk assessment, which is more cumbersome for classical statistical approaches.

  14. A systematic approach for model verification: application on seven published activated sludge models.

    Science.gov (United States)

    Hauduc, H; Rieger, L; Takács, I; Héduit, A; Vanrolleghem, P A; Gillot, S

    2010-01-01

    The quality of simulation results can be significantly affected by errors in the published model (typing, inconsistencies, gaps or conceptual errors) and/or in the underlying numerical model description. Seven of the most commonly used activated sludge models have been investigated to point out the typing errors, inconsistencies and gaps in the model publications: ASM1; ASM2d; ASM3; ASM3 + Bio-P; ASM2d + TUD; New General; UCTPHO+. A systematic approach to verify models by tracking typing errors and inconsistencies in model development and software implementation is proposed. Then, stoichiometry and kinetic rate expressions are checked for each model and the errors found are reported in detail. An attached spreadsheet (see http://www.iwaponline.com/wst/06104/0898.pdf) provides corrected matrices with the calculations of all stoichiometric coefficients for the discussed biokinetic models and gives an example of proper continuity checks.

  15. Resolving the cycle skip introduced by the multi-layer static model using a hybrid approach

    Science.gov (United States)

    Tawadros, Emad Ekladios Toma

    Cycle skips (breaks) in seismic data are occasionally irresolvable using conventional static correction programs. Such artificial cycle skips can be misleading for interpreters and introduce false subsurface images. After applying datum static corrections using either the single-layer or multi-layer models, artificial cycle skips might develop in the data. Although conventional residual static correction techniques are occasionally able to solve this problem, they fail in solving many other cases. A new approach is introduced in this study to resolve this problem by forming a static model that is free of these artificial cycle skips, which can be used as a pilot volume for residual statics calculation. The pilot volume is formed by combining the high-frequency static component of the single-layer model which show better static solution at the static problem locations and the low-frequency static component of the two-layer model. This new approach is applied on a 3-D seismic data set from Haba Field of Eastern Saudi Arabia where a major cycle skip was introduced by the multilayer model. Results show a better image of the subsurface structure after application of the new approach.

  16. A sparse QSRR model for predicting retention indices of essential oils based on robust screening approach.

    Science.gov (United States)

    Al-Fakih, A M; Algamal, Z Y; Lee, M H; Aziz, M

    2017-08-01

    A robust screening approach and a sparse quantitative structure-retention relationship (QSRR) model for predicting retention indices (RIs) of 169 constituents of essential oils is proposed. The proposed approach is represented in two steps. First, dimension reduction was performed using the proposed modified robust sure independence screening (MR-SIS) method. Second, prediction of RIs was made using the proposed robust sparse QSRR with smoothly clipped absolute deviation (SCAD) penalty (RSQSRR). The RSQSRR model was internally and externally validated based on [Formula: see text], [Formula: see text], [Formula: see text], [Formula: see text], Y-randomization test, [Formula: see text], [Formula: see text], and the applicability domain. The validation results indicate that the model is robust and not due to chance correlation. The descriptor selection and prediction performance of the RSQSRR for training dataset outperform the other two used modelling methods. The RSQSRR shows the highest [Formula: see text], [Formula: see text], and [Formula: see text], and the lowest [Formula: see text]. For the test dataset, the RSQSRR shows a high external validation value ([Formula: see text]), and a low value of [Formula: see text] compared with the other methods, indicating its higher predictive ability. In conclusion, the results reveal that the proposed RSQSRR is an efficient approach for modelling high dimensional QSRRs and the method is useful for the estimation of RIs of essential oils that have not been experimentally tested.

  17. A Data-Based Approach for Modeling and Analysis of Vehicle Collision by LPV-ARMAX Models

    Directory of Open Access Journals (Sweden)

    Qiugang Lu

    2013-01-01

    Full Text Available Vehicle crash test is considered to be the most direct and common approach to assess the vehicle crashworthiness. However, it suffers from the drawbacks of high experiment cost and huge time consumption. Therefore, the establishment of a mathematical model of vehicle crash which can simplify the analysis process is significantly attractive. In this paper, we present the application of LPV-ARMAX model to simulate the car-to-pole collision with different initial impact velocities. The parameters of the LPV-ARMAX are assumed to have dependence on the initial impact velocities. Instead of establishing a set of LTI models for vehicle crashes with various impact velocities, the LPV-ARMAX model is comparatively simple and applicable to predict the responses of new collision situations different from the ones used for identification. Finally, the comparison between the predicted response and the real test data is conducted, which shows the high fidelity of the LPV-ARMAX model.

  18. Modelling Approach to Assess Future Agricultural Water Demand

    Science.gov (United States)

    Spano, D.; Mancosu, N.; Orang, M.; Sarreshteh, S.; Snyder, R. L.

    2013-12-01

    The combination of long-term climate changes (e.g., warmer average temperatures) and extremes events (e.g., droughts) can have decisive impacts on water demand, with further implications on the ecosystems. In countries already affected by water scarcity, water management problems are becoming increasingly serious. The sustainable management of available water resources at the global, regional, and site-specific level is necessary. In agriculture, the first step is to compute how much water is needed by crops in regards to climate conditions. Modelling approach can be a way to compute crop water requirement (CWR). In this study, the improved version of the SIMETAW model was used. The model is a user friendly soil water balance model, developed by the University of California, Davis, the California Department of Water Resource, and the University of Sassari. The SIMETAW# model assesses CWR and generates hypothetical irrigation scheduling for a wide range of irrigated crops experiencing full, deficit, or no irrigation. The model computes the evapotranspiration of the applied water (ETaw), which is the sum of the net amount of irrigation water needed to match losses due to the crop evapotranspiration (ETc). ETaw is determined by first computing reference evapotranspiration (ETo) using the daily standardized Reference Evapotranspiration equation. ETaw is computed as ETaw = CETc - CEr, where CETc and CE are the cumulative total crop ET and effective rainfall values, respectively. Crop evapotranspiration is estimated as ETc = ETo x Kc, where Kc is the corrected midseason tabular crop coefficient, adjusted for climate conditions. The net irrigation amounts are determined from a daily soil water balance, using an integrated approach that considers soil and crop management information, and the daily ETc estimates. Using input information on irrigation system distribution uniformity and runoff, when appropriate, the model estimates the applied water to the low quarter of the

  19. Computational approaches for modeling human intestinal absorption and permeability.

    Science.gov (United States)

    Subramanian, Govindan; Kitchen, Douglas B

    2006-07-01

    Human intestinal absorption (HIA) is an important roadblock in the formulation of new drug substances. Computational models are needed for the rapid estimation of this property. The measurements are determined via in vivo experiments or in vitro permeability studies. We present several computational models that are able to predict the absorption of drugs by the human intestine and the permeability through human Caco-2 cells. The training and prediction sets were derived from literature sources and carefully examined to eliminate compounds that are actively transported. We compare our results to models derived by other methods and find that the statistical quality is similar. We believe that models derived from both sources of experimental data would provide greater consistency in predictions. The performance of several QSPR models that we investigated to predict outside the training set for either experimental property clearly indicates that caution should be exercised while applying any of the models for quantitative predictions. However, we are able to show that the qualitative predictions can be obtained with close to a 70% success rate.

  20. A new approach to flow simulation using hybrid models

    Science.gov (United States)

    Solgi, Abazar; Zarei, Heidar; Nourani, Vahid; Bahmani, Ramin

    2017-11-01

    The necessity of flow prediction in rivers, for proper management of water resource, and the need for determining the inflow to the dam reservoir, designing efficient flood warning systems and so forth, have always led water researchers to think about models with high-speed response and low error. In the recent years, the development of Artificial Neural Networks and Wavelet theory and using the combination of models help researchers to estimate the river flow better and better. In this study, daily and monthly scales were used for simulating the flow of Gamasiyab River, Nahavand, Iran. The first simulation was done using two types of ANN and ANFIS models. Then, using wavelet theory and decomposing input signals of the used parameters, sub-signals were obtained and were fed into the ANN and ANFIS to obtain hybrid models of WANN and WANFIS. In this study, in addition to the parameters of precipitation and flow, parameters of temperature and evaporation were used to analyze their effects on the simulation. The results showed that using wavelet transform improved the performance of the models in both monthly and daily scale. However, it had a better effect on the monthly scale and the WANFIS was the best model.

  1. Probabilistic model-based approach for heart beat detection.

    Science.gov (United States)

    Chen, Hugh; Erol, Yusuf; Shen, Eric; Russell, Stuart

    2016-09-01

    Nowadays, hospitals are ubiquitous and integral to modern society. Patients flow in and out of a veritable whirlwind of paperwork, consultations, and potential inpatient admissions, through an abstracted system that is not without flaws. One of the biggest flaws in the medical system is perhaps an unexpected one: the patient alarm system. One longitudinal study reported an 88.8% rate of false alarms, with other studies reporting numbers of similar magnitudes. These false alarm rates lead to deleterious effects that manifest in a lower standard of care across clinics. This paper discusses a model-based probabilistic inference approach to estimate physiological variables at a detection level. We design a generative model that complies with a layman's understanding of human physiology and perform approximate Bayesian inference. One primary goal of this paper is to justify a Bayesian modeling approach to increasing robustness in a physiological domain. In order to evaluate our algorithm we look at the application of heart beat detection using four datasets provided by PhysioNet, a research resource for complex physiological signals, in the form of the PhysioNet 2014 Challenge set-p1 and set-p2, the MIT-BIH Polysomnographic Database, and the MGH/MF Waveform Database. On these data sets our algorithm performs on par with the other top six submissions to the PhysioNet 2014 challenge. The overall evaluation scores in terms of sensitivity and positive predictivity values obtained were as follows: set-p1 (99.72%), set-p2 (93.51%), MIT-BIH (99.66%), and MGH/MF (95.53%). These scores are based on the averaging of gross sensitivity, gross positive predictivity, average sensitivity, and average positive predictivity.

  2. SMEs Access to Finance in Croatia – Model Approach

    OpenAIRE

    Vinko Vidučić; Ljiljana Vidučić; Damir Boras

    2014-01-01

    The goals of the research include the determination of the characteristics of SMEs finance in Croatia, as well as the determination of indirect growth rates of the information model of the entrepreneurs` perception of business environment. The research results show that cost of finance and access to finance are most important constraining factor in setting up and running the business of small entrepreneurs in Croatia. Furthermore, small entrepreneurs in Croatia are significantly dissatisfied ...

  3. The Structure of Preschoolers' Emotion Knowledge: Model Equivalence and Validity Using a Structural Equation Modeling Approach

    Science.gov (United States)

    Bassett, Hideko Hamada; Denham, Susanne; Mincic, Melissa; Graling, Kelly

    2012-01-01

    Research Findings: A theory-based 2-factor structure of preschoolers' emotion knowledge (i.e., recognition of emotional expression and understanding of emotion-eliciting situations) was tested using confirmatory factor analysis. Compared to 1- and 3-factor models, the 2-factor model showed a better fit to the data. The model was found to be…

  4. Modeling Plankton Mixotrophy: A Mechanistic Model Consistent with the Shuter-Type Biochemical Approach

    Directory of Open Access Journals (Sweden)

    Caroline Ghyoot

    2017-07-01

    Full Text Available Mixotrophy, i.e., the ability to combine phototrophy and phagotrophy in one organism, is now recognized to be widespread among photic-zone protists and to potentially modify the structure and functioning of planktonic ecosystems. However, few biogeochemical/ecological models explicitly include this mode of nutrition, owing to the large diversity of observed mixotrophic types, the few data allowing the parameterization of physiological processes, and the need to make the addition of mixotrophy into existing ecosystem models as simple as possible. We here propose and discuss a flexible model that depicts the main observed behaviors of mixotrophy in microplankton. A first model version describes constitutive mixotrophy (the organism photosynthesizes by use of its own chloroplasts. This model version offers two possible configurations, allowing the description of constitutive mixotrophs (CMs that favor either phototrophy or heterotrophy. A second version describes non-constitutive mixotrophy (the organism performs phototrophy by use of chloroplasts acquired from its prey. The model variants were described so as to be consistent with a plankton conceptualization in which the biomass is divided into separate components on the basis of their biochemical function (Shuter-approach; Shuter, 1979. The two model variants of mixotrophy can easily be implemented in ecological models that adopt the Shuter-approach, such as the MIRO model (Lancelot et al., 2005, and address the challenges associated with modeling mixotrophy.

  5. Modeling healthcare authorization and claim submissions using the openEHR dual-model approach

    Science.gov (United States)

    2011-01-01

    Background The TISS standard is a set of mandatory forms and electronic messages for healthcare authorization and claim submissions among healthcare plans and providers in Brazil. It is not based on formal models as the new generation of health informatics standards suggests. The objective of this paper is to model the TISS in terms of the openEHR archetype-based approach and integrate it into a patient-centered EHR architecture. Methods Three approaches were adopted to model TISS. In the first approach, a set of archetypes was designed using ENTRY subclasses. In the second one, a set of archetypes was designed using exclusively ADMIN_ENTRY and CLUSTERs as their root classes. In the third approach, the openEHR ADMIN_ENTRY is extended with classes designed for authorization and claim submissions, and an ISM_TRANSITION attribute is added to the COMPOSITION class. Another set of archetypes was designed based on this model. For all three approaches, templates were designed to represent the TISS forms. Results The archetypes based on the openEHR RM (Reference Model) can represent all TISS data structures. The extended model adds subclasses and an attribute to the COMPOSITION class to represent information on authorization and claim submissions. The archetypes based on all three approaches have similar structures, although rooted in different classes. The extended openEHR RM model is more semantically aligned with the concepts involved in a claim submission, but may disrupt interoperability with other systems and the current tools must be adapted to deal with it. Conclusions Modeling the TISS standard by means of the openEHR approach makes it aligned with ISO recommendations and provides a solid foundation on which the TISS can evolve. Although there are few administrative archetypes available, the openEHR RM is expressive enough to represent the TISS standard. This paper focuses on the TISS but its results may be extended to other billing processes. A complete

  6. Banking Crisis Early Warning Model based on a Bayesian Model Averaging Approach

    Directory of Open Access Journals (Sweden)

    Taha Zaghdoudi

    2016-08-01

    Full Text Available The succession of banking crises in which most have resulted in huge economic and financial losses, prompted several authors to study their determinants. These authors constructed early warning models to prevent their occurring. It is in this same vein as our study takes its inspiration. In particular, we have developed a warning model of banking crises based on a Bayesian approach. The results of this approach have allowed us to identify the involvement of the decline in bank profitability, deterioration of the competitiveness of the traditional intermediation, banking concentration and higher real interest rates in triggering bank crisis.

  7. Modeling healthcare authorization and claim submissions using the openEHR dual-model approach

    Directory of Open Access Journals (Sweden)

    Freire Sergio M

    2011-10-01

    Full Text Available Abstract Background The TISS standard is a set of mandatory forms and electronic messages for healthcare authorization and claim submissions among healthcare plans and providers in Brazil. It is not based on formal models as the new generation of health informatics standards suggests. The objective of this paper is to model the TISS in terms of the openEHR archetype-based approach and integrate it into a patient-centered EHR architecture. Methods Three approaches were adopted to model TISS. In the first approach, a set of archetypes was designed using ENTRY subclasses. In the second one, a set of archetypes was designed using exclusively ADMIN_ENTRY and CLUSTERs as their root classes. In the third approach, the openEHR ADMIN_ENTRY is extended with classes designed for authorization and claim submissions, and an ISM_TRANSITION attribute is added to the COMPOSITION class. Another set of archetypes was designed based on this model. For all three approaches, templates were designed to represent the TISS forms. Results The archetypes based on the openEHR RM (Reference Model can represent all TISS data structures. The extended model adds subclasses and an attribute to the COMPOSITION class to represent information on authorization and claim submissions. The archetypes based on all three approaches have similar structures, although rooted in different classes. The extended openEHR RM model is more semantically aligned with the concepts involved in a claim submission, but may disrupt interoperability with other systems and the current tools must be adapted to deal with it. Conclusions Modeling the TISS standard by means of the openEHR approach makes it aligned with ISO recommendations and provides a solid foundation on which the TISS can evolve. Although there are few administrative archetypes available, the openEHR RM is expressive enough to represent the TISS standard. This paper focuses on the TISS but its results may be extended to other billing

  8. Synthesis, Modelling, and Anticonvulsant Studies of New Quinazolines Showing Three Highly Active Compounds with Low Toxicity and High Affinity to the GABA-A Receptor

    Directory of Open Access Journals (Sweden)

    Mohamed F. Zayed

    2017-01-01

    Full Text Available Some novel fluorinated quinazolines (5a–j were designed and synthesized to be evaluated for their anticonvulsant activity and their neurotoxicity. Structures of all newly synthesized compounds were confirmed by their infrared (IR, mass spectrometry (MS spectra, 1H nuclear magnetic resonance (NMR, 13C-NMR, and elemental analysis (CHN. The anticonvulsant activity was evaluated by a subcutaneous pentylenetetrazole (scPTZ test and maximal electroshock (MES-induced seizure test, while neurotoxicity was evaluated by a rotorod test. The molecular docking was performed for all newly-synthesized compounds to assess their binding affinities to the GABA-A receptor in order to rationalize their anticonvulsant activities in a qualitative way. The data obtained from the molecular modeling was correlated with that obtained from the biological screening. These data showed considerable anticonvulsant activity for all newly-synthesized compounds. Compounds 5b, 5c, and 5d showed the highest binding affinities toward the GABA-A receptor, along with the highest anticonvulsant activities in experimental mice. These compounds also showed low neurotoxicity and low toxicity in the median lethal dose test compared to the reference drugs. A GABA enzymatic assay was performed for these highly active compounds to confirm the obtained results and explain the possible mechanism for anticonvulsant action. The most active compounds might be used as leads for future modification and optimization.

  9. Sensitivity in reflectance attributed to phytoplankton cell size: forward and inverse modelling approaches

    CSIR Research Space (South Africa)

    Evers-King, H

    2014-05-01

    Full Text Available ocean colour products To put these results in to the context of current ocean colour products, Fig. 5 shows an approx- imation of the maximum band ratio (MBR) approach used in the OC4 algorithm [37] using forward model output (ES) analogous to the data...], suggesting that variability in a∗φ (in our case, coincident with changes in size) may be obscured by agd , particularly at lower biomass, where the majority of the size related signal occurs in the blue and MBR approaches are typically applied (Fig. 1). Sauer...

  10. BioModels: expanding horizons to include more modelling approaches and formats.

    Science.gov (United States)

    Glont, Mihai; Nguyen, Tung V N; Graesslin, Martin; Hälke, Robert; Ali, Raza; Schramm, Jochen; Wimalaratne, Sarala M; Kothamachu, Varun B; Rodriguez, Nicolas; Swat, Maciej J; Eils, Jurgen; Eils, Roland; Laibe, Camille; Malik-Sheriff, Rahuman S; Chelliah, Vijayalakshmi; Le Novère, Nicolas; Hermjakob, Henning

    2018-01-04

    BioModels serves as a central repository of mathematical models representing biological processes. It offers a platform to make mathematical models easily shareable across the systems modelling community, thereby supporting model reuse. To facilitate hosting a broader range of model formats derived from diverse modelling approaches and tools, a new infrastructure for BioModels has been developed that is available at http://www.ebi.ac.uk/biomodels. This new system allows submitting and sharing of a wide range of models with improved support for formats other than SBML. It also offers a version-control backed environment in which authors and curators can work collaboratively to curate models. This article summarises the features available in the current system and discusses the potential benefit they offer to the users over the previous system. In summary, the new portal broadens the scope of models accepted in BioModels and supports collaborative model curation which is crucial for model reproducibility and sharing. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  11. Export of microplastics from land to sea. A modelling approach.

    Science.gov (United States)

    Siegfried, Max; Koelmans, Albert A; Besseling, Ellen; Kroeze, Carolien

    2017-12-15

    Quantifying the transport of plastic debris from river to sea is crucial for assessing the risks of plastic debris to human health and the environment. We present a global modelling approach to analyse the composition and quantity of point-source microplastic fluxes from European rivers to the sea. The model accounts for different types and sources of microplastics entering river systems via point sources. We combine information on these sources with information on sewage management and plastic retention during river transport for the largest European rivers. Sources of microplastics include personal care products, laundry, household dust and tyre and road wear particles (TRWP). Most of the modelled microplastics exported by rivers to seas are synthetic polymers from TRWP (42%) and plastic-based textiles abraded during laundry (29%). Smaller sources are synthetic polymers and plastic fibres in household dust (19%) and microbeads in personal care products (10%). Microplastic export differs largely among European rivers, as a result of differences in socio-economic development and technological status of sewage treatment facilities. About two-thirds of the microplastics modelled in this study flow into the Mediterranean and Black Sea. This can be explained by the relatively low microplastic removal efficiency of sewage treatment plants in the river basins draining into these two seas. Sewage treatment is generally more efficient in river basins draining into the North Sea, the Baltic Sea and the Atlantic Ocean. We use our model to explore future trends up to the year 2050. Our scenarios indicate that in the future river export of microplastics may increase in some river basins, but decrease in others. Remarkably, for many basins we calculate a reduction in river export of microplastics from point-sources, mainly due to an anticipated improvement in sewage treatment. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Hybrid empirical--theoretical approach to modeling uranium adsorption

    Energy Technology Data Exchange (ETDEWEB)

    Hull, Larry C.; Grossman, Christopher; Fjeld, Robert A.; Coates, John T.; Elzerman, Alan W

    2004-05-01

    An estimated 330 metric tons of U are buried in the radioactive waste Subsurface Disposal Area (SDA) at the Idaho National Engineering and Environmental Laboratory (INEEL). An assessment of U transport parameters is being performed to decrease the uncertainty in risk and dose predictions derived from computer simulations of U fate and transport to the underlying Snake River Plain Aquifer. Uranium adsorption isotherms were measured for 14 sediment samples collected from sedimentary interbeds underlying the SDA. The adsorption data were fit with a Freundlich isotherm. The Freundlich n parameter is statistically identical for all 14 sediment samples and the Freundlich K{sub f} parameter is correlated to sediment surface area (r{sup 2}=0.80). These findings suggest an efficient approach to material characterization and implementation of a spatially variable reactive transport model that requires only the measurement of sediment surface area. To expand the potential applicability of the measured isotherms, a model is derived from the empirical observations by incorporating concepts from surface complexation theory to account for the effects of solution chemistry. The resulting model is then used to predict the range of adsorption conditions to be expected in the vadose zone at the SDA based on the range in measured pore water chemistry. Adsorption in the deep vadose zone is predicted to be stronger than in near-surface sediments because the total dissolved carbonate decreases with depth.

  13. Mobile phone use while driving: a hybrid modeling approach.

    Science.gov (United States)

    Márquez, Luis; Cantillo, Víctor; Arellana, Julián

    2015-05-01

    The analysis of the effects that mobile phone use produces while driving is a topic of great interest for the scientific community. There is consensus that using a mobile phone while driving increases the risk of exposure to traffic accidents. The purpose of this research is to evaluate the drivers' behavior when they decide whether or not to use a mobile phone while driving. For that, a hybrid modeling approach that integrates a choice model with the latent variable "risk perception" was used. It was found that workers and individuals with the highest education level are more prone to use a mobile phone while driving than others. Also, "risk perception" is higher among individuals who have been previously fined and people who have been in an accident or almost been in an accident. It was also found that the tendency to use mobile phones while driving increases when the traffic speed reduces, but it decreases when the fine increases. Even though the urgency of the phone call is the most important explanatory variable in the choice model, the cost of the fine is an important attribute in order to control mobile phone use while driving. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Implementation of a Novel Educational Modeling Approach for Cloud Computing

    Directory of Open Access Journals (Sweden)

    Sara Ouahabi

    2014-12-01

    Full Text Available The Cloud model is cost-effective because customers pay for their actual usage without upfront costs, and scalable because it can be used more or less depending on the customers’ needs. Due to its advantages, Cloud has been increasingly adopted in many areas, such as banking, e-commerce, retail industry, and academy. For education, cloud is used to manage the large volume of educational resources produced across many universities in the cloud. Keep interoperability between content in an inter-university Cloud is not always easy. Diffusion of pedagogical contents on the Cloud by different E-Learning institutions leads to heterogeneous content which influence the quality of teaching offered by university to teachers and learners. From this reason, comes the idea of using IMS-LD coupled with metadata in the cloud. This paper presents the implementation of our previous educational modeling by combining an application in J2EE with Reload editor that consists of modeling heterogeneous content in the cloud. The new approach that we followed focuses on keeping interoperability between Educational Cloud content for teachers and learners and facilitates the task of identification, reuse, sharing, adapting teaching and learning resources in the Cloud.

  15. Hybrid empirical--theoretical approach to modeling uranium adsorption

    International Nuclear Information System (INIS)

    Hull, Larry C.; Grossman, Christopher; Fjeld, Robert A.; Coates, John T.; Elzerman, Alan W.

    2004-01-01

    An estimated 330 metric tons of U are buried in the radioactive waste Subsurface Disposal Area (SDA) at the Idaho National Engineering and Environmental Laboratory (INEEL). An assessment of U transport parameters is being performed to decrease the uncertainty in risk and dose predictions derived from computer simulations of U fate and transport to the underlying Snake River Plain Aquifer. Uranium adsorption isotherms were measured for 14 sediment samples collected from sedimentary interbeds underlying the SDA. The adsorption data were fit with a Freundlich isotherm. The Freundlich n parameter is statistically identical for all 14 sediment samples and the Freundlich K f parameter is correlated to sediment surface area (r 2 =0.80). These findings suggest an efficient approach to material characterization and implementation of a spatially variable reactive transport model that requires only the measurement of sediment surface area. To expand the potential applicability of the measured isotherms, a model is derived from the empirical observations by incorporating concepts from surface complexation theory to account for the effects of solution chemistry. The resulting model is then used to predict the range of adsorption conditions to be expected in the vadose zone at the SDA based on the range in measured pore water chemistry. Adsorption in the deep vadose zone is predicted to be stronger than in near-surface sediments because the total dissolved carbonate decreases with depth

  16. A Dynamic Approach to Modeling Dependence Between Human Failure Events

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald Laurids [Idaho National Laboratory

    2015-09-01

    In practice, most HRA methods use direct dependence from THERP—the notion that error be- gets error, and one human failure event (HFE) may increase the likelihood of subsequent HFEs. In this paper, we approach dependence from a simulation perspective in which the effects of human errors are dynamically modeled. There are three key concepts that play into this modeling: (1) Errors are driven by performance shaping factors (PSFs). In this context, the error propagation is not a result of the presence of an HFE yielding overall increases in subsequent HFEs. Rather, it is shared PSFs that cause dependence. (2) PSFs have qualities of lag and latency. These two qualities are not currently considered in HRA methods that use PSFs. Yet, to model the effects of PSFs, it is not simply a matter of identifying the discrete effects of a particular PSF on performance. The effects of PSFs must be considered temporally, as the PSFs will have a range of effects across the event sequence. (3) Finally, there is the concept of error spilling. When PSFs are activated, they not only have temporal effects but also lateral effects on other PSFs, leading to emergent errors. This paper presents the framework for tying together these dynamic dependence concepts.

  17. Sulfur Deactivation of NOx Storage Catalysts: A Multiscale Modeling Approach

    Directory of Open Access Journals (Sweden)

    Rankovic N.

    2013-09-01

    Full Text Available Lean NOx Trap (LNT catalysts, a promising solution for reducing the noxious nitrogen oxide emissions from the lean burn and Diesel engines, are technologically limited by the presence of sulfur in the exhaust gas stream. Sulfur stemming from both fuels and lubricating oils is oxidized during the combustion event and mainly exists as SOx (SO2 and SO3 in the exhaust. Sulfur oxides interact strongly with the NOx trapping material of a LNT to form thermodynamically favored sulfate species, consequently leading to the blockage of NOx sorption sites and altering the catalyst operation. Molecular and kinetic modeling represent a valuable tool for predicting system behavior and evaluating catalytic performances. The present paper demonstrates how fundamental ab initio calculations can be used as a valuable source for designing kinetic models developed in the IFP Exhaust library, intended for vehicle simulations. The concrete example we chose to illustrate our approach was SO3 adsorption on the model NOx storage material, BaO. SO3 adsorption was described for various sites (terraces, surface steps and kinks and bulk for a closer description of a real storage material. Additional rate and sensitivity analyses provided a deeper understanding of the poisoning phenomena.

  18. FRIGA, a new approach to identify isotopes and hypernuclei in n -body transport models

    Science.gov (United States)

    Le Fèvre, A.; Leifels, Y.; Aichelin, J.; Hartnack, Ch.; Kireyev, V.; Bratkovskaya, E.

    2017-11-01

    We present a new algorithm to identify fragments in computer simulations of relativistic heavy-ion collisions. It is based on the simulated annealing technique and can be applied to n -body transport models like the Quantum Molecular Dynamics. This new approach is able to predict isotope yields as well as hypernucleus production. In order to illustrate its predicting power, we confront this new method to experimental data, and show the sensitivity on the parameters which govern the cluster formation.

  19. NEW APPROACH TO MODELLING OF SAND FILTER CLOGGING BY SEPTIC TANK EFFLUENT

    Directory of Open Access Journals (Sweden)

    Jakub Nieć

    2016-04-01

    Full Text Available The deep bed filtration model elaborated by Iwasaki has many applications, e.g. solids removal from wastewater. Its main parameter, filter coefficient, is directly related to removal efficiency and depends on filter depth and time of operation. In this paper the authors have proposed a new approach to modelling, describing dry organic mass from septic tank effluent and biomass distribution in a sand filter. In this approach the variable filter coefficient value was used as affected by depth and time of operation and the live biomass concentration distribution was approximated by a logistic function. Relatively stable biomass contents in deeper beds compartments were observed in empirical studies. The Iwasaki equations associated with the logistic function can predict volatile suspended solids deposition and biomass content in sand filters. The comparison between the model and empirical data for filtration lasting 10 and 20 days showed a relatively good agreement.

  20. Towards an Ontological Learners’ Modelling Approach for Personalised e-Learning

    Directory of Open Access Journals (Sweden)

    Seyed Ali Hosseini

    2013-05-01

    Full Text Available The rapid advancements in the semantic web technologies has enabled personalised learning based on learner’s characteristics in the learning process. We have implemented a Personalised Adaptive e-Learning system (onto-PAdeL which uses an ontological approach in design-ing learners’ models. Thus, this paper focuses on describing our approach for modelling learners based on their charac-teristics such as abilities, learning style(s, prior knowledge and preferences. The system uses Item Response Theory (IRT for calculating learner’s abilities. The learning style can be represented according to different theories, each of which supports personalisation in different ways. We show that using ontologies for learner modelling, in addition to many different benefits, enables reasoning for adaptive learning.