WorldWideScience

Sample records for modeling approaches showed

  1. A model comparison approach shows stronger support for economic models of fertility decline.

    Science.gov (United States)

    Shenk, Mary K; Towner, Mary C; Kress, Howard C; Alam, Nurul

    2013-05-14

    The demographic transition is an ongoing global phenomenon in which high fertility and mortality rates are replaced by low fertility and mortality. Despite intense interest in the causes of the transition, especially with respect to decreasing fertility rates, the underlying mechanisms motivating it are still subject to much debate. The literature is crowded with competing theories, including causal models that emphasize (i) mortality and extrinsic risk, (ii) the economic costs and benefits of investing in self and children, and (iii) the cultural transmission of low-fertility social norms. Distinguishing between models, however, requires more comprehensive, better-controlled studies than have been published to date. We use detailed demographic data from recent fieldwork to determine which models produce the most robust explanation of the rapid, recent demographic transition in rural Bangladesh. To rigorously compare models, we use an evidence-based statistical approach using model selection techniques derived from likelihood theory. This approach allows us to quantify the relative evidence the data give to alternative models, even when model predictions are not mutually exclusive. Results indicate that fertility, measured as either total fertility or surviving children, is best explained by models emphasizing economic factors and related motivations for parental investment. Our results also suggest important synergies between models, implicating multiple causal pathways in the rapidity and degree of recent demographic transitions.

  2. An integrated proteomics approach shows synaptic plasticity changes in an APP/PS1 Alzheimer's mouse model

    DEFF Research Database (Denmark)

    Kempf, Stefan J; Metaxas, Athanasios; Ibáñez-Vea, María

    2016-01-01

    The aim of this study was to elucidate the molecular signature of Alzheimer's disease-associated amyloid pathology.We used the double APPswe/PS1ΔE9 mouse, a widely used model of cerebral amyloidosis, to compare changes in proteome, including global phosphorylation and sialylated N-linked glycosyl...

  3. An integrated proteomics approach shows synaptic plasticity changes in an APP/PS1 Alzheimer´s mouse model

    DEFF Research Database (Denmark)

    Kempf, Stefan; Metaxas, Athanasios; Vea, Maria Ibanez

    2016-01-01

    The aim of this study was to elucidate the molecular signature of Alzheimer ́s disease-associated amyloid pathology. We used the double APPswe/PS1ΔE9 mouse, a widely used model of cerebral amyloidosis, to compare changes in proteome, including global phosphorylation and sialylated N-linked glycos...

  4. An integrated proteomics approach shows synaptic plasticity changes in an APP/PS1 Alzheimer's mouse model

    Science.gov (United States)

    Kempf, Stefan J.; Metaxas, Athanasios; Ibáñez-Vea, María; Darvesh, Sultan; Finsen, Bente; Larsen, Martin R.

    2016-01-01

    The aim of this study was to elucidate the molecular signature of Alzheimer's disease-associated amyloid pathology. We used the double APPswe/PS1ΔE9 mouse, a widely used model of cerebral amyloidosis, to compare changes in proteome, including global phosphorylation and sialylated N-linked glycosylation patterns, pathway-focused transcriptome and neurological disease-associated miRNAome with age-matched controls in neocortex, hippocampus, olfactory bulb and brainstem. We report that signalling pathways related to synaptic functions associated with dendritic spine morphology, neurite outgrowth, long-term potentiation, CREB signalling and cytoskeletal dynamics were altered in 12 month old APPswe/PS1ΔE9 mice, particularly in the neocortex and olfactory bulb. This was associated with cerebral amyloidosis as well as formation of argyrophilic tangle-like structures and microglial clustering in all brain regions, except for brainstem. These responses may be epigenetically modulated by the interaction with a number of miRNAs regulating spine restructuring, Aβ expression and neuroinflammation. We suggest that these changes could be associated with development of cognitive dysfunction in early disease states in patients with Alzheimer's disease. PMID:27144524

  5. Exploring the Interactions of the Dietary Plant Flavonoids Fisetin and Naringenin with G-Quadruplex and Duplex DNA, Showing Contrasting Binding Behavior: Spectroscopic and Molecular Modeling Approaches.

    Science.gov (United States)

    Bhattacharjee, Snehasish; Chakraborty, Sandipan; Sengupta, Pradeep K; Bhowmik, Sudipta

    2016-09-01

    Guanine-rich sequences have the propensity to fold into a four-stranded DNA structure known as a G-quadruplex (G4). G4 forming sequences are abundant in the promoter region of several oncogenes and become a key target for anticancer drug binding. Here we have studied the interactions of two structurally similar dietary plant flavonoids fisetin and naringenin with G4 as well as double stranded (duplex) DNA by using different spectroscopic and modeling techniques. Our study demonstrates the differential binding ability of the two flavonoids with G4 and duplex DNA. Fisetin more strongly interacts with parallel G4 structure than duplex DNA, whereas naringenin shows stronger binding affinity to duplex rather than G4 DNA. Molecular docking results also corroborate our spectroscopic results, and it was found that both of the ligands are stacked externally in the G4 DNA structure. C-ring planarity of the flavonoid structure appears to be a crucial factor for preferential G4 DNA recognition of flavonoids. The goal of this study is to explore the critical effects of small differences in the structure of closely similar chemical classes of such small molecules (flavonoids) which lead to the contrasting binding properties with the two different forms of DNA. The resulting insights may be expected to facilitate the designing of the highly selective G4 DNA binders based on flavonoid scaffolds.

  6. A Population Pharmacokinetic Modeling Approach Shows that Serum Penicillin G Concentrations Are Below Inhibitory Concentrations by Two Weeks after Benzathine Penicillin G Injection in the Majority of Young Adults

    Science.gov (United States)

    2014-11-01

    liter is the suggested minimum protective concentration of penicillin G against group A streptococcus . Note that the majority of measured concen- trations... pneumoniae , Streptococcus pyogenes and Haemophilus influenzae collected from patients across the USA, in 2001-2002, as part of the PROTEKT US study. J...Naval Health Research Center A Population Pharmacokinetic Modeling Approach Shows that Serum Penicillin G Concentrations Are Below Inhibitory

  7. A structural approach in networks: showing patterns, possibilities and pitfalls

    Directory of Open Access Journals (Sweden)

    Carlos Alberto Alves

    2010-01-01

    Full Text Available In recent years there has been a noticeable shift in evaluation paradigms away from positivist, individualist and atomistic explanations of phenomena to those seeking a more relational, contextual and systemic understanding. This growing shift in interest to the interrelationships or networks of connections between entities is apparent in fields as organizations in networks, knowledge transmission between social groups and so. A growing theoretical and methodological base is providing enhanced capacities to uncover the actual topologies or patterns of connections between entities, elements, people, organizations or communities and deliver a more fine grained analysis of their elements. In this way network analysis differs from conventional evaluation and research modes since its focus is on the interrelationships of entities not the characteristics of individuals. In this paper, we review and analyze the emerging capacity of the network paradigm as an evaluation method and show how this model can be applied t a range of evaluation arenas. In doing so, we outline a framework to guide network evaluation, establish some key network indicators and highlight key methodological aspects and pitfalls.

  8. Which new approaches to tackling neglected tropical diseases show promise?

    Science.gov (United States)

    Spiegel, Jerry M; Dharamsi, Shafik; Wasan, Kishor M; Yassi, Annalee; Singer, Burton; Hotez, Peter J; Hanson, Christy; Bundy, Donald A P

    2010-05-18

    This PLoS Medicine Debate examines the different approaches that can be taken to tackle neglected tropical diseases (NTDs). Some commentators, like Jerry Spiegel and colleagues from the University of British Columbia, feel there has been too much focus on the biomedical mechanisms and drug development for NTDs, at the expense of attention to the social determinants of disease. Burton Singer argues that this represents another example of the inappropriate "overmedicalization" of contemporary tropical disease control. Peter Hotez and colleagues, in contrast, argue that the best return on investment will continue to be mass drug administration for NTDs.

  9. Which new approaches to tackling neglected tropical diseases show promise?

    Directory of Open Access Journals (Sweden)

    Jerry M Spiegel

    2010-05-01

    Full Text Available This PLoS Medicine Debate examines the different approaches that can be taken to tackle neglected tropical diseases (NTDs. Some commentators, like Jerry Spiegel and colleagues from the University of British Columbia, feel there has been too much focus on the biomedical mechanisms and drug development for NTDs, at the expense of attention to the social determinants of disease. Burton Singer argues that this represents another example of the inappropriate "overmedicalization" of contemporary tropical disease control. Peter Hotez and colleagues, in contrast, argue that the best return on investment will continue to be mass drug administration for NTDs.

  10. ShowFlow: A practical interface for groundwater modeling

    Energy Technology Data Exchange (ETDEWEB)

    Tauxe, J.D.

    1990-12-01

    ShowFlow was created to provide a user-friendly, intuitive environment for researchers and students who use computer modeling software. What traditionally has been a workplace available only to those familiar with command-line based computer systems is now within reach of almost anyone interested in the subject of modeling. In the case of this edition of ShowFlow, the user can easily experiment with simulations using the steady state gaussian plume groundwater pollutant transport model SSGPLUME, though ShowFlow can be rewritten to provide a similar interface for any computer model. Included in this thesis is all the source code for both the ShowFlow application for Microsoft{reg sign} Windows{trademark} and the SSGPLUME model, a User's Guide, and a Developer's Guide for converting ShowFlow to run other model programs. 18 refs., 13 figs.

  11. Reciprocal Ontological Models Show Indeterminism Comparable to Quantum Theory

    Science.gov (United States)

    Bandyopadhyay, Somshubhro; Banik, Manik; Bhattacharya, Some Sankar; Ghosh, Sibasish; Kar, Guruprasad; Mukherjee, Amit; Roy, Arup

    2016-12-01

    We show that within the class of ontological models due to Harrigan and Spekkens, those satisfying preparation-measurement reciprocity must allow indeterminism comparable to that in quantum theory. Our result implies that one can design quantum random number generator, for which it is impossible, even in principle, to construct a reciprocal deterministic model.

  12. Reciprocal Ontological Models Show Indeterminism Comparable to Quantum Theory

    Science.gov (United States)

    Bandyopadhyay, Somshubhro; Banik, Manik; Bhattacharya, Some Sankar; Ghosh, Sibasish; Kar, Guruprasad; Mukherjee, Amit; Roy, Arup

    2017-02-01

    We show that within the class of ontological models due to Harrigan and Spekkens, those satisfying preparation-measurement reciprocity must allow indeterminism comparable to that in quantum theory. Our result implies that one can design quantum random number generator, for which it is impossible, even in principle, to construct a reciprocal deterministic model.

  13. A Solved Model to Show Insufficiency of Quantitative Adiabatic Condition

    Institute of Scientific and Technical Information of China (English)

    LIU Long-Jiang; LIU Yu-Zhen; TONG Dian-Min

    2009-01-01

    The adiabatic theorem is a useful tool in processing quantum systems slowly evolving,but its practical application depends on the quantitative condition expressed by Hamiltonian's eigenvalues and eigenstates,which is usually taken as a sufficient condition.Recently,the sumciency of the condition was questioned,and several counterex amples have been reported.Here we present a new solved model to show the insufficiency of the traditional quantitative adiabatic condition.

  14. Showing that the race model inequality is not violated

    DEFF Research Database (Denmark)

    Gondan, Matthias; Riehl, Verena; Blurton, Steven Paul

    2012-01-01

    important being race models and coactivation models. Redundancy gains consistent with the race model have an upper limit, however, which is given by the well-known race model inequality (Miller, 1982). A number of statistical tests have been proposed for testing the race model inequality in single...... participants and groups of participants. All of these tests use the race model as the null hypothesis, and rejection of the null hypothesis is considered evidence in favor of coactivation. We introduce a statistical test in which the race model prediction is the alternative hypothesis. This test controls...

  15. Showing that the race model inequality is not violated

    DEFF Research Database (Denmark)

    Gondan, Matthias; Riehl, Verena; Blurton, Steven Paul

    2012-01-01

    important being race models and coactivation models. Redundancy gains consistent with the race model have an upper limit, however, which is given by the well-known race model inequality (Miller, 1982). A number of statistical tests have been proposed for testing the race model inequality in single...... participants and groups of participants. All of these tests use the race model as the null hypothesis, and rejection of the null hypothesis is considered evidence in favor of coactivation. We introduce a statistical test in which the race model prediction is the alternative hypothesis. This test controls...... the Type I error if a theory predicts that the race model prediction holds in a given experimental condition. © 2011 Psychonomic Society, Inc....

  16. Showing that the race model inequality is not violated

    DEFF Research Database (Denmark)

    Gondan, Matthias; Riehl, Verena; Blurton, Steven Paul

    2012-01-01

    important being race models and coactivation models. Redundancy gains consistent with the race model have an upper limit, however, which is given by the well-known race model inequality (Miller, 1982). A number of statistical tests have been proposed for testing the race model inequality in single...... participants and groups of participants. All of these tests use the race model as the null hypothesis, and rejection of the null hypothesis is considered evidence in favor of coactivation. We introduce a statistical test in which the race model prediction is the alternative hypothesis. This test controls...... the Type I error if a theory predicts that the race model prediction holds in a given experimental condition. © 2011 Psychonomic Society, Inc....

  17. Showing Automatically Generated Students' Conceptual Models to Students and Teachers

    Science.gov (United States)

    Perez-Marin, Diana; Pascual-Nieto, Ismael

    2010-01-01

    A student conceptual model can be defined as a set of interconnected concepts associated with an estimation value that indicates how well these concepts are used by the students. It can model just one student or a group of students, and can be represented as a concept map, conceptual diagram or one of several other knowledge representation…

  18. Gerber Technology Presented Modern-Day Approach at Texprocess Americas Show

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    At Texprocess Americas show in Atlanta, Georgia, the company took a modern-day approach to expose attendees to severalnew products. Applying the concept of "Bring us your challenge, We can help", the company offered these latest solutions to overcome customers' challenges.

  19. Model Mapping Approach Based on Ontology Semantics

    Directory of Open Access Journals (Sweden)

    Jinkui Hou

    2013-09-01

    Full Text Available The mapping relations between different models are the foundation for model transformation in model-driven software development. On the basis of ontology semantics, model mappings between different levels are classified by using structural semantics of modeling languages. The general definition process for mapping relations is explored, and the principles of structure mapping are proposed subsequently. The approach is further illustrated by the mapping relations from class model of object oriented modeling language to the C programming codes. The application research shows that the approach provides a theoretical guidance for the realization of model mapping, and thus can make an effective support to model-driven software development

  20. Showing their true colors: a practical approach to volume rendering from serial sections

    Directory of Open Access Journals (Sweden)

    Metscher Brian D

    2010-04-01

    Full Text Available Abstract Background In comparison to more modern imaging methods, conventional light microscopy still offers a range of substantial advantages with regard to contrast options, accessible specimen size, and resolution. Currently, tomographic image data in particular is most commonly visualized in three dimensions using volume rendering. To date, this method has only very rarely been applied to image stacks taken from serial sections, whereas surface rendering is still the most prevalent method for presenting such data sets three-dimensionally. The aim of this study was to develop standard protocols for volume rendering of image stacks of serial sections, while retaining the benefits of light microscopy such as resolution and color information. Results Here we provide a set of protocols for acquiring high-resolution 3D images of diverse microscopic samples through volume rendering based on serial light microscopical sections using the 3D reconstruction software Amira (Visage Imaging Inc.. We overcome several technical obstacles and show that these renderings are comparable in quality and resolution to 3D visualizations using other methods. This practical approach for visualizing 3D micro-morphology in full color takes advantage of both the sub-micron resolution of light microscopy and the specificity of histological stains, by combining conventional histological sectioning techniques, digital image acquisition, three-dimensional image filtering, and 3D image manipulation and visualization technologies. Conclusions We show that this method can yield "true"-colored high-resolution 3D views of tissues as well as cellular and sub-cellular structures and thus represents a powerful tool for morphological, developmental, and comparative investigations. We conclude that the presented approach fills an important gap in the field of micro-anatomical 3D imaging and visualization methods by combining histological resolution and differentiation of details with

  1. Showing their true colors: a practical approach to volume rendering from serial sections.

    Science.gov (United States)

    Handschuh, Stephan; Schwaha, Thomas; Metscher, Brian D

    2010-04-21

    In comparison to more modern imaging methods, conventional light microscopy still offers a range of substantial advantages with regard to contrast options, accessible specimen size, and resolution. Currently, tomographic image data in particular is most commonly visualized in three dimensions using volume rendering. To date, this method has only very rarely been applied to image stacks taken from serial sections, whereas surface rendering is still the most prevalent method for presenting such data sets three-dimensionally. The aim of this study was to develop standard protocols for volume rendering of image stacks of serial sections, while retaining the benefits of light microscopy such as resolution and color information. Here we provide a set of protocols for acquiring high-resolution 3D images of diverse microscopic samples through volume rendering based on serial light microscopical sections using the 3D reconstruction software Amira (Visage Imaging Inc.). We overcome several technical obstacles and show that these renderings are comparable in quality and resolution to 3D visualizations using other methods. This practical approach for visualizing 3D micro-morphology in full color takes advantage of both the sub-micron resolution of light microscopy and the specificity of histological stains, by combining conventional histological sectioning techniques, digital image acquisition, three-dimensional image filtering, and 3D image manipulation and visualization technologies. We show that this method can yield "true"-colored high-resolution 3D views of tissues as well as cellular and sub-cellular structures and thus represents a powerful tool for morphological, developmental, and comparative investigations. We conclude that the presented approach fills an important gap in the field of micro-anatomical 3D imaging and visualization methods by combining histological resolution and differentiation of details with 3D rendering of whole tissue samples. We demonstrate the

  2. Model Penilaian dan Pemilihan Trade Show Bagi Industri Kreatif di Sektor Mode

    Directory of Open Access Journals (Sweden)

    Afrin Fauzya Rizana

    2017-07-01

    Full Text Available The article identifies the criteria for choosing a trade show and develops a basic model of exhibition selection for creative industry players before deciding to participate in a trade show. It is necessary to ensure that expenses in terms of business, money, and time, will be worth the results. Based on literature review and interviews, six criteria were used, namely location, booth position, organizational reputation, cost estimation, prestige, and reputation of other participants. After selection criteria are identified, then calculations are performed to measure the criteria weight by using the AHP approach. Based on weight calculations, it was found that booth positions had the highest importance weight, followed by trade show location, organizers reputation, cost estimation, prestige and reputation of other participants. The weight value is then used to calculate the trade show's prediction value. The predicted value generated from the model is then compared to the value of the past data. The model has an accuracy rate of 89% and does not have a significant difference between the value generated by the model and the value of the past data.

  3. Planetarium Shows: Public Outreach and Changing the Approach to Classroom Teaching

    Science.gov (United States)

    Morris, P. A.; Reiff, P.; Sumners, C.

    2009-05-01

    We are successfully using the planetarium experience as a method of increasing the public's knowledge of science for the last few years. The results with pre and post testing have shown that there is an increased level of science knowledge. With this level of achievement it was plausible that the next step would be introducing it into the classroom, particularly at an open admissions university with a multi-racial and ethnic population and in the non-majors science courses. Creating an atmosphere of excitement and interest in science within this group of students is difficult, particularly with those students who have minimal exposure to science and have little or no interest in the subject. They have preconceived ideas that the subject is boring, difficult and has no relation to their future career paths. Coupled with this apathy, not only in students but in segments of the public, there is an increasing need to educate all groups on environmental issues and the need to promote science and science research. Traditional teaching methods are not effective, and based on the outreach levels of success, we are introducing planetarium shows that are not only entertaining but include scientific concepts Most colleges and universities do not have a fixed planetarium, but a portable planetarium can serve the same purpose. The planetarium does not replace the traditional classroom and laboratory experience, but augments it. The shows, including the freeware Stellarium, can be incorporated into college and university classes from the physical to the biological sciences. For example, the show Titanic includes information on the effects of solar maximums and minimums on climate and changes in the last 100 years on ship building. Another show, Ice Worlds, includes information on polar geology and biology but also explains its importance as a terrestrial analogue for interpreting the remote sensing information obtained with robotic missions to the icy worlds in our outer solar

  4. Multiple Model Approaches to Modelling and Control,

    DEFF Research Database (Denmark)

    Why Multiple Models?This book presents a variety of approaches which produce complex models or controllers by piecing together a number of simpler subsystems. Thisdivide-and-conquer strategy is a long-standing and general way of copingwith complexity in engineering systems, nature and human probl...

  5. Multiple Model Approaches to Modelling and Control,

    DEFF Research Database (Denmark)

    on the ease with which prior knowledge can be incorporated. It is interesting to note that researchers in Control Theory, Neural Networks,Statistics, Artificial Intelligence and Fuzzy Logic have more or less independently developed very similar modelling methods, calling them Local ModelNetworks, Operating...... of introduction of existing knowledge, as well as the ease of model interpretation. This book attempts to outlinemuch of the common ground between the various approaches, encouraging the transfer of ideas.Recent progress in algorithms and analysis is presented, with constructive algorithms for automated model...

  6. Model Construct Based Enterprise Model Architecture and Its Modeling Approach

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    In order to support enterprise integration, a kind of model construct based enterprise model architecture and its modeling approach are studied in this paper. First, the structural makeup and internal relationships of enterprise model architecture are discussed. Then, the concept of reusable model construct (MC) which belongs to the control view and can help to derive other views is proposed. The modeling approach based on model construct consists of three steps, reference model architecture synthesis, enterprise model customization, system design and implementation. According to MC based modeling approach a case study with the background of one-kind-product machinery manufacturing enterprises is illustrated. It is shown that proposal model construct based enterprise model architecture and modeling approach are practical and efficient.

  7. A Multiple Model Approach to Modeling Based on LPF Algorithm

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Input-output data fitting methods are often used for unknown-structure nonlinear system modeling. Based on model-on-demand tactics, a multiple model approach to modeling for nonlinear systems is presented. The basic idea is to find out, from vast historical system input-output data sets, some data sets matching with the current working point, then to develop a local model using Local Polynomial Fitting (LPF) algorithm. With the change of working points, multiple local models are built, which realize the exact modeling for the global system. By comparing to other methods, the simulation results show good performance for its simple, effective and reliable estimation.``

  8. Hydraulic Modeling of Lock Approaches

    Science.gov (United States)

    2016-08-01

    cation was that the guidewall design changed from a solid wall to one on pilings in which water was allowed to flow through and/or under the wall ...develops innovative solutions in civil and military engineering, geospatial sciences, water resources, and environmental sciences for the Army, the...magnitudes and directions at lock approaches for open river conditions. The meshes were developed using the Surface- water Modeling System. The two

  9. LP Approach to Statistical Modeling

    OpenAIRE

    Mukhopadhyay, Subhadeep; Parzen, Emanuel

    2014-01-01

    We present an approach to statistical data modeling and exploratory data analysis called `LP Statistical Data Science.' It aims to generalize and unify traditional and novel statistical measures, methods, and exploratory tools. This article outlines fundamental concepts along with real-data examples to illustrate how the `LP Statistical Algorithm' can systematically tackle different varieties of data types, data patterns, and data structures under a coherent theoretical framework. A fundament...

  10. Learning Actions Models: Qualitative Approach

    DEFF Research Database (Denmark)

    Bolander, Thomas; Gierasimczuk, Nina

    2015-01-01

    —they are identifiable in the limit.We then move on to a particular learning method, which proceeds via restriction of a space of events within a learning-specific action model. This way of learning closely resembles the well-known update method from dynamic epistemic logic. We introduce several different learning......In dynamic epistemic logic, actions are described using action models. In this paper we introduce a framework for studying learnability of action models from observations. We present first results concerning propositional action models. First we check two basic learnability criteria: finite...... identifiability (conclusively inferring the appropriate action model in finite time) and identifiability in the limit (inconclusive convergence to the right action model). We show that deterministic actions are finitely identifiable, while non-deterministic actions require more learning power...

  11. Approaches to Modeling of Recrystallization

    Directory of Open Access Journals (Sweden)

    Håkan Hallberg

    2011-10-01

    Full Text Available Control of the material microstructure in terms of the grain size is a key component in tailoring material properties of metals and alloys and in creating functionally graded materials. To exert this control, reliable and efficient modeling and simulation of the recrystallization process whereby the grain size evolves is vital. The present contribution is a review paper, summarizing the current status of various approaches to modeling grain refinement due to recrystallization. The underlying mechanisms of recrystallization are briefly recollected and different simulation methods are discussed. Analytical and empirical models, continuum mechanical models and discrete methods as well as phase field, vertex and level set models of recrystallization will be considered. Such numerical methods have been reviewed previously, but with the present focus on recrystallization modeling and with a rapidly increasing amount of related publications, an updated review is called for. Advantages and disadvantages of the different methods are discussed in terms of applicability, underlying assumptions, physical relevance, implementation issues and computational efficiency.

  12. Learning Actions Models: Qualitative Approach

    DEFF Research Database (Denmark)

    Bolander, Thomas; Gierasimczuk, Nina

    2015-01-01

    identifiability (conclusively inferring the appropriate action model in finite time) and identifiability in the limit (inconclusive convergence to the right action model). We show that deterministic actions are finitely identifiable, while non-deterministic actions require more learning power......—they are identifiable in the limit.We then move on to a particular learning method, which proceeds via restriction of a space of events within a learning-specific action model. This way of learning closely resembles the well-known update method from dynamic epistemic logic. We introduce several different learning...

  13. Vortexlet models of flapping flexible wings show tuning for force production and control

    Energy Technology Data Exchange (ETDEWEB)

    Mountcastle, A M [Department of Organismic and Evolutionary Biology, Harvard University, Concord Field Station, Bedford, MA 01730 (United States); Daniel, T L, E-mail: mtcastle@u.washington.ed [Department of Biology, University of Washington, Seattle, WA 98195 (United States)

    2010-12-15

    Insect wings are compliant structures that experience deformations during flight. Such deformations have recently been shown to substantially affect induced flows, with appreciable consequences to flight forces. However, there are open questions related to the aerodynamic mechanisms underlying the performance benefits of wing deformation, as well as the extent to which such deformations are determined by the boundary conditions governing wing actuation together with mechanical properties of the wing itself. Here we explore aerodynamic performance parameters of compliant wings under periodic oscillations, subject to changes in phase between wing elevation and pitch, and magnitude and spatial pattern of wing flexural stiffness. We use a combination of computational structural mechanics models and a 2D computational fluid dynamics approach to ask how aerodynamic force production and control potential are affected by pitch/elevation phase and variations in wing flexural stiffness. Our results show that lift and thrust forces are highly sensitive to flexural stiffness distributions, with performance optima that lie in different phase regions. These results suggest a control strategy for both flying animals and engineering applications of micro-air vehicles.

  14. Decomposition approach to model smart suspension struts

    Science.gov (United States)

    Song, Xubin

    2008-10-01

    Model and simulation study is the starting point for engineering design and development, especially for developing vehicle control systems. This paper presents a methodology to build models for application of smart struts for vehicle suspension control development. The modeling approach is based on decomposition of the testing data. Per the strut functions, the data is dissected according to both control and physical variables. Then the data sets are characterized to represent different aspects of the strut working behaviors. Next different mathematical equations can be built and optimized to best fit the corresponding data sets, respectively. In this way, the model optimization can be facilitated in comparison to a traditional approach to find out a global optimum set of model parameters for a complicated nonlinear model from a series of testing data. Finally, two struts are introduced as examples for this modeling study: magneto-rheological (MR) dampers and compressible fluid (CF) based struts. The model validation shows that this methodology can truly capture macro-behaviors of these struts.

  15. Interfacial Fluid Mechanics A Mathematical Modeling Approach

    CERN Document Server

    Ajaev, Vladimir S

    2012-01-01

    Interfacial Fluid Mechanics: A Mathematical Modeling Approach provides an introduction to mathematical models of viscous flow used in rapidly developing fields of microfluidics and microscale heat transfer. The basic physical effects are first introduced in the context of simple configurations and their relative importance in typical microscale applications is discussed. Then,several configurations of importance to microfluidics, most notably thin films/droplets on substrates and confined bubbles, are discussed in detail.  Topics from current research on electrokinetic phenomena, liquid flow near structured solid surfaces, evaporation/condensation, and surfactant phenomena are discussed in the later chapters. This book also:  Discusses mathematical models in the context of actual applications such as electrowetting Includes unique material on fluid flow near structured surfaces and phase change phenomena Shows readers how to solve modeling problems related to microscale multiphase flows Interfacial Fluid Me...

  16. A Bayesian Shrinkage Approach for AMMI Models.

    Science.gov (United States)

    da Silva, Carlos Pereira; de Oliveira, Luciano Antonio; Nuvunga, Joel Jorge; Pamplona, Andrezza Kéllen Alves; Balestre, Marcio

    2015-01-01

    Linear-bilinear models, especially the additive main effects and multiplicative interaction (AMMI) model, are widely applicable to genotype-by-environment interaction (GEI) studies in plant breeding programs. These models allow a parsimonious modeling of GE interactions, retaining a small number of principal components in the analysis. However, one aspect of the AMMI model that is still debated is the selection criteria for determining the number of multiplicative terms required to describe the GE interaction pattern. Shrinkage estimators have been proposed as selection criteria for the GE interaction components. In this study, a Bayesian approach was combined with the AMMI model with shrinkage estimators for the principal components. A total of 55 maize genotypes were evaluated in nine different environments using a complete blocks design with three replicates. The results show that the traditional Bayesian AMMI model produces low shrinkage of singular values but avoids the usual pitfalls in determining the credible intervals in the biplot. On the other hand, Bayesian shrinkage AMMI models have difficulty with the credible interval for model parameters, but produce stronger shrinkage of the principal components, converging to GE matrices that have more shrinkage than those obtained using mixed models. This characteristic allowed more parsimonious models to be chosen, and resulted in models being selected that were similar to those obtained by the Cornelius F-test (α = 0.05) in traditional AMMI models and cross validation based on leave-one-out. This characteristic allowed more parsimonious models to be chosen and more GEI pattern retained on the first two components. The resulting model chosen by posterior distribution of singular value was also similar to those produced by the cross-validation approach in traditional AMMI models. Our method enables the estimation of credible interval for AMMI biplot plus the choice of AMMI model based on direct posterior

  17. A Bayesian Shrinkage Approach for AMMI Models.

    Directory of Open Access Journals (Sweden)

    Carlos Pereira da Silva

    Full Text Available Linear-bilinear models, especially the additive main effects and multiplicative interaction (AMMI model, are widely applicable to genotype-by-environment interaction (GEI studies in plant breeding programs. These models allow a parsimonious modeling of GE interactions, retaining a small number of principal components in the analysis. However, one aspect of the AMMI model that is still debated is the selection criteria for determining the number of multiplicative terms required to describe the GE interaction pattern. Shrinkage estimators have been proposed as selection criteria for the GE interaction components. In this study, a Bayesian approach was combined with the AMMI model with shrinkage estimators for the principal components. A total of 55 maize genotypes were evaluated in nine different environments using a complete blocks design with three replicates. The results show that the traditional Bayesian AMMI model produces low shrinkage of singular values but avoids the usual pitfalls in determining the credible intervals in the biplot. On the other hand, Bayesian shrinkage AMMI models have difficulty with the credible interval for model parameters, but produce stronger shrinkage of the principal components, converging to GE matrices that have more shrinkage than those obtained using mixed models. This characteristic allowed more parsimonious models to be chosen, and resulted in models being selected that were similar to those obtained by the Cornelius F-test (α = 0.05 in traditional AMMI models and cross validation based on leave-one-out. This characteristic allowed more parsimonious models to be chosen and more GEI pattern retained on the first two components. The resulting model chosen by posterior distribution of singular value was also similar to those produced by the cross-validation approach in traditional AMMI models. Our method enables the estimation of credible interval for AMMI biplot plus the choice of AMMI model based on direct

  18. Assessing the "Rothstein Falsification Test": Does It Really Show Teacher Value-Added Models Are Biased?

    Science.gov (United States)

    Goldhaber, Dan; Chaplin, Duncan Dunbar

    2015-01-01

    In an influential paper, Jesse Rothstein (2010) shows that standard value-added models (VAMs) suggest implausible and large future teacher effects on past student achievement. This is the basis of a falsification test that "appears" to indicate bias in typical VAM estimates of teacher contributions to student learning on standardized…

  19. Validation of Modeling Flow Approaching Navigation Locks

    Science.gov (United States)

    2013-08-01

    instrumentation, direction vernier . ........................................................................ 8  Figure 11. Plan A lock approach, upstream approach...13-9 8 Figure 9. Tools and instrumentation, bracket attached to rail. Figure 10. Tools and instrumentation, direction vernier . Numerical model

  20. Modeled hydrologic metrics show links between hydrology and the functional composition of stream assemblages.

    Science.gov (United States)

    Patrick, Christopher J; Yuan, Lester L

    2017-07-01

    Flow alteration is widespread in streams, but current understanding of the effects of differences in flow characteristics on stream biological communities is incomplete. We tested hypotheses about the effect of variation in hydrology on stream communities by using generalized additive models to relate watershed information to the values of different flow metrics at gauged sites. Flow models accounted for 54-80% of the spatial variation in flow metric values among gauged sites. We then used these models to predict flow metrics in 842 ungauged stream sites in the mid-Atlantic United States that were sampled for fish, macroinvertebrates, and environmental covariates. Fish and macroinvertebrate assemblages were characterized in terms of a suite of metrics that quantified aspects of community composition, diversity, and functional traits that were expected to be associated with differences in flow characteristics. We related modeled flow metrics to biological metrics in a series of stressor-response models. Our analyses identified both drying and base flow instability as explaining 30-50% of the observed variability in fish and invertebrate community composition. Variations in community composition were related to variations in the prevalence of dispersal traits in invertebrates and trophic guilds in fish. The results demonstrate that we can use statistical models to predict hydrologic conditions at bioassessment sites, which, in turn, we can use to estimate relationships between flow conditions and biological characteristics. This analysis provides an approach to quantify the effects of spatial variation in flow metrics using readily available biomonitoring data. © 2017 by the Ecological Society of America.

  1. "A cigarette a day keeps the goodies away": smokers show automatic approach tendencies for smoking--but not for food-related stimuli.

    Directory of Open Access Journals (Sweden)

    Alla Machulska

    Full Text Available Smoking leads to the development of automatic tendencies that promote approach behavior toward smoking-related stimuli which in turn may maintain addictive behavior. The present study examined whether automatic approach tendencies toward smoking-related stimuli can be measured by using an adapted version of the Approach-Avoidance Task (AAT. Given that progression of addictive behavior has been associated with a decreased reactivity of the brain reward system for stimuli signaling natural rewards, we also used the AAT to measure approach behavior toward natural rewarding stimuli in smokers. During the AAT, 92 smokers and 51 non-smokers viewed smoking-related vs. non-smoking-related pictures and pictures of natural rewards (i.e. highly palatable food vs. neutral pictures. They were instructed to ignore image content and to respond to picture orientation by either pulling or pushing a joystick. Within-group comparisons revealed that smokers showed an automatic approach bias exclusively for smoking-related pictures. Contrary to our expectations, there was no difference in smokers' and non-smokers' approach bias for nicotine-related stimuli, indicating that non-smokers also showed approach tendencies for this picture category. Yet, in contrast to non-smokers, smokers did not show an approach bias for food-related pictures. Moreover, self-reported smoking attitude could not predict approach-avoidance behavior toward nicotine-related pictures in smokers or non-smokers. Our findings indicate that the AAT is suited for measuring smoking-related approach tendencies in smokers. Furthermore, we provide evidence for a diminished approach tendency toward food-related stimuli in smokers, suggesting a decreased sensitivity to natural rewards in the course of nicotine addiction. Our results indicate that in contrast to similar studies conducted in alcohol, cannabis and heroin users, the AAT might only be partially suited for measuring smoking-related approach

  2. Learning Action Models: Qualitative Approach

    NARCIS (Netherlands)

    Bolander, T.; Gierasimczuk, N.; van der Hoek, W.; Holliday, W.H.; Wang, W.-F.

    2015-01-01

    In dynamic epistemic logic, actions are described using action models. In this paper we introduce a framework for studying learnability of action models from observations. We present first results concerning propositional action models. First we check two basic learnability criteria: finite

  3. A new approach for Bayesian model averaging

    Institute of Scientific and Technical Information of China (English)

    TIAN XiangJun; XIE ZhengHui; WANG AiHui; YANG XiaoChun

    2012-01-01

    Bayesian model averaging (BMA) is a recently proposed statistical method for calibrating forecast ensembles from numerical weather models.However,successful implementation of BMA requires accurate estimates of the weights and variances of the individual competing models in the ensemble.Two methods,namely the Expectation-Maximization (EM) and the Markov Chain Monte Carlo (MCMC) algorithms,are widely used for BMA model training.Both methods have their own respective strengths and weaknesses.In this paper,we first modify the BMA log-likelihood function with the aim of removing the additional limitation that requires that the BMA weights add to one,and then use a limited memory quasi-Newtonian algorithm for solving the nonlinear optimization problem,thereby formulating a new approach for BMA (referred to as BMA-BFGS).Several groups of multi-model soil moisture simulation experiments from three land surface models show that the performance of BMA-BFGS is similar to the MCMC method in terms of simulation accuracy,and that both are superior to the EM algorithm.On the other hand,the computational cost of the BMA-BFGS algorithm is substantially less than for MCMC and is almost equivalent to that for EM.

  4. Geometrical approach to fluid models

    NARCIS (Netherlands)

    Kuvshinov, B. N.; Schep, T. J.

    1997-01-01

    Differential geometry based upon the Cartan calculus of differential forms is applied to investigate invariant properties of equations that describe the motion of continuous media. The main feature of this approach is that physical quantities are treated as geometrical objects. The geometrical

  5. Geometrical approach to fluid models

    NARCIS (Netherlands)

    Kuvshinov, B. N.; Schep, T. J.

    1997-01-01

    Differential geometry based upon the Cartan calculus of differential forms is applied to investigate invariant properties of equations that describe the motion of continuous media. The main feature of this approach is that physical quantities are treated as geometrical objects. The geometrical notio

  6. Model based feature fusion approach

    NARCIS (Netherlands)

    Schwering, P.B.W.

    2001-01-01

    In recent years different sensor data fusion approaches have been analyzed and evaluated in the field of mine detection. In various studies comparisons have been made between different techniques. Although claims can be made for advantages for using certain techniques, until now there has been no si

  7. Merging Digital Surface Models Implementing Bayesian Approaches

    Science.gov (United States)

    Sadeq, H.; Drummond, J.; Li, Z.

    2016-06-01

    In this research different DSMs from different sources have been merged. The merging is based on a probabilistic model using a Bayesian Approach. The implemented data have been sourced from very high resolution satellite imagery sensors (e.g. WorldView-1 and Pleiades). It is deemed preferable to use a Bayesian Approach when the data obtained from the sensors are limited and it is difficult to obtain many measurements or it would be very costly, thus the problem of the lack of data can be solved by introducing a priori estimations of data. To infer the prior data, it is assumed that the roofs of the buildings are specified as smooth, and for that purpose local entropy has been implemented. In addition to the a priori estimations, GNSS RTK measurements have been collected in the field which are used as check points to assess the quality of the DSMs and to validate the merging result. The model has been applied in the West-End of Glasgow containing different kinds of buildings, such as flat roofed and hipped roofed buildings. Both quantitative and qualitative methods have been employed to validate the merged DSM. The validation results have shown that the model was successfully able to improve the quality of the DSMs and improving some characteristics such as the roof surfaces, which consequently led to better representations. In addition to that, the developed model has been compared with the well established Maximum Likelihood model and showed similar quantitative statistical results and better qualitative results. Although the proposed model has been applied on DSMs that were derived from satellite imagery, it can be applied to any other sourced DSMs.

  8. MERGING DIGITAL SURFACE MODELS IMPLEMENTING BAYESIAN APPROACHES

    Directory of Open Access Journals (Sweden)

    H. Sadeq

    2016-06-01

    Full Text Available In this research different DSMs from different sources have been merged. The merging is based on a probabilistic model using a Bayesian Approach. The implemented data have been sourced from very high resolution satellite imagery sensors (e.g. WorldView-1 and Pleiades. It is deemed preferable to use a Bayesian Approach when the data obtained from the sensors are limited and it is difficult to obtain many measurements or it would be very costly, thus the problem of the lack of data can be solved by introducing a priori estimations of data. To infer the prior data, it is assumed that the roofs of the buildings are specified as smooth, and for that purpose local entropy has been implemented. In addition to the a priori estimations, GNSS RTK measurements have been collected in the field which are used as check points to assess the quality of the DSMs and to validate the merging result. The model has been applied in the West-End of Glasgow containing different kinds of buildings, such as flat roofed and hipped roofed buildings. Both quantitative and qualitative methods have been employed to validate the merged DSM. The validation results have shown that the model was successfully able to improve the quality of the DSMs and improving some characteristics such as the roof surfaces, which consequently led to better representations. In addition to that, the developed model has been compared with the well established Maximum Likelihood model and showed similar quantitative statistical results and better qualitative results. Although the proposed model has been applied on DSMs that were derived from satellite imagery, it can be applied to any other sourced DSMs.

  9. Global energy modeling - A biophysical approach

    Energy Technology Data Exchange (ETDEWEB)

    Dale, Michael

    2010-09-15

    This paper contrasts the standard economic approach to energy modelling with energy models using a biophysical approach. Neither of these approaches includes changing energy-returns-on-investment (EROI) due to declining resource quality or the capital intensive nature of renewable energy sources. Both of these factors will become increasingly important in the future. An extension to the biophysical approach is outlined which encompasses a dynamic EROI function that explicitly incorporates technological learning. The model is used to explore several scenarios of long-term future energy supply especially concerning the global transition to renewable energy sources in the quest for a sustainable energy system.

  10. Evaluating face trustworthiness: a model based approach.

    Science.gov (United States)

    Todorov, Alexander; Baron, Sean G; Oosterhof, Nikolaas N

    2008-06-01

    Judgments of trustworthiness from faces determine basic approach/avoidance responses and approximate the valence evaluation of faces that runs across multiple person judgments. Here, based on trustworthiness judgments and using a computer model for face representation, we built a model for representing face trustworthiness (study 1). Using this model, we generated novel faces with an increased range of trustworthiness and used these faces as stimuli in a functional Magnetic Resonance Imaging study (study 2). Although participants did not engage in explicit evaluation of the faces, the amygdala response changed as a function of face trustworthiness. An area in the right amygdala showed a negative linear response-as the untrustworthiness of faces increased so did the amygdala response. Areas in the left and right putamen, the latter area extended into the anterior insula, showed a similar negative linear response. The response in the left amygdala was quadratic--strongest for faces on both extremes of the trustworthiness dimension. The medial prefrontal cortex and precuneus also showed a quadratic response, but their response was strongest to faces in the middle range of the trustworthiness dimension.

  11. A POMDP approach to Affective Dialogue Modeling

    NARCIS (Netherlands)

    Bui Huu Trung, B.H.T.; Poel, Mannes; Nijholt, Antinus; Zwiers, Jakob; Keller, E.; Marinaro, M.; Bratanic, M.

    2007-01-01

    We propose a novel approach to developing a dialogue model that is able to take into account some aspects of the user's affective state and to act appropriately. Our dialogue model uses a Partially Observable Markov Decision Process approach with observations composed of the observed user's

  12. The chronic diseases modelling approach

    NARCIS (Netherlands)

    Hoogenveen RT; Hollander AEM de; Genugten MLL van; CCM

    1998-01-01

    A mathematical model structure is described that can be used to simulate the changes of the Dutch public health state over time. The model is based on the concept of demographic and epidemiologic processes (events) and is mathematically based on the lifetable method. The population is divided over s

  13. Histidine decarboxylase knockout mice, a genetic model of Tourette syndrome, show repetitive grooming after induced fear.

    Science.gov (United States)

    Xu, Meiyu; Li, Lina; Ohtsu, Hiroshi; Pittenger, Christopher

    2015-05-19

    Tics, such as are seen in Tourette syndrome (TS), are common and can cause profound morbidity, but they are poorly understood. Tics are potentiated by psychostimulants, stress, and sleep deprivation. Mutations in the gene histidine decarboxylase (Hdc) have been implicated as a rare genetic cause of TS, and Hdc knockout mice have been validated as a genetic model that recapitulates phenomenological and pathophysiological aspects of the disorder. Tic-like stereotypies in this model have not been observed at baseline but emerge after acute challenge with the psychostimulant d-amphetamine. We tested the ability of an acute stressor to stimulate stereotypies in this model, using tone fear conditioning. Hdc knockout mice acquired conditioned fear normally, as manifested by freezing during the presentation of a tone 48h after it had been paired with a shock. During the 30min following tone presentation, knockout mice showed increased grooming. Heterozygotes exhibited normal freezing and intermediate grooming. These data validate a new paradigm for the examination of tic-like stereotypies in animals without pharmacological challenge and enhance the face validity of the Hdc knockout mouse as a pathophysiologically grounded model of tic disorders.

  14. Small GSK-3 Inhibitor Shows Efficacy in a Motor Neuron Disease Murine Model Modulating Autophagy.

    Science.gov (United States)

    de Munck, Estefanía; Palomo, Valle; Muñoz-Sáez, Emma; Perez, Daniel I; Gómez-Miguel, Begoña; Solas, M Teresa; Gil, Carmen; Martínez, Ana; Arahuetes, Rosa M

    2016-01-01

    Amyotrophic lateral sclerosis (ALS) is a progressive motor neuron degenerative disease that has no effective treatment up to date. Drug discovery tasks have been hampered due to the lack of knowledge in its molecular etiology together with the limited animal models for research. Recently, a motor neuron disease animal model has been developed using β-N-methylamino-L-alanine (L-BMAA), a neurotoxic amino acid related to the appearing of ALS. In the present work, the neuroprotective role of VP2.51, a small heterocyclic GSK-3 inhibitor, is analysed in this novel murine model together with the analysis of autophagy. VP2.51 daily administration for two weeks, starting the first day after L-BMAA treatment, leads to total recovery of neurological symptoms and prevents the activation of autophagic processes in rats. These results show that the L-BMAA murine model can be used to test the efficacy of new drugs. In addition, the results confirm the therapeutic potential of GSK-3 inhibitors, and specially VP2.51, for the disease-modifying future treatment of motor neuron disorders like ALS.

  15. Small GSK-3 Inhibitor Shows Efficacy in a Motor Neuron Disease Murine Model Modulating Autophagy

    Science.gov (United States)

    de Munck, Estefanía; Palomo, Valle; Muñoz-Sáez, Emma; Perez, Daniel I.; Gómez-Miguel, Begoña; Solas, M. Teresa; Gil, Carmen; Martínez, Ana; Arahuetes, Rosa M.

    2016-01-01

    Amyotrophic lateral sclerosis (ALS) is a progressive motor neuron degenerative disease that has no effective treatment up to date. Drug discovery tasks have been hampered due to the lack of knowledge in its molecular etiology together with the limited animal models for research. Recently, a motor neuron disease animal model has been developed using β-N-methylamino-L-alanine (L-BMAA), a neurotoxic amino acid related to the appearing of ALS. In the present work, the neuroprotective role of VP2.51, a small heterocyclic GSK-3 inhibitor, is analysed in this novel murine model together with the analysis of autophagy. VP2.51 daily administration for two weeks, starting the first day after L-BMAA treatment, leads to total recovery of neurological symptoms and prevents the activation of autophagic processes in rats. These results show that the L-BMAA murine model can be used to test the efficacy of new drugs. In addition, the results confirm the therapeutic potential of GSK-3 inhibitors, and specially VP2.51, for the disease-modifying future treatment of motor neuron disorders like ALS. PMID:27631495

  16. MTO1-deficient mouse model mirrors the human phenotype showing complex I defect and cardiomyopathy.

    Directory of Open Access Journals (Sweden)

    Lore Becker

    Full Text Available Recently, mutations in the mitochondrial translation optimization factor 1 gene (MTO1 were identified as causative in children with hypertrophic cardiomyopathy, lactic acidosis and respiratory chain defect. Here, we describe an MTO1-deficient mouse model generated by gene trap mutagenesis that mirrors the human phenotype remarkably well. As in patients, the most prominent signs and symptoms were cardiovascular and included bradycardia and cardiomyopathy. In addition, the mutant mice showed a marked worsening of arrhythmias during induction and reversal of anaesthesia. The detailed morphological and biochemical workup of murine hearts indicated that the myocardial damage was due to complex I deficiency and mitochondrial dysfunction. In contrast, neurological examination was largely normal in Mto1-deficient mice. A translational consequence of this mouse model may be to caution against anaesthesia-related cardiac arrhythmias which may be fatal in patients.

  17. A Unified Approach to Modeling and Programming

    DEFF Research Database (Denmark)

    Madsen, Ole Lehrmann; Møller-Pedersen, Birger

    2010-01-01

    of this paper is to go back to the future and get inspiration from SIMULA and propose a unied approach. In addition to reintroducing the contributions of SIMULA and the Scandinavian approach to object-oriented programming, we do this by discussing a number of issues in modeling and programming and argue3 why we......SIMULA was a language for modeling and programming and provided a unied approach to modeling and programming in contrast to methodologies based on structured analysis and design. The current development seems to be going in the direction of separation of modeling and programming. The goal...

  18. Problem-based learning using patient-simulated videos showing daily life for a comprehensive clinical approach.

    Science.gov (United States)

    Ikegami, Akiko; Ohira, Yoshiyuki; Uehara, Takanori; Noda, Kazutaka; Suzuki, Shingo; Shikino, Kiyoshi; Kajiwara, Hideki; Kondo, Takeshi; Hirota, Yusuke; Ikusaka, Masatomi

    2017-02-27

    We examined whether problem-based learning tutorials using patient-simulated videos showing daily life are more practical for clinical learning, compared with traditional paper-based problem-based learning, for the consideration rate of psychosocial issues and the recall rate for experienced learning. Twenty-two groups with 120 fifth-year students were each assigned paper-based problem-based learning and video-based problem-based learning using patient-simulated videos. We compared target achievement rates in questionnaires using the Wilcoxon signed-rank test and discussion contents diversity using the Mann-Whitney U test. A follow-up survey used a chi-square test to measure students' recall of cases in three categories: video, paper, and non-experienced. Video-based problem-based learning displayed significantly higher achievement rates for imagining authentic patients (p=0.001), incorporating a comprehensive approach including psychosocial aspects (pmaterials.

  19. Transchromosomic cell model of Down syndrome shows aberrant migration, adhesion and proteome response to extracellular matrix

    Directory of Open Access Journals (Sweden)

    Cotter Finbarr E

    2009-08-01

    Full Text Available Abstract Background Down syndrome (DS, caused by trisomy of human chromosome 21 (HSA21, is the most common genetic birth defect. Congenital heart defects (CHD are seen in 40% of DS children, and >50% of all atrioventricular canal defects in infancy are caused by trisomy 21, but the causative genes remain unknown. Results Here we show that aberrant adhesion and proliferation of DS cells can be reproduced using a transchromosomic model of DS (mouse fibroblasts bearing supernumerary HSA21. We also demonstrate a deacrease of cell migration in transchromosomic cells independently of their adhesion properties. We show that cell-autonomous proteome response to the presence of Collagen VI in extracellular matrix is strongly affected by trisomy 21. Conclusion This set of experiments establishes a new model system for genetic dissection of the specific HSA21 gene-overdose contributions to aberrant cell migration, adhesion, proliferation and specific proteome response to collagen VI, cellular phenotypes linked to the pathogenesis of CHD.

  20. Estimating carbon and showing impacts of drought using satellite data in regression-tree models

    Science.gov (United States)

    Boyte, Stephen; Wylie, Bruce K.; Howard, Danny; Dahal, Devendra; Gilmanov, Tagir G.

    2018-01-01

    Integrating spatially explicit biogeophysical and remotely sensed data into regression-tree models enables the spatial extrapolation of training data over large geographic spaces, allowing a better understanding of broad-scale ecosystem processes. The current study presents annual gross primary production (GPP) and annual ecosystem respiration (RE) for 2000–2013 in several short-statured vegetation types using carbon flux data from towers that are located strategically across the conterminous United States (CONUS). We calculate carbon fluxes (annual net ecosystem production [NEP]) for each year in our study period, which includes 2012 when drought and higher-than-normal temperatures influence vegetation productivity in large parts of the study area. We present and analyse carbon flux dynamics in the CONUS to better understand how drought affects GPP, RE, and NEP. Model accuracy metrics show strong correlation coefficients (r) (r ≥ 94%) between training and estimated data for both GPP and RE. Overall, average annual GPP, RE, and NEP are relatively constant throughout the study period except during 2012 when almost 60% less carbon is sequestered than normal. These results allow us to conclude that this modelling method effectively estimates carbon dynamics through time and allows the exploration of impacts of meteorological anomalies and vegetation types on carbon dynamics.

  1. NMR Metabolomics Show Evidence for Mitochondrial Oxidative Stress in a Mouse Model of Polycystic Ovary Syndrome.

    Science.gov (United States)

    Selen, Ebru Selin; Bolandnazar, Zeinab; Tonelli, Marco; Bütz, Daniel E; Haviland, Julia A; Porter, Warren P; Assadi-Porter, Fariba M

    2015-08-07

    Polycystic ovary syndrome (PCOS) is associated with metabolic and endocrine disorders in women of reproductive age. The etiology of PCOS is still unknown. Mice prenatally treated with glucocorticoids exhibit metabolic disturbances that are similar to those seen in women with PCOS. We used an untargeted nuclear magnetic resonance (NMR)-based metabolomics approach to understand the metabolic changes occurring in the plasma and kidney over time in female glucocorticoid-treated (GC-treated) mice. There are significant changes in plasma amino acid levels (valine, tyrosine, and proline) and their intermediates (2-hydroxybutyrate, 4-aminobutyrate, and taurine), whereas in kidneys, the TCA cycle metabolism (citrate, fumarate, and succinate) and the pentose phosphate (PP) pathway products (inosine and uracil) are significantly altered (p metabolic substrates in the plasma and kidneys of treated mice are associated with altered amino acid metabolism, increased cytoplasmic PP, and increased mitochondrial activity, leading to a more oxidized state. This study identifies biomarkers associated with metabolic dysfunction in kidney mitochondria of a prenatal gluococorticoid-treated mouse model of PCOS that may be used as early predictive biomarkers of oxidative stress in the PCOS metabolic disorder in women.

  2. Szekeres models: a covariant approach

    CERN Document Server

    Apostolopoulos, Pantelis S

    2016-01-01

    We exploit the 1+1+2 formalism to covariantly describe the inhomogeneous and anisotropic Szekeres models. It is shown that an \\emph{average scale length} can be defined \\emph{covariantly} which satisfies a 2d equation of motion driven from the \\emph{effective gravitational mass} (EGM) contained in the dust cloud. The contributions to the EGM are encoded to the energy density of the dust fluid and the free gravitational field $E_{ab}$. In addition the notions of the Apparent and Absolute Apparent Horizons are briefly discussed and we give an alternative gauge-invariant form to define them in terms of the kinematical variables of the spacelike congruences. We argue that the proposed program can be used in order to express the Sachs optical equations in a covariant form and analyze the confrontation of a spatially inhomogeneous irrotational overdense fluid model with the observational data.

  3. Matrix Model Approach to Cosmology

    CERN Document Server

    Chaney, A; Stern, A

    2015-01-01

    We perform a systematic search for rotationally invariant cosmological solutions to matrix models, or more specifically the bosonic sector of Lorentzian IKKT-type matrix models, in dimensions $d$ less than ten, specifically $d=3$ and $d=5$. After taking a continuum (or commutative) limit they yield $d-1$ dimensional space-time surfaces, with an attached Poisson structure, which can be associated with closed, open or static cosmologies. For $d=3$, we obtain recursion relations from which it is possible to generate rotationally invariant matrix solutions which yield open universes in the continuum limit. Specific examples of matrix solutions have also been found which are associated with closed and static two-dimensional space-times in the continuum limit. The solutions provide for a matrix resolution of cosmological singularities. The commutative limit reveals other desirable features, such as a solution describing a smooth transition from an initial inflation to a noninflationary era. Many of the $d=3$ soluti...

  4. A new approach to adaptive data models

    Directory of Open Access Journals (Sweden)

    Ion LUNGU

    2016-12-01

    Full Text Available Over the last decade, there has been a substantial increase in the volume and complexity of data we collect, store and process. We are now aware of the increasing demand for real time data processing in every continuous business process that evolves within the organization. We witness a shift from a traditional static data approach to a more adaptive model approach. This article aims to extend understanding in the field of data models used in information systems by examining how an adaptive data model approach for managing business processes can help organizations accommodate on the fly and build dynamic capabilities to react in a dynamic environment.

  5. Rubber particle proteins, HbREF and HbSRPP, show different interactions with model membranes.

    Science.gov (United States)

    Berthelot, Karine; Lecomte, Sophie; Estevez, Yannick; Zhendre, Vanessa; Henry, Sarah; Thévenot, Julie; Dufourc, Erick J; Alves, Isabel D; Peruch, Frédéric

    2014-01-01

    The biomembrane surrounding rubber particles from the hevea latex is well known for its content of numerous allergen proteins. HbREF (Hevb1) and HbSRPP (Hevb3) are major components, linked on rubber particles, and they have been shown to be involved in rubber synthesis or quality (mass regulation), but their exact function is still to be determined. In this study we highlighted the different modes of interactions of both recombinant proteins with various membrane models (lipid monolayers, liposomes or supported bilayers, and multilamellar vesicles) to mimic the latex particle membrane. We combined various biophysical methods (polarization-modulation-infrared reflection-adsorption spectroscopy (PM-IRRAS)/ellipsometry, attenuated-total reflectance Fourier-transform infrared (ATR-FTIR), solid-state nuclear magnetic resonance (NMR), plasmon waveguide resonance (PWR), fluorescence spectroscopy) to elucidate their interactions. Small rubber particle protein (SRPP) shows less affinity than rubber elongation factor (REF) for the membranes but displays a kind of "covering" effect on the lipid headgroups without disturbing the membrane integrity. Its structure is conserved in the presence of lipids. Contrarily, REF demonstrates higher membrane affinity with changes in its aggregation properties, the amyloid nature of REF, which we previously reported, is not favored in the presence of lipids. REF binds and inserts into membranes. The membrane integrity is highly perturbed, and we suspect that REF is even able to remove lipids from the membrane leading to the formation of mixed micelles. These two homologous proteins show affinity to all membrane models tested but neatly differ in their interacting features. This could imply differential roles on the surface of rubber particles.

  6. Modeling software behavior a craftsman's approach

    CERN Document Server

    Jorgensen, Paul C

    2009-01-01

    A common problem with most texts on requirements specifications is that they emphasize structural models to the near exclusion of behavioral models-focusing on what the software is, rather than what it does. If they do cover behavioral models, the coverage is brief and usually focused on a single model. Modeling Software Behavior: A Craftsman's Approach provides detailed treatment of various models of software behavior that support early analysis, comprehension, and model-based testing. Based on the popular and continually evolving course on requirements specification models taught by the auth

  7. Antiparasitic mebendazole shows survival benefit in 2 preclinical models of glioblastoma multiforme.

    Science.gov (United States)

    Bai, Ren-Yuan; Staedtke, Verena; Aprhys, Colette M; Gallia, Gary L; Riggins, Gregory J

    2011-09-01

    Glioblastoma multiforme (GBM) is the most common and aggressive brain cancer, and despite treatment advances, patient prognosis remains poor. During routine animal studies, we serendipitously observed that fenbendazole, a benzimidazole antihelminthic used to treat pinworm infection, inhibited brain tumor engraftment. Subsequent in vitro and in vivo experiments with benzimidazoles identified mebendazole as the more promising drug for GBM therapy. In GBM cell lines, mebendazole displayed cytotoxicity, with half-maximal inhibitory concentrations ranging from 0.1 to 0.3 µM. Mebendazole disrupted microtubule formation in GBM cells, and in vitro activity was correlated with reduced tubulin polymerization. Subsequently, we showed that mebendazole significantly extended mean survival up to 63% in syngeneic and xenograft orthotopic mouse glioma models. Mebendazole has been approved by the US Food and Drug Administration for parasitic infections, has a long track-record of safe human use, and was effective in our animal models with doses documented as safe in humans. Our findings indicate that mebendazole is a possible novel anti-brain tumor therapeutic that could be further tested in clinical trials.

  8. Cybrid models of Parkinson's disease show variable mitochondrial biogenesis and genotype-respiration relationships.

    Science.gov (United States)

    Keeney, Paula M; Dunham, Lisa D; Quigley, Caitlin K; Morton, Stephanie L; Bergquist, Kristen E; Bennett, James P

    2009-12-01

    Sporadic Parkinson's disease (sPD) is a nervous system-wide disease that presents with a bradykinetic movement disorder and frequently progresses to include depression and cognitive impairment. Cybrid models of sPD are based on expression of sPD platelet mitochondrial DNA (mtDNA) in neural cells and demonstrate some similarities to sPD brains. In sPD and CTL cybrids we characterized aspects of mitochondrial biogenesis, mtDNA genomics, composition of the respirasome and the relationships among isolated mitochondrial and intact cell respiration. Cybrid mtDNA levels varied and correlated with expression of PGC-1 alpha, a transcriptional co-activator regulator of mitochondrial biogenesis. Levels of mtDNA heteroplasmic mutations were asymmetrically distributed across the mitochondrial genome; numbers of heteroplasmies were more evenly distributed. Neither levels nor numbers of heteroplasmies distinguished sPD from CTL. sPD cybrid mitochondrial ETC subunit protein levels were not altered. Isolated mitochondrial complex I respiration rates showed limited correlation with whole cell complex I respiration rates in both sPD and CTL cybrids. Intact cell respiration during the normoxic-anoxic transition yielded K(m) values for oxygen that directly related to respiration rates in CTL but not in sPD cell lines. Both sPD and CTL cybrid cells are substantially heterogeneous in mitochondrial genomic and physiologic properties. Our results suggest that mtDNA depletion may occur in sPD neurons and could reflect impairment of mitochondrial biogenesis. Cybrids remain a valuable model for some aspects of sPD but their heterogeneity mitigates against a simple designation of sPD phenotype in this cell model.

  9. Fundamental mathematical model shows that applied electrical field enhances chemotherapy delivery to tumors.

    Science.gov (United States)

    Moarefian, Maryam; Pascal, Jennifer A

    2016-02-01

    Biobarriers imposed by the tumor microenvironment create a challenge to deliver chemotherapeutics effectively. Electric fields can be used to overcome these biobarriers in the form of electrochemotherapy, or by applying an electric field to tissue after chemotherapy has been delivered systemically. A fundamental understanding of the underlying physical phenomena governing tumor response to an applied electrical field is lacking. Building upon the work of Pascal et al. [1], a mathematical model that predicts the fraction of tumor killed due to a direct current (DC) applied electrical field and chemotherapy is developed here for tumor tissue surrounding a single, straight, cylindrical blood vessel. Results show the typical values of various parameters related to properties of the electrical field, tumor tissue and chemotherapy drug that have the most significant influence on the fraction of tumor killed. We show that the applied electrical field enhances tumor death due to chemotherapy and that the direction and magnitude of the applied electrical field have a significant impact on the fraction of tumor killed. Published by Elsevier Inc.

  10. Current approaches to gene regulatory network modelling

    Directory of Open Access Journals (Sweden)

    Brazma Alvis

    2007-09-01

    Full Text Available Abstract Many different approaches have been developed to model and simulate gene regulatory networks. We proposed the following categories for gene regulatory network models: network parts lists, network topology models, network control logic models, and dynamic models. Here we will describe some examples for each of these categories. We will study the topology of gene regulatory networks in yeast in more detail, comparing a direct network derived from transcription factor binding data and an indirect network derived from genome-wide expression data in mutants. Regarding the network dynamics we briefly describe discrete and continuous approaches to network modelling, then describe a hybrid model called Finite State Linear Model and demonstrate that some simple network dynamics can be simulated in this model.

  11. Aerobic Toluene Degraders in the Rhizosphere of a Constructed Wetland Model Show Diurnal Polyhydroxyalkanoate Metabolism.

    Science.gov (United States)

    Lünsmann, Vanessa; Kappelmeyer, Uwe; Taubert, Anja; Nijenhuis, Ivonne; von Bergen, Martin; Heipieper, Hermann J; Müller, Jochen A; Jehmlich, Nico

    2016-07-15

    Constructed wetlands (CWs) are successfully applied for the treatment of waters contaminated with aromatic compounds. In these systems, plants provide oxygen and root exudates to the rhizosphere and thereby stimulate microbial degradation processes. Root exudation of oxygen and organic compounds depends on photosynthetic activity and thus may show day-night fluctuations. While diurnal changes in CW effluent composition have been observed, information on respective fluctuations of bacterial activity are scarce. We investigated microbial processes in a CW model system treating toluene-contaminated water which showed diurnal oscillations of oxygen concentrations using metaproteomics. Quantitative real-time PCR was applied to assess diurnal expression patterns of genes involved in aerobic and anaerobic toluene degradation. We observed stable aerobic toluene turnover by Burkholderiales during the day and night. Polyhydroxyalkanoate synthesis was upregulated in these bacteria during the day, suggesting that they additionally feed on organic root exudates while reutilizing the stored carbon compounds during the night via the glyoxylate cycle. Although mRNA copies encoding the anaerobic enzyme benzylsuccinate synthase (bssA) were relatively abundant and increased slightly at night, the corresponding protein could not be detected in the CW model system. Our study provides insights into diurnal patterns of microbial processes occurring in the rhizosphere of an aquatic ecosystem. Constructed wetlands are a well-established and cost-efficient option for the bioremediation of contaminated waters. While it is commonly accepted knowledge that the function of CWs is determined by the interplay of plants and microorganisms, the detailed molecular processes are considered a black box. Here, we used a well-characterized CW model system treating toluene-contaminated water to investigate the microbial processes influenced by diurnal plant root exudation. Our results indicated stable

  12. Model Oriented Approach for Industrial Software Development

    Directory of Open Access Journals (Sweden)

    P. D. Drobintsev

    2015-01-01

    Full Text Available The article considers the specifics of a model oriented approach to software development based on the usage of Model Driven Architecture (MDA, Model Driven Software Development (MDSD and Model Driven Development (MDD technologies. Benefits of this approach usage in the software development industry are described. The main emphasis is put on the system design, automated code generation for large systems, verification, proof of system properties and reduction of bug density. Drawbacks of the approach are also considered. The approach proposed in the article is specific for industrial software systems development. These systems are characterized by different levels of abstraction, which is used on modeling and code development phases. The approach allows to detail the model to the level of the system code, at the same time store the verified model semantics and provide the checking of the whole detailed model. Steps of translating abstract data structures (including transactions, signals and their parameters into data structures used in detailed system implementation are presented. Also the grammar of a language for specifying rules of abstract model data structures transformation into real system detailed data structures is described. The results of applying the proposed method in the industrial technology are shown.The article is published in the authors’ wording.

  13. Distributed simulation a model driven engineering approach

    CERN Document Server

    Topçu, Okan; Oğuztüzün, Halit; Yilmaz, Levent

    2016-01-01

    Backed by substantive case studies, the novel approach to software engineering for distributed simulation outlined in this text demonstrates the potent synergies between model-driven techniques, simulation, intelligent agents, and computer systems development.

  14. A Bayesian Model Committee Approach to Forecasting Global Solar Radiation

    CERN Document Server

    Lauret, Philippe; Muselli, Marc; David, Mathieu; Diagne, Hadja; Voyant, Cyril

    2012-01-01

    This paper proposes to use a rather new modelling approach in the realm of solar radiation forecasting. In this work, two forecasting models: Autoregressive Moving Average (ARMA) and Neural Network (NN) models are combined to form a model committee. The Bayesian inference is used to affect a probability to each model in the committee. Hence, each model's predictions are weighted by their respective probability. The models are fitted to one year of hourly Global Horizontal Irradiance (GHI) measurements. Another year (the test set) is used for making genuine one hour ahead (h+1) out-of-sample forecast comparisons. The proposed approach is benchmarked against the persistence model. The very first results show an improvement brought by this approach.

  15. SC-535, a Novel Oral Multikinase Inhibitor, Showed Potent Antitumor Activity in Human Melanoma Models

    Directory of Open Access Journals (Sweden)

    Xin Chen

    2013-07-01

    Full Text Available Background: Melanoma is considered as one of the most aggressive and deadliest cancers and current targeted therapies of melanoma often suffer limited efficacy or drug resistance. Discovery of novel multikinase inhibitors as anti-melanoma drug candidates is still needed. Methods: In this investigation, we assessed the in vitro and in vivo anti-melanoma activities of SC-535, which is a novel small molecule multikinase inhibitor discovered by us recently. We analyzed inhibitory effects of SC-535 on various melanoma cell lines and human umbilical vascular endothelial cells (HUVEC in vitro. Tumor xenografts in athymic mice were used to examine the in vivo activity of SC-535. Results: SC-535 could efficiently inhibit vascular endothelial growth factor receptor (VEGFR 1/2/3, B-RAF, and C-RAF kinases. It showed significant antiangiogenic potencies both in vitro and in vivo and considerable anti-proliferative ability against several melanoma cell lines. Oral administration of SC-535 resulted in dose-dependent suppression of tumor growth in WM2664 and C32 xenograft mouse models. Studies of mechanisms of action indicated that SC-535 suppressed the tumor angiogenesis and induced G2/M phase cell cycle arrest in human melanoma cells. SC-535 possesses favorable pharmacokinetic properties. Conclusion: All of these results support SC-535 as a potential candidate for clinical studies in patients with melanoma.

  16. Progesterone Treatment Shows Benefit in Female Rats in a Pediatric Model of Controlled Cortical Impact Injury.

    Directory of Open Access Journals (Sweden)

    Rastafa I Geddes

    Full Text Available We recently showed that progesterone treatment can reduce lesion size and behavioral deficits after moderate-to-severe bilateral injury to the medial prefrontal cortex in immature male rats. Whether there are important sex differences in response to injury and progesterone treatment in very young subjects has not been given sufficient attention. Here we investigated progesterone's effects in the same model of brain injury but with pre-pubescent females.Twenty-eight-day-old female Sprague-Dawley rats received sham (n = 14 or controlled cortical impact (CCI (n = 21 injury, were given progesterone (8 mg/kg body weight or vehicle injections on post-injury days (PID 1-7, and underwent behavioral testing from PID 9-27. Brains were evaluated for lesion size at PID 28.Lesion size in vehicle-treated female rats with CCI injury was smaller than that previously reported for similarly treated age-matched male rats. Treatment with progesterone reduced the effect of CCI on extent of damage and behavioral deficits.Pre-pubescent female rats with midline CCI injury to the frontal cortex have reduced morphological and functional deficits following progesterone treatment. While gender differences in susceptibility to this injury were observed, progesterone treatment produced beneficial effects in young rats of both sexes following CCI.

  17. Quantum Machine and SR Approach: a Unified Model

    CERN Document Server

    Garola, C; Sozzo, S; Garola, Claudio; Pykacz, Jaroslav; Sozzo, Sandro

    2005-01-01

    The Geneva-Brussels approach to quantum mechanics (QM) and the semantic realism (SR) nonstandard interpretation of QM exhibit some common features and some deep conceptual differences. We discuss in this paper two elementary models provided in the two approaches as intuitive supports to general reasonings and as a proof of consistency of general assumptions, and show that Aerts' quantum machine can be embodied into a macroscopic version of the microscopic SR model, overcoming the seeming incompatibility between the two models. This result provides some hints for the construction of a unified perspective in which the two approaches can be properly placed.

  18. EROBATIC SHOW

    Institute of Scientific and Technical Information of China (English)

    2016-01-01

    Visitors look at plane models of the Commercial Aircraft Corp. of China, developer of the count,s first homegrown large passenger jet C919, during the Singapore Airshow on February 16. The biennial event is the largest airshow in Asia and one of the most important aviation and defense shows worldwide. A number of Chinese companies took part in the event during which Okay Airways, the first privately owned aidine in China, signed a deal to acquire 12 Boeing 737 jets.

  19. Showing a model's eye movements in examples does not improve learning of problem-solving tasks

    NARCIS (Netherlands)

    van Marlen, Tim; van Wermeskerken, Margot; Jarodzka, Halszka; van Gog, Tamara

    2016-01-01

    Eye movement modeling examples (EMME) are demonstrations of a computer-based task by a human model (e.g., a teacher), with the model's eye movements superimposed on the task to guide learners' attention. EMME have been shown to enhance learning of perceptual classification tasks; however, it is an

  20. Showing a model's eye movements in examples does not improve learning of problem-solving tasks

    NARCIS (Netherlands)

    van Marlen, Tim; van Wermeskerken, Margot; Jarodzka, Halszka; van Gog, Tamara

    2016-01-01

    Eye movement modeling examples (EMME) are demonstrations of a computer-based task by a human model (e.g., a teacher), with the model's eye movements superimposed on the task to guide learners' attention. EMME have been shown to enhance learning of perceptual classification tasks; however, it is an o

  1. What Can the Bohr-Sommerfeld Model Show Students of Chemistry in the 21st Century?

    Science.gov (United States)

    Niaz, Mansoor; Cardellini, Liberato

    2011-01-01

    Bohr's model of the atom is considered to be important by general chemistry textbooks. A shortcoming of this model was that it could not explain the spectra of atoms containing more than one electron. To increase the explanatory power of the model, Sommerfeld hypothesized the existence of elliptical orbits. This study aims to elaborate a framework…

  2. A Set Theoretical Approach to Maturity Models

    DEFF Research Database (Denmark)

    Lasrado, Lester; Vatrapu, Ravi; Andersen, Kim Normann

    2016-01-01

    Maturity Model research in IS has been criticized for the lack of theoretical grounding, methodological rigor, empirical validations, and ignorance of multiple and non-linear paths to maturity. To address these criticisms, this paper proposes a novel set-theoretical approach to maturity models ch...

  3. Modeling diffuse pollution with a distributed approach.

    Science.gov (United States)

    León, L F; Soulis, E D; Kouwen, N; Farquhar, G J

    2002-01-01

    The transferability of parameters for non-point source pollution models to other watersheds, especially those in remote areas without enough data for calibration, is a major problem in diffuse pollution modeling. A water quality component was developed for WATFLOOD (a flood forecast hydrological model) to deal with sediment and nutrient transport. The model uses a distributed group response unit approach for water quantity and quality modeling. Runoff, sediment yield and soluble nutrient concentrations are calculated separately for each land cover class, weighted by area and then routed downstream. The distributed approach for the water quality model for diffuse pollution in agricultural watersheds is described in this paper. Integrating the model with data extracted using GIS technology (Geographical Information Systems) for a local watershed, the model is calibrated for the hydrologic response and validated for the water quality component. With the connection to GIS and the group response unit approach used in this paper, model portability increases substantially, which will improve non-point source modeling at the watershed scale level.

  4. MODULAR APPROACH WITH ROUGH DECISION MODELS

    Directory of Open Access Journals (Sweden)

    Ahmed T. Shawky

    2012-09-01

    Full Text Available Decision models which adopt rough set theory have been used effectively in many real world applications.However, rough decision models suffer the high computational complexity when dealing with datasets ofhuge size. In this research we propose a new rough decision model that allows making decisions based onmodularity mechanism. According to the proposed approach, large-size datasets can be divided intoarbitrary moderate-size datasets, then a group of rough decision models can be built as separate decisionmodules. The overall model decision is computed as the consensus decision of all decision modulesthrough some aggregation technique. This approach provides a flexible and a quick way for extractingdecision rules of large size information tables using rough decision models.

  5. Modular Approach with Rough Decision Models

    Directory of Open Access Journals (Sweden)

    Ahmed T. Shawky

    2012-10-01

    Full Text Available Decision models which adopt rough set theory have been used effectively in many real world applications.However, rough decision models suffer the high computational complexity when dealing with datasets ofhuge size. In this research we propose a new rough decision model that allows making decisions based onmodularity mechanism. According to the proposed approach, large-size datasets can be divided intoarbitrary moderate-size datasets, then a group of rough decision models can be built as separate decisionmodules. The overall model decision is computed as the consensus decision of all decision modulesthrough some aggregation technique. This approach provides a flexible and a quick way for extractingdecision rules of large size information tables using rough decision models.

  6. Modeling approach suitable for energy system

    Energy Technology Data Exchange (ETDEWEB)

    Goetschel, D. V.

    1979-01-01

    Recently increased attention has been placed on optimization problems related to the determination and analysis of operating strategies for energy systems. Presented in this paper is a nonlinear model that can be used in the formulation of certain energy-conversion systems-modeling problems. The model lends itself nicely to solution approaches based on nonlinear-programming algorithms and, in particular, to those methods falling into the class of variable metric algorithms for nonlinearly constrained optimization.

  7. Stormwater infiltration trenches: a conceptual modelling approach.

    Science.gov (United States)

    Freni, Gabriele; Mannina, Giorgio; Viviani, Gaspare

    2009-01-01

    In recent years, limitations linked to traditional urban drainage schemes have been pointed out and new approaches are developing introducing more natural methods for retaining and/or disposing of stormwater. These mitigation measures are generally called Best Management Practices or Sustainable Urban Drainage System and they include practices such as infiltration and storage tanks in order to reduce the peak flow and retain part of the polluting components. The introduction of such practices in urban drainage systems entails an upgrade of existing modelling frameworks in order to evaluate their efficiency in mitigating the impact of urban drainage systems on receiving water bodies. While storage tank modelling approaches are quite well documented in literature, some gaps are still present about infiltration facilities mainly dependent on the complexity of the involved physical processes. In this study, a simplified conceptual modelling approach for the simulation of the infiltration trenches is presented. The model enables to assess the performance of infiltration trenches. The main goal is to develop a model that can be employed for the assessment of the mitigation efficiency of infiltration trenches in an integrated urban drainage context. Particular care was given to the simulation of infiltration structures considering the performance reduction due to clogging phenomena. The proposed model has been compared with other simplified modelling approaches and with a physically based model adopted as benchmark. The model performed better compared to other approaches considering both unclogged facilities and the effect of clogging. On the basis of a long-term simulation of six years of rain data, the performance and the effectiveness of an infiltration trench measure are assessed. The study confirmed the important role played by the clogging phenomenon on such infiltration structures.

  8. Challenges in structural approaches to cell modeling.

    Science.gov (United States)

    Im, Wonpil; Liang, Jie; Olson, Arthur; Zhou, Huan-Xiang; Vajda, Sandor; Vakser, Ilya A

    2016-07-31

    Computational modeling is essential for structural characterization of biomolecular mechanisms across the broad spectrum of scales. Adequate understanding of biomolecular mechanisms inherently involves our ability to model them. Structural modeling of individual biomolecules and their interactions has been rapidly progressing. However, in terms of the broader picture, the focus is shifting toward larger systems, up to the level of a cell. Such modeling involves a more dynamic and realistic representation of the interactomes in vivo, in a crowded cellular environment, as well as membranes and membrane proteins, and other cellular components. Structural modeling of a cell complements computational approaches to cellular mechanisms based on differential equations, graph models, and other techniques to model biological networks, imaging data, etc. Structural modeling along with other computational and experimental approaches will provide a fundamental understanding of life at the molecular level and lead to important applications to biology and medicine. A cross section of diverse approaches presented in this review illustrates the developing shift from the structural modeling of individual molecules to that of cell biology. Studies in several related areas are covered: biological networks; automated construction of three-dimensional cell models using experimental data; modeling of protein complexes; prediction of non-specific and transient protein interactions; thermodynamic and kinetic effects of crowding; cellular membrane modeling; and modeling of chromosomes. The review presents an expert opinion on the current state-of-the-art in these various aspects of structural modeling in cellular biology, and the prospects of future developments in this emerging field. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Evaluating Interventions with Multimethod Data: A Structural Equation Modeling Approach

    Science.gov (United States)

    Crayen, Claudia; Geiser, Christian; Scheithauer, Herbert; Eid, Michael

    2011-01-01

    In many intervention and evaluation studies, outcome variables are assessed using a multimethod approach comparing multiple groups over time. In this article, we show how evaluation data obtained from a complex multitrait-multimethod-multioccasion-multigroup design can be analyzed with structural equation models. In particular, we show how the…

  10. Building Water Models, A Different Approach

    CERN Document Server

    Izadi, Saeed; Onufriev, Alexey V

    2014-01-01

    Simplified, classical models of water are an integral part of atomistic molecular simulations, especially in biology and chemistry where hydration effects are critical. Yet, despite several decades of effort, these models are still far from perfect. Presented here is an alternative approach to constructing point charge water models - currently, the most commonly used type. In contrast to the conventional approach, we do not impose any geometry constraints on the model other than symmetry. Instead, we optimize the distribution of point charges to best describe the "electrostatics" of the water molecule, which is key to many unusual properties of liquid water. The search for the optimal charge distribution is performed in 2D parameter space of key lowest multipole moments of the model, to find best fit to a small set of bulk water properties at room temperature. A virtually exhaustive search is enabled via analytical equations that relate the charge distribution to the multipole moments. The resulting "optimal"...

  11. Towards new approaches in phenological modelling

    Science.gov (United States)

    Chmielewski, Frank-M.; Götz, Klaus-P.; Rawel, Harshard M.; Homann, Thomas

    2014-05-01

    Modelling of phenological stages is based on temperature sums for many decades, describing both the chilling and the forcing requirement of woody plants until the beginning of leafing or flowering. Parts of this approach go back to Reaumur (1735), who originally proposed the concept of growing degree-days. Now, there is a growing body of opinion that asks for new methods in phenological modelling and more in-depth studies on dormancy release of woody plants. This requirement is easily understandable if we consider the wide application of phenological models, which can even affect the results of climate models. To this day, in phenological models still a number of parameters need to be optimised on observations, although some basic physiological knowledge of the chilling and forcing requirement of plants is already considered in these approaches (semi-mechanistic models). Limiting, for a fundamental improvement of these models, is the lack of knowledge about the course of dormancy in woody plants, which cannot be directly observed and which is also insufficiently described in the literature. Modern metabolomic methods provide a solution for this problem and allow both, the validation of currently used phenological models as well as the development of mechanistic approaches. In order to develop this kind of models, changes of metabolites (concentration, temporal course) must be set in relation to the variability of environmental (steering) parameters (weather, day length, etc.). This necessarily requires multi-year (3-5 yr.) and high-resolution (weekly probes between autumn and spring) data. The feasibility of this approach has already been tested in a 3-year pilot-study on sweet cherries. Our suggested methodology is not only limited to the flowering of fruit trees, it can be also applied to tree species of the natural vegetation, where even greater deficits in phenological modelling exist.

  12. Bayesian approach to decompression sickness model parameter estimation.

    Science.gov (United States)

    Howle, L E; Weber, P W; Nichols, J M

    2017-03-01

    We examine both maximum likelihood and Bayesian approaches for estimating probabilistic decompression sickness model parameters. Maximum likelihood estimation treats parameters as fixed values and determines the best estimate through repeated trials, whereas the Bayesian approach treats parameters as random variables and determines the parameter probability distributions. We would ultimately like to know the probability that a parameter lies in a certain range rather than simply make statements about the repeatability of our estimator. Although both represent powerful methods of inference, for models with complex or multi-peaked likelihoods, maximum likelihood parameter estimates can prove more difficult to interpret than the estimates of the parameter distributions provided by the Bayesian approach. For models of decompression sickness, we show that while these two estimation methods are complementary, the credible intervals generated by the Bayesian approach are more naturally suited to quantifying uncertainty in the model parameters.

  13. Simple solvable energy-landscape model that shows a thermodynamic phase transition and a glass transition.

    Science.gov (United States)

    Naumis, Gerardo G

    2012-06-01

    When a liquid melt is cooled, a glass or phase transition can be obtained depending on the cooling rate. Yet, this behavior has not been clearly captured in energy-landscape models. Here, a model is provided in which two key ingredients are considered in the landscape, metastable states and their multiplicity. Metastable states are considered as in two level system models. However, their multiplicity and topology allows a phase transition in the thermodynamic limit for slow cooling, while a transition to the glass is obtained for fast cooling. By solving the corresponding master equation, the minimal speed of cooling required to produce the glass is obtained as a function of the distribution of metastable states.

  14. Comparison of approaches for parameter estimation on stochastic models: Generic least squares versus specialized approaches.

    Science.gov (United States)

    Zimmer, Christoph; Sahle, Sven

    2016-04-01

    Parameter estimation for models with intrinsic stochasticity poses specific challenges that do not exist for deterministic models. Therefore, specialized numerical methods for parameter estimation in stochastic models have been developed. Here, we study whether dedicated algorithms for stochastic models are indeed superior to the naive approach of applying the readily available least squares algorithm designed for deterministic models. We compare the performance of the recently developed multiple shooting for stochastic systems (MSS) method designed for parameter estimation in stochastic models, a stochastic differential equations based Bayesian approach and a chemical master equation based techniques with the least squares approach for parameter estimation in models of ordinary differential equations (ODE). As test data, 1000 realizations of the stochastic models are simulated. For each realization an estimation is performed with each method, resulting in 1000 estimates for each approach. These are compared with respect to their deviation to the true parameter and, for the genetic toggle switch, also their ability to reproduce the symmetry of the switching behavior. Results are shown for different set of parameter values of a genetic toggle switch leading to symmetric and asymmetric switching behavior as well as an immigration-death and a susceptible-infected-recovered model. This comparison shows that it is important to choose a parameter estimation technique that can treat intrinsic stochasticity and that the specific choice of this algorithm shows only minor performance differences.

  15. Reexamination of the State of the Art Cloud Modeling Shows Real Improvements

    Energy Technology Data Exchange (ETDEWEB)

    Muehlbauer, Andreas D.; Grabowski, Wojciech W.; Malinowski, S. P.; Ackerman, Thomas P.; Bryan, George; Lebo, Zachary; Milbrandt, Jason; Morrison, H.; Ovchinnikov, Mikhail; Tessendorf, Sarah; Theriault, Julie M.; Thompson, Gregory

    2013-05-25

    Following up on an almost thirty year long history of International Cloud Modeling Workshops, that started out with a meeting in Irsee, Germany in 1985, the 8th International Cloud Modeling Workshop was held in July 2012 in Warsaw, Poland. The workshop, hosted by the Institute of Geophysics at the University of Warsaw, was organized by Szymon Malinowski and his local team of students and co-chaired by Wojciech Grabowski (NCAR/MMM) and Andreas Muhlbauer (University of Washington). International Cloud Modeling Workshops have been held traditionally every four years typically during the week before the International Conference on Clouds and Precipitation (ICCP) . Rooted in the World Meteorological Organization’s (WMO) weather modification program, the core objectives of the Cloud Modeling Workshop have been centered at the numerical modeling of clouds, cloud microphysics, and the interactions between cloud microphysics and cloud dynamics. In particular, the goal of the workshop is to provide insight into the pertinent problems of today’s state-of-the-art of cloud modeling and to identify key deficiencies in the microphysical representation of clouds in numerical models and cloud parameterizations. In recent years, the workshop has increasingly shifted the focus toward modeling the interactions between aerosols and clouds and provided case studies to investigate both the effects of aerosols on clouds and precipitation as well as the impact of cloud and precipitation processes on aerosols. This time, about 60 (?) scientists from about 10 (?) different countries participated in the workshop and contributed with discussions, oral and poster presentations to the workshop’s plenary and breakout sessions. Several case leaders contributed to the workshop by setting up five observationally-based case studies covering a wide range of cloud types, namely, marine stratocumulus, mid-latitude squall lines, mid-latitude cirrus clouds, Arctic stratus and winter-time orographic

  16. Functional state modelling approach validation for yeast and bacteria cultivations

    Science.gov (United States)

    Roeva, Olympia; Pencheva, Tania

    2014-01-01

    In this paper, the functional state modelling approach is validated for modelling of the cultivation of two different microorganisms: yeast (Saccharomyces cerevisiae) and bacteria (Escherichia coli). Based on the available experimental data for these fed-batch cultivation processes, three different functional states are distinguished, namely primary product synthesis state, mixed oxidative state and secondary product synthesis state. Parameter identification procedures for different local models are performed using genetic algorithms. The simulation results show high degree of adequacy of the models describing these functional states for both S. cerevisiae and E. coli cultivations. Thus, the local models are validated for the cultivation of both microorganisms. This fact is a strong structure model verification of the functional state modelling theory not only for a set of yeast cultivations, but also for bacteria cultivation. As such, the obtained results demonstrate the efficiency and efficacy of the functional state modelling approach. PMID:26740778

  17. An optimization approach to kinetic model reduction for combustion chemistry

    CERN Document Server

    Lebiedz, Dirk

    2013-01-01

    Model reduction methods are relevant when the computation time of a full convection-diffusion-reaction simulation based on detailed chemical reaction mechanisms is too large. In this article, we review a model reduction approach based on optimization of trajectories and show its applicability to realistic combustion models. As most model reduction methods, it identifies points on a slow invariant manifold based on time scale separation in the dynamics of the reaction system. The numerical approximation of points on the manifold is achieved by solving a semi-infinite optimization problem, where the dynamics enter the problem as constraints. The proof of existence of a solution for an arbitrarily chosen dimension of the reduced model (slow manifold) is extended to the case of realistic combustion models including thermochemistry by considering the properties of proper maps. The model reduction approach is finally applied to three models based on realistic reaction mechanisms: 1. ozone decomposition as a small t...

  18. Functional state modelling approach validation for yeast and bacteria cultivations.

    Science.gov (United States)

    Roeva, Olympia; Pencheva, Tania

    2014-09-03

    In this paper, the functional state modelling approach is validated for modelling of the cultivation of two different microorganisms: yeast (Saccharomyces cerevisiae) and bacteria (Escherichia coli). Based on the available experimental data for these fed-batch cultivation processes, three different functional states are distinguished, namely primary product synthesis state, mixed oxidative state and secondary product synthesis state. Parameter identification procedures for different local models are performed using genetic algorithms. The simulation results show high degree of adequacy of the models describing these functional states for both S. cerevisiae and E. coli cultivations. Thus, the local models are validated for the cultivation of both microorganisms. This fact is a strong structure model verification of the functional state modelling theory not only for a set of yeast cultivations, but also for bacteria cultivation. As such, the obtained results demonstrate the efficiency and efficacy of the functional state modelling approach.

  19. Phenolic Acids from Wheat Show Different Absorption Profiles in Plasma: A Model Experiment with Catheterized Pigs

    DEFF Research Database (Denmark)

    Nørskov, Natalja; Hedemann, Mette Skou; Theil, Peter Kappel

    2013-01-01

    The concentration and absorption of the nine phenolic acids of wheat were measured in a model experiment with catheterized pigs fed whole grain wheat and wheat aleurone diets. Six pigs in a repeated crossover design were fitted with catheters in the portal vein and mesenteric artery to study the ...

  20. Modelling Coagulation Systems: A Stochastic Approach

    CERN Document Server

    Ryazanov, V V

    2011-01-01

    A general stochastic approach to the description of coagulating aerosol system is developed. As the object of description one can consider arbitrary mesoscopic values (number of aerosol clusters, their size etc). The birth-and-death formalism for a number of clusters can be regarded as a partial case of the generalized storage model. An application of the storage model to the number of monomers in a cluster is discussed.

  1. Application of the Interface Approach in Quantum Ising Models

    OpenAIRE

    Sen, Parongama

    1997-01-01

    We investigate phase transitions in the Ising model and the ANNNI model in transverse field using the interface approach. The exact result of the Ising chain in a transverse field is reproduced. We find that apart from the interfacial energy, there are two other response functions which show simple scaling behaviour. For the ANNNI model in a transverse field, the phase diagram can be fully studied in the region where a ferromagnetic to paramagnetic phase transition occurs. The other region ca...

  2. Animal Models for Muscular Dystrophy Show Different Patterns of Sarcolemmal Disruption

    OpenAIRE

    1997-01-01

    Genetic defects in a number of components of the dystrophin–glycoprotein complex (DGC) lead to distinct forms of muscular dystrophy. However, little is known about how alterations in the DGC are manifested in the pathophysiology present in dystrophic muscle tissue. One hypothesis is that the DGC protects the sarcolemma from contraction-induced damage. Using tracer molecules, we compared sarcolemmal integrity in animal models for muscular dystrophy and in muscular dystrophy patient samples. Ev...

  3. Towards a Multiscale Approach to Cybersecurity Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Hogan, Emilie A.; Hui, Peter SY; Choudhury, Sutanay; Halappanavar, Mahantesh; Oler, Kiri J.; Joslyn, Cliff A.

    2013-11-12

    We propose a multiscale approach to modeling cyber networks, with the goal of capturing a view of the network and overall situational awareness with respect to a few key properties--- connectivity, distance, and centrality--- for a system under an active attack. We focus on theoretical and algorithmic foundations of multiscale graphs, coming from an algorithmic perspective, with the goal of modeling cyber system defense as a specific use case scenario. We first define a notion of \\emph{multiscale} graphs, in contrast with their well-studied single-scale counterparts. We develop multiscale analogs of paths and distance metrics. As a simple, motivating example of a common metric, we present a multiscale analog of the all-pairs shortest-path problem, along with a multiscale analog of a well-known algorithm which solves it. From a cyber defense perspective, this metric might be used to model the distance from an attacker's position in the network to a sensitive machine. In addition, we investigate probabilistic models of connectivity. These models exploit the hierarchy to quantify the likelihood that sensitive targets might be reachable from compromised nodes. We believe that our novel multiscale approach to modeling cyber-physical systems will advance several aspects of cyber defense, specifically allowing for a more efficient and agile approach to defending these systems.

  4. A Model Lesson: Finland Shows Us What Equal Opportunity Looks Like

    Science.gov (United States)

    Sahlberg, Pasi

    2012-01-01

    International indicators show that Finland has one of the most educated citizenries in the world, provides educational opportunities in an egalitarian manner, and makes efficient use of resources. But at the beginning of the 1990s, education in Finland was nothing special in international terms. The performance of Finnish students on international…

  5. A Murine Model of Candida glabrata Vaginitis Shows No Evidence of an Inflammatory Immunopathogenic Response

    Science.gov (United States)

    Nash, Evelyn E.; Peters, Brian M.; Lilly, Elizabeth A.; Noverr, Mairi C.; Fidel, Paul L.

    2016-01-01

    Candida glabrata is the second most common organism isolated from women with vulvovaginal candidiasis (VVC), particularly in women with uncontrolled diabetes mellitus. However, mechanisms involved in the pathogenesis of C. glabrata-associated VVC are unknown and have not been studied at any depth in animal models. The objective of this study was to evaluate host responses to infection following efforts to optimize a murine model of C. glabrata VVC. For this, various designs were evaluated for consistent experimental vaginal colonization (i.e., type 1 and type 2 diabetic mice, exogenous estrogen, varying inocula, and co-infection with C. albicans). Upon model optimization, vaginal fungal burden and polymorphonuclear neutrophil (PMN) recruitment were assessed longitudinally over 21 days post-inoculation, together with vaginal concentrations of IL-1β, S100A8 alarmin, lactate dehydrogenase (LDH), and in vivo biofilm formation. Consistent and sustained vaginal colonization with C. glabrata was achieved in estrogenized streptozotocin-induced type 1 diabetic mice. Vaginal PMN infiltration was consistently low, with IL-1β, S100A8, and LDH concentrations similar to uninoculated mice. Biofilm formation was not detected in vivo, and co-infection with C. albicans did not induce synergistic immunopathogenic effects. This data suggests that experimental vaginal colonization of C. glabrata is not associated with an inflammatory immunopathogenic response or biofilm formation. PMID:26807975

  6. The PROMETHEUS bundled payment experiment: slow start shows problems in implementing new payment models.

    Science.gov (United States)

    Hussey, Peter S; Ridgely, M Susan; Rosenthal, Meredith B

    2011-11-01

    Fee-for-service payment is blamed for many of the problems observed in the US health care system. One of the leading alternative payment models proposed in the Affordable Care Act of 2010 is bundled payment, which provides payment for all of the care a patient needs over the course of a defined clinical episode, instead of paying for each discrete service. We evaluated the initial "road test" of PROMETHEUS Payment, one of several bundled payment pilot projects. The project has faced substantial implementation challenges, and none of the three pilot sites had executed contracts or made bundled payments as of May 2011. The pilots have taken longer to set up than expected, primarily because of the complexity of the payment model and the fact that it builds on the existing fee-for-service payment system and other complexities of health care. Participants continue to see promise and value in the bundled payment model, but the pilot results suggest that the desired benefits of this and other payment reforms may take time and considerable effort to materialize.

  7. A Murine Model of Candida glabrata Vaginitis Shows No Evidence of an Inflammatory Immunopathogenic Response.

    Directory of Open Access Journals (Sweden)

    Evelyn E Nash

    Full Text Available Candida glabrata is the second most common organism isolated from women with vulvovaginal candidiasis (VVC, particularly in women with uncontrolled diabetes mellitus. However, mechanisms involved in the pathogenesis of C. glabrata-associated VVC are unknown and have not been studied at any depth in animal models. The objective of this study was to evaluate host responses to infection following efforts to optimize a murine model of C. glabrata VVC. For this, various designs were evaluated for consistent experimental vaginal colonization (i.e., type 1 and type 2 diabetic mice, exogenous estrogen, varying inocula, and co-infection with C. albicans. Upon model optimization, vaginal fungal burden and polymorphonuclear neutrophil (PMN recruitment were assessed longitudinally over 21 days post-inoculation, together with vaginal concentrations of IL-1β, S100A8 alarmin, lactate dehydrogenase (LDH, and in vivo biofilm formation. Consistent and sustained vaginal colonization with C. glabrata was achieved in estrogenized streptozotocin-induced type 1 diabetic mice. Vaginal PMN infiltration was consistently low, with IL-1β, S100A8, and LDH concentrations similar to uninoculated mice. Biofilm formation was not detected in vivo, and co-infection with C. albicans did not induce synergistic immunopathogenic effects. This data suggests that experimental vaginal colonization of C. glabrata is not associated with an inflammatory immunopathogenic response or biofilm formation.

  8. Post-16 Biology--Some Model Approaches?

    Science.gov (United States)

    Lock, Roger

    1997-01-01

    Outlines alternative approaches to the teaching of difficult concepts in A-level biology which may help student learning by making abstract ideas more concrete and accessible. Examples include models, posters, and poems for illustrating meiosis, mitosis, genetic mutations, and protein synthesis. (DDR)

  9. Turn-Taking Model in the Chinese Recruitment Reality show-BelongtoYou

    Institute of Scientific and Technical Information of China (English)

    AI Fan-qing

    2014-01-01

    Based on the theories of conversational analysis proposed by Sacks et al,this paper chooses excerpts of candidates’inter-view from the Chinese recruitment reality TV show BelongtoYou in Tianjin TV. Through analyzing the excerpt, how the rules of turn-taking are applied in this program will be demonstrated. And the features of turn-taking strategies used by the host,candi-dates and bosses will be concluded.

  10. Global thermal niche models of two European grasses show high invasion risks in Antarctica.

    Science.gov (United States)

    Pertierra, Luis R; Aragón, Pedro; Shaw, Justine D; Bergstrom, Dana M; Terauds, Aleks; Olalla-Tárraga, Miguel Ángel

    2016-12-14

    The two non-native grasses that have established long-term populations in Antarctica (Poa pratensis and Poa annua) were studied from a global multidimensional thermal niche perspective to address the biological invasion risk to Antarctica. These two species exhibit contrasting introduction histories and reproductive strategies and represent two referential case studies of biological invasion processes. We used a multistep process with a range of species distribution modelling techniques (ecological niche factor analysis, multidimensional envelopes, distance/entropy algorithms) together with a suite of thermoclimatic variables, to characterize the potential ranges of these species. Their native bioclimatic thermal envelopes in Eurasia, together with the different naturalized populations across continents, were compared next. The potential niche of P. pratensis was wider at the cold extremes; however, P. annua life history attributes enable it to be a more successful colonizer. We observe that particularly cold summers are a key aspect of the unique Antarctic environment. In consequence, ruderals such as P. annua can quickly expand under such harsh conditions, whereas the more stress-tolerant P. pratensis endures and persist through steady growth. Compiled data on human pressure at the Antarctic Peninsula allowed us to provide site-specific biosecurity risk indicators. We conclude that several areas across the region are vulnerable to invasions from these and other similar species. This can only be visualized in species distribution models (SDMs) when accounting for founder populations that reveal nonanalogous conditions. Results reinforce the need for strict management practices to minimize introductions. Furthermore, our novel set of temperature-based bioclimatic GIS layers for ice-free terrestrial Antarctica provide a mechanism for regional and global species distribution models to be built for other potentially invasive species.

  11. A new bovine conjunctiva model shows that Listeria monocytogenes invasion is associated with lysozyme resistance.

    Science.gov (United States)

    Warren, Jessica; Owen, A Rhys; Glanvill, Amy; Francis, Asher; Maboni, Grazieli; Nova, Rodrigo J; Wapenaar, Wendela; Rees, Catherine; Tötemeyer, Sabine

    2015-08-31

    Listerial keratoconjunctivitis ('silage eye') is a wide spread problem in ruminants causing economic losses to farmers and impacts negatively on animal welfare. It results from direct entry of Listeria monocytogenes into the eye, often following consumption of contaminated silage. An isolation protocol for bovine conjunctival swabbing was developed and used to sample both infected and healthy eyes bovine eyes (n=46). L. monocytogenes was only isolated from one healthy eye sample, and suggests that this organism can be present without causing disease. To initiate a study of this disease, an infection model was developed using isolated conjunctiva explants obtained from cattle eyes post slaughter. Conjunctiva were cultured and infected for 20 h with a range of L. monocytogenes isolates (n=11), including the healthy bovine eye isolate and also strains isolated from other bovine sources, such as milk or clinical infections. Two L. monocytogenes isolates (one from a healthy eye and one from a cattle abortion) were markedly less able to invade conjunctiva explants, but one of those was able to efficiently infect Caco2 cells indicating that it was fully virulent. These two isolates were also significantly more sensitive to lysozyme compared to most other isolates tested, suggesting that lysozyme resistance is an important factor when infecting bovine conjunctiva. In conclusion, we present the first bovine conjunctiva explant model for infection studies and demonstrate that clinical L. monocytogenes isolates from cases of bovine keratoconjunctivitis are able to infect these tissues.

  12. Progesterone treatment shows benefit in a pediatric model of moderate to severe bilateral brain injury.

    Directory of Open Access Journals (Sweden)

    Rastafa I Geddes

    Full Text Available PURPOSE: Controlled cortical impact (CCI models in adult and aged Sprague-Dawley (SD rats have been used extensively to study medial prefrontal cortex (mPFC injury and the effects of post-injury progesterone treatment, but the hormone's effects after traumatic brain injury (TBI in juvenile animals have not been determined. In the present proof-of-concept study we investigated whether progesterone had neuroprotective effects in a pediatric model of moderate to severe bilateral brain injury. METHODS: Twenty-eight-day old (PND 28 male Sprague Dawley rats received sham (n = 24 or CCI (n = 47 injury and were given progesterone (4, 8, or 16 mg/kg per 100 g body weight or vehicle injections on post-injury days (PID 1-7, subjected to behavioral testing from PID 9-27, and analyzed for lesion size at PID 28. RESULTS: The 8 and 16 mg/kg doses of progesterone were observed to be most beneficial in reducing the effect of CCI on lesion size and behavior in PND 28 male SD rats. CONCLUSION: Our findings suggest that a midline CCI injury to the frontal cortex will reliably produce a moderate TBI comparable to what is seen in the adult male rat and that progesterone can ameliorate the injury-induced deficits.

  13. A New Detection Approach Based on the Maximum Entropy Model

    Institute of Scientific and Technical Information of China (English)

    DONG Xiaomei; XIANG Guang; YU Ge; LI Xiaohua

    2006-01-01

    The maximum entropy model was introduced and a new intrusion detection approach based on the maximum entropy model was proposed. The vector space model was adopted for data presentation. The minimal entropy partitioning method was utilized for attribute discretization. Experiments on the KDD CUP 1999 standard data set were designed and the experimental results were shown. The receiver operating characteristic(ROC) curve analysis approach was utilized to analyze the experimental results. The analysis results show that the proposed approach is comparable to those based on support vector machine(SVM) and outperforms those based on C4.5 and Naive Bayes classifiers. According to the overall evaluation result, the proposed approach is a little better than those based on SVM.

  14. A model SN2 reaction ‘on water’ does not show rate enhancement

    Science.gov (United States)

    Nelson, Katherine V.; Benjamin, Ilan

    2011-05-01

    Molecular dynamics calculations of the benchmark nucleophilic substitution reaction (SN2) Cl- + CH3Cl are carried out at the water liquid/vapor interface. The reaction free energy profile and the activation free energy are determined as a function of the reactants' location normal to the surface. The activation free energy remains almost constant relative to that in bulk water, despite the fact that the barrier is expected to significantly decrease as the reaction is carried out near the vapor phase. We show that this is due to the combined effects of a clustering of water molecules around the nucleophile and a relatively weak hydration of the transition state.

  15. Heat transfer modeling an inductive approach

    CERN Document Server

    Sidebotham, George

    2015-01-01

    This innovative text emphasizes a "less-is-more" approach to modeling complicated systems such as heat transfer by treating them first as "1-node lumped models" that yield simple closed-form solutions. The author develops numerical techniques for students to obtain more detail, but also trains them to use the techniques only when simpler approaches fail. Covering all essential methods offered in traditional texts, but with a different order, Professor Sidebotham stresses inductive thinking and problem solving as well as a constructive understanding of modern, computer-based practice. Readers learn to develop their own code in the context of the material, rather than just how to use packaged software, offering a deeper, intrinsic grasp behind models of heat transfer. Developed from over twenty-five years of lecture notes to teach students of mechanical and chemical engineering at The Cooper Union for the Advancement of Science and Art, the book is ideal for students and practitioners across engineering discipl...

  16. In vitro and in vivo models of Huntington's disease show alterations in the endocannabinoid system.

    Science.gov (United States)

    Bari, Monica; Battista, Natalia; Valenza, Marta; Mastrangelo, Nicolina; Malaponti, Marinella; Catanzaro, Giuseppina; Centonze, Diego; Finazzi-Agrò, Alessandro; Cattaneo, Elena; Maccarrone, Mauro

    2013-07-01

    In this study, we analyzed the components of the endocannabinoid system (ECS) in R6/2 mice, a widely used model of Huntington's disease (HD). We measured the endogenous content of N-arachidonoylethanolamine and 2-arachidonoylglycerol and the activity of their biosynthetic enzymes (N-acyl-phosphatidylethanolamine-hydrolyzing phospholipase D and diacylglycerol lipase, respectively) and hydrolytic enzymes [fatty acid amide hydrolase (FAAH) and monoacylglycerol lipase, respectively] and of their target receptors (type 1 cannabinoid receptor, type 2 cannabinoid receptor, and transient receptor potential vanilloid-1) in the brains of wild-type and R6/2 mice of different ages, as well as in the striatum and cortex of 12-week-old animals. In addition, we measured FAAH activity in lymphocytes of R6/2 mice. In the whole brains of 12-week-old R6/2 mice, we found reductions in N-acyl-phosphatidylethanolamine-hydrolyzing phospholipase D activity, diacylglycerol lipase activity and cannabinoid receptor binding, mostly associated with changes in the striatum but not in the cortex, as well as an increase in 2-arachidonoylglycerol content as compared with wild-type littermates, without any other change in ECS elements. Then, our analysis was extended to HD43 cells, an inducible cellular model of HD derived from rat ST14A cells. In both induced and noninduced conditions, we demonstrated a fully functional ECS. Overall, our data suggest that the ECS is differently affected in mouse and human HD, and that HD43 cells are suitable for high-throughput screening of FAAH-oriented drugs affecting HD progression.

  17. A zebrafish model of glucocorticoid resistance shows serotonergic modulation of the stress response

    Directory of Open Access Journals (Sweden)

    Brian eGriffiths

    2012-10-01

    Full Text Available One function of glucocorticoids is to restore homeostasis after an acute stress response by providing negative feedback to stress circuits in the brain. Loss of this negative feedback leads to elevated physiological stress and may contribute to depression, anxiety and post-traumatic stress disorder. We investigated the early, developmental effects of glucocorticoid signaling deficits on stress physiology and related behaviors using a mutant zebrafish, grs357, with non-functional glucocorticoid receptors. These mutants are morphologically inconspicuous and adult-viable. A previous study of adult grs357 mutants showed loss of glucocorticoid-mediated negative feedback and elevated physiological and behavioral stress markers. Already at five days post-fertilization, mutant larvae had elevated whole body cortisol, increased expression of pro-opiomelanocortin (POMC, the precursor of adrenocorticotropic hormone (ACTH, and failed to show normal suppression of stress markers after dexamethasone treatment. Mutant larvae had larger auditory-evoked startle responses compared to wildtype sibling controls (grwt, despite having lower spontaneous activity levels. Fluoxetine (Prozac treatment in mutants decreased startle responding and increased spontaneous activity, making them behaviorally similar to wildtype. This result mirrors known effects of selective serotonin reuptake inhibitors (SSRIs in modifying glucocorticoid signaling and alleviating stress disorders in human patients. Our results suggest that larval grs357 zebrafish can be used to study behavioral, physiological and molecular aspects of stress disorders. Most importantly, interactions between glucocorticoid and serotonin signaling appear to be highly conserved among vertebrates, suggesting deep homologies at the neural circuit level and opening up new avenues for research into psychiatric conditions.

  18. The atherogenic Scarb1 null mouse model shows a high bone mass phenotype.

    Science.gov (United States)

    Martineau, Corine; Martin-Falstrault, Louise; Brissette, Louise; Moreau, Robert

    2014-01-01

    Scavenger receptor class B, type I (SR-BI), the Scarb1 gene product, is a receptor associated with cholesteryl ester uptake from high-density lipoproteins (HDL), which drives cholesterol movement from peripheral tissues toward the liver for excretion, and, consequently, Scarb1 null mice are prone to atherosclerosis. Because studies have linked atherosclerosis incidence with osteoporosis, we characterized the bone metabolism in these mice. Bone morphometry was assessed through microcomputed tomography and histology. Marrow stromal cells (MSCs) were used to characterize influence of endogenous SR-BI in cell functions. Total and HDL-associated cholesterol in null mice were increased by 32-60%, correlating with its role in lipoprotein metabolism. Distal metaphyses from 2- and 4-mo-old null mice showed correspondingly 46 and 37% higher bone volume fraction associated with a higher number of trabeculae. Histomorphometric analyses in 2-mo-old null male mice revealed 1.42-fold greater osteoblast surface, 1.37-fold higher percent mineralizing surface, and 1.69-fold enhanced bone formation rate. In vitro assays for MSCs from null mice revealed 37% higher proliferation rate, 48% more alkaline phosphatase activity, 70% greater mineralization potential and a 2-fold osterix (Sp7) expression, yet a 0.5-fold decrease in caveolin-1 (Cav1) expression. Selective uptake levels of HDL-associated cholesteryl oleate and estradiol were similar between MSC from wild-type and Scarb1 null mice, suggesting that its contribution to this process is not its main role in these cells. However, Scarb1 knockout stunted the HDL-dependent regulation of Cav1 genic expression. Scarb1 null mice are not prone to osteoporosis but show higher bone mass associated with enhanced bone formation.

  19. Metabolic remodeling agents show beneficial effects in the dystrophin-deficient mdx mouse model

    Directory of Open Access Journals (Sweden)

    Jahnke Vanessa E

    2012-08-01

    Full Text Available Abstract Background Duchenne muscular dystrophy is a genetic disease involving a severe muscle wasting that is characterized by cycles of muscle degeneration/regeneration and culminates in early death in affected boys. Mitochondria are presumed to be involved in the regulation of myoblast proliferation/differentiation; enhancing mitochondrial activity with exercise mimetics (AMPK and PPAR-delta agonists increases muscle function and inhibits muscle wasting in healthy mice. We therefore asked whether metabolic remodeling agents that increase mitochondrial activity would improve muscle function in mdx mice. Methods Twelve-week-old mdx mice were treated with two different metabolic remodeling agents (GW501516 and AICAR, separately or in combination, for 4 weeks. Extensive systematic behavioral, functional, histological, biochemical, and molecular tests were conducted to assess the drug(s' effects. Results We found a gain in body and muscle weight in all treated mice. Histologic examination showed a decrease in muscle inflammation and in the number of fibers with central nuclei and an increase in fibers with peripheral nuclei, with significantly fewer activated satellite cells and regenerating fibers. Together with an inhibition of FoXO1 signaling, these results indicated that the treatments reduced ongoing muscle damage. Conclusions The three treatments produced significant improvements in disease phenotype, including an increase in overall behavioral activity and significant gains in forelimb and hind limb strength. Our findings suggest that triggering mitochondrial activity with exercise mimetics improves muscle function in dystrophin-deficient mdx mice.

  20. Male Wistar rats show individual differences in an animal model of conformity.

    Science.gov (United States)

    Jolles, Jolle W; de Visser, Leonie; van den Bos, Ruud

    2011-09-01

    Conformity refers to the act of changing one's behaviour to match that of others. Recent studies in humans have shown that individual differences exist in conformity and that these differences are related to differences in neuronal activity. To understand the neuronal mechanisms in more detail, animal tests to assess conformity are needed. Here, we used a test of conformity in rats that has previously been evaluated in female, but not male, rats and assessed the nature of individual differences in conformity. Male Wistar rats were given the opportunity to learn that two diets differed in palatability. They were subsequently exposed to a demonstrator that had consumed the less palatable food. Thereafter, they were exposed to the same diets again. Just like female rats, male rats decreased their preference for the more palatable food after interaction with demonstrator rats that had eaten the less palatable food. Individual differences existed for this shift, which were only weakly related to an interaction between their own initial preference and the amount consumed by the demonstrator rat. The data show that this conformity test in rats is a promising tool to study the neurobiology of conformity.

  1. Metamodelling Approach and Software Tools for Physical Modelling and Simulation

    Directory of Open Access Journals (Sweden)

    Vitaliy Mezhuyev

    2015-02-01

    Full Text Available In computer science, metamodelling approach becomes more and more popular for the purpose of software systems development. In this paper, we discuss applicability of the metamodelling approach for development of software tools for physical modelling and simulation.To define a metamodel for physical modelling the analysis of physical models will be done. The result of such the analyses will show the invariant physical structures, we propose to use as the basic abstractions of the physical metamodel. It is a system of geometrical objects, allowing to build a spatial structure of physical models and to set a distribution of physical properties. For such geometry of distributed physical properties, the different mathematical methods can be applied. To prove the proposed metamodelling approach, we consider the developed prototypes of software tools.

  2. A geometrical approach to structural change modeling

    OpenAIRE

    Stijepic, Denis

    2013-01-01

    We propose a model for studying the dynamics of economic structures. The model is based on qualitative information regarding structural dynamics, in particular, (a) the information on the geometrical properties of trajectories (and their domains) which are studied in structural change theory and (b) the empirical information from stylized facts of structural change. We show that structural change is path-dependent in this model and use this fact to restrict the number of future structural cha...

  3. Towards a CPN-Based Modelling Approach for Reconciling Verification and Implementation of Protocol Models

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge; Kristensen, Lars Michael

    2013-01-01

    and implementation. Our approach has been developed in the context of the Coloured Petri Nets (CPNs) modelling language. We illustrate our approach by presenting a descriptive specification model of the Websocket protocol which is currently under development by the Internet Engineering Task Force (IETF), and we show......Formal modelling of protocols is often aimed at one specific purpose such as verification or automatically generating an implementation. This leads to models that are useful for one purpose, but not for others. Being able to derive models for verification and implementation from a single model...

  4. A combined cultivation and cultivation-independent approach shows high bacterial diversity in water-miscible metalworking fluids.

    Science.gov (United States)

    Lodders, Nicole; Kämpfer, Peter

    2012-06-01

    Ten metalworking fluids (MWF) and seven water preparation basis samples (WPB) were taken from five industrial plants in Germany. Total cells (TCC) and colony forming units (CFU) were counted, strains were isolated and their 16S rRNA gene was sequenced. Additionally, DNA was extracted directly from the samples, and clone libraries of 16S rRNA genes were built and gene sequenced. TCC ranged from 7.6×10(4) TCC/mL MWF to 1.6×10(8) TCC/mL MWF, and from 4.6×10(2) TCC/mL WPB to 7.8×10(7) TCC/mL WPB. The CFU showed similar but often lower results. A total of 70 isolates and 732 clones were 16S rRNA gene sequenced and all isolates, as well as 183 of the nearly full length 16S rRNA of these clones, were gene sequenced. A total of 98 different genera were detected in all 17 samples. The number of genera within each sample varied highly, with 1-22 genera per sample. The dominant genera in MWF were Leucobacter, Desemzia, Sphingomonas and Wautersiella. From these, only Sphingomonas was detected in WPB as well. This study showed that MWF can harbour a high bacterial diversity, which differs significantly from the bacterial flora of the corresponding WPB.

  5. Scientific Theories, Models and the Semantic Approach

    Directory of Open Access Journals (Sweden)

    Décio Krause

    2007-12-01

    Full Text Available According to the semantic view, a theory is characterized by a class of models. In this paper, we examine critically some of the assumptions that underlie this approach. First, we recall that models are models of something. Thus we cannot leave completely aside the axiomatization of the theories under consideration, nor can we ignore the metamathematics used to elaborate these models, for changes in the metamathematics often impose restrictions on the resulting models. Second, based on a parallel between van Fraassen’s modal interpretation of quantum mechanics and Skolem’s relativism regarding set-theoretic concepts, we introduce a distinction between relative and absolute concepts in the context of the models of a scientific theory. And we discuss the significance of that distinction. Finally, by focusing on contemporary particle physics, we raise the question: since there is no general accepted unification of the parts of the standard model (namely, QED and QCD, we have no theory, in the usual sense of the term. This poses a difficulty: if there is no theory, how can we speak of its models? What are the latter models of? We conclude by noting that it is unclear that the semantic view can be applied to contemporary physical theories.

  6. Multiscale Model Approach for Magnetization Dynamics Simulations

    CERN Document Server

    De Lucia, Andrea; Tretiakov, Oleg A; Kläui, Mathias

    2016-01-01

    Simulations of magnetization dynamics in a multiscale environment enable rapid evaluation of the Landau-Lifshitz-Gilbert equation in a mesoscopic sample with nanoscopic accuracy in areas where such accuracy is required. We have developed a multiscale magnetization dynamics simulation approach that can be applied to large systems with spin structures that vary locally on small length scales. To implement this, the conventional micromagnetic simulation framework has been expanded to include a multiscale solving routine. The software selectively simulates different regions of a ferromagnetic sample according to the spin structures located within in order to employ a suitable discretization and use either a micromagnetic or an atomistic model. To demonstrate the validity of the multiscale approach, we simulate the spin wave transmission across the regions simulated with the two different models and different discretizations. We find that the interface between the regions is fully transparent for spin waves with f...

  7. Mule duck "foie gras" shows different metabolic states according to its quality phenotype by using a proteomic approach.

    Science.gov (United States)

    François, Yoannah; Marie-Etancelin, Christel; Vignal, Alain; Viala, Didier; Davail, Stéphane; Molette, Caroline

    2014-07-23

    This study aimed at identifying the mechanisms implicated in "foie gras" quality variability through the study of the relationships between liver protein compositions and four liver quality phenotypes: liver weight, melting rate, and protein contents on crude or dry matter. Spots of soluble proteins were separated by bidimensional electrophoresis, and the relative abundance of proteins according to quality traits values was investigated. Twenty-three protein spots (19 unique identified proteins) showed different levels of abundance according to one or more of the traits' values. These abundance differences highlighted two groups of livers with opposite trends of abundance levels. Proteins of the first group, associated with low liver weight and melting rate, are involved in synthesis and anabolism processes, whereas proteins of the second group, associated with high liver weight and melting rate, are proteins involved in stress response. Altogether, these results highlight the variations in metabolic states underlying foie gras quality traits.

  8. Approaches to Working with Children, Young People and Families for Traveller, Irish Traveller, Gypsy, Roma and Show People Communities. Annotated Bibliography for the Children's Workforce Development Council

    Science.gov (United States)

    Robinson, Mark; Martin, Kerry; Wilkin, Carol

    2008-01-01

    This annoted bibliography relays a range of issues and approaches to working with Travellers, Irish Travellers, Gypsies, Roma and Show People. This is an accompanying document to the literature review report, ED501860.

  9. Continuum modeling an approach through practical examples

    CERN Document Server

    Muntean, Adrian

    2015-01-01

    This book develops continuum modeling skills and approaches the topic from three sides: (1) derivation of global integral laws together with the associated local differential equations, (2) design of constitutive laws and (3) modeling boundary processes. The focus of this presentation lies on many practical examples covering aspects such as coupled flow, diffusion and reaction in porous media or microwave heating of a pizza, as well as traffic issues in bacterial colonies and energy harvesting from geothermal wells. The target audience comprises primarily graduate students in pure and applied mathematics as well as working practitioners in engineering who are faced by nonstandard rheological topics like those typically arising in the food industry.

  10. A Multivariate Approach to Functional Neuro Modeling

    DEFF Research Database (Denmark)

    Mørch, Niels J.S.

    1998-01-01

    This Ph.D. thesis, A Multivariate Approach to Functional Neuro Modeling, deals with the analysis and modeling of data from functional neuro imaging experiments. A multivariate dataset description is provided which facilitates efficient representation of typical datasets and, more importantly...... and overall conditions governing the functional experiment, via associated micro- and macroscopic variables. The description facilitates an efficient microscopic re-representation, as well as a handle on the link between brain and behavior; the latter is achieved by hypothesizing variations in the micro...... a generalization theoretical framework centered around measures of model generalization error. - Only few, if any, examples of the application of generalization theory to functional neuro modeling currently exist in the literature. - Exemplification of the proposed generalization theoretical framework...

  11. Asteroid fragmentation approaches for modeling atmospheric energy deposition

    Science.gov (United States)

    Register, Paul J.; Mathias, Donovan L.; Wheeler, Lorien F.

    2017-03-01

    During asteroid entry, energy is deposited in the atmosphere through thermal ablation and momentum-loss due to aerodynamic drag. Analytic models of asteroid entry and breakup physics are used to compute the energy deposition, which can then be compared against measured light curves and used to estimate ground damage due to airburst events. This work assesses and compares energy deposition results from four existing approaches to asteroid breakup modeling, and presents a new model that combines key elements of those approaches. The existing approaches considered include a liquid drop or "pancake" model where the object is treated as a single deforming body, and a set of discrete fragment models where the object breaks progressively into individual fragments. The new model incorporates both independent fragments and aggregate debris clouds to represent a broader range of fragmentation behaviors and reproduce more detailed light curve features. All five models are used to estimate the energy deposition rate versus altitude for the Chelyabinsk meteor impact, and results are compared with an observationally derived energy deposition curve. Comparisons show that four of the five approaches are able to match the overall observed energy deposition profile, but the features of the combined model are needed to better replicate both the primary and secondary peaks of the Chelyabinsk curve.

  12. Systematic approach to MIS model creation

    Directory of Open Access Journals (Sweden)

    Macura Perica

    2004-01-01

    Full Text Available In this paper-work, by application of basic principles of general theory of system (systematic approach, we have formulated a model of marketing information system. Bases for research were basic characteristics of systematic approach and marketing system. Informational base for management of marketing system, i.e. marketing instruments was presented in a way that the most important information for decision making were listed per individual marketing mix instruments. In projected model of marketing information system, information listed in this way create a base for establishing of data bases, i.e. bases of information (data bases of: product, price, distribution, promotion. This paper-work gives basic preconditions for formulation and functioning of the model. Model was presented by explication of elements of its structure (environment, data bases operators, analysts of information system, decision makers - managers, i.e. input, process, output, feedback and relations between these elements which are necessary for its optimal functioning. Beside that, here are basic elements for implementation of the model into business system, as well as conditions for its efficient functioning and development.

  13. Dynamic Metabolic Model Building Based on the Ensemble Modeling Approach

    Energy Technology Data Exchange (ETDEWEB)

    Liao, James C. [Univ. of California, Los Angeles, CA (United States)

    2016-10-01

    Ensemble modeling of kinetic systems addresses the challenges of kinetic model construction, with respect to parameter value selection, and still allows for the rich insights possible from kinetic models. This project aimed to show that constructing, implementing, and analyzing such models is a useful tool for the metabolic engineering toolkit, and that they can result in actionable insights from models. Key concepts are developed and deliverable publications and results are presented.

  14. Modeling Electronic Circular Dichroism within the Polarizable Embedding Approach

    DEFF Research Database (Denmark)

    Nørby, Morten S; Olsen, Jógvan Magnus Haugaard; Steinmann, Casper

    2017-01-01

    We present a systematic investigation of the key components needed to model single chromophore electronic circular dichroism (ECD) within the polarizable embedding (PE) approach. By relying on accurate forms of the embedding potential, where especially the inclusion of local field effects...... sampling. We show that a significant number of snapshots are needed to avoid artifacts in the calculated electronic circular dichroism parameters due to insufficient configurational sampling, thus highlighting the efficiency of the PE model....

  15. Regularization of turbulence - a comprehensive modeling approach

    Science.gov (United States)

    Geurts, B. J.

    2011-12-01

    Turbulence readily arises in numerous flows in nature and technology. The large number of degrees of freedom of turbulence poses serious challenges to numerical approaches aimed at simulating and controlling such flows. While the Navier-Stokes equations are commonly accepted to precisely describe fluid turbulence, alternative coarsened descriptions need to be developed to cope with the wide range of length and time scales. These coarsened descriptions are known as large-eddy simulations in which one aims to capture only the primary features of a flow, at considerably reduced computational effort. Such coarsening introduces a closure problem that requires additional phenomenological modeling. A systematic approach to the closure problem, know as regularization modeling, will be reviewed. Its application to multiphase turbulent will be illustrated in which a basic regularization principle is enforced to physically consistently approximate momentum and scalar transport. Examples of Leray and LANS-alpha regularization are discussed in some detail, as are compatible numerical strategies. We illustrate regularization modeling to turbulence under the influence of rotation and buoyancy and investigate the accuracy with which particle-laden flow can be represented. A discussion of the numerical and modeling errors incurred will be given on the basis of homogeneous isotropic turbulence.

  16. An algebraic approach to the Hubbard model

    CERN Document Server

    de Leeuw, Marius

    2015-01-01

    We study the algebraic structure of an integrable Hubbard-Shastry type lattice model associated with the centrally extended su(2|2) superalgebra. This superalgebra underlies Beisert's AdS/CFT worldsheet R-matrix and Shastry's R-matrix. The considered model specializes to the one-dimensional Hubbard model in a certain limit. We demonstrate that Yangian symmetries of the R-matrix specialize to the Yangian symmetry of the Hubbard model found by Korepin and Uglov. Moreover, we show that the Hubbard model Hamiltonian has an algebraic interpretation as the so-called secret symmetry. We also discuss Yangian symmetries of the A and B models introduced by Frolov and Quinn.

  17. Men and women with bisexual identities show bisexual patterns of sexual attraction to male and female "swimsuit models".

    Science.gov (United States)

    Lippa, Richard A

    2013-02-01

    Do self-identified bisexual men and women actually show bisexual patterns of sexual attraction and interest? To answer this question, I studied bisexual men's and women's sexual attraction to photographed male and female "swimsuit models" that varied in attractiveness. Participants (663 college students and gay pride attendees, including 14 self-identified bisexual men and 17 self-identified bisexual women) rated their degree of sexual attraction to 34 male and 34 female swimsuit models. Participants' viewing times to models were unobtrusively assessed. Results showed that bisexual men and women showed bisexual patterns of attraction and viewing times to photo models, which strongly distinguished them from same-sex heterosexual and homosexual participants. In contrast to other groups, which showed evidence of greater male than female category specificity, bisexual men and women did not differ in category specificity. Results suggest that there are subsets of men and women who display truly bisexual patterns of sexual attraction and interest.

  18. Current approaches to model extracellular electrical neural microstimulation

    Directory of Open Access Journals (Sweden)

    Sébastien eJoucla

    2014-02-01

    Full Text Available Nowadays, high-density microelectrode arrays provide unprecedented possibilities to precisely activate spatially well-controlled central nervous system (CNS areas. However, this requires optimizing stimulating devices, which in turn requires a good understanding of the effects of microstimulation on cells and tissues. In this context, modeling approaches provide flexible ways to predict the outcome of electrical stimulation in terms of CNS activation. In this paper, we present state-of-the-art modeling methods with sufficient details to allow the reader to rapidly build numerical models of neuronal extracellular microstimulation. These include 1 the computation of the electrical potential field created by the stimulation in the tissue, and 2 the response of a target neuron to this field. Two main approaches are described: First we describe the classical hybrid approach that combines the finite element modeling of the potential field with the calculation of the neuron’s response in a cable equation framework (compartmentalized neuron models. Then, we present a whole finite element approach allows the simultaneous calculation of the extracellular and intracellular potentials, by representing the neuronal membrane with a thin-film approximation. This approach was previously introduced in the frame of neural recording, but has never been implemented to determine the effect of extracellular stimulation on the neural response at a sub-compartment level. Here, we show on an example that the latter modeling scheme can reveal important sub-compartment behavior of the neural membrane that cannot be resolved using the hybrid approach. The goal of this paper is also to describe in detail the practical implementation of these methods to allow the reader to easily build new models using standard software packages. These modeling paradigms, depending on the situation, should help build more efficient high-density neural prostheses for CNS rehabilitation.

  19. Second Quantization Approach to Stochastic Epidemic Models

    CERN Document Server

    Mondaini, Leonardo

    2015-01-01

    We show how the standard field theoretical language based on creation and annihilation operators may be used for a straightforward derivation of closed master equations describing the population dynamics of multivariate stochastic epidemic models. In order to do that, we introduce an SIR-inspired stochastic model for hepatitis C virus epidemic, from which we obtain the time evolution of the mean number of susceptible, infected, recovered and chronically infected individuals in a population whose total size is allowed to change.

  20. Show Time

    Institute of Scientific and Technical Information of China (English)

    2004-01-01

    <正> Story: Show Time!The whole class presents the story"Under the Sea".Everyone is so excited and happy.Both Leo and Kathy show their parentsthe characters of the play."Who’s he?"asks Kathy’s mom."He’s the prince."Kathy replies."Who’s she?"asks Leo’s dad."She’s the queen."Leo replieswith a smile.

  1. Snobbish Show

    Institute of Scientific and Technical Information of China (English)

    YIN PUMIN

    2010-01-01

    @@ The State Administration of Radio,Film and Television (SARFT),China's media watchdog,issued a new set of mles on June 9 that strictly regulate TV match-making shows,which have been sweeping the country's primetime programming. "Improper social and love values such as money worship should not be presented in these shows.Humiliation,verbal attacks and sex-implied vulgar content are not allowed" the new roles said.

  2. A computational language approach to modeling prose recall in schizophrenia.

    Science.gov (United States)

    Rosenstein, Mark; Diaz-Asper, Catherine; Foltz, Peter W; Elvevåg, Brita

    2014-06-01

    Many cortical disorders are associated with memory problems. In schizophrenia, verbal memory deficits are a hallmark feature. However, the exact nature of this deficit remains elusive. Modeling aspects of language features used in memory recall have the potential to provide means for measuring these verbal processes. We employ computational language approaches to assess time-varying semantic and sequential properties of prose recall at various retrieval intervals (immediate, 30 min and 24 h later) in patients with schizophrenia, unaffected siblings and healthy unrelated control participants. First, we model the recall data to quantify the degradation of performance with increasing retrieval interval and the effect of diagnosis (i.e., group membership) on performance. Next we model the human scoring of recall performance using an n-gram language sequence technique, and then with a semantic feature based on Latent Semantic Analysis. These models show that automated analyses of the recalls can produce scores that accurately mimic human scoring. The final analysis addresses the validity of this approach by ascertaining the ability to predict group membership from models built on the two classes of language features. Taken individually, the semantic feature is most predictive, while a model combining the features improves accuracy of group membership prediction slightly above the semantic feature alone as well as over the human rating approach. We discuss the implications for cognitive neuroscience of such a computational approach in exploring the mechanisms of prose recall.

  3. AN AUTOMATIC APPROACH TO BOX & JENKINS MODELLING

    OpenAIRE

    MARCELO KRIEGER

    1983-01-01

    Apesar do reconhecimento amplo da qualidade das previsões obtidas na aplicação de um modelo ARIMA à previsão de séries temporais univariadas, seu uso tem permanecido restrito pela falta de procedimentos automáticos, computadorizados. Neste trabalho este problema é discutido e um algoritmo é proposto. Inspite of general recognition of the good forecasting ability of ARIMA models in predicting time series, this approach is not widely used because of the lack of ...

  4. Modeling in transport phenomena a conceptual approach

    CERN Document Server

    Tosun, Ismail

    2007-01-01

    Modeling in Transport Phenomena, Second Edition presents and clearly explains with example problems the basic concepts and their applications to fluid flow, heat transfer, mass transfer, chemical reaction engineering and thermodynamics. A balanced approach is presented between analysis and synthesis, students will understand how to use the solution in engineering analysis. Systematic derivations of the equations and the physical significance of each term are given in detail, for students to easily understand and follow up the material. There is a strong incentive in science and engineering to

  5. A database approach to information retrieval: The remarkable relationship between language models and region models

    CERN Document Server

    Hiemstra, Djoerd

    2010-01-01

    In this report, we unify two quite distinct approaches to information retrieval: region models and language models. Region models were developed for structured document retrieval. They provide a well-defined behaviour as well as a simple query language that allows application developers to rapidly develop applications. Language models are particularly useful to reason about the ranking of search results, and for developing new ranking approaches. The unified model allows application developers to define complex language modeling approaches as logical queries on a textual database. We show a remarkable one-to-one relationship between region queries and the language models they represent for a wide variety of applications: simple ad-hoc search, cross-language retrieval, video retrieval, and web search.

  6. Comparative flood damage model assessment: towards a European approach

    Directory of Open Access Journals (Sweden)

    B. Jongman

    2012-12-01

    Full Text Available There is a wide variety of flood damage models in use internationally, differing substantially in their approaches and economic estimates. Since these models are being used more and more as a basis for investment and planning decisions on an increasingly large scale, there is a need to reduce the uncertainties involved and develop a harmonised European approach, in particular with respect to the EU Flood Risks Directive. In this paper we present a qualitative and quantitative assessment of seven flood damage models, using two case studies of past flood events in Germany and the United Kingdom. The qualitative analysis shows that modelling approaches vary strongly, and that current methodologies for estimating infrastructural damage are not as well developed as methodologies for the estimation of damage to buildings. The quantitative results show that the model outcomes are very sensitive to uncertainty in both vulnerability (i.e. depth–damage functions and exposure (i.e. asset values, whereby the first has a larger effect than the latter. We conclude that care needs to be taken when using aggregated land use data for flood risk assessment, and that it is essential to adjust asset values to the regional economic situation and property characteristics. We call for the development of a flexible but consistent European framework that applies best practice from existing models while providing room for including necessary regional adjustments.

  7. Modeling for fairness: A Rawlsian approach.

    Science.gov (United States)

    Diekmann, Sven; Zwart, Sjoerd D

    2014-06-01

    In this paper we introduce the overlapping design consensus for the construction of models in design and the related value judgments. The overlapping design consensus is inspired by Rawls' overlapping consensus. The overlapping design consensus is a well-informed, mutual agreement among all stakeholders based on fairness. Fairness is respected if all stakeholders' interests are given due and equal attention. For reaching such fair agreement, we apply Rawls' original position and reflective equilibrium to modeling. We argue that by striving for the original position, stakeholders expel invalid arguments, hierarchies, unwarranted beliefs, and bargaining effects from influencing the consensus. The reflective equilibrium requires that stakeholders' beliefs cohere with the final agreement and its justification. Therefore, the overlapping design consensus is not only an agreement to decisions, as most other stakeholder approaches, it is also an agreement to their justification and that this justification is consistent with each stakeholders' beliefs. For supporting fairness, we argue that fairness qualifies as a maxim in modeling. We furthermore distinguish values embedded in a model from values that are implied by its context of application. Finally, we conclude that for reaching an overlapping design consensus communication about properties of and values related to a model is required.

  8. DISTRIBUTED APPROACH to WEB PAGE CATEGORIZATION USING MAPREDUCE PROGRAMMING MODEL

    Directory of Open Access Journals (Sweden)

    P.Malarvizhi

    2011-12-01

    Full Text Available The web is a large repository of information and to facilitate the search and retrieval of pages from it,categorization of web documents is essential. An effective means to handle the complexity of information retrieval from the internet is through automatic classification of web pages. Although lots of automatic classification algorithms and systems have been presented, most of the existing approaches are computationally challenging. In order to overcome this challenge, we have proposed a parallel algorithm, known as MapReduce programming model to automatically categorize the web pages. This approach incorporates three concepts. They are web crawler, MapReduce programming model and the proposed web page categorization approach. Initially, we have utilized web crawler to mine the World Wide Web and the crawled web pages are then directly given as input to the MapReduce programming model. Here the MapReduce programming model adapted to our proposed web page categorization approach finds the appropriate category of the web page according to its content. The experimental results show that our proposed parallel web page categorization approach achieves satisfactory results in finding the right category for any given web page.

  9. Pedagogic process modeling: Humanistic-integrative approach

    Directory of Open Access Journals (Sweden)

    Boritko Nikolaj M.

    2007-01-01

    Full Text Available The paper deals with some current problems of modeling the dynamics of the subject-features development of the individual. The term "process" is considered in the context of the humanistic-integrative approach, in which the principles of self education are regarded as criteria for efficient pedagogic activity. Four basic characteristics of the pedagogic process are pointed out: intentionality reflects logicality and regularity of the development of the process; discreteness (stageability in dicates qualitative stages through which the pedagogic phenomenon passes; nonlinearity explains the crisis character of pedagogic processes and reveals inner factors of self-development; situationality requires a selection of pedagogic conditions in accordance with the inner factors, which would enable steering the pedagogic process. Offered are two steps for singling out a particular stage and the algorithm for developing an integrative model for it. The suggested conclusions might be of use for further theoretic research, analyses of educational practices and for realistic predicting of pedagogical phenomena. .

  10. Nuclear level density: Shell-model approach

    Science.gov (United States)

    Sen'kov, Roman; Zelevinsky, Vladimir

    2016-06-01

    Knowledge of the nuclear level density is necessary for understanding various reactions, including those in the stellar environment. Usually the combinatorics of a Fermi gas plus pairing is used for finding the level density. Recently a practical algorithm avoiding diagonalization of huge matrices was developed for calculating the density of many-body nuclear energy levels with certain quantum numbers for a full shell-model Hamiltonian. The underlying physics is that of quantum chaos and intrinsic thermalization in a closed system of interacting particles. We briefly explain this algorithm and, when possible, demonstrate the agreement of the results with those derived from exact diagonalization. The resulting level density is much smoother than that coming from conventional mean-field combinatorics. We study the role of various components of residual interactions in the process of thermalization, stressing the influence of incoherent collision-like processes. The shell-model results for the traditionally used parameters are also compared with standard phenomenological approaches.

  11. Modeling Social Annotation: a Bayesian Approach

    CERN Document Server

    Plangprasopchok, Anon

    2008-01-01

    Collaborative tagging systems, such as del.icio.us, CiteULike, and others, allow users to annotate objects, e.g., Web pages or scientific papers, with descriptive labels called tags. The social annotations, contributed by thousands of users, can potentially be used to infer categorical knowledge, classify documents or recommend new relevant information. Traditional text inference methods do not make best use of socially-generated data, since they do not take into account variations in individual users' perspectives and vocabulary. In a previous work, we introduced a simple probabilistic model that takes interests of individual annotators into account in order to find hidden topics of annotated objects. Unfortunately, our proposed approach had a number of shortcomings, including overfitting, local maxima and the requirement to specify values for some parameters. In this paper we address these shortcomings in two ways. First, we extend the model to a fully Bayesian framework. Second, we describe an infinite ver...

  12. Joint Modeling of Multiple Crimes: A Bayesian Spatial Approach

    Directory of Open Access Journals (Sweden)

    Hongqiang Liu

    2017-01-01

    Full Text Available A multivariate Bayesian spatial modeling approach was used to jointly model the counts of two types of crime, i.e., burglary and non-motor vehicle theft, and explore the geographic pattern of crime risks and relevant risk factors. In contrast to the univariate model, which assumes independence across outcomes, the multivariate approach takes into account potential correlations between crimes. Six independent variables are included in the model as potential risk factors. In order to fully present this method, both the multivariate model and its univariate counterpart are examined. We fitted the two models to the data and assessed them using the deviance information criterion. A comparison of the results from the two models indicates that the multivariate model was superior to the univariate model. Our results show that population density and bar density are clearly associated with both burglary and non-motor vehicle theft risks and indicate a close relationship between these two types of crime. The posterior means and 2.5% percentile of type-specific crime risks estimated by the multivariate model were mapped to uncover the geographic patterns. The implications, limitations and future work of the study are discussed in the concluding section.

  13. Multicomponent Equilibrium Models for Testing Geothermometry Approaches

    Energy Technology Data Exchange (ETDEWEB)

    Carl D. Palmer; Robert W. Smith; Travis L. McLing

    2013-02-01

    Geothermometry is an important tool for estimating deep reservoir temperature from the geochemical composition of shallower and cooler waters. The underlying assumption of geothermometry is that the waters collected from shallow wells and seeps maintain a chemical signature that reflects equilibrium in the deeper reservoir. Many of the geothermometers used in practice are based on correlation between water temperatures and composition or using thermodynamic calculations based a subset (typically silica, cations or cation ratios) of the dissolved constituents. An alternative approach is to use complete water compositions and equilibrium geochemical modeling to calculate the degree of disequilibrium (saturation index) for large number of potential reservoir minerals as a function of temperature. We have constructed several “forward” geochemical models using The Geochemist’s Workbench to simulate the change in chemical composition of reservoir fluids as they migrate toward the surface. These models explicitly account for the formation (mass and composition) of a steam phase and equilibrium partitioning of volatile components (e.g., CO2, H2S, and H2) into the steam as a result of pressure decreases associated with upward fluid migration from depth. We use the synthetic data generated from these simulations to determine the advantages and limitations of various geothermometry and optimization approaches for estimating the likely conditions (e.g., temperature, pCO2) to which the water was exposed in the deep subsurface. We demonstrate the magnitude of errors that can result from boiling, loss of volatiles, and analytical error from sampling and instrumental analysis. The estimated reservoir temperatures for these scenarios are also compared to conventional geothermometers. These results can help improve estimation of geothermal resource temperature during exploration and early development.

  14. A semiparametric approach to physiological flow models.

    Science.gov (United States)

    Verotta, D; Sheiner, L B; Ebling, W F; Stanski, D R

    1989-08-01

    By regarding sampled tissues in a physiological model as linear subsystems, the usual advantages of flow models are preserved while mitigating two of their disadvantages, (i) the need for assumptions regarding intratissue kinetics, and (ii) the need to simultaneously fit data from several tissues. To apply the linear systems approach, both arterial blood and (interesting) tissue drug concentrations must be measured. The body is modeled as having an arterial compartment (A) distributing drug to different linear subsystems (tissues), connected in a specific way by blood flow. The response (CA, with dimensions of concentration) of A is measured. Tissues receive input from A (and optionally from other tissues), and send output to the outside or to other parts of the body. The response (CT, total amount of drug in the tissue (T) divided by the volume of T) from the T-th one, for example, of such tissues is also observed. From linear systems theory, CT can be expressed as the convolution of CA with a disposition function, F(t) (with dimensions 1/time). The function F(t) depends on the (unknown) structure of T, but has certain other constant properties: The integral integral infinity0 F(t) dt is the steady state ratio of CT to CA, and the point F(0) is the clearance rate of drug from A to T divided by the volume of T. A formula for the clearance rate of drug from T to outside T can be derived. To estimate F(t) empirically, and thus mitigate disadvantage (i), we suggest that, first, a nonparametric (or parametric) function be fitted to CA data yielding predicted values, CA, and, second, the convolution integral of CA with F(t) be fitted to CT data using a deconvolution method. By so doing, each tissue's data are analyzed separately, thus mitigating disadvantage (ii). A method for system simulation is also proposed. The results of applying the approach to simulated data and to real thiopental data are reported.

  15. Plot showing ATLAS limits on Standard Model Higgs production in the mass range 110-150 GeV

    CERN Multimedia

    ATLAS Collaboration

    2011-01-01

    The combined upper limit on the Standard Model Higgs boson production cross section divided by the Standard Model expectation as a function of mH is indicated by the solid line. This is a 95% CL limit using the CLs method in in the low mass range. The dotted line shows the median expected limit in the absence of a signal and the green and yellow bands reflect the corresponding 68% and 95% expected

  16. Plot showing ATLAS limits on Standard Model Higgs production in the mass range 100-600 GeV

    CERN Multimedia

    ATLAS Collaboration

    2011-01-01

    The combined upper limit on the Standard Model Higgs boson production cross section divided by the Standard Model expectation as a function of mH is indicated by the solid line. This is a 95% CL limit using the CLs method in the entire mass range. The dotted line shows the median expected limit in the absence of a signal and the green and yellow bands reflect the corresponding 68% and 95% expected

  17. MULTI MODEL DATA MINING APPROACH FOR HEART FAILURE PREDICTION

    Directory of Open Access Journals (Sweden)

    Priyanka H U

    2016-09-01

    Full Text Available Developing predictive modelling solutions for risk estimation is extremely challenging in health-care informatics. Risk estimation involves integration of heterogeneous clinical sources having different representation from different health-care provider making the task increasingly complex. Such sources are typically voluminous, diverse, and significantly change over the time. Therefore, distributed and parallel computing tools collectively termed big data tools are in need which can synthesize and assist the physician to make right clinical decisions. In this work we propose multi-model predictive architecture, a novel approach for combining the predictive ability of multiple models for better prediction accuracy. We demonstrate the effectiveness and efficiency of the proposed work on data from Framingham Heart study. Results show that the proposed multi-model predictive architecture is able to provide better accuracy than best model approach. By modelling the error of predictive models we are able to choose sub set of models which yields accurate results. More information was modelled into system by multi-level mining which has resulted in enhanced predictive accuracy.

  18. a Study of Urban Stormwater Modeling Approach in Singapore Catchment

    Science.gov (United States)

    Liew, S. C.; Liong, S. Y.; Vu, M. T.

    2011-07-01

    Urbanization has the direct effect of increasing the amount of surface runoff to be discharged through man-made drainage systems. Thus, Singapore's rapid urbanization has drawn great attention on flooding issues. In view of this, proper stormwater modeling approach is necessary for the assessment planning, design, and control of the storm and combines sewerage system. Impacts of urbanization on surface runoff and catchment flooding in Singapore are studied in this paper. In this study, the application of SOBEK-urban 1D is introduced on model catchments and a hypothetical catchment model is created for simulation purpose. Stormwater modeling approach using SOBEK-urban offers a comprehensive modeling tool for simple or extensive urban drainage systems consisting of sewers and open channels despite its size and complexity of the network. The findings from the present study show that stormwater modeling is able to identify flood area and the impact of the anticipated sea level on urban drainage network. Consequently, the performance of the urban drainage system can be improved and early prevention approaches can be carried out.

  19. Approaches and models of intercultural education

    Directory of Open Access Journals (Sweden)

    Iván Manuel Sánchez Fontalvo

    2013-10-01

    Full Text Available Needed to be aware of the need to build an intercultural society, awareness must be assumed in all social spheres, where stands the role play education. A role of transcendental, since it must promote educational spaces to form people with virtues and powers that allow them to live together / as in multicultural contexts and social diversities (sometimes uneven in an increasingly globalized and interconnected world, and foster the development of feelings of civic belonging shared before the neighborhood, city, region and country, allowing them concern and critical judgement to marginalization, poverty, misery and inequitable distribution of wealth, causes of structural violence, but at the same time, wanting to work for the welfare and transformation of these scenarios. Since these budgets, it is important to know the approaches and models of intercultural education that have been developed so far, analysing their impact on the contexts educational where apply.   

  20. Hill-type muscle model parameters determined from experiments on single muscles show large animal-to-animal variation.

    Science.gov (United States)

    Blümel, Marcus; Guschlbauer, Christoph; Daun-Gruhn, Silvia; Hooper, Scott L; Büschges, Ansgar

    2012-11-01

    Models built using mean data can represent only a very small percentage, or none, of the population being modeled, and produce different activity than any member of it. Overcoming this "averaging" pitfall requires measuring, in single individuals in single experiments, all of the system's defining characteristics. We have developed protocols that allow all the parameters in the curves used in typical Hill-type models (passive and active force-length, series elasticity, force-activation, force-velocity) to be determined from experiments on individual stick insect muscles (Blümel et al. 2012a). A requirement for means to not well represent the population is that the population shows large variation in its defining characteristics. We therefore used these protocols to measure extensor muscle defining parameters in multiple animals. Across-animal variability in these parameters can be very large, ranging from 1.3- to 17-fold. This large variation is consistent with earlier data in which extensor muscle responses to identical motor neuron driving showed large animal-to-animal variability (Hooper et al. 2006), and suggests accurate modeling of extensor muscles requires modeling individual-by-individual. These complete characterizations of individual muscles also allowed us to test for parameter correlations. Two parameter pairs significantly co-varied, suggesting that a simpler model could as well reproduce muscle response.

  1. Drifting model approach to modeling based on weighted support vector machines

    Institute of Scientific and Technical Information of China (English)

    冯瑞; 宋春林; 邵惠鹤

    2004-01-01

    This paper proposes a novel drifting modeling (DM) method. Briefly, we first employ an improved SVMs algorithm named weighted support vector machines (W_SVMs), which is suitable for locally learning, and then the DM method using the algorithm is proposed. By applying the proposed modeling method to Fluidized Catalytic Cracking Unit (FCCU), the simulation results show that the property of this proposed approach is superior to global modeling method based on standard SVMs.

  2. Validation of models with constant bias: an applied approach

    Directory of Open Access Journals (Sweden)

    Salvador Medina-Peralta

    2014-06-01

    Full Text Available Objective. This paper presents extensions to the statistical validation method based on the procedure of Freese when a model shows constant bias (CB in its predictions and illustrate the method with data from a new mechanistic model that predict weight gain in cattle. Materials and methods. The extensions were the hypothesis tests and maximum anticipated error for the alternative approach, and the confidence interval for a quantile of the distribution of errors. Results. The model evaluated showed CB, once the CB is removed and with a confidence level of 95%, the magnitude of the error does not exceed 0.575 kg. Therefore, the validated model can be used to predict the daily weight gain of cattle, although it will require an adjustment in its structure based on the presence of CB to increase the accuracy of its forecasts. Conclusions. The confidence interval for the 1-α quantile of the distribution of errors after correcting the constant bias, allows determining the top limit for the magnitude of the error of prediction and use it to evaluate the evolution of the model in the forecasting of the system. The confidence interval approach to validate a model is more informative than the hypothesis tests for the same purpose.

  3. Value Delivery Architecture Modeling – A New Approach for Business Modeling

    Directory of Open Access Journals (Sweden)

    Joachim Metzger

    2015-08-01

    Full Text Available Complexity and uncertainty have evolved as important challenges for entrepreneurship in many industries. Value Delivery Architecture Modeling (VDAM is a proposal for a new approach for business modeling to conquer these challenges. In addition to the creation of transparency and clarity, our approach supports the operationalization of business model ideas. VDAM is based on the combination of a new business modeling language called VDML, ontology building, and the implementation of a level of cross-company abstraction. The application of our new approach in the area of electric mobility in Germany, an industry sector with high levels of uncertainty and a lack of common understanding, shows several promising results: VDAM enables the development of an unambiguous and unbiased view on value creation. Additionally it allows for several applications leading to a more informed decision towards the implementation of new business models.

  4. Noether symmetry approach in f(R)-tachyon model

    Energy Technology Data Exchange (ETDEWEB)

    Jamil, Mubasher, E-mail: mjamil@camp.nust.edu.pk [Center for Advanced Mathematics and Physics (CAMP), National University of Sciences and Technology (NUST), H-12, Islamabad (Pakistan); Mahomed, F.M., E-mail: Fazal.Mahomed@wits.ac.za [Centre for Differential Equations, Continuum Mechanics and Applications, School of Computational and Applied Mathematics, University of the Witwatersrand, Wits 2050 (South Africa); Momeni, D., E-mail: d.momeni@yahoo.com [Department of Physics, Faculty of Sciences, Tarbiat Moa' llem University, Tehran (Iran, Islamic Republic of)

    2011-08-26

    In this Letter by utilizing the Noether symmetry approach in cosmology, we attempt to find the tachyon potential via the application of this kind of symmetry to a flat Friedmann-Robertson-Walker (FRW) metric. We reduce the system of equations to simpler ones and obtain the general class of the tachyon's potential function and f(R) functions. We have found that the Noether symmetric model results in a power law f(R) and an inverse fourth power potential for the tachyonic field. Further we investigate numerically the cosmological evolution of our model and show explicitly the behavior of the equation of state crossing the cosmological constant boundary.

  5. A Bayesian modeling approach for generalized semiparametric structural equation models.

    Science.gov (United States)

    Song, Xin-Yuan; Lu, Zhao-Hua; Cai, Jing-Heng; Ip, Edward Hak-Sing

    2013-10-01

    In behavioral, biomedical, and psychological studies, structural equation models (SEMs) have been widely used for assessing relationships between latent variables. Regression-type structural models based on parametric functions are often used for such purposes. In many applications, however, parametric SEMs are not adequate to capture subtle patterns in the functions over the entire range of the predictor variable. A different but equally important limitation of traditional parametric SEMs is that they are not designed to handle mixed data types-continuous, count, ordered, and unordered categorical. This paper develops a generalized semiparametric SEM that is able to handle mixed data types and to simultaneously model different functional relationships among latent variables. A structural equation of the proposed SEM is formulated using a series of unspecified smooth functions. The Bayesian P-splines approach and Markov chain Monte Carlo methods are developed to estimate the smooth functions and the unknown parameters. Moreover, we examine the relative benefits of semiparametric modeling over parametric modeling using a Bayesian model-comparison statistic, called the complete deviance information criterion (DIC). The performance of the developed methodology is evaluated using a simulation study. To illustrate the method, we used a data set derived from the National Longitudinal Survey of Youth.

  6. Plectasin shows intracellular activity against Staphylococcus aureus in human THP-1 monocytes and in a mouse peritonitis model

    DEFF Research Database (Denmark)

    Brinch, Karoline Sidelmann; Sandberg, Anne; Baudoux, Pierre

    2009-01-01

    was maintained (maximal relative efficacy [E(max)], 1.0- to 1.3-log reduction in CFU) even though efficacy was inferior to that of extracellular killing (E(max), >4.5-log CFU reduction). Animal studies included a novel use of the mouse peritonitis model, exploiting extra- and intracellular differentiation assays...... concentration. These findings stress the importance of performing studies of extra- and intracellular activity since these features cannot be predicted from traditional MIC and killing kinetic studies. Application of both the THP-1 and the mouse peritonitis models showed that the in vitro results were similar...

  7. A simplified GIS approach to modeling global leaf water isoscapes.

    Directory of Open Access Journals (Sweden)

    Jason B West

    Full Text Available The stable hydrogen (delta(2H and oxygen (delta(18O isotope ratios of organic and inorganic materials record biological and physical processes through the effects of substrate isotopic composition and fractionations that occur as reactions proceed. At large scales, these processes can exhibit spatial predictability because of the effects of coherent climatic patterns over the Earth's surface. Attempts to model spatial variation in the stable isotope ratios of water have been made for decades. Leaf water has a particular importance for some applications, including plant organic materials that record spatial and temporal climate variability and that may be a source of food for migrating animals. It is also an important source of the variability in the isotopic composition of atmospheric gases. Although efforts to model global-scale leaf water isotope ratio spatial variation have been made (especially of delta(18O, significant uncertainty remains in models and their execution across spatial domains. We introduce here a Geographic Information System (GIS approach to the generation of global, spatially-explicit isotope landscapes (= isoscapes of "climate normal" leaf water isotope ratios. We evaluate the approach and the resulting products by comparison with simulation model outputs and point measurements, where obtainable, over the Earth's surface. The isoscapes were generated using biophysical models of isotope fractionation and spatially continuous precipitation isotope and climate layers as input model drivers. Leaf water delta(18O isoscapes produced here generally agreed with latitudinal averages from GCM/biophysical model products, as well as mean values from point measurements. These results show global-scale spatial coherence in leaf water isotope ratios, similar to that observed for precipitation and validate the GIS approach to modeling leaf water isotopes. These results demonstrate that relatively simple models of leaf water enrichment

  8. Social Network Analyses and Nutritional Behavior: An Integrated Modeling Approach

    Directory of Open Access Journals (Sweden)

    Alistair McNair Senior

    2016-01-01

    Full Text Available Animals have evolved complex foraging strategies to obtain a nutritionally balanced diet and associated fitness benefits. Recent advances in nutrition research, combining state-space models of nutritional geometry with agent-based models of systems biology, show how nutrient targeted foraging behavior can also influence animal social interactions, ultimately affecting collective dynamics and group structures. Here we demonstrate how social network analyses can be integrated into such a modeling framework and provide a tangible and practical analytical tool to compare experimental results with theory. We illustrate our approach by examining the case of nutritionally mediated dominance hierarchies. First we show how nutritionally explicit agent-based models that simulate the emergence of dominance hierarchies can be used to generate social networks. Importantly the structural properties of our simulated networks bear similarities to dominance networks of real animals (where conflicts are not always directly related to nutrition. Finally, we demonstrate how metrics from social network analyses can be used to predict the fitness of agents in these simulated competitive environments. Our results highlight the potential importance of nutritional mechanisms in shaping dominance interactions in a wide range of social and ecological contexts. Nutrition likely influences social interaction in many species, and yet a theoretical framework for exploring these effects is currently lacking. Combining social network analyses with computational models from nutritional ecology may bridge this divide, representing a pragmatic approach for generating theoretical predictions for nutritional experiments.

  9. Generalized linear models with coarsened covariates: a practical Bayesian approach.

    Science.gov (United States)

    Johnson, Timothy R; Wiest, Michelle M

    2014-06-01

    Coarsened covariates are a common and sometimes unavoidable phenomenon encountered in statistical modeling. Covariates are coarsened when their values or categories have been grouped. This may be done to protect privacy or to simplify data collection or analysis when researchers are not aware of their drawbacks. Analyses with coarsened covariates based on ad hoc methods can compromise the validity of inferences. One valid method for accounting for a coarsened covariate is to use a marginal likelihood derived by summing or integrating over the unknown realizations of the covariate. However, algorithms for estimation based on this approach can be tedious to program and can be computationally expensive. These are significant obstacles to their use in practice. To overcome these limitations, we show that when expressed as a Bayesian probability model, a generalized linear model with a coarsened covariate can be posed as a tractable missing data problem where the missing data are due to censoring. We also show that this model is amenable to widely available general-purpose software for simulation-based inference for Bayesian probability models, providing researchers a very practical approach for dealing with coarsened covariates.

  10. An integrated approach to permeability modeling using micro-models

    Energy Technology Data Exchange (ETDEWEB)

    Hosseini, A.H.; Leuangthong, O.; Deutsch, C.V. [Society of Petroleum Engineers, Canadian Section, Calgary, AB (Canada)]|[Alberta Univ., Edmonton, AB (Canada)

    2008-10-15

    An important factor in predicting the performance of steam assisted gravity drainage (SAGD) well pairs is the spatial distribution of permeability. Complications that make the inference of a reliable porosity-permeability relationship impossible include the presence of short-scale variability in sand/shale sequences; preferential sampling of core data; and uncertainty in upscaling parameters. Micro-modelling is a simple and effective method for overcoming these complications. This paper proposed a micro-modeling approach to account for sampling bias, small laminated features with high permeability contrast, and uncertainty in upscaling parameters. The paper described the steps and challenges of micro-modeling and discussed the construction of binary mixture geo-blocks; flow simulation and upscaling; extended power law formalism (EPLF); and the application of micro-modeling and EPLF. An extended power-law formalism to account for changes in clean sand permeability as a function of macroscopic shale content was also proposed and tested against flow simulation results. There was close agreement between the model and simulation results. The proposed methodology was also applied to build the porosity-permeability relationship for laminated and brecciated facies of McMurray oil sands. Experimental data was in good agreement with the experimental data. 8 refs., 17 figs.

  11. ISM Approach to Model Offshore Outsourcing Risks

    Directory of Open Access Journals (Sweden)

    Sunand Kumar

    2014-07-01

    Full Text Available In an effort to achieve a competitive advantage via cost reductions and improved market responsiveness, organizations are increasingly employing offshore outsourcing as a major component of their supply chain strategies. But as evident from literature number of risks such as Political risk, Risk due to cultural differences, Compliance and regulatory risk, Opportunistic risk and Organization structural risk, which adversely affect the performance of offshore outsourcing in a supply chain network. This also leads to dissatisfaction among different stake holders. The main objective of this paper is to identify and understand the mutual interaction among various risks which affect the performance of offshore outsourcing.  To this effect, authors have identified various risks through extant review of literature.  From this information, an integrated model using interpretive structural modelling (ISM for risks affecting offshore outsourcing is developed and the structural relationships between these risks are modeled.  Further, MICMAC analysis is done to analyze the driving power and dependency of risks which shall be helpful to managers to identify and classify important criterions and to reveal the direct and indirect effects of each criterion on offshore outsourcing. Results show that political risk and risk due to cultural differences are act as strong drivers.

  12. Development of a computationally efficient urban modeling approach

    DEFF Research Database (Denmark)

    Wolfs, Vincent; Murla, Damian; Ntegeka, Victor

    2016-01-01

    This paper presents a parsimonious and data-driven modelling approach to simulate urban floods. Flood levels simulated by detailed 1D-2D hydrodynamic models can be emulated using the presented conceptual modelling approach with a very short calculation time. In addition, the model detail can be a...

  13. A reservoir simulation approach for modeling of naturally fractured reservoirs

    Directory of Open Access Journals (Sweden)

    H. Mohammadi

    2012-12-01

    Full Text Available In this investigation, the Warren and Root model proposed for the simulation of naturally fractured reservoir was improved. A reservoir simulation approach was used to develop a 2D model of a synthetic oil reservoir. Main rock properties of each gridblock were defined for two different types of gridblocks called matrix and fracture gridblocks. These two gridblocks were different in porosity and permeability values which were higher for fracture gridblocks compared to the matrix gridblocks. This model was solved using the implicit finite difference method. Results showed an improvement in the Warren and Root model especially in region 2 of the semilog plot of pressure drop versus time, which indicated a linear transition zone with no inflection point as predicted by other investigators. Effects of fracture spacing, fracture permeability, fracture porosity, matrix permeability and matrix porosity on the behavior of a typical naturally fractured reservoir were also presented.

  14. Anomalous superconductivity in the tJ model; moment approach

    DEFF Research Database (Denmark)

    Sørensen, Mads Peter; Rodriguez-Nunez, J.J.

    1997-01-01

    By extending the moment approach of Nolting (Z, Phys, 225 (1972) 25) in the superconducting phase, we have constructed the one-particle spectral functions (diagonal and off-diagonal) for the tJ model in any dimensions. We propose that both the diagonal and the off-diagonal spectral functions...... Hartree shift which in the end result enlarges the bandwidth of the free carriers allowing us to take relative high values of J/t and allowing superconductivity to live in the T-c-rho phase diagram, in agreement with numerical calculations in a cluster, We have calculated the static spin susceptibility......, chi(T), and the specific heat, C-v(T), within the moment approach. We find that all the relevant physical quantities show the signature of superconductivity at T-c in the form of kinks (anomalous behavior) or jumps, for low density, in agreement with recent published literature, showing a generic...

  15. Skeletal Muscle Differentiation on a Chip Shows Human Donor Mesoangioblasts' Efficiency in Restoring Dystrophin in a Duchenne Muscular Dystrophy Model.

    Science.gov (United States)

    Serena, Elena; Zatti, Susi; Zoso, Alice; Lo Verso, Francesca; Tedesco, F Saverio; Cossu, Giulio; Elvassore, Nicola

    2016-12-01

    : Restoration of the protein dystrophin on muscle membrane is the goal of many research lines aimed at curing Duchenne muscular dystrophy (DMD). Results of ongoing preclinical and clinical trials suggest that partial restoration of dystrophin might be sufficient to significantly reduce muscle damage. Different myogenic progenitors are candidates for cell therapy of muscular dystrophies, but only satellite cells and pericytes have already entered clinical experimentation. This study aimed to provide in vitro quantitative evidence of the ability of mesoangioblasts to restore dystrophin, in terms of protein accumulation and distribution, within myotubes derived from DMD patients, using a microengineered model. We designed an ad hoc experimental strategy to miniaturize on a chip the standard process of muscle regeneration independent of variables such as inflammation and fibrosis. It is based on the coculture, at different ratios, of human dystrophin-positive myogenic progenitors and dystrophin-negative myoblasts in a substrate with muscle-like physiological stiffness and cell micropatterns. Results showed that both healthy myoblasts and mesoangioblasts restored dystrophin expression in DMD myotubes. However, mesoangioblasts showed unexpected efficiency with respect to myoblasts in dystrophin production in terms of the amount of protein produced (40% vs. 15%) and length of the dystrophin membrane domain (210-240 µm vs. 40-70 µm). These results show that our microscaled in vitro model of human DMD skeletal muscle validated previous in vivo preclinical work and may be used to predict efficacy of new methods aimed at enhancing dystrophin accumulation and distribution before they are tested in vivo, reducing time, costs, and variability of clinical experimentation. This study aimed to provide in vitro quantitative evidence of the ability of human mesoangioblasts to restore dystrophin, in terms of protein accumulation and distribution, within myotubes derived from

  16. A Bayesian Approach for Structural Learning with Hidden Markov Models

    Directory of Open Access Journals (Sweden)

    Cen Li

    2002-01-01

    Full Text Available Hidden Markov Models(HMM have proved to be a successful modeling paradigm for dynamic and spatial processes in many domains, such as speech recognition, genomics, and general sequence alignment. Typically, in these applications, the model structures are predefined by domain experts. Therefore, the HMM learning problem focuses on the learning of the parameter values of the model to fit the given data sequences. However, when one considers other domains, such as, economics and physiology, model structure capturing the system dynamic behavior is not available. In order to successfully apply the HMM methodology in these domains, it is important that a mechanism is available for automatically deriving the model structure from the data. This paper presents a HMM learning procedure that simultaneously learns the model structure and the maximum likelihood parameter values of a HMM from data. The HMM model structures are derived based on the Bayesian model selection methodology. In addition, we introduce a new initialization procedure for HMM parameter value estimation based on the K-means clustering method. Experimental results with artificially generated data show the effectiveness of the approach.

  17. Social Network Analysis and Nutritional Behavior: An Integrated Modeling Approach.

    Science.gov (United States)

    Senior, Alistair M; Lihoreau, Mathieu; Buhl, Jerome; Raubenheimer, David; Simpson, Stephen J

    2016-01-01

    Animals have evolved complex foraging strategies to obtain a nutritionally balanced diet and associated fitness benefits. Recent research combining state-space models of nutritional geometry with agent-based models (ABMs), show how nutrient targeted foraging behavior can also influence animal social interactions, ultimately affecting collective dynamics and group structures. Here we demonstrate how social network analyses can be integrated into such a modeling framework and provide a practical analytical tool to compare experimental results with theory. We illustrate our approach by examining the case of nutritionally mediated dominance hierarchies. First we show how nutritionally explicit ABMs that simulate the emergence of dominance hierarchies can be used to generate social networks. Importantly the structural properties of our simulated networks bear similarities to dominance networks of real animals (where conflicts are not always directly related to nutrition). Finally, we demonstrate how metrics from social network analyses can be used to predict the fitness of agents in these simulated competitive environments. Our results highlight the potential importance of nutritional mechanisms in shaping dominance interactions in a wide range of social and ecological contexts. Nutrition likely influences social interactions in many species, and yet a theoretical framework for exploring these effects is currently lacking. Combining social network analyses with computational models from nutritional ecology may bridge this divide, representing a pragmatic approach for generating theoretical predictions for nutritional experiments.

  18. Connectivity of channelized reservoirs: a modelling approach

    Energy Technology Data Exchange (ETDEWEB)

    Larue, David K. [ChevronTexaco, Bakersfield, CA (United States); Hovadik, Joseph [ChevronTexaco, San Ramon, CA (United States)

    2006-07-01

    Connectivity represents one of the fundamental properties of a reservoir that directly affects recovery. If a portion of the reservoir is not connected to a well, it cannot be drained. Geobody or sandbody connectivity is defined as the percentage of the reservoir that is connected, and reservoir connectivity is defined as the percentage of the reservoir that is connected to wells. Previous studies have mostly considered mathematical, physical and engineering aspects of connectivity. In the current study, the stratigraphy of connectivity is characterized using simple, 3D geostatistical models. Based on these modelling studies, stratigraphic connectivity is good, usually greater than 90%, if the net: gross ratio, or sand fraction, is greater than about 30%. At net: gross values less than 30%, there is a rapid diminishment of connectivity as a function of net: gross. This behaviour between net: gross and connectivity defines a characteristic 'S-curve', in which the connectivity is high for net: gross values above 30%, then diminishes rapidly and approaches 0. Well configuration factors that can influence reservoir connectivity are well density, well orientation (vertical or horizontal; horizontal parallel to channels or perpendicular) and length of completion zones. Reservoir connectivity as a function of net: gross can be improved by several factors: presence of overbank sandy facies, deposition of channels in a channel belt, deposition of channels with high width/thickness ratios, and deposition of channels during variable floodplain aggradation rates. Connectivity can be reduced substantially in two-dimensional reservoirs, in map view or in cross-section, by volume support effects and by stratigraphic heterogeneities. It is well known that in two dimensions, the cascade zone for the 'S-curve' of net: gross plotted against connectivity occurs at about 60% net: gross. Generalizing this knowledge, any time that a reservoir can be regarded as &apos

  19. Genetic Algorithm Approaches to Prebiobiotic Chemistry Modeling

    Science.gov (United States)

    Lohn, Jason; Colombano, Silvano

    1997-01-01

    We model an artificial chemistry comprised of interacting polymers by specifying two initial conditions: a distribution of polymers and a fixed set of reversible catalytic reactions. A genetic algorithm is used to find a set of reactions that exhibit a desired dynamical behavior. Such a technique is useful because it allows an investigator to determine whether a specific pattern of dynamics can be produced, and if it can, the reaction network found can be then analyzed. We present our results in the context of studying simplified chemical dynamics in theorized protocells - hypothesized precursors of the first living organisms. Our results show that given a small sample of plausible protocell reaction dynamics, catalytic reaction sets can be found. We present cases where this is not possible and also analyze the evolved reaction sets.

  20. Connecting with The Biggest Loser: an extended model of parasocial interaction and identification in health-related reality TV shows.

    Science.gov (United States)

    Tian, Yan; Yoo, Jina H

    2015-01-01

    This study investigates audience responses to health-related reality TV shows in the setting of The Biggest Loser. It conceptualizes a model for audience members' parasocial interaction and identification with cast members and explores antecedents and outcomes of parasocial interaction and identification. Data analysis suggests the following direct relationships: (1) audience members' exposure to the show is positively associated with parasocial interaction, which in turn is positively associated with identification, (2) parasocial interaction is positively associated with exercise self-efficacy, whereas identification is negatively associated with exercise self-efficacy, and (3) exercise self-efficacy is positively associated with exercise behavior. Indirect effects of parasocial interaction and identification on exercise self-efficacy and exercise behavior are also significant. We discuss the theoretical and practical implications of these findings.

  1. Approaching the other: Investigation of a descriptive belief revision model

    Directory of Open Access Journals (Sweden)

    Spyridon Stelios

    2016-12-01

    Full Text Available When an individual—a hearer—is confronted with an opinion expressed by another individual—a speaker—differing from her only in terms of a degree of belief, how will she react? In trying to answer that question this paper reintroduces and investigates a descriptive belief revision model designed to measure approaches. Parameters of the model are the hearer’s credibility account of the speaker, the initial difference between the hearer’s and speaker’s degrees of belief, and the hearer’s resistance to change. Within an interdisciplinary framework, two empirical studies were conducted. A comparison was carried out between empirically recorded revisions and revisions according to the model. Results showed that the theoretical model is highly confirmed. An interesting finding is the measurement of an “unexplainable behaviour” that is not classified either as repulsion or as approach. At a second level of analysis, the model is compared to the Bayesian framework of inference. Structural differences and evidence for optimal descriptive adequacy of the former were highlighted.

  2. An implicit approach to model plant infestation by insect pests.

    Science.gov (United States)

    Lopes, Christelle; Spataro, Thierry; Doursat, Christophe; Lapchin, Laurent; Arditi, Roger

    2007-09-07

    Various spatial approaches were developed to study the effect of spatial heterogeneities on population dynamics. We present in this paper a flux-based model to describe an aphid-parasitoid system in a closed and spatially structured environment, i.e. a greenhouse. Derived from previous work and adapted to host-parasitoid interactions, our model represents the level of plant infestation as a continuous variable corresponding to the number of plants bearing a given density of pests at a given time. The variation of this variable is described by a partial differential equation. It is coupled to an ordinary differential equation and a delay-differential equation that describe the parasitized host population and the parasitoid population, respectively. We have applied our approach to the pest Aphis gossypii and to one of its parasitoids, Lysiphlebus testaceipes, in a melon greenhouse. Numerical simulations showed that, regardless of the number and distribution of hosts in the greenhouse, the aphid population is slightly larger if parasitoids display a type III rather than a type II functional response. However, the population dynamics depend on the initial distribution of hosts and the initial density of parasitoids released, which is interesting for biological control strategies. Sensitivity analysis showed that the delay in the parasitoid equation and the growth rate of the pest population are crucial parameters for predicting the dynamics. We demonstrate here that such a flux-based approach generates relevant predictions with a more synthetic formalism than a common plant-by-plant model. We also explain how this approach can be better adapted to test different management strategies and to manage crops of several greenhouses.

  3. Object-Oriented Approach to Modeling Units of Pneumatic Systems

    Directory of Open Access Journals (Sweden)

    Yu. V. Kyurdzhiev

    2014-01-01

    Full Text Available The article shows the relevance of the approaches to the object-oriented programming when modeling the pneumatic units (PU.Based on the analysis of the calculation schemes of aggregates pneumatic systems two basic objects, namely a cavity flow and a material point were highlighted.Basic interactions of objects are defined. Cavity-cavity interaction: ex-change of matter and energy with the flows of mass. Cavity-point interaction: force interaction, exchange of energy in the form of operation. Point-point in-teraction: force interaction, elastic interaction, inelastic interaction, and inter-vals of displacement.The authors have developed mathematical models of basic objects and interactions. Models and interaction of elements are implemented in the object-oriented programming.Mathematical models of elements of PU design scheme are implemented in derived from the base class. These classes implement the models of flow cavity, piston, diaphragm, short channel, diaphragm to be open by a given law, spring, bellows, elastic collision, inelastic collision, friction, PU stages with a limited movement, etc.A numerical integration of differential equations for the mathematical models of PU design scheme elements is based on the Runge-Kutta method of the fourth order. On request each class performs a tact of integration i.e. calcu-lation of the coefficient method.The paper presents an integration algorithm of the system of differential equations. All objects of the PU design scheme are placed in a unidirectional class list. Iterator loop cycle initiates the integration tact of all the objects in the list. One in four iteration makes a transition to the next step of integration. Calculation process stops when any object shows a shutdowns flag.The proposed approach was tested in the calculation of a number of PU designs. With regard to traditional approaches to modeling, the authors-proposed method features in easy enhancement, code reuse, high reliability

  4. An interdisciplinary approach for earthquake modelling and forecasting

    Science.gov (United States)

    Han, P.; Zhuang, J.; Hattori, K.; Ogata, Y.

    2016-12-01

    Earthquake is one of the most serious disasters, which may cause heavy casualties and economic losses. Especially in the past two decades, huge/mega earthquakes have hit many countries. Effective earthquake forecasting (including time, location, and magnitude) becomes extremely important and urgent. To date, various heuristically derived algorithms have been developed for forecasting earthquakes. Generally, they can be classified into two types: catalog-based approaches and non-catalog-based approaches. Thanks to the rapid development of statistical seismology in the past 30 years, now we are able to evaluate the performances of these earthquake forecast approaches quantitatively. Although a certain amount of precursory information is available in both earthquake catalogs and non-catalog observations, the earthquake forecast is still far from satisfactory. In most case, the precursory phenomena were studied individually. An earthquake model that combines self-exciting and mutually exciting elements was developed by Ogata and Utsu from the Hawkes process. The core idea of this combined model is that the status of the event at present is controlled by the event itself (self-exciting) and all the external factors (mutually exciting) in the past. In essence, the conditional intensity function is a time-varying Poisson process with rate λ(t), which is composed of the background rate, the self-exciting term (the information from past seismic events), and the external excitation term (the information from past non-seismic observations). This model shows us a way to integrate the catalog-based forecast and non-catalog-based forecast. Against this background, we are trying to develop a new earthquake forecast model which combines catalog-based and non-catalog-based approaches.

  5. Systematic approach to verification and validation: High explosive burn models

    Energy Technology Data Exchange (ETDEWEB)

    Menikoff, Ralph [Los Alamos National Laboratory; Scovel, Christina A. [Los Alamos National Laboratory

    2012-04-16

    , run a simulation, and generate a comparison plot showing simulated and experimental velocity gauge data. These scripts are then applied to several series of experiments and to several HE burn models. The same systematic approach is applicable to other types of material models; for example, equations of state models and material strength models.

  6. Forecasting wind-driven wildfires using an inverse modelling approach

    Directory of Open Access Journals (Sweden)

    O. Rios

    2013-12-01

    Full Text Available A technology able to rapidly forecast wildlfire dynamics would lead to a paradigm shift in the response to emergencies, providing the Fire Service with essential information about the on-going fire. The article at hand presents and explores a novel methodology to forecast wildfire dynamics in wind-driven conditions, using real time data assimilation and inverse modelling. The forecasting algorithm combines Rothermel's rate of spread theory with a perimeter expansion model based on Huygens principle and solves the optimisation problem with a tangent linear approach and a forward automatic differentiation. Its potential is investigated using synthetic data and evaluated in different wildfire scenarios. The results show the high capacity of the method to quickly predict the location of the fire front with a positive lead time (ahead of the event. This work opens the door to further advances framework and more sophisticated models while keeping the computational time suitable for operativeness.

  7. Restless led syndrome model Drosophila melanogaster show successful olfactory learning and 1-day retention of the acquired memory

    Directory of Open Access Journals (Sweden)

    Mika F. Asaba

    2013-09-01

    Full Text Available Restless Legs Syndrome (RLS is a prevalent but poorly understood disorder that ischaracterized by uncontrollable movements during sleep, resulting in sleep disturbance.Olfactory memory in Drosophila melanogaster has proven to be a useful tool for the study ofcognitive deficits caused by sleep disturbances, such as those seen in RLS. A recently generatedDrosophila model of RLS exhibited disturbed sleep patterns similar to those seen in humans withRLS. This research seeks to improve understanding of the relationship between cognitivefunctioning and sleep disturbances in a new model for RLS. Here, we tested learning andmemory in wild type and dBTBD9 mutant flies by Pavlovian olfactory conditioning, duringwhich a shock was paired with one of two odors. Flies were then placed in a T-maze with oneodor on either side, and successful associative learning was recorded when the flies chose theside with the unpaired odor. We hypothesized that due to disrupted sleep patterns, dBTBD9mutant flies would be unable to learn the shock-odor association. However, the current studyreports that the recently generated Drosophila model of RLS shows successful olfactorylearning, despite disturbed sleep patterns, with learning performance levels matching or betterthan wild type flies.

  8. Infiltration under snow cover: Modeling approaches and predictive uncertainty

    Science.gov (United States)

    Meeks, Jessica; Moeck, Christian; Brunner, Philip; Hunkeler, Daniel

    2017-03-01

    Groundwater recharge from snowmelt represents a temporal redistribution of precipitation. This is extremely important because the rate and timing of snowpack drainage has substantial consequences to aquifer recharge patterns, which in turn affect groundwater availability throughout the rest of the year. The modeling methods developed to estimate drainage from a snowpack, which typically rely on temporally-dense point-measurements or temporally-limited spatially-dispersed calibration data, range in complexity from the simple degree-day method to more complex and physically-based energy balance approaches. While the gamut of snowmelt models are routinely used to aid in water resource management, a comparison of snowmelt models' predictive uncertainties had previously not been done. Therefore, we established a snowmelt model calibration dataset that is both temporally dense and represents the integrated snowmelt infiltration signal for the Vers Chez le Brandt research catchment, which functions as a rather unique natural lysimeter. We then evaluated the uncertainty associated with the degree-day, a modified degree-day and energy balance snowmelt model predictions using the null-space Monte Carlo approach. All three melt models underestimate total snowpack drainage, underestimate the rate of early and midwinter drainage and overestimate spring snowmelt rates. The actual rate of snowpack water loss is more constant over the course of the entire winter season than the snowmelt models would imply, indicating that mid-winter melt can contribute as significantly as springtime snowmelt to groundwater recharge in low alpine settings. Further, actual groundwater recharge could be between 2 and 31% greater than snowmelt models suggest, over the total winter season. This study shows that snowmelt model predictions can have considerable uncertainty, which may be reduced by the inclusion of more data that allows for the use of more complex approaches such as the energy balance

  9. Comparing large-scale computational approaches to epidemic modeling: Agent-based versus structured metapopulation models

    Directory of Open Access Journals (Sweden)

    Merler Stefano

    2010-06-01

    Full Text Available Abstract Background In recent years large-scale computational models for the realistic simulation of epidemic outbreaks have been used with increased frequency. Methodologies adapt to the scale of interest and range from very detailed agent-based models to spatially-structured metapopulation models. One major issue thus concerns to what extent the geotemporal spreading pattern found by different modeling approaches may differ and depend on the different approximations and assumptions used. Methods We provide for the first time a side-by-side comparison of the results obtained with a stochastic agent-based model and a structured metapopulation stochastic model for the progression of a baseline pandemic event in Italy, a large and geographically heterogeneous European country. The agent-based model is based on the explicit representation of the Italian population through highly detailed data on the socio-demographic structure. The metapopulation simulations use the GLobal Epidemic and Mobility (GLEaM model, based on high-resolution census data worldwide, and integrating airline travel flow data with short-range human mobility patterns at the global scale. The model also considers age structure data for Italy. GLEaM and the agent-based models are synchronized in their initial conditions by using the same disease parameterization, and by defining the same importation of infected cases from international travels. Results The results obtained show that both models provide epidemic patterns that are in very good agreement at the granularity levels accessible by both approaches, with differences in peak timing on the order of a few days. The relative difference of the epidemic size depends on the basic reproductive ratio, R0, and on the fact that the metapopulation model consistently yields a larger incidence than the agent-based model, as expected due to the differences in the structure in the intra-population contact pattern of the approaches. The age

  10. A DYNAMICAL SYSTEM APPROACH IN MODELING TECHNOLOGY TRANSFER

    Directory of Open Access Journals (Sweden)

    Hennie Husniah

    2016-05-01

    Full Text Available In this paper we discuss a mathematical model of two parties technology transfer from a leader to a follower. The model is reconstructed via dynamical system approach from a known standard Raz and Assa model and we found some important conclusion which have not been discussed in the original model. The model assumes that in the absence of technology transfer from a leader to a follower, both the leader and the follower have a capability to grow independently with a known upper limit of the development. We obtain a rich mathematical structure of the steady state solution of the model. We discuss a special situation in which the upper limit of the technological development of the follower is higher than that of the leader, but the leader has started earlier than the follower in implementing the technology. In this case we show a paradox stating that the follower is unable to reach its original upper limit of the technological development could appear whenever the transfer rate is sufficiently high.  We propose a new model to increase realism so that any technological transfer rate could only has a positive effect in accelerating the rate of growth of the follower in reaching its original upper limit of the development.

  11. Exploring a type-theoretic approach to accessibility constraint modelling

    CERN Document Server

    Pogodalla, Sylvain

    2008-01-01

    The type-theoretic modelling of DRT that [degroote06] proposed features continuations for the management of the context in which a clause has to be interpreted. This approach, while keeping the standard definitions of quantifier scope, translates the rules of the accessibility constraints of discourse referents inside the semantic recipes. In this paper, we deal with additional rules for these accessibility constraints. In particular in the case of discourse referents introduced by proper nouns, that negation does not block, and in the case of rhetorical relations that structure discourses. We show how this continuation-based approach applies to those accessibility constraints and how we can consider the parallel management of various principles.

  12. Uncertainty in biology a computational modeling approach

    CERN Document Server

    Gomez-Cabrero, David

    2016-01-01

    Computational modeling of biomedical processes is gaining more and more weight in the current research into the etiology of biomedical problems and potential treatment strategies.  Computational modeling allows to reduce, refine and replace animal experimentation as well as to translate findings obtained in these experiments to the human background. However these biomedical problems are inherently complex with a myriad of influencing factors, which strongly complicates the model building and validation process.  This book wants to address four main issues related to the building and validation of computational models of biomedical processes: Modeling establishment under uncertainty Model selection and parameter fitting Sensitivity analysis and model adaptation Model predictions under uncertainty In each of the abovementioned areas, the book discusses a number of key-techniques by means of a general theoretical description followed by one or more practical examples.  This book is intended for graduate stude...

  13. Spintronic device modeling and evaluation using modular approach to spintronics

    Science.gov (United States)

    Ganguly, Samiran

    Spintronics technology finds itself in an exciting stage today. Riding on the backs of rapid growth and impressive advances in materials and phenomena, it has started to make headway in the memory industry as solid state magnetic memories (STT-MRAM) and is considered a possible candidate to replace the CMOS when its scaling reaches physical limits. It is necessary to bring all these advances together in a coherent fashion to explore and evaluate the potential of spintronic devices. This work creates a framework for this exploration and evaluation based on Modular Approach to Spintronics, which encapsulate the physics of transport of charge and spin through materials and the phenomenology of magnetic dynamics and interaction in benchmarked elemental modules. These modules can then be combined together to form spin-circuit models of complex spintronic devices and structures which can be simulated using SPICE like circuit simulators. In this work we demonstrate how Modular Approach to Spintronics can be used to build spin-circuit models of functional spintronic devices of all types: memory, logic, and oscillators. We then show how Modular Approach to Spintronics can help identify critical factors behind static and dynamic dissipation in spintronic devices and provide remedies by exploring the use of various alternative materials and phenomena. Lastly, we show the use of Modular Approach to Spintronics in exploring new paradigms of computing enabled by the inherent physics of spintronic devices. We hope that this work will encourage more research and experiments that will establish spintronics as a viable technology for continued advancement of electronics.

  14. ALREST High Fidelity Modeling Program Approach

    Science.gov (United States)

    2011-05-18

    Gases and Mixtures of Redlich - Kwong and Peng- Robinson Fluids Assumed pdf Model based on k- ε-g Model in NASA/LaRc Vulcan code Level Set model...Potential Attractiveness Of Liquid Hydrocarbon Engines For Boost Applications • Propensity Of Hydrocarbon Engines For Combustion Instability • Air

  15. Novel AAV-based rat model of forebrain synucleinopathy shows extensive pathologies and progressive loss of cholinergic interneurons.

    Directory of Open Access Journals (Sweden)

    Patrick Aldrin-Kirk

    Full Text Available Synucleinopathies, characterized by intracellular aggregation of α-synuclein protein, share a number of features in pathology and disease progression. However, the vulnerable cell population differs significantly between the disorders, despite being caused by the same protein. While the vulnerability of dopamine cells in the substantia nigra to α-synuclein over-expression, and its link to Parkinson's disease, is well studied, animal models recapitulating the cortical degeneration in dementia with Lewy-bodies (DLB are much less mature. The aim of this study was to develop a first rat model of widespread progressive synucleinopathy throughout the forebrain using adeno-associated viral (AAV vector mediated gene delivery. Through bilateral injection of an AAV6 vector expressing human wild-type α-synuclein into the forebrain of neonatal rats, we were able to achieve widespread, robust α-synuclein expression with preferential expression in the frontal cortex. These animals displayed a progressive emergence of hyper-locomotion and dysregulated response to the dopaminergic agonist apomorphine. The animals receiving the α-synuclein vector displayed significant α-synuclein pathology including intra-cellular inclusion bodies, axonal pathology and elevated levels of phosphorylated α-synuclein, accompanied by significant loss of cortical neurons and a progressive reduction in both cortical and striatal ChAT positive interneurons. Furthermore, we found evidence of α-synuclein sequestered by IBA-1 positive microglia, which was coupled with a distinct change in morphology. In areas of most prominent pathology, the total α-synuclein levels were increased to, on average, two-fold, which is similar to the levels observed in patients with SNCA gene triplication, associated with cortical Lewy body pathology. This study provides a novel rat model of progressive cortical synucleinopathy, showing for the first time that cholinergic interneurons are vulnerable

  16. Pomalidomide shows significant therapeutic activity against CNS lymphoma with a major impact on the tumor microenvironment in murine models.

    Science.gov (United States)

    Li, Zhimin; Qiu, Yushi; Personett, David; Huang, Peng; Edenfield, Brandy; Katz, Jason; Babusis, Darius; Tang, Yang; Shirely, Michael A; Moghaddam, Mehran F; Copland, John A; Tun, Han W

    2013-01-01

    Primary CNS lymphoma carries a poor prognosis. Novel therapeutic agents are urgently needed. Pomalidomide (POM) is a novel immunomodulatory drug with anti-lymphoma activity. CNS pharmacokinetic analysis was performed in rats to assess the CNS penetration of POM. Preclinical evaluation of POM was performed in two murine models to assess its therapeutic activity against CNS lymphoma. The impact of POM on the CNS lymphoma immune microenvironment was evaluated by immunohistochemistry and immunofluorescence. In vitro cell culture experiments were carried out to further investigate the impact of POM on the biology of macrophages. POM crosses the blood brain barrier with CNS penetration of ~ 39%. Preclinical evaluations showed that it had significant therapeutic activity against CNS lymphoma with significant reduction in tumor growth rate and prolongation of survival, that it had a major impact on the tumor microenvironment with an increase in macrophages and natural killer cells, and that it decreased M2-polarized tumor-associated macrophages and increased M1-polarized macrophages when macrophages were evaluated based on polarization status. In vitro studies using various macrophage models showed that POM converted the polarization status of IL4-stimulated macrophages from M2 to M1, that M2 to M1 conversion by POM in the polarization status of lymphoma-associated macrophages is dependent on the presence of NK cells, that POM induced M2 to M1 conversion in the polarization of macrophages by inactivating STAT6 signaling and activating STAT1 signaling, and that POM functionally increased the phagocytic activity of macrophages. Based on our findings, POM is a promising therapeutic agent for CNS lymphoma with excellent CNS penetration, significant preclinical therapeutic activity, and a major impact on the tumor microenvironment. It can induce significant biological changes in tumor-associated macrophages, which likely play a major role in its therapeutic activity against CNS

  17. In vitro and in vivo models of cerebral ischemia show discrepancy in therapeutic effects of M2 macrophages.

    Directory of Open Access Journals (Sweden)

    Virginie Desestret

    Full Text Available THE INFLAMMATORY RESPONSE FOLLOWING ISCHEMIC STROKE IS DOMINATED BY INNATE IMMUNE CELLS: resident microglia and blood-derived macrophages. The ambivalent role of these cells in stroke outcome might be explained in part by the acquisition of distinct functional phenotypes: classically (M1 and alternatively activated (M2 macrophages. To shed light on the crosstalk between hypoxic neurons and macrophages, an in vitro model was set up in which bone marrow-derived macrophages were co-cultured with hippocampal slices subjected to oxygen and glucose deprivation. The results showed that macrophages provided potent protection against neuron cell loss through a paracrine mechanism, and that they expressed M2-type alternative polarization. These findings raised the possibility of using bone marrow-derived M2 macrophages in cellular therapy for stroke. Therefore, 2 million M2 macrophages (or vehicle were intravenously administered during the subacute stage of ischemia (D4 in a model of transient middle cerebral artery occlusion. Functional neuroscores and magnetic resonance imaging endpoints (infarct volumes, blood-brain barrier integrity, phagocytic activity assessed by iron oxide uptake were longitudinally monitored for 2 weeks. This cell-based treatment did not significantly improve any outcome measure compared with vehicle, suggesting that this strategy is not relevant to stroke therapy.

  18. LEXICAL APPROACH IN TEACHING TURKISH: A COLLOCATIONAL STUDY MODEL

    National Research Council Canada - National Science Library

    Eser ÖRDEM

    2013-01-01

    Abstract This study intends to propose Lexical Approach (Lewis, 1998, 2002; Harwood, 2002) and a model for teaching Turkish as a foreign language so that this model can be used in classroom settings...

  19. A model-based multisensor data fusion knowledge management approach

    Science.gov (United States)

    Straub, Jeremy

    2014-06-01

    A variety of approaches exist for combining data from multiple sensors. The model-based approach combines data based on its support for or refutation of elements of the model which in turn can be used to evaluate an experimental thesis. This paper presents a collection of algorithms for mapping various types of sensor data onto a thesis-based model and evaluating the truth or falsity of the thesis, based on the model. The use of this approach for autonomously arriving at findings and for prioritizing data are considered. Techniques for updating the model (instead of arriving at a true/false assertion) are also discussed.

  20. Comparison of two novel approaches to model fibre reinforced concrete

    NARCIS (Netherlands)

    Radtke, F.K.F.; Simone, A.; Sluys, L.J.

    2009-01-01

    We present two approaches to model fibre reinforced concrete. In both approaches, discrete fibre distributions and the behaviour of the fibre-matrix interface are explicitly considered. One approach employs the reaction forces from fibre to matrix while the other is based on the partition of unity f

  1. Modelling the World Wool Market: A Hybrid Approach

    OpenAIRE

    2007-01-01

    We present a model of the world wool market that merges two modelling traditions: the partialequilibrium commodity-specific approach and the computable general-equilibrium approach. The model captures the multistage nature of the wool production system, and the heterogeneous nature of raw wool, processed wool and wool garments. It also captures the important wool producing and consuming regions of the world. We illustrate the utility of the model by estimating the effects of tariff barriers o...

  2. Effective Model Approach to the Dense State of QCD Matter

    CERN Document Server

    Fukushima, Kenji

    2010-01-01

    The first-principle approach to the dense state of QCD matter, i.e. the lattice-QCD simulation at finite baryon density, is not under theoretical control for the moment. The effective model study based on QCD symmetries is a practical alternative. However the model parameters that are fixed by hadronic properties in the vacuum may have unknown dependence on the baryon chemical potential. We propose a new prescription to constrain the effective model parameters by the matching condition with the thermal Statistical Model. In the transitional region where thermal quantities blow up in the Statistical Model, deconfined quarks and gluons should smoothly take over the relevant degrees of freedom from hadrons and resonances. We use the Polyakov-loop coupled Nambu--Jona-Lasinio (PNJL) model as an effective description in the quark side and show how the matching condition is satisfied by a simple ansatz on the Polyakov loop potential. Our results favor a phase diagram with the chiral phase transition located at sligh...

  3. Numerical modelling approach for mine backfill

    Indian Academy of Sciences (India)

    MUHAMMAD ZAKA EMAD

    2017-09-01

    Numerical modelling is broadly used for assessing complex scenarios in underground mines, including mining sequence and blast-induced vibrations from production blasting. Sublevel stoping mining methods with delayed backfill are extensively used to exploit steeply dipping ore bodies by Canadian hard-rockmetal mines. Mine backfill is an important constituent of mining process. Numerical modelling of mine backfill material needs special attention as the numerical model must behave realistically and in accordance with the site conditions. This paper discusses a numerical modelling strategy for modelling mine backfill material. Themodelling strategy is studied using a case study mine from Canadian mining industry. In the end, results of numerical model parametric study are shown and discussed.

  4. Regularization of turbulence - a comprehensive modeling approach

    NARCIS (Netherlands)

    Geurts, Bernard J.

    2011-01-01

    Turbulence readily arises in numerous flows in nature and technology. The large number of degrees of freedom of turbulence poses serious challenges to numerical approaches aimed at simulating and controlling such flows. While the Navier-Stokes equations are commonly accepted to precisely describe fl

  5. Measuring equilibrium models: a multivariate approach

    Directory of Open Access Journals (Sweden)

    Nadji RAHMANIA

    2011-04-01

    Full Text Available This paper presents a multivariate methodology for obtaining measures of unobserved macroeconomic variables. The used procedure is the multivariate Hodrick-Prescot which depends on smoothing param eters. The choice of these parameters is crucial. Our approach is based on consistent estimators of these parameters, depending only on the observed data.

  6. A graphical approach to analogue behavioural modelling

    OpenAIRE

    Moser, Vincent; Nussbaum, Pascal; Amann, Hans-Peter; Astier, Luc; Pellandini, Fausto

    2007-01-01

    In order to master the growing complexity of analogue electronic systems, modelling and simulation of analogue hardware at various levels is absolutely necessary. This paper presents an original modelling method based on the graphical description of analogue electronic functional blocks. This method is intended to be automated and integrated into a design framework: specialists create behavioural models of existing functional blocks, that can then be used through high-level selection and spec...

  7. Consumer preference models: fuzzy theory approach

    Science.gov (United States)

    Turksen, I. B.; Wilson, I. A.

    1993-12-01

    Consumer preference models are widely used in new product design, marketing management, pricing and market segmentation. The purpose of this article is to develop and test a fuzzy set preference model which can represent linguistic variables in individual-level models implemented in parallel with existing conjoint models. The potential improvements in market share prediction and predictive validity can substantially improve management decisions about what to make (product design), for whom to make it (market segmentation) and how much to make (market share prediction).

  8. Bayesian network approach for modeling local failure in lung cancer

    Science.gov (United States)

    Oh, Jung Hun; Craft, Jeffrey; Al-Lozi, Rawan; Vaidya, Manushka; Meng, Yifan; Deasy, Joseph O; Bradley, Jeffrey D; Naqa, Issam El

    2011-01-01

    Locally advanced non-small cell lung cancer (NSCLC) patients suffer from a high local failure rate following radiotherapy. Despite many efforts to develop new dose-volume models for early detection of tumor local failure, there was no reported significant improvement in their application prospectively. Based on recent studies of biomarker proteins’ role in hypoxia and inflammation in predicting tumor response to radiotherapy, we hypothesize that combining physical and biological factors with a suitable framework could improve the overall prediction. To test this hypothesis, we propose a graphical Bayesian network framework for predicting local failure in lung cancer. The proposed approach was tested using two different datasets of locally advanced NSCLC patients treated with radiotherapy. The first dataset was collected retrospectively, which is comprised of clinical and dosimetric variables only. The second dataset was collected prospectively in which in addition to clinical and dosimetric information, blood was drawn from the patients at various time points to extract candidate biomarkers as well. Our preliminary results show that the proposed method can be used as an efficient method to develop predictive models of local failure in these patients and to interpret relationships among the different variables in the models. We also demonstrate the potential use of heterogenous physical and biological variables to improve the model prediction. With the first dataset, we achieved better performance compared with competing Bayesian-based classifiers. With the second dataset, the combined model had a slightly higher performance compared to individual physical and biological models, with the biological variables making the largest contribution. Our preliminary results highlight the potential of the proposed integrated approach for predicting post-radiotherapy local failure in NSCLC patients. PMID:21335651

  9. A model-based approach to selection of tag SNPs

    Directory of Open Access Journals (Sweden)

    Sun Fengzhu

    2006-06-01

    Full Text Available Abstract Background Single Nucleotide Polymorphisms (SNPs are the most common type of polymorphisms found in the human genome. Effective genetic association studies require the identification of sets of tag SNPs that capture as much haplotype information as possible. Tag SNP selection is analogous to the problem of data compression in information theory. According to Shannon's framework, the optimal tag set maximizes the entropy of the tag SNPs subject to constraints on the number of SNPs. This approach requires an appropriate probabilistic model. Compared to simple measures of Linkage Disequilibrium (LD, a good model of haplotype sequences can more accurately account for LD structure. It also provides a machinery for the prediction of tagged SNPs and thereby to assess the performances of tag sets through their ability to predict larger SNP sets. Results Here, we compute the description code-lengths of SNP data for an array of models and we develop tag SNP selection methods based on these models and the strategy of entropy maximization. Using data sets from the HapMap and ENCODE projects, we show that the hidden Markov model introduced by Li and Stephens outperforms the other models in several aspects: description code-length of SNP data, information content of tag sets, and prediction of tagged SNPs. This is the first use of this model in the context of tag SNP selection. Conclusion Our study provides strong evidence that the tag sets selected by our best method, based on Li and Stephens model, outperform those chosen by several existing methods. The results also suggest that information content evaluated with a good model is more sensitive for assessing the quality of a tagging set than the correct prediction rate of tagged SNPs. Besides, we show that haplotype phase uncertainty has an almost negligible impact on the ability of good tag sets to predict tagged SNPs. This justifies the selection of tag SNPs on the basis of haplotype

  10. A New Approach to Model Verification, Falsification and Selection

    Directory of Open Access Journals (Sweden)

    Andrew J. Buck

    2015-06-01

    Full Text Available This paper shows that a qualitative analysis, i.e., an assessment of the consistency of a hypothesized sign pattern for structural arrays with the sign pattern of the estimated reduced form, can always provide decisive insight into a model’s validity both in general and compared to other models. Qualitative analysis can show that it is impossible for some models to have generated the data used to estimate the reduced form, even though standard specification tests might show the model to be adequate. A partially specified structural hypothesis can be falsified by estimating as few as one reduced form equation. Zero restrictions in the structure can themselves be falsified. It is further shown how the information content of the hypothesized structural sign patterns can be measured using a commonly applied concept of statistical entropy. The lower the hypothesized structural sign pattern’s entropy, the more a priori information it proposes about the sign pattern of the estimated reduced form. As an hypothesized structural sign pattern has a lower entropy, it is more subject to type 1 error and less subject to type 2 error. Three cases illustrate the approach taken here.

  11. Kinetics approach to modeling of polymer additive degradation in lubricants

    Institute of Scientific and Technical Information of China (English)

    llyaI.KUDISH; RubenG.AIRAPETYAN; Michael; J.; COVITCH

    2001-01-01

    A kinetics problem for a degrading polymer additive dissolved in a base stock is studied.The polymer degradation may be caused by the combination of such lubricant flow parameters aspressure, elongational strain rate, and temperature as well as lubricant viscosity and the polymercharacteristics (dissociation energy, bead radius, bond length, etc.). A fundamental approach tothe problem of modeling mechanically induced polymer degradation is proposed. The polymerdegradation is modeled on the basis of a kinetic equation for the density of the statistical distribu-tion of polymer molecules as a function of their molecular weight. The integrodifferential kineticequation for polymer degradation is solved numerically. The effects of pressure, elongational strainrate, temperature, and lubricant viscosity on the process of lubricant degradation are considered.The increase of pressure promotes fast degradation while the increase of temperature delaysdegradation. A comparison of a numerically calculated molecular weight distribution with an ex-perimental one obtained in bench tests showed that they are in excellent agreement with eachother.

  12. A New Approach for Magneto-Static Hysteresis Behavioral Modeling

    DEFF Research Database (Denmark)

    Astorino, Antonio; Swaminathan, Madhavan; Antonini, Giulio

    2016-01-01

    In this paper, a new behavioral modeling approach for magneto-static hysteresis is presented. Many accurate models are currently available, but none of them seems to be able to correctly reproduce all the possible B-H paths with low computational cost. By contrast, the approach proposed...... achieved when comparing the measured and simulated results....

  13. The two capacitor problem revisited: simple harmonic oscillator model approach

    CERN Document Server

    Lee, Keeyung

    2012-01-01

    The well-known two-capacitor problem, in which exactly half the stored energy disappears when a charged capacitor is connected to an identical capacitor is discussed based on the mechanical harmonic oscillator model approach. In the mechanical harmonic oscillator model, it is shown first that \\emph {exactly half} the work done by a constant applied force is dissipated irrespective of the form of dissipation mechanism when the system comes to a new equilibrium after a constant force is abruptly applied. This model is then applied to the energy loss mechanism in the capacitor charging problem or the two-capacitor problem. This approach allows a simple explanation of the energy dissipation mechanism in these problems and shows that the dissipated energy should always be \\emph {exactly half} the supplied energy whether that is caused by the Joule heat or by the radiation. This paper which provides a simple treatment of the energy dissipation mechanism in the two-capacitor problem is suitable for all undergraduate...

  14. Nucleon Spin Content in a Relativistic Quark Potential Model Approach

    Institute of Scientific and Technical Information of China (English)

    DONG YuBing; FENG QingGuo

    2002-01-01

    Based on a relativistic quark model approach with an effective potential U(r) = (ac/2)(1 + γ0)r2, the spin content of the nucleon is investigated. Pseudo-scalar interaction between quarks and Goldstone bosons is employed to calculate the couplings between the Goldstone bosons and the nucleon. Different approaches to deal with the center of mass correction in the relativistic quark potential model approach are discussed.

  15. New nonlinear multivariable model shows the relationship between central corneal thickness and HRTII topographic parameters in glaucoma patients

    Directory of Open Access Journals (Sweden)

    Dimitrios Kourkoutas

    2009-04-01

    Full Text Available Dimitrios Kourkoutas1,2, Gerasimos Georgopoulos1, Antonios Maragos1, et al1Department of Ophthalmology, Medical School, Athens University, Athens, Greece; 2Department of Ophthalmology, 417 Hellenic Army Shared Fund Hospital, Athens, GreecePurpose: In this paper a new nonlinear multivariable regression method is presented in order to investigate the relationship between the central corneal thickness (CCT and the Heidelberg Retina Tomograph (HRTII optic nerve head (ONH topographic measurements, in patients with established glaucoma.Methods: Forty nine eyes of 49 patients with glaucoma were included in this study. Inclusion criteria were patients with (a HRT II ONH imaging of good quality (SD 30 < μm, (b reliable Humphrey visual field tests (30-2 program, and (c bilateral CCT measurements with ultrasonic contact pachymetry. Patients were classified as glaucomatous based on visual field and/or ONH damage. The relationship between CCT and topographic parameters was analyzed by using the new nonlinear multivariable regression model.Results: In the entire group, CCT was 549.78 ± 33.08 μm (range: 484–636 μm; intraocular pressure (IOP was 16.4 ± 2.67 mmHg (range: 11–23 mmHg; MD was −3.80 ± 4.97 dB (range: 4.04 – [−20.4] dB; refraction was −0.78 ± 2.46 D (range: −6.0 D to +3.0 D. The new nonlinear multivariable regression model we used indicated that CCT was significantly related (R2 = 0.227, p < 0.01 with rim volume nasally and type of diagnosis.Conclusions: By using the new nonlinear multivariable regression model, in patients with established glaucoma, our data showed that there is a statistically significant correlation between CCT and HRTII ONH structural measurements, in glaucoma patients.Keywords: central corneal thickness, glaucoma, optic nerve head, HRT 

  16. A simple approach to modeling ductile failure.

    Energy Technology Data Exchange (ETDEWEB)

    Wellman, Gerald William

    2012-06-01

    Sandia National Laboratories has the need to predict the behavior of structures after the occurrence of an initial failure. In some cases determining the extent of failure, beyond initiation, is required, while in a few cases the initial failure is a design feature used to tailor the subsequent load paths. In either case, the ability to numerically simulate the initiation and propagation of failures is a highly desired capability. This document describes one approach to the simulation of failure initiation and propagation.

  17. An approach for activity-based DEVS model specification

    DEFF Research Database (Denmark)

    Alshareef, Abdurrahman; Sarjoughian, Hessam S.; Zarrin, Bahram

    2016-01-01

    activity-based behavior modeling of parallel DEVS atomic models. We consider UML activities and actions as fundamental units of behavior modeling, especially in the presence of recent advances in the UML 2.5 specifications. We describe in detail how to approach activity modeling with a set of elemental...

  18. Advanced language modeling approaches, case study: Expert search

    NARCIS (Netherlands)

    Hiemstra, Djoerd

    2008-01-01

    This tutorial gives a clear and detailed overview of advanced language modeling approaches and tools, including the use of document priors, translation models, relevance models, parsimonious models and expectation maximization training. Expert search will be used as a case study to explain the

  19. Correlations in a generalized elastic model: fractional Langevin equation approach.

    Science.gov (United States)

    Taloni, Alessandro; Chechkin, Aleksei; Klafter, Joseph

    2010-12-01

    The generalized elastic model (GEM) provides the evolution equation which governs the stochastic motion of several many-body systems in nature, such as polymers, membranes, and growing interfaces. On the other hand a probe (tracer) particle in these systems performs a fractional Brownian motion due to the spatial interactions with the other system's components. The tracer's anomalous dynamics can be described by a fractional Langevin equation (FLE) with a space-time correlated noise. We demonstrate that the description given in terms of GEM coincides with that furnished by the relative FLE, by showing that the correlation functions of the stochastic field obtained within the FLE framework agree with the corresponding quantities calculated from the GEM. Furthermore we show that the Fox H -function formalism appears to be very convenient to describe the correlation properties within the FLE approach.

  20. Challenges and opportunities for integrating lake ecosystem modelling approaches

    Science.gov (United States)

    Mooij, Wolf M.; Trolle, Dennis; Jeppesen, Erik; Arhonditsis, George; Belolipetsky, Pavel V.; Chitamwebwa, Deonatus B.R.; Degermendzhy, Andrey G.; DeAngelis, Donald L.; Domis, Lisette N. De Senerpont; Downing, Andrea S.; Elliott, J. Alex; Ruberto, Carlos Ruberto; Gaedke, Ursula; Genova, Svetlana N.; Gulati, Ramesh D.; Hakanson, Lars; Hamilton, David P.; Hipsey, Matthew R.; Hoen, Jochem 't; Hulsmann, Stephan; Los, F. Hans; Makler-Pick, Vardit; Petzoldt, Thomas; Prokopkin, Igor G.; Rinke, Karsten; Schep, Sebastiaan A.; Tominaga, Koji; Van Dam, Anne A.; Van Nes, Egbert H.; Wells, Scott A.; Janse, Jan H.

    2010-01-01

    A large number and wide variety of lake ecosystem models have been developed and published during the past four decades. We identify two challenges for making further progress in this field. One such challenge is to avoid developing more models largely following the concept of others ('reinventing the wheel'). The other challenge is to avoid focusing on only one type of model, while ignoring new and diverse approaches that have become available ('having tunnel vision'). In this paper, we aim at improving the awareness of existing models and knowledge of concurrent approaches in lake ecosystem modelling, without covering all possible model tools and avenues. First, we present a broad variety of modelling approaches. To illustrate these approaches, we give brief descriptions of rather arbitrarily selected sets of specific models. We deal with static models (steady state and regression models), complex dynamic models (CAEDYM, CE-QUAL-W2, Delft 3D-ECO, LakeMab, LakeWeb, MyLake, PCLake, PROTECH, SALMO), structurally dynamic models and minimal dynamic models. We also discuss a group of approaches that could all be classified as individual based: super-individual models (Piscator, Charisma), physiologically structured models, stage-structured models and trait-based models. We briefly mention genetic algorithms, neural networks, Kalman filters and fuzzy logic. Thereafter, we zoom in, as an in-depth example, on the multi-decadal development and application of the lake ecosystem model PCLake and related models (PCLake Metamodel, Lake Shira Model, IPH-TRIM3D-PCLake). In the discussion, we argue that while the historical development of each approach and model is understandable given its 'leading principle', there are many opportunities for combining approaches. We take the point of view that a single 'right' approach does not exist and should not be strived for. Instead, multiple modelling approaches, applied concurrently to a given problem, can help develop an integrative

  1. Random matrix model approach to chiral symmetry

    CERN Document Server

    Verbaarschot, J J M

    1996-01-01

    We review the application of random matrix theory (RMT) to chiral symmetry in QCD. Starting from the general philosophy of RMT we introduce a chiral random matrix model with the global symmetries of QCD. Exact results are obtained for universal properties of the Dirac spectrum: i) finite volume corrections to valence quark mass dependence of the chiral condensate, and ii) microscopic fluctuations of Dirac spectra. Comparisons with lattice QCD simulations are made. Most notably, the variance of the number of levels in an interval containing $n$ levels on average is suppressed by a factor $(\\log n)/\\pi^2 n$. An extension of the random matrix model model to nonzero temperatures and chemical potential provides us with a schematic model of the chiral phase transition. In particular, this elucidates the nature of the quenched approximation at nonzero chemical potential.

  2. Machine Learning Approaches for Modeling Spammer Behavior

    CERN Document Server

    Islam, Md Saiful; Islam, Md Rafiqul

    2010-01-01

    Spam is commonly known as unsolicited or unwanted email messages in the Internet causing potential threat to Internet Security. Users spend a valuable amount of time deleting spam emails. More importantly, ever increasing spam emails occupy server storage space and consume network bandwidth. Keyword-based spam email filtering strategies will eventually be less successful to model spammer behavior as the spammer constantly changes their tricks to circumvent these filters. The evasive tactics that the spammer uses are patterns and these patterns can be modeled to combat spam. This paper investigates the possibilities of modeling spammer behavioral patterns by well-known classification algorithms such as Na\\"ive Bayesian classifier (Na\\"ive Bayes), Decision Tree Induction (DTI) and Support Vector Machines (SVMs). Preliminary experimental results demonstrate a promising detection rate of around 92%, which is considerably an enhancement of performance compared to similar spammer behavior modeling research.

  3. Infectious disease modeling a hybrid system approach

    CERN Document Server

    Liu, Xinzhi

    2017-01-01

    This volume presents infectious diseases modeled mathematically, taking seasonality and changes in population behavior into account, using a switched and hybrid systems framework. The scope of coverage includes background on mathematical epidemiology, including classical formulations and results; a motivation for seasonal effects and changes in population behavior, an investigation into term-time forced epidemic models with switching parameters, and a detailed account of several different control strategies. The main goal is to study these models theoretically and to establish conditions under which eradication or persistence of the disease is guaranteed. In doing so, the long-term behavior of the models is determined through mathematical techniques from switched systems theory. Numerical simulations are also given to augment and illustrate the theoretical results and to help study the efficacy of the control schemes.

  4. MODEL-BASED PERFORMANCE EVALUATION APPROACH FOR MOBILE AGENT SYSTEMS

    Institute of Scientific and Technical Information of China (English)

    Li Xin; Mi Zhengkun; Meng Xudong

    2004-01-01

    Claimed as the next generation programming paradigm, mobile agent technology has attracted extensive interests in recent years. However, up to now, limited research efforts have been devoted to the performance study of mobile agent system and most of these researches focus on agent behavior analysis resulting in that models are hard to apply to mobile agent systems. To bridge the gap, a new performance evaluation model derived from operation mechanisms of mobile agent platforms is proposed. Details are discussed for the design of companion simulation software, which can provide the system performance such as response time of platform to mobile agent. Further investigation is followed on the determination of model parameters. Finally comparison is made between the model-based simulation results and measurement-based real performance of mobile agent systems. The results show that the proposed model and designed software are effective in evaluating performance characteristics of mobile agent systems. The proposed approach can also be considered as the basis of performance analysis for large systems composed of multiple mobile agent platforms.

  5. "Dispersion modeling approaches for near road | Science ...

    Science.gov (United States)

    Roadway design and roadside barriers can have significant effects on the dispersion of traffic-generated pollutants, especially in the near-road environment. Dispersion models that can accurately simulate these effects are needed to fully assess these impacts for a variety of applications. For example, such models can be useful for evaluating the mitigation potential of roadside barriers in reducing near-road exposures and their associated adverse health effects. Two databases, a tracer field study and a wind tunnel study, provide measurements used in the development and/or validation of algorithms to simulate dispersion in the presence of noise barriers. The tracer field study was performed in Idaho Falls, ID, USA with a 6-m noise barrier and a finite line source in a variety of atmospheric conditions. The second study was performed in the meteorological wind tunnel at the US EPA and simulated line sources at different distances from a model noise barrier to capture the effect on emissions from individual lanes of traffic. In both cases, velocity and concentration measurements characterized the effect of the barrier on dispersion.This paper presents comparisons with the two datasets of the barrier algorithms implemented in two different dispersion models: US EPA’s R-LINE (a research dispersion modelling tool under development by the US EPA’s Office of Research and Development) and CERC’s ADMS model (ADMS-Urban). In R-LINE the physical features reveal

  6. Flipped models in Trinification: A Comprehensive Approach

    CERN Document Server

    Rodríguez, Oscar; Ponce, William A; Rojas, Eduardo

    2016-01-01

    By considering the 3-3-1 and the left-right symmetric models as low energy effective theories of the trinification group, alternative versions of these models are found. The new neutral gauge bosons in the universal 3-3-1 model and its flipped versions are considered; also, the left-right symmetric model and the two flipped variants of it are also studied. For these models, the couplings of the $Z'$ bosons to the standard model fermions are reported. The explicit form of the null space of the vector boson mass matrix for an arbitrary Higgs tensor and gauge group is also presented. In the general framework of the trinification gauge group, and by using the LHC experimental results and EW precision data, limits on the $Z'$ mass and the mixing angle between $Z$ and the new gauge bosons $Z'$ are imposed. The general results call for very small mixing angles in the range $10^{-3}$ radians and $M_{Z'}$ > 2.5 TeV.

  7. Lightweight approach to model traceability in a CASE tool

    Science.gov (United States)

    Vileiniskis, Tomas; Skersys, Tomas; Pavalkis, Saulius; Butleris, Rimantas; Butkiene, Rita

    2017-07-01

    A term "model-driven" is not at all a new buzzword within the ranks of system development community. Nevertheless, the ever increasing complexity of model-driven approaches keeps fueling all kinds of discussions around this paradigm and pushes researchers forward to research and develop new and more effective ways to system development. With the increasing complexity, model traceability, and model management as a whole, becomes indispensable activities of model-driven system development process. The main goal of this paper is to present a conceptual design and implementation of a practical lightweight approach to model traceability in a CASE tool.

  8. Approaching models of nursing from a postmodernist perspective.

    Science.gov (United States)

    Lister, P

    1991-02-01

    This paper explores some questions about the use of models of nursing. These questions make various assumptions about the nature of models of nursing, in general and in particular. Underlying these assumptions are various philosophical positions which are explored through an introduction to postmodernist approaches in philosophical criticism. To illustrate these approaches, a critique of the Roper et al. model is developed, and more general attitudes towards models of nursing are examined. It is suggested that postmodernism offers a challenge to many of the assumptions implicit in models of nursing, and that a greater awareness of these assumptions should lead to nursing care being better informed where such models are in use.

  9. Manufacturing Excellence Approach to Business Performance Model

    Directory of Open Access Journals (Sweden)

    Jesus Cruz Alvarez

    2015-03-01

    Full Text Available Six Sigma, lean manufacturing, total quality management, quality control, and quality function deployment are the fundamental set of tools to enhance productivity in organizations. There is some research that outlines the benefit of each tool into a particular context of firm´s productivity, but not into a broader context of firm´s competitiveness that is achieved thru business performance. The aim of this theoretical research paper is to contribute to this mean and propose a manufacturing excellence approach that links productivity tools into a broader context of business performance.

  10. The CONRAD approach to biokinetic modeling of DTPA decorporation therapy.

    Science.gov (United States)

    Breustedt, Bastian; Blanchardon, Eric; Bérard, Philippe; Fritsch, Paul; Giussani, Augusto; Lopez, Maria Antonia; Luciani, Andrea; Nosske, Dietmar; Piechowski, Jean; Schimmelpfeng, Jutta; Sérandour, Anne-Laure

    2010-10-01

    Diethylene Triamine Pentaacetic Acid (DTPA) is used for decorporation of plutonium because it is known to be able to enhance its urinary excretion for several days after treatment by forming stable Pu-DTPA complexes. The decorporation prevents accumulation in organs and results in a dosimetric benefit, which is difficult to quantify from bioassay data using existing models. The development of a biokinetic model describing the mechanisms of actinide decorporation by administration of DTPA was initiated as a task in the European COordinated Network on RAdiation Dosimetry (CONRAD). The systemic biokinetic model from Leggett et al. and the biokinetic model for DTPA compounds of International Commission on Radiological Protection Publication 53 were the starting points. A new model for biokinetics of administered DTPA based on physiological interpretation of 14C-labeled DTPA studies from literature was proposed by the group. Plutonium and DTPA biokinetics were modeled separately. The systems were connected by means of a second order kinetics process describing the chelation process of plutonium atoms and DTPA molecules to Pu-DTPA complexes. It was assumed that chelation only occurs in the blood and in systemic compartment ST0 (representing rapid turnover soft tissues), and that Pu-DTPA complexes and administered forms of DTPA share the same biokinetic behavior. First applications of the CONRAD approach showed that the enhancement of plutonium urinary excretion after administration of DTPA was strongly influenced by the chelation rate constant. Setting it to a high value resulted in a good fit to the observed data. However, the model was not yet satisfactory since the effects of repeated DTPA administration in a short time period cannot be predicted in a realistic way. In order to introduce more physiological knowledge into the model several questions still have to be answered. Further detailed studies of human contamination cases and experimental data will be needed in

  11. MDA based-approach for UML Models Complete Comparison

    CERN Document Server

    Chaouni, Samia Benabdellah; Mouline, Salma

    2011-01-01

    If a modeling task is distributed, it will frequently be necessary to integrate models developed by different team members. Problems occur in the models integration step and particularly, in the comparison phase of the integration. This issue had been discussed in several domains and various models. However, previous approaches have not correctly handled the semantic comparison. In the current paper, we provide a MDA-based approach for models comparison which aims at comparing UML models. We develop an hybrid approach which takes into account syntactic, semantic and structural comparison aspects. For this purpose, we use the domain ontology as well as other resources such as dictionaries. We propose a decision support system which permits the user to validate (or not) correspondences extracted in the comparison phase. For implementation, we propose an extension of the generic correspondence metamodel AMW in order to transform UML models to the correspondence model.

  12. A consortium approach to glass furnace modeling.

    Energy Technology Data Exchange (ETDEWEB)

    Chang, S.-L.; Golchert, B.; Petrick, M.

    1999-04-20

    Using computational fluid dynamics to model a glass furnace is a difficult task for any one glass company, laboratory, or university to accomplish. The task of building a computational model of the furnace requires knowledge and experience in modeling two dissimilar regimes (the combustion space and the liquid glass bath), along with the skill necessary to couple these two regimes. Also, a detailed set of experimental data is needed in order to evaluate the output of the code to ensure that the code is providing proper results. Since all these diverse skills are not present in any one research institution, a consortium was formed between Argonne National Laboratory, Purdue University, Mississippi State University, and five glass companies in order to marshal these skills into one three-year program. The objective of this program is to develop a fully coupled, validated simulation of a glass melting furnace that may be used by industry to optimize the performance of existing furnaces.

  13. Mixture modeling approach to flow cytometry data.

    Science.gov (United States)

    Boedigheimer, Michael J; Ferbas, John

    2008-05-01

    Flow Cytometry has become a mainstay technique for measuring fluorescent and physical attributes of single cells in a suspended mixture. These data are reduced during analysis using a manual or semiautomated process of gating. Despite the need to gate data for traditional analyses, it is well recognized that analyst-to-analyst variability can impact the dataset. Moreover, cells of interest can be inadvertently excluded from the gate, and relationships between collected variables may go unappreciated because they were not included in the original analysis plan. A multivariate non-gating technique was developed and implemented that accomplished the same goal as traditional gating while eliminating many weaknesses. The procedure was validated against traditional gating for analysis of circulating B cells in normal donors (n = 20) and persons with Systemic Lupus Erythematosus (n = 42). The method recapitulated relationships in the dataset while providing for an automated and objective assessment of the data. Flow cytometry analyses are amenable to automated analytical techniques that are not predicated on discrete operator-generated gates. Such alternative approaches can remove subjectivity in data analysis, improve efficiency and may ultimately enable construction of large bioinformatics data systems for more sophisticated approaches to hypothesis testing.

  14. BUSINESS MODEL IN ELECTRICITY INDUSTRY USING BUSINESS MODEL CANVAS APPROACH; THE CASE OF PT. XYZ

    National Research Council Canada - National Science Library

    Wicaksono, Achmad Arief; Syarief, Rizal; Suparno, Ono

    2017-01-01

    .... This study aims to identify company's business model using Business Model Canvas approach, formulate business development strategy alternatives, and determine the prioritized business development...

  15. A mechanism-based approach for absorption modeling: the Gastro-Intestinal Transit Time (GITT) model.

    Science.gov (United States)

    Hénin, Emilie; Bergstrand, Martin; Standing, Joseph F; Karlsson, Mats O

    2012-06-01

    Absorption models used in the estimation of pharmacokinetic drug characteristics from plasma concentration data are generally empirical and simple, utilizing no prior information on gastro-intestinal (GI) transit patterns. Our aim was to develop and evaluate an estimation strategy based on a mechanism-based model for drug absorption, which takes into account the tablet movement through the GI transit. This work is an extension of a previous model utilizing tablet movement characteristics derived from magnetic marker monitoring (MMM) and pharmacokinetic data. The new approach, which replaces MMM data with a GI transit model, was evaluated in data sets where MMM data were available (felodipine) or not available (diclofenac). Pharmacokinetic profiles in both datasets were well described by the model according to goodness-of-fit plots. Visual predictive checks showed the model to give superior simulation properties compared with a standard empirical approach (first-order absorption rate + lag-time). This model represents a step towards an integrated mechanism-based NLME model, where the use of physiological knowledge and in vitro–in vivo correlation helps fully characterize PK and generate hypotheses for new formulations or specific populations.

  16. A message passing approach for general epidemic models

    CERN Document Server

    Karrer, Brian

    2010-01-01

    In most models of the spread of disease over contact networks it is assumed that the probabilities of disease transmission and recovery from disease are constant in time. In real life, however, this is far from true. In many diseases, for instance, recovery occurs at about the same time after infection for all individuals, rather than at a constant rate. In this paper, we study a generalized version of the SIR (susceptible-infected-recovered) model of epidemic disease that allows for arbitrary nonuniform distributions of transmission and recovery times. Standard differential equation approaches cannot be used for this generalized model, but we show that the problem can be reformulated as a time-dependent message passing calculation on the appropriate contact network. The calculation is exact on trees (i.e., loopless networks) or locally tree-like networks (such as random graphs) in the large system size limit. On non-tree-like networks we show that the calculation gives a rigorous bound on the size of disease...

  17. "Dispersion modeling approaches for near road

    Science.gov (United States)

    Roadway design and roadside barriers can have significant effects on the dispersion of traffic-generated pollutants, especially in the near-road environment. Dispersion models that can accurately simulate these effects are needed to fully assess these impacts for a variety of app...

  18. and Models: A Self-Similar Approach

    Directory of Open Access Journals (Sweden)

    José Antonio Belinchón

    2013-01-01

    equations (FEs admit self-similar solutions. The methods employed allow us to obtain general results that are valid not only for the FRW metric, but also for all the Bianchi types as well as for the Kantowski-Sachs model (under the self-similarity hypothesis and the power-law hypothesis for the scale factors.

  19. Nonperturbative approach to the modified statistical model

    Energy Technology Data Exchange (ETDEWEB)

    Magdy, M.A.; Bekmezci, A.; Sever, R. [Middle East Technical Univ., Ankara (Turkey)

    1993-12-01

    The modified form of the statistical model is used without making any perturbation. The mass spectra of the lowest S, P and D levels of the (Q{bar Q}) and the non-self-conjugate (Q{bar q}) mesons are studied with the Song-Lin potential. The authors results are in good agreement with the experimental and theoretical findings.

  20. System Behavior Models: A Survey of Approaches

    Science.gov (United States)

    2016-06-01

    Mandana Vaziri, and Frank Tip. 2007. “Finding Bugs Efficiently with a SAT Solver.” In European Software Engineering Conference and the ACM SIGSOFT...Van Gorp. 2005. “A Taxonomy of Model Transformation.” Electronic Notes in Theoretical Computer Science 152: 125–142. Miyazawa, Alvaro, and Ana

  1. A moving approach for the Vector Hysteron Model

    Energy Technology Data Exchange (ETDEWEB)

    Cardelli, E. [Department of Engineering, University of Perugia, Via G. Duranti 93, 06125 Perugia (Italy); Faba, A., E-mail: antonio.faba@unipg.it [Department of Engineering, University of Perugia, Via G. Duranti 93, 06125 Perugia (Italy); Laudani, A. [Department of Engineering, Roma Tre University, Via V. Volterra 62, 00146 Rome (Italy); Quondam Antonio, S. [Department of Engineering, University of Perugia, Via G. Duranti 93, 06125 Perugia (Italy); Riganti Fulginei, F.; Salvini, A. [Department of Engineering, Roma Tre University, Via V. Volterra 62, 00146 Rome (Italy)

    2016-04-01

    A moving approach for the VHM (Vector Hysteron Model) is here described, to reconstruct both scalar and rotational magnetization of electrical steels with weak anisotropy, such as the non oriented grain Silicon steel. The hysterons distribution is postulated to be function of the magnetization state of the material, in order to overcome the practical limitation of the congruency property of the standard VHM approach. By using this formulation and a suitable accommodation procedure, the results obtained indicate that the model is accurate, in particular in reproducing the experimental behavior approaching to the saturation region, allowing a real improvement respect to the previous approach.

  2. Integration models: multicultural and liberal approaches confronted

    Science.gov (United States)

    Janicki, Wojciech

    2012-01-01

    European societies have been shaped by their Christian past, upsurge of international migration, democratic rule and liberal tradition rooted in religious tolerance. Boosting globalization processes impose new challenges on European societies, striving to protect their diversity. This struggle is especially clearly visible in case of minorities trying to resist melting into mainstream culture. European countries' legal systems and cultural policies respond to these efforts in many ways. Respecting identity politics-driven group rights seems to be the most common approach, resulting in creation of a multicultural society. However, the outcome of respecting group rights may be remarkably contradictory to both individual rights growing out from liberal tradition, and to reinforced concept of integration of immigrants into host societies. The hereby paper discusses identity politics upturn in the context of both individual rights and integration of European societies.

  3. A New Approach in Regression Analysis for Modeling Adsorption Isotherms

    Directory of Open Access Journals (Sweden)

    Dana D. Marković

    2014-01-01

    Full Text Available Numerous regression approaches to isotherm parameters estimation appear in the literature. The real insight into the proper modeling pattern can be achieved only by testing methods on a very big number of cases. Experimentally, it cannot be done in a reasonable time, so the Monte Carlo simulation method was applied. The objective of this paper is to introduce and compare numerical approaches that involve different levels of knowledge about the noise structure of the analytical method used for initial and equilibrium concentration determination. Six levels of homoscedastic noise and five types of heteroscedastic noise precision models were considered. Performance of the methods was statistically evaluated based on median percentage error and mean absolute relative error in parameter estimates. The present study showed a clear distinction between two cases. When equilibrium experiments are performed only once, for the homoscedastic case, the winning error function is ordinary least squares, while for the case of heteroscedastic noise the use of orthogonal distance regression or Margart’s percent standard deviation is suggested. It was found that in case when experiments are repeated three times the simple method of weighted least squares performed as well as more complicated orthogonal distance regression method.

  4. Thin inclusion approach for modelling of heterogeneous conducting materials

    Energy Technology Data Exchange (ETDEWEB)

    Lavrov, Nikolay [Davenport University, 4801 Oakman Boulevard, Dearborn, MI 48126 (United States); Smirnova, Alevtina; Gorgun, Haluk; Sammes, Nigel [University of Connecticut, Department of Materials Science and Engineering, Connecticut Global Fuel Center, 44 Weaver Road, Unit 5233, Storrs, CT 06269 (United States)

    2006-04-21

    Experimental data show that heterogeneous nanostructure of solid oxide and polymer electrolyte fuel cells could be approximated as an infinite set of fiber-like or penny-shaped inclusions in a continuous medium. Inclusions can be arranged in a cluster mode and regular or random order. In the newly proposed theoretical model of nanostructured material, the most attention is paid to the small aspect ratio of structural elements as well as to some model problems of electrostatics. The proposed integral equation for electric potential caused by the charge distributed over the single circular or elliptic cylindrical conductor of finite length, as a single unit of a nanostructured material, has been asymptotically simplified for the small aspect ratio and solved numerically. The result demonstrates that surface density changes slightly in the middle part of the thin domain and has boundary layers localized near the edges. It is anticipated, that contribution of boundary layer solution to the surface density is significant and cannot be governed by classic equation for smooth linear charge. The role of the cross-section shape is also investigated. Proposed approach is sufficiently simple, robust and allows extension to either regular or irregular system of various inclusions. This approach can be used for the development of the system of conducting inclusions, which are commonly present in nanostructured materials used for solid oxide and polymer electrolyte fuel cell (PEMFC) materials. (author)

  5. A Modeling Approach for Plastic-Metal Laser Direct Joining

    Science.gov (United States)

    Lutey, Adrian H. A.; Fortunato, Alessandro; Ascari, Alessandro; Romoli, Luca

    2017-09-01

    Laser processing has been identified as a feasible approach to direct joining of metal and plastic components without the need for adhesives or mechanical fasteners. The present work sees development of a modeling approach for conduction and transmission laser direct joining of these materials based on multi-layer optical propagation theory and numerical heat flow simulation. The scope of this methodology is to predict process outcomes based on the calculated joint interface and upper surface temperatures. Three representative cases are considered for model verification, including conduction joining of PBT and aluminum alloy, transmission joining of optically transparent PET and stainless steel, and transmission joining of semi-transparent PA 66 and stainless steel. Conduction direct laser joining experiments are performed on black PBT and 6082 anticorodal aluminum alloy, achieving shear loads of over 2000 N with specimens of 2 mm thickness and 25 mm width. Comparison with simulation results shows that consistently high strength is achieved where the peak interface temperature is above the plastic degradation temperature. Comparison of transmission joining simulations and published experimental results confirms these findings and highlights the influence of plastic layer optical absorption on process feasibility.

  6. A Workflow-Oriented Approach To Propagation Models In Heliophysics

    Directory of Open Access Journals (Sweden)

    Gabriele Pierantoni

    2014-01-01

    Full Text Available The Sun is responsible for the eruption of billions of tons of plasma andthe generation of near light-speed particles that propagate throughout the solarsystem and beyond. If directed towards Earth, these events can be damaging toour tecnological infrastructure. Hence there is an effort to understand the causeof the eruptive events and how they propagate from Sun to Earth. However, thephysics governing their propagation is not well understood, so there is a need todevelop a theoretical description of their propagation, known as a PropagationModel, in order to predict when they may impact Earth. It is often difficultto define a single propagation model that correctly describes the physics ofsolar eruptive events, and even more difficult to implement models capable ofcatering for all these complexities and to validate them using real observational data.In this paper, we envisage that workflows offer both a theoretical andpractical framerwork for a novel approach to propagation models. We definea mathematical framework that aims at encompassing the different modalitieswith which workflows can be used, and provide a set of generic building blockswritten in the TAVERNA workflow language that users can use to build theirown propagation models. Finally we test both the theoretical model and thecomposite building blocks of the workflow with a real Science Use Case that wasdiscussed during the 4th CDAW (Coordinated Data Analysis Workshop eventheld by the HELIO project. We show that generic workflow building blocks canbe used to construct a propagation model that succesfully describes the transitof solar eruptive events toward Earth and predict a correct Earth-impact time

  7. Human ESC-derived dopamine neurons show similar preclinical efficacy and potency to fetal neurons when grafted in a rat model of Parkinson's disease.

    Science.gov (United States)

    Grealish, Shane; Diguet, Elsa; Kirkeby, Agnete; Mattsson, Bengt; Heuer, Andreas; Bramoulle, Yann; Van Camp, Nadja; Perrier, Anselme L; Hantraye, Philippe; Björklund, Anders; Parmar, Malin

    2014-11-06

    Considerable progress has been made in generating fully functional and transplantable dopamine neurons from human embryonic stem cells (hESCs). Before these cells can be used for cell replacement therapy in Parkinson's disease (PD), it is important to verify their functional properties and efficacy in animal models. Here we provide a comprehensive preclinical assessment of hESC-derived midbrain dopamine neurons in a rat model of PD. We show long-term survival and functionality using clinically relevant MRI and PET imaging techniques and demonstrate efficacy in restoration of motor function with a potency comparable to that seen with human fetal dopamine neurons. Furthermore, we show that hESC-derived dopamine neurons can project sufficiently long distances for use in humans, fully regenerate midbrain-to-forebrain projections, and innervate correct target structures. This provides strong preclinical support for clinical translation of hESC-derived dopamine neurons using approaches similar to those established with fetal cells for the treatment of Parkinson's disease. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  8. Human ESC-Derived Dopamine Neurons Show Similar Preclinical Efficacy and Potency to Fetal Neurons when Grafted in a Rat Model of Parkinson’s Disease

    Science.gov (United States)

    Grealish, Shane; Diguet, Elsa; Kirkeby, Agnete; Mattsson, Bengt; Heuer, Andreas; Bramoulle, Yann; Van Camp, Nadja; Perrier, Anselme L.; Hantraye, Philippe; Björklund, Anders; Parmar, Malin

    2014-01-01

    Summary Considerable progress has been made in generating fully functional and transplantable dopamine neurons from human embryonic stem cells (hESCs). Before these cells can be used for cell replacement therapy in Parkinson’s disease (PD), it is important to verify their functional properties and efficacy in animal models. Here we provide a comprehensive preclinical assessment of hESC-derived midbrain dopamine neurons in a rat model of PD. We show long-term survival and functionality using clinically relevant MRI and PET imaging techniques and demonstrate efficacy in restoration of motor function with a potency comparable to that seen with human fetal dopamine neurons. Furthermore, we show that hESC-derived dopamine neurons can project sufficiently long distances for use in humans, fully regenerate midbrain-to-forebrain projections, and innervate correct target structures. This provides strong preclinical support for clinical translation of hESC-derived dopamine neurons using approaches similar to those established with fetal cells for the treatment of Parkinson’s disease. PMID:25517469

  9. A modular approach to numerical human body modeling

    NARCIS (Netherlands)

    Forbes, P.A.; Griotto, G.; Rooij, L. van

    2007-01-01

    The choice of a human body model for a simulated automotive impact scenario must take into account both accurate model response and computational efficiency as key factors. This study presents a "modular numerical human body modeling" approach which allows the creation of a customized human body mod

  10. A BEHAVIORAL-APPROACH TO LINEAR EXACT MODELING

    NARCIS (Netherlands)

    ANTOULAS, AC; WILLEMS, JC

    1993-01-01

    The behavioral approach to system theory provides a parameter-free framework for the study of the general problem of linear exact modeling and recursive modeling. The main contribution of this paper is the solution of the (continuous-time) polynomial-exponential time series modeling problem. Both re

  11. A modular approach to numerical human body modeling

    NARCIS (Netherlands)

    Forbes, P.A.; Griotto, G.; Rooij, L. van

    2007-01-01

    The choice of a human body model for a simulated automotive impact scenario must take into account both accurate model response and computational efficiency as key factors. This study presents a "modular numerical human body modeling" approach which allows the creation of a customized human body

  12. A market model for stochastic smile: a conditional density approach

    NARCIS (Netherlands)

    Zilber, A.

    2005-01-01

    The purpose of this paper is to introduce a new approach that allows to construct no-arbitrage market models of for implied volatility surfaces (in other words, stochastic smile models). That is to say, the idea presented here allows us to model prices of liquidly traded vanilla options as separate

  13. Thermoplasmonics modeling: A Green's function approach

    Science.gov (United States)

    Baffou, Guillaume; Quidant, Romain; Girard, Christian

    2010-10-01

    We extend the discrete dipole approximation (DDA) and the Green’s dyadic tensor (GDT) methods—previously dedicated to all-optical simulations—to investigate the thermodynamics of illuminated plasmonic nanostructures. This extension is based on the use of the thermal Green’s function and a original algorithm that we named Laplace matrix inversion. It allows for the computation of the steady-state temperature distribution throughout plasmonic systems. This hybrid photothermal numerical method is suited to investigate arbitrarily complex structures. It can take into account the presence of a dielectric planar substrate and is simple to implement in any DDA or GDT code. Using this numerical framework, different applications are discussed such as thermal collective effects in nanoparticles assembly, the influence of a substrate on the temperature distribution and the heat generation in a plasmonic nanoantenna. This numerical approach appears particularly suited for new applications in physics, chemistry, and biology such as plasmon-induced nanochemistry and catalysis, nanofluidics, photothermal cancer therapy, or phase-transition control at the nanoscale.

  14. Agribusiness model approach to territorial food development

    Directory of Open Access Journals (Sweden)

    Murcia Hector Horacio

    2011-04-01

    Full Text Available

    Several research efforts have coordinated the academic program of Agricultural Business Management from the University De La Salle (Bogota D.C., to the design and implementation of a sustainable agribusiness model applied to food development, with territorial projection. Rural development is considered as a process that aims to improve the current capacity and potential of the inhabitant of the sector, which refers not only to production levels and productivity of agricultural items. It takes into account the guidelines of the Organization of the United Nations “Millennium Development Goals” and considered the concept of sustainable food and agriculture development, including food security and nutrition in an integrated interdisciplinary context, with holistic and systemic dimension. Analysis is specified by a model with an emphasis on sustainable agribusiness production chains related to agricultural food items in a specific region. This model was correlated with farm (technical objectives, family (social purposes and community (collective orientations projects. Within this dimension are considered food development concepts and methodologies of Participatory Action Research (PAR. Finally, it addresses the need to link the results to low-income communities, within the concepts of the “new rurality”.

  15. Coupling approaches used in atmospheric entry models

    Science.gov (United States)

    Gritsevich, M. I.

    2012-09-01

    While a planet orbits the Sun, it is subject to impact by smaller objects, ranging from tiny dust particles and space debris to much larger asteroids and comets. Such collisions have taken place frequently over geological time and played an important role in the evolution of planets and the development of life on the Earth. Though the search for near-Earth objects addresses one of the main points of the Asteroid and Comet Hazard, one should not underestimate the useful information to be gleaned from smaller atmospheric encounters, known as meteors or fireballs. Not only do these events help determine the linkages between meteorites and their parent bodies; due to their relative regularity they provide a good statistical basis for analysis. For successful cases with found meteorites, the detailed atmospheric path record is an excellent tool to test and improve existing entry models assuring the robustness of their implementation. There are many more important scientific questions meteoroids help us to answer, among them: Where do these objects come from, what are their origins, physical properties and chemical composition? What are the shapes and bulk densities of the space objects which fully ablate in an atmosphere and do not reach the planetary surface? Which values are directly measured and which are initially assumed as input to various models? How to couple both fragmentation and ablation effects in the model, taking real size distribution of fragments into account? How to specify and speed up the recovery of a recently fallen meteorites, not letting weathering to affect samples too much? How big is the pre-atmospheric projectile to terminal body ratio in terms of their mass/volume? Which exact parameters beside initial mass define this ratio? More generally, how entering object affects Earth's atmosphere and (if applicable) Earth's surface? How to predict these impact consequences based on atmospheric trajectory data? How to describe atmospheric entry

  16. Applied Regression Modeling A Business Approach

    CERN Document Server

    Pardoe, Iain

    2012-01-01

    An applied and concise treatment of statistical regression techniques for business students and professionals who have little or no background in calculusRegression analysis is an invaluable statistical methodology in business settings and is vital to model the relationship between a response variable and one or more predictor variables, as well as the prediction of a response value given values of the predictors. In view of the inherent uncertainty of business processes, such as the volatility of consumer spending and the presence of market uncertainty, business professionals use regression a

  17. Bayesian Approach to Neuro-Rough Models for Modelling HIV

    CERN Document Server

    Marwala, Tshilidzi

    2007-01-01

    This paper proposes a new neuro-rough model for modelling the risk of HIV from demographic data. The model is formulated using Bayesian framework and trained using Markov Chain Monte Carlo method and Metropolis criterion. When the model was tested to estimate the risk of HIV infection given the demographic data it was found to give the accuracy of 62% as opposed to 58% obtained from a Bayesian formulated rough set model trained using Markov chain Monte Carlo method and 62% obtained from a Bayesian formulated multi-layered perceptron (MLP) model trained using hybrid Monte. The proposed model is able to combine the accuracy of the Bayesian MLP model and the transparency of Bayesian rough set model.

  18. Evaluation of approaches focused on modelling of organic carbon stocks using the RothC model

    Science.gov (United States)

    Koco, Štefan; Skalský, Rastislav; Makovníková, Jarmila; Tarasovičová, Zuzana; Barančíková, Gabriela

    2014-05-01

    The aim of current efforts in the European area is the protection of soil organic matter, which is included in all relevant documents related to the protection of soil. The use of modelling of organic carbon stocks for anticipated climate change, respectively for land management can significantly help in short and long-term forecasting of the state of soil organic matter. RothC model can be applied in the time period of several years to centuries and has been tested in long-term experiments within a large range of soil types and climatic conditions in Europe. For the initialization of the RothC model, knowledge about the carbon pool sizes is essential. Pool size characterization can be obtained from equilibrium model runs, but this approach is time consuming and tedious, especially for larger scale simulations. Due to this complexity we search for new possibilities how to simplify and accelerate this process. The paper presents a comparison of two approaches for SOC stocks modelling in the same area. The modelling has been carried out on the basis of unique input of land use, management and soil data for each simulation unit separately. We modeled 1617 simulation units of 1x1 km grid on the territory of agroclimatic region Žitný ostrov in the southwest of Slovakia. The first approach represents the creation of groups of simulation units based on the evaluation of results for simulation unit with similar input values. The groups were created after the testing and validation of modelling results for individual simulation units with results of modelling the average values of inputs for the whole group. Tests of equilibrium model for interval in the range 5 t.ha-1 from initial SOC stock showed minimal differences in results comparing with result for average value of whole interval. Management inputs data from plant residues and farmyard manure for modelling of carbon turnover were also the same for more simulation units. Combining these groups (intervals of initial

  19. Development of a computationally efficient urban modeling approach

    DEFF Research Database (Denmark)

    Wolfs, Vincent; Murla, Damian; Ntegeka, Victor;

    2016-01-01

    This paper presents a parsimonious and data-driven modelling approach to simulate urban floods. Flood levels simulated by detailed 1D-2D hydrodynamic models can be emulated using the presented conceptual modelling approach with a very short calculation time. In addition, the model detail can...... be adjust-ed, allowing the modeller to focus on flood-prone locations. This results in efficiently parameterized models that can be tailored to applications. The simulated flood levels are transformed into flood extent maps using a high resolution (0.5-meter) digital terrain model in GIS. To illustrate...... the developed methodology, a case study for the city of Ghent in Belgium is elaborated. The configured conceptual model mimics the flood levels of a detailed 1D-2D hydrodynamic InfoWorks ICM model accurately, while the calculation time is an order of magnitude of 106 times shorter than the original highly...

  20. Implicit moral evaluations: A multinomial modeling approach.

    Science.gov (United States)

    Cameron, C Daryl; Payne, B Keith; Sinnott-Armstrong, Walter; Scheffer, Julian A; Inzlicht, Michael

    2017-01-01

    Implicit moral evaluations-i.e., immediate, unintentional assessments of the wrongness of actions or persons-play a central role in supporting moral behavior in everyday life. Yet little research has employed methods that rigorously measure individual differences in implicit moral evaluations. In five experiments, we develop a new sequential priming measure-the Moral Categorization Task-and a multinomial model that decomposes judgment on this task into multiple component processes. These include implicit moral evaluations of moral transgression primes (Unintentional Judgment), accurate moral judgments about target actions (Intentional Judgment), and a directional tendency to judge actions as morally wrong (Response Bias). Speeded response deadlines reduced Intentional Judgment but not Unintentional Judgment (Experiment 1). Unintentional Judgment was stronger toward moral transgression primes than non-moral negative primes (Experiments 2-4). Intentional Judgment was associated with increased error-related negativity, a neurophysiological indicator of behavioral control (Experiment 4). Finally, people who voted for an anti-gay marriage amendment had stronger Unintentional Judgment toward gay marriage primes (Experiment 5). Across Experiments 1-4, implicit moral evaluations converged with moral personality: Unintentional Judgment about wrong primes, but not negative primes, was negatively associated with psychopathic tendencies and positively associated with moral identity and guilt proneness. Theoretical and practical applications of formal modeling for moral psychology are discussed. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Continuous Molecular Fields Approach Applied to Structure-Activity Modeling

    CERN Document Server

    Baskin, Igor I

    2013-01-01

    The Method of Continuous Molecular Fields is a universal approach to predict various properties of chemical compounds, in which molecules are represented by means of continuous fields (such as electrostatic, steric, electron density functions, etc). The essence of the proposed approach consists in performing statistical analysis of functional molecular data by means of joint application of kernel machine learning methods and special kernels which compare molecules by computing overlap integrals of their molecular fields. This approach is an alternative to traditional methods of building 3D structure-activity and structure-property models based on the use of fixed sets of molecular descriptors. The methodology of the approach is described in this chapter, followed by its application to building regression 3D-QSAR models and conducting virtual screening based on one-class classification models. The main directions of the further development of this approach are outlined at the end of the chapter.

  2. New Cutting Force Modeling Approach for Flat End Mill

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    A new mechanistic cutting force model for flat end milling using the instantaneous cutting force coefficients is proposed. An in-depth analysis shows that the total cutting forces can be separated into two terms: a nominal component independent of the runout and a perturbation component induced by the runout. The instantaneous value of the nominal component is used to calibrate the cutting force coefficients. With the help of the perturbation component and the cutting force coeffcients obtained above, the cutter runout is identified.Based on simulation and experimental results, the validity of the identification approach is demonstrated. The advantage of the proposed method lies in that the calibration performed with data of one cutting test under a specific regime can be applied for a great range of cutting conditions.

  3. A forward modeling approach for interpreting impeller flow logs.

    Science.gov (United States)

    Parker, Alison H; West, L Jared; Odling, Noelle E; Bown, Richard T

    2010-01-01

    A rigorous and practical approach for interpretation of impeller flow log data to determine vertical variations in hydraulic conductivity is presented and applied to two well logs from a Chalk aquifer in England. Impeller flow logging involves measuring vertical flow speed in a pumped well and using changes in flow with depth to infer the locations and magnitudes of inflows into the well. However, the measured flow logs are typically noisy, which leads to spurious hydraulic conductivity values where simplistic interpretation approaches are applied. In this study, a new method for interpretation is presented, which first defines a series of physical models for hydraulic conductivity variation with depth and then fits the models to the data, using a regression technique. Some of the models will be rejected as they are physically unrealistic. The best model is then selected from the remaining models using a maximum likelihood approach. This balances model complexity against fit, for example, using Akaike's Information Criterion.

  4. An Adaptive Approach to Schema Classification for Data Warehouse Modeling

    Institute of Scientific and Technical Information of China (English)

    Hong-Ding Wang; Yun-Hai Tong; Shao-Hua Tan; Shi-Wei Tang; Dong-Qing Yang; Guo-Hui Sun

    2007-01-01

    Data warehouse (DW) modeling is a complicated task, involving both knowledge of business processes and familiarity with operational information systems structure and behavior. Existing DW modeling techniques suffer from the following major drawbacks -data-driven approach requires high levels of expertise and neglects the requirements of end users, while demand-driven approach lacks enterprise-wide vision and is regardless of existing models of underlying operational systems. In order to make up for those shortcomings, a method of classification of schema elements for DW modeling is proposed in this paper. We first put forward the vector space models for subjects and schema elements, then present an adaptive approach with self-tuning theory to construct context vectors of subjects, and finally classify the source schema elements into different subjects of the DW automatically. Benefited from the result of the schema elements classification, designers can model and construct a DW more easily.

  5. A computational toy model for shallow landslides: Molecular dynamics approach

    Science.gov (United States)

    Martelloni, Gianluca; Bagnoli, Franco; Massaro, Emanuele

    2013-09-01

    The aim of this paper is to propose a 2D computational algorithm for modeling the triggering and propagation of shallow landslides caused by rainfall. We used a molecular dynamics (MD) approach, similar to the discrete element method (DEM), that is suitable to model granular material and to observe the trajectory of a single particle, so to possibly identify its dynamical properties. We consider that the triggering of shallow landslides is caused by the decrease of the static friction along the sliding surface due to water infiltration by rainfall. Thence the triggering is caused by the two following conditions: (a) a threshold speed of the particles and (b) a condition on the static friction, between the particles and the slope surface, based on the Mohr-Coulomb failure criterion. The latter static condition is used in the geotechnical model to estimate the possibility of landslide triggering. The interaction force between particles is modeled, in the absence of experimental data, by means of a potential similar to the Lennard-Jones one. The viscosity is also introduced in the model and for a large range of values of the model's parameters, we observe a characteristic velocity pattern, with acceleration increments, typical of real landslides. The results of simulations are quite promising: the energy and time triggering distribution of local avalanches show a power law distribution, analogous to the observed Gutenberg-Richter and Omori power law distributions for earthquakes. Finally, it is possible to apply the method of the inverse surface displacement velocity [4] for predicting the failure time.

  6. A Networks Approach to Modeling Enzymatic Reactions.

    Science.gov (United States)

    Imhof, P

    2016-01-01

    Modeling enzymatic reactions is a demanding task due to the complexity of the system, the many degrees of freedom involved and the complex, chemical, and conformational transitions associated with the reaction. Consequently, enzymatic reactions are not determined by precisely one reaction pathway. Hence, it is beneficial to obtain a comprehensive picture of possible reaction paths and competing mechanisms. By combining individually generated intermediate states and chemical transition steps a network of such pathways can be constructed. Transition networks are a discretized representation of a potential energy landscape consisting of a multitude of reaction pathways connecting the end states of the reaction. The graph structure of the network allows an easy identification of the energetically most favorable pathways as well as a number of alternative routes.

  7. Modeling Approaches for Describing Microbial Population Heterogeneity

    DEFF Research Database (Denmark)

    Lencastre Fernandes, Rita

    , ethanol and biomass throughout the reactor. This work has proven that the integration of CFD and population balance models, for describing the growth of a microbial population in a spatially heterogeneous reactor, is feasible, and that valuable insight on the interplay between flow and the dynamics......Although microbial populations are typically described by averaged properties, individual cells present a certain degree of variability. Indeed, initially clonal microbial populations develop into heterogeneous populations, even when growing in a homogeneous environment. A heterogeneous microbial......) to predict distributions of certain population properties including particle size, mass or volume, and molecular weight. Similarly, PBM allow for a mathematical description of distributed cell properties within microbial populations. Cell total protein content distributions (a measure of cell mass) have been...

  8. Hamiltonian approach to hybrid plasma models

    CERN Document Server

    Tronci, Cesare

    2010-01-01

    The Hamiltonian structures of several hybrid kinetic-fluid models are identified explicitly, upon considering collisionless Vlasov dynamics for the hot particles interacting with a bulk fluid. After presenting different pressure-coupling schemes for an ordinary fluid interacting with a hot gas, the paper extends the treatment to account for a fluid plasma interacting with an energetic ion species. Both current-coupling and pressure-coupling MHD schemes are treated extensively. In particular, pressure-coupling schemes are shown to require a transport-like term in the Vlasov kinetic equation, in order for the Hamiltonian structure to be preserved. The last part of the paper is devoted to studying the more general case of an energetic ion species interacting with a neutralizing electron background (hybrid Hall-MHD). Circulation laws and Casimir functionals are presented explicitly in each case.

  9. Modeling of phase equilibria with CPA using the homomorph approach

    DEFF Research Database (Denmark)

    Breil, Martin Peter; Tsivintzelis, Ioannis; Kontogeorgis, Georgios

    2011-01-01

    For association models, like CPA and SAFT, a classical approach is often used for estimating pure-compound and mixture parameters. According to this approach, the pure-compound parameters are estimated from vapor pressure and liquid density data. Then, the binary interaction parameters, kij, are ...

  10. A Constructive Neural-Network Approach to Modeling Psychological Development

    Science.gov (United States)

    Shultz, Thomas R.

    2012-01-01

    This article reviews a particular computational modeling approach to the study of psychological development--that of constructive neural networks. This approach is applied to a variety of developmental domains and issues, including Piagetian tasks, shift learning, language acquisition, number comparison, habituation of visual attention, concept…

  11. A Constructive Neural-Network Approach to Modeling Psychological Development

    Science.gov (United States)

    Shultz, Thomas R.

    2012-01-01

    This article reviews a particular computational modeling approach to the study of psychological development--that of constructive neural networks. This approach is applied to a variety of developmental domains and issues, including Piagetian tasks, shift learning, language acquisition, number comparison, habituation of visual attention, concept…

  12. Modular Modelling and Simulation Approach - Applied to Refrigeration Systems

    DEFF Research Database (Denmark)

    Sørensen, Kresten Kjær; Stoustrup, Jakob

    2008-01-01

    This paper presents an approach to modelling and simulation of the thermal dynamics of a refrigeration system, specifically a reefer container. A modular approach is used and the objective is to increase the speed and flexibility of the developed simulation environment. The refrigeration system...

  13. Pattern-based approach for logical traffic isolation forensic modelling

    CSIR Research Space (South Africa)

    Dlamini, I

    2009-08-01

    Full Text Available The use of design patterns usually changes the approach of software design and makes software development relatively easy. This paper extends work on a forensic model for Logical Traffic Isolation (LTI) based on Differentiated Services (Diff...

  14. A semantic-web approach for modeling computing infrastructures

    NARCIS (Netherlands)

    M. Ghijsen; J. van der Ham; P. Grosso; C. Dumitru; H. Zhu; Z. Zhao; C. de Laat

    2013-01-01

    This paper describes our approach to modeling computing infrastructures. Our main contribution is the Infrastructure and Network Description Language (INDL) ontology. The aim of INDL is to provide technology independent descriptions of computing infrastructures, including the physical resources as w

  15. Hybrid Modelling Approach to Prairie hydrology: Fusing Data-driven and Process-based Hydrological Models

    Science.gov (United States)

    Mekonnen, B.; Nazemi, A.; Elshorbagy, A.; Mazurek, K.; Putz, G.

    2012-04-01

    Modeling the hydrological response in prairie regions, characterized by flat and undulating terrain, and thus, large non-contributing areas, is a known challenge. The hydrological response (runoff) is the combination of the traditional runoff from the hydrologically contributing area and the occasional overflow from the non-contributing area. This study provides a unique opportunity to analyze the issue of fusing the Soil and Water Assessment Tool (SWAT) and Artificial Neural Networks (ANNs) in a hybrid structure to model the hydrological response in prairie regions. A hybrid SWAT-ANN model is proposed, where the SWAT component and the ANN module deal with the effective (contributing) area and the non-contributing area, respectively. The hybrid model is applied to the case study of Moose Jaw watershed, located in southern Saskatchewan, Canada. As an initial exploration, a comparison between ANN and SWAT models is established based on addressing the daily runoff (streamflow) prediction accuracy using multiple error measures. This is done to identify the merits and drawbacks of each modeling approach. It has been found out that the SWAT model has better performance during the low flow periods but with degraded efficiency during periods of high flows. The case is different for the ANN model as ANNs exhibit improved simulation during high flow periods but with biased estimates during low flow periods. The modelling results show that the new hybrid SWAT-ANN model is capable of exploiting the strengths of both SWAT and ANN models in an integrated framrwork. The new hybrid SWAT-ANN model simulates daily runoff quite satisfactorily with NSE measures of 0.80 and 0.83 during calibration and validation periods, respectively. Furthermore, an experimental assessment was performed to identify the effects of the ANN training method on the performance of the hybrid model as well as the parametric identifiability. Overall, the results obtained in this study suggest that the fusion

  16. Modelling road accidents: An approach using structural time series

    Science.gov (United States)

    Junus, Noor Wahida Md; Ismail, Mohd Tahir

    2014-09-01

    In this paper, the trend of road accidents in Malaysia for the years 2001 until 2012 was modelled using a structural time series approach. The structural time series model was identified using a stepwise method, and the residuals for each model were tested. The best-fitted model was chosen based on the smallest Akaike Information Criterion (AIC) and prediction error variance. In order to check the quality of the model, a data validation procedure was performed by predicting the monthly number of road accidents for the year 2012. Results indicate that the best specification of the structural time series model to represent road accidents is the local level with a seasonal model.

  17. Bioavailability of particulate metal to zebra mussels: Biodynamic modelling shows that assimilation efficiencies are site-specific

    Energy Technology Data Exchange (ETDEWEB)

    Bourgeault, Adeline, E-mail: bourgeault@ensil.unilim.fr [Cemagref, Unite de Recherche Hydrosystemes et Bioprocedes, 1 rue Pierre-Gilles de Gennes, 92761 Antony (France); FIRE, FR-3020, 4 place Jussieu, 75005 Paris (France); Gourlay-France, Catherine, E-mail: catherine.gourlay@cemagref.fr [Cemagref, Unite de Recherche Hydrosystemes et Bioprocedes, 1 rue Pierre-Gilles de Gennes, 92761 Antony (France); FIRE, FR-3020, 4 place Jussieu, 75005 Paris (France); Priadi, Cindy, E-mail: cindy.priadi@eng.ui.ac.id [LSCE/IPSL CEA-CNRS-UVSQ, Avenue de la Terrasse, 91198 Gif-sur-Yvette (France); Ayrault, Sophie, E-mail: Sophie.Ayrault@lsce.ipsl.fr [LSCE/IPSL CEA-CNRS-UVSQ, Avenue de la Terrasse, 91198 Gif-sur-Yvette (France); Tusseau-Vuillemin, Marie-Helene, E-mail: Marie-helene.tusseau@ifremer.fr [IFREMER Technopolis 40, 155 rue Jean-Jacques Rousseau, 92138 Issy-Les-Moulineaux (France)

    2011-12-15

    This study investigates the ability of the biodynamic model to predict the trophic bioaccumulation of cadmium (Cd), chromium (Cr), copper (Cu), nickel (Ni) and zinc (Zn) in a freshwater bivalve. Zebra mussels were transplanted to three sites along the Seine River (France) and collected monthly for 11 months. Measurements of the metal body burdens in mussels were compared with the predictions from the biodynamic model. The exchangeable fraction of metal particles did not account for the bioavailability of particulate metals, since it did not capture the differences between sites. The assimilation efficiency (AE) parameter is necessary to take into account biotic factors influencing particulate metal bioavailability. The biodynamic model, applied with AEs from the literature, overestimated the measured concentrations in zebra mussels, the extent of overestimation being site-specific. Therefore, an original methodology was proposed for in situ AE measurements for each site and metal. - Highlights: > Exchangeable fraction of metal particles did not account for the bioavailability of particulate metals. > Need for site-specific biodynamic parameters. > Field-determined AE provide a good fit between the biodynamic model predictions and bioaccumulation measurements. - The interpretation of metal bioaccumulation in transplanted zebra mussels with biodynamic modelling highlights the need for site-specific assimilation efficiencies of particulate metals.

  18. Molecular Modeling Approach to Cardiovascular Disease Targetting

    Directory of Open Access Journals (Sweden)

    Chandra Sekhar Akula,

    2010-05-01

    Full Text Available Cardiovascular disease, including stroke, is the leading cause of illness and death in the India. A number of studies have shown that inflammation of blood vessels is one of the major factors that increase the incidence of heart diseases, including arteriosclerosis (clogging of the arteries, stroke and myocardial infraction or heart attack. Studies have associated obesity and other components of metabolic syndrome, cardiovascular risk factors, with lowgradeinflammation. Furthermore, some findings suggest that drugs commonly prescribed to the lower cholesterol also reduce this inflammation, suggesting an additional beneficial effect of the stains. The recent development of angiotensin 11 (Ang11 receptor antagonists has enabled to improve significantly the tolerability profile of thisgroup of drugs while maintaining a high clinical efficacy. ACE2 is expressed predominantly in the endothelium and in renal tubular epithelium, and it thus may be an import new cardiovascular target. In the present study we modeled the structure of ACE and designed an inhibitor through using ARGUS lab and the validation of the Drug molecule is done basing on QSAR properties and Cache for this protein through CADD.

  19. Virtuous organization: A structural equation modeling approach

    Directory of Open Access Journals (Sweden)

    Majid Zamahani

    2013-02-01

    Full Text Available For years, the idea of virtue was unfavorable among researchers and virtues were traditionally considered as culture-specific, relativistic and they were supposed to be associated with social conservatism, religious or moral dogmatism, and scientific irrelevance. Virtue and virtuousness have been recently considered seriously among organizational researchers. The proposed study of this paper examines the relationships between leadership, organizational culture, human resource, structure and processes, care for community and virtuous organization. Structural equation modeling is employed to investigate the effects of each variable on other components. The data used in this study consists of questionnaire responses from employees in Payam e Noor University in Yazd province. A total of 250 questionnaires were sent out and a total of 211 valid responses were received. Our results have revealed that all the five variables have positive and significant impacts on virtuous organization. Among the five variables, organizational culture has the most direct impact (0.80 and human resource has the most total impact (0.844 on virtuous organization.

  20. Modeling of movement-related potentials using a fractal approach.

    Science.gov (United States)

    Uşakli, Ali Bülent

    2010-06-01

    In bio-signal applications, classification performance depends greatly on feature extraction, which is also the case for electroencephalogram (EEG) based applications. Feature extraction, and consequently classification of EEG signals is not an easy task due to their inherent low signal-to-noise ratios and artifacts. EEG signals can be treated as the output of a non-linear dynamical (chaotic) system in the human brain and therefore they can be modeled by their dimension values. In this study, the variance fractal dimension technique is suggested for the modeling of movement-related potentials (MRPs). Experimental data sets consist of EEG signals recorded during the movements of right foot up, lip pursing and a simultaneous execution of these two tasks. The experimental results and performance tests show that the proposed modeling method can efficiently be applied to MRPs especially in the binary approached brain computer interface applications aiming to assist severely disabled people such as amyotrophic lateral sclerosis patients in communication and/or controlling devices.

  1. Masked areas in shear peak statistics. A forward modeling approach

    Energy Technology Data Exchange (ETDEWEB)

    Bard, D.; Kratochvil, J. M.; Dawson, W.

    2016-03-09

    The statistics of shear peaks have been shown to provide valuable cosmological information beyond the power spectrum, and will be an important constraint of models of cosmology in forthcoming astronomical surveys. Surveys include masked areas due to bright stars, bad pixels etc., which must be accounted for in producing constraints on cosmology from shear maps. We advocate a forward-modeling approach, where the impacts of masking and other survey artifacts are accounted for in the theoretical prediction of cosmological parameters, rather than correcting survey data to remove them. We use masks based on the Deep Lens Survey, and explore the impact of up to 37% of the survey area being masked on LSST and DES-scale surveys. By reconstructing maps of aperture mass the masking effect is smoothed out, resulting in up to 14% smaller statistical uncertainties compared to simply reducing the survey area by the masked area. We show that, even in the presence of large survey masks, the bias in cosmological parameter estimation produced in the forward-modeling process is ≈1%, dominated by bias caused by limited simulation volume. We also explore how this potential bias scales with survey area and evaluate how much small survey areas are impacted by the differences in cosmological structure in the data and simulated volumes, due to cosmic variance.

  2. Experimental results showing the internal three-component velocity field and outlet temperature contours for a model gas turbine combustor

    CSIR Research Space (South Africa)

    Meyers, BC

    2011-09-01

    Full Text Available ,2]. These inconsistencies are especially great when combustion is simulated when there are already flow inconsistencies after modeling the flow in cold flow simulations. To enable the improvement of CFD modeling and techniques, a CFD test case has been created to aid.... [7], attempts have to be made to ensure that as many of the factors that influence the combustor flow should be included in the tests. The combustor in which these experiments were performed is a full, non-premixed, cylindrical, can-type combustor...

  3. Data Analysis A Model Comparison Approach, Second Edition

    CERN Document Server

    Judd, Charles M; Ryan, Carey S

    2008-01-01

    This completely rewritten classic text features many new examples, insights and topics including mediational, categorical, and multilevel models. Substantially reorganized, this edition provides a briefer, more streamlined examination of data analysis. Noted for its model-comparison approach and unified framework based on the general linear model, the book provides readers with a greater understanding of a variety of statistical procedures. This consistent framework, including consistent vocabulary and notation, is used throughout to develop fewer but more powerful model building techniques. T

  4. A test of the intergenerational conflict model in Indonesia shows no evidence of earlier menopause in female-dispersing groups.

    Science.gov (United States)

    Snopkowski, Kristin; Moya, Cristina; Sear, Rebecca

    2014-08-07

    Menopause remains an evolutionary puzzle, as humans are unique among primates in having a long post-fertile lifespan. One model proposes that intergenerational conflict in patrilocal populations favours female reproductive cessation. This model predicts that women should experience menopause earlier in groups with an evolutionary history of patrilocality compared with matrilocal groups. Using data from the Indonesia Family Life Survey, we test this model at multiple timescales: deep historical time, comparing age at menopause in ancestrally patrilocal Chinese Indonesians with ancestrally matrilocal Austronesian Indonesians; more recent historical time, comparing age at menopause in ethnic groups with differing postmarital residence within Indonesia and finally, analysing age at menopause at an individual-level, assuming a woman facultatively adjusts her age at menopause based on her postmarital residence. We find a significant effect only at the intermediate timescale where, contrary to predictions, ethnic groups with a history of multilocal postnuptial residence (where couples choose where to live) have the slowest progression to menopause, whereas matrilocal and patrilocal ethnic groups have similar progression rates. Multilocal residence may reduce intergenerational conflicts between women, thus influencing reproductive behaviour, but our results provide no support for the female-dispersal model of intergenerational conflict as an explanation of menopause.

  5. Demonstrating Chirality: Using a Mirror with Physical Models To Show Non-superimposability of Chiral Molecules with Their Mirror Images.

    Science.gov (United States)

    Collins, Michael J.

    2001-01-01

    Presents a remarkable demonstration on chiralty in molecules and the existence of enantiomers, also known as non-superimposable mirror images. Uses a mirror, a physical model of a molecule, and a bit of trickery involving the non-superimposable mirror image. (Author/NB)

  6. Assessing the Rothstein Test: Does It Really Show Teacher Value-Added Models Are Biased? Working Paper 5

    Science.gov (United States)

    Goldhaber, Dan; Chaplin, Duncan

    2012-01-01

    In a provocative and influential paper, Jesse Rothstein (2010) finds that standard value-added models (VAMs) suggest implausible future teacher effects on past student achievement, a finding that obviously cannot be viewed as causal. This is the basis of a falsification test (the Rothstein falsification test) that appears to indicate bias in VAM…

  7. Assessing the "Rothstein Test": Does It Really Show Teacher Value-Added Models Are Biased? Working Paper 71

    Science.gov (United States)

    Goldhaber, Dan; Chaplin, Duncan

    2012-01-01

    In a provocative and influential paper, Jesse Rothstein (2010) finds that standard value-added models (VAMs) suggest implausible future teacher effects on past student achievement, a finding that obviously cannot be viewed as causal. This is the basis of a falsification test (the Rothstein falsification test) that appears to indicate bias in VAM…

  8. The Kallikrein Inhibitor from Bauhinia bauhinioides (BbKI) shows antithrombotic properties in venous and arterial thrombosis models.

    Science.gov (United States)

    Brito, Marlon V; de Oliveira, Cleide; Salu, Bruno R; Andrade, Sonia A; Malloy, Paula M D; Sato, Ana C; Vicente, Cristina P; Sampaio, Misako U; Maffei, Francisco H A; Oliva, Maria Luiza V

    2014-05-01

    The Bauhinia bauhinioides Kallikrein Inhibitor (BbKI) is a Kunitz-type serine peptidase inhibitor of plant origin that has been shown to impair the viability of some tumor cells and to feature a potent inhibitory activity against human and rat plasma kallikrein (Kiapp 2.4 nmol/L and 5.2 nmol/L, respectively). This inhibitory activity is possibly responsible for an effect on hemostasis by prolonging activated partial thromboplastin time (aPTT). Because the association between cancer and thrombosis is well established, we evaluated the possible antithrombotic activity of this protein in venous and arterial thrombosis models. Vein thrombosis was studied in the vena cava ligature model in Wistar rats, and arterial thrombosis in the photochemical induced endothelium lesion model in the carotid artery of C57 black 6 mice. BbKI at a concentration of 2.0 mg/kg reduced the venous thrombus weight by 65% in treated rats in comparison to rats in the control group. The inhibitor prolonged the time for total artery occlusion in the carotid artery model mice indicating that this potent plasma kallikrein inhibitor prevented thrombosis.

  9. A test of the intergenerational conflict model in Indonesia shows no evidence of earlier menopause in female-dispersing groups

    Science.gov (United States)

    Snopkowski, Kristin; Moya, Cristina; Sear, Rebecca

    2014-01-01

    Menopause remains an evolutionary puzzle, as humans are unique among primates in having a long post-fertile lifespan. One model proposes that intergenerational conflict in patrilocal populations favours female reproductive cessation. This model predicts that women should experience menopause earlier in groups with an evolutionary history of patrilocality compared with matrilocal groups. Using data from the Indonesia Family Life Survey, we test this model at multiple timescales: deep historical time, comparing age at menopause in ancestrally patrilocal Chinese Indonesians with ancestrally matrilocal Austronesian Indonesians; more recent historical time, comparing age at menopause in ethnic groups with differing postmarital residence within Indonesia and finally, analysing age at menopause at an individual-level, assuming a woman facultatively adjusts her age at menopause based on her postmarital residence. We find a significant effect only at the intermediate timescale where, contrary to predictions, ethnic groups with a history of multilocal postnuptial residence (where couples choose where to live) have the slowest progression to menopause, whereas matrilocal and patrilocal ethnic groups have similar progression rates. Multilocal residence may reduce intergenerational conflicts between women, thus influencing reproductive behaviour, but our results provide no support for the female-dispersal model of intergenerational conflict as an explanation of menopause. PMID:24966311

  10. Modelling and Generating Ajax Applications: A Model-Driven Approach

    NARCIS (Netherlands)

    Gharavi, V.; Mesbah, A.; Van Deursen, A.

    2008-01-01

    Preprint of paper published in: IWWOST 2008 - 7th International Workshop on Web-Oriented Software Technologies, 14-15 July 2008 AJAX is a promising and rapidly evolving approach for building highly interactive web applications. In AJAX, user interface components and the event-based interaction betw

  11. Modelling and Generating Ajax Applications: A Model-Driven Approach

    NARCIS (Netherlands)

    Gharavi, V.; Mesbah, A.; Van Deursen, A.

    2008-01-01

    Preprint of paper published in: IWWOST 2008 - 7th International Workshop on Web-Oriented Software Technologies, 14-15 July 2008 AJAX is a promising and rapidly evolving approach for building highly interactive web applications. In AJAX, user interface components and the event-based interaction

  12. A novel approach to modeling and diagnosing the cardiovascular system

    Energy Technology Data Exchange (ETDEWEB)

    Keller, P.E.; Kangas, L.J.; Hashem, S.; Kouzes, R.T. [Pacific Northwest Lab., Richland, WA (United States); Allen, P.A. [Life Link, Richland, WA (United States)

    1995-07-01

    A novel approach to modeling and diagnosing the cardiovascular system is introduced. A model exhibits a subset of the dynamics of the cardiovascular behavior of an individual by using a recurrent artificial neural network. Potentially, a model will be incorporated into a cardiovascular diagnostic system. This approach is unique in that each cardiovascular model is developed from physiological measurements of an individual. Any differences between the modeled variables and the variables of an individual at a given time are used for diagnosis. This approach also exploits sensor fusion to optimize the utilization of biomedical sensors. The advantage of sensor fusion has been demonstrated in applications including control and diagnostics of mechanical and chemical processes.

  13. A dynamic appearance descriptor approach to facial actions temporal modeling.

    Science.gov (United States)

    Jiang, Bihan; Valstar, Michel; Martinez, Brais; Pantic, Maja

    2014-02-01

    Both the configuration and the dynamics of facial expressions are crucial for the interpretation of human facial behavior. Yet to date, the vast majority of reported efforts in the field either do not take the dynamics of facial expressions into account, or focus only on prototypic facial expressions of six basic emotions. Facial dynamics can be explicitly analyzed by detecting the constituent temporal segments in Facial Action Coding System (FACS) Action Units (AUs)-onset, apex, and offset. In this paper, we present a novel approach to explicit analysis of temporal dynamics of facial actions using the dynamic appearance descriptor Local Phase Quantization from Three Orthogonal Planes (LPQ-TOP). Temporal segments are detected by combining a discriminative classifier for detecting the temporal segments on a frame-by-frame basis with Markov Models that enforce temporal consistency over the whole episode. The system is evaluated in detail over the MMI facial expression database, the UNBC-McMaster pain database, the SAL database, the GEMEP-FERA dataset in database-dependent experiments, in cross-database experiments using the Cohn-Kanade, and the SEMAINE databases. The comparison with other state-of-the-art methods shows that the proposed LPQ-TOP method outperforms the other approaches for the problem of AU temporal segment detection, and that overall AU activation detection benefits from dynamic appearance information.

  14. A Mouse Model of Hyperproliferative Human Epithelium Validated by Keratin Profiling Shows an Aberrant Cytoskeletal Response to Injury

    Directory of Open Access Journals (Sweden)

    Samal Zhussupbekova

    2016-07-01

    Full Text Available A validated animal model would assist with research on the immunological consequences of the chronic expression of stress keratins KRT6, KRT16, and KRT17, as observed in human pre-malignant hyperproliferative epithelium. Here we examine keratin gene expression profile in skin from mice expressing the E7 oncoprotein of HPV16 (K14E7 demonstrating persistently hyperproliferative epithelium, in nontransgenic mouse skin, and in hyperproliferative actinic keratosis lesions from human skin. We demonstrate that K14E7 mouse skin overexpresses stress keratins in a similar manner to human actinic keratoses, that overexpression is a consequence of epithelial hyperproliferation induced by E7, and that overexpression further increases in response to injury. As stress keratins modify local immunity and epithelial cell function and differentiation, the K14E7 mouse model should permit study of how continued overexpression of stress keratins impacts on epithelial tumor development and on local innate and adaptive immunity.

  15. Mathematical models for therapeutic approaches to control HIV disease transmission

    CERN Document Server

    Roy, Priti Kumar

    2015-01-01

    The book discusses different therapeutic approaches based on different mathematical models to control the HIV/AIDS disease transmission. It uses clinical data, collected from different cited sources, to formulate the deterministic as well as stochastic mathematical models of HIV/AIDS. It provides complementary approaches, from deterministic and stochastic points of view, to optimal control strategy with perfect drug adherence and also tries to seek viewpoints of the same issue from different angles with various mathematical models to computer simulations. The book presents essential methods and techniques for students who are interested in designing epidemiological models on HIV/AIDS. It also guides research scientists, working in the periphery of mathematical modeling, and helps them to explore a hypothetical method by examining its consequences in the form of a mathematical modelling and making some scientific predictions. The model equations, mathematical analysis and several numerical simulations that are...

  16. Asteroid modeling for testing spacecraft approach and landing.

    Science.gov (United States)

    Martin, Iain; Parkes, Steve; Dunstan, Martin; Rowell, Nick

    2014-01-01

    Spacecraft exploration of asteroids presents autonomous-navigation challenges that can be aided by virtual models to test and develop guidance and hazard-avoidance systems. Researchers have extended and applied graphics techniques to create high-resolution asteroid models to simulate cameras and other spacecraft sensors approaching and descending toward asteroids. A scalable model structure with evenly spaced vertices simplifies terrain modeling, avoids distortion at the poles, and enables triangle-strip definition for efficient rendering. To create the base asteroid models, this approach uses two-phase Poisson faulting and Perlin noise. It creates realistic asteroid surfaces by adding both crater models adapted from lunar terrain simulation and multiresolution boulders. The researchers evaluated the virtual asteroids by comparing them with real asteroid images, examining the slope distributions, and applying a surface-relative feature-tracking algorithm to the models.

  17. A model-driven approach to information security compliance

    Science.gov (United States)

    Correia, Anacleto; Gonçalves, António; Teodoro, M. Filomena

    2017-06-01

    The availability, integrity and confidentiality of information are fundamental to the long-term survival of any organization. Information security is a complex issue that must be holistically approached, combining assets that support corporate systems, in an extended network of business partners, vendors, customers and other stakeholders. This paper addresses the conception and implementation of information security systems, conform the ISO/IEC 27000 set of standards, using the model-driven approach. The process begins with the conception of a domain level model (computation independent model) based on information security vocabulary present in the ISO/IEC 27001 standard. Based on this model, after embedding in the model mandatory rules for attaining ISO/IEC 27001 conformance, a platform independent model is derived. Finally, a platform specific model serves the base for testing the compliance of information security systems with the ISO/IEC 27000 set of standards.

  18. Heuristic approaches to models and modeling in systems biology

    NARCIS (Netherlands)

    MacLeod, Miles

    2016-01-01

    Prediction and control sufficient for reliable medical and other interventions are prominent aims of modeling in systems biology. The short-term attainment of these goals has played a strong role in projecting the importance and value of the field. In this paper I identify the standard models must m

  19. A Model Management Approach for Co-Simulation Model Evaluation

    NARCIS (Netherlands)

    Zhang, X.C.; Broenink, Johannes F.; Filipe, Joaquim; Kacprzyk, Janusz; Pina, Nuno

    2011-01-01

    Simulating formal models is a common means for validating the correctness of the system design and reduce the time-to-market. In most of the embedded control system design, multiple engineering disciplines and various domain-specific models are often involved, such as mechanical, control, software

  20. LEXICAL APPROACH IN TEACHING TURKISH: A COLLOCATIONAL STUDY MODEL

    Directory of Open Access Journals (Sweden)

    Eser ÖRDEM

    2013-06-01

    Full Text Available Abstract This study intends to propose Lexical Approach (Lewis, 1998, 2002; Harwood, 2002 and a model for teaching Turkish as a foreign language so that this model can be used in classroom settings. This model was created by the researcher as a result of the studies carried out in applied linguistics (Hill, 20009 and memory (Murphy, 2004. Since one of the main problems of foreign language learners is to retrieve what they have learnt, Lewis (1998 and Wray (2008 assume that lexical approach is an alternative explanation to solve this problem.Unlike grammar translation method, this approach supports the idea that language is not composed of general grammar but strings of word and word combinations.In addition, lexical approach posits the idea that each word has tiw gramamtical properties, and therefore each dictionary is a potential grammar book. Foreign language learners can learn to use collocations, a basic principle of Lexical approach. Thus, learners can increase the level of retention.The concept of retrieval clue (Murphy, 2004 is considered the main element in this collocational study model because the main purpose of this model is boost fluency and help learners gain native-like accuracy while producing the target language. Keywords: Foreign language teaching, lexical approach, collocations, retrieval clue

  1. Optimising GPR modelling: A practical, multi-threaded approach to 3D FDTD numerical modelling

    Science.gov (United States)

    Millington, T. M.; Cassidy, N. J.

    2010-09-01

    The demand for advanced interpretational tools has lead to the development of highly sophisticated, computationally demanding, 3D GPR processing and modelling techniques. Many of these methods solve very large problems with stepwise methods that utilise numerically similar functions within iterative computational loops. Problems of this nature are readily parallelised by splitting the computational domain into smaller, independent chunks for direct use on cluster-style, multi-processor supercomputers. Unfortunately, the implications of running such facilities, as well as time investment needed to develop the parallel codes, means that for most researchers, the use of these advanced methods is too impractical. In this paper, we propose an alternative method of parallelisation which exploits the capabilities of the modern multi-core processors (upon which today's desktop PCs are built) by multi-threading the calculation of a problem's individual sub-solutions. To illustrate the approach, we have applied it to an advanced, 3D, finite-difference time-domain (FDTD) GPR modelling tool in which the calculation of the individual vector field components is multi-threaded. To be of practical use, the FDTD scheme must be able to deliver accurate results with short execution times and we, therefore, show that the performance benefits of our approach can deliver runtimes less than half those of the more conventional, serial programming techniques. We evaluate implementations of the technique using different programming languages (e.g., Matlab, Java, C++), which will facilitate the construction of a flexible modelling tool for use in future GPR research. The implementations are compared on a variety of typical hardware platforms, having between one and eight processing cores available, and also a modern Graphical Processing Unit (GPU)-based computer. Our results show that a multi-threaded xyz modelling approach is easy to implement and delivers excellent results when implemented

  2. A Model-Driven Approach for Telecommunications Network Services Definition

    Science.gov (United States)

    Chiprianov, Vanea; Kermarrec, Yvon; Alff, Patrick D.

    Present day Telecommunications market imposes a short concept-to-market time for service providers. To reduce it, we propose a computer-aided, model-driven, service-specific tool, with support for collaborative work and for checking properties on models. We started by defining a prototype of the Meta-model (MM) of the service domain. Using this prototype, we defined a simple graphical modeling language specific for service designers. We are currently enlarging the MM of the domain using model transformations from Network Abstractions Layers (NALs). In the future, we will investigate approaches to ensure the support for collaborative work and for checking properties on models.

  3. Box-wing model approach for solar radiation pressure modelling in a multi-GNSS scenario

    Science.gov (United States)

    Tobias, Guillermo; Jesús García, Adrián

    2016-04-01

    The solar radiation pressure force is the largest orbital perturbation after the gravitational effects and the major error source affecting GNSS satellites. A wide range of approaches have been developed over the years for the modelling of this non gravitational effect as part of the orbit determination process. These approaches are commonly divided into empirical, semi-analytical and analytical, where their main difference relies on the amount of knowledge of a-priori physical information about the properties of the satellites (materials and geometry) and their attitude. It has been shown in the past that the pre-launch analytical models fail to achieve the desired accuracy mainly due to difficulties in the extrapolation of the in-orbit optical and thermic properties, the perturbations in the nominal attitude law and the aging of the satellite's surfaces, whereas empirical models' accuracies strongly depend on the amount of tracking data used for deriving the models, and whose performances are reduced as the area to mass ratio of the GNSS satellites increases, as it happens for the upcoming constellations such as BeiDou and Galileo. This paper proposes to use basic box-wing model for Galileo complemented with empirical parameters, based on the limited available information about the Galileo satellite's geometry. The satellite is modelled as a box, representing the satellite bus, and a wing representing the solar panel. The performance of the model will be assessed for GPS, GLONASS and Galileo constellations. The results of the proposed approach have been analyzed over a one year period. In order to assess the results two different SRP models have been used. Firstly, the proposed box-wing model and secondly, the new CODE empirical model, ECOM2. The orbit performances of both models are assessed using Satellite Laser Ranging (SLR) measurements, together with the evaluation of the orbit prediction accuracy. This comparison shows the advantages and disadvantages of

  4. Stakeholder approach, Stakeholders mental model: A visualization test with cognitive mapping technique

    Directory of Open Access Journals (Sweden)

    Garoui Nassreddine

    2012-04-01

    Full Text Available The idea of this paper is to determine the mental models of actors in the firm with respect to the stakeholder approach of corporate governance. The use of the cognitive map to view these diagrams to show the ways of thinking and conceptualization of the stakeholder approach. The paper takes a corporate governance perspective, discusses stakeholder model. It takes also a cognitive mapping technique.

  5. Child human model development: a hybrid validation approach

    NARCIS (Netherlands)

    Forbes, P.A.; Rooij, L. van; Rodarius, C.; Crandall, J.

    2008-01-01

    The current study presents a development and validation approach of a child human body model that will help understand child impact injuries and improve the biofidelity of child anthropometric test devices. Due to the lack of fundamental child biomechanical data needed to fully develop such models a

  6. Modeling Alaska boreal forests with a controlled trend surface approach

    Science.gov (United States)

    Mo Zhou; Jingjing Liang

    2012-01-01

    An approach of Controlled Trend Surface was proposed to simultaneously take into consideration large-scale spatial trends and nonspatial effects. A geospatial model of the Alaska boreal forest was developed from 446 permanent sample plots, which addressed large-scale spatial trends in recruitment, diameter growth, and mortality. The model was tested on two sets of...

  7. Teaching Service Modelling to a Mixed Class: An Integrated Approach

    Science.gov (United States)

    Deng, Jeremiah D.; Purvis, Martin K.

    2015-01-01

    Service modelling has become an increasingly important area in today's telecommunications and information systems practice. We have adapted a Network Design course in order to teach service modelling to a mixed class of both the telecommunication engineering and information systems backgrounds. An integrated approach engaging mathematics teaching…

  8. Gray-box modelling approach for description of storage tunnel

    DEFF Research Database (Denmark)

    Harremoës, Poul; Carstensen, Jacob

    1999-01-01

    The dynamics of a storage tunnel is examined using a model based on on-line measured data and a combination of simple deterministic and black-box stochastic elements. This approach, called gray-box modeling, is a new promising methodology for giving an on-line state description of sewer systems...

  9. Child human model development: a hybrid validation approach

    NARCIS (Netherlands)

    Forbes, P.A.; Rooij, L. van; Rodarius, C.; Crandall, J.

    2008-01-01

    The current study presents a development and validation approach of a child human body model that will help understand child impact injuries and improve the biofidelity of child anthropometric test devices. Due to the lack of fundamental child biomechanical data needed to fully develop such models a

  10. Refining the Committee Approach and Uncertainty Prediction in Hydrological Modelling

    NARCIS (Netherlands)

    Kayastha, N.

    2014-01-01

    Due to the complexity of hydrological systems a single model may be unable to capture the full range of a catchment response and accurately predict the streamflows. The multi modelling approach opens up possibilities for handling such difficulties and allows improve the predictive capability of mode

  11. Refining the committee approach and uncertainty prediction in hydrological modelling

    NARCIS (Netherlands)

    Kayastha, N.

    2014-01-01

    Due to the complexity of hydrological systems a single model may be unable to capture the full range of a catchment response and accurately predict the streamflows. The multi modelling approach opens up possibilities for handling such difficulties and allows improve the predictive capability of mode

  12. Hybrid continuum-atomistic approach to model electrokinetics in nanofluidics

    Energy Technology Data Exchange (ETDEWEB)

    Amani, Ehsan, E-mail: eamani@aut.ac.ir; Movahed, Saeid, E-mail: smovahed@aut.ac.ir

    2016-06-07

    In this study, for the first time, a hybrid continuum-atomistic based model is proposed for electrokinetics, electroosmosis and electrophoresis, through nanochannels. Although continuum based methods are accurate enough to model fluid flow and electric potential in nanofluidics (in dimensions larger than 4 nm), ionic concentration is too low in nanochannels for the continuum assumption to be valid. On the other hand, the non-continuum based approaches are too time-consuming and therefore is limited to simple geometries, in practice. Here, to propose an efficient hybrid continuum-atomistic method of modelling the electrokinetics in nanochannels; the fluid flow and electric potential are computed based on continuum hypothesis coupled with an atomistic Lagrangian approach for the ionic transport. The results of the model are compared to and validated by the results of the molecular dynamics technique for a couple of case studies. Then, the influences of bulk ionic concentration, external electric field, size of nanochannel, and surface electric charge on the electrokinetic flow and ionic mass transfer are investigated, carefully. The hybrid continuum-atomistic method is a promising approach to model more complicated geometries and investigate more details of the electrokinetics in nanofluidics. - Highlights: • A hybrid continuum-atomistic model is proposed for electrokinetics in nanochannels. • The model is validated by molecular dynamics. • This is a promising approach to model more complicated geometries and physics.

  13. Modelling diversity in building occupant behaviour: a novel statistical approach

    DEFF Research Database (Denmark)

    Haldi, Frédéric; Calì, Davide; Andersen, Rune Korsholm

    2016-01-01

    We propose an advanced modelling framework to predict the scope and effects of behavioural diversity regarding building occupant actions on window openings, shading devices and lighting. We develop a statistical approach based on generalised linear mixed models to account for the longitudinal nat...

  14. A Bayesian Approach for Analyzing Longitudinal Structural Equation Models

    Science.gov (United States)

    Song, Xin-Yuan; Lu, Zhao-Hua; Hser, Yih-Ing; Lee, Sik-Yum

    2011-01-01

    This article considers a Bayesian approach for analyzing a longitudinal 2-level nonlinear structural equation model with covariates, and mixed continuous and ordered categorical variables. The first-level model is formulated for measures taken at each time point nested within individuals for investigating their characteristics that are dynamically…

  15. An Empirical-Mathematical Modelling Approach to Upper Secondary Physics

    Science.gov (United States)

    Angell, Carl; Kind, Per Morten; Henriksen, Ellen K.; Guttersrud, Oystein

    2008-01-01

    In this paper we describe a teaching approach focusing on modelling in physics, emphasizing scientific reasoning based on empirical data and using the notion of multiple representations of physical phenomena as a framework. We describe modelling activities from a project (PHYS 21) and relate some experiences from implementation of the modelling…

  16. An Alternative Approach for Nonlinear Latent Variable Models

    Science.gov (United States)

    Mooijaart, Ab; Bentler, Peter M.

    2010-01-01

    In the last decades there has been an increasing interest in nonlinear latent variable models. Since the seminal paper of Kenny and Judd, several methods have been proposed for dealing with these kinds of models. This article introduces an alternative approach. The methodology involves fitting some third-order moments in addition to the means and…

  17. Refining the Committee Approach and Uncertainty Prediction in Hydrological Modelling

    NARCIS (Netherlands)

    Kayastha, N.

    2014-01-01

    Due to the complexity of hydrological systems a single model may be unable to capture the full range of a catchment response and accurately predict the streamflows. The multi modelling approach opens up possibilities for handling such difficulties and allows improve the predictive capability of mode

  18. Refining the committee approach and uncertainty prediction in hydrological modelling

    NARCIS (Netherlands)

    Kayastha, N.

    2014-01-01

    Due to the complexity of hydrological systems a single model may be unable to capture the full range of a catchment response and accurately predict the streamflows. The multi modelling approach opens up possibilities for handling such difficulties and allows improve the predictive capability of mode

  19. A network model shows the importance of coupled processes in the microbial N cycle in the Cape Fear River Estuary

    Science.gov (United States)

    Hines, David E.; Lisa, Jessica A.; Song, Bongkeun; Tobias, Craig R.; Borrett, Stuart R.

    2012-06-01

    Estuaries serve important ecological and economic functions including habitat provision and the removal of nutrients. Eutrophication can overwhelm the nutrient removal capacity of estuaries and poses a widely recognized threat to the health and function of these ecosystems. Denitrification and anaerobic ammonium oxidation (anammox) are microbial processes responsible for the removal of fixed nitrogen and diminish the effects of eutrophication. Both of these microbial removal processes can be influenced by direct inputs of dissolved inorganic nitrogen substrates or supported by microbial interactions with other nitrogen transforming pathways such as nitrification and dissimilatory nitrate reduction to ammonium (DNRA). The coupling of nitrogen removal pathways to other transformation pathways facilitates the removal of some forms of inorganic nitrogen; however, differentiating between direct and coupled nitrogen removal is difficult. Network modeling provides a tool to examine interactions among microbial nitrogen cycling processes and to determine the within-system history of nitrogen involved in denitrification and anammox. To examine the coupling of nitrogen cycling processes, we built a nitrogen budget mass balance network model in two adjacent 1 cm3 sections of bottom water and sediment in the oligohaline portion of the Cape Fear River Estuary, NC, USA. Pathway, flow, and environ ecological network analyses were conducted to characterize the organization of nitrogen flow in the estuary and to estimate the coupling of nitrification to denitrification and of nitrification and DNRA to anammox. Centrality analysis indicated NH4+ is the most important form of nitrogen involved in removal processes. The model analysis further suggested that direct denitrification and coupled nitrification-denitrification had similar contributions to nitrogen removal while direct anammox was dominant to coupled forms of anammox. Finally, results also indicated that partial

  20. BUSINESS MODEL IN ELECTRICITY INDUSTRY USING BUSINESS MODEL CANVAS APPROACH; THE CASE OF PT. XYZ

    Directory of Open Access Journals (Sweden)

    Achmad Arief Wicaksono

    2017-01-01

    Full Text Available The magnitude of opportunities and project values of electricity system in Indonesia encourages PT. XYZ to develop its business in electrical sector which requires business development strategies. This study aims to identify company's business model using Business Model Canvas approach, formulate business development strategy alternatives, and determine the prioritized business development strategy which is appropriate to the manufacturing business model for PT. XYZ. This study utilized a descriptive approach and the nine elements of the Business Model Canvas. Alternative formulation and priority determination of the strategies were obtained by using Strengths, Weaknesses, Opportunities, Threats (SWOT analysis and pairwise comparison. The results of this study are the improvement of Business Model Canvas on the elements of key resources, key activities, key partners and customer segment. In terms of SWOT analysis on the nine elements of the Business Model Canvas for the first business development, the results show an expansion on the power plant construction project as the main contractor, an increase in sales in its core business in supporting equipment industry of oil and gas,  a development in the second business i.e. an investment in the electricity sector as an independent renewable emery-based power producer. On its first business development, PT. XYZ selected three Business Model Canvas elements which become the priorities of the company i.e. key resources weighing 0.252, key activities weighing 0.240, and key partners weighing 0.231. On its second business development, the company selected three elements to become their the priorities i.e. key partners weighing 0.225, customer segments weighing 0.217, and key resources weighing 0.215.Keywords: business model canvas, SWOT, pairwise comparison, business model

  1. A multilevel approach to modeling of porous bioceramics

    Science.gov (United States)

    Mikushina, Valentina A.; Sidorenko, Yury N.

    2015-10-01

    The paper is devoted to discussion of multiscale models of heterogeneous materials using principles. The specificity of approach considered is the using of geometrical model of composites representative volume, which must be generated with taking the materials reinforcement structure into account. In framework of such model may be considered different physical processes which have influence on the effective mechanical properties of composite, in particular, the process of damage accumulation. It is shown that such approach can be used to prediction the value of composite macroscopic ultimate strength. As an example discussed the particular problem of the study the mechanical properties of biocomposite representing porous ceramics matrix filled with cortical bones tissue.

  2. Gray-box modelling approach for description of storage tunnel

    DEFF Research Database (Denmark)

    Harremoës, Poul; Carstensen, Jacob

    1999-01-01

    of the water in the overflow structures. The capacity of a pump draining the storage tunnel is estimated for two different rain events, revealing that the pump was malfunctioning during the first rain event. The proposed modeling approach can be used in automated online surveillance and control and implemented....... The model in the present paper provides on-line information on overflow volumes, pumping capacities, and remaining storage capacities. A linear overflow relation is found, differing significantly from the traditional deterministic modeling approach. The linearity of the formulas is explained by the inertia...

  3. A study of multidimensional modeling approaches for data warehouse

    Science.gov (United States)

    Yusof, Sharmila Mat; Sidi, Fatimah; Ibrahim, Hamidah; Affendey, Lilly Suriani

    2016-08-01

    Data warehouse system is used to support the process of organizational decision making. Hence, the system must extract and integrate information from heterogeneous data sources in order to uncover relevant knowledge suitable for decision making process. However, the development of data warehouse is a difficult and complex process especially in its conceptual design (multidimensional modeling). Thus, there have been various approaches proposed to overcome the difficulty. This study surveys and compares the approaches of multidimensional modeling and highlights the issues, trend and solution proposed to date. The contribution is on the state of the art of the multidimensional modeling design.

  4. Meta-analysis a structural equation modeling approach

    CERN Document Server

    Cheung, Mike W-L

    2015-01-01

    Presents a novel approach to conducting meta-analysis using structural equation modeling. Structural equation modeling (SEM) and meta-analysis are two powerful statistical methods in the educational, social, behavioral, and medical sciences. They are often treated as two unrelated topics in the literature. This book presents a unified framework on analyzing meta-analytic data within the SEM framework, and illustrates how to conduct meta-analysis using the metaSEM package in the R statistical environment. Meta-Analysis: A Structural Equation Modeling Approach begins by introducing the impo

  5. Modeling and Algorithmic Approaches to Constitutively-Complex, Microstructured Fluids

    Energy Technology Data Exchange (ETDEWEB)

    Miller, Gregory H. [Univ. of California, Davis, CA (United States); Forest, Gregory [Univ. of California, Davis, CA (United States)

    2014-05-01

    We present a new multiscale model for complex fluids based on three scales: microscopic, kinetic, and continuum. We choose the microscopic level as Kramers' bead-rod model for polymers, which we describe as a system of stochastic differential equations with an implicit constraint formulation. The associated Fokker-Planck equation is then derived, and adiabatic elimination removes the fast momentum coordinates. Approached in this way, the kinetic level reduces to a dispersive drift equation. The continuum level is modeled with a finite volume Godunov-projection algorithm. We demonstrate computation of viscoelastic stress divergence using this multiscale approach.

  6. Raw Data Maximum Likelihood Estimation for Common Principal Component Models: A State Space Approach.

    Science.gov (United States)

    Gu, Fei; Wu, Hao

    2016-09-01

    The specifications of state space model for some principal component-related models are described, including the independent-group common principal component (CPC) model, the dependent-group CPC model, and principal component-based multivariate analysis of variance. Some derivations are provided to show the equivalence of the state space approach and the existing Wishart-likelihood approach. For each model, a numeric example is used to illustrate the state space approach. In addition, a simulation study is conducted to evaluate the standard error estimates under the normality and nonnormality conditions. In order to cope with the nonnormality conditions, the robust standard errors are also computed. Finally, other possible applications of the state space approach are discussed at the end.

  7. Social learning in Models and Cases - an Interdisciplinary Approach

    Science.gov (United States)

    Buhl, Johannes; De Cian, Enrica; Carrara, Samuel; Monetti, Silvia; Berg, Holger

    2016-04-01

    Our paper follows an interdisciplinary understanding of social learning. We contribute to the literature on social learning in transition research by bridging case-oriented research and modelling-oriented transition research. We start by describing selected theories on social learning in innovation, diffusion and transition research. We present theoretical understandings of social learning in techno-economic and agent-based modelling. Then we elaborate on empirical research on social learning in transition case studies. We identify and synthetize key dimensions of social learning in transition case studies. In the following we bridge between more formal and generalising modelling approaches towards social learning processes and more descriptive, individualising case study approaches by interpreting the case study analysis into a visual guide on functional forms of social learning typically identified in the cases. We then try to exemplarily vary functional forms of social learning in integrated assessment models. We conclude by drawing the lessons learned from the interdisciplinary approach - methodologically and empirically.

  8. Learning the Task Management Space of an Aircraft Approach Model

    Science.gov (United States)

    Krall, Joseph; Menzies, Tim; Davies, Misty

    2014-01-01

    Validating models of airspace operations is a particular challenge. These models are often aimed at finding and exploring safety violations, and aim to be accurate representations of real-world behavior. However, the rules governing the behavior are quite complex: nonlinear physics, operational modes, human behavior, and stochastic environmental concerns all determine the responses of the system. In this paper, we present a study on aircraft runway approaches as modeled in Georgia Tech's Work Models that Compute (WMC) simulation. We use a new learner, Genetic-Active Learning for Search-Based Software Engineering (GALE) to discover the Pareto frontiers defined by cognitive structures. These cognitive structures organize the prioritization and assignment of tasks of each pilot during approaches. We discuss the benefits of our approach, and also discuss future work necessary to enable uncertainty quantification.

  9. Building enterprise reuse program--A model-based approach

    Institute of Scientific and Technical Information of China (English)

    梅宏; 杨芙清

    2002-01-01

    Reuse is viewed as a realistically effective approach to solving software crisis. For an organization that wants to build a reuse program, technical and non-technical issues must be considered in parallel. In this paper, a model-based approach to building systematic reuse program is presented. Component-based reuse is currently a dominant approach to software reuse. In this approach, building the right reusable component model is the first important step. In order to achieve systematic reuse, a set of component models should be built from different perspectives. Each of these models will give a specific view of the components so as to satisfy different needs of different persons involved in the enterprise reuse program. There already exist some component models for reuse from technical perspectives. But less attention is paid to the reusable components from a non-technical view, especially from the view of process and management. In our approach, a reusable component model--FLP model for reusable component--is introduced. This model describes components from three dimensions (Form, Level, and Presentation) and views components and their relationships from the perspective of process and management. It determines the sphere of reusable components, the time points of reusing components in the development process, and the needed means to present components in terms of the abstraction level, logic granularity and presentation media. Being the basis on which the management and technical decisions are made, our model will be used as the kernel model to initialize and normalize a systematic enterprise reuse program.

  10. Bamboo Leaf Flavones and Tea Polyphenols Show a Lipid-lowering Effect in a Rat Model of Hyperlipidemia.

    Science.gov (United States)

    Yang, C; Yifan, L; Dan, L; Qian, Y; Ming-yan, J

    2015-12-01

    At present, most of the lipid-lowering drugs are western medicines, which have a lot of adverse reactions. Zhucha, an age-old Uyghur medicine, is made up of bamboo leaves and tea (green tea), which has good efficacy and lipid-lowering effect. The purpose of this study was to undertake a pharmacodynamic examination of the optimal proportions of bamboo leaf flavones and tea polyphenols required to achieve lipid lowering in rats. A hyperlipidemia rat model was used to examine the lipid lowering effects of bamboo leaf flavones and tea polyphenols. Wistar rats were divided into 13 groups including one hyperlipidemia model group and 2 positive drug groups as well as experimental groups (9 groups dosed with different proportions of bamboo leaf flavones and tea polyphenols, the 3 dosages of bamboo leaf flavones were 75 mg/kg/d, 50 mg/kg/d and 25 mg/kg/d respectively, the 3 dosages of tea polyphenol were 750 mg/kg/d, 500 mg/kg/d and 250 mg/kg/d). The weight, the levels of triglyceride (TG) and high-density lipoprotein cholesterol (HDL) were determined. A high dose of bamboo leaf flavones (75 mg/kg/d) combined with a medium dose of tea polyphenols (500 mg/kg/d) was deemed to be optimal for achieving a lipid-lowering effect, the weight had the smallest increase and the level of TG and HDL was similar to positive control. The bamboo leaf flavones and tea polyphenols were mixed according to a certain proportion (1:6.7), and the mixture achieved a lipid-lowering effect and might prove to be useful as a natural lipid-lowering agent.

  11. Pridopidine, a dopamine stabilizer, improves motor performance and shows neuroprotective effects in Huntington disease R6/2 mouse model.

    Science.gov (United States)

    Squitieri, Ferdinando; Di Pardo, Alba; Favellato, Mariagrazia; Amico, Enrico; Maglione, Vittorio; Frati, Luigi

    2015-11-01

    Huntington disease (HD) is a neurodegenerative disorder for which new treatments are urgently needed. Pridopidine is a new dopaminergic stabilizer, recently developed for the treatment of motor symptoms associated with HD. The therapeutic effect of pridopidine in patients with HD has been determined in two double-blind randomized clinical trials, however, whether pridopidine exerts neuroprotection remains to be addressed. The main goal of this study was to define the potential neuroprotective effect of pridopidine, in HD in vivo and in vitro models, thus providing evidence that might support a potential disease-modifying action of the drug and possibly clarifying other aspects of pridopidine mode-of-action. Our data corroborated the hypothesis of neuroprotective action of pridopidine in HD experimental models. Administration of pridopidine protected cells from apoptosis, and resulted in highly improved motor performance in R6/2 mice. The anti-apoptotic effect observed in the in vitro system highlighted neuroprotective properties of the drug, and advanced the idea of sigma-1-receptor as an additional molecular target implicated in the mechanism of action of pridopidine. Coherent with protective effects, pridopidine-mediated beneficial effects in R6/2 mice were associated with an increased expression of pro-survival and neurostimulatory molecules, such as brain derived neurotrophic factor and DARPP32, and with a reduction in the size of mHtt aggregates in striatal tissues. Taken together, these findings support the theory of pridopidine as molecule with disease-modifying properties in HD and advance the idea of a valuable therapeutic strategy for effectively treating the disease. © 2015 The Authors. Journal of Cellular and Molecular Medicine published by John Wiley & Sons Ltd and Foundation for Cellular and Molecular Medicine.

  12. Actinobacteria from Termite Mounds Show Antiviral Activity against Bovine Viral Diarrhea Virus, a Surrogate Model for Hepatitis C Virus

    Directory of Open Access Journals (Sweden)

    Marina Aiello Padilla

    2015-01-01

    Full Text Available Extracts from termite-associated bacteria were evaluated for in vitro antiviral activity against bovine viral diarrhea virus (BVDV. Two bacterial strains were identified as active, with percentages of inhibition (IP equal to 98%. Both strains were subjected to functional analysis via the addition of virus and extract at different time points in cell culture; the results showed that they were effective as posttreatments. Moreover, we performed MTT colorimetric assays to identify the CC50, IC50, and SI values of these strains, and strain CDPA27 was considered the most promising. In parallel, the isolates were identified as Streptomyces through 16S rRNA gene sequencing analysis. Specifically, CDPA27 was identified as S. chartreusis. The CDPA27 extract was fractionated on a C18-E SPE cartridge, and the fractions were reevaluated. A 100% methanol fraction was identified to contain the compound(s responsible for antiviral activity, which had an SI of 262.41. GC-MS analysis showed that this activity was likely associated with the compound(s that had a peak retention time of 5 min. Taken together, the results of the present study provide new information for antiviral research using natural sources, demonstrate the antiviral potential of Streptomyces chartreusis compounds isolated from termite mounds against BVDV, and lay the foundation for further studies on the treatment of HCV infection.

  13. Forecasting wind-driven wildfires using an inverse modelling approach

    Directory of Open Access Journals (Sweden)

    O. Rios

    2014-06-01

    Full Text Available A technology able to rapidly forecast wildfire dynamics would lead to a paradigm shift in the response to emergencies, providing the Fire Service with essential information about the ongoing fire. This paper presents and explores a novel methodology to forecast wildfire dynamics in wind-driven conditions, using real-time data assimilation and inverse modelling. The forecasting algorithm combines Rothermel's rate of spread theory with a perimeter expansion model based on Huygens principle and solves the optimisation problem with a tangent linear approach and forward automatic differentiation. Its potential is investigated using synthetic data and evaluated in different wildfire scenarios. The results show the capacity of the method to quickly predict the location of the fire front with a positive lead time (ahead of the event in the order of 10 min for a spatial scale of 100 m. The greatest strengths of our method are lightness, speed and flexibility. We specifically tailor the forecast to be efficient and computationally cheap so it can be used in mobile systems for field deployment and operativeness. Thus, we put emphasis on producing a positive lead time and the means to maximise it.

  14. Benchmarking novel approaches for modelling species range dynamics.

    Science.gov (United States)

    Zurell, Damaris; Thuiller, Wilfried; Pagel, Jörn; Cabral, Juliano S; Münkemüller, Tamara; Gravel, Dominique; Dullinger, Stefan; Normand, Signe; Schiffers, Katja H; Moore, Kara A; Zimmermann, Niklaus E

    2016-08-01

    Increasing biodiversity loss due to climate change is one of the most vital challenges of the 21st century. To anticipate and mitigate biodiversity loss, models are needed that reliably project species' range dynamics and extinction risks. Recently, several new approaches to model range dynamics have been developed to supplement correlative species distribution models (SDMs), but applications clearly lag behind model development. Indeed, no comparative analysis has been performed to evaluate their performance. Here, we build on process-based, simulated data for benchmarking five range (dynamic) models of varying complexity including classical SDMs, SDMs coupled with simple dispersal or more complex population dynamic models (SDM hybrids), and a hierarchical Bayesian process-based dynamic range model (DRM). We specifically test the effects of demographic and community processes on model predictive performance. Under current climate, DRMs performed best, although only marginally. Under climate change, predictive performance varied considerably, with no clear winners. Yet, all range dynamic models improved predictions under climate change substantially compared to purely correlative SDMs, and the population dynamic models also predicted reasonable extinction risks for most scenarios. When benchmarking data were simulated with more complex demographic and community processes, simple SDM hybrids including only dispersal often proved most reliable. Finally, we found that structural decisions during model building can have great impact on model accuracy, but prior system knowledge on important processes can reduce these uncertainties considerably. Our results reassure the clear merit in using dynamic approaches for modelling species' response to climate change but also emphasize several needs for further model and data improvement. We propose and discuss perspectives for improving range projections through combination of multiple models and for making these approaches

  15. A Variable Flow Modelling Approach To Military End Strength Planning

    Science.gov (United States)

    2016-12-01

    System Dynamics (SD) model is ideal for strategic analysis as it encompasses all the behaviours of a system and how the behaviours are influenced by...Markov Chain Models Wang describes Markov chain theory as a mathematical tool used to investigate dynamic behaviours of a system in a discrete-time... MODELLING APPROACH TO MILITARY END STRENGTH PLANNING by Benjamin K. Grossi December 2016 Thesis Advisor: Kenneth Doerr Second Reader

  16. New Approaches in Usable Booster System Life Cycle Cost Modeling

    Science.gov (United States)

    2012-01-01

    Lean NPD practices (many) • Lean Production & Operations Practices (many) • Supply Chain Operations Reference ( SCOR ) Model , Best Practices Make Deliver...NEW APPROACHES IN REUSABLE BOOSTER SYSTEM LIFE CYCLE COST MODELING Edgar Zapata National Aeronautics and Space Administration Kennedy Space Center...Kennedy Space Center (KSC) and the Air Force Research Laboratory (AFRL). The work included the creation of a new cost estimating model and an LCC

  17. Multi-Model approach to reconstruct the Mediterranean Freshwater Evolution

    Science.gov (United States)

    Simon, Dirk; Marzocchi, Alice; Flecker, Rachel; Lunt, Dan; Hilgen, Frits; Meijer, Paul

    2016-04-01

    Today the Mediterranean Sea is isolated from the global ocean by the Strait of Gibraltar. This restricted nature causes the Mediterranean basin to react more sensitively to climatic and tectonic related phenomena than the global ocean. Not just eustatic sea-level and regional river run-off, but also gateway tectonics and connectivity between sub-basins are leaving an enhanced fingerprint in its geological record. To understand its evolution, it is crucial to understand how these different effects are coupled. The Miocene-Pliocene sedimentary record of the Mediterranean shows alternations in composition and colour and has been astronomically tuned. Around the Miocene-Pliocene Boundary the most extreme changes occur in the Mediterranean Sea. About 6% of the salt in the global ocean deposited in the Mediterranean Region, forming an approximately 2 km thick salt layer, which is still present today. This extreme event is named the Messinian Salinity Crisis (MSC, 5.97-5.33 Ma). The gateway and climate evolution is not well constrained for this time, which makes it difficult to distinguish which of the above mentioned drivers might have triggered the MSC. We, therefore, decided to tackle this problem via a multi-model approach: (1) We calculate the Mediterranean freshwater evolution via 30 atmosphere-ocean-vegetation simulations (using HadCM3L), to which we fitted to a function, using a regression model. This allows us to directly relate the orbital curves to evaporation, precipitation and run off. The resulting freshwater evolution can be directly correlated to other sedimentary and proxy records in the late Miocene. (2) By feeding the new freshwater evolution curve into a box/budget model we can predict the salinity and strontium evolution of the Mediterranean for a certain Atlantic-Mediterranean gateway. (3) By comparing these results to the known salinity thresholds of gypsum and halite saturation of sea water, but also to the late Miocene Mediterranean strontium

  18. An inducible transgenic mouse model for immune mediated hepatitis showing clearance of antigen expressing hepatocytes by CD8+ T cells.

    Directory of Open Access Journals (Sweden)

    Marcin Cebula

    Full Text Available The liver has the ability to prime immune responses against neo antigens provided upon infections. However, T cell immunity in liver is uniquely modulated by the complex tolerogenic property of this organ that has to also cope with foreign agents such as endotoxins or food antigens. In this respect, the nature of intrahepatic T cell responses remains to be fully characterized. To gain deeper insight into the mechanisms that regulate the CD8+ T cell responses in the liver, we established a novel OVA_X_CreER(T2 mouse model. Upon tamoxifen administration OVA antigen expression is observed in a fraction of hepatocytes, resulting in a mosaic expression pattern. To elucidate the cross-talk of CD8+ T cells with antigen-expressing hepatocytes, we adoptively transferred K(b/OVA257-264-specific OT-I T cells to OVA_X_CreER(T2 mice or generated triple transgenic OVA_X CreER(T2_X_OT-I mice. OT-I T cells become activated in OVA_X_CreER(T2 mice and induce an acute and transient hepatitis accompanied by liver damage. In OVA_X_CreER(T2_X_OT-I mice, OVA induction triggers an OT-I T cell mediated, fulminant hepatitis resulting in 50% mortality. Surviving mice manifest a long lasting hepatitis, and recover after 9 weeks. In these experimental settings, recovery from hepatitis correlates with a complete loss of OVA expression indicating efficient clearance of the antigen-expressing hepatocytes. Moreover, a relapse of hepatitis can be induced upon re-induction of cured OVA_X_CreER(T2_X_OT-I mice indicating absence of tolerogenic mechanisms. This pathogen-free, conditional mouse model has the advantage of tamoxifen inducible tissue specific antigen expression that reflects the heterogeneity of viral antigen expression and enables the study of intrahepatic immune responses to both de novo and persistent antigen. It allows following the course of intrahepatic immune responses: initiation, the acute phase and antigen clearance.

  19. Clarithromycin and dexamethasone show similar anti-inflammatory effects on distinct phenotypic chronic rhinosinusitis: an explant model study.

    Science.gov (United States)

    Zeng, Ming; Li, Zhi-Yong; Ma, Jin; Cao, Ping-Ping; Wang, Heng; Cui, Yong-Hua; Liu, Zheng

    2015-06-06

    Phenotype of chronic rhinosinusitis (CRS) may be an important determining factor of the efficacy of anti-inflammatory treatments. Although both glucocorticoids and macrolide antibiotics have been recommended for the treatment of CRS, whether they have different anti-inflammatory functions for distinct phenotypic CRS has not been completely understood. The aim of this study is to compare the anti-inflammatory effects of clarithromycin and dexamethasone on sinonasal mucosal explants from different phenotypic CRS ex vivo. Ethmoid mucosal tissues from CRSsNP patients (n = 15), and polyp tissues from eosinophilic (n = 13) and non-eosinophilic (n = 12) CRSwNP patients were cultured in an ex vivo explant model with or without dexamethasone or clarithromycin treatment for 24 h. After culture, the production and/or expression of anti-inflammatory molecules, epithelial-derived cytokines, pro-inflammatory cytokines, T helper (Th)1, Th2 and Th17 cytokines, chemokines, dendritic cell relevant markers, pattern recognition receptors (PRRs), and tissue remodeling factors were detected in tissue explants or culture supernatants by RT-PCR or ELISA, respectively. We found that both clarithromycin and dexamethasone up-regulated the production of anti-inflammatory mediators (Clara cell 10-kDa protein and interleukin (IL)-10), whereas down-regulated the production of Th2 response and eosinophilia promoting molecules (thymic stromal lymphopoietin, IL-25, IL-33, CD80, CD86, OX40 ligand, programmed cell death ligand 1, CCL17, CCL22, CCL11, CCL5, IL-5, IL-13, and eosinophilic cationic protein) and Th1 response and neutrophilia promoting molecules (CXCL8, CXCL5, CXCL10, CXCL9, interferon-γ, and IL-12), from sinonasal mucosa from distinct phenotypic CRS. In contrast, they had no effect on IL-17A production. The expression of PRRs (Toll-like receptors and melanoma differentiation-associated gene 5) was induced, and the production of tissue remodeling factors (transforming growth factor-β1

  20. Individual Diet Modeling Shows How to Balance the Diet of French Adults with or without Excessive Free Sugar Intakes

    Directory of Open Access Journals (Sweden)

    Anne Lluch

    2017-02-01

    Full Text Available Dietary changes needed to achieve nutritional adequacy for 33 nutrients were determined for 1719 adults from a representative French national dietary survey. For each individual, an iso-energy nutritionally adequate diet was generated using diet modeling, staying as close as possible to the observed diet. The French food composition table was completed with free sugar (FS content. Results were analyzed separately for individuals with FS intakes in their observed diets ≤10% or >10% of their energy intake (named below FS-ACCEPTABLE and FS-EXCESS, respectively. The FS-EXCESS group represented 41% of the total population (average energy intake of 14.2% from FS. Compared with FS-ACCEPTABLE individuals, FS-EXCESS individuals had diets of lower nutritional quality and consumed more energy (2192 vs. 2123 kcal/day, particularly during snacking occasions (258 vs. 131 kcal/day (all p-values < 0.01. In order to meet nutritional targets, for both FS-ACCEPTABLE and FS-EXCESS individuals, the main dietary changes in optimized diets were significant increases in fresh fruits, starchy foods, water, hot beverages and plain yogurts; and significant decreases in mixed dishes/sandwiches, meat/eggs/fish and cheese. For FS-EXCESS individuals only, the optimization process significantly increased vegetables and significantly decreased sugar-sweetened beverages, sweet products and fruit juices. The diets of French adults with excessive intakes of FS are of lower nutritional quality, but can be optimized via specific dietary changes.

  1. Individual Diet Modeling Shows How to Balance the Diet of French Adults with or without Excessive Free Sugar Intakes.

    Science.gov (United States)

    Lluch, Anne; Maillot, Matthieu; Gazan, Rozenn; Vieux, Florent; Delaere, Fabien; Vaudaine, Sarah; Darmon, Nicole

    2017-02-20

    Dietary changes needed to achieve nutritional adequacy for 33 nutrients were determined for 1719 adults from a representative French national dietary survey. For each individual, an iso-energy nutritionally adequate diet was generated using diet modeling, staying as close as possible to the observed diet. The French food composition table was completed with free sugar (FS) content. Results were analyzed separately for individuals with FS intakes in their observed diets ≤10% or >10% of their energy intake (named below FS-ACCEPTABLE and FS-EXCESS, respectively). The FS-EXCESS group represented 41% of the total population (average energy intake of 14.2% from FS). Compared with FS-ACCEPTABLE individuals, FS-EXCESS individuals had diets of lower nutritional quality and consumed more energy (2192 vs. 2123 kcal/day), particularly during snacking occasions (258 vs. 131 kcal/day) (all p-values fresh fruits, starchy foods, water, hot beverages and plain yogurts; and significant decreases in mixed dishes/sandwiches, meat/eggs/fish and cheese. For FS-EXCESS individuals only, the optimization process significantly increased vegetables and significantly decreased sugar-sweetened beverages, sweet products and fruit juices. The diets of French adults with excessive intakes of FS are of lower nutritional quality, but can be optimized via specific dietary changes.

  2. Individual Diet Modeling Shows How to Balance the Diet of French Adults with or without Excessive Free Sugar Intakes

    Science.gov (United States)

    Lluch, Anne; Maillot, Matthieu; Gazan, Rozenn; Vieux, Florent; Delaere, Fabien; Vaudaine, Sarah; Darmon, Nicole

    2017-01-01

    Dietary changes needed to achieve nutritional adequacy for 33 nutrients were determined for 1719 adults from a representative French national dietary survey. For each individual, an iso-energy nutritionally adequate diet was generated using diet modeling, staying as close as possible to the observed diet. The French food composition table was completed with free sugar (FS) content. Results were analyzed separately for individuals with FS intakes in their observed diets ≤10% or >10% of their energy intake (named below FS-ACCEPTABLE and FS-EXCESS, respectively). The FS-EXCESS group represented 41% of the total population (average energy intake of 14.2% from FS). Compared with FS-ACCEPTABLE individuals, FS-EXCESS individuals had diets of lower nutritional quality and consumed more energy (2192 vs. 2123 kcal/day), particularly during snacking occasions (258 vs. 131 kcal/day) (all p-values cheese. For FS-EXCESS individuals only, the optimization process significantly increased vegetables and significantly decreased sugar-sweetened beverages, sweet products and fruit juices. The diets of French adults with excessive intakes of FS are of lower nutritional quality, but can be optimized via specific dietary changes. PMID:28230722

  3. An Approach to Computer Modeling of Geological Faults in 3D and an Application

    Institute of Scientific and Technical Information of China (English)

    ZHU Liang-feng; HE Zheng; PAN Xin; WU Xin-cai

    2006-01-01

    3D geological modeling, one of the most important applications in geosciences of 3D GIS, forms the basis and is a prerequisite for visualized representation and analysis of 3D geological data. Computer modeling of geological faults in 3D is currently a topical research area. Structural modeling techniques of complex geological entities containing reverse faults are discussed and a series of approaches are proposed. The geological concepts involved in computer modeling and visualization of geological fault in 3D are explained, the type of data of geological faults based on geological exploration is analyzed, and a normative database format for geological faults is designed. Two kinds of modeling approaches for faults are compared: a modeling technique of faults based on stratum recovery and a modeling technique of faults based on interpolation in subareas. A novel approach, called the Unified Modeling Technique for stratum and fault, is presented to solve the puzzling problems of reverse faults, syn-sedimentary faults and faults terminated within geological models. A case study of a fault model of bed rock in the Beijing Olympic Green District is presented in order to show the practical result of this method. The principle and the process of computer modeling of geological faults in 3D are discussed and a series of applied technical proposals established. It strengthens our profound comprehension of geological phenomena and the modeling approach, and establishes the basic techniques of 3D geological modeling for practical applications in the field of geosciences.

  4. Amniotic fluid stem cells with low γ-interferon response showed behavioral improvement in Parkinsonism rat model.

    Directory of Open Access Journals (Sweden)

    Yu-Jen Chang

    Full Text Available Amniotic fluid stem cells (AFSCs are multipotent stem cells that may be used in transplantation medicine. In this study, AFSCs established from amniocentesis were characterized on the basis of surface marker expression and differentiation potential. To further investigate the properties of AFSCs for translational applications, we examined the cell surface expression of human leukocyte antigens (HLA of these cells and estimated the therapeutic effect of AFSCs in parkinsonian rats. The expression profiles of HLA-II and transcription factors were compared between AFSCs and bone marrow-derived mesenchymal stem cells (BMMSCs following treatment with γ-IFN. We found that stimulation of AFSCs with γ-IFN prompted only a slight increase in the expression of HLA-Ia and HLA-E, and the rare HLA-II expression could also be observed in most AFSCs samples. Consequently, the expression of CIITA and RFX5 was weakly induced by γ-IFN stimulation of AFSCs compared to that of BMMSCs. In the transplantation test, Sprague Dawley rats with 6-hydroxydopamine lesioning of the substantia nigra were used as a parkinsonian-animal model. Following the negative γ-IFN response AFSCs injection, apomorphine-induced rotation was reduced by 75% in AFSCs engrafted parkinsonian rats but was increased by 53% in the control group after 12-weeks post-transplantation. The implanted AFSCs were viable, and were able to migrate into the brain's circuitry and express specific proteins of dopamine neurons, such as tyrosine hydroxylase and dopamine transporter. In conclusion, the relative insensitivity AFSCs to γ-IFN implies that AFSCs might have immune-tolerance in γ-IFN inflammatory conditions. Furthermore, the effective improvement of AFSCs transplantation for apomorphine-induced rotation paves the way for the clinical application in parkinsonian therapy.

  5. An orally available, small-molecule polymerase inhibitor shows efficacy against a lethal morbillivirus infection in a large animal model.

    Science.gov (United States)

    Krumm, Stefanie A; Yan, Dan; Hovingh, Elise S; Evers, Taylor J; Enkirch, Theresa; Reddy, G Prabhakar; Sun, Aiming; Saindane, Manohar T; Arrendale, Richard F; Painter, George; Liotta, Dennis C; Natchus, Michael G; von Messling, Veronika; Plemper, Richard K

    2014-04-16

    Measles virus is a highly infectious morbillivirus responsible for major morbidity and mortality in unvaccinated humans. The related, zoonotic canine distemper virus (CDV) induces morbillivirus disease in ferrets with 100% lethality. We report an orally available, shelf-stable pan-morbillivirus inhibitor that targets the viral RNA polymerase. Prophylactic oral treatment of ferrets infected intranasally with a lethal CDV dose reduced viremia and prolonged survival. Ferrets infected with the same dose of virus that received post-infection treatment at the onset of viremia showed low-grade viral loads, remained asymptomatic, and recovered from infection, whereas control animals succumbed to the disease. Animals that recovered also mounted a robust immune response and were protected against rechallenge with a lethal CDV dose. Drug-resistant viral recombinants were generated and found to be attenuated and transmission-impaired compared to the genetic parent virus. These findings may pioneer a path toward an effective morbillivirus therapy that could aid measles eradication by synergizing with vaccination to close gaps in herd immunity due to vaccine refusal.

  6. THE FAIRSHARES MODEL: AN ETHICAL APPROACH TO SOCIAL ENTERPRISE DEVELOPMENT?

    OpenAIRE

    Ridley-Duff, R.

    2015-01-01

    This paper is based on the keynote address to the 14th International Association of Public and Non-Profit Marketing (IAPNM) conference. It explore the question "What impact do ethical values in the FairShares Model have on social entrepreneurial behaviour?" In the first part, three broad approaches to social enterprise are set out: co-operative and mutual enterprises (CMEs), social and responsible businesses (SRBs) and charitable trading activities (CTAs). The ethics that guide each approach ...

  7. Intelligent Transportation and Evacuation Planning A Modeling-Based Approach

    CERN Document Server

    Naser, Arab

    2012-01-01

    Intelligent Transportation and Evacuation Planning: A Modeling-Based Approach provides a new paradigm for evacuation planning strategies and techniques. Recently, evacuation planning and modeling have increasingly attracted interest among researchers as well as government officials. This interest stems from the recent catastrophic hurricanes and weather-related events that occurred in the southeastern United States (Hurricane Katrina and Rita). The evacuation methods that were in place before and during the hurricanes did not work well and resulted in thousands of deaths. This book offers insights into the methods and techniques that allow for implementing mathematical-based, simulation-based, and integrated optimization and simulation-based engineering approaches for evacuation planning. This book also: Comprehensively discusses the application of mathematical models for evacuation and intelligent transportation modeling Covers advanced methodologies in evacuation modeling and planning Discusses principles a...

  8. Dynamical system approach to running Λ cosmological models

    Energy Technology Data Exchange (ETDEWEB)

    Stachowski, Aleksander [Jagiellonian University, Astronomical Observatory, Krakow (Poland); Szydlowski, Marek [Jagiellonian University, Astronomical Observatory, Krakow (Poland); Jagiellonian University, Mark Kac Complex Systems Research Centre, Krakow (Poland)

    2016-11-15

    We study the dynamics of cosmological models with a time dependent cosmological term. We consider five classes of models; two with the non-covariant parametrization of the cosmological term Λ: Λ(H)CDM cosmologies, Λ(a)CDM cosmologies, and three with the covariant parametrization of Λ: Λ(R)CDM cosmologies, where R(t) is the Ricci scalar, Λ(φ)-cosmologies with diffusion, Λ(X)-cosmologies, where X = (1)/(2)g{sup αβ}∇{sub α}∇{sub β}φ is a kinetic part of the density of the scalar field. We also consider the case of an emergent Λ(a) relation obtained from the behaviour of trajectories in a neighbourhood of an invariant submanifold. In the study of the dynamics we used dynamical system methods for investigating how an evolutionary scenario can depend on the choice of special initial conditions. We show that the methods of dynamical systems allow one to investigate all admissible solutions of a running Λ cosmology for all initial conditions. We interpret Alcaniz and Lima's approach as a scaling cosmology. We formulate the idea of an emergent cosmological term derived directly from an approximation of the exact dynamics. We show that some non-covariant parametrization of the cosmological term like Λ(a), Λ(H) gives rise to the non-physical behaviour of trajectories in the phase space. This behaviour disappears if the term Λ(a) is emergent from the covariant parametrization. (orig.)

  9. A model selection approach to analysis of variance and covariance.

    Science.gov (United States)

    Alber, Susan A; Weiss, Robert E

    2009-06-15

    An alternative to analysis of variance is a model selection approach where every partition of the treatment means into clusters with equal value is treated as a separate model. The null hypothesis that all treatments are equal corresponds to the partition with all means in a single cluster. The alternative hypothesis correspond to the set of all other partitions of treatment means. A model selection approach can also be used for a treatment by covariate interaction, where the null hypothesis and each alternative correspond to a partition of treatments into clusters with equal covariate effects. We extend the partition-as-model approach to simultaneous inference for both treatment main effect and treatment interaction with a continuous covariate with separate partitions for the intercepts and treatment-specific slopes. The model space is the Cartesian product of the intercept partition and the slope partition, and we develop five joint priors for this model space. In four of these priors the intercept and slope partition are dependent. We advise on setting priors over models, and we use the model to analyze an orthodontic data set that compares the frictional resistance created by orthodontic fixtures. Copyright (c) 2009 John Wiley & Sons, Ltd.

  10. Towards a whole-cell modeling approach for synthetic biology

    Science.gov (United States)

    Purcell, Oliver; Jain, Bonny; Karr, Jonathan R.; Covert, Markus W.; Lu, Timothy K.

    2013-06-01

    Despite rapid advances over the last decade, synthetic biology lacks the predictive tools needed to enable rational design. Unlike established engineering disciplines, the engineering of synthetic gene circuits still relies heavily on experimental trial-and-error, a time-consuming and inefficient process that slows down the biological design cycle. This reliance on experimental tuning is because current modeling approaches are unable to make reliable predictions about the in vivo behavior of synthetic circuits. A major reason for this lack of predictability is that current models view circuits in isolation, ignoring the vast number of complex cellular processes that impinge on the dynamics of the synthetic circuit and vice versa. To address this problem, we present a modeling approach for the design of synthetic circuits in the context of cellular networks. Using the recently published whole-cell model of Mycoplasma genitalium, we examined the effect of adding genes into the host genome. We also investigated how codon usage correlates with gene expression and find agreement with existing experimental results. Finally, we successfully implemented a synthetic Goodwin oscillator in the whole-cell model. We provide an updated software framework for the whole-cell model that lays the foundation for the integration of whole-cell models with synthetic gene circuit models. This software framework is made freely available to the community to enable future extensions. We envision that this approach will be critical to transforming the field of synthetic biology into a rational and predictive engineering discipline.

  11. A transformation approach for collaboration based requirement models

    CERN Document Server

    Harbouche, Ahmed; Mokhtari, Aicha

    2012-01-01

    Distributed software engineering is widely recognized as a complex task. Among the inherent complexities is the process of obtaining a system design from its global requirement specification. This paper deals with such transformation process and suggests an approach to derive the behavior of a given system components, in the form of distributed Finite State Machines, from the global system requirements, in the form of an augmented UML Activity Diagrams notation. The process of the suggested approach is summarized in three steps: the definition of the appropriate source Meta-Model (requirements Meta-Model), the definition of the target Design Meta-Model and the definition of the rules to govern the transformation during the derivation process. The derivation process transforms the global system requirements described as UML diagram activities (extended with collaborations) to system roles behaviors represented as UML finite state machines. The approach is implemented using Atlas Transformation Language (ATL).

  12. A TRANSFORMATION APPROACH FOR COLLABORATION BASED REQUIREMENT MODELS

    Directory of Open Access Journals (Sweden)

    Ahmed Harbouche

    2012-02-01

    Full Text Available Distributed software engineering is widely recognized as a complex task. Among the inherent complexitiesis the process of obtaining a system design from its global requirement specification. This paper deals withsuch transformation process and suggests an approach to derive the behavior of a given systemcomponents, in the form of distributed Finite State Machines, from the global system requirements, in theform of an augmented UML Activity Diagrams notation. The process of the suggested approach issummarized in three steps: the definition of the appropriate source Meta-Model (requirements Meta-Model, the definition of the target Design Meta-Model and the definition of the rules to govern thetransformation during the derivation process. The derivation process transforms the global systemrequirements described as UML diagram activities (extended with collaborations to system rolesbehaviors represented as UML finite state machines. The approach is implemented using AtlasTransformation Language (ATL.

  13. An algebraic approach to modeling in software engineering

    Energy Technology Data Exchange (ETDEWEB)

    Loegel, G.J. [Superconducting Super Collider Lab., Dallas, TX (United States)]|[Michigan Univ., Ann Arbor, MI (United States); Ravishankar, C.V. [Michigan Univ., Ann Arbor, MI (United States)

    1993-09-01

    Our work couples the formalism of universal algebras with the engineering techniques of mathematical modeling to develop a new approach to the software engineering process. Our purpose in using this combination is twofold. First, abstract data types and their specification using universal algebras can be considered a common point between the practical requirements of software engineering and the formal specification of software systems. Second, mathematical modeling principles provide us with a means for effectively analyzing real-world systems. We first use modeling techniques to analyze a system and then represent the analysis using universal algebras. The rest of the software engineering process exploits properties of universal algebras that preserve the structure of our original model. This paper describes our software engineering process and our experience using it on both research and commercial systems. We need a new approach because current software engineering practices often deliver software that is difficult to develop and maintain. Formal software engineering approaches use universal algebras to describe ``computer science`` objects like abstract data types, but in practice software errors are often caused because ``real-world`` objects are improperly modeled. There is a large semantic gap between the customer`s objects and abstract data types. In contrast, mathematical modeling uses engineering techniques to construct valid models for real-world systems, but these models are often implemented in an ad hoc manner. A combination of the best features of both approaches would enable software engineering to formally specify and develop software systems that better model real systems. Software engineering, like mathematical modeling, should concern itself first and foremost with understanding a real system and its behavior under given circumstances, and then with expressing this knowledge in an executable form.

  14. Genetic and Modeling Approaches Reveal Distinct Components of Impulsive Behavior.

    Science.gov (United States)

    Nautiyal, Katherine M; Wall, Melanie M; Wang, Shuai; Magalong, Valerie M; Ahmari, Susanne E; Balsam, Peter D; Blanco, Carlos; Hen, René

    2017-01-18

    Impulsivity is an endophenotype found in many psychiatric disorders including substance use disorders, pathological gambling, and attention deficit hyperactivity disorder. Two behavioral features often considered in impulsive behavior are behavioral inhibition (impulsive action) and delayed gratification (impulsive choice). However, the extent to which these behavioral constructs represent distinct facets of behavior with discrete biological bases is unclear. To test the hypothesis that impulsive action and impulsive choice represent statistically independent behavioral constructs in mice, we collected behavioral measures of impulsivity in a single cohort of mice using well-validated operant behavioral paradigms. Mice with manipulation of serotonin 1B receptor (5-HT1BR) expression were included as a model of disordered impulsivity. A factor analysis was used to characterize correlations between the measures of impulsivity and to identify covariates. Using two approaches, we dissociated impulsive action from impulsive choice. First, the absence of 5-HT1BRs caused increased impulsive action, but not impulsive choice. Second, based on an exploratory factor analysis, a two-factor model described the data well, with measures of impulsive action and choice separating into two independent factors. A multiple-indicator multiple-causes analysis showed that 5-HT1BR expression and sex were significant covariates of impulsivity. Males displayed increased impulsivity in both dimensions, whereas 5-HT1BR expression was a predictor of increased impulsive action only. These data support the conclusion that impulsive action and impulsive choice are distinct behavioral phenotypes with dissociable biological influences that can be modeled in mice. Our work may help inform better classification, diagnosis, and treatment of psychiatric disorders, which present with disordered impulsivity.Neuropsychopharmacology advance online publication, 18 January 2017; doi:10.1038/npp.2016.277.

  15. Optimal design of supply chain network under uncertainty environment using hybrid analytical and simulation modeling approach

    Science.gov (United States)

    Chiadamrong, N.; Piyathanavong, V.

    2017-04-01

    Models that aim to optimize the design of supply chain networks have gained more interest in the supply chain literature. Mixed-integer linear programming and discrete-event simulation are widely used for such an optimization problem. We present a hybrid approach to support decisions for supply chain network design using a combination of analytical and discrete-event simulation models. The proposed approach is based on iterative procedures until the difference between subsequent solutions satisfies the pre-determined termination criteria. The effectiveness of proposed approach is illustrated by an example, which shows closer to optimal results with much faster solving time than the results obtained from the conventional simulation-based optimization model. The efficacy of this proposed hybrid approach is promising and can be applied as a powerful tool in designing a real supply chain network. It also provides the possibility to model and solve more realistic problems, which incorporate dynamism and uncertainty.

  16. Teaching Service Modelling to a Mixed Class: An Integrated Approach

    Directory of Open Access Journals (Sweden)

    Jeremiah D. DENG

    2015-04-01

    Full Text Available Service modelling has become an increasingly important area in today's telecommunications and information systems practice. We have adapted a Network Design course in order to teach service modelling to a mixed class of both the telecommunication engineering and information systems backgrounds. An integrated approach engaging mathematics teaching with strategies such as problem-solving, visualization, and the use of examples and simulations, has been developed. From assessment on student learning outcomes, it is indicated that the proposed course delivery approach succeeded in bringing out comparable and satisfactory performance from students of different educational backgrounds.

  17. A Spatial Clustering Approach for Stochastic Fracture Network Modelling

    Science.gov (United States)

    Seifollahi, S.; Dowd, P. A.; Xu, C.; Fadakar, A. Y.

    2014-07-01

    Fracture network modelling plays an important role in many application areas in which the behaviour of a rock mass is of interest. These areas include mining, civil, petroleum, water and environmental engineering and geothermal systems modelling. The aim is to model the fractured rock to assess fluid flow or the stability of rock blocks. One important step in fracture network modelling is to estimate the number of fractures and the properties of individual fractures such as their size and orientation. Due to the lack of data and the complexity of the problem, there are significant uncertainties associated with fracture network modelling in practice. Our primary interest is the modelling of fracture networks in geothermal systems and, in this paper, we propose a general stochastic approach to fracture network modelling for this application. We focus on using the seismic point cloud detected during the fracture stimulation of a hot dry rock reservoir to create an enhanced geothermal system; these seismic points are the conditioning data in the modelling process. The seismic points can be used to estimate the geographical extent of the reservoir, the amount of fracturing and the detailed geometries of fractures within the reservoir. The objective is to determine a fracture model from the conditioning data by minimizing the sum of the distances of the points from the fitted fracture model. Fractures are represented as line segments connecting two points in two-dimensional applications or as ellipses in three-dimensional (3D) cases. The novelty of our model is twofold: (1) it comprises a comprehensive fracture modification scheme based on simulated annealing and (2) it introduces new spatial approaches, a goodness-of-fit measure for the fitted fracture model, a measure for fracture similarity and a clustering technique for proposing a locally optimal solution for fracture parameters. We use a simulated dataset to demonstrate the application of the proposed approach

  18. Comparison of the Two-Hemisphere Model-Driven Approach to Other Methods for Model-Driven Software Development

    Directory of Open Access Journals (Sweden)

    Nikiforova Oksana

    2015-12-01

    Full Text Available Models are widely used not only in computer science field, but also in other fields. They are an effective way to show relevant information in a convenient way. Model-driven software development uses models and transformations as first-class citizens. That makes software development phases more related to each other, those links later help to make changes or modify software product more freely. At the moment there are a lot of methods and techniques to create those models and transform them into each other. Since 2004, authors have been developing the so called 2HMD approach to bridge the gap between problem domain and software components by using models and model transformation. The goal of this research is to compare different methods positioned for performing the same tasks as the 2HMD approach and to understand the state of the art in the area of model-driven software development.

  19. A Multiple Model Approach to Modeling Based on Fuzzy Support Vector Machines

    Institute of Scientific and Technical Information of China (English)

    冯瑞; 张艳珠; 宋春林; 邵惠鹤

    2003-01-01

    A new multiple models(MM) approach was proposed to model complex industrial process by using Fuzzy Support Vector Machines (F SVMs). By applying the proposed approach to a pH neutralization titration experi-ment, F_SVMs MM not only provides satisfactory approximation and generalization property, but also achieves superior performance to USOCPN multiple modeling method and single modeling method based on standard SVMs.

  20. Software sensors based on the grey-box modelling approach

    DEFF Research Database (Denmark)

    Carstensen, J.; Harremoës, P.; Strube, Rune

    1996-01-01

    In recent years the grey-box modelling approach has been applied to wastewater transportation and treatment Grey-box models are characterized by the combination of deterministic and stochastic terms to form a model where all the parameters are statistically identifiable from the on......-line measurements. With respect to the development of software sensors, the grey-box models possess two important features. Firstly, the on-line measurements can be filtered according to the grey-box model in order to remove noise deriving from the measuring equipment and controlling devices. Secondly, the grey-box...... models may contain terms which can be estimated on-line by use of the models and measurements. In this paper, it is demonstrated that many storage basins in sewer systems can be used as an on-line flow measurement provided that the basin is monitored on-line with a level transmitter and that a grey-box...

  1. Environmental Radiation Effects on Mammals A Dynamical Modeling Approach

    CERN Document Server

    Smirnova, Olga A

    2010-01-01

    This text is devoted to the theoretical studies of radiation effects on mammals. It uses the framework of developed deterministic mathematical models to investigate the effects of both acute and chronic irradiation in a wide range of doses and dose rates on vital body systems including hematopoiesis, small intestine and humoral immunity, as well as on the development of autoimmune diseases. Thus, these models can contribute to the development of the system and quantitative approaches in radiation biology and ecology. This text is also of practical use. Its modeling studies of the dynamics of granulocytopoiesis and thrombocytopoiesis in humans testify to the efficiency of employment of the developed models in the investigation and prediction of radiation effects on these hematopoietic lines. These models, as well as the properly identified models of other vital body systems, could provide a better understanding of the radiation risks to health. The modeling predictions will enable the implementation of more ef...

  2. The standard data model approach to patient record transfer.

    Science.gov (United States)

    Canfield, K; Silva, M; Petrucci, K

    1994-01-01

    This paper develops an approach to electronic data exchange of patient records from Ambulatory Encounter Systems (AESs). This approach assumes that the AES is based upon a standard data model. The data modeling standard used here is IDEFIX for Entity/Relationship (E/R) modeling. Each site that uses a relational database implementation of this standard data model (or a subset of it) can exchange very detailed patient data with other such sites using industry standard tools and without excessive programming efforts. This design is detailed below for a demonstration project between the research-oriented geriatric clinic at the Baltimore Veterans Affairs Medical Center (BVAMC) and the Laboratory for Healthcare Informatics (LHI) at the University of Maryland.

  3. Model selection and inference a practical information-theoretic approach

    CERN Document Server

    Burnham, Kenneth P

    1998-01-01

    This book is unique in that it covers the philosophy of model-based data analysis and an omnibus strategy for the analysis of empirical data The book introduces information theoretic approaches and focuses critical attention on a priori modeling and the selection of a good approximating model that best represents the inference supported by the data Kullback-Leibler information represents a fundamental quantity in science and is Hirotugu Akaike's basis for model selection The maximized log-likelihood function can be bias-corrected to provide an estimate of expected, relative Kullback-Leibler information This leads to Akaike's Information Criterion (AIC) and various extensions and these are relatively simple and easy to use in practice, but little taught in statistics classes and far less understood in the applied sciences than should be the case The information theoretic approaches provide a unified and rigorous theory, an extension of likelihood theory, an important application of information theory, and are ...

  4. Real-space renormalization group approach to the Anderson model

    Science.gov (United States)

    Campbell, Eamonn

    Many of the most interesting electronic behaviours currently being studied are associated with strong correlations. In addition, many of these materials are disordered either intrinsically or due to doping. Solving interacting systems exactly is extremely computationally expensive, and approximate techniques developed for strongly correlated systems are not easily adapted to include disorder. As a non-interacting disordered model, it makes sense to consider the Anderson model as a first step in developing an approximate method of solution to the interacting and disordered Anderson-Hubbard model. Our renormalization group (RG) approach is modeled on that proposed by Johri and Bhatt [23]. We found an error in their work which we have corrected in our procedure. After testing the execution of the RG, we benchmarked the density of states and inverse participation ratio results against exact diagonalization. Our approach is significantly faster than exact diagonalization and is most accurate in the limit of strong disorder.

  5. Model Convolution: A Computational Approach to Digital Image Interpretation

    Science.gov (United States)

    Gardner, Melissa K.; Sprague, Brian L.; Pearson, Chad G.; Cosgrove, Benjamin D.; Bicek, Andrew D.; Bloom, Kerry; Salmon, E. D.

    2010-01-01

    Digital fluorescence microscopy is commonly used to track individual proteins and their dynamics in living cells. However, extracting molecule-specific information from fluorescence images is often limited by the noise and blur intrinsic to the cell and the imaging system. Here we discuss a method called “model-convolution,” which uses experimentally measured noise and blur to simulate the process of imaging fluorescent proteins whose spatial distribution cannot be resolved. We then compare model-convolution to the more standard approach of experimental deconvolution. In some circumstances, standard experimental deconvolution approaches fail to yield the correct underlying fluorophore distribution. In these situations, model-convolution removes the uncertainty associated with deconvolution and therefore allows direct statistical comparison of experimental and theoretical data. Thus, if there are structural constraints on molecular organization, the model-convolution method better utilizes information gathered via fluorescence microscopy, and naturally integrates experiment and theory. PMID:20461132

  6. A new approach of high speed cutting modelling: SPH method

    OpenAIRE

    LIMIDO, Jérôme; Espinosa, Christine; Salaün, Michel; Lacome, Jean-Luc

    2006-01-01

    The purpose of this study is to introduce a new approach of high speed cutting numerical modelling. A lagrangian Smoothed Particle Hydrodynamics (SPH) based model is carried out using the Ls-Dyna software. SPH is a meshless method, thus large material distortions that occur in the cutting problem are easily managed and SPH contact control permits a “natural” workpiece/chip separation. Estimated chip morphology and cutting forces are compared to machining dedicated code results and experimenta...

  7. Schwinger boson approach to the fully screened Kondo model.

    Science.gov (United States)

    Rech, J; Coleman, P; Zarand, G; Parcollet, O

    2006-01-13

    We apply the Schwinger boson scheme to the fully screened Kondo model and generalize the method to include antiferromagnetic interactions between ions. Our approach captures the Kondo crossover from local moment behavior to a Fermi liquid with a nontrivial Wilson ratio. When applied to the two-impurity model, the mean-field theory describes the "Varma-Jones" quantum phase transition between a valence bond state and a heavy Fermi liquid.

  8. Kallen Lehman approach to 3D Ising model

    Science.gov (United States)

    Canfora, F.

    2007-03-01

    A “Kallen-Lehman” approach to Ising model, inspired by quantum field theory à la Regge, is proposed. The analogy with the Kallen-Lehman representation leads to a formula for the free-energy of the 3D model with few free parameters which could be matched with the numerical data. The possible application of this scheme to the spin glass case is shortly discussed.

  9. Modelling approaches in sedimentology: Introduction to the thematic issue

    Science.gov (United States)

    Joseph, Philippe; Teles, Vanessa; Weill, Pierre

    2016-09-01

    As an introduction to this thematic issue on "Modelling approaches in sedimentology", this paper gives an overview of the workshop held in Paris on 7 November 2013 during the 14th Congress of the French Association of Sedimentologists. A synthesis of the workshop in terms of concepts, spatial and temporal scales, constraining data, and scientific challenges is first presented, then a discussion on the possibility of coupling different models, the industrial needs, and the new potential domains of research is exposed.

  10. Computational Models of Spreadsheet Development: Basis for Educational Approaches

    CERN Document Server

    Hodnigg, Karin; Mittermeir, Roland T

    2008-01-01

    Among the multiple causes of high error rates in spreadsheets, lack of proper training and of deep understanding of the computational model upon which spreadsheet computations rest might not be the least issue. The paper addresses this problem by presenting a didactical model focussing on cell interaction, thus exceeding the atomicity of cell computations. The approach is motivated by an investigation how different spreadsheet systems handle certain computational issues implied from moving cells, copy-paste operations, or recursion.

  11. Modeling Water Shortage Management Using an Object-Oriented Approach

    Science.gov (United States)

    Wang, J.; Senarath, S.; Brion, L.; Niedzialek, J.; Novoa, R.; Obeysekera, J.

    2007-12-01

    As a result of the increasing global population and the resulting urbanization, water shortage issues have received increased attention throughout the world . Water supply has not been able to keep up with increased demand for water, especially during times of drought. The use of an object-oriented (OO) approach coupled with efficient mathematical models is an effective tool in addressing discrepancies between water supply and demand. Object-oriented modeling has been proven powerful and efficient in simulating natural behavior. This research presents a way to model water shortage management using the OO approach. Three groups of conceptual components using the OO approach are designed for the management model. The first group encompasses evaluation of natural behaviors and possible related management options. This evaluation includes assessing any discrepancy that might exist between water demand and supply. The second group is for decision making which includes the determination of water use cutback amount and duration using established criteria. The third group is for implementation of the management options which are restrictions of water usage at a local or regional scale. The loop is closed through a feedback mechanism where continuity in the time domain is established. Like many other regions, drought management is very important in south Florida. The Regional Simulation Model (RSM) is a finite volume, fully integrated hydrologic model used by the South Florida Water Management District to evaluate regional response to various planning alternatives including drought management. A trigger module was developed for RSM that encapsulates the OO approach to water shortage management. Rigorous testing of the module was performed using historical south Florida conditions. Keywords: Object-oriented, modeling, water shortage management, trigger module, Regional Simulation Model

  12. Urban Modelling with Typological Approach. Case Study: Merida, Yucatan, Mexico

    Science.gov (United States)

    Rodriguez, A.

    2017-08-01

    In three-dimensional models of urban historical reconstruction, missed contextual architecture faces difficulties because it does not have much written references in contrast to the most important monuments. This is the case of Merida, Yucatan, Mexico during the Colonial Era (1542-1810), which has lost much of its heritage. An alternative to offer a hypothetical view of these elements is a typological - parametric definition that allows a 3D modeling approach to the most common features of this heritage evidence.

  13. Comparison of Modeling and Experimental Approaches for Improved Modeling of Filtration in Granular and Consolidated Media

    Science.gov (United States)

    Mirabolghasemi, M.; Prodanovic, M.; DiCarlo, D. A.

    2014-12-01

    Filtration is relevant to many disciplines from colloid transport in environmental engineering to formation damage in petroleum engineering. In this study we compare the results of the novel numerical modeling of filtration phenomenon on pore scale with the complementary experimental observations on laboratory scale and discuss how the results of comparison can be used to improve macroscale filtration models for different porous media. The water suspension contained glass beads of 200 micron diameter and flows through a packing of 1mm diameter glass beads, and thus the main filtration mechanism is straining and jamming of particles. The numerical model simulates the flow of suspension through a realistic 3D structure of an imaged, disordered sphere pack, which acts as the filter medium. Particle capture through size exclusion and jamming is modeled via a coupled Discrete Element Method (DEM) and Computational Fluid Dynamics (CFD) approach. The coupled CFD-DEM approach is capable of modeling the majority of particle-particle, particle-wall, and particle-fluid interactions. Note that most of traditional approaches require spherical particles both in suspension and the filtration medium. We adapted the interface between the pore space and the spherical grains to be represented as a triangulated surface and this allows extensions to any imaged media. The numerical and experimental results show that the filtration coefficient of the sphere pack is a function of the flow rate and concentration of the suspension, even for constant total particle flow rate. An increase in the suspension flow rate results in a decrease in the filtration coefficient, which suggests that the hydrodynamic drag force plays the key role in hindering the particle capture in random sphere packs. Further, similar simulations of suspension flow through a sandstone sample, which has a tighter pore space, show that filtration coefficient remains almost constant at different suspension flow rates. This

  14. Point-coupling models from mesonic hyper massive limit and mean-field approaches

    Energy Technology Data Exchange (ETDEWEB)

    Lourenco, O.; Dutra, M., E-mail: odilon@ita.br [Departamento de Fisica, Instituto Tecnologico da Aeronautica - CTA, Sao Jose dos Campos, SP (Brazil); Delfino, Antonio, E-mail: delfino@if.uff.br [Instituto de Fisica, Universidade Federal Fluminense, Niteroi, RJ (Brazil); Amaral, R.L.P.G. [Center for Theoretical Physics, Massachusetts Institute of Technology, Cambridge, MA (United States)

    2012-08-15

    t In this work, we show how nonlinear point coupling models, described by a Lagrangian density containing only terms up to fourth order in the fermion condensate ({Psi}-bar{Psi}), are derived from a modified meson exchange nonlinear Walecka model. We present two methods of derivation, namely the hyper massive meson limit within a functional integral approach and the mean-field approximation, in which equations of state at zero temperature of the nonlinear point-coupling models are directly obtained. (author)

  15. Similarity transformation approach to identifiability analysis of nonlinear compartmental models.

    Science.gov (United States)

    Vajda, S; Godfrey, K R; Rabitz, H

    1989-04-01

    Through use of the local state isomorphism theorem instead of the algebraic equivalence theorem of linear systems theory, the similarity transformation approach is extended to nonlinear models, resulting in finitely verifiable sufficient and necessary conditions for global and local identifiability. The approach requires testing of certain controllability and observability conditions, but in many practical examples these conditions prove very easy to verify. In principle the method also involves nonlinear state variable transformations, but in all of the examples presented in the paper the transformations turn out to be linear. The method is applied to an unidentifiable nonlinear model and a locally identifiable nonlinear model, and these are the first nonlinear models other than bilinear models where the reason for lack of global identifiability is nontrivial. The method is also applied to two models with Michaelis-Menten elimination kinetics, both of considerable importance in pharmacokinetics, and for both of which the complicated nature of the algebraic equations arising from the Taylor series approach has hitherto defeated attempts to establish identifiability results for specific input functions.

  16. The Generalised Ecosystem Modelling Approach in Radiological Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Klos, Richard

    2008-03-15

    An independent modelling capability is required by SSI in order to evaluate dose assessments carried out in Sweden by, amongst others, SKB. The main focus is the evaluation of the long-term radiological safety of radioactive waste repositories for both spent fuel and low-level radioactive waste. To meet the requirement for an independent modelling tool for use in biosphere dose assessments, SSI through its modelling team CLIMB commissioned the development of a new model in 2004, a project to produce an integrated model of radionuclides in the landscape. The generalised ecosystem modelling approach (GEMA) is the result. GEMA is a modular system of compartments representing the surface environment. It can be configured, through water and solid material fluxes, to represent local details in the range of ecosystem types found in the past, present and future Swedish landscapes. The approach is generic but fine tuning can be carried out using local details of the surface drainage system. The modular nature of the modelling approach means that GEMA modules can be linked to represent large scale surface drainage features over an extended domain in the landscape. System change can also be managed in GEMA, allowing a flexible and comprehensive model of the evolving landscape to be constructed. Environmental concentrations of radionuclides can be calculated and the GEMA dose pathway model provides a means of evaluating the radiological impact of radionuclide release to the surface environment. This document sets out the philosophy and details of GEMA and illustrates the functioning of the model with a range of examples featuring the recent CLIMB review of SKB's SR-Can assessment

  17. Reliability assessment using degradation models: bayesian and classical approaches

    Directory of Open Access Journals (Sweden)

    Marta Afonso Freitas

    2010-04-01

    Full Text Available Traditionally, reliability assessment of devices has been based on (accelerated life tests. However, for highly reliable products, little information about reliability is provided by life tests in which few or no failures are typically observed. Since most failures arise from a degradation mechanism at work for which there are characteristics that degrade over time, one alternative is monitor the device for a period of time and assess its reliability from the changes in performance (degradation observed during that period. The goal of this article is to illustrate how degradation data can be modeled and analyzed by using "classical" and Bayesian approaches. Four methods of data analysis based on classical inference are presented. Next we show how Bayesian methods can also be used to provide a natural approach to analyzing degradation data. The approaches are applied to a real data set regarding train wheels degradation.Tradicionalmente, o acesso à confiabilidade de dispositivos tem sido baseado em testes de vida (acelerados. Entretanto, para produtos altamente confiáveis, pouca informação a respeito de sua confiabilidade é fornecida por testes de vida no quais poucas ou nenhumas falhas são observadas. Uma vez que boa parte das falhas é induzida por mecanismos de degradação, uma alternativa é monitorar o dispositivo por um período de tempo e acessar sua confiabilidade através das mudanças em desempenho (degradação observadas durante aquele período. O objetivo deste artigo é ilustrar como dados de degradação podem ser modelados e analisados utilizando-se abordagens "clássicas" e Bayesiana. Quatro métodos de análise de dados baseados em inferência clássica são apresentados. A seguir, mostramos como os métodos Bayesianos podem também ser aplicados para proporcionar uma abordagem natural à análise de dados de degradação. As abordagens são aplicadas a um banco de dados real relacionado à degradação de rodas de trens.

  18. A vector relational data modeling approach to Insider threat intelligence

    Science.gov (United States)

    Kelly, Ryan F.; Anderson, Thomas S.

    2016-05-01

    We address the problem of detecting insider threats before they can do harm. In many cases, co-workers notice indications of suspicious activity prior to insider threat attacks. A partial solution to this problem requires an understanding of how information can better traverse the communication network between human intelligence and insider threat analysts. Our approach employs modern mobile communications technology and scale free network architecture to reduce the network distance between human sensors and analysts. In order to solve this problem, we propose a Vector Relational Data Modeling approach to integrate human "sensors," geo-location, and existing visual analytics tools. This integration problem is known to be difficult due to quadratic increases in cost associated with complex integration solutions. A scale free network integration approach using vector relational data modeling is proposed as a method for reducing network distance without increasing cost.

  19. A discrete Lagrangian based direct approach to macroscopic modelling

    Science.gov (United States)

    Sarkar, Saikat; Nowruzpour, Mohsen; Reddy, J. N.; Srinivasa, A. R.

    2017-01-01

    A direct discrete Lagrangian based approach, designed at a length scale of interest, to characterize the response of a body is proposed. The main idea is to understand the dynamics of a deformable body via a Lagrangian corresponding to a coupled interaction of rigid particles in the reduced dimension. We argue that the usual practice of describing the laws of a deformable body in the continuum limit is redundant, because for most of the practical problems, analytical solutions are not available. Since continuum limit is not taken, the framework automatically relaxes the requirement of differentiability of field variables. The discrete Lagrangian based approach is illustrated by deriving an equivalent of the Euler-Bernoulli beam model. A few test examples are solved, which demonstrate that the derived non-local model predicts lower deflections in comparison to classical Euler-Bernoulli beam solutions. We have also included crack propagation in thin structures for isotropic and anisotropic cases using the Lagrangian based approach.

  20. Reconciliation with oneself and with others: From approach to model

    Directory of Open Access Journals (Sweden)

    Nikolić-Ristanović Vesna

    2010-01-01

    Full Text Available The paper intends to present the approach to dealing with war and its consequences which was developed within Victimology Society of Serbia over the last five years, in the framework of Association Joint Action for Truth and Reconciliation (ZAIP. First, the short review of the Association and the process through which ZAIP approach to dealing with a past was developed is presented. Then, the detailed description of the approach itself, with identification of its most important specificities, is presented. In the conclusion, next steps, aimed at development of the model of reconciliation which will have the basis in ZAIP approach and which will be appropriate to social context of Serbia and its surrounding, are suggested.

  1. EXTENDE MODEL OF COMPETITIVITY THROUG APPLICATION OF NEW APPROACH DIRECTIVES

    Directory of Open Access Journals (Sweden)

    Slavko Arsovski

    2009-03-01

    Full Text Available The basic subject of this work is the model of new approach impact on quality and safety products, and competency of our companies. This work represents real hypothesis on the basis of expert's experiences, in regard to that the infrastructure with using new approach directives wasn't examined until now, it isn't known which product or industry of Serbia is related to directives of the new approach and CE mark, and it is not known which are effects of the use of the CE mark. This work should indicate existing quality reserves and product's safety, the level of possible competency improvement and increasing the profit by discharging new approach directive requires.

  2. Vibro-acoustics of porous materials - waveguide modeling approach

    DEFF Research Database (Denmark)

    Darula, Radoslav; Sorokin, Sergey V.

    2016-01-01

    The porous material is considered as a compound multi-layered waveguide (i.e. a fluid layer surrounded with elastic layers) with traction free boundary conditions. The attenuation of the vibro-acoustic waves in such a material is assessed. This approach is compared with a conventional Biot's model...... in porous materials....

  3. A novel Monte Carlo approach to hybrid local volatility models

    NARCIS (Netherlands)

    A.W. van der Stoep (Anton); L.A. Grzelak (Lech Aleksander); C.W. Oosterlee (Cornelis)

    2017-01-01

    textabstractWe present in a Monte Carlo simulation framework, a novel approach for the evaluation of hybrid local volatility [Risk, 1994, 7, 18–20], [Int. J. Theor. Appl. Finance, 1998, 1, 61–110] models. In particular, we consider the stochastic local volatility model—see e.g. Lipton et al. [Quant.

  4. Teaching Modeling with Partial Differential Equations: Several Successful Approaches

    Science.gov (United States)

    Myers, Joseph; Trubatch, David; Winkel, Brian

    2008-01-01

    We discuss the introduction and teaching of partial differential equations (heat and wave equations) via modeling physical phenomena, using a new approach that encompasses constructing difference equations and implementing these in a spreadsheet, numerically solving the partial differential equations using the numerical differential equation…

  5. A Behavioral Decision Making Modeling Approach Towards Hedging Services

    NARCIS (Netherlands)

    Pennings, J.M.E.; Candel, M.J.J.M.; Egelkraut, T.M.

    2003-01-01

    This paper takes a behavioral approach toward the market for hedging services. A behavioral decision-making model is developed that provides insight into how and why owner-managers decide the way they do regarding hedging services. Insight into those choice processes reveals information needed by fi

  6. A fuzzy approach to the Weighted Overlap Dominance model

    DEFF Research Database (Denmark)

    Franco de los Rios, Camilo Andres; Hougaard, Jens Leth; Nielsen, Kurt

    2013-01-01

    in an interactive way, where input data can take the form of uniquely-graded or interval-valued information. Here we explore the Weighted Overlap Dominance (WOD) model from a fuzzy perspective and its outranking approach to decision support and multidimensional interval analysis. Firstly, imprecision measures...

  7. Methodological Approach for Modeling of Multienzyme in-pot Processes

    DEFF Research Database (Denmark)

    Andrade Santacoloma, Paloma de Gracia; Roman Martinez, Alicia; Sin, Gürkan;

    2011-01-01

    This paper presents a methodological approach for modeling multi-enzyme in-pot processes. The methodology is exemplified stepwise through the bi-enzymatic production of N-acetyl-D-neuraminic acid (Neu5Ac) from N-acetyl-D-glucosamine (GlcNAc). In this case study, sensitivity analysis is also used...

  8. Towards modeling future energy infrastructures - the ELECTRA system engineering approach

    DEFF Research Database (Denmark)

    Uslar, Mathias; Heussen, Kai

    2016-01-01

    Within this contribution, we provide an overview based on previous work conducted in the ELECTRA project to come up with a consistent method for modeling the ELECTRA WoC approach according to the methods established with the M/490 mandate of the European Commission. We will motivate the use of th...

  9. Pruning Chinese trees : an experimental and modelling approach

    NARCIS (Netherlands)

    Zeng, Bo

    2002-01-01

    Pruning of trees, in which some branches are removed from the lower crown of a tree, has been extensively used in China in silvicultural management for many purposes. With an experimental and modelling approach, the effects of pruning on tree growth and on the harvest of plant material were studied.

  10. Teaching Modeling with Partial Differential Equations: Several Successful Approaches

    Science.gov (United States)

    Myers, Joseph; Trubatch, David; Winkel, Brian

    2008-01-01

    We discuss the introduction and teaching of partial differential equations (heat and wave equations) via modeling physical phenomena, using a new approach that encompasses constructing difference equations and implementing these in a spreadsheet, numerically solving the partial differential equations using the numerical differential equation…

  11. A Metacognitive-Motivational Model of Surface Approach to Studying

    Science.gov (United States)

    Spada, Marcantonio M.; Moneta, Giovanni B.

    2012-01-01

    In this study, we put forward and tested a model of how surface approach to studying during examination preparation is influenced by the trait variables of motivation and metacognition and the state variables of avoidance coping and evaluation anxiety. A sample of 528 university students completed, one week before examinations, the following…

  12. A New Approach for Testing the Rasch Model

    Science.gov (United States)

    Kubinger, Klaus D.; Rasch, Dieter; Yanagida, Takuya

    2011-01-01

    Though calibration of an achievement test within psychological and educational context is very often carried out by the Rasch model, data sampling is hardly designed according to statistical foundations. However, Kubinger, Rasch, and Yanagida (2009) recently suggested an approach for the determination of sample size according to a given Type I and…

  13. Comparing State SAT Scores Using a Mixture Modeling Approach

    Science.gov (United States)

    Kim, YoungKoung Rachel

    2009-01-01

    Presented at the national conference for AERA (American Educational Research Association) in April 2009. The large variability of SAT taker population across states makes state-by-state comparisons of the SAT scores challenging. Using a mixture modeling approach, therefore, the current study presents a method of identifying subpopulations in terms…

  14. The Bipolar Approach: A Model for Interdisciplinary Art History Courses.

    Science.gov (United States)

    Calabrese, John A.

    1993-01-01

    Describes a college level art history course based on the opposing concepts of Classicism and Romanticism. Contends that all creative work, such as film or architecture, can be categorized according to this bipolar model. Includes suggestions for objects to study and recommends this approach for art education at all education levels. (CFR)

  15. Non-frontal model based approach to forensic face recognition

    NARCIS (Netherlands)

    Dutta, Abhishek; Veldhuis, Raymond; Spreeuwers, Luuk

    2012-01-01

    In this paper, we propose a non-frontal model based approach which ensures that a face recognition system always gets to compare images having similar view (or pose). This requires a virtual suspect reference set that consists of non-frontal suspect images having pose similar to the surveillance vie

  16. Smeared crack modelling approach for corrosion-induced concrete damage

    DEFF Research Database (Denmark)

    Thybo, Anna Emilie Anusha; Michel, Alexander; Stang, Henrik

    2017-01-01

    compared to experimental data obtained by digital image correlation and published in the literature. Excellent agreements between experimentally observed and numerically predicted crack patterns at the micro and macro scale indicate the capability of the modelling approach to accurately capture corrosion...

  17. Towards modeling future energy infrastructures - the ELECTRA system engineering approach

    DEFF Research Database (Denmark)

    Uslar, Mathias; Heussen, Kai

    2016-01-01

    Within this contribution, we provide an overview based on previous work conducted in the ELECTRA project to come up with a consistent method for modeling the ELECTRA WoC approach according to the methods established with the M/490 mandate of the European Commission. We will motivate the use...

  18. Atomistic approach for modeling metal-semiconductor interfaces

    DEFF Research Database (Denmark)

    Stradi, Daniele; Martinez, Umberto; Blom, Anders

    2016-01-01

    realistic metal-semiconductor interfaces and allows for a direct comparison between theory and experiments via the I–V curve. In particular, it will be demonstrated how doping — and bias — modifies the Schottky barrier, and how finite size models (the slab approach) are unable to describe these interfaces...

  19. Modelling individual differences in the form of Pavlovian conditioned approach responses: a dual learning systems approach with factored representations.

    Science.gov (United States)

    Lesaint, Florian; Sigaud, Olivier; Flagel, Shelly B; Robinson, Terry E; Khamassi, Mehdi

    2014-02-01

    Reinforcement Learning has greatly influenced models of conditioning, providing powerful explanations of acquired behaviour and underlying physiological observations. However, in recent autoshaping experiments in rats, variation in the form of Pavlovian conditioned responses (CRs) and associated dopamine activity, have questioned the classical hypothesis that phasic dopamine activity corresponds to a reward prediction error-like signal arising from a classical Model-Free system, necessary for Pavlovian conditioning. Over the course of Pavlovian conditioning using food as the unconditioned stimulus (US), some rats (sign-trackers) come to approach and engage the conditioned stimulus (CS) itself - a lever - more and more avidly, whereas other rats (goal-trackers) learn to approach the location of food delivery upon CS presentation. Importantly, although both sign-trackers and goal-trackers learn the CS-US association equally well, only in sign-trackers does phasic dopamine activity show classical reward prediction error-like bursts. Furthermore, neither the acquisition nor the expression of a goal-tracking CR is dopamine-dependent. Here we present a computational model that can account for such individual variations. We show that a combination of a Model-Based system and a revised Model-Free system can account for the development of distinct CRs in rats. Moreover, we show that revising a classical Model-Free system to individually process stimuli by using factored representations can explain why classical dopaminergic patterns may be observed for some rats and not for others depending on the CR they develop. In addition, the model can account for other behavioural and pharmacological results obtained using the same, or similar, autoshaping procedures. Finally, the model makes it possible to draw a set of experimental predictions that may be verified in a modified experimental protocol. We suggest that further investigation of factored representations in computational

  20. Modelling individual differences in the form of Pavlovian conditioned approach responses: a dual learning systems approach with factored representations.

    Directory of Open Access Journals (Sweden)

    Florian Lesaint

    2014-02-01

    Full Text Available Reinforcement Learning has greatly influenced models of conditioning, providing powerful explanations of acquired behaviour and underlying physiological observations. However, in recent autoshaping experiments in rats, variation in the form of Pavlovian conditioned responses (CRs and associated dopamine activity, have questioned the classical hypothesis that phasic dopamine activity corresponds to a reward prediction error-like signal arising from a classical Model-Free system, necessary for Pavlovian conditioning. Over the course of Pavlovian conditioning using food as the unconditioned stimulus (US, some rats (sign-trackers come to approach and engage the conditioned stimulus (CS itself - a lever - more and more avidly, whereas other rats (goal-trackers learn to approach the location of food delivery upon CS presentation. Importantly, although both sign-trackers and goal-trackers learn the CS-US association equally well, only in sign-trackers does phasic dopamine activity show classical reward prediction error-like bursts. Furthermore, neither the acquisition nor the expression of a goal-tracking CR is dopamine-dependent. Here we present a computational model that can account for such individual variations. We show that a combination of a Model-Based system and a revised Model-Free system can account for the development of distinct CRs in rats. Moreover, we show that revising a classical Model-Free system to individually process stimuli by using factored representations can explain why classical dopaminergic patterns may be observed for some rats and not for others depending on the CR they develop. In addition, the model can account for other behavioural and pharmacological results obtained using the same, or similar, autoshaping procedures. Finally, the model makes it possible to draw a set of experimental predictions that may be verified in a modified experimental protocol. We suggest that further investigation of factored representations in

  1. Habitat fragmentation and reproductive success: a structural equation modelling approach.

    Science.gov (United States)

    Le Tortorec, Eric; Helle, Samuli; Käyhkö, Niina; Suorsa, Petri; Huhta, Esa; Hakkarainen, Harri

    2013-09-01

    1. There is great interest on the effects of habitat fragmentation, whereby habitat is lost and the spatial configuration of remaining habitat patches is altered, on individual breeding performance. However, we still lack consensus of how this important process affects reproductive success, and whether its effects are mainly due to reduced fecundity or nestling survival. 2. The main reason for this may be the way that habitat fragmentation has been previously modelled. Studies have treated habitat loss and altered spatial configuration as two independent processes instead of as one hierarchical and interdependent process, and therefore have not been able to consider the relative direct and indirect effects of habitat loss and altered spatial configuration. 3. We investigated how habitat (i.e. old forest) fragmentation, caused by intense forest harvesting at the territory and landscape scales, is associated with the number of fledged offspring of an area-sensitive passerine, the Eurasian treecreeper (Certhia familiaris). We used structural equation modelling (SEM) to examine the complex hierarchical associations between habitat loss and altered spatial configuration on the number of fledged offspring, by controlling for individual condition and weather conditions during incubation. 4. Against generally held expectations, treecreeper reproductive success did not show a significant association with habitat fragmentation measured at the territory scale. Instead, our analyses suggested that an increasing amount of habitat at the landscape scale caused a significant increase in nest predation rates, leading to reduced reproductive success. This effect operated directly on nest predation rates, instead of acting indirectly through altered spatial configuration. 5. Because habitat amount and configuration are inherently strongly collinear, particularly when multiple scales are considered, our study demonstrates the usefulness of a SEM approach for hierarchical partitioning

  2. A New Algebraic Modelling Approach to Distributed Problem-Solving in MAS

    Institute of Scientific and Technical Information of China (English)

    帅典勋; 邓志东

    2002-01-01

    This paper is devoted to a new algebraic modelling approach to distributed problem-solving in multi-agent systems (MAS), which is featured by a unified framework for describing and treating social behaviors, social dynamics and social intelligence. A conceptual architecture of algebraic modelling is presented. The algebraic modelling of typical social behaviors, social situation and social dynamics is discussed in the context of distributed problemsolving in MAS. The comparison and simulation on distributed task allocations and resource assignments in MAS show more advantages of the algebraic approach than other conventional methods.

  3. CFD Approaches for Modelling Bubble Entrainment by an Impinging Jet

    Directory of Open Access Journals (Sweden)

    Martin Schmidtke

    2009-01-01

    Full Text Available This contribution presents different approaches for the modeling of gas entrainment under water by a plunging jet. Since the generation of bubbles happens on a scale which is smaller than the bubbles, this process cannot be resolved in meso-scale simulations, which include the full length of the jet and its environment. This is why the gas entrainment has to be modeled in meso-scale simulations. In the frame of a Euler-Euler simulation, the local morphology of the phases has to be considered in the drag model. For example, the gas is a continuous phase above the water level but bubbly below the water level. Various drag models are tested and their influence on the gas void fraction below the water level is discussed. The algebraic interface area density (AIAD model applies a drag coefficient for bubbles and a different drag coefficient for the free surface. If the AIAD model is used for the simulation of impinging jets, the gas entrainment depends on the free parameters included in this model. The calculated gas entrainment can be adapted via these parameters. Therefore, an advanced AIAD approach could be used in future for the implementation of models (e.g., correlations for the gas entrainment.

  4. Approach for workflow modeling using π-calculus

    Institute of Scientific and Technical Information of China (English)

    杨东; 张申生

    2003-01-01

    As a variant of process algebra, π-calculus can describe the interactions between evolving processes. By modeling activity as a process interacting with other processes through ports, this paper presents a new approach: representing workilow models using ~-calculus. As a result, the model can characterize the dynamic behaviors of the workflow process in terms of the LTS ( Labeled Transition Semantics) semantics of π-calculus. The main advantage of the worktlow model's formal semantic is that it allows for verification of the model's properties, such as deadlock-free and normal termination. Moreover, the equivalence of worktlow models can be checked thlx)ugh weak bisimulation theorem in the π-caleulus, thus facilitating the optimizationof business processes.

  5. Approach for workflow modeling using π-calculus

    Institute of Scientific and Technical Information of China (English)

    杨东; 张申生

    2003-01-01

    As a variant of process algebra, π-calculus can describe the interactions between evolving processes. By modeling activity as a process interacting with other processes through ports, this paper presents a new approach: representing workflow models using π-calculus. As a result, the model can characterize the dynamic behaviors of the workflow process in terms of the LTS (Labeled Transition Semantics) semantics of π-calculus. The main advantage of the workflow model's formal semantic is that it allows for verification of the model's properties, such as deadlock-free and normal termination. Moreover, the equivalence of workflow models can be checked through weak bisimulation theorem in the π-calculus, thus facilitating the optimization of business processes.

  6. Multiphysics modeling using COMSOL a first principles approach

    CERN Document Server

    Pryor, Roger W

    2011-01-01

    Multiphysics Modeling Using COMSOL rapidly introduces the senior level undergraduate, graduate or professional scientist or engineer to the art and science of computerized modeling for physical systems and devices. It offers a step-by-step modeling methodology through examples that are linked to the Fundamental Laws of Physics through a First Principles Analysis approach. The text explores a breadth of multiphysics models in coordinate systems that range from 1D to 3D and introduces the readers to the numerical analysis modeling techniques employed in the COMSOL Multiphysics software. After readers have built and run the examples, they will have a much firmer understanding of the concepts, skills, and benefits acquired from the use of computerized modeling techniques to solve their current technological problems and to explore new areas of application for their particular technological areas of interest.

  7. Evaluation of Workflow Management Systems - A Meta Model Approach

    Directory of Open Access Journals (Sweden)

    Michael Rosemann

    1998-11-01

    Full Text Available The automated enactment of processes through the use of workflow management systems enables the outsourcing of the control flow from application systems. By now a large number of systems, that follow different workflow paradigms, are available. This leads to the problem of selecting the appropriate workflow management system for a given situation. In this paper we outline the benefits of a meta model approach for the evaluation and comparison of different workflow management systems. After a general introduction on the topic of meta modeling the meta models of the workflow management systems WorkParty (Siemens Nixdorf and FlowMark (IBM are compared as an example. These product specific meta models can be generalized to meta reference models, which helps to specify a workflow methodology. Exemplary, an organisational reference meta model is presented, which helps users in specifying their requirements for a workflow management system.

  8. Experimental and Numerical Analysis of Triaxially Braided Composites Utilizing a Modified Subcell Modeling Approach

    Science.gov (United States)

    Cater, Christopher; Xiao, Xinran; Goldberg, Robert K.; Kohlman, Lee W.

    2015-01-01

    A combined experimental and analytical approach was performed for characterizing and modeling triaxially braided composites with a modified subcell modeling strategy. Tensile coupon tests were conducted on a [0deg/60deg/-60deg] braided composite at angles of 0deg, 30deg, 45deg, 60deg and 90deg relative to the axial tow of the braid. It was found that measured coupon strength varied significantly with the angle of the applied load and each coupon direction exhibited unique final failures. The subcell modeling approach implemented into the finite element software LS-DYNA was used to simulate the various tensile coupon test angles. The modeling approach was successful in predicting both the coupon strength and reported failure mode for the 0deg, 30deg and 60deg loading directions. The model over-predicted the strength in the 90deg direction; however, the experimental results show a strong influence of free edge effects on damage initiation and failure. In the absence of these local free edge effects, the subcell modeling approach showed promise as a viable and computationally efficient analysis tool for triaxially braided composite structures. Future work will focus on validation of the approach for predicting the impact response of the braided composite against flat panel impact tests.

  9. CGC/saturation approach for soft interactions at high energy: a two channel model

    Energy Technology Data Exchange (ETDEWEB)

    Gotsman, E.; Maor, U. [Tel Aviv University, Department of Particle Physics, School of Physics and Astronomy, Raymond and Beverly Sackler Faculty of Exact Science, Tel Aviv (Israel); Levin, E. [Tel Aviv University, Department of Particle Physics, School of Physics and Astronomy, Raymond and Beverly Sackler Faculty of Exact Science, Tel Aviv (Israel); Universidad Tecnica Federico Santa Maria, Departemento de Fisica, Centro Cientifico-Tecnologico de Valparaiso, Valparaiso (Chile)

    2015-05-15

    In this paper we continue the development of a model for strong interactions at high energy, based on two ingredients: the CGC/saturation approach and the BFKL Pomeron. In our approach, the unknown mechanism of confinement of quarks and gluons is characterized by several numerical parameters, which are extracted from the experimental data. We demonstrate that the two channel model successfully describes the experimental data, including both the value of the elastic slope and the energy behavior of the single diffraction cross section. We show that the disagreement with the experimental data of our previous single channel eikonal model (Gotsman et al., Eur Phys J C 75:1-18, 2015) stems from the simplified approach used for the hadron structure and is not related to our principal theoretical input, based on the CGC/saturation approach. (orig.)

  10. A Novel Approach to Modeling and Flooding in Ad-hoc Wireless Networks

    Directory of Open Access Journals (Sweden)

    O. Issaad

    2008-01-01

    Full Text Available This study proposes a new modeling approach for wireless ad-hoc networks. The new approach is based on the construction of fuzzy neighborhoods and essentially consists of assigning a membership or importance degree to each network radio link which reflects the relative quality of this link. This approach is first used to model the flooding problem and then an algorithm is proposed to solve this problem which is of a great importance in ad-hoc wireless networks intrinsically subject to a certain level of node mobility. Simulations carried out in a dynamic environment show promising results and stability compared to the enhanced dominant pruning algorithm. Such an approach is suitable to take into account the volatile aspect of radio links and the physical layer uncertainty when modeling these networks, particularly when the physical layer offers no or insufficient guaranties to high-level protocols as for the flooding.

  11. CGC/saturation approach for soft interactions at high energy: a two channel model

    CERN Document Server

    Gotsman, E; Maor, U

    2015-01-01

    In this paper we continue the development of a model for strong interactions at high energy, based on two ingredients: CGC/saturation approach and the BFKL Pomeron. In our approach, the unknown mechanism of confinement of quarks and gluons, is characterized by several numerical parameters, which are extracted from the experimental data. We demonstrate that the two channel model, successfully describes the experimental data, including both the value of the elastic slope and the energy behavior of the single diffraction cross section. We show that the disagreement with experimental data of our previous single channel eikonal model [6] stems from the simplified approach used for the hadron structure, and is not related to our principal theoretical input, based on the CGC/saturation approach.

  12. A predictive modeling approach to increasing the economic effectiveness of disease management programs.

    Science.gov (United States)

    Bayerstadler, Andreas; Benstetter, Franz; Heumann, Christian; Winter, Fabian

    2014-09-01

    Predictive Modeling (PM) techniques are gaining importance in the worldwide health insurance business. Modern PM methods are used for customer relationship management, risk evaluation or medical management. This article illustrates a PM approach that enables the economic potential of (cost-) effective disease management programs (DMPs) to be fully exploited by optimized candidate selection as an example of successful data-driven business management. The approach is based on a Generalized Linear Model (GLM) that is easy to apply for health insurance companies. By means of a small portfolio from an emerging country, we show that our GLM approach is stable compared to more sophisticated regression techniques in spite of the difficult data environment. Additionally, we demonstrate for this example of a setting that our model can compete with the expensive solutions offered by professional PM vendors and outperforms non-predictive standard approaches for DMP selection commonly used in the market.

  13. Polynomial Chaos Expansion Approach to Interest Rate Models

    Directory of Open Access Journals (Sweden)

    Luca Di Persio

    2015-01-01

    Full Text Available The Polynomial Chaos Expansion (PCE technique allows us to recover a finite second-order random variable exploiting suitable linear combinations of orthogonal polynomials which are functions of a given stochastic quantity ξ, hence acting as a kind of random basis. The PCE methodology has been developed as a mathematically rigorous Uncertainty Quantification (UQ method which aims at providing reliable numerical estimates for some uncertain physical quantities defining the dynamic of certain engineering models and their related simulations. In the present paper, we use the PCE approach in order to analyze some equity and interest rate models. In particular, we take into consideration those models which are based on, for example, the Geometric Brownian Motion, the Vasicek model, and the CIR model. We present theoretical as well as related concrete numerical approximation results considering, without loss of generality, the one-dimensional case. We also provide both an efficiency study and an accuracy study of our approach by comparing its outputs with the ones obtained adopting the Monte Carlo approach, both in its standard and its enhanced version.

  14. Popularity Modeling for Mobile Apps: A Sequential Approach.

    Science.gov (United States)

    Zhu, Hengshu; Liu, Chuanren; Ge, Yong; Xiong, Hui; Chen, Enhong

    2015-07-01

    The popularity information in App stores, such as chart rankings, user ratings, and user reviews, provides an unprecedented opportunity to understand user experiences with mobile Apps, learn the process of adoption of mobile Apps, and thus enables better mobile App services. While the importance of popularity information is well recognized in the literature, the use of the popularity information for mobile App services is still fragmented and under-explored. To this end, in this paper, we propose a sequential approach based on hidden Markov model (HMM) for modeling the popularity information of mobile Apps toward mobile App services. Specifically, we first propose a popularity based HMM (PHMM) to model the sequences of the heterogeneous popularity observations of mobile Apps. Then, we introduce a bipartite based method to precluster the popularity observations. This can help to learn the parameters and initial values of the PHMM efficiently. Furthermore, we demonstrate that the PHMM is a general model and can be applicable for various mobile App services, such as trend based App recommendation, rating and review spam detection, and ranking fraud detection. Finally, we validate our approach on two real-world data sets collected from the Apple Appstore. Experimental results clearly validate both the effectiveness and efficiency of the proposed popularity modeling approach.

  15. On a Markovian approach for modeling passive solar devices

    Energy Technology Data Exchange (ETDEWEB)

    Bottazzi, F.; Liebling, T.M. (Chaire de Recherche Operationelle, Ecole Polytechnique Federale de Lausanne (Switzerland)); Scartezzini, J.L.; Nygaard-Ferguson, M. (Lab. d' Energie Solaire et de Physique du Batiment, Ecole Polytechnique Federale de Lausanne (Switzerland))

    1991-01-01

    Stochastic models for the analysis of the energy and thermal comfort performances of passive solar devices have been increasingly studied for over a decade. A new approach to thermal building modeling, based on Markov chains, is proposed here to combine both the accuracy of traditional dynamic simulation with the practical advantages of simplified methods. A main difficulty of the Markovian approach is the discretization of the system variables. Efficient procedures have been developed to carry out this discretization and several numerical experiments have been performed to analyze the possibilities and limitations of the Markovian model. Despite its restrictive assumptions, it will be shown that accurate results are indeed obtained by this method. However, due to discretization, computer memory reqirements are more than inversely proportional to accuracy. (orig.).

  16. Disturbed state concept as unified constitutive modeling approach

    Directory of Open Access Journals (Sweden)

    Chandrakant S. Desai

    2016-06-01

    Full Text Available A unified constitutive modeling approach is highly desirable to characterize a wide range of engineering materials subjected simultaneously to the effect of a number of factors such as elastic, plastic and creep deformations, stress path, volume change, microcracking leading to fracture, failure and softening, stiffening, and mechanical and environmental forces. There are hardly available such unified models. The disturbed state concept (DSC is considered to be a unified approach and is able to provide material characterization for almost all of the above factors. This paper presents a description of the DSC, and statements for determination of parameters based on triaxial, multiaxial and interface tests. Statements of DSC and validation at the specimen level and at the boundary value problem levels are also presented. An extensive list of publications by the author and others is provided at the end. The DSC is considered to be a unique and versatile procedure for modeling behaviors of engineering materials and interfaces.

  17. Disturbed state concept as unified constitutive modeling approach

    Institute of Scientific and Technical Information of China (English)

    Chandrakant S. Desai

    2016-01-01

    A unified constitutive modeling approach is highly desirable to characterize a wide range of engineering materials subjected simultaneously to the effect of a number of factors such as elastic, plastic and creep deformations, stress path, volume change, microcracking leading to fracture, failure and softening, stiffening, and mechanical and environmental forces. There are hardly available such unified models. The disturbed state concept (DSC) is considered to be a unified approach and is able to provide material characterization for almost all of the above factors. This paper presents a description of the DSC, and statements for determination of parameters based on triaxial, multiaxial and interface tests. Statements of DSC and validation at the specimen level and at the boundary value problem levels are also presented. An extensive list of publications by the author and others is provided at the end. The DSC is considered to be a unique and versatile procedure for modeling behaviors of engineering materials and interfaces.

  18. Proposal: A Hybrid Dictionary Modelling Approach for Malay Tweet Normalization

    Science.gov (United States)

    Muhamad, Nor Azlizawati Binti; Idris, Norisma; Arshi Saloot, Mohammad

    2017-02-01

    Malay Twitter message presents a special deviation from the original language. Malay Tweet widely used currently by Twitter users, especially at Malaya archipelago. Thus, it is important to make a normalization system which can translated Malay Tweet language into the standard Malay language. Some researchers have conducted in natural language processing which mainly focuses on normalizing English Twitter messages, while few studies have been done for normalize Malay Tweets. This paper proposes an approach to normalize Malay Twitter messages based on hybrid dictionary modelling methods. This approach normalizes noisy Malay twitter messages such as colloquially language, novel words, and interjections into standard Malay language. This research will be used Language Model and N-grams model.

  19. Approaches to verification of two-dimensional water quality models

    Energy Technology Data Exchange (ETDEWEB)

    Butkus, S.R. (Tennessee Valley Authority, Chattanooga, TN (USA). Water Quality Dept.)

    1990-11-01

    The verification of a water quality model is the one procedure most needed by decision making evaluating a model predictions, but is often not adequate or done at all. The results of a properly conducted verification provide the decision makers with an estimate of the uncertainty associated with model predictions. Several statistical tests are available for quantifying of the performance of a model. Six methods of verification were evaluated using an application of the BETTER two-dimensional water quality model for Chickamauga reservoir. Model predictions for ten state variables were compared to observed conditions from 1989. Spatial distributions of the verification measures showed the model predictions were generally adequate, except at a few specific locations in the reservoir. The most useful statistics were the mean standard error of the residuals. Quantifiable measures of model performance should be calculated during calibration and verification of future applications of the BETTER model. 25 refs., 5 figs., 7 tabs.

  20. ON SOME APPROACHES TO ECONOMICMATHEMATICAL MODELING OF SMALL BUSINESS

    Directory of Open Access Journals (Sweden)

    Orlov A. I.

    2015-04-01

    Full Text Available Small business is an important part of modern Russian economy. We give a wide panorama developed by us of possible approaches to the construction of economic-mathematical models that may be useful to describe the dynamics of small businesses, as well as management. As for the description of certain problems of small business can use a variety of types of economic-mathematical and econometric models, we found it useful to consider a fairly wide range of such models, which resulted in quite a short description of the specific models. In this description of the models brought to such a level that an experienced professional in the field of economic-mathematical modeling could, if necessary, to develop their own specific model to the stage of design formulas and numerical results. Particular attention is paid to the use of statistical methods of non-numeric data, the most pressing at the moment. Are considered the problems of economic-mathematical modeling in solving problems of small business marketing. We have accumulated some experience in application of the methodology of economic-mathematical modeling in solving practical problems in small business marketing, in particular in the field of consumer goods and industrial purposes, educational services, as well as in the analysis and modeling of inflation, taxation and others. In marketing models of decision making theory we apply rankings and ratings. Is considered the problem of comparing averages. We present some models of the life cycle of small businesses - flow model projects, model of capture niches, and model of niche selection. We discuss the development of research on economic-mathematical modeling of small businesses

  1. A validated approach for modeling collapse of steel structures

    Science.gov (United States)

    Saykin, Vitaliy Victorovich

    A civil engineering structure is faced with many hazardous conditions such as blasts, earthquakes, hurricanes, tornadoes, floods, and fires during its lifetime. Even though structures are designed for credible events that can happen during a lifetime of the structure, extreme events do happen and cause catastrophic failures. Understanding the causes and effects of structural collapse is now at the core of critical areas of national need. One factor that makes studying structural collapse difficult is the lack of full-scale structural collapse experimental test results against which researchers could validate their proposed collapse modeling approaches. The goal of this work is the creation of an element deletion strategy based on fracture models for use in validated prediction of collapse of steel structures. The current work reviews the state-of-the-art of finite element deletion strategies for use in collapse modeling of structures. It is shown that current approaches to element deletion in collapse modeling do not take into account stress triaxiality in vulnerable areas of the structure, which is important for proper fracture and element deletion modeling. The report then reviews triaxiality and its role in fracture prediction. It is shown that fracture in ductile materials is a function of triaxiality. It is also shown that, depending on the triaxiality range, different fracture mechanisms are active and should be accounted for. An approach using semi-empirical fracture models as a function of triaxiality are employed. The models to determine fracture initiation, softening and subsequent finite element deletion are outlined. This procedure allows for stress-displacement softening at an integration point of a finite element in order to subsequently remove the element. This approach avoids abrupt changes in the stress that would create dynamic instabilities, thus making the results more reliable and accurate. The calibration and validation of these models are

  2. GEOSPATIAL MODELLING APPROACH FOR 3D URBAN DENSIFICATION DEVELOPMENTS

    Directory of Open Access Journals (Sweden)

    O. Koziatek

    2016-06-01

    Full Text Available With growing populations, economic pressures, and the need for sustainable practices, many urban regions are rapidly densifying developments in the vertical built dimension with mid- and high-rise buildings. The location of these buildings can be projected based on key factors that are attractive to urban planners, developers, and potential buyers. Current research in this area includes various modelling approaches, such as cellular automata and agent-based modelling, but the results are mostly linked to raster grids as the smallest spatial units that operate in two spatial dimensions. Therefore, the objective of this research is to develop a geospatial model that operates on irregular spatial tessellations to model mid- and high-rise buildings in three spatial dimensions (3D. The proposed model is based on the integration of GIS, fuzzy multi-criteria evaluation (MCE, and 3D GIS-based procedural modelling. Part of the City of Surrey, within the Metro Vancouver Region, Canada, has been used to present the simulations of the generated 3D building objects. The proposed 3D modelling approach was developed using ESRI’s CityEngine software and the Computer Generated Architecture (CGA language.

  3. Geospatial Modelling Approach for 3d Urban Densification Developments

    Science.gov (United States)

    Koziatek, O.; Dragićević, S.; Li, S.

    2016-06-01

    With growing populations, economic pressures, and the need for sustainable practices, many urban regions are rapidly densifying developments in the vertical built dimension with mid- and high-rise buildings. The location of these buildings can be projected based on key factors that are attractive to urban planners, developers, and potential buyers. Current research in this area includes various modelling approaches, such as cellular automata and agent-based modelling, but the results are mostly linked to raster grids as the smallest spatial units that operate in two spatial dimensions. Therefore, the objective of this research is to develop a geospatial model that operates on irregular spatial tessellations to model mid- and high-rise buildings in three spatial dimensions (3D). The proposed model is based on the integration of GIS, fuzzy multi-criteria evaluation (MCE), and 3D GIS-based procedural modelling. Part of the City of Surrey, within the Metro Vancouver Region, Canada, has been used to present the simulations of the generated 3D building objects. The proposed 3D modelling approach was developed using ESRI's CityEngine software and the Computer Generated Architecture (CGA) language.

  4. Peroxidases identified in a subtractive cDNA library approach show tissue-specific transcript abundance and enzyme activity during seed germination of Lepidium sativum.

    Science.gov (United States)

    Linkies, Ada; Schuster-Sherpa, Uta; Tintelnot, Stefanie; Leubner-Metzger, Gerhard; Müller, Kerstin

    2010-01-01

    The micropylar endosperm is a major regulator of seed germination in endospermic species, to which the close Brassicaceae relatives Arabidopsis thaliana and Lepidium sativum (cress) belong. Cress seeds are about 20 times larger than the seeds of Arabidopsis. This advantage was used to construct a tissue-specific subtractive cDNA library of transcripts that are up-regulated late in the germination process specifically in the micropylar endosperm of cress seeds. The library showed that a number of transcripts known to be up-regulated late during germination are up-regulated in the micropylar endosperm cap. Detailed germination kinetics of SALK lines carrying insertions in genes present in our library showed that the identified transcripts do indeed play roles during germination. Three peroxidases were present in the library. These peroxidases were identified as orthologues of Arabidopsis AtAPX01, AtPrx16, and AtPrxIIE. The corresponding SALK lines displayed significant germination phenotypes. Their transcripts were quantified in specific cress seed tissues during germination in the presence and absence of ABA and they were found to be regulated in a tissue-specific manner. Peroxidase activity, and particularly its regulation by ABA, also differed between radicles and micropylar endosperm caps. Possible implications of this tissue-specificity are discussed.

  5. A global sensitivity analysis approach for morphogenesis models

    KAUST Repository

    Boas, Sonja E. M.

    2015-11-21

    Background Morphogenesis is a developmental process in which cells organize into shapes and patterns. Complex, non-linear and multi-factorial models with images as output are commonly used to study morphogenesis. It is difficult to understand the relation between the uncertainty in the input and the output of such ‘black-box’ models, giving rise to the need for sensitivity analysis tools. In this paper, we introduce a workflow for a global sensitivity analysis approach to study the impact of single parameters and the interactions between them on the output of morphogenesis models. Results To demonstrate the workflow, we used a published, well-studied model of vascular morphogenesis. The parameters of this cellular Potts model (CPM) represent cell properties and behaviors that drive the mechanisms of angiogenic sprouting. The global sensitivity analysis correctly identified the dominant parameters in the model, consistent with previous studies. Additionally, the analysis provided information on the relative impact of single parameters and of interactions between them. This is very relevant because interactions of parameters impede the experimental verification of the predicted effect of single parameters. The parameter interactions, although of low impact, provided also new insights in the mechanisms of in silico sprouting. Finally, the analysis indicated that the model could be reduced by one parameter. Conclusions We propose global sensitivity analysis as an alternative approach to study the mechanisms of morphogenesis. Comparison of the ranking of the impact of the model parameters to knowledge derived from experimental data and from manipulation experiments can help to falsify models and to find the operand mechanisms in morphogenesis. The workflow is applicable to all ‘black-box’ models, including high-throughput in vitro models in which output measures are affected by a set of experimental perturbations.

  6. A systemic approach for modeling biological evolution using Parallel DEVS.

    Science.gov (United States)

    Heredia, Daniel; Sanz, Victorino; Urquia, Alfonso; Sandín, Máximo

    2015-08-01

    A new model for studying the evolution of living organisms is proposed in this manuscript. The proposed model is based on a non-neodarwinian systemic approach. The model is focused on considering several controversies and open discussions about modern evolutionary biology. Additionally, a simplification of the proposed model, named EvoDEVS, has been mathematically described using the Parallel DEVS formalism and implemented as a computer program using the DEVSLib Modelica library. EvoDEVS serves as an experimental platform to study different conditions and scenarios by means of computer simulations. Two preliminary case studies are presented to illustrate the behavior of the model and validate its results. EvoDEVS is freely available at http://www.euclides.dia.uned.es. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  7. Kinetic equations modelling wealth redistribution: a comparison of approaches.

    Science.gov (United States)

    Düring, Bertram; Matthes, Daniel; Toscani, Giuseppe

    2008-11-01

    Kinetic equations modelling the redistribution of wealth in simple market economies is one of the major topics in the field of econophysics. We present a unifying approach to the qualitative study for a large variety of such models, which is based on a moment analysis in the related homogeneous Boltzmann equation, and on the use of suitable metrics for probability measures. In consequence, we are able to classify the most important feature of the steady wealth distribution, namely the fatness of the Pareto tail, and the dynamical stability of the latter in terms of the model parameters. Our results apply, e.g., to the market model with risky investments [S. Cordier, L. Pareschi, and G. Toscani, J. Stat. Phys. 120, 253 (2005)], and to the model with quenched saving propensities [A. Chatterjee, B. K. Chakrabarti, and S. S. Manna, Physica A 335, 155 (2004)]. Also, we present results from numerical experiments that confirm the theoretical predictions.

  8. A Computationally Efficient State Space Approach to Estimating Multilevel Regression Models and Multilevel Confirmatory Factor Models.

    Science.gov (United States)

    Gu, Fei; Preacher, Kristopher J; Wu, Wei; Yung, Yiu-Fai

    2014-01-01

    Although the state space approach for estimating multilevel regression models has been well established for decades in the time series literature, it does not receive much attention from educational and psychological researchers. In this article, we (a) introduce the state space approach for estimating multilevel regression models and (b) extend the state space approach for estimating multilevel factor models. A brief outline of the state space formulation is provided and then state space forms for univariate and multivariate multilevel regression models, and a multilevel confirmatory factor model, are illustrated. The utility of the state space approach is demonstrated with either a simulated or real example for each multilevel model. It is concluded that the results from the state space approach are essentially identical to those from specialized multilevel regression modeling and structural equation modeling software. More importantly, the state space approach offers researchers a computationally more efficient alternative to fit multilevel regression models with a large number of Level 1 units within each Level 2 unit or a large number of observations on each subject in a longitudinal study.

  9. Techniques for managing behaviour in pediatric dentistry: comparative study of live modelling and tell-show-do based on children's heart rates during treatment.

    Science.gov (United States)

    Farhat-McHayleh, Nada; Harfouche, Alice; Souaid, Philippe

    2009-05-01

    Tell-show-do is the most popular technique for managing children"s behaviour in dentists" offices. Live modelling is used less frequently, despite the satisfactory results obtained in studies conducted during the 1980s. The purpose of this study was to compare the effects of these 2 techniques on children"s heart rates during dental treatments, heart rate being the simplest biological parameter to measure and an increase in heart rate being the most common physiologic indicator of anxiety and fear. For this randomized, controlled, parallel-group single-centre clinical trial, children 5 to 9 years of age presenting for the first time to the Saint Joseph University dental care centre in Beirut, Lebanon, were divided into 3 groups: those in groups A and B were prepared for dental treatment by means of live modelling, the mother serving as the model for children in group A and the father as the model for children in group B. The children in group C were prepared by a pediatric dentist using the tell-show-do method. Each child"s heart rate was monitored during treatment, which consisted of an oral examination and cleaning. A total of 155 children met the study criteria and participated in the study. Children who received live modelling with the mother as model had lower heart rates than those who received live modelling with the father as model and those who were prepared by the tell-show-do method (p dentistry.

  10. Building Energy Modeling: A Data-Driven Approach

    Science.gov (United States)

    Cui, Can

    Buildings consume nearly 50% of the total energy in the United States, which drives the need to develop high-fidelity models for building energy systems. Extensive methods and techniques have been developed, studied, and applied to building energy simulation and forecasting, while most of work have focused on developing dedicated modeling approach for generic buildings. In this study, an integrated computationally efficient and high-fidelity building energy modeling framework is proposed, with the concentration on developing a generalized modeling approach for various types of buildings. First, a number of data-driven simulation models are reviewed and assessed on various types of computationally expensive simulation problems. Motivated by the conclusion that no model outperforms others if amortized over diverse problems, a meta-learning based recommendation system for data-driven simulation modeling is proposed. To test the feasibility of the proposed framework on the building energy system, an extended application of the recommendation system for short-term building energy forecasting is deployed on various buildings. Finally, Kalman filter-based data fusion technique is incorporated into the building recommendation system for on-line energy forecasting. Data fusion enables model calibration to update the state estimation in real-time, which filters out the noise and renders more accurate energy forecast. The framework is composed of two modules: off-line model recommendation module and on-line model calibration module. Specifically, the off-line model recommendation module includes 6 widely used data-driven simulation models, which are ranked by meta-learning recommendation system for off-line energy modeling on a given building scenario. Only a selective set of building physical and operational characteristic features is needed to complete the recommendation task. The on-line calibration module effectively addresses system uncertainties, where data fusion on

  11. Anthropomorphic Coding of Speech and Audio: A Model Inversion Approach

    Directory of Open Access Journals (Sweden)

    W. Bastiaan Kleijn

    2005-06-01

    Full Text Available Auditory modeling is a well-established methodology that provides insight into human perception and that facilitates the extraction of signal features that are most relevant to the listener. The aim of this paper is to provide a tutorial on perceptual speech and audio coding using an invertible auditory model. In this approach, the audio signal is converted into an auditory representation using an invertible auditory model. The auditory representation is quantized and coded. Upon decoding, it is then transformed back into the acoustic domain. This transformation converts a complex distortion criterion into a simple one, thus facilitating quantization with low complexity. We briefly review past work on auditory models and describe in more detail the components of our invertible model and its inversion procedure, that is, the method to reconstruct the signal from the output of the auditory model. We summarize attempts to use the auditory representation for low-bit-rate coding. Our approach also allows the exploitation of the inherent redundancy of the human auditory system for the purpose of multiple description (joint source-channel coding.

  12. A modal approach to modeling spatially distributed vibration energy dissipation.

    Energy Technology Data Exchange (ETDEWEB)

    Segalman, Daniel Joseph

    2010-08-01

    The nonlinear behavior of mechanical joints is a confounding element in modeling the dynamic response of structures. Though there has been some progress in recent years in modeling individual joints, modeling the full structure with myriad frictional interfaces has remained an obstinate challenge. A strategy is suggested for structural dynamics modeling that can account for the combined effect of interface friction distributed spatially about the structure. This approach accommodates the following observations: (1) At small to modest amplitudes, the nonlinearity of jointed structures is manifest primarily in the energy dissipation - visible as vibration damping; (2) Correspondingly, measured vibration modes do not change significantly with amplitude; and (3) Significant coupling among the modes does not appear to result at modest amplitudes. The mathematical approach presented here postulates the preservation of linear modes and invests all the nonlinearity in the evolution of the modal coordinates. The constitutive form selected is one that works well in modeling spatially discrete joints. When compared against a mathematical truth model, the distributed dissipation approximation performs well.

  13. Comparison of two model approaches in the Zambezi river basin with regard to model reliability and identifiability

    Directory of Open Access Journals (Sweden)

    H. C. Winsemius

    2006-01-01

    Full Text Available Variations of water stocks in the upper Zambezi river basin have been determined by 2 different hydrological modelling approaches. The purpose was to provide preliminary terrestrial storage estimates in the upper Zambezi, which will be compared with estimates derived from the Gravity Recovery And Climate Experiment (GRACE in a future study. The first modelling approach is GIS-based, distributed and conceptual (STREAM. The second approach uses Lumped Elementary Watersheds identified and modelled conceptually (LEW. The STREAM model structure has been assessed using GLUE (Generalized Likelihood Uncertainty Estimation a posteriori to determine parameter identifiability. The LEW approach could, in addition, be tested for model structure, because computational efforts of LEW are low. Both models are threshold models, where the non-linear behaviour of the Zambezi river basin is explained by a combination of thresholds and linear reservoirs. The models were forced by time series of gauged and interpolated rainfall. Where available, runoff station data was used to calibrate the models. Ungauged watersheds were generally given the same parameter sets as their neighbouring calibrated watersheds. It appeared that the LEW model structure could be improved by applying GLUE iteratively. Eventually, it led to better identifiability of parameters and consequently a better model structure than the STREAM model. Hence, the final model structure obtained better represents the true hydrology. After calibration, both models show a comparable efficiency in representing discharge. However the LEW model shows a far greater storage amplitude than the STREAM model. This emphasizes the storage uncertainty related to hydrological modelling in data-scarce environments such as the Zambezi river basin. It underlines the need and potential for independent observations of terrestrial storage to enhance our understanding and modelling capacity of the hydrological processes. GRACE

  14. A moni-modelling approach to manage groundwater risk to pesticide leaching at regional scale.

    Science.gov (United States)

    Di Guardo, Andrea; Finizio, Antonio

    2016-03-01

    Historically, the approach used to manage risk of chemical contamination of water bodies is based on the use of monitoring programmes, which provide a snapshot of the presence/absence of chemicals in water bodies. Monitoring is required in the current EU regulations, such as the Water Framework Directive (WFD), as a tool to record temporal variation in the chemical status of water bodies. More recently, a number of models have been developed and used to forecast chemical contamination of water bodies. These models combine information of chemical properties, their use, and environmental scenarios. Both approaches are useful for risk assessors in decision processes. However, in our opinion, both show flaws and strengths when taken alone. This paper proposes an integrated approach (moni-modelling approach) where monitoring data and modelling simulations work together in order to provide a common decision framework for the risk assessor. This approach would be very useful, particularly for the risk management of pesticides at a territorial level. It fulfils the requirement of the recent Sustainable Use of Pesticides Directive. In fact, the moni-modelling approach could be used to identify sensible areas where implement mitigation measures or limitation of use of pesticides, but even to effectively re-design future monitoring networks or to better calibrate the pedo-climatic input data for the environmental fate models. A case study is presented, where the moni-modelling approach is applied in Lombardy region (North of Italy) to identify groundwater vulnerable areas to pesticides. The approach has been applied to six active substances with different leaching behaviour, in order to highlight the advantages in using the proposed methodology.

  15. Approach to Organizational Structure Modelling in Construction Companies

    Directory of Open Access Journals (Sweden)

    Ilin Igor V.

    2016-01-01

    Full Text Available Effective management system is one of the key factors of business success nowadays. Construction companies usually have a portfolio of independent projects running at the same time. Thus it is reasonable to take into account project orientation of such kind of business while designing the construction companies’ management system, which main components are business process system and organizational structure. The paper describes the management structure designing approach, based on the project-oriented nature of the construction projects, and propose a model of the organizational structure for the construction company. Application of the proposed approach will enable to assign responsibilities within the organizational structure in construction projects effectively and thus to shorten the time for projects allocation and to provide its smoother running. The practical case of using the approach also provided in the paper.

  16. Parameter identification of multi-body railway vehicle models - Application of the adjoint state approach

    Science.gov (United States)

    Kraft, S.; Puel, G.; Aubry, D.; Funfschilling, C.

    2016-12-01

    For the calibration of multi-body models of railway vehicles, the identification of the model parameters from on-track measurement is required. This involves the solution of an inverse problem by minimising the misfit function which describes the distance between model and measurement using optimisation methods. The application of gradient-based optimisation methods is advantageous but necessitates an efficient approach for the computation of the gradients considering the large number of model parameters and the costly evaluation of the forward model. This work shows that the application of the adjoint state approach to the nonlinear vehicle-track multi-body system is suitable, reducing on the one hand the computational cost and increasing on the other hand the precision of the gradients. Gradients from the adjoint state method are computed for vehicle models and validated taking into account measurement noise.

  17. An integrated modelling approach to estimate urban traffic emissions

    Science.gov (United States)

    Misra, Aarshabh; Roorda, Matthew J.; MacLean, Heather L.

    2013-07-01

    An integrated modelling approach is adopted to estimate microscale urban traffic emissions. The modelling framework consists of a traffic microsimulation model developed in PARAMICS, a microscopic emissions model (Comprehensive Modal Emissions Model), and two dispersion models, AERMOD and the Quick Urban and Industrial Complex (QUIC). This framework is applied to a traffic network in downtown Toronto, Canada to evaluate summer time morning peak traffic emissions of carbon monoxide (CO) and nitrogen oxides (NOx) during five weekdays at a traffic intersection. The model predicted results are validated against sensor observations with 100% of the AERMOD modelled CO concentrations and 97.5% of the QUIC modelled NOx concentrations within a factor of two of the corresponding observed concentrations. Availability of local estimates of ambient concentration is useful for accurate comparisons of predicted concentrations with observed concentrations. Predicted and sensor measured concentrations are significantly lower than the hourly threshold Maximum Acceptable Levels for CO (31 ppm, ˜90 times lower) and NO2 (0.4 mg/m3, ˜12 times lower), within the National Ambient Air Quality Objectives established by Environment Canada.

  18. A Novel Approach to Implement Takagi-Sugeno Fuzzy Models.

    Science.gov (United States)

    Chang, Chia-Wen; Tao, Chin-Wang

    2017-09-01

    This paper proposes new algorithms based on the fuzzy c-regressing model algorithm for Takagi-Sugeno (T-S) fuzzy modeling of the complex nonlinear systems. A fuzzy c-regression state model (FCRSM) algorithm is a T-S fuzzy model in which the functional antecedent and the state-space-model-type consequent are considered with the available input-output data. The antecedent and consequent forms of the proposed FCRSM consists mainly of two advantages: one is that the FCRSM has low computation load due to only one input variable is considered in the antecedent part; another is that the unknown system can be modeled to not only the polynomial form but also the state-space form. Moreover, the FCRSM can be extended to FCRSM-ND and FCRSM-Free algorithms. An algorithm FCRSM-ND is presented to find the T-S fuzzy state-space model of the nonlinear system when the input-output data cannot be precollected and an assumed effective controller is available. In the practical applications, the mathematical model of controller may be hard to be obtained. In this case, an online tuning algorithm, FCRSM-FREE, is designed such that the parameters of a T-S fuzzy controller and the T-S fuzzy state model of an unknown system can be online tuned simultaneously. Four numerical simulations are given to demonstrate the effectiveness of the proposed approach.

  19. Diagnosing Hybrid Systems: a Bayesian Model Selection Approach

    Science.gov (United States)

    McIlraith, Sheila A.

    2005-01-01

    In this paper we examine the problem of monitoring and diagnosing noisy complex dynamical systems that are modeled as hybrid systems-models of continuous behavior, interleaved by discrete transitions. In particular, we examine continuous systems with embedded supervisory controllers that experience abrupt, partial or full failure of component devices. Building on our previous work in this area (MBCG99;MBCG00), our specific focus in this paper ins on the mathematical formulation of the hybrid monitoring and diagnosis task as a Bayesian model tracking algorithm. The nonlinear dynamics of many hybrid systems present challenges to probabilistic tracking. Further, probabilistic tracking of a system for the purposes of diagnosis is problematic because the models of the system corresponding to failure modes are numerous and generally very unlikely. To focus tracking on these unlikely models and to reduce the number of potential models under consideration, we exploit logic-based techniques for qualitative model-based diagnosis to conjecture a limited initial set of consistent candidate models. In this paper we discuss alternative tracking techniques that are relevant to different classes of hybrid systems, focusing specifically on a method for tracking multiple models of nonlinear behavior simultaneously using factored sampling and conditional density propagation. To illustrate and motivate the approach described in this paper we examine the problem of monitoring and diganosing NASA's Sprint AERCam, a small spherical robotic camera unit with 12 thrusters that enable both linear and rotational motion.

  20. An adaptive model switching approach for phase I dose-finding trials.

    Science.gov (United States)

    Daimon, Takashi; Zohar, Sarah

    2013-01-01

    Model-based phase I dose-finding designs rely on a single model throughout the study for estimating the maximum tolerated dose (MTD). Thus, one major concern is about the choice of the most suitable model to be used. This is important because the dose allocation process and the MTD estimation depend on whether or not the model is reliable, or whether or not it gives a better fit to toxicity data. The aim of our work was to propose a method that would remove the need for a model choice prior to the trial onset and then allow it sequentially at each patient's inclusion. In this paper, we described model checking approach based on the posterior predictive check and model comparison approach based on the deviance information criterion, in order to identify a more reliable or better model during the course of a trial and to support clinical decision making. Further, we presented two model switching designs for a phase I cancer trial that were based on the aforementioned approaches, and performed a comparison between designs with or without model switching, through a simulation study. The results showed that the proposed designs had the advantage of decreasing certain risks, such as those of poor dose allocation and failure to find the MTD, which could occur if the model is misspecified. Copyright © 2013 John Wiley & Sons, Ltd.

  1. A structured approach for the engineering of biochemical network models, illustrated for signalling pathways.

    Science.gov (United States)

    Breitling, Rainer; Gilbert, David; Heiner, Monika; Orton, Richard

    2008-09-01

    Quantitative models of biochemical networks (signal transduction cascades, metabolic pathways, gene regulatory circuits) are a central component of modern systems biology. Building and managing these complex models is a major challenge that can benefit from the application of formal methods adopted from theoretical computing science. Here we provide a general introduction to the field of formal modelling, which emphasizes the intuitive biochemical basis of the modelling process, but is also accessible for an audience with a background in computing science and/or model engineering. We show how signal transduction cascades can be modelled in a modular fashion, using both a qualitative approach--qualitative Petri nets, and quantitative approaches--continuous Petri nets and ordinary differential equations (ODEs). We review the major elementary building blocks of a cellular signalling model, discuss which critical design decisions have to be made during model building, and present a number of novel computational tools that can help to explore alternative modular models in an easy and intuitive manner. These tools, which are based on Petri net theory, offer convenient ways of composing hierarchical ODE models, and permit a qualitative analysis of their behaviour. We illustrate the central concepts using signal transduction as our main example. The ultimate aim is to introduce a general approach that provides the foundations for a structured formal engineering of large-scale models of biochemical networks.

  2. A Nonhydrostatic Model Based On A New Approach

    Science.gov (United States)

    Janjic, Z. I.

    Considerable experience with nonhydrostatic mo dels has been accumulated on the scales of convective clouds and storms. However, numerical weather prediction (NWP) deals with motions on a much wider range of temporal and spatial scales. Thus, difficulties that may not be significant on the small scales, may become important in NWP applications. Having in mind these considerations, a new approach has been proposed and applied in developing nonhydrostatic models intended for NWP applications. Namely, instead of extending the cloud models to synoptic scales, the hydrostatic approximation is relaxed in a hydrostatic NWP model. In this way the model validity is extended to nonhydrostatic motions, and at the same time favorable features of the hydrostatic formulation are preserved. In order to apply this approach, the system of nonhydrostatic equations is split into two parts: (a) the part that corresponds to the hydrostatic system, except for corrections due to vertical acceleration, and (b) the system of equations that allows computation of the corrections appearing in the first system. This procedure does not require any additional approximation. In the model, "isotropic" horizontal finite differencing is employed that conserves a number of basic and derived dynamical and quadratic quantities. The hybrid pressure-sigma vertical coordinate has been chosen as the primary option. The forward-backward scheme is used for horizontally propagating fast waves, and an implicit scheme is used for vertically propagating sound waves. The Adams- Bashforth scheme is applied for the advection of the basic dynamical variables and for the Coriolis terms. In real data runs, the nonhydrostatic dynamics does not require extra computational boundary conditions at the top. The philosophy of the physical package and possible future developments of physical parameterizations are also reviewed. A two-dimensional model based on the described approach successfully reproduced classical

  3. H(infinity) output tracking control for nonlinear systems via T-S fuzzy model approach.

    Science.gov (United States)

    Lin, Chong; Wang, Qing-Guo; Lee, Tong Heng

    2006-04-01

    This paper studies the problem of H(infinity) output tracking control for nonlinear time-delay systems using Takagi-Sugeno (T-S) fuzzy model approach. An LMI-based design method is proposed for achieving the output tracking purpose. Illustrative examples are given to show the effectiveness of the present results.

  4. Comparison of data-driven and model-driven approaches to brightness temperature diurnal cycle interpolation

    CSIR Research Space (South Africa)

    Van den Bergh, F

    2006-01-01

    Full Text Available RKHS model for the first experiment. MSE = (0.5363, 0.7331). motivation for this approach was that the amount of compu- tation per cycle would be reduced significantly. The specific example in Figure 4 shows the RKHS model—initially fitted to cycle...

  5. Testing of kinetic models: usefulness of the multiresponse approach as applied to chlorophyll degradation in foods

    NARCIS (Netherlands)

    Boekel, van M.A.J.S.

    1999-01-01

    Cascades of reactions, in which several reactants and products take part, frequently occur in foods. This work shows that kinetic modelling of such reactions having parameters in common is much more powerful when using a multiresponse rather than a uniresponse approach (i.e. analysing more than one

  6. Social model: a new approach of the disability theme.

    Science.gov (United States)

    Bampi, Luciana Neves da Silva; Guilhem, Dirce; Alves, Elioenai Dornelles

    2010-01-01

    The experience of disability is part of the daily lives of people who have a disease, lesion or corporal limitation. Disability is still understood as personal bad luck; moreover, from the social and political points of view, the disabled are seen as a minority. The aim of this study is to contribute to the knowledge about the experience of disability. The research presents a new approach on the theme: the social model. This approach appeared as an alternative to the medical model of disability, which sees the lesion as the primary cause of social inequality and of the disadvantages experienced by the disabled, ignoring the role of social structures in their oppression and marginalization. The study permits reflecting on how the difficulties and barriers society imposed on people considered different make disability a reality and portray social injustice and the vulnerability situation lived by excluded groups.

  7. Lattice percolation approach to 3D modeling of tissue aging

    Science.gov (United States)

    Gorshkov, Vyacheslav; Privman, Vladimir; Libert, Sergiy

    2016-11-01

    We describe a 3D percolation-type approach to modeling of the processes of aging and certain other properties of tissues analyzed as systems consisting of interacting cells. Lattice sites are designated as regular (healthy) cells, senescent cells, or vacancies left by dead (apoptotic) cells. The system is then studied dynamically with the ongoing processes including regular cell dividing to fill vacant sites, healthy cells becoming senescent or dying, and senescent cells dying. Statistical-mechanics description can provide patterns of time dependence and snapshots of morphological system properties. The developed theoretical modeling approach is found not only to corroborate recent experimental findings that inhibition of senescence can lead to extended lifespan, but also to confirm that, unlike 2D, in 3D senescent cells can contribute to tissue's connectivity/mechanical stability. The latter effect occurs by senescent cells forming the second infinite cluster in the regime when the regular (healthy) cell's infinite cluster still exists.

  8. Research on teacher education programs: logic model approach.

    Science.gov (United States)

    Newton, Xiaoxia A; Poon, Rebecca C; Nunes, Nicole L; Stone, Elisa M

    2013-02-01

    Teacher education programs in the United States face increasing pressure to demonstrate their effectiveness through pupils' learning gains in classrooms where program graduates teach. The link between teacher candidates' learning in teacher education programs and pupils' learning in K-12 classrooms implicit in the policy discourse suggests a one-to-one correspondence. However, the logical steps leading from what teacher candidates have learned in their programs to what they are doing in classrooms that may contribute to their pupils' learning are anything but straightforward. In this paper, we argue that the logic model approach from scholarship on evaluation can enhance research on teacher education by making explicit the logical links between program processes and intended outcomes. We demonstrate the usefulness of the logic model approach through our own work on designing a longitudinal study that focuses on examining the process and impact of an undergraduate mathematics and science teacher education program.

  9. A Variational Approach to the Modeling of MIMO Systems

    Directory of Open Access Journals (Sweden)

    Jraifi A

    2007-01-01

    Full Text Available Motivated by the study of the optimization of the quality of service for multiple input multiple output (MIMO systems in 3G (third generation, we develop a method for modeling MIMO channel . This method, which uses a statistical approach, is based on a variational form of the usual channel equation. The proposed equation is given by with scalar variable . Minimum distance of received vectors is used as the random variable to model MIMO channel. This variable is of crucial importance for the performance of the transmission system as it captures the degree of interference between neighbors vectors. Then, we use this approach to compute numerically the total probability of errors with respect to signal-to-noise ratio (SNR and then predict the numbers of antennas. By fixing SNR variable to a specific value, we extract informations on the optimal numbers of MIMO antennas.

  10. A relaxation-based approach to damage modeling

    Science.gov (United States)

    Junker, Philipp; Schwarz, Stephan; Makowski, Jerzy; Hackl, Klaus

    2017-01-01

    Material models, including softening effects due to, for example, damage and localizations, share the problem of ill-posed boundary value problems that yield mesh-dependent finite element results. It is thus necessary to apply regularization techniques that couple local behavior described, for example, by internal variables, at a spatial level. This can take account of the gradient of the internal variable to yield mesh-independent finite element results. In this paper, we present a new approach to damage modeling that does not use common field functions, inclusion of gradients or complex integration techniques: Appropriate modifications of the relaxed (condensed) energy hold the same advantage as other methods, but with much less numerical effort. We start with the theoretical derivation and then discuss the numerical treatment. Finally, we present finite element results that prove empirically how the new approach works.

  11. Coordination-theoretic approach to modelling grid service composition process

    Institute of Scientific and Technical Information of China (English)

    Meng Qian; Zhong Liu; Jing Wang; Li Yao; Weiming Zhang

    2010-01-01

    A grid service composite process is made up of complex coordinative activities.Developing the appropriate model of grid service coordinative activities is an important foundation for the grid service composition.According to the coordination theory,this paper elaborates the process of the grid service composition by using UML 2.0,and proposes an approach to modelling the grid service composition process based on the coordination theory.This approach helps not only to analyze accurately the task activities and relevant dependencies among task activities,but also to facilitate the adaptability of the grid service orchestration to further realize the connectivity,timeliness,appropriateness and expansibility of the grid service composition.

  12. Innovation Networks New Approaches in Modelling and Analyzing

    CERN Document Server

    Pyka, Andreas

    2009-01-01

    The science of graphs and networks has become by now a well-established tool for modelling and analyzing a variety of systems with a large number of interacting components. Starting from the physical sciences, applications have spread rapidly to the natural and social sciences, as well as to economics, and are now further extended, in this volume, to the concept of innovations, viewed broadly. In an abstract, systems-theoretical approach, innovation can be understood as a critical event which destabilizes the current state of the system, and results in a new process of self-organization leading to a new stable state. The contributions to this anthology address different aspects of the relationship between innovation and networks. The various chapters incorporate approaches in evolutionary economics, agent-based modeling, social network analysis and econophysics and explore the epistemic tension between insights into economics and society-related processes, and the insights into new forms of complex dynamics.

  13. Understanding complex urban systems multidisciplinary approaches to modeling

    CERN Document Server

    Gurr, Jens; Schmidt, J

    2014-01-01

    Understanding Complex Urban Systems takes as its point of departure the insight that the challenges of global urbanization and the complexity of urban systems cannot be understood – let alone ‘managed’ – by sectoral and disciplinary approaches alone. But while there has recently been significant progress in broadening and refining the methodologies for the quantitative modeling of complex urban systems, in deepening the theoretical understanding of cities as complex systems, or in illuminating the implications for urban planning, there is still a lack of well-founded conceptual thinking on the methodological foundations and the strategies of modeling urban complexity across the disciplines. Bringing together experts from the fields of urban and spatial planning, ecology, urban geography, real estate analysis, organizational cybernetics, stochastic optimization, and literary studies, as well as specialists in various systems approaches and in transdisciplinary methodologies of urban analysis, the volum...

  14. A Composite Modelling Approach to Decision Support by the Use of the CBA-DK Model

    DEFF Research Database (Denmark)

    Barfod, Michael Bruhn; Salling, Kim Bang; Leleur, Steen

    2007-01-01

    This paper presents a decision support system for assessment of transport infrastructure projects. The composite modelling approach, COSIMA, combines a cost-benefit analysis by use of the CBA-DK model with multi-criteria analysis applying the AHP and SMARTER techniques. The modelling uncertainties...

  15. Risk evaluation of uranium mining: A geochemical inverse modelling approach

    Science.gov (United States)

    Rillard, J.; Zuddas, P.; Scislewski, A.

    2011-12-01

    reactive mineral surface area. The formation of coatings on dissolving mineral surfaces significantly reduces the amount of surface available to react with fluids. Our results show that negatively charged ion complexes, responsible for U transport, decreases when alkalinity and rock buffer capacity is similarly lower. Carbonate ion pairs however, may increase U mobility when radionuclide concentration is high and rock buffer capacity is low. The present work helps to orient future monitoring of this site in Brazil as well as of other sites where uranium is linked to igneous rock formations, without the presence of sulphides. Monitoring SO4 migration (in acidic leaching uranium sites) seems to be an efficient and simple way to track different hazards, especially in tropical conditions, where the succession of dry and wet periods increases the weathering action of the residual H2SO4. Nevertheless, models of risk evaluation should take into account reactive surface areas and neogenic minerals since they determine the U ion complex formation, which in turn, controls uranium mobility in natural systems. Keywords: uranium mining, reactive mineral surface area, uranium complexes, inverse modelling approach, risk evaluation

  16. CFD Modeling of Wall Steam Condensation: Two-Phase Flow Approach versus Homogeneous Flow Approach

    Directory of Open Access Journals (Sweden)

    S. Mimouni

    2011-01-01

    Full Text Available The present work is focused on the condensation heat transfer that plays a dominant role in many accident scenarios postulated to occur in the containment of nuclear reactors. The study compares a general multiphase approach implemented in NEPTUNE_CFD with a homogeneous model, of widespread use for engineering studies, implemented in Code_Saturne. The model implemented in NEPTUNE_CFD assumes that liquid droplets form along the wall within nucleation sites. Vapor condensation on droplets makes them grow. Once the droplet diameter reaches a critical value, gravitational forces compensate surface tension force and then droplets slide over the wall and form a liquid film. This approach allows taking into account simultaneously the mechanical drift between the droplet and the gas, the heat and mass transfer on droplets in the core of the flow and the condensation/evaporation phenomena on the walls. As concern the homogeneous approach, the motion of the liquid film due to the gravitational forces is neglected, as well as the volume occupied by the liquid. Both condensation models and compressible procedures are validated and compared to experimental data provided by the TOSQAN ISP47 experiment (IRSN Saclay. Computational results compare favorably with experimental data, particularly for the Helium and steam volume fractions.

  17. A NEW APPROACH OF DIGITAL BRIDGE SURFACE MODEL GENERATION

    OpenAIRE

    Ju, H.

    2012-01-01

    Bridge areas present difficulties for orthophotos generation and to avoid “collapsed” bridges in the orthoimage, operator assistance is required to create the precise DBM (Digital Bridge Model), which is, subsequently, used for the orthoimage generation. In this paper, a new approach of DBM generation, based on fusing LiDAR (Light Detection And Ranging) data and aerial imagery, is proposed. The no precise exterior orientation of the aerial image is required for the DBM generation. First, a co...

  18. A Conditional Approach to Panel Data Models with Common Shocks

    Directory of Open Access Journals (Sweden)

    Giovanni Forchini

    2016-01-01

    Full Text Available This paper studies the effects of common shocks on the OLS estimators of the slopes’ parameters in linear panel data models. The shocks are assumed to affect both the errors and some of the explanatory variables. In contrast to existing approaches, which rely on using results on martingale difference sequences, our method relies on conditional strong laws of large numbers and conditional central limit theorems for conditionally-heterogeneous random variables.

  19. Modeling software with finite state machines a practical approach

    CERN Document Server

    Wagner, Ferdinand; Wagner, Thomas; Wolstenholme, Peter

    2006-01-01

    Modeling Software with Finite State Machines: A Practical Approach explains how to apply finite state machines to software development. It provides a critical analysis of using finite state machines as a foundation for executable specifications to reduce software development effort and improve quality. This book discusses the design of a state machine and of a system of state machines. It also presents a detailed analysis of development issues relating to behavior modeling with design examples and design rules for using finite state machines. This volume describes a coherent and well-tested fr

  20. Ionization coefficient approach to modeling breakdown in nonuniform geometries.

    Energy Technology Data Exchange (ETDEWEB)

    Warne, Larry Kevin; Jorgenson, Roy Eberhardt; Nicolaysen, Scott D.

    2003-11-01

    This report summarizes the work on breakdown modeling in nonuniform geometries by the ionization coefficient approach. Included are: (1) fits to primary and secondary ionization coefficients used in the modeling; (2) analytical test cases for sphere-to-sphere, wire-to-wire, corner, coaxial, and rod-to-plane geometries; a compilation of experimental data with source references; comparisons between code results, test case results, and experimental data. A simple criterion is proposed to differentiate between corona and spark. The effect of a dielectric surface on avalanche growth is examined by means of Monte Carlo simulations. The presence of a clean dry surface does not appear to enhance growth.

  1. A Data Mining Approach to Modelling of Water Supply Assets

    DEFF Research Database (Denmark)

    Babovic, V.; Drecourt, J.; Keijzer, M.

    2002-01-01

    supply assets are mainly situated underground, and therefore not visible and under the influence of various highly unpredictable forces. This paper proposes the use of advanced data mining methods in order to determine the risks of pipe bursts. For example, analysis of the database of already occurred...... with the choice of pipes to be replaced, the outlined approach opens completely new avenues in asset modelling. The condition of an asset such as a water supply network deteriorates with age. With reliable risk models, addressing the evolution of risk with aging asset, it is now possible to plan optimal...

  2. AN APPROACH IN MODELING TWO-DIMENSIONAL PARTIALLY CAVITATING FLOW

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    An approach of modeling viscosity, unsteady partially cavitating flows around lifting bodies is presented. By employing an one-fluid Navier-Stokers solver, the algorithm is proved to be able to handle two-dimensional laminar cavitating flows at moderate Reynolds number. Based on the state equation of water-vapor mixture, the constructive relations of densities and pressures are established. To numerically simulate the cavity wall, different pseudo transition of density models are presumed. The finite-volume method is adopted and the algorithm can be extended to three-dimensional cavitating flows.

  3. A transformation approach to modelling multi-modal diffusions

    DEFF Research Database (Denmark)

    Forman, Julie Lyng; Sørensen, Michael

    2014-01-01

    when the diffusion is observed with additional measurement error. The new approach is applied to molecular dynamics data in the form of a reaction coordinate of the small Trp-zipper protein, from which the folding and unfolding rates of the protein are estimated. Because the diffusion coefficient...... is state-dependent, the new models provide a better fit to this type of protein folding data than the previous models with a constant diffusion coefficient, particularly when the effect of errors with a short time-scale is taken into account....

  4. THE SIGNAL APPROACH TO MODELLING THE BALANCE OF PAYMENT CRISIS

    Directory of Open Access Journals (Sweden)

    O. Chernyak

    2016-12-01

    Full Text Available The paper considers and presents synthesis of theoretical models of balance of payment crisis and investigates the most effective ways to model the crisis in Ukraine. For mathematical formalization of balance of payment crisis, comparative analysis of the effectiveness of different calculation methods of Exchange Market Pressure Index was performed. A set of indicators that signal the growing likelihood of balance of payments crisis was defined using signal approach. With the help of minimization function thresholds indicators were selected, the crossing of which signalize increase in the probability of balance of payment crisis.

  5. Laser modeling a numerical approach with algebra and calculus

    CERN Document Server

    Csele, Mark Steven

    2014-01-01

    Offering a fresh take on laser engineering, Laser Modeling: A Numerical Approach with Algebra and Calculus presents algebraic models and traditional calculus-based methods in tandem to make concepts easier to digest and apply in the real world. Each technique is introduced alongside a practical, solved example based on a commercial laser. Assuming some knowledge of the nature of light, emission of radiation, and basic atomic physics, the text:Explains how to formulate an accurate gain threshold equation as well as determine small-signal gainDiscusses gain saturation and introduces a novel pass

  6. Surrogate based approaches to parameter inference in ocean models

    KAUST Repository

    Knio, Omar

    2016-01-06

    This talk discusses the inference of physical parameters using model surrogates. Attention is focused on the use of sampling schemes to build suitable representations of the dependence of the model response on uncertain input data. Non-intrusive spectral projections and regularized regressions are used for this purpose. A Bayesian inference formalism is then applied to update the uncertain inputs based on available measurements or observations. To perform the update, we consider two alternative approaches, based on the application of Markov Chain Monte Carlo methods or of adjoint-based optimization techniques. We outline the implementation of these techniques to infer dependence of wind drag, bottom drag, and internal mixing coefficients.

  7. Modeling fabrication of nuclear components: An integrative approach

    Energy Technology Data Exchange (ETDEWEB)

    Hench, K.W.

    1996-08-01

    Reduction of the nuclear weapons stockpile and the general downsizing of the nuclear weapons complex has presented challenges for Los Alamos. One is to design an optimized fabrication facility to manufacture nuclear weapon primary components in an environment of intense regulation and shrinking budgets. This dissertation presents an integrative two-stage approach to modeling the casting operation for fabrication of nuclear weapon primary components. The first stage optimizes personnel radiation exposure for the casting operation layout by modeling the operation as a facility layout problem formulated as a quadratic assignment problem. The solution procedure uses an evolutionary heuristic technique. The best solutions to the layout problem are used as input to the second stage - a simulation model that assesses the impact of competing layouts on operational performance. The focus of the simulation model is to determine the layout that minimizes personnel radiation exposures and nuclear material movement, and maximizes the utilization of capacity for finished units.

  8. Injury prevention risk communication: A mental models approach

    DEFF Research Database (Denmark)

    Austin, Laurel Cecelia; Fischhoff, Baruch

    2012-01-01

    Individuals' decisions and behaviour can play a critical role in determining both the probability and severity of injury. Behavioural decision research studies peoples' decision-making processes in terms comparable to scientific models of optimal choices, providing a basis for focusing...... interventions on the most critical opportunities to reduce risks. That research often seeks to identify the ‘mental models’ that underlie individuals' interpretations of their circumstances and the outcomes of possible actions. In the context of injury prevention, a mental models approach would ask why people...... and create an expert model of the risk situation, interviewing lay people to elicit their comparable mental models, and developing and evaluating communication interventions designed to close the gaps between lay people and experts. This paper reviews the theory and method behind this research stream...

  9. An Integrated Approach to Flexible Modelling and Animated Simulation

    Institute of Scientific and Technical Information of China (English)

    Li Shuliang; Wu Zhenye

    1994-01-01

    Based on the software support of SIMAN/CINEMA, this paper presents an integrated approach to flexible modelling and simulation with animation. The methodology provides a structured way of integrating mathematical and logical model, statistical experinentation, and statistical analysis with computer animation. Within this methodology, an animated simulation study is separated into six different activities: simulation objectives identification , system model development, simulation experiment specification, animation layout construction, real-time simulation and animation run, and output data analysis. These six activities are objectives driven, relatively independent, and integrate through software organization and simulation files. The key ideas behind this methodology are objectives orientation, modelling flexibility,simulation and animation integration, and application tailorability. Though the methodology is closely related to SIMAN/CINEMA, it can be extended to other software environments.

  10. Model-based approach for elevator performance estimation

    Science.gov (United States)

    Esteban, E.; Salgado, O.; Iturrospe, A.; Isasa, I.

    2016-02-01

    In this paper, a dynamic model for an elevator installation is presented in the state space domain. The model comprises both the mechanical and the electrical subsystems, including the electrical machine and a closed-loop field oriented control. The proposed model is employed for monitoring the condition of the elevator installation. The adopted model-based approach for monitoring employs the Kalman filter as an observer. A Kalman observer estimates the elevator car acceleration, which determines the elevator ride quality, based solely on the machine control signature and the encoder signal. Finally, five elevator key performance indicators are calculated based on the estimated car acceleration. The proposed procedure is experimentally evaluated, by comparing the key performance indicators calculated based on the estimated car acceleration and the values obtained from actual acceleration measurements in a test bench. Finally, the proposed procedure is compared with the sliding mode observer.

  11. A Model Independent Approach to (p)Reheating

    CERN Document Server

    Özsoy, Ogan; Sinha, Kuver; Watson, Scott

    2015-01-01

    In this note we propose a model independent framework for inflationary (p)reheating. Our approach is analogous to the Effective Field Theory of Inflation, however here the inflaton oscillations provide an additional source of (discrete) symmetry breaking. Using the Goldstone field that non-linearly realizes time diffeormorphism invariance we construct a model independent action for both the inflaton and reheating sectors. Utilizing the hierarchy of scales present during the reheating process we are able to recover known results in the literature in a simpler fashion, including the presence of oscillations in the primordial power spectrum. We also construct a class of models where the shift symmetry of the inflaton is preserved during reheating, which helps alleviate past criticisms of (p)reheating in models of Natural Inflation. Extensions of our framework suggest the possibility of analytically investigating non-linear effects (such as rescattering and back-reaction) during thermalization without resorting t...

  12. A model-based approach to human identification using ECG

    Science.gov (United States)

    Homer, Mark; Irvine, John M.; Wendelken, Suzanne

    2009-05-01

    Biometrics, such as fingerprint, iris scan, and face recognition, offer methods for identifying individuals based on a unique physiological measurement. Recent studies indicate that a person's electrocardiogram (ECG) may also provide a unique biometric signature. Current techniques for identification using ECG rely on empirical methods for extracting features from the ECG signal. This paper presents an alternative approach based on a time-domain model of the ECG trace. Because Auto-Regressive Integrated Moving Average (ARIMA) models form a rich class of descriptors for representing the structure of periodic time series data, they are well-suited to characterizing the ECG signal. We present a method for modeling the ECG, extracting features from the model representation, and identifying individuals using these features.

  13. Computer Modeling of Violent Intent: A Content Analysis Approach

    Energy Technology Data Exchange (ETDEWEB)

    Sanfilippo, Antonio P.; Mcgrath, Liam R.; Bell, Eric B.

    2014-01-03

    We present a computational approach to modeling the intent of a communication source representing a group or an individual to engage in violent behavior. Our aim is to identify and rank aspects of radical rhetoric that are endogenously related to violent intent to predict the potential for violence as encoded in written or spoken language. We use correlations between contentious rhetoric and the propensity for violent behavior found in documents from radical terrorist and non-terrorist groups and individuals to train and evaluate models of violent intent. We then apply these models to unseen instances of linguistic behavior to detect signs of contention that have a positive correlation with violent intent factors. Of particular interest is the application of violent intent models to social media, such as Twitter, that have proved to serve as effective channels in furthering sociopolitical change.

  14. Hierarchical Agent-Based Integrated Modelling Approach for Microgrids with Adoption of EVs and HRES

    Directory of Open Access Journals (Sweden)

    Peng Han

    2014-01-01

    Full Text Available The large adoption of electric vehicles (EVs, hybrid renewable energy systems (HRESs, and the increasing of the loads shall bring significant challenges to the microgrid. The methodology to model microgrid with high EVs and HRESs penetrations is the key to EVs adoption assessment and optimized HRESs deployment. However, considering the complex interactions of the microgrid containing massive EVs and HRESs, any previous single modelling approaches are insufficient. Therefore in this paper, the methodology named Hierarchical Agent-based Integrated Modelling Approach (HAIMA is proposed. With the effective integration of the agent-based modelling with other advanced modelling approaches, the proposed approach theoretically contributes to a new microgrid model hierarchically constituted by microgrid management layer, component layer, and event layer. Then the HAIMA further links the key parameters and interconnects them to achieve the interactions of the whole model. Furthermore, HAIMA practically contributes to a comprehensive microgrid operation system, through which the assessment of the proposed model and the impact of the EVs adoption are achieved. Simulations show that the proposed HAIMA methodology will be beneficial for the microgrid study and EV’s operation assessment and shall be further utilized for the energy management, electricity consumption prediction, the EV scheduling control, and HRES deployment optimization.

  15. A Model-based Prognostics Approach Applied to Pneumatic Valves

    Directory of Open Access Journals (Sweden)

    Matthew J. Daigle

    2011-01-01

    Full Text Available Within the area of systems health management, the task of prognostics centers on predicting when components will fail. Model-based prognostics exploits domain knowledge of the system, its components, and how they fail by casting the underlying physical phenomena in a physics-based model that is derived from first principles. Uncertainty cannot be avoided in prediction, therefore, algorithms are employed that help in managing these uncertainties. The particle filtering algorithm has become a popular choice for model-based prognostics due to its wide applicability, ease of implementation, and support for uncertainty management. We develop a general model-based prognostics methodology within a robust probabilistic framework using particle filters. As a case study, we consider a pneumatic valve from the Space Shuttle cryogenic refueling system. We develop a detailed physics-based model of the pneumatic valve, and perform comprehensive simulation experiments to illustrate our prognostics approach and evaluate its effectiveness and robustness. The approach is demonstrated using historical pneumatic valve data from the refueling system.

  16. A Model-Based Prognostics Approach Applied to Pneumatic Valves

    Science.gov (United States)

    Daigle, Matthew J.; Goebel, Kai

    2011-01-01

    Within the area of systems health management, the task of prognostics centers on predicting when components will fail. Model-based prognostics exploits domain knowledge of the system, its components, and how they fail by casting the underlying physical phenomena in a physics-based model that is derived from first principles. Uncertainty cannot be avoided in prediction, therefore, algorithms are employed that help in managing these uncertainties. The particle filtering algorithm has become a popular choice for model-based prognostics due to its wide applicability, ease of implementation, and support for uncertainty management. We develop a general model-based prognostics methodology within a robust probabilistic framework using particle filters. As a case study, we consider a pneumatic valve from the Space Shuttle cryogenic refueling system. We develop a detailed physics-based model of the pneumatic valve, and perform comprehensive simulation experiments to illustrate our prognostics approach and evaluate its effectiveness and robustness. The approach is demonstrated using historical pneumatic valve data from the refueling system.

  17. Systems pharmacology modeling: an approach to improving drug safety.

    Science.gov (United States)

    Bai, Jane P F; Fontana, Robert J; Price, Nathan D; Sangar, Vineet

    2014-01-01

    Advances in systems biology in conjunction with the expansion in knowledge of drug effects and diseases present an unprecedented opportunity to extend traditional pharmacokinetic and pharmacodynamic modeling/analysis to conduct systems pharmacology modeling. Many drugs that cause liver injury and myopathies have been studied extensively. Mitochondrion-centric systems pharmacology modeling is important since drug toxicity across a large number of pharmacological classes converges to mitochondrial injury and death. Approaches to systems pharmacology modeling of drug effects need to consider drug exposure, organelle and cellular phenotypes across all key cell types of human organs, organ-specific clinical biomarkers/phenotypes, gene-drug interaction and immune responses. Systems modeling approaches, that leverage the knowledge base constructed from curating a selected list of drugs across a wide range of pharmacological classes, will provide a critically needed blueprint for making informed decisions to reduce the rate of attrition for drugs in development and increase the number of drugs with an acceptable benefit/risk ratio.

  18. THE MODEL OF EXTERNSHIP ORGANIZATION FOR FUTURE TEACHERS: QUALIMETRIC APPROACH

    Directory of Open Access Journals (Sweden)

    Taisiya A. Isaeva

    2015-01-01

    Full Text Available The aim of the paper is to present author’s model for bachelors – future teachers of vocational training. The model is been worked out from the standpoint of qualimetric approach and provides a pedagogical training.Methods. The process is based on the literature analysis of externship organization for students in higher education and includes the SWOT-analysis techniques in pedagogical training. The method of group expert evaluation is the main method of pedagogical qualimetry. Structural components of professional pedagogical competency of students-future teachers are defined. It allows us to determine a development level and criterion of estimation on mastering programme «Vocational training (branch-wise».Results. This article interprets the concept «pedagogical training»; its basic organization principles during students’ practice are stated. The methods of expert group formation are presented: self-assessment and personal data.Scientific novelty. The externship organization model for future teachers is developed. This model is based on pedagogical training, using qualimetric approach and the SWOT-analysis techniques. Proposed criterion-assessment procedures are managed to determine the developing levels of professional and pedagogical competency.Practical significance. The model is introduced into pedagogical training of educational process of Kalashnikov’s Izhevsk State Technical University, and can be used in other similar educational establishments.

  19. Spatiotemporal infectious disease modeling: a BME-SIR approach.

    Science.gov (United States)

    Angulo, Jose; Yu, Hwa-Lung; Langousis, Andrea; Kolovos, Alexander; Wang, Jinfeng; Madrid, Ana Esther; Christakos, George

    2013-01-01

    This paper is concerned with the modeling of infectious disease spread in a composite space-time domain under conditions of uncertainty. We focus on stochastic modeling that accounts for basic mechanisms of disease distribution and multi-sourced in situ uncertainties. Starting from the general formulation of population migration dynamics and the specification of transmission and recovery rates, the model studies the functional formulation of the evolution of the fractions of susceptible-infected-recovered individuals. The suggested approach is capable of: a) modeling population dynamics within and across localities, b) integrating the disease representation (i.e. susceptible-infected-recovered individuals) with observation time series at different geographical locations and other sources of information (e.g. hard and soft data, empirical relationships, secondary information), and c) generating predictions of disease spread and associated parameters in real time, while considering model and observation uncertainties. Key aspects of the proposed approach are illustrated by means of simulations (i.e. synthetic studies), and a real-world application using hand-foot-mouth disease (HFMD) data from China.

  20. Soft computing approach for modeling power plant with a once-through boiler

    Energy Technology Data Exchange (ETDEWEB)

    Ghaffari, Ali; Chaibakhsh, Ali [Department of Mechanical Engineering, K.N. Toosi University of Technology, P.O. Box 16765-3381, Tehran, (Iran); Lucas, Caro [Department of Electrical and Computer Engineering, University of Tehran, P.O. Box 14318, Tehran, (Iran)

    2007-09-15

    In this paper, a soft computing approach is presented for modeling electrical power generating plants in order to characterize the essential dynamic behavior of the plant subsystems. The structure of the soft computing method consists of fuzzy logic, neural networks and genetic algorithms. The measured data from a complete set of field experiments is the basis for training the models including the extraction of linguistic rules and membership functions as well as adjusting the other parameters of the fuzzy model. The genetic algorithm is applied to the modeling approach in order to optimize the procedure of the training. Comparison between the responses of the proposed models with the responses of the plants validates the accuracy and performance of the modeling approach. A similar comparison between the responses of these models with the models obtained based on the thermodynamical and physical relations of the plant shows the effectiveness and feasibility of the developed model in terms of more accurate and less deviation between the responses of the models and the corresponding subsystems. (Author)

  1. Model-driven engineering approach to design and implementation of robot control system

    OpenAIRE

    Trojanek, Piotr

    2013-01-01

    In this paper we apply a model-driven engineering approach to designing domain-specific solutions for robot control system development. We present a case study of the complete process, including identification of the domain meta-model, graphical notation definition and source code generation for subsumption architecture -- a well-known example of robot control architecture. Our goal is to show that both the definition of the robot-control architecture and its supporting tools fits well into t...

  2. Model-driven engineering approach to design and implementation of robot control system

    OpenAIRE

    Trojanek, Piotr

    2013-01-01

    In this paper we apply a model-driven engineering approach to designing domain-specific solutions for robot control system development. We present a case study of the complete process, including identification of the domain meta-model, graphical notation definition and source code generation for subsumption architecture -- a well-known example of robot control architecture. Our goal is to show that both the definition of the robot-control architecture and its supporting tools fits well into t...

  3. Rescaled Local Interaction Simulation Approach for Shear Wave Propagation Modelling in Magnetic Resonance Elastography

    Directory of Open Access Journals (Sweden)

    Z. Hashemiyan

    2016-01-01

    Full Text Available Properties of soft biological tissues are increasingly used in medical diagnosis to detect various abnormalities, for example, in liver fibrosis or breast tumors. It is well known that mechanical stiffness of human organs can be obtained from organ responses to shear stress waves through Magnetic Resonance Elastography. The Local Interaction Simulation Approach is proposed for effective modelling of shear wave propagation in soft tissues. The results are validated using experimental data from Magnetic Resonance Elastography. These results show the potential of the method for shear wave propagation modelling in soft tissues. The major advantage of the proposed approach is a significant reduction of computational effort.

  4. Rescaled Local Interaction Simulation Approach for Shear Wave Propagation Modelling in Magnetic Resonance Elastography

    Science.gov (United States)

    Packo, P.; Staszewski, W. J.; Uhl, T.

    2016-01-01

    Properties of soft biological tissues are increasingly used in medical diagnosis to detect various abnormalities, for example, in liver fibrosis or breast tumors. It is well known that mechanical stiffness of human organs can be obtained from organ responses to shear stress waves through Magnetic Resonance Elastography. The Local Interaction Simulation Approach is proposed for effective modelling of shear wave propagation in soft tissues. The results are validated using experimental data from Magnetic Resonance Elastography. These results show the potential of the method for shear wave propagation modelling in soft tissues. The major advantage of the proposed approach is a significant reduction of computational effort. PMID:26884808

  5. Modeling drug- and chemical- induced hepatotoxicity with systems biology approaches

    Directory of Open Access Journals (Sweden)

    Sudin eBhattacharya

    2012-12-01

    Full Text Available We provide an overview of computational systems biology approaches as applied to the study of chemical- and drug-induced toxicity. The concept of ‘toxicity pathways’ is described in the context of the 2007 US National Academies of Science report, Toxicity testing in the 21st Century: A Vision and A Strategy. Pathway mapping and modeling based on network biology concepts are a key component of the vision laid out in this report for a more biologically-based analysis of dose-response behavior and the safety of chemicals and drugs. We focus on toxicity of the liver (hepatotoxicity – a complex phenotypic response with contributions from a number of different cell types and biological processes. We describe three case studies of complementary multi-scale computational modeling approaches to understand perturbation of toxicity pathways in the human liver as a result of exposure to environmental contaminants and specific drugs. One approach involves development of a spatial, multicellular virtual tissue model of the liver lobule that combines molecular circuits in individual hepatocytes with cell-cell interactions and blood-mediated transport of toxicants through hepatic sinusoids, to enable quantitative, mechanistic prediction of hepatic dose-response for activation of the AhR toxicity pathway. Simultaneously, methods are being developing to extract quantitative maps of intracellular signaling and transcriptional regulatory networks perturbed by environmental contaminants, using a combination of gene expression and genome-wide protein-DNA interaction data. A predictive physiological model (DILIsymTM to understand drug-induced liver injury (DILI, the most common adverse event leading to termination of clinical development programs and regulatory actions on drugs, is also described. The model initially focuses on reactive metabolite-induced DILI in response to administration of acetaminophen, and spans multiple biological scales.

  6. A Statistical Approach For Modeling Tropical Cyclones. Synthetic Hurricanes Generator Model

    Energy Technology Data Exchange (ETDEWEB)

    Pasqualini, Donatella [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-11

    This manuscript brie y describes a statistical ap- proach to generate synthetic tropical cyclone tracks to be used in risk evaluations. The Synthetic Hur- ricane Generator (SynHurG) model allows model- ing hurricane risk in the United States supporting decision makers and implementations of adaptation strategies to extreme weather. In the literature there are mainly two approaches to model hurricane hazard for risk prediction: deterministic-statistical approaches, where the storm key physical parameters are calculated using physi- cal complex climate models and the tracks are usually determined statistically from historical data; and sta- tistical approaches, where both variables and tracks are estimated stochastically using historical records. SynHurG falls in the second category adopting a pure stochastic approach.

  7. A generalized quarter car modelling approach with frame flexibility and other nonlocal effects

    Indian Academy of Sciences (India)

    HUSAIN KANCHWALA; ANINDYA CHATTERJEE

    2017-07-01

    Quarter-car models are popular, simple, unidirectional in kinematics and enable quicker computation than full-car models. However, they do not account for three other wheels and their suspensions, nor for the frame’s flexibility, mass distribution and damping. Here we propose a generalized quarter-car modelling approach, incorporating both the frame as well as other-wheel ground contacts. Our approach is linear, uses Laplace transforms, involves vertical motions of key points of interest and has intermediate complexity with improved realism. Our model uses baseline suspension parameters and responses to step force inputs at suspensionattachment locations on the frame. Subsequently, new suspension parameters and unsprung mass compliance parameters can be incorporated, for which relevant formulas are given. The final expression for the transfer function, between ground displacement and body point response, is approximated using model orderreduction. A simple Matlab code is provided that enables quick parametric studies. Finally, a parametric study and wheel hop analysis are performed for a realistic numerical example. Frequency and time domain responses obtained show clearly the effects of other wheels, which are outside the scope of usual quarter-car models. The displacements obtained from our model are compared against those of the usual quarter-car model and show ways in which predictions of the quarter-car model include errors that can be reduced in our approach. In summary, our approach has intermediate complexity between that of a full-car model and a quarter-car model, and offers corresponding intermediate detail and realism.

  8. ABOUT COMPLEX APPROACH TO MODELLING OF TECHNOLOGICAL MACHINES FUNCTIONING

    Directory of Open Access Journals (Sweden)

    A. A. Honcharov

    2015-01-01

    Full Text Available Problems arise in the process of designing, production and investigation of a complicated technological machine. These problems concern not only properties of some types of equipment but they have respect to regularities of control object functioning as a whole. A technological machine is thought of as such technological complex where it is possible to lay emphasis on a control system (or controlling device and a controlled object. The paper analyzes a number of existing approaches to construction of models for controlling devices and their functioning. A complex model for a technological machine operation has been proposed in the paper; in other words it means functioning of a controlling device and a controlled object of the technological machine. In this case models of the controlling device and the controlled object of the technological machine can be represented as aggregate combination (elements of these models. The paper describes a conception on realization of a complex model for a technological machine as a model for interaction of units (elements in the controlling device and the controlled object. When a control activation is given to the controlling device of the technological machine its modelling is executed at an algorithmic or logic level and the obtained output signals are interpreted as events and information about them is transferred to executive mechanisms.The proposed scheme of aggregate integration considers element models as object classes and the integration scheme is presented as a combination of object property values (combination of a great many input and output contacts and combination of object interactions (in the form of an integration operator. Spawn of parent object descendants of the technological machine model and creation of their copies in various project parts is one of the most important means of the distributed technological machine modelling that makes it possible to develop complicated models of

  9. Quantitative versus qualitative modeling: a complementary approach in ecosystem study.

    Science.gov (United States)

    Bondavalli, C; Favilla, S; Bodini, A

    2009-02-01

    Natural disturbance or human perturbation act upon ecosystems by changing some dynamical parameters of one or more species. Foreseeing these modifications is necessary before embarking on an intervention: predictions may help to assess management options and define hypothesis for interventions. Models become valuable tools for studying and making predictions only when they capture types of interactions and their magnitude. Quantitative models are more precise and specific about a system, but require a large effort in model construction. Because of this very often ecological systems remain only partially specified and one possible approach to their description and analysis comes from qualitative modelling. Qualitative models yield predictions as directions of change in species abundance but in complex systems these predictions are often ambiguous, being the result of opposite actions exerted on the same species by way of multiple pathways of interactions. Again, to avoid such ambiguities one needs to know the intensity of all links in the system. One way to make link magnitude explicit in a way that can be used in qualitative analysis is described in this paper and takes advantage of another type of ecosystem representation: ecological flow networks. These flow diagrams contain the structure, the relative position and the connections between the components of a system, and the quantity of matter flowing along every connection. In this paper it is shown how these ecological flow networks can be used to produce a quantitative model similar to the qualitative counterpart. Analyzed through the apparatus of loop analysis this quantitative model yields predictions that are by no means ambiguous, solving in an elegant way the basic problem of qualitative analysis. The approach adopted in this work is still preliminary and we must be careful in its application.

  10. A multi-model approach to X-ray pulsars

    Directory of Open Access Journals (Sweden)

    Schönherr G.

    2014-01-01

    Full Text Available The emission characteristics of X-ray pulsars are governed by magnetospheric accretion within the Alfvén radius, leading to a direct coupling of accretion column properties and interactions at the magnetosphere. The complexity of the physical processes governing the formation of radiation within the accreted, strongly magnetized plasma has led to several sophisticated theoretical modelling efforts over the last decade, dedicated to either the formation of the broad band continuum, the formation of cyclotron resonance scattering features (CRSFs or the formation of pulse profiles. While these individual approaches are powerful in themselves, they quickly reach their limits when aiming at a quantitative comparison to observational data. Too many fundamental parameters, describing the formation of the accretion columns and the systems’ overall geometry are unconstrained and different models are often based on different fundamental assumptions, while everything is intertwined in the observed, highly phase-dependent spectra and energy-dependent pulse profiles. To name just one example: the (phase variable line width of the CRSFs is highly dependent on the plasma temperature, the existence of B-field gradients (geometry and observation angle, parameters which, in turn, drive the continuum radiation and are driven by the overall two-pole geometry for the light bending model respectively. This renders a parallel assessment of all available spectral and timing information by a compatible across-models-approach indispensable. In a collaboration of theoreticians and observers, we have been working on a model unification project over the last years, bringing together theoretical calculations of the Comptonized continuum, Monte Carlo simulations and Radiation Transfer calculations of CRSFs as well as a General Relativity (GR light bending model for ray tracing of the incident emission pattern from both magnetic poles. The ultimate goal is to implement a

  11. A participatory modelling approach to developing a numerical sediment dynamics model

    Science.gov (United States)

    Jones, Nicholas; McEwen, Lindsey; Parker, Chris; Staddon, Chad

    2016-04-01

    Fluvial geomorphology is recognised as an important consideration in policy and legislation in the management of river catchments. Despite this recognition, limited knowledge exchange occurs between scientific researchers and river management practitioners. An example of this can be found within the limited uptake of numerical models of sediment dynamics by river management practitioners in the United Kingdom. The uptake of these models amongst the applied community is important as they have the potential to articulate how, at the catchment-scale, the impacts of management strategies of land-use change affect sediment dynamics and resulting channel quality. This paper describes and evaluates a new approach which involves river management stakeholders in an iterative and reflexive participatory modelling process. The aim of this approach was to create an environment for knowledge exchange between the stakeholders and the research team in the process of co-constructing a model. This process adopted a multiple case study approach, involving four groups of river catchment stakeholders in the United Kingdom. These stakeholder groups were involved in several stages of the participatory modelling process including: requirements analysis, model design, model development, and model evaluation. Stakeholders have provided input into a number of aspects of the modelling process, such as: data requirements, user interface, modelled processes, model assumptions, model applications, and model outputs. This paper will reflect on this process, in particular: the innovative methods used, data generated, and lessons learnt.

  12. A Featureless Approach to 3D Polyhedral Building Modeling from Aerial Images

    Directory of Open Access Journals (Sweden)

    Karim Hammoudi

    2010-12-01

    Full Text Available This paper presents a model-based approach for reconstructing 3D polyhedral building models from aerial images. The proposed approach exploits some geometric and photometric properties resulting from the perspective projection of planar structures. Data are provided by calibrated aerial images. The novelty of the approach lies in its featurelessness and in its use of direct optimization based on image rawbrightness. The proposed framework avoids feature extraction and matching. The 3D polyhedral model is directly estimated by optimizing an objective function that combines an image-based dissimilarity measure and a gradient score over several aerial images. The optimization process is carried out by the Differential Evolution algorithm. The proposed approach is intended to provide more accurate 3D reconstruction than feature-based approaches. Fast 3D model rectification and updating can take advantage of the proposed method. Several results and evaluations of performance from real and synthetic images show the feasibility and robustness of the proposed approach.

  13. A Quantitative Model-Driven Comparison of Command Approaches in an Adversarial Process Model

    Science.gov (United States)

    2007-06-01

    12TH ICCRTS “Adapting C2 to the 21st Century” A Quantitative Model-Driven Comparison of Command Approaches in an Adversarial Process Model Tracks...Lenahan2 identified metrics and techniques for adversarial C2 process modeling . We intend to further that work by developing a set of adversarial process ...Approaches in an Adversarial Process Model 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK

  14. Comparison of Joint Modeling Approaches Including Eulerian Sliding Interfaces

    Energy Technology Data Exchange (ETDEWEB)

    Lomov, I; Antoun, T; Vorobiev, O

    2009-12-16

    Accurate representation of discontinuities such as joints and faults is a key ingredient for high fidelity modeling of shock propagation in geologic media. The following study was done to improve treatment of discontinuities (joints) in the Eulerian hydrocode GEODYN (Lomov and Liu 2005). Lagrangian methods with conforming meshes and explicit inclusion of joints in the geologic model are well suited for such an analysis. Unfortunately, current meshing tools are unable to automatically generate adequate hexahedral meshes for large numbers of irregular polyhedra. Another concern is that joint stiffness in such explicit computations requires significantly reduced time steps, with negative implications for both the efficiency and quality of the numerical solution. An alternative approach is to use non-conforming meshes and embed joint information into regular computational elements. However, once slip displacement on the joints become comparable to the zone size, Lagrangian (even non-conforming) meshes could suffer from tangling and decreased time step problems. The use of non-conforming meshes in an Eulerian solver may alleviate these difficulties and provide a viable numerical approach for modeling the effects of faults on the dynamic response of geologic materials. We studied shock propagation in jointed/faulted media using a Lagrangian and two Eulerian approaches. To investigate the accuracy of this joint treatment the GEODYN calculations have been compared with results from the Lagrangian code GEODYN-L which uses an explicit treatment of joints via common plane contact. We explore two approaches to joint treatment in the code, one for joints with finite thickness and the other for tight joints. In all cases the sliding interfaces are tracked explicitly without homogenization or blending the joint and block response into an average response. In general, rock joints will introduce an increase in normal compliance in addition to a reduction in shear strength. In the

  15. Fugacity superposition: a new approach to dynamic multimedia fate modeling.

    Science.gov (United States)

    Hertwich, E G

    2001-08-01

    The fugacities, concentrations, or inventories of pollutants in environmental compartments as determined by multimedia environmental fate models of the Mackay type can be superimposed on each other. This is true for both steady-state (level III) and dynamic (level IV) models. Any problem in multimedia fate models with linear, time-invariant transfer and transformation coefficients can be solved through a superposition of a set of n independent solutions to a set of coupled, homogeneous first-order differential equations, where n is the number of compartments in the model. For initial condition problems in dynamic models, the initial inventories can be separated, e.g. by a compartment. The solution is obtained by adding the single-compartment solutions. For time-varying emissions, a convolution integral is used to superimpose solutions. The advantage of this approach is that the differential equations have to be solved only once. No numeric integration is required. Alternatively, the dynamic model can be simplified to algebraic equations using the Laplace transform. For time-varying emissions, the Laplace transform of the model equations is simply multiplied with the Laplace transform of the emission profile. It is also shown that the time-integrated inventories of the initial conditions problems are the same as the inventories in the steady-state problem. This implies that important properties of pollutants such as potential dose, persistence, and characteristic travel distance can be derived from the steady state.

  16. A novel approach to modeling spacecraft spectral reflectance

    Science.gov (United States)

    Willison, Alexander; Bédard, Donald

    2016-10-01

    Simulated spectrometric observations of unresolved resident space objects are required for the interpretation of quantities measured by optical telescopes. This allows for their characterization as part of regular space surveillance activity. A peer-reviewed spacecraft reflectance model is necessary to help improve the understanding of characterization measurements. With this objective in mind, a novel approach to model spacecraft spectral reflectance as an overall spectral bidirectional reflectance distribution function (sBRDF) is presented. A spacecraft's overall sBRDF is determined using its triangular-faceted computer-aided design (CAD) model and the empirical sBRDF of its homogeneous materials. The CAD model is used to determine the proportional contribution of each homogeneous material to the overall reflectance. Each empirical sBRDF is contained in look-up tables developed from measurements made over a range of illumination and reflection geometries using simple interpolation and extrapolation techniques. A demonstration of the spacecraft reflectance model is provided through simulation of an optical ground truth characterization using the Canadian Advanced Nanospace eXperiment-1 Engineering Model nanosatellite as the subject. Validation of the reflectance model is achieved through a qualitative comparison of simulated and measured quantities.

  17. Cancer systems biology and modeling: microscopic scale and multiscale approaches.

    Science.gov (United States)

    Masoudi-Nejad, Ali; Bidkhori, Gholamreza; Hosseini Ashtiani, Saman; Najafi, Ali; Bozorgmehr, Joseph H; Wang, Edwin

    2015-02-01

    Cancer has become known as a complex and systematic disease on macroscopic, mesoscopic and microscopic scales. Systems biology employs state-of-the-art computational theories and high-throughput experimental data to model and simulate complex biological procedures such as cancer, which involves genetic and epigenetic, in addition to intracellular and extracellular complex interaction networks. In this paper, different systems biology modeling techniques such as systems of differential equations, stochastic methods, Boolean networks, Petri nets, cellular automata methods and agent-based systems are concisely discussed. We have compared the mentioned formalisms and tried to address the span of applicability they can bear on emerging cancer modeling and simulation approaches. Different scales of cancer modeling, namely, microscopic, mesoscopic and macroscopic scales are explained followed by an illustration of angiogenesis in microscopic scale of the cancer modeling. Then, the modeling of cancer cell proliferation and survival are examined on a microscopic scale and the modeling of multiscale tumor growth is explained along with its advantages.

  18. Modeling the crop transpiration using an optimality-based approach

    Institute of Scientific and Technical Information of China (English)

    Stanislaus; J.Schymanski; Murugesu; Sivapalan

    2008-01-01

    Evapotranspiration constitutes more than 80% of the long-term water balance in Northern China.In this area,crop transpiration due to large areas of agriculture and irrigation is responsible for the majority of evapotranspiration.A model for crop transpiration is therefore essential for estimating the agricultural water consumption and understanding its feedback to the environment.However,most existing hydrological models usually calculate transpiration by relying on parameter calibration against local observations,and do not take into account crop feedback to the ambient environment.This study presents an optimality-based ecohydrology model that couples an ecological hypothesis,the photosynthetic process,stomatal movement,water balance,root water uptake and crop senescence,with the aim of predicting crop characteristics,CO2 assimilation and water balance based only on given meteorological data.Field experiments were conducted in the Weishan Irrigation District of Northern China to evaluate performance of the model.Agreement between simulation and measurement was achieved for CO2 assimilation,evapotranspiration and soil moisture content.The vegetation optimality was proven valid for crops and the model was applicable for both C3 and C4 plants.Due to the simple scheme of the optimality-based approach as well as its capability for modeling dynamic interactions between crops and the water cycle without prior vegetation information,this methodology is potentially useful to couple with the distributed hydrological model for application at the watershed scale.

  19. Inverse modeling approach to allogenic karst system characterization.

    Science.gov (United States)

    Dörfliger, N; Fleury, P; Ladouche, B

    2009-01-01

    Allogenic karst systems function in a particular way that is influenced by the type of water infiltrating through river water losses, by karstification processes, and by water quality. Management of this system requires a good knowledge of its structure and functioning, for which a new methodology based on an inverse modeling approach appears to be well suited. This approach requires both spring and river inflow discharge measurements and a continuous record of chemical parameters in the river and at the spring. The inverse model calculates unit hydrographs and the impulse responses of fluxes from rainfall hydraulic head at the spring or rainfall flux data, the purpose of which is hydrograph separation. Hydrograph reconstruction is done using rainfall and river inflow data as model input and enables definition at each time step of the ratio of each component. Using chemical data, representing event and pre-event water, as input, it is possible to determine the origin of spring water (either fast flow through the epikarstic zone or slow flow through the saturated zone). This study made it possible to improve a conceptual model of allogenic karst system functioning. The methodology is used to study the Bas-Agly and the Cent Font karst systems, two allogenic karst systems in Southern France.

  20. A secured e-tendering modeling using misuse case approach

    Science.gov (United States)

    Mohd, Haslina; Robie, Muhammad Afdhal Muhammad; Baharom, Fauziah; Darus, Norida Muhd; Saip, Mohamed Ali; Yasin, Azman

    2016-08-01

    Major risk factors relating to electronic transactions may lead to destructive impacts on trust and transparency in the process of tendering. Currently, electronic tendering (e-tendering) systems still remain uncertain in issues relating to legal and security compliance and most importantly it has an unclear security framework. Particularly, the available systems are lacking in addressing integrity, confidentiality, authentication, and non-repudiation in e-tendering requirements. Thus, one of the challenges in developing an e-tendering system is to ensure the system requirements include the function for secured and trusted environment. Therefore, this paper aims to model a secured e-tendering system using misuse case approach. The modeling process begins with identifying the e-tendering process, which is based on the Australian Standard Code of Tendering (AS 4120-1994). It is followed by identifying security threats and their countermeasure. Then, the e-tendering was modelled using misuse case approach. The model can contribute to e-tendering developers and also to other researchers or experts in the e-tendering domain.