WorldWideScience

Sample records for relevant model complexes

  1. A Peep into the Uncertainty-Complexity-Relevance Modeling Trilemma through Global Sensitivity and Uncertainty Analysis

    Science.gov (United States)

    Munoz-Carpena, R.; Muller, S. J.; Chu, M.; Kiker, G. A.; Perz, S. G.

    2014-12-01

    Model Model complexity resulting from the need to integrate environmental system components cannot be understated. In particular, additional emphasis is urgently needed on rational approaches to guide decision making through uncertainties surrounding the integrated system across decision-relevant scales. However, in spite of the difficulties that the consideration of modeling uncertainty represent for the decision process, it should not be avoided or the value and science behind the models will be undermined. These two issues; i.e., the need for coupled models that can answer the pertinent questions and the need for models that do so with sufficient certainty, are the key indicators of a model's relevance. Model relevance is inextricably linked with model complexity. Although model complexity has advanced greatly in recent years there has been little work to rigorously characterize the threshold of relevance in integrated and complex models. Formally assessing the relevance of the model in the face of increasing complexity would be valuable because there is growing unease among developers and users of complex models about the cumulative effects of various sources of uncertainty on model outputs. In particular, this issue has prompted doubt over whether the considerable effort going into further elaborating complex models will in fact yield the expected payback. New approaches have been proposed recently to evaluate the uncertainty-complexity-relevance modeling trilemma (Muller, Muñoz-Carpena and Kiker, 2011) by incorporating state-of-the-art global sensitivity and uncertainty analysis (GSA/UA) in every step of the model development so as to quantify not only the uncertainty introduced by the addition of new environmental components, but the effect that these new components have over existing components (interactions, non-linear responses). Outputs from the analysis can also be used to quantify system resilience (stability, alternative states, thresholds or tipping

  2. Developing predictive systems models to address complexity and relevance for ecological risk assessment.

    Science.gov (United States)

    Forbes, Valery E; Calow, Peter

    2013-07-01

    Ecological risk assessments (ERAs) are not used as well as they could be in risk management. Part of the problem is that they often lack ecological relevance; that is, they fail to grasp necessary ecological complexities. Adding realism and complexity can be difficult and costly. We argue that predictive systems models (PSMs) can provide a way of capturing complexity and ecological relevance cost-effectively. However, addressing complexity and ecological relevance is only part of the problem. Ecological risk assessments often fail to meet the needs of risk managers by not providing assessments that relate to protection goals and by expressing risk in ratios that cannot be weighed against the costs of interventions. Once more, PSMs can be designed to provide outputs in terms of value-relevant effects that are modulated against exposure and that can provide a better basis for decision making than arbitrary ratios or threshold values. Recent developments in the modeling and its potential for implementation by risk assessors and risk managers are beginning to demonstrate how PSMs can be practically applied in risk assessment and the advantages that doing so could have. Copyright © 2013 SETAC.

  3. Parsimonious relevance models

    NARCIS (Netherlands)

    Meij, E.; Weerkamp, W.; Balog, K.; de Rijke, M.; Myang, S.-H.; Oard, D.W.; Sebastiani, F.; Chua, T.-S.; Leong, M.-K.

    2008-01-01

    We describe a method for applying parsimonious language models to re-estimate the term probabilities assigned by relevance models. We apply our method to six topic sets from test collections in five different genres. Our parsimonious relevance models (i) improve retrieval effectiveness in terms of

  4. Structural zinc(II thiolate complexes relevant to the modeling of Ada repair protein: Application toward alkylation reactions

    Directory of Open Access Journals (Sweden)

    Mohamed M. Ibrahim

    2014-11-01

    Full Text Available The TtZn(II-bound perchlorate complex [TtZn–OClO3] 1 (Ttxyly = hydrotris[N-xylyl-thioimidazolyl]borate was used for the synthesis of zinc(II-bound ethanthiothiol complex [TtZn–SCH2CH3] 2 and its hydrogen-bond containing analog Tt–ZnSCH2CH2–NH(COOC(CH33 3. These thiolate complexes were examined as structural models for the active sites of Ada repair protein toward methylation reactions. The Zn[S3O] coordination sphere in complex 1 includes three thione donors from the ligand Ttixyl and one oxygen donor from the perchlorate coligand in ideally tetrahedral arrangement around the zinc center. The average Zn(1–S(thione bond length is 2.344 Å, and the Zn(1–O(1 bond length is 1.917 Å.

  5. Other relevant numerical modelling papers

    International Nuclear Information System (INIS)

    Chartier, M.

    1989-01-01

    The ocean modelling is a rapidly evolving science and a large number of results have been published. Several categories of papers are of particular interest for this review: the papers published by the international atomic institutions, such as the NEA (for the CRESP or Subseabed Programs), the IAEA (for example the Safety Series, the Technical Report Series or the TECDOC), and the ICRP, and the papers concerned by more fundamental research, which are published in specific scientific literature. This paper aims to list some of the most relevant publications for the CRESP purposes. It means by no way to be exhaustive, but informative on the incontestable progress recently achieved in that field. One should note that some of these papers are so recent that their final version has not yet been published

  6. Controlling complexity: the clinical relevance of mouse complex genetics

    Czech Academy of Sciences Publication Activity Database

    Forejt, Jiří

    2013-01-01

    Roč. 21, č. 11 (2013), s. 1191-1196 ISSN 1018-4813 Institutional support: RVO:68378050 Keywords : Mouse model * Forward genetics * Rewiev Subject RIV: EB - Genetics ; Molecular Biology OBOR OECD: Genetics and heredity (medical genetics to be 3) Impact factor: 4.225, year: 2013

  7. Nuclear models relevant to evaluation

    International Nuclear Information System (INIS)

    Arthur, E.D.; Chadwick, M.B.; Hale, G.M.; Young, P.G.

    1991-01-01

    The widespread use of nuclear models continues in the creation of data evaluations. The reasons include extension of data evaluations to higher energies, creation of data libraries for isotopic components of natural materials, and production of evaluations for radiative target species. In these cases, experimental data are often sparse or nonexistent. As this trend continues, the nuclear models employed in evaluation work move towards more microscopically-based theoretical methods, prompted in part by the availability of increasingly powerful computational resources. Advances in nuclear models applicable to evaluation will be reviewed. These include advances in optical model theory, microscopic and phenomenological state and level density theory, unified models that consistently describe both equilibrium and nonequilibrium reaction mechanism, and improved methodologies for calculation of prompt radiation from fission. 84 refs., 8 figs

  8. Structural Model Error and Decision Relevancy

    Science.gov (United States)

    Goldsby, M.; Lusk, G.

    2017-12-01

    The extent to which climate models can underwrite specific climate policies has long been a contentious issue. Skeptics frequently deny that climate models are trustworthy in an attempt to undermine climate action, whereas policy makers often desire information that exceeds the capabilities of extant models. While not skeptics, a group of mathematicians and philosophers [Frigg et al. (2014)] recently argued that even tiny differences between the structure of a complex dynamical model and its target system can lead to dramatic predictive errors, possibly resulting in disastrous consequences when policy decisions are based upon those predictions. They call this result the Hawkmoth effect (HME), and seemingly use it to rebuke rightwing proposals to forgo mitigation in favor of adaptation. However, a vigorous debate has emerged between Frigg et al. on one side and another philosopher-mathematician pair [Winsberg and Goodwin (2016)] on the other. On one hand, Frigg et al. argue that their result shifts the burden to climate scientists to demonstrate that their models do not fall prey to the HME. On the other hand, Winsberg and Goodwin suggest that arguments like those asserted by Frigg et al. can be, if taken seriously, "dangerous": they fail to consider the variety of purposes for which models can be used, and thus too hastily undermine large swaths of climate science. They put the burden back on Frigg et al. to show their result has any effect on climate science. This paper seeks to attenuate this debate by establishing an irenic middle position; we find that there is more agreement between sides than it first seems. We distinguish a `decision standard' from a `burden of proof', which helps clarify the contributions to the debate from both sides. In making this distinction, we argue that scientists bear the burden of assessing the consequences of HME, but that the standard Frigg et al. adopt for decision relevancy is too strict.

  9. Passage relevance models for genomics search

    Directory of Open Access Journals (Sweden)

    Frieder Ophir

    2009-03-01

    Full Text Available Abstract We present a passage relevance model for integrating syntactic and semantic evidence of biomedical concepts and topics using a probabilistic graphical model. Component models of topics, concepts, terms, and document are represented as potential functions within a Markov Random Field. The probability of a passage being relevant to a biologist's information need is represented as the joint distribution across all potential functions. Relevance model feedback of top ranked passages is used to improve distributional estimates of query concepts and topics in context, and a dimensional indexing strategy is used for efficient aggregation of concept and term statistics. By integrating multiple sources of evidence including dependencies between topics, concepts, and terms, we seek to improve genomics literature passage retrieval precision. Using this model, we are able to demonstrate statistically significant improvements in retrieval precision using a large genomics literature corpus.

  10. The complexity of DNA damage: relevance to biological consequences

    International Nuclear Information System (INIS)

    Ward, J.F.

    1994-01-01

    Ionizing radiation causes both singly and multiply damaged sites in DNA when the range of radical migration is limited by the presence of hydroxyl radical scavengers (e.g. within cells). Multiply damaged sites are considered to be more biologically relevant because of the challenges they present to cellular repair mechanisms. These sites occur in the form of DNA double-strand breaks (dsb) but also as other multiple damages that can be converted to dsb during attempted repair. The presence of a dsb can lead to loss of base sequence information and/or can permit the two ends of a break to separate and rejoin with the wrong partner. (Multiply damaged sites may also be the biologically relevant type of damage caused by other agents, such as UVA, B and/or C light, and some antitumour antibiotics). The quantitative data available from radiation studies of DNA are shown to support the proposed mechanisms for the production of complex damage in cellular DNA, i.e. via scavengable and non-scavengable mechanisms. The yields of complex damages can in turn be used to support the conclusion that cellular mutations are a consequence of the presence of these damages within a gene. (Author)

  11. THE COMPLEX OF EMOTIONAL EXPERIENCES, RELEVANT MANIFESTATIONS OF INSPIRATION

    Directory of Open Access Journals (Sweden)

    Pavel A. Starikov

    2015-01-01

    Full Text Available The aim of the study is to investigate structure of emotional experiences, relevant manifestations of inspiration creative activities of students.Methods. The proposed methods of mathematical statistics (correlation analysis, factor analysis, multidimensional scaling are applied.Results and scientific novelty. The use of factor analysis, multidimensional scaling allowed to reveal a consistent set of positive experiences of the students, the relevant experience of inspiration in creative activities. «Operational» rueful feelings dedicated by M. Chiksentmihaji («feeling of full involvement, and dilution in what you do», «feeling of concentration, perfect clarity of purpose, complete control and a feeling of total immersion in a job that does not require special efforts» and experiences of the «spiritual» nature, more appropriate to peaks experiences of A. Maslow («feeling of love for all existing, all life»; «a deep sense of self importance, the inner feeling of approval of self»; «feeling of unity with the whole world»; «acute perception of the beauty of the world of nature, “beautiful instant”»; «feeling of lightness, flowing» are included in this complex in accordance with the study results. The interrelation of degree of expressiveness of the given complex of experiences with inspiration experience is considered.Practical significance. The results of the study show structure of emotional experiences, relevant manifestations of inspiration. Research materials can be useful both to psychologists, and experts in the field of pedagogy of creative activity.

  12. Towards increased policy relevance in energy modeling

    Energy Technology Data Exchange (ETDEWEB)

    Worrell, Ernst; Ramesohl, Stephan; Boyd, Gale

    2003-07-29

    Historically, most energy models were reasonably equipped to assess the impact of a subsidy or change in taxation, but are often insufficient to assess the impact of more innovative policy instruments. We evaluate the models used to assess future energy use, focusing on industrial energy use. We explore approaches to engineering-economic analysis that could help improve the realism and policy relevance of engineering-economic modeling frameworks. We also explore solutions to strengthen the policy usefulness of engineering-economic analysis that can be built from a framework of multi-disciplinary cooperation. We focus on the so-called ''engineering-economic'' (or ''bottom-up'') models, as they include the amount of detail that is commonly needed to model policy scenarios. We identify research priorities for the modeling framework, technology representation in models, policy evaluation and modeling of decision-making behavior.

  13. Epidemic modeling in complex realities.

    Science.gov (United States)

    Colizza, Vittoria; Barthélemy, Marc; Barrat, Alain; Vespignani, Alessandro

    2007-04-01

    In our global world, the increasing complexity of social relations and transport infrastructures are key factors in the spread of epidemics. In recent years, the increasing availability of computer power has enabled both to obtain reliable data allowing one to quantify the complexity of the networks on which epidemics may propagate and to envision computational tools able to tackle the analysis of such propagation phenomena. These advances have put in evidence the limits of homogeneous assumptions and simple spatial diffusion approaches, and stimulated the inclusion of complex features and heterogeneities relevant in the description of epidemic diffusion. In this paper, we review recent progresses that integrate complex systems and networks analysis with epidemic modelling and focus on the impact of the various complex features of real systems on the dynamics of epidemic spreading.

  14. Modeling Complex Time Limits

    Directory of Open Access Journals (Sweden)

    Oleg Svatos

    2013-01-01

    Full Text Available In this paper we analyze complexity of time limits we can find especially in regulated processes of public administration. First we review the most popular process modeling languages. There is defined an example scenario based on the current Czech legislature which is then captured in discussed process modeling languages. Analysis shows that the contemporary process modeling languages support capturing of the time limit only partially. This causes troubles to analysts and unnecessary complexity of the models. Upon unsatisfying results of the contemporary process modeling languages we analyze the complexity of the time limits in greater detail and outline lifecycles of a time limit using the multiple dynamic generalizations pattern. As an alternative to the popular process modeling languages there is presented PSD process modeling language, which supports the defined lifecycles of a time limit natively and therefore allows keeping the models simple and easy to understand.

  15. Complex matrix model duality

    International Nuclear Information System (INIS)

    Brown, T.W.

    2010-11-01

    The same complex matrix model calculates both tachyon scattering for the c=1 non-critical string at the self-dual radius and certain correlation functions of half-BPS operators in N=4 super- Yang-Mills. It is dual to another complex matrix model where the couplings of the first model are encoded in the Kontsevich-like variables of the second. The duality between the theories is mirrored by the duality of their Feynman diagrams. Analogously to the Hermitian Kontsevich- Penner model, the correlation functions of the second model can be written as sums over discrete points in subspaces of the moduli space of punctured Riemann surfaces. (orig.)

  16. Complex matrix model duality

    Energy Technology Data Exchange (ETDEWEB)

    Brown, T.W.

    2010-11-15

    The same complex matrix model calculates both tachyon scattering for the c=1 non-critical string at the self-dual radius and certain correlation functions of half-BPS operators in N=4 super- Yang-Mills. It is dual to another complex matrix model where the couplings of the first model are encoded in the Kontsevich-like variables of the second. The duality between the theories is mirrored by the duality of their Feynman diagrams. Analogously to the Hermitian Kontsevich- Penner model, the correlation functions of the second model can be written as sums over discrete points in subspaces of the moduli space of punctured Riemann surfaces. (orig.)

  17. Mathematical Properties Relevant to Geomagnetic Field Modeling

    DEFF Research Database (Denmark)

    Sabaka, Terence J.; Hulot, Gauthier; Olsen, Nils

    2010-01-01

    be directly measured. In this chapter, the mathematical foundation of global (as opposed to regional) geomagnetic field modeling is reviewed, and the spatial modeling of the field in spherical coordinates is focussed. Time can be dealt with as an independent variable and is not explicitly considered......Geomagnetic field modeling consists in converting large numbers of magnetic observations into a linear combination of elementary mathematical functions that best describes those observations.The set of numerical coefficients defining this linear combination is then what one refers.......The relevant elementary mathematical functions are introduced, their properties are reviewed, and how they can be used to describe the magnetic field in a source-free (such as the Earth’s neutral atmosphere) or source-dense (such as the ionosphere) environment is explained. Completeness and uniqueness...

  18. Mathematical Properties Relevant to Geomagnetic Field Modeling

    DEFF Research Database (Denmark)

    Sabaka, Terence J.; Hulot, Gauthier; Olsen, Nils

    2014-01-01

    be directly measured. In this chapter, the mathematical foundation of global (as opposed to regional) geomagnetic field modeling is reviewed, and the spatial modeling of the field in spherical coordinates is focused. Time can be dealt with as an independent variable and is not explicitly considered......Geomagnetic field modeling consists in converting large numbers of magnetic observations into a linear combination of elementary mathematical functions that best describes those observations. The set of numerical coefficients defining this linear combination is then what one refers....... The relevant elementary mathematical functions are introduced, their properties are reviewed, and how they can be used to describe the magnetic field in a source-free (such as the Earth’s neutral atmosphere) or source-dense (such as the ionosphere) environment is explained. Completeness and uniqueness...

  19. Simulation in Complex Modelling

    DEFF Research Database (Denmark)

    Nicholas, Paul; Ramsgaard Thomsen, Mette; Tamke, Martin

    2017-01-01

    This paper will discuss the role of simulation in extended architectural design modelling. As a framing paper, the aim is to present and discuss the role of integrated design simulation and feedback between design and simulation in a series of projects under the Complex Modelling framework. Complex...... performance, engage with high degrees of interdependency and allow the emergence of design agency and feedback between the multiple scales of architectural construction. This paper presents examples for integrated design simulation from a series of projects including Lace Wall, A Bridge Too Far and Inflated...... Restraint developed for the research exhibition Complex Modelling, Meldahls Smedie Gallery, Copenhagen in 2016. Where the direct project aims and outcomes have been reported elsewhere, the aim for this paper is to discuss overarching strategies for working with design integrated simulation....

  20. Modeling Complex Systems

    CERN Document Server

    Boccara, Nino

    2010-01-01

    Modeling Complex Systems, 2nd Edition, explores the process of modeling complex systems, providing examples from such diverse fields as ecology, epidemiology, sociology, seismology, and economics. It illustrates how models of complex systems are built and provides indispensable mathematical tools for studying their dynamics. This vital introductory text is useful for advanced undergraduate students in various scientific disciplines, and serves as an important reference book for graduate students and young researchers. This enhanced second edition includes: . -recent research results and bibliographic references -extra footnotes which provide biographical information on cited scientists who have made significant contributions to the field -new and improved worked-out examples to aid a student’s comprehension of the content -exercises to challenge the reader and complement the material Nino Boccara is also the author of Essentials of Mathematica: With Applications to Mathematics and Physics (Springer, 2007).

  1. Modeling Complex Systems

    International Nuclear Information System (INIS)

    Schreckenberg, M

    2004-01-01

    This book by Nino Boccara presents a compilation of model systems commonly termed as 'complex'. It starts with a definition of the systems under consideration and how to build up a model to describe the complex dynamics. The subsequent chapters are devoted to various categories of mean-field type models (differential and recurrence equations, chaos) and of agent-based models (cellular automata, networks and power-law distributions). Each chapter is supplemented by a number of exercises and their solutions. The table of contents looks a little arbitrary but the author took the most prominent model systems investigated over the years (and up until now there has been no unified theory covering the various aspects of complex dynamics). The model systems are explained by looking at a number of applications in various fields. The book is written as a textbook for interested students as well as serving as a comprehensive reference for experts. It is an ideal source for topics to be presented in a lecture on dynamics of complex systems. This is the first book on this 'wide' topic and I have long awaited such a book (in fact I planned to write it myself but this is much better than I could ever have written it!). Only section 6 on cellular automata is a little too limited to the author's point of view and one would have expected more about the famous Domany-Kinzel model (and more accurate citation!). In my opinion this is one of the best textbooks published during the last decade and even experts can learn a lot from it. Hopefully there will be an actualization after, say, five years since this field is growing so quickly. The price is too high for students but this, unfortunately, is the normal case today. Nevertheless I think it will be a great success! (book review)

  2. Modeling complexes of modeled proteins.

    Science.gov (United States)

    Anishchenko, Ivan; Kundrotas, Petras J; Vakser, Ilya A

    2017-03-01

    Structural characterization of proteins is essential for understanding life processes at the molecular level. However, only a fraction of known proteins have experimentally determined structures. This fraction is even smaller for protein-protein complexes. Thus, structural modeling of protein-protein interactions (docking) primarily has to rely on modeled structures of the individual proteins, which typically are less accurate than the experimentally determined ones. Such "double" modeling is the Grand Challenge of structural reconstruction of the interactome. Yet it remains so far largely untested in a systematic way. We present a comprehensive validation of template-based and free docking on a set of 165 complexes, where each protein model has six levels of structural accuracy, from 1 to 6 Å C α RMSD. Many template-based docking predictions fall into acceptable quality category, according to the CAPRI criteria, even for highly inaccurate proteins (5-6 Å RMSD), although the number of such models (and, consequently, the docking success rate) drops significantly for models with RMSD > 4 Å. The results show that the existing docking methodologies can be successfully applied to protein models with a broad range of structural accuracy, and the template-based docking is much less sensitive to inaccuracies of protein models than the free docking. Proteins 2017; 85:470-478. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  3. Prototypes and matrix relevance learning in complex fourier space

    NARCIS (Netherlands)

    Straat, M.; Kaden, M.; Gay, M.; Villmann, T.; Lampe, Alexander; Seiffert, U.; Biehl, M.; Melchert, F.

    2017-01-01

    In this contribution, we consider the classification of time-series and similar functional data which can be represented in complex Fourier coefficient space. We apply versions of Learning Vector Quantization (LVQ) which are suitable for complex-valued data, based on the so-called Wirtinger

  4. Evolution of disorder in Mediator complex and its functional relevance.

    Science.gov (United States)

    Nagulapalli, Malini; Maji, Sourobh; Dwivedi, Nidhi; Dahiya, Pradeep; Thakur, Jitendra K

    2016-02-29

    Mediator, an important component of eukaryotic transcriptional machinery, is a huge multisubunit complex. Though the complex is known to be conserved across all the eukaryotic kingdoms, the evolutionary topology of its subunits has never been studied. In this study, we profiled disorder in the Mediator subunits of 146 eukaryotes belonging to three kingdoms viz., metazoans, plants and fungi, and attempted to find correlation between the evolution of Mediator complex and its disorder. Our analysis suggests that disorder in Mediator complex have played a crucial role in the evolutionary diversification of complexity of eukaryotic organisms. Conserved intrinsic disordered regions (IDRs) were identified in only six subunits in the three kingdoms whereas unique patterns of IDRs were identified in other Mediator subunits. Acquisition of novel molecular recognition features (MoRFs) through evolution of new subunits or through elongation of the existing subunits was evident in metazoans and plants. A new concept of 'junction-MoRF' has been introduced. Evolutionary link between CBP and Med15 has been provided which explain the evolution of extended-IDR in CBP from Med15 KIX-IDR junction-MoRF suggesting role of junction-MoRF in evolution and modulation of protein-protein interaction repertoire. This study can be informative and helpful in understanding the conserved and flexible nature of Mediator complex across eukaryotic kingdoms. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  5. Predictive Surface Complexation Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Sverjensky, Dimitri A. [Johns Hopkins Univ., Baltimore, MD (United States). Dept. of Earth and Planetary Sciences

    2016-11-29

    Surface complexation plays an important role in the equilibria and kinetics of processes controlling the compositions of soilwaters and groundwaters, the fate of contaminants in groundwaters, and the subsurface storage of CO2 and nuclear waste. Over the last several decades, many dozens of individual experimental studies have addressed aspects of surface complexation that have contributed to an increased understanding of its role in natural systems. However, there has been no previous attempt to develop a model of surface complexation that can be used to link all the experimental studies in order to place them on a predictive basis. Overall, my research has successfully integrated the results of the work of many experimentalists published over several decades. For the first time in studies of the geochemistry of the mineral-water interface, a practical predictive capability for modeling has become available. The predictive correlations developed in my research now enable extrapolations of experimental studies to provide estimates of surface chemistry for systems not yet studied experimentally and for natural and anthropogenically perturbed systems.

  6. Polystochastic Models for Complexity

    CERN Document Server

    Iordache, Octavian

    2010-01-01

    This book is devoted to complexity understanding and management, considered as the main source of efficiency and prosperity for the next decades. Divided into six chapters, the book begins with a presentation of basic concepts as complexity, emergence and closure. The second chapter looks to methods and introduces polystochastic models, the wave equation, possibilities and entropy. The third chapter focusing on physical and chemical systems analyzes flow-sheet synthesis, cyclic operations of separation, drug delivery systems and entropy production. Biomimetic systems represent the main objective of the fourth chapter. Case studies refer to bio-inspired calculation methods, to the role of artificial genetic codes, neural networks and neural codes for evolutionary calculus and for evolvable circuits as biomimetic devices. The fifth chapter, taking its inspiration from systems sciences and cognitive sciences looks to engineering design, case base reasoning methods, failure analysis, and multi-agent manufacturing...

  7. Cementitious Barriers Partnership Accomplishments And Relevance To The DOE Complex

    International Nuclear Information System (INIS)

    Burns, H.; Langton, C.; Flach, G.; Kosson, D.

    2010-01-01

    The Cementitious Barriers Partnership (CBP) was initiated to reduce risk and uncertainties in the performance assessments that directly impact U.S. Department of Energy (DOE) environmental cleanup and closure programs. The CBP is supported by the DOE Office of Environmental Management (DOE-EM) and has been specifically addressing the following critical EM program needs: (i) the long-term performance of cementitious barriers and materials in nuclear waste disposal facilities and (ii) increased understanding of contaminant transport behavior within cementitious barrier systems to support the development and deployment of adequate closure technologies. To accomplish this, the CBP has two initiatives: (1) an experimental initiative to increase understanding of changes in cementitious materials over long times (> 1000 years) over changing conditions and (2) a modeling initiative to enhance and integrate a set of computational tools validated by laboratory and field experimental data to improve understanding and prediction of the long-term performance of cementitious barriers and waste forms used in nuclear applications. In FY10, the CBP developed the initial phase of an integrated modeling tool that would serve as a screening tool which could help in making decisions concerning disposal and tank closure. The CBP experimental programs are underway to validate this tool and provide increased understanding of how CM changes over time and under changing conditions. These initial CBP products that will eventually be enhanced are anticipated to reduce the uncertainties of current methodologies for assessing cementitious barrier performance and increase the consistency and transparency of the DOE assessment process. These tools have application to low activity waste forms, high level waste tank closure, D and D and entombment of major nuclear facilities, landfill waste acceptance criteria, and in-situ grouting and immobilization of vadose zone contamination. This paper

  8. Mouse models of ageing and their relevance to disease.

    Science.gov (United States)

    Kõks, Sulev; Dogan, Soner; Tuna, Bilge Guvenc; González-Navarro, Herminia; Potter, Paul; Vandenbroucke, Roosmarijn E

    2016-12-01

    Ageing is a process that gradually increases the organism's vulnerability to death. It affects different biological pathways, and the underlying cellular mechanisms are complex. In view of the growing disease burden of ageing populations, increasing efforts are being invested in understanding the pathways and mechanisms of ageing. We review some mouse models commonly used in studies on ageing, highlight the advantages and disadvantages of the different strategies, and discuss their relevance to disease susceptibility. In addition to addressing the genetics and phenotypic analysis of mice, we discuss examples of models of delayed or accelerated ageing and their modulation by caloric restriction. Copyright © 2016 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  9. Towards policy relevant environmental modeling: contextual validity and pragmatic models

    Science.gov (United States)

    Miles, Scott B.

    2000-01-01

    "What makes for a good model?" In various forms, this question is a question that, undoubtedly, many people, businesses, and institutions ponder with regards to their particular domain of modeling. One particular domain that is wrestling with this question is the multidisciplinary field of environmental modeling. Examples of environmental models range from models of contaminated ground water flow to the economic impact of natural disasters, such as earthquakes. One of the distinguishing claims of the field is the relevancy of environmental modeling to policy and environment-related decision-making in general. A pervasive view by both scientists and decision-makers is that a "good" model is one that is an accurate predictor. Thus, determining whether a model is "accurate" or "correct" is done by comparing model output to empirical observations. The expected outcome of this process, usually referred to as "validation" or "ground truthing," is a stamp on the model in question of "valid" or "not valid" that serves to indicate whether or not the model will be reliable before it is put into service in a decision-making context. In this paper, I begin by elaborating on the prevailing view of model validation and why this view must change. Drawing from concepts coming out of the studies of science and technology, I go on to propose a contextual view of validity that can overcome the problems associated with "ground truthing" models as an indicator of model goodness. The problem of how we talk about and determine model validity has much to do about how we perceive the utility of environmental models. In the remainder of the paper, I argue that we should adopt ideas of pragmatism in judging what makes for a good model and, in turn, developing good models. From such a perspective of model goodness, good environmental models should facilitate communication, convey—not bury or "eliminate"—uncertainties, and, thus, afford the active building of consensus decisions, instead

  10. Are invertebrates relevant models in ageing research?

    DEFF Research Database (Denmark)

    Erdogan, Cihan Suleyman; Hansen, Benni Winding; Vang, Ole

    2016-01-01

    is an evolutionary conserved key protein kinase in the TOR pathway that regulates growth, proliferation and cell metabolism in response to nutrients, growth factors and stress. Comparing the ageing process in invertebrate model organisms with relatively short lifespan with mammals provides valuable information about...... the molecular mechanisms underlying the ageing process faster than mammal systems. Inhibition of the TOR pathway activity via either genetic manipulation or rapamycin increases lifespan profoundly in most invertebrate model organisms. This contribution will review the recent findings in invertebrates concerning...... the TOR pathway and effects of TOR inhibition by rapamycin on lifespan. Besides some contradictory results, the majority points out that rapamycin induces longevity. This suggests that administration of rapamycin in invertebrates is a promising tool for pursuing the scientific puzzle of lifespan...

  11. Complex Systems and Self-organization Modelling

    CERN Document Server

    Bertelle, Cyrille; Kadri-Dahmani, Hakima

    2009-01-01

    The concern of this book is the use of emergent computing and self-organization modelling within various applications of complex systems. The authors focus their attention both on the innovative concepts and implementations in order to model self-organizations, but also on the relevant applicative domains in which they can be used efficiently. This book is the outcome of a workshop meeting within ESM 2006 (Eurosis), held in Toulouse, France in October 2006.

  12. On relevant boundary perturbations of unitary minimal models

    International Nuclear Information System (INIS)

    Recknagel, A.; Roggenkamp, D.; Schomerus, V.

    2000-01-01

    We consider unitary Virasoro minimal models on the disk with Cardy boundary conditions and discuss deformations by certain relevant boundary operators, analogous to tachyon condensation in string theory. Concentrating on the least relevant boundary field, we can perform a perturbative analysis of renormalization group fixed points. We find that the systems always flow towards stable fixed points which admit no further (non-trivial) relevant perturbations. The new conformal boundary conditions are in general given by superpositions of 'pure' Cardy boundary conditions

  13. Appropriate complexity landscape modeling

    NARCIS (Netherlands)

    Larsen, Laurel G.; Eppinga, Maarten B.; Passalacqua, Paola; Getz, Wayne M.; Rose, Kenneth A.; Liang, Man

    Advances in computing technology, new and ongoing restoration initiatives, concerns about climate change's effects, and the increasing interdisciplinarity of research have encouraged the development of landscape-scale mechanistic models of coupled ecological-geophysical systems. However,

  14. X-ray and vibrational spectroscopy of manganese complexes relevant to the oxygen-evolving complex of photosynthesis

    Energy Technology Data Exchange (ETDEWEB)

    Visser, Hendrik [Univ. of California, Berkeley, CA (United States)

    2001-01-01

    Manganese model complexes, relevant to the oxygen-evolving complex (OEC) in photosynthesis, were studied with Mn K-edge X-ray absorption near-edge spectroscopy (XANES), Mn Kb X-ray emission spectroscopy (XES), and vibrational spectroscopy. A more detailed understanding was obtained of the influence of nuclearity, overall structure, oxidation state, and ligand environment of the Mn atoms on the spectra from these methods. This refined understanding is necessary for improving the interpretation of spectra of the OEC. Mn XANES and Kb XES were used to study a di-(mu)-oxo and a mono-(mu)-oxo di-nuclear Mn compound in the (III,III), (III,IV), and (IV,IV) oxidation states. XANES spectra show energy shifts of 0.8 - 2.2 eV for 1-electron oxidation-state changes and 0.4 - 1.8 eV for ligand-environment changes. The shifts observed for Mn XES spectra were approximately 0.21 eV for oxidation state-changes and only approximately 0.04 eV for ligand-environment changes. This indicates that Mn Kb XES i s more sensitive to the oxidation state and less sensitive to the ligand environment of the Mn atoms than XANES. These complimentary methods provide information about the oxidation state and the ligand environment of Mn atoms in model compounds and biological systems. A versatile spectroelectrochemical apparatus was designed to aid the interpretation of IR spectra of Mn compounds in different oxidation states. The design, based on an attenuated total reflection device, permits the study of a wide spectral range: 16,700 (600 nm) - 225

  15. A Compositional Relevance Model for Adaptive Information Retrieval

    Science.gov (United States)

    Mathe, Nathalie; Chen, James; Lu, Henry, Jr. (Technical Monitor)

    1994-01-01

    There is a growing need for rapid and effective access to information in large electronic documentation systems. Access can be facilitated if information relevant in the current problem solving context can be automatically supplied to the user. This includes information relevant to particular user profiles, tasks being performed, and problems being solved. However most of this knowledge on contextual relevance is not found within the contents of documents, and current hypermedia tools do not provide any easy mechanism to let users add this knowledge to their documents. We propose a compositional relevance network to automatically acquire the context in which previous information was found relevant. The model records information on the relevance of references based on user feedback for specific queries and contexts. It also generalizes such information to derive relevant references for similar queries and contexts. This model lets users filter information by context of relevance, build personalized views of documents over time, and share their views with other users. It also applies to any type of multimedia information. Compared to other approaches, it is less costly and doesn't require any a priori statistical computation, nor an extended training period. It is currently being implemented into the Computer Integrated Documentation system which enables integration of various technical documents in a hypertext framework.

  16. Complex mixtures: Relevance of combined exposure to substances at low dose levels

    NARCIS (Netherlands)

    Leeman, W.R.; Krul, L.; Houben, G.F.

    2013-01-01

    Upon analysis of chemically complex food matrices a forest of peaks is likely to be found. Identification of these peaks and concurrent determination of the toxicological relevance upon exposure is very time consuming, expensive and often requires animal studies. Recently, a safety assessment

  17. A study of ruthenium complexes of some biologically relevant a-N ...

    Indian Academy of Sciences (India)

    Home; Journals; Journal of Chemical Sciences; Volume 112; Issue 3. A study of ruthenium complexes of some biologically relevant ∙ -N-heterocyclic ... Author Affiliations. P Sengupta1 S Ghosh1. Department of Inorganic Chemistry, Indian Association for the Cultivation of Science, Jadavpur, Calcutta 700 032, India ...

  18. Stress and adaptation : Toward ecologically relevant animal models

    NARCIS (Netherlands)

    Koolhaas, Jaap M.; Boer, Sietse F. de; Buwalda, Bauke

    Animal models have contributed considerably to the current understanding of mechanisms underlying the role of stress in health and disease. Despite the progress made already, much more can be made by more carefully exploiting animals' and humans' shared biology, using ecologically relevant models.

  19. A Hybrid Approach to Finding Relevant Social Media Content for Complex Domain Specific Information Needs.

    Science.gov (United States)

    Cameron, Delroy; Sheth, Amit P; Jaykumar, Nishita; Thirunarayan, Krishnaprasad; Anand, Gaurish; Smith, Gary A

    2014-12-01

    While contemporary semantic search systems offer to improve classical keyword-based search, they are not always adequate for complex domain specific information needs. The domain of prescription drug abuse, for example, requires knowledge of both ontological concepts and "intelligible constructs" not typically modeled in ontologies. These intelligible constructs convey essential information that include notions of intensity, frequency, interval, dosage and sentiments, which could be important to the holistic needs of the information seeker. In this paper, we present a hybrid approach to domain specific information retrieval that integrates ontology-driven query interpretation with synonym-based query expansion and domain specific rules, to facilitate search in social media on prescription drug abuse. Our framework is based on a context-free grammar (CFG) that defines the query language of constructs interpretable by the search system. The grammar provides two levels of semantic interpretation: 1) a top-level CFG that facilitates retrieval of diverse textual patterns, which belong to broad templates and 2) a low-level CFG that enables interpretation of specific expressions belonging to such textual patterns. These low-level expressions occur as concepts from four different categories of data: 1) ontological concepts, 2) concepts in lexicons (such as emotions and sentiments), 3) concepts in lexicons with only partial ontology representation, called lexico-ontology concepts (such as side effects and routes of administration (ROA)), and 4) domain specific expressions (such as date, time, interval, frequency and dosage) derived solely through rules. Our approach is embodied in a novel Semantic Web platform called PREDOSE, which provides search support for complex domain specific information needs in prescription drug abuse epidemiology. When applied to a corpus of over 1 million drug abuse-related web forum posts, our search framework proved effective in retrieving

  20. The Kuramoto model in complex networks

    Science.gov (United States)

    Rodrigues, Francisco A.; Peron, Thomas K. DM.; Ji, Peng; Kurths, Jürgen

    2016-01-01

    Synchronization of an ensemble of oscillators is an emergent phenomenon present in several complex systems, ranging from social and physical to biological and technological systems. The most successful approach to describe how coherent behavior emerges in these complex systems is given by the paradigmatic Kuramoto model. This model has been traditionally studied in complete graphs. However, besides being intrinsically dynamical, complex systems present very heterogeneous structure, which can be represented as complex networks. This report is dedicated to review main contributions in the field of synchronization in networks of Kuramoto oscillators. In particular, we provide an overview of the impact of network patterns on the local and global dynamics of coupled phase oscillators. We cover many relevant topics, which encompass a description of the most used analytical approaches and the analysis of several numerical results. Furthermore, we discuss recent developments on variations of the Kuramoto model in networks, including the presence of noise and inertia. The rich potential for applications is discussed for special fields in engineering, neuroscience, physics and Earth science. Finally, we conclude by discussing problems that remain open after the last decade of intensive research on the Kuramoto model and point out some promising directions for future research.

  1. Computational models of complex systems

    CERN Document Server

    Dabbaghian, Vahid

    2014-01-01

    Computational and mathematical models provide us with the opportunities to investigate the complexities of real world problems. They allow us to apply our best analytical methods to define problems in a clearly mathematical manner and exhaustively test our solutions before committing expensive resources. This is made possible by assuming parameter(s) in a bounded environment, allowing for controllable experimentation, not always possible in live scenarios. For example, simulation of computational models allows the testing of theories in a manner that is both fundamentally deductive and experimental in nature. The main ingredients for such research ideas come from multiple disciplines and the importance of interdisciplinary research is well recognized by the scientific community. This book provides a window to the novel endeavours of the research communities to present their works by highlighting the value of computational modelling as a research tool when investigating complex systems. We hope that the reader...

  2. Relevant criteria for testing the quality of turbulence models

    DEFF Research Database (Denmark)

    Frandsen, Sten Tronæs; Ejsing Jørgensen, Hans; Sørensen, J.D.

    2007-01-01

    Seeking relevant criteria for testing the quality of turbulence models, the scale of turbulence and the gust factor have been estimated from data and compared with predictions from first-order models of these two quantities. It is found that the mean of the measured length scales is approx. 10......% smaller than the IEC model, for wind turbine hub height levels. The mean is only marginally dependent on trends in time series. It is also found that the coefficient of variation of the measured length scales is about 50%. 3sec and 10sec pre-averaging of wind speed data are relevant for MW-size wind...... turbines when seeking wind characteristics that correspond to one blade and the entire rotor, respectively. For heights exceeding 50-60m the gust factor increases with wind speed. For heights larger the 60-80m, present assumptions on the value of the gust factor are significantly conservative, both for 3...

  3. Complexity-aware simple modeling.

    Science.gov (United States)

    Gómez-Schiavon, Mariana; El-Samad, Hana

    2018-02-26

    Mathematical models continue to be essential for deepening our understanding of biology. On one extreme, simple or small-scale models help delineate general biological principles. However, the parsimony of detail in these models as well as their assumption of modularity and insulation make them inaccurate for describing quantitative features. On the other extreme, large-scale and detailed models can quantitatively recapitulate a phenotype of interest, but have to rely on many unknown parameters, making them often difficult to parse mechanistically and to use for extracting general principles. We discuss some examples of a new approach-complexity-aware simple modeling-that can bridge the gap between the small-scale and large-scale approaches. Copyright © 2018 Elsevier Ltd. All rights reserved.

  4. Complex Networks in Psychological Models

    Science.gov (United States)

    Wedemann, R. S.; Carvalho, L. S. A. V. D.; Donangelo, R.

    We develop schematic, self-organizing, neural-network models to describe mechanisms associated with mental processes, by a neurocomputational substrate. These models are examples of real world complex networks with interesting general topological structures. Considering dopaminergic signal-to-noise neuronal modulation in the central nervous system, we propose neural network models to explain development of cortical map structure and dynamics of memory access, and unify different mental processes into a single neurocomputational substrate. Based on our neural network models, neurotic behavior may be understood as an associative memory process in the brain, and the linguistic, symbolic associative process involved in psychoanalytic working-through can be mapped onto a corresponding process of reconfiguration of the neural network. The models are illustrated through computer simulations, where we varied dopaminergic modulation and observed the self-organizing emergent patterns at the resulting semantic map, interpreting them as different manifestations of mental functioning, from psychotic through to normal and neurotic behavior, and creativity.

  5. Theoretical Relevance of Neuropsychological Data for Connectionist Modelling

    Directory of Open Access Journals (Sweden)

    Mauricio Iza

    2011-05-01

    Full Text Available The symbolic information-processing paradigm in cognitive psychology has met a growing challenge from neural network models over the past two decades. While neuropsychological
    evidence has been of great utility to theories concerned with information processing, the real question is, whether the less rigid connectionist models provide valid, or enough, information
    concerning complex cognitive structures. In this work, we will discuss the theoretical implications that neuropsychological data posits for modelling cognitive systems.

  6. Complex fluids modeling and algorithms

    CERN Document Server

    Saramito, Pierre

    2016-01-01

    This book presents a comprehensive overview of the modeling of complex fluids, including many common substances, such as toothpaste, hair gel, mayonnaise, liquid foam, cement and blood, which cannot be described by Navier-Stokes equations. It also offers an up-to-date mathematical and numerical analysis of the corresponding equations, as well as several practical numerical algorithms and software solutions for the approximation of the solutions. It discusses industrial (molten plastics, forming process), geophysical (mud flows, volcanic lava, glaciers and snow avalanches), and biological (blood flows, tissues) modeling applications. This book is a valuable resource for undergraduate students and researchers in applied mathematics, mechanical engineering and physics.

  7. Models of the Economic Growth and their Relevance

    Directory of Open Access Journals (Sweden)

    Nicolae MOROIANU

    2012-06-01

    Full Text Available Until few years ago, the economic growth was something perfect normal, part of an era marked by the transformation speed. Normality itself has been transformed and we currently are influenced by other rules, unknown yet, which should answer the question: “How do we return to the economic growth?” The economic growth and the models aiming to solve this problem concern the economic history even since its beginnings. In this paper we would like to find out what is the relevance that the well-known macroeconomic models still have and which might be their applicability level in a framework created by a black swan event type.

  8. Macroscale hydrologic modeling of ecologically relevant flow metrics

    Science.gov (United States)

    Wenger, Seth J.; Luce, Charles H.; Hamlet, Alan F.; Isaak, Daniel J.; Neville, Helen M.

    2010-09-01

    Stream hydrology strongly affects the structure of aquatic communities. Changes to air temperature and precipitation driven by increased greenhouse gas concentrations are shifting timing and volume of streamflows potentially affecting these communities. The variable infiltration capacity (VIC) macroscale hydrologic model has been employed at regional scales to describe and forecast hydrologic changes but has been calibrated and applied mainly to large rivers. An important question is how well VIC runoff simulations serve to answer questions about hydrologic changes in smaller streams, which are important habitat for many fish species. To answer this question, we aggregated gridded VIC outputs within the drainage basins of 55 streamflow gages in the Pacific Northwest United States and compared modeled hydrographs and summary metrics to observations. For most streams, several ecologically relevant aspects of the hydrologic regime were accurately modeled, including center of flow timing, mean annual and summer flows and frequency of winter floods. Frequencies of high and low flows in the summer were not well predicted, however. Predictions were worse for sites with strong groundwater influence, and some sites showed errors that may result from limitations in the forcing climate data. Higher resolution (1/16th degree) modeling provided small improvements over lower resolution (1/8th degree). Despite some limitations, the VIC model appears capable of representing several ecologically relevant hydrologic characteristics in streams, making it a useful tool for understanding the effects of hydrology in delimiting species distributions and predicting the potential effects of climate shifts on aquatic organisms.

  9. Model complexity control for hydrologic prediction

    NARCIS (Netherlands)

    Schoups, G.; Van de Giesen, N.C.; Savenije, H.H.G.

    2008-01-01

    A common concern in hydrologic modeling is overparameterization of complex models given limited and noisy data. This leads to problems of parameter nonuniqueness and equifinality, which may negatively affect prediction uncertainties. A systematic way of controlling model complexity is therefore

  10. Bioprinting towards Physiologically Relevant Tissue Models for Pharmaceutics.

    Science.gov (United States)

    Peng, Weijie; Unutmaz, Derya; Ozbolat, Ibrahim T

    2016-09-01

    Improving the ability to predict the efficacy and toxicity of drug candidates earlier in the drug discovery process will speed up the introduction of new drugs into clinics. 3D in vitro systems have significantly advanced the drug screening process as 3D tissue models can closely mimic native tissues and, in some cases, the physiological response to drugs. Among various in vitro systems, bioprinting is a highly promising technology possessing several advantages such as tailored microarchitecture, high-throughput capability, coculture ability, and low risk of cross-contamination. In this opinion article, we discuss the currently available tissue models in pharmaceutics along with their limitations and highlight the possibilities of bioprinting physiologically relevant tissue models, which hold great potential in drug testing, high-throughput screening, and disease modeling. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. On sampling and modeling complex systems

    International Nuclear Information System (INIS)

    Marsili, Matteo; Mastromatteo, Iacopo; Roudi, Yasser

    2013-01-01

    The study of complex systems is limited by the fact that only a few variables are accessible for modeling and sampling, which are not necessarily the most relevant ones to explain the system behavior. In addition, empirical data typically undersample the space of possible states. We study a generic framework where a complex system is seen as a system of many interacting degrees of freedom, which are known only in part, that optimize a given function. We show that the underlying distribution with respect to the known variables has the Boltzmann form, with a temperature that depends on the number of unknown variables. In particular, when the influence of the unknown degrees of freedom on the known variables is not too irregular, the temperature decreases as the number of variables increases. This suggests that models can be predictable only when the number of relevant variables is less than a critical threshold. Concerning sampling, we argue that the information that a sample contains on the behavior of the system is quantified by the entropy of the frequency with which different states occur. This allows us to characterize the properties of maximally informative samples: within a simple approximation, the most informative frequency size distributions have power law behavior and Zipf’s law emerges at the crossover between the under sampled regime and the regime where the sample contains enough statistics to make inferences on the behavior of the system. These ideas are illustrated in some applications, showing that they can be used to identify relevant variables or to select the most informative representations of data, e.g. in data clustering. (paper)

  12. Equation of state experiments and theory relevant to planetary modelling

    International Nuclear Information System (INIS)

    Ross, M.; Graboske, H.C. Jr.; Nellis, W.J.

    1981-01-01

    In recent years there have been a number of static and shockwave experiments on the properties of planetary materials. The highest pressure measurements, and the ones most relevant to planetary modelling, have been obtained by shock compression. Of particular interest to the Jovian group are results for H 2 , H 2 O, CH 4 and NH 3 . Although the properties of metallic hydrogen have not been measured, they have been the subject of extensive calculations. In addition recent shock wave experiments on iron report to have detected melting under Earth core conditions. From this data theoretical models have been developed for computing the equations of state of materials used in planetary studies. A compelling feature that has followed from the use of improved material properties is a simplification in the planetary models. (author)

  13. Effect of tDCS on task relevant and irrelevant perceptual learning of complex objects.

    Science.gov (United States)

    Van Meel, Chayenne; Daniels, Nicky; de Beeck, Hans Op; Baeck, Annelies

    2016-01-01

    During perceptual learning the visual representations in the brain are altered, but these changes' causal role has not yet been fully characterized. We used transcranial direct current stimulation (tDCS) to investigate the role of higher visual regions in lateral occipital cortex (LO) in perceptual learning with complex objects. We also investigated whether object learning is dependent on the relevance of the objects for the learning task. Participants were trained in two tasks: object recognition using a backward masking paradigm and an orientation judgment task. During both tasks, an object with a red line on top of it were presented in each trial. The crucial difference between both tasks was the relevance of the object: the object was relevant for the object recognition task, but not for the orientation judgment task. During training, half of the participants received anodal tDCS stimulation targeted at the lateral occipital cortex (LO). Afterwards, participants were tested on how well they recognized the trained objects, the irrelevant objects presented during the orientation judgment task and a set of completely new objects. Participants stimulated with tDCS during training showed larger improvements of performance compared to participants in the sham condition. No learning effect was found for the objects presented during the orientation judgment task. To conclude, this study suggests a causal role of LO in relevant object learning, but given the rather low spatial resolution of tDCS, more research on the specificity of this effect is needed. Further, mere exposure is not sufficient to train object recognition in our paradigm.

  14. A multiple relevance feedback strategy with positive and negative models.

    Directory of Open Access Journals (Sweden)

    Yunlong Ma

    Full Text Available A commonly used strategy to improve search accuracy is through feedback techniques. Most existing work on feedback relies on positive information, and has been extensively studied in information retrieval. However, when a query topic is difficult and the results from the first-pass retrieval are very poor, it is impossible to extract enough useful terms from a few positive documents. Therefore, the positive feedback strategy is incapable to improve retrieval in this situation. Contrarily, there is a relatively large number of negative documents in the top of the result list, and it has been confirmed that negative feedback strategy is an important and useful way for adapting this scenario by several recent studies. In this paper, we consider a scenario when the search results are so poor that there are at most three relevant documents in the top twenty documents. Then, we conduct a novel study of multiple strategies for relevance feedback using both positive and negative examples from the first-pass retrieval to improve retrieval accuracy for such difficult queries. Experimental results on these TREC collections show that the proposed language model based multiple model feedback method which is generally more effective than both the baseline method and the methods using only positive or negative model.

  15. XAS Investigation of bio-relevant cobalt complexes in aqueous media

    International Nuclear Information System (INIS)

    Bresson, C.; Lamouroux, C.; Esnouf, S.; Solari, P.L.; Den Auwer, C.

    2006-01-01

    Cobalt is an essential element of biological cycles involved in numerous metallo-biomolecules, but it becomes a toxic element at high concentration or a radio-toxic element because of its use in the nuclear industry. 'Molecular speciation' in biological media is an essential prerequisite to evaluate its chemical behaviour as well as its toxic or beneficial effects. In this scheme, we have focused on the coordination properties of the thiol-containing amino acid cysteine (Cys) and the pseudo-peptide N-(2-mercapto-propionyl) glycine (MPG) towards the Co 2+ cation in aqueous media. XAS at the Co K edge and traditional spectroscopic techniques have been coupled in order to structurally characterize the cobalt coordination sphere. Oxidation states and geometries of the bis- and tris-cysteinato Co(III) complexes are in agreement with the literature data. In addition, bond lengths between the metallic centre and the donor atoms have been determined. The structure of a new dimeric N-(2-mercapto-propionyl) glycinato Co(II) complex in solution is also reported. The coordination of MPG to Co(II) through the thiolate and carboxylate functions is ascertained. This work provides fundamental structural information about bio-relevant complexes of cobalt, which will contribute to our understanding of the chemical behaviour and the biological role of this radionuclide. (authors)

  16. A review of models relevant to road safety.

    Science.gov (United States)

    Hughes, B P; Newstead, S; Anund, A; Shu, C C; Falkmer, T

    2015-01-01

    It is estimated that more than 1.2 million people die worldwide as a result of road traffic crashes and some 50 million are injured per annum. At present some Western countries' road safety strategies and countermeasures claim to have developed into 'Safe Systems' models to address the effects of road related crashes. Well-constructed models encourage effective strategies to improve road safety. This review aimed to identify and summarise concise descriptions, or 'models' of safety. The review covers information from a wide variety of fields and contexts including transport, occupational safety, food industry, education, construction and health. The information from 2620 candidate references were selected and summarised in 121 examples of different types of model and contents. The language of safety models and systems was found to be inconsistent. Each model provided additional information regarding style, purpose, complexity and diversity. In total, seven types of models were identified. The categorisation of models was done on a high level with a variation of details in each group and without a complete, simple and rational description. The models identified in this review are likely to be adaptable to road safety and some of them have previously been used. None of systems theory, safety management systems, the risk management approach, or safety culture was commonly or thoroughly applied to road safety. It is concluded that these approaches have the potential to reduce road trauma. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Culturally relevant model program to prevent and reduce agricultural injuries.

    Science.gov (United States)

    Helitzer, D L; Hathorn, G; Benally, J; Ortega, C

    2014-07-01

    Limited research has explored pesticide injury prevention among American Indian farmers. In a five-year agricultural intervention, a university-community partnership, including the University of New Mexico School of Medicine, New Mexico State University, Shiprock Area Cooperative Extension Service, and Navajo Nation communities, used a culturally relevant model to introduce and maintain safe use of integrated pest management techniques. We applied the Diffusion of Innovations theory and community-based approaches to tailor health promotion strategies for our intervention. In a longitudinal study with repeated measures, we trained six "model farmers" to be crop management experts in pesticide safety, application, and control. Subsequently, these model farmers worked with 120 farm families randomized into two groups: intervention (Group 1) and delayed intervention (Group 2). Measurements included a walk-through analysis, test of knowledge and attitudes, and yield analysis. Both groups demonstrated improvements in pesticide storage behaviors after training. Test scores regarding safety practices improved significantly: from 57.3 to 72.4 for Group 1 and from 52.6 to 76.3 for Group 2. Group 1 maintained their knowledge and safety practices after the intervention. Attitudes about pesticides and communication of viewpoints changed across the study years. With pesticides and fertilizer, the number of corn ears increased by 56.3% and yield (kg m(-2)) of alfalfa increased by 41.2%. The study combined traditional farming practices with culturally relevant approaches and behavior change theory to affect knowledge, safety practices, attitudes, communication channels, and crop yield. Storage behaviors, use of pesticides and safety and application equipment, and safety practice knowledge changed significantly, as did attitudes about social networking, social support, and the compatibility and relative advantage of pesticides for farms.

  18. Nonparametric Bayesian Modeling of Complex Networks

    DEFF Research Database (Denmark)

    Schmidt, Mikkel Nørgaard; Mørup, Morten

    2013-01-01

    an infinite mixture model as running example, we go through the steps of deriving the model as an infinite limit of a finite parametric model, inferring the model parameters by Markov chain Monte Carlo, and checking the model?s fit and predictive performance. We explain how advanced nonparametric models......Modeling structure in complex networks using Bayesian nonparametrics makes it possible to specify flexible model structures and infer the adequate model complexity from the observed data. This article provides a gentle introduction to nonparametric Bayesian modeling of complex networks: Using...

  19. Mining Very High Resolution INSAR Data Based On Complex-GMRF Cues And Relevance Feedback

    Science.gov (United States)

    Singh, Jagmal; Popescu, Anca; Soccorsi, Matteo; Datcu, Mihai

    2012-01-01

    With the increase in number of remote sensing satellites, the number of image-data scenes in our repositories is also increasing and a large quantity of these scenes are never received and used. Thus automatic retrieval of de- sired image-data using query by image content to fully utilize the huge repository volume is becoming of great interest. Generally different users are interested in scenes containing different kind of objects and structures. So its important to analyze all the image information mining (IIM) methods so that its easier for user to select a method depending upon his/her requirement. We concentrate our study only on high-resolution SAR images and we propose to use InSAR observations instead of only one single look complex (SLC) images for mining scenes containing coherent objects such as high-rise buildings. However in case of objects with less coherence like areas with vegetation cover, SLC images exhibits better performance. We demonstrate IIM performance comparison using complex-Gauss Markov Random Fields as texture descriptor for image patches and SVM relevance- feedback.

  20. Genetic mouse models relevant to schizophrenia: taking stock and looking forward.

    Science.gov (United States)

    Harrison, Paul J; Pritchett, David; Stumpenhorst, Katharina; Betts, Jill F; Nissen, Wiebke; Schweimer, Judith; Lane, Tracy; Burnet, Philip W J; Lamsa, Karri P; Sharp, Trevor; Bannerman, David M; Tunbridge, Elizabeth M

    2012-03-01

    Genetic mouse models relevant to schizophrenia complement, and have to a large extent supplanted, pharmacological and lesion-based rat models. The main attraction is that they potentially have greater construct validity; however, they share the fundamental limitations of all animal models of psychiatric disorder, and must also be viewed in the context of the uncertain and complex genetic architecture of psychosis. Some of the key issues, including the choice of gene to target, the manner of its manipulation, gene-gene and gene-environment interactions, and phenotypic characterization, are briefly considered in this commentary, illustrated by the relevant papers reported in this special issue. Copyright © 2011 Elsevier Ltd. All rights reserved.

  1. Location Criteria Relevant for Sustainability of Social Housing Model

    Directory of Open Access Journals (Sweden)

    Petković-Grozdanović Nataša

    2016-01-01

    Full Text Available Social housing models, which had began to develop during the last century, for their only objective had a need to overcome the housing problems of socially vulnerable categories. However, numerous studies have shown that these social categories, because of their low social status, are highly susceptible to various psychological and sociological problems. On the other hand a low level of quality, which was common for social housing dwellings, has further aggravated these problems by initiating trouble behaviours among tenants, affecting social exclusion and segregation. Contemporary social housing models are therefore conceptualized in a way to provide a positive psycho-sociological impact on their tenants. Therefore the planning approach in social housing should be such to: support important functions in daily life routines; promote tolerance and cooperation; influence on a sense of social order and belonging; affect the socialization of the tenant and their integration into the wider community; and improve social cohesion. Analysis of the influential location parameters of immediate and wider social housing environment strive to define the ones relevant to the life quality of social housing tenants and therefore influence on the sustainability of social housing model.

  2. Social aggravation: Understanding the complex role of social relationships on stress and health-relevant physiology.

    Science.gov (United States)

    Birmingham, Wendy C; Holt-Lunstad, Julianne

    2018-04-05

    There is a rich literature on social support and physical health, but research has focused primarily on the protective effects of social relationship. The stress buffering model asserts that relationships may be protective by being a source of support when coping with stress, thereby blunting health relevant physiological responses. Research also indicates relationships can be a source of stress, also influencing health. In other words, the social buffering influence may have a counterpart, a social aggravating influence that has an opposite or opposing effect. Drawing upon existing conceptual models, we expand these to delineate how social relationships may influence stress processes and ultimately health. This review summarizes the existing literature that points to the potential deleterious physiological effects of our relationships when they are sources of stress or exacerbate stress. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. Alkali Metal Ion Complexes with Phosphates, Nucleotides, Amino Acids, and Related Ligands of Biological Relevance. Their Properties in Solution.

    Science.gov (United States)

    Crea, Francesco; De Stefano, Concetta; Foti, Claudia; Lando, Gabriele; Milea, Demetrio; Sammartano, Silvio

    2016-01-01

    Alkali metal ions play very important roles in all biological systems, some of them are essential for life. Their concentration depends on several physiological factors and is very variable. For example, sodium concentrations in human fluids vary from quite low (e.g., 8.2 mmol dm(-3) in mature maternal milk) to high values (0.14 mol dm(-3) in blood plasma). While many data on the concentration of Na(+) and K(+) in various fluids are available, the information on other alkali metal cations is scarce. Since many vital functions depend on the network of interactions occurring in various biofluids, this chapter reviews their complex formation with phosphates, nucleotides, amino acids, and related ligands of biological relevance. Literature data on this topic are quite rare if compared to other cations. Generally, the stability of alkali metal ion complexes of organic and inorganic ligands is rather low (usually log K  Na(+) > K(+) > Rb(+) > Cs(+). For example, for citrate it is: log K ML = 0.88, 0.80, 0.48, 0.38, and 0.13 at 25 °C and infinite dilution. Some considerations are made on the main aspects related to the difficulties in the determination of weak complexes. The importance of the alkali metal ion complexes was also studied in the light of modelling natural fluids and in the use of these cations as probes for different processes. Some empirical relationships are proposed for the dependence of the stability constants of Na(+) complexes on the ligand charge, as well as for correlations among log K values of NaL, KL or LiL species (L = generic ligand).

  4. Relevance of separation science and technology to nuclear fuel complex operations

    International Nuclear Information System (INIS)

    Rao, S.M.; Ojha, P.B.; Rajashri, M.; Mirji, K.V.; Kalidas, R.

    2004-01-01

    During the last three decades at Nuclear Fuel Complex (NFC), Hyderabad, the Science and Technology of separation to produce various reactor grade materials in tonnage quantity is being practiced in the fields of Zr/Hf, U and Nb/Ta. Apart from this, the separation science is also being used in the production of various high purity materials and in the analytical field. The separation science and technology that is used in the production and characterisation of reactor grade materials has many striking differences from that of the common metals. The relevance and significance of separation science in the field of nuclear materials arises mainly due to the harmful effects w.r.t corrosion property and absorption of neutron caused by the presence of impurities, that are to be brought down to ppm or sub ppm level. In many cases low separation factors, that too from a multi component system call for effective process control at every stage of the bulk production so as to get quality product consistently. This article brings out the importance of separation science and technology and various process standardisations/developments that have been carried out at NFC, starting from laboratory scale to pilot scale and up to industrial scale production in the case of (i) Uranium refining (ii) Zr-Hf separation (iii) Ta-Nb separation and (iv) High purity materials production. (author)

  5. Polycomb repressive complex 2 regulates MiR-200b in retinal endothelial cells: potential relevance in diabetic retinopathy.

    Directory of Open Access Journals (Sweden)

    Michael Anthony Ruiz

    Full Text Available Glucose-induced augmented vascular endothelial growth factor (VEGF production is a key event in diabetic retinopathy. We have previously demonstrated that downregulation of miR-200b increases VEGF, mediating structural and functional changes in the retina in diabetes. However, mechanisms regulating miR-200b in diabetes are not known. Histone methyltransferase complex, Polycomb Repressive Complex 2 (PRC2, has been shown to repress miRNAs in neoplastic process. We hypothesized that, in diabetes, PRC2 represses miR-200b through its histone H3 lysine-27 trimethylation mark. We show that human retinal microvascular endothelial cells exposed to high levels of glucose regulate miR-200b repression through histone methylation and that inhibition of PRC2 increases miR-200b while reducing VEGF. Furthermore, retinal tissue from animal models of diabetes showed increased expression of major PRC2 components, demonstrating in vivo relevance. This research established a repressive relationship between PRC2 and miR-200b, providing evidence of a novel mechanism of miRNA regulation through histone methylation.

  6. Generalized complex geometry, generalized branes and the Hitchin sigma model

    International Nuclear Information System (INIS)

    Zucchini, Roberto

    2005-01-01

    Hitchin's generalized complex geometry has been shown to be relevant in compactifications of superstring theory with fluxes and is expected to lead to a deeper understanding of mirror symmetry. Gualtieri's notion of generalized complex submanifold seems to be a natural candidate for the description of branes in this context. Recently, we introduced a Batalin-Vilkovisky field theoretic realization of generalized complex geometry, the Hitchin sigma model, extending the well known Poisson sigma model. In this paper, exploiting Gualtieri's formalism, we incorporate branes into the model. A detailed study of the boundary conditions obeyed by the world sheet fields is provided. Finally, it is found that, when branes are present, the classical Batalin-Vilkovisky cohomology contains an extra sector that is related non trivially to a novel cohomology associated with the branes as generalized complex submanifolds. (author)

  7. Osteosarcoma models : understanding complex disease

    NARCIS (Netherlands)

    Mohseny, Alexander Behzad

    2012-01-01

    A mesenchymal stem cell (MSC) based osteosarcoma model was established. The model provided evidence for a MSC origin of osteosarcoma. Normal MSCs transformed spontaneously to osteosarcoma-like cells which was always accompanied by genomic instability and loss of the Cdkn2a locus. Accordingly loss of

  8. Relevance Theory as model for analysing visual and multimodal communication

    NARCIS (Netherlands)

    Forceville, C.; Machin, D.

    2014-01-01

    Elaborating on my earlier work (Forceville 1996: chapter 5, 2005, 2009; see also Yus 2008), I will here sketch how discussions of visual and multimodal discourse can be embedded in a more general theory of communication and cognition: Sperber and Wilson’s Relevance Theory/RT (Sperber and Wilson

  9. Modelling low energy electron and positron tracks in biologically relevant media

    International Nuclear Information System (INIS)

    Blanco, F.; Munoz, A.; Almeida, D.; Ferreira da Silva, F.; Limao-Vieira, P.; Fuss, M.C.; Sanz, A.G.; Garcia, G.

    2013-01-01

    This colloquium describes an approach to incorporate into radiation damage models the effect of low and intermediate energy (0-100 eV) electrons and positrons, slowing down in biologically relevant materials (water and representative biomolecules). The core of the modelling procedure is a C++ computing programme named 'Low Energy Particle Track Simulation (LEPTS)', which is compatible with available general purpose Monte Carlo packages. Input parameters are carefully selected from theoretical and experimental cross section data and energy loss distribution functions. Data sources used for this purpose are reviewed showing examples of electron and positron cross section and energy loss data for interactions with different media of increasing complexity: atoms, molecules, clusters and condense matter. Finally, we show how such a model can be used to develop an effective dosimetric tool at the molecular level (i.e. nanodosimetry). Recent experimental developments to study the fragmentation induced in biologically material by charge transfer from neutrals and negative ions are also included. (authors)

  10. Mathematical approaches for complexity/predictivity trade-offs in complex system models : LDRD final report.

    Energy Technology Data Exchange (ETDEWEB)

    Goldsby, Michael E.; Mayo, Jackson R.; Bhattacharyya, Arnab (Massachusetts Institute of Technology, Cambridge, MA); Armstrong, Robert C.; Vanderveen, Keith

    2008-09-01

    The goal of this research was to examine foundational methods, both computational and theoretical, that can improve the veracity of entity-based complex system models and increase confidence in their predictions for emergent behavior. The strategy was to seek insight and guidance from simplified yet realistic models, such as cellular automata and Boolean networks, whose properties can be generalized to production entity-based simulations. We have explored the usefulness of renormalization-group methods for finding reduced models of such idealized complex systems. We have prototyped representative models that are both tractable and relevant to Sandia mission applications, and quantified the effect of computational renormalization on the predictive accuracy of these models, finding good predictivity from renormalized versions of cellular automata and Boolean networks. Furthermore, we have theoretically analyzed the robustness properties of certain Boolean networks, relevant for characterizing organic behavior, and obtained precise mathematical constraints on systems that are robust to failures. In combination, our results provide important guidance for more rigorous construction of entity-based models, which currently are often devised in an ad-hoc manner. Our results can also help in designing complex systems with the goal of predictable behavior, e.g., for cybersecurity.

  11. Thermodynamic modeling of complex systems

    DEFF Research Database (Denmark)

    Liang, Xiaodong

    after an oil spill. Engineering thermodynamics could be applied in the state-of-the-art sonar products through advanced artificial technology, if the speed of sound, solubility and density of oil-seawater systems could be satisfactorily modelled. The addition of methanol or glycols into unprocessed well...... is successfully applied to model the phase behaviour of water, chemical and hydrocarbon (oil) containing systems with newly developed pure component parameters for water and chemicals and characterization procedures for petroleum fluids. The performance of the PCSAFT EOS on liquid-liquid equilibria of water...... with hydrocarbons has been under debate for some vii years. An interactive step-wise procedure is proposed to fit the model parameters for small associating fluids by taking the liquid-liquid equilibrium data into account. It is still far away from a simple task to apply PC-SAFT in routine PVT simulations and phase...

  12. Integrating retention soil filters into urban hydrologic models - Relevant processes and important parameters

    Science.gov (United States)

    Bachmann-Machnik, Anna; Meyer, Daniel; Waldhoff, Axel; Fuchs, Stephan; Dittmer, Ulrich

    2018-04-01

    Retention Soil Filters (RSFs), a form of vertical flow constructed wetlands specifically designed for combined sewer overflow (CSO) treatment, have proven to be an effective tool to mitigate negative impacts of CSOs on receiving water bodies. Long-term hydrologic simulations are used to predict the emissions from urban drainage systems during planning of stormwater management measures. So far no universally accepted model for RSF simulation exists. When simulating hydraulics and water quality in RSFs, an appropriate level of detail must be chosen for reasonable balancing between model complexity and model handling, considering the model input's level of uncertainty. The most crucial parameters determining the resultant uncertainties of the integrated sewer system and filter bed model were identified by evaluating a virtual drainage system with a Retention Soil Filter for CSO treatment. To determine reasonable parameter ranges for RSF simulations, data of 207 events from six full-scale RSF plants in Germany were analyzed. Data evaluation shows that even though different plants with varying loading and operation modes were examined, a simple model is sufficient to assess relevant suspended solids (SS), chemical oxygen demand (COD) and NH4 emissions from RSFs. Two conceptual RSF models with different degrees of complexity were assessed. These models were developed based on evaluation of data from full scale RSF plants and column experiments. Incorporated model processes are ammonium adsorption in the filter layer and degradation during subsequent dry weather period, filtration of SS and particulate COD (XCOD) to a constant background concentration and removal of solute COD (SCOD) by a constant removal rate during filter passage as well as sedimentation of SS and XCOD in the filter overflow. XCOD, SS and ammonium loads as well as ammonium concentration peaks are discharged primarily via RSF overflow not passing through the filter bed. Uncertainties of the integrated

  13. Role models for complex networks

    Science.gov (United States)

    Reichardt, J.; White, D. R.

    2007-11-01

    We present a framework for automatically decomposing (“block-modeling”) the functional classes of agents within a complex network. These classes are represented by the nodes of an image graph (“block model”) depicting the main patterns of connectivity and thus functional roles in the network. Using a first principles approach, we derive a measure for the fit of a network to any given image graph allowing objective hypothesis testing. From the properties of an optimal fit, we derive how to find the best fitting image graph directly from the network and present a criterion to avoid overfitting. The method can handle both two-mode and one-mode data, directed and undirected as well as weighted networks and allows for different types of links to be dealt with simultaneously. It is non-parametric and computationally efficient. The concepts of structural equivalence and modularity are found as special cases of our approach. We apply our method to the world trade network and analyze the roles individual countries play in the global economy.

  14. Possible self-complexity and affective reactions to goal-relevant evaluation.

    Science.gov (United States)

    Niedenthal, P M; Setterlund, M B; Wherry, M B

    1992-07-01

    The complexity of people's self-concept appears to be inversely related to the intensity of their reactions to evaluative feedback about present goals and abilities (Linville, 1985, 1987). The idea that the complexity of individuals' possible self-concept similarly mediates reactions to feedback regarding future goals was investigated. Two preliminary studies suggested that complexity of the actual self only explains 20% to 30% of the variance in possible self-complexity. Three studies were conducted. Support was found for the idea that possible self-complexity mediates affective reactions to evaluative feedback about future goals and actual self-complexity mediates affective reactions to evaluative feedback about present goals. The findings underscore the independent roles of the organization of actual and possible self-concepts in affective processes.

  15. A content relevance model for social media health information.

    Science.gov (United States)

    Prybutok, Gayle Linda; Koh, Chang; Prybutok, Victor R

    2014-04-01

    Consumer health informatics includes the development and implementation of Internet-based systems to deliver health risk management information and health intervention applications to the public. The application of consumer health informatics to educational and interventional efforts such as smoking reduction and cessation has garnered attention from both consumers and health researchers in recent years. Scientists believe that smoking avoidance or cessation before the age of 30 years can prevent more than 90% of smoking-related cancers and that individuals who stop smoking fare as well in preventing cancer as those who never start. The goal of this study was to determine factors that were most highly correlated with content relevance for health information provided on the Internet for a study group of 18- to 30-year-old college students. Data analysis showed that the opportunity for convenient entertainment, social interaction, health information-seeking behavior, time spent surfing on the Internet, the importance of available activities on the Internet (particularly e-mail), and perceived site relevance for Internet-based sources of health information were significantly correlated with content relevance for 18- to 30-year-old college students, an educated subset of this population segment.

  16. Modelling the structure of complex networks

    DEFF Research Database (Denmark)

    Herlau, Tue

    networks has been independently studied as mathematical objects in their own right. As such, there has been both an increased demand for statistical methods for complex networks as well as a quickly growing mathematical literature on the subject. In this dissertation we explore aspects of modelling complex....... The next chapters will treat some of the various symmetries, representer theorems and probabilistic structures often deployed in the modelling complex networks, the construction of sampling methods and various network models. The introductory chapters will serve to provide context for the included written...

  17. Structural characterisation of medically relevant protein assemblies by integrating mass spectrometry with computational modelling.

    Science.gov (United States)

    Politis, Argyris; Schmidt, Carla

    2018-03-20

    Structural mass spectrometry with its various techniques is a powerful tool for the structural elucidation of medically relevant protein assemblies. It delivers information on the composition, stoichiometries, interactions and topologies of these assemblies. Most importantly it can deal with heterogeneous mixtures and assemblies which makes it universal among the conventional structural techniques. In this review we summarise recent advances and challenges in structural mass spectrometric techniques. We describe how the combination of the different mass spectrometry-based methods with computational strategies enable structural models at molecular levels of resolution. These models hold significant potential for helping us in characterizing the function of protein assemblies related to human health and disease. In this review we summarise the techniques of structural mass spectrometry often applied when studying protein-ligand complexes. We exemplify these techniques through recent examples from literature that helped in the understanding of medically relevant protein assemblies. We further provide a detailed introduction into various computational approaches that can be integrated with these mass spectrometric techniques. Last but not least we discuss case studies that integrated mass spectrometry and computational modelling approaches and yielded models of medically important protein assembly states such as fibrils and amyloids. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.

  18. The Relevance of Using Mathematical Models in Macroeconomic Policies Theory

    Directory of Open Access Journals (Sweden)

    Nora Mihail

    2006-11-01

    Full Text Available The article presents a look of the principal’s mathematical models – starting with Theil, Hansen and Tinbergen work – and their results used to analysis and design macroeconomic policies. In modeling field changes are very fast in theoretical aspects of modeling the many problems of macroeconomic policies and in using in practice the different political models elaboration. The article points out the problems of static and dynamic theory used in macro-policies modeling.

  19. The Relevance of Using Mathematical Models in Macroeconomic Policies Theory

    Directory of Open Access Journals (Sweden)

    Nora Mihail

    2006-09-01

    Full Text Available The article presents a look of the principal’s mathematical models – starting with Theil, Hansen and Tinbergen work – and their results used to analysis and design macroeconomic policies. In modeling field changes are very fast in theoretical aspects of modeling the many problems of macroeconomic policies and in using in practice the different political models elaboration. The article points out the problems of static and dynamic theory used in macro-policies modeling.

  20. The Complexity Turn in Studies of Organisations and Leadership: Relevance and Implications

    Science.gov (United States)

    Johannessen, Stig O.

    2009-01-01

    The widespread experience of complexity is the experience of radical unpredictability and loss of clear connections between cause and effect. The typical response from leaders and researchers is to suggest that more complex contexts require new ways of management control and that particular ways of organising and leading are better than others in…

  1. Computational Modeling of Complex Protein Activity Networks

    NARCIS (Netherlands)

    Schivo, Stefano; Leijten, Jeroen; Karperien, Marcel; Post, Janine N.; Prignet, Claude

    2017-01-01

    Because of the numerous entities interacting, the complexity of the networks that regulate cell fate makes it impossible to analyze and understand them using the human brain alone. Computational modeling is a powerful method to unravel complex systems. We recently described the development of a

  2. Exploring the potential relevance of human-specific genes to complex disease

    Directory of Open Access Journals (Sweden)

    Cooper David N

    2011-01-01

    Full Text Available Abstract Although human disease genes generally tend to be evolutionarily more ancient than non-disease genes, complex disease genes appear to be represented more frequently than Mendelian disease genes among genes of more recent evolutionary origin. It is therefore proposed that the analysis of human-specific genes might provide new insights into the genetics of complex disease. Cross-comparison with the Human Gene Mutation Database (http://www.hgmd.org revealed a number of examples of disease-causing and disease-associated mutations in putatively human-specific genes. A sizeable proportion of these were missense polymorphisms associated with complex disease. Since both human-specific genes and genes associated with complex disease have often experienced particularly rapid rates of evolutionary change, either due to weaker purifying selection or positive selection, it is proposed that a significant number of human-specific genes may play a role in complex disease.

  3. Models of complex attitude systems

    DEFF Research Database (Denmark)

    Sørensen, Bjarne Taulo

    search algorithms and structural equation models. The results suggest that evaluative judgments of the importance of production system attributes are generated in a schematic manner, driven by personal value orientations. The effect of personal value orientations was strong and largely unmediated...... that evaluative affect propagates through the system in such a way that the system becomes evaluatively consistent and operates as a schema for the generation of evaluative judgments. In the empirical part of the paper, the causal structure of an attitude system from which people derive their evaluations of pork......Existing research on public attitudes towards agricultural production systems is largely descriptive, abstracting from the processes through which members of the general public generate their evaluations of such systems. The present paper adopts a systems perspective on such evaluations...

  4. Extracting the relevant delays in time series modelling

    DEFF Research Database (Denmark)

    Goutte, Cyril

    1997-01-01

    selection, and more precisely stepwise forward selection. The method is compared to other forward selection schemes, as well as to a nonparametric tests aimed at estimating the embedding dimension of time series. The final application extends these results to the efficient estimation of FIR filters on some......In this contribution, we suggest a convenient way to use generalisation error to extract the relevant delays from a time-varying process, i.e. the delays that lead to the best prediction performance. We design a generalisation-based algorithm that takes its inspiration from traditional variable...

  5. Foundations for Streaming Model Transformations by Complex Event Processing.

    Science.gov (United States)

    Dávid, István; Ráth, István; Varró, Dániel

    2018-01-01

    Streaming model transformations represent a novel class of transformations to manipulate models whose elements are continuously produced or modified in high volume and with rapid rate of change. Executing streaming transformations requires efficient techniques to recognize activated transformation rules over a live model and a potentially infinite stream of events. In this paper, we propose foundations of streaming model transformations by innovatively integrating incremental model query, complex event processing (CEP) and reactive (event-driven) transformation techniques. Complex event processing allows to identify relevant patterns and sequences of events over an event stream. Our approach enables event streams to include model change events which are automatically and continuously populated by incremental model queries. Furthermore, a reactive rule engine carries out transformations on identified complex event patterns. We provide an integrated domain-specific language with precise semantics for capturing complex event patterns and streaming transformations together with an execution engine, all of which is now part of the Viatra reactive transformation framework. We demonstrate the feasibility of our approach with two case studies: one in an advanced model engineering workflow; and one in the context of on-the-fly gesture recognition.

  6. Modeling Musical Complexity: Commentary on Eerola (2016

    Directory of Open Access Journals (Sweden)

    Joshua Albrecht

    2016-07-01

    Full Text Available In his paper, "Expectancy violation and information-theoretic models of melodic complexity," Eerola compares a number of models that correlate musical features of monophonic melodies with participant ratings of perceived melodic complexity. He finds that fairly strong results can be achieved using several different approaches to modeling perceived melodic complexity. The data used in this study are gathered from several previously published studies that use widely different types of melodies, including isochronous folk melodies, isochronous 12-tone rows, and rhythmically complex African folk melodies. This commentary first briefly reviews the article's method and main findings, then suggests a rethinking of the theoretical framework of the study. Finally, some of the methodological issues of the study are discussed.

  7. Relevance of a Managerial Decision-Model to Educational Administration.

    Science.gov (United States)

    Lundin, Edward.; Welty, Gordon

    The rational model of classical economic theory assumes that the decision maker has complete information on alternatives and consequences, and that he chooses the alternative that maximizes expected utility. This model does not allow for constraints placed on the decision maker resulting from lack of information, organizational pressures,…

  8. On the empirical relevance of the transient in opinion models

    Energy Technology Data Exchange (ETDEWEB)

    Banisch, Sven, E-mail: sven.banisch@universecity.d [Mathematical Physics, Physics Department, Bielefeld University, 33501 Bielefeld (Germany); Institute for Complexity Science (ICC), 1249-078 Lisbon (Portugal); Araujo, Tanya, E-mail: tanya@iseg.utl.p [Research Unit on Complexity in Economics (UECE), ISEG, TULisbon, 1249-078 Lisbon (Portugal); Institute for Complexity Science (ICC), 1249-078 Lisbon (Portugal)

    2010-07-12

    While the number and variety of models to explain opinion exchange dynamics is huge, attempts to justify the model results using empirical data are relatively rare. As linking to real data is essential for establishing model credibility, this Letter develops an empirical confirmation experiment by which an opinion model is related to real election data. The model is based on a representation of opinions as a vector of k bits. Individuals interact according to the principle that similarity leads to interaction and interaction leads to still more similarity. In the comparison to real data we concentrate on the transient opinion profiles that form during the dynamic process. An artificial election procedure is introduced which allows to relate transient opinion configurations to the electoral performance of candidates for which data are available. The election procedure based on the well-established principle of proximity voting is repeatedly performed during the transient period and remarkable statistical agreement with the empirical data is observed.

  9. On the empirical relevance of the transient in opinion models

    International Nuclear Information System (INIS)

    Banisch, Sven; Araujo, Tanya

    2010-01-01

    While the number and variety of models to explain opinion exchange dynamics is huge, attempts to justify the model results using empirical data are relatively rare. As linking to real data is essential for establishing model credibility, this Letter develops an empirical confirmation experiment by which an opinion model is related to real election data. The model is based on a representation of opinions as a vector of k bits. Individuals interact according to the principle that similarity leads to interaction and interaction leads to still more similarity. In the comparison to real data we concentrate on the transient opinion profiles that form during the dynamic process. An artificial election procedure is introduced which allows to relate transient opinion configurations to the electoral performance of candidates for which data are available. The election procedure based on the well-established principle of proximity voting is repeatedly performed during the transient period and remarkable statistical agreement with the empirical data is observed.

  10. Bioinorganic Relevance of Some Cobalt(II Complexes with Thiophene-2-glyoxal Derived Schiff Bases

    Directory of Open Access Journals (Sweden)

    Prashant Singh

    2009-01-01

    Full Text Available Complexes of Co(II with two new Schiff bases TEAB [2-hydroxy-4-{[2-oxo-2-(thiophen-2-ylethylidene]amino}benzoic acid] and TEPC [N-[2-oxo-2-(thiophen-2-ylethylidene]pyridine-3-carboxamide] have been synthesized and characterized with the help of elemental analysis, magnetic, mass, 1H-NMR, 13C-NMR, IR and electronic spectral data. IR spectra manifest the coordination of the ligand to the metal ion through the carbonyl oxygen, azomethine nitrogen and thienyl sulphur atoms. With the help of electronic spectral data various ligand field parameters were also calculated. All these studies reveal the distorted octahedral Co(II complexes. Synthesized compounds have also been screened against some micro organisms viz, Escherichia coli, Proteus vulgaris, Aspergillus niger and Aspergillus flavus with the help of ‘filter paper disc’ technique. It has been observed that the antimicrobial activities of metal complexes are higher than that of the free ligand.

  11. Clinical relevance of voltage-gated potassium channel–complex antibodies in children.

    Science.gov (United States)

    Hacohen, Yael; Singh, Rahul; Rossi, Meghan; Lang, Bethan; Hemingway, Cheryl; Lim, Ming; Vincent, Angela

    2015-09-15

    To assess the clinical and immunologic findings in children with voltage-gated potassium channel (VGKC)-complex antibodies (Abs). Thirty-nine of 363 sera, referred from 2 pediatric centers from 2007 to 2013, had been reported positive (.100 pM) for VGKC-complex Abs. Medical records were reviewed retrospectively and the patients’ condition was independently classified as inflammatory (n 5 159) or noninflammatory (n 5 204). Positive sera (.100 pM) were tested/retested for the VGKC complex Ab–positive complex proteins LGI1 and CASPR2, screened for binding to live hippocampal neurons, and 12 high-titer sera (.400 pM) tested by radioimmunoassay for binding to VGKC Kv1 subunits with or without intracellular postsynaptic density proteins. VGKC-complex Abs were found in 39 children, including 20% of encephalopathies and 7.6% of other conditions (p 5 0.001). Thirty children had inflammatory conditions and 9 had noninflammatory etiologies but titers.400 pM (n512) were found only in inflammatory diseases (p , 0.0001). Four sera, including from 2 children with coexisting NMDA receptor Abs and one with Guillain-Barré syndrome and Abs to both LGI1 and CASPR2, bound to hippocampal neurons. None of the sera bound detectably to VGKC Kv1 subunits on live HEK cells, but 4 of 12 .400 pM sera immunoprecipitated VGKC Kv1 subunits, with or without postsynaptic densities, extracted from transfected cells. Positive VGKC-complex Abs cannot be taken to indicate a specific clinical syndrome in children, but appear to be a nonspecific biomarker of inflammatory neurologic diseases, particularly of encephalopathy. Some of the Abs may bind to intracellular epitopes on the VGKC subunits, or to the intracellular interacting proteins, but in many the targets remain undefined.

  12. Perceived Service Quality models: Are They Still Relevant?

    OpenAIRE

    Polyakova, Olga; Mirza, Mohammed T.

    2015-01-01

    This paper reviews the concept of perceived service quality and provides an update to the body of service quality knowledge. It consolidates the pathway of perceived service quality concept, from its emergence to the research model’s development. It also critically reviews service characteristics as prerequisites of perceived service quality conceptualisation. The examination of six perceived service quality models is intended to identify a superior model that could be used by further researc...

  13. Modeling complex work systems - method meets reality

    NARCIS (Netherlands)

    van der Veer, Gerrit C.; Hoeve, Machteld; Lenting, Bert

    1996-01-01

    Modeling an existing task situation is often a first phase in the (re)design of information systems. For complex systems design, this model should consider both the people and the organization involved, the work, and situational aspects. Groupware Task Analysis (GTA) as part of a method for the

  14. Fatigue modeling of materials with complex microstructures

    DEFF Research Database (Denmark)

    Qing, Hai; Mishnaevsky, Leon

    2011-01-01

    with the phenomenological model of fatigue damage growth. As a result, the fatigue lifetime of materials with complex structures can be determined as a function of the parameters of their structures. As an example, the fatigue lifetimes of wood modeled as a cellular material with multilayered, fiber reinforced walls were...

  15. Updating the debate on model complexity

    Science.gov (United States)

    Simmons, Craig T.; Hunt, Randall J.

    2012-01-01

    As scientists who are trying to understand a complex natural world that cannot be fully characterized in the field, how can we best inform the society in which we live? This founding context was addressed in a special session, “Complexity in Modeling: How Much is Too Much?” convened at the 2011 Geological Society of America Annual Meeting. The session had a variety of thought-provoking presentations—ranging from philosophy to cost-benefit analyses—and provided some areas of broad agreement that were not evident in discussions of the topic in 1998 (Hunt and Zheng, 1999). The session began with a short introduction during which model complexity was framed borrowing from an economic concept, the Law of Diminishing Returns, and an example of enjoyment derived by eating ice cream. Initially, there is increasing satisfaction gained from eating more ice cream, to a point where the gain in satisfaction starts to decrease, ending at a point when the eater sees no value in eating more ice cream. A traditional view of model complexity is similar—understanding gained from modeling can actually decrease if models become unnecessarily complex. However, oversimplified models—those that omit important aspects of the problem needed to make a good prediction—can also limit and confound our understanding. Thus, the goal of all modeling is to find the “sweet spot” of model sophistication—regardless of whether complexity was added sequentially to an overly simple model or collapsed from an initial highly parameterized framework that uses mathematics and statistics to attain an optimum (e.g., Hunt et al., 2007). Thus, holistic parsimony is attained, incorporating “as simple as possible,” as well as the equally important corollary “but no simpler.”

  16. Differential Effects of Munc18s on Multiple Degranulation-Relevant Trans-SNARE Complexes.

    Directory of Open Access Journals (Sweden)

    Hao Xu

    Full Text Available Mast cell exocytosis, which includes compound degranulation and vesicle-associated piecemeal degranulation, requires multiple Q- and R- SNAREs. It is not clear how these SNAREs pair to form functional trans-SNARE complexes and how these trans-SNARE complexes are selectively regulated for fusion. Here we undertake a comprehensive examination of the capacity of two Q-SNARE subcomplexes (syntaxin3/SNAP-23 and syntaxin4/SNAP-23 to form fusogenic trans-SNARE complexes with each of the four granule-borne R-SNAREs (VAMP2, 3, 7, 8. We report the identification of at least six distinct trans-SNARE complexes under enhanced tethering conditions: i VAMP2/syntaxin3/SNAP-23, ii VAMP2/syntaxin4/SNAP-23, iii VAMP3/syntaxin3/SNAP-23, iv VAMP3/syntaxin4/SNAP-23, v VAMP8/syntaxin3/SNAP-23, and vi VAMP8/syntaxin4/SNAP-23. We show for the first time that Munc18a operates synergistically with SNAP-23-based non-neuronal SNARE complexes (i to iv in lipid mixing, in contrast to Munc18b and c, which exhibit no positive effect on any SNARE combination tested. Pre-incubation with Munc18a renders the SNARE-dependent fusion reactions insensitive to the otherwise inhibitory R-SNARE cytoplasmic domains, suggesting a protective role of Munc18a for its cognate SNAREs. Our findings substantiate the recently discovered but unexpected requirement for Munc18a in mast cell exocytosis, and implicate post-translational modifications in Munc18b/c activation.

  17. Assessing the value relevance of current mandatory business model disclosures

    DEFF Research Database (Denmark)

    Schaper, Stefan; Nielsen, Christian; Simoni, Lorenzo

    the model developed by Ohlson (1995). Our results show no significant association between BM disclosure and share prices. The main reason behind this finding can be associated with the low level of disclosure (i.e. the low number of value drivers disclosed on average) by companies as part of their BM......Recent regulations have introduced the requirement for large companies to disclose information about their business model (BM) in the annual reports. The objective of these disclosures is to allow external users to understand better how companies create, deliver and capture value. This study aims...... reports. Ad-hoc created disclosure indexes are based on the taxonomy of business model (BM) configurations developed by Taran et al. (2016) as well as complemented by a frame of reference based on the nice BM canvas elements from Osterwalder and Pigneur (2010). After the classification of companies...

  18. Complexity, Modeling, and Natural Resource Management

    Directory of Open Access Journals (Sweden)

    Paul Cilliers

    2013-09-01

    Full Text Available This paper contends that natural resource management (NRM issues are, by their very nature, complex and that both scientists and managers in this broad field will benefit from a theoretical understanding of complex systems. It starts off by presenting the core features of a view of complexity that not only deals with the limits to our understanding, but also points toward a responsible and motivating position. Everything we do involves explicit or implicit modeling, and as we can never have comprehensive access to any complex system, we need to be aware both of what we leave out as we model and of the implications of the choice of our modeling framework. One vantage point is never sufficient, as complexity necessarily implies that multiple (independent conceptualizations are needed to engage the system adequately. We use two South African cases as examples of complex systems - restricting the case narratives mainly to the biophysical domain associated with NRM issues - that make the point that even the behavior of the biophysical subsystems themselves are already complex. From the insights into complex systems discussed in the first part of the paper and the lessons emerging from the way these cases have been dealt with in reality, we extract five interrelated generic principles for practicing science and management in complex NRM environments. These principles are then further elucidated using four further South African case studies - organized as two contrasting pairs - and now focusing on the more difficult organizational and social side, comparing the human organizational endeavors in managing such systems.

  19. Multifaceted Modelling of Complex Business Enterprises.

    Science.gov (United States)

    Chakraborty, Subrata; Mengersen, Kerrie; Fidge, Colin; Ma, Lin; Lassen, David

    2015-01-01

    We formalise and present a new generic multifaceted complex system approach for modelling complex business enterprises. Our method has a strong focus on integrating the various data types available in an enterprise which represent the diverse perspectives of various stakeholders. We explain the challenges faced and define a novel approach to converting diverse data types into usable Bayesian probability forms. The data types that can be integrated include historic data, survey data, and management planning data, expert knowledge and incomplete data. The structural complexities of the complex system modelling process, based on various decision contexts, are also explained along with a solution. This new application of complex system models as a management tool for decision making is demonstrated using a railway transport case study. The case study demonstrates how the new approach can be utilised to develop a customised decision support model for a specific enterprise. Various decision scenarios are also provided to illustrate the versatility of the decision model at different phases of enterprise operations such as planning and control.

  20. Multifaceted Modelling of Complex Business Enterprises

    Science.gov (United States)

    2015-01-01

    We formalise and present a new generic multifaceted complex system approach for modelling complex business enterprises. Our method has a strong focus on integrating the various data types available in an enterprise which represent the diverse perspectives of various stakeholders. We explain the challenges faced and define a novel approach to converting diverse data types into usable Bayesian probability forms. The data types that can be integrated include historic data, survey data, and management planning data, expert knowledge and incomplete data. The structural complexities of the complex system modelling process, based on various decision contexts, are also explained along with a solution. This new application of complex system models as a management tool for decision making is demonstrated using a railway transport case study. The case study demonstrates how the new approach can be utilised to develop a customised decision support model for a specific enterprise. Various decision scenarios are also provided to illustrate the versatility of the decision model at different phases of enterprise operations such as planning and control. PMID:26247591

  1. Modeling OPC complexity for design for manufacturability

    Science.gov (United States)

    Gupta, Puneet; Kahng, Andrew B.; Muddu, Swamy; Nakagawa, Sam; Park, Chul-Hong

    2005-11-01

    Increasing design complexity in sub-90nm designs results in increased mask complexity and cost. Resolution enhancement techniques (RET) such as assist feature addition, phase shifting (attenuated PSM) and aggressive optical proximity correction (OPC) help in preserving feature fidelity in silicon but increase mask complexity and cost. Data volume increase with rise in mask complexity is becoming prohibitive for manufacturing. Mask cost is determined by mask write time and mask inspection time, which are directly related to the complexity of features printed on the mask. Aggressive RET increase complexity by adding assist features and by modifying existing features. Passing design intent to OPC has been identified as a solution for reducing mask complexity and cost in several recent works. The goal of design-aware OPC is to relax OPC tolerances of layout features to minimize mask cost, without sacrificing parametric yield. To convey optimal OPC tolerances for manufacturing, design optimization should drive OPC tolerance optimization using models of mask cost for devices and wires. Design optimization should be aware of impact of OPC correction levels on mask cost and performance of the design. This work introduces mask cost characterization (MCC) that quantifies OPC complexity, measured in terms of fracture count of the mask, for different OPC tolerances. MCC with different OPC tolerances is a critical step in linking design and manufacturing. In this paper, we present a MCC methodology that provides models of fracture count of standard cells and wire patterns for use in design optimization. MCC cannot be performed by designers as they do not have access to foundry OPC recipes and RET tools. To build a fracture count model, we perform OPC and fracturing on a limited set of standard cells and wire configurations with all tolerance combinations. Separately, we identify the characteristics of the layout that impact fracture count. Based on the fracture count (FC) data

  2. relevance of information warfare models to critical infrastructure

    African Journals Online (AJOL)

    ismith

    Critical infrastructure models, strategies and policies should take information ... gain an advantage over a competitor or adversary through the use of one's own .... digital communications system, where the vehicles are analogous to bits or packets, ..... performance degraded, causing an increase in traffic finding a new route.

  3. Automatic Relevance Determination for multi-way models

    DEFF Research Database (Denmark)

    Mørup, Morten; Hansen, Lars Kai

    2009-01-01

    of components of data within the Tucker and CP structure. For the Tucker and CP model the approach performs better than heuristics such as the Bayesian Information Criterion, Akaikes Information Criterion, DIFFIT and the numerical convex hull (NumConvHull) while operating only at the cost of estimating...... is available for download at www.erpwavelab.org....

  4. Sutherland models for complex reflection groups

    International Nuclear Information System (INIS)

    Crampe, N.; Young, C.A.S.

    2008-01-01

    There are known to be integrable Sutherland models associated to every real root system, or, which is almost equivalent, to every real reflection group. Real reflection groups are special cases of complex reflection groups. In this paper we associate certain integrable Sutherland models to the classical family of complex reflection groups. Internal degrees of freedom are introduced, defining dynamical spin chains, and the freezing limit taken to obtain static chains of Haldane-Shastry type. By considering the relation of these models to the usual BC N case, we are led to systems with both real and complex reflection groups as symmetries. We demonstrate their integrability by means of new Dunkl operators, associated to wreath products of dihedral groups

  5. Canine intrahepatic vasculature: is a functional anatomic model relevant to the dog?

    Science.gov (United States)

    Hall, Jon L; Mannion, Paddy; Ladlow, Jane F

    2015-01-01

    To clarify canine intrahepatic portal and hepatic venous system anatomy using corrosion casting and advanced imaging and to devise a novel functional anatomic model of the canine liver to investigate whether this could help guide the planning and surgical procedure of partial hepatic lobectomy and interventional radiological procedures. Prospective experimental study. Adult Greyhound cadavers (n = 8). Portal and hepatic vein corrosion casts of healthy livers were assessed using computed tomography (CT). The hepatic lobes have a consistent hilar hepatic and portal vein supply with some variation in the number of intrahepatic branches. For all specimens, 3 surgically resectable areas were identified in the left lateral lobe and 2 surgically resectable areas were identified in the right medial lobe as defined by a functional anatomic model. CT of detailed acrylic casts allowed complex intrahepatic vascular relationships to be investigated and compared with previous studies. Improving understanding of the intrahepatic vascular supply facilitates interpretation of advanced images in clinical patients, the planning and performance of surgical procedures, and may facilitate interventional vascular procedures, such as intravenous embolization of portosystemic shunts. Functional division of the canine liver similar to human models is possible. The left lateral and right medial lobes can be consistently divided into surgically resectable functional areas and partial lobectomies can be performed following a functional model; further study in clinically affected animals would be required to investigate the relevance of this functional model in the dog. © Copyright 2014 by The American College of Veterinary Surgeons.

  6. Deciphering the complexity of acute inflammation using mathematical models.

    Science.gov (United States)

    Vodovotz, Yoram

    2006-01-01

    Various stresses elicit an acute, complex inflammatory response, leading to healing but sometimes also to organ dysfunction and death. We constructed both equation-based models (EBM) and agent-based models (ABM) of various degrees of granularity--which encompass the dynamics of relevant cells, cytokines, and the resulting global tissue dysfunction--in order to begin to unravel these inflammatory interactions. The EBMs describe and predict various features of septic shock and trauma/hemorrhage (including the response to anthrax, preconditioning phenomena, and irreversible hemorrhage) and were used to simulate anti-inflammatory strategies in clinical trials. The ABMs that describe the interrelationship between inflammation and wound healing yielded insights into intestinal healing in necrotizing enterocolitis, vocal fold healing during phonotrauma, and skin healing in the setting of diabetic foot ulcers. Modeling may help in understanding the complex interactions among the components of inflammation and response to stress, and therefore aid in the development of novel therapies and diagnostics.

  7. Deterministic ripple-spreading model for complex networks.

    Science.gov (United States)

    Hu, Xiao-Bing; Wang, Ming; Leeson, Mark S; Hines, Evor L; Di Paolo, Ezequiel

    2011-04-01

    This paper proposes a deterministic complex network model, which is inspired by the natural ripple-spreading phenomenon. The motivations and main advantages of the model are the following: (i) The establishment of many real-world networks is a dynamic process, where it is often observed that the influence of a few local events spreads out through nodes, and then largely determines the final network topology. Obviously, this dynamic process involves many spatial and temporal factors. By simulating the natural ripple-spreading process, this paper reports a very natural way to set up a spatial and temporal model for such complex networks. (ii) Existing relevant network models are all stochastic models, i.e., with a given input, they cannot output a unique topology. Differently, the proposed ripple-spreading model can uniquely determine the final network topology, and at the same time, the stochastic feature of complex networks is captured by randomly initializing ripple-spreading related parameters. (iii) The proposed model can use an easily manageable number of ripple-spreading related parameters to precisely describe a network topology, which is more memory efficient when compared with traditional adjacency matrix or similar memory-expensive data structures. (iv) The ripple-spreading model has a very good potential for both extensions and applications.

  8. Minimum-complexity helicopter simulation math model

    Science.gov (United States)

    Heffley, Robert K.; Mnich, Marc A.

    1988-01-01

    An example of a minimal complexity simulation helicopter math model is presented. Motivating factors are the computational delays, cost, and inflexibility of the very sophisticated math models now in common use. A helicopter model form is given which addresses each of these factors and provides better engineering understanding of the specific handling qualities features which are apparent to the simulator pilot. The technical approach begins with specification of features which are to be modeled, followed by a build up of individual vehicle components and definition of equations. Model matching and estimation procedures are given which enable the modeling of specific helicopters from basic data sources such as flight manuals. Checkout procedures are given which provide for total model validation. A number of possible model extensions and refinement are discussed. Math model computer programs are defined and listed.

  9. The hamster flank organ model: Is it relevant to man

    International Nuclear Information System (INIS)

    Franz, T.J.; Lehman, P.A.; Pochi, P.; Odland, G.F.; Olerud, J.

    1989-01-01

    The critical role that androgens play in the etiology of acne has led to a search for topically active antiandrogens and the frequent use of the flank organ of the golden Syrian hamster as an animal model. 17-alpha-propyltestosterone (17-PT) has been identified as having potent antiandrogenic activity in the hamster model, and this report describes its clinical evaluation. Two double-blind placebo controlled studies comparing 4% 17-PT in 80% alcohol versus vehicle alone were conducted. One study examined 17-PT sebosuppressive activity in 20 subjects. The second study examined its efficacy in 44 subjects having mild to moderate acne. A third study measured in vitro percutaneous absorption of 17-PT through hamster flank and monkey skin, and human face skin in-vivo, using radioactive drug. 17-PT was found to be ineffective in reducing either the sebum excretion rate or the number of inflammatory acne lesions. Failure of 17-PT to show clinical activity was not a result of poor percutaneous absorption. Total absorption in man was 7.7% of the dose and only 1.0% in the hamster. The sebaceous gland of hamster flank organ is apparently more sensitive to antiandrogens than the human sebaceous gland

  10. Determinants of dermal exposure relevant for exposure modelling in regulatory risk assessment.

    Science.gov (United States)

    Marquart, J; Brouwer, D H; Gijsbers, J H J; Links, I H M; Warren, N; van Hemmen, J J

    2003-11-01

    Risk assessment of chemicals requires assessment of the exposure levels of workers. In the absence of adequate specific measured data, models are often used to estimate exposure levels. For dermal exposure only a few models exist, which are not validated externally. In the scope of a large European research programme, an analysis of potential dermal exposure determinants was made based on the available studies and models and on the expert judgement of the authors of this publication. Only a few potential determinants appear to have been studied in depth. Several studies have included clusters of determinants into vaguely defined parameters, such as 'task' or 'cleaning and maintenance of clothing'. Other studies include several highly correlated parameters, such as 'amount of product handled', 'duration of task' and 'area treated', and separation of these parameters to study their individual influence is not possible. However, based on the available information, a number of determinants could clearly be defined as proven or highly plausible determinants of dermal exposure in one or more exposure situation. This information was combined with expert judgement on the scientific plausibility of the influence of parameters that have not been extensively studied and on the possibilities to gather relevant information during a risk assessment process. The result of this effort is a list of determinants relevant for dermal exposure models in the scope of regulatory risk assessment. The determinants have been divided into the major categories 'substance and product characteristics', 'task done by the worker', 'process technique and equipment', 'exposure control measures', 'worker characteristics and habits' and 'area and situation'. To account for the complex nature of the dermal exposure processes, a further subdivision was made into the three major processes 'direct contact', 'surface contact' and 'deposition'.

  11. RF modeling of the ITER-relevant lower hybrid antenna

    International Nuclear Information System (INIS)

    Hillairet, J.; Ceccuzzi, S.; Belo, J.; Marfisi, L.; Artaud, J.F.; Bae, Y.S.; Berger-By, G.; Bernard, J.M.; Cara, Ph.; Cardinali, A.; Castaldo, C.; Cesario, R.; Decker, J.; Delpech, L.; Ekedahl, A.; Garcia, J.; Garibaldi, P.; Goniche, M.; Guilhem, D.; Hoang, G.T.

    2011-01-01

    In the frame of the EFDA task HCD-08-03-01, a 5 GHz Lower Hybrid system which should be able to deliver 20 MW CW on ITER and sustain the expected high heat fluxes has been reviewed. The design and overall dimensions of the key RF elements of the launcher and its subsystem has been updated from the 2001 design in collaboration with ITER organization. Modeling of the LH wave propagation and absorption into the plasma shows that the optimal parallel index must be chosen between 1.9 and 2.0 for the ITER steady-state scenario. The present study has been made with n || = 2.0 but can be adapted for n || = 1.9. Individual components have been studied separately giving confidence on the global RF design of the whole antenna.

  12. Models relevant to radiation effects on stem cell pools

    Energy Technology Data Exchange (ETDEWEB)

    Lajtha, L G

    1971-04-01

    The available evidence clearly indicates the existence of a pluripotential primitive stem cell population (CFU). In the normal animal a large proportion of this population will 'sit' in the G{sub 0} state. At any time a small proportion of these cells will differentiate into one or more 'precursor' populations. One such precursor population is the erythropoietin responsive cell (ERC) and it is important to realize that this population has a significant number of mitoses in it, i.e. it is capable of considerable, although not indefinite proliferation. This is indicated by the colony growth during which, from a single colony former, large numbers of differentiated cells can be formed, in excess of several millions, at a time when the CFU numbers in the colony are still very small. To understand then the effects of radiation on this complex series of populations the following will have to be borne in mind. The ultimate stem cells are, at least in the small rodent, the CFU. It is their quantity that is essential for regeneration. However, the normal rate of differentiation of CFU in the small rodent is very low. This is because this rate of differentiation gives rise to a possibly committed precursor cell population such as the ERC, or possibly the agar forming 'GRC', the precursor populations with considerable proliferative potential in them. Therefore, after relatively small doses of radiation, or even during a regime of continuous irradiation, the proliferative potential of these transit precursor populations will be able to cope with near normal haemopoiesis at a time when the number of colony formers is gradually depleted. It is also significant that the rate of seeding of the primitive colony former is not related to its numbers and, therefore, under conditions of partial irradiation, at least in the small rodent, greatly increased relative migration and to some extent increased absolute migration of the CFU is possible, with consequent enhancement of the

  13. From research excellence to brand relevance: A model for higher education reputation building

    Directory of Open Access Journals (Sweden)

    Nina Overton-de Klerk

    2016-05-01

    Full Text Available In this article we propose a novel approach to reputation development at higher education institutions. Global reputation development at higher education institutions is largely driven by research excellence, is predominantly measured by research output, and is predominantly reflected in hierarchical university rankings. The ranking becomes equated with brand equity. We argue that the current approach to reputation development in higher education institutions is modernist and linear. This is strangely out-of-kilter with the complexities of a transforming society in flux, the demands of a diversity of stakeholders, and the drive towards transdisciplinarity, laterality, reflexivity and relevance in science. Good research clearly remains an important ingredient of a university's brand value. However, a case can be made for brand relevance, co-created in collaboration with stakeholders, as an alternative and non-linear way of differentiation. This approach is appropriate in light of challenges in strategic science globally as well as trends and shifts in the emerging paradigm of strategic communication. In applying strategic communication principles to current trends and issues in strategic science and the communication thereof, an alternative model for strategic reputation building at higher education institutions is developed.

  14. Geometric Modelling with a-Complexes

    NARCIS (Netherlands)

    Gerritsen, B.H.M.; Werff, K. van der; Veltkamp, R.C.

    2001-01-01

    The shape of real objects can be so complicated, that only a sampling data point set can accurately represent them. Analytic descriptions are too complicated or impossible. Natural objects, for example, can be vague and rough with many holes. For this kind of modelling, a-complexes offer advantages

  15. A cognitive model for software architecture complexity

    NARCIS (Netherlands)

    Bouwers, E.; Lilienthal, C.; Visser, J.; Van Deursen, A.

    2010-01-01

    Evaluating the complexity of the architecture of a softwaresystem is a difficult task. Many aspects have to be considered to come to a balanced assessment. Several architecture evaluation methods have been proposed, but very few define a quality model to be used during the evaluation process. In

  16. Spectroscopic investigation of complexation of Cm(III) und Eu(III) with partitioning-relevant N-donor ligands

    International Nuclear Information System (INIS)

    Bremer, Antje

    2014-01-01

    The separation of trivalent actinides and lanthanides is an essential part of the development of improved nuclear fuel cycles. Liquid-liquid extraction is an applicable technique to achieve this separation. Due to the chemical similarity and the almost identical ionic radii of trivalent actinides and lanthanides this separation is, however, only feasible with highly selective extracting agents. It has been proven that molecules with soft sulphur or nitrogen donor atoms have a higher affinity for trivalent actinides. In the present work, the complexation of Cm(III) and Eu(III) with N-donor ligands relevant for partitioning has been studied by time-resolved laser fluorescence spectroscopy (TRLFS). This work aims at a better understanding of the molecular reason of the selectivity of these ligands. In this context, enormous effort has been and is still put into detailed investigations on BTP and BTBP ligands, which are the most successful N-donor ligands for the selective extraction of trivalent actinides, to date. Additionally, the complexation and extraction behavior of molecules which are structurally related to these ligands is studied. The ligand C5-BPP (2,6-bis(5-(2,2-dimethylpropyl)-1H-pyrazol-3-yl)pyridine) where the triazine rings of the aromatic backbone of the BTP ligands have been replaced by pyrazole rings is one of these molecules. Laser fluorescence spectroscopic investigation of the complexation of Cm(III) with this ligand revealed stepwise formation of three (Cm(C5-BPP) n ) 3+ complexes (n = 1 - 3). The stability constant of the 1:3 complex was determined (log β 3 = 14.8 ± 0.4). Extraction experiments have shown that, in contrast to BTP and BTBP ligands, C5-BPP needs an additional lipophilic anion source such as a 2-bromocarboxylic acid to selectively extract trivalent actinides from nitric acid solutions. The comparison of the stability constant of the (Cm(C5-BPP) 3 ) 3+ complex with the stability constant of the (Cm(nPr-BTP) 3 ) 3+ complex

  17. Comparing flood loss models of different complexity

    Science.gov (United States)

    Schröter, Kai; Kreibich, Heidi; Vogel, Kristin; Riggelsen, Carsten; Scherbaum, Frank; Merz, Bruno

    2013-04-01

    Any deliberation on flood risk requires the consideration of potential flood losses. In particular, reliable flood loss models are needed to evaluate cost-effectiveness of mitigation measures, to assess vulnerability, for comparative risk analysis and financial appraisal during and after floods. In recent years, considerable improvements have been made both concerning the data basis and the methodological approaches used for the development of flood loss models. Despite of that, flood loss models remain an important source of uncertainty. Likewise the temporal and spatial transferability of flood loss models is still limited. This contribution investigates the predictive capability of different flood loss models in a split sample cross regional validation approach. For this purpose, flood loss models of different complexity, i.e. based on different numbers of explaining variables, are learned from a set of damage records that was obtained from a survey after the Elbe flood in 2002. The validation of model predictions is carried out for different flood events in the Elbe and Danube river basins in 2002, 2005 and 2006 for which damage records are available from surveys after the flood events. The models investigated are a stage-damage model, the rule based model FLEMOps+r as well as novel model approaches which are derived using data mining techniques of regression trees and Bayesian networks. The Bayesian network approach to flood loss modelling provides attractive additional information concerning the probability distribution of both model predictions and explaining variables.

  18. Complex scaling in the cluster model

    International Nuclear Information System (INIS)

    Kruppa, A.T.; Lovas, R.G.; Gyarmati, B.

    1987-01-01

    To find the positions and widths of resonances, a complex scaling of the intercluster relative coordinate is introduced into the resonating-group model. In the generator-coordinate technique used to solve the resonating-group equation the complex scaling requires minor changes in the formulae and code. The finding of the resonances does not need any preliminary guess or explicit reference to any asymptotic prescription. The procedure is applied to the resonances in the relative motion of two ground-state α clusters in 8 Be, but is appropriate for any systems consisting of two clusters. (author) 23 refs.; 5 figs

  19. Modeling of anaerobic digestion of complex substrates

    International Nuclear Information System (INIS)

    Keshtkar, A. R.; Abolhamd, G.; Meyssami, B.; Ghaforian, H.

    2003-01-01

    A structured mathematical model of anaerobic conversion of complex organic materials in non-ideally cyclic-batch reactors for biogas production has been developed. The model is based on multiple-reaction stoichiometry (enzymatic hydrolysis, acidogenesis, aceto genesis and methano genesis), microbial growth kinetics, conventional material balances in the liquid and gas phases for a cyclic-batch reactor, liquid-gas interactions, liquid-phase equilibrium reactions and a simple mixing model which considers the reactor volume in two separate sections: the flow-through and the retention regions. The dynamic model describes the effects of reactant's distribution resulting from the mixing conditions, time interval of feeding, hydraulic retention time and mixing parameters on the process performance. The model is applied in the simulation of anaerobic digestion of cattle manure under different operating conditions. The model is compared with experimental data and good correlations are obtained

  20. The temporal-relevance temporal-uncertainty model of prospective duration judgment.

    Science.gov (United States)

    Zakay, Dan

    2015-12-15

    A model aimed at explaining prospective duration judgments in real life settings (as well as in the laboratory) is presented. The model is based on the assumption that situational meaning is continuously being extracted by humans' perceptual and cognitive information processing systems. Time is one of the important dimensions of situational meaning. Based on the situational meaning, a value for Temporal Relevance is set. Temporal Relevance reflects the importance of temporal aspects for enabling adaptive behavior in a specific moment in time. When Temporal Relevance is above a certain threshold a prospective duration judgment process is evoked automatically. In addition, a search for relevant temporal information is taking place and its outcomes determine the level of Temporal Uncertainty which reflects the degree of knowledge one has regarding temporal aspects of the task to be performed. The levels of Temporal Relevance and Temporal Uncertainty determine the amount of attentional resources allocated for timing by the executive system. The merit of the model is in connecting timing processes with the ongoing general information processing stream. The model rests on findings in various domains which indicate that cognitive-relevance and self-relevance are powerful determinants of resource allocation policy. The feasibility of the model is demonstrated by analyzing various temporal phenomena. Suggestions for further empirical validation of the model are presented. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. Relevant teaching in higher education: an exercise from complexity theory in the social work profession

    Directory of Open Access Journals (Sweden)

    Maribel Molina Correa

    2013-12-01

    Full Text Available The requirements of our globalized world and the advancement of the teaching science show didactics as a fundamental category defined as the scientific discipline with principles, laws, theoretical and methodological frameworks, creatively modeling the pedagogical intervention in the academic environment.The implementation of the research "Teaching focused on the development of superior thinking and meaningful learning in students of first semester of Social Work Program", set the goal: Qualify the personal life and student projects from the acknowledgement of potentials of the subjects, for the development of competences meaningful to life. This is a research experience that has been developed since 2009 at Simon Bolivar University in the District of Barranquilla.The didactics was based on the development of superior thinking cognitive-process-centered, for the processing of information, creativity, readings of the reality of contexts, expounded/voiced subjectivities of life projects of students, the incorporation of TIC, in order to approach a humanizing and contextualized pedagogical practice. The critical theory was used in this research as a part of its epistemological basis for understanding and building a new academic scenario.The methodology used is the action with techniques such as mind mapping, dialogues, and stories of life, field works, and contents analysis, among others. The data analysis was guided by the hermeneutics as a possibility for the understanding and interpretation of the events that occurred in the classroom.

  2. A Practical Philosophy of Complex Climate Modelling

    Science.gov (United States)

    Schmidt, Gavin A.; Sherwood, Steven

    2014-01-01

    We give an overview of the practice of developing and using complex climate models, as seen from experiences in a major climate modelling center and through participation in the Coupled Model Intercomparison Project (CMIP).We discuss the construction and calibration of models; their evaluation, especially through use of out-of-sample tests; and their exploitation in multi-model ensembles to identify biases and make predictions. We stress that adequacy or utility of climate models is best assessed via their skill against more naive predictions. The framework we use for making inferences about reality using simulations is naturally Bayesian (in an informal sense), and has many points of contact with more familiar examples of scientific epistemology. While the use of complex simulations in science is a development that changes much in how science is done in practice, we argue that the concepts being applied fit very much into traditional practices of the scientific method, albeit those more often associated with laboratory work.

  3. Intrinsic Uncertainties in Modeling Complex Systems.

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, Curtis S; Bramson, Aaron L.; Ames, Arlo L.

    2014-09-01

    Models are built to understand and predict the behaviors of both natural and artificial systems. Because it is always necessary to abstract away aspects of any non-trivial system being modeled, we know models can potentially leave out important, even critical elements. This reality of the modeling enterprise forces us to consider the prospective impacts of those effects completely left out of a model - either intentionally or unconsidered. Insensitivity to new structure is an indication of diminishing returns. In this work, we represent a hypothetical unknown effect on a validated model as a finite perturba- tion whose amplitude is constrained within a control region. We find robustly that without further constraints, no meaningful bounds can be placed on the amplitude of a perturbation outside of the control region. Thus, forecasting into unsampled regions is a very risky proposition. We also present inherent difficulties with proper time discretization of models and representing in- herently discrete quantities. We point out potentially worrisome uncertainties, arising from math- ematical formulation alone, which modelers can inadvertently introduce into models of complex systems. Acknowledgements This work has been funded under early-career LDRD project #170979, entitled "Quantify- ing Confidence in Complex Systems Models Having Structural Uncertainties", which ran from 04/2013 to 09/2014. We wish to express our gratitude to the many researchers at Sandia who con- tributed ideas to this work, as well as feedback on the manuscript. In particular, we would like to mention George Barr, Alexander Outkin, Walt Beyeler, Eric Vugrin, and Laura Swiler for provid- ing invaluable advice and guidance through the course of the project. We would also like to thank Steven Kleban, Amanda Gonzales, Trevor Manzanares, and Sarah Burwell for their assistance in managing project tasks and resources.

  4. Building Better Ecological Machines: Complexity Theory and Alternative Economic Models

    Directory of Open Access Journals (Sweden)

    Jess Bier

    2016-12-01

    Full Text Available Computer models of the economy are regularly used to predict economic phenomena and set financial policy. However, the conventional macroeconomic models are currently being reimagined after they failed to foresee the current economic crisis, the outlines of which began to be understood only in 2007-2008. In this article we analyze the most prominent of this reimagining: Agent-Based models (ABMs. ABMs are an influential alternative to standard economic models, and they are one focus of complexity theory, a discipline that is a more open successor to the conventional chaos and fractal modeling of the 1990s. The modelers who create ABMs claim that their models depict markets as ecologies, and that they are more responsive than conventional models that depict markets as machines. We challenge this presentation, arguing instead that recent modeling efforts amount to the creation of models as ecological machines. Our paper aims to contribute to an understanding of the organizing metaphors of macroeconomic models, which we argue is relevant conceptually and politically, e.g., when models are used for regulatory purposes.

  5. Different Epidemic Models on Complex Networks

    International Nuclear Information System (INIS)

    Zhang Haifeng; Small, Michael; Fu Xinchu

    2009-01-01

    Models for diseases spreading are not just limited to SIS or SIR. For instance, for the spreading of AIDS/HIV, the susceptible individuals can be classified into different cases according to their immunity, and similarly, the infected individuals can be sorted into different classes according to their infectivity. Moreover, some diseases may develop through several stages. Many authors have shown that the individuals' relation can be viewed as a complex network. So in this paper, in order to better explain the dynamical behavior of epidemics, we consider different epidemic models on complex networks, and obtain the epidemic threshold for each case. Finally, we present numerical simulations for each case to verify our results.

  6. A User-Centered Approach to Adaptive Hypertext Based on an Information Relevance Model

    Science.gov (United States)

    Mathe, Nathalie; Chen, James

    1994-01-01

    Rapid and effective to information in large electronic documentation systems can be facilitated if information relevant in an individual user's content can be automatically supplied to this user. However most of this knowledge on contextual relevance is not found within the contents of documents, it is rather established incrementally by users during information access. We propose a new model for interactively learning contextual relevance during information retrieval, and incrementally adapting retrieved information to individual user profiles. The model, called a relevance network, records the relevance of references based on user feedback for specific queries and user profiles. It also generalizes such knowledge to later derive relevant references for similar queries and profiles. The relevance network lets users filter information by context of relevance. Compared to other approaches, it does not require any prior knowledge nor training. More importantly, our approach to adaptivity is user-centered. It facilitates acceptance and understanding by users by giving them shared control over the adaptation without disturbing their primary task. Users easily control when to adapt and when to use the adapted system. Lastly, the model is independent of the particular application used to access information, and supports sharing of adaptations among users.

  7. FRAM Modelling Complex Socio-technical Systems

    CERN Document Server

    Hollnagel, Erik

    2012-01-01

    There has not yet been a comprehensive method that goes behind 'human error' and beyond the failure concept, and various complicated accidents have accentuated the need for it. The Functional Resonance Analysis Method (FRAM) fulfils that need. This book presents a detailed and tested method that can be used to model how complex and dynamic socio-technical systems work, and understand both why things sometimes go wrong but also why they normally succeed.

  8. Complex Constructivism: A Theoretical Model of Complexity and Cognition

    Science.gov (United States)

    Doolittle, Peter E.

    2014-01-01

    Education has long been driven by its metaphors for teaching and learning. These metaphors have influenced both educational research and educational practice. Complexity and constructivism are two theories that provide functional and robust metaphors. Complexity provides a metaphor for the structure of myriad phenomena, while constructivism…

  9. Present status on atomic and molecular data relevant to fusion plasma diagnostics and modeling

    International Nuclear Information System (INIS)

    Tawara, H.

    1997-01-01

    This issue is the collection of the paper presented status on atomic and molecular data relevant to fusion plasma diagnostics and modeling. The 10 of the presented papers are indexed individually. (J.P.N.)

  10. Complex networks under dynamic repair model

    Science.gov (United States)

    Chaoqi, Fu; Ying, Wang; Kun, Zhao; Yangjun, Gao

    2018-01-01

    Invulnerability is not the only factor of importance when considering complex networks' security. It is also critical to have an effective and reasonable repair strategy. Existing research on network repair is confined to the static model. The dynamic model makes better use of the redundant capacity of repaired nodes and repairs the damaged network more efficiently than the static model; however, the dynamic repair model is complex and polytropic. In this paper, we construct a dynamic repair model and systematically describe the energy-transfer relationships between nodes in the repair process of the failure network. Nodes are divided into three types, corresponding to three structures. We find that the strong coupling structure is responsible for secondary failure of the repaired nodes and propose an algorithm that can select the most suitable targets (nodes or links) to repair the failure network with minimal cost. Two types of repair strategies are identified, with different effects under the two energy-transfer rules. The research results enable a more flexible approach to network repair.

  11. From complex to simple: interdisciplinary stochastic models

    International Nuclear Information System (INIS)

    Mazilu, D A; Zamora, G; Mazilu, I

    2012-01-01

    We present two simple, one-dimensional, stochastic models that lead to a qualitative understanding of very complex systems from biology, nanoscience and social sciences. The first model explains the complicated dynamics of microtubules, stochastic cellular highways. Using the theory of random walks in one dimension, we find analytical expressions for certain physical quantities, such as the time dependence of the length of the microtubules, and diffusion coefficients. The second one is a stochastic adsorption model with applications in surface deposition, epidemics and voter systems. We introduce the ‘empty interval method’ and show sample calculations for the time-dependent particle density. These models can serve as an introduction to the field of non-equilibrium statistical physics, and can also be used as a pedagogical tool to exemplify standard statistical physics concepts, such as random walks or the kinetic approach of the master equation. (paper)

  12. Highly Relevant Mentoring (HRM) as a Faculty Development Model for Web-Based Instruction

    Science.gov (United States)

    Carter, Lorraine; Salyers, Vincent; Page, Aroha; Williams, Lynda; Albl, Liz; Hofsink, Clarence

    2012-01-01

    This paper describes a faculty development model called the highly relevant mentoring (HRM) model; the model includes a framework as well as some practical strategies for meeting the professional development needs of faculty who teach web-based courses. The paper further emphasizes the need for faculty and administrative buy-in for HRM and…

  13. Political economy models and agricultural policy formation : empirical applicability and relevance for the CAP

    NARCIS (Netherlands)

    Zee, van der F.A.

    1997-01-01

    This study explores the relevance and applicability of political economy models for the explanation of agricultural policies. Part I (chapters 4-7) takes a general perspective and evaluates the empirical applicability of voting models and interest group models to agricultural policy

  14. A SIMULATION MODEL OF THE GAS COMPLEX

    Directory of Open Access Journals (Sweden)

    Sokolova G. E.

    2016-06-01

    Full Text Available The article considers the dynamics of gas production in Russia, the structure of sales in the different market segments, as well as comparative dynamics of selling prices on these segments. Problems of approach to the creation of the gas complex using a simulation model, allowing to estimate efficiency of the project and determine the stability region of the obtained solutions. In the presented model takes into account the unit repayment of the loan, allowing with the first year of simulation to determine the possibility of repayment of the loan. The model object is a group of gas fields, which is determined by the minimum flow rate above which the project is cost-effective. In determining the minimum source flow rate for the norm of discount is taken as a generalized weighted average percentage on debt and equity taking into account risk premiums. He also serves as the lower barrier to internal rate of return below which the project is rejected as ineffective. Analysis of the dynamics and methods of expert evaluation allow to determine the intervals of variation of the simulated parameters, such as the price of gas and the exit gas complex at projected capacity. Calculated using the Monte Carlo method, for each random realization of the model simulated values of parameters allow to obtain a set of optimal for each realization of values minimum yield of wells, and also allows to determine the stability region of the solution.

  15. Structured analysis and modeling of complex systems

    Science.gov (United States)

    Strome, David R.; Dalrymple, Mathieu A.

    1992-01-01

    The Aircrew Evaluation Sustained Operations Performance (AESOP) facility at Brooks AFB, Texas, combines the realism of an operational environment with the control of a research laboratory. In recent studies we collected extensive data from the Airborne Warning and Control Systems (AWACS) Weapons Directors subjected to high and low workload Defensive Counter Air Scenarios. A critical and complex task in this environment involves committing a friendly fighter against a hostile fighter. Structured Analysis and Design techniques and computer modeling systems were applied to this task as tools for analyzing subject performance and workload. This technology is being transferred to the Man-Systems Division of NASA Johnson Space Center for application to complex mission related tasks, such as manipulating the Shuttle grappler arm.

  16. Glass Durability Modeling, Activated Complex Theory (ACT)

    International Nuclear Information System (INIS)

    CAROL, JANTZEN

    2005-01-01

    The most important requirement for high-level waste glass acceptance for disposal in a geological repository is the chemical durability, expressed as a glass dissolution rate. During the early stages of glass dissolution in near static conditions that represent a repository disposal environment, a gel layer resembling a membrane forms on the glass surface through which ions exchange between the glass and the leachant. The hydrated gel layer exhibits acid/base properties which are manifested as the pH dependence of the thickness and nature of the gel layer. The gel layer has been found to age into either clay mineral assemblages or zeolite mineral assemblages. The formation of one phase preferentially over the other has been experimentally related to changes in the pH of the leachant and related to the relative amounts of Al +3 and Fe +3 in a glass. The formation of clay mineral assemblages on the leached glass surface layers ,lower pH and Fe +3 rich glasses, causes the dissolution rate to slow to a long-term steady state rate. The formation of zeolite mineral assemblages ,higher pH and Al +3 rich glasses, on leached glass surface layers causes the dissolution rate to increase and return to the initial high forward rate. The return to the forward dissolution rate is undesirable for long-term performance of glass in a disposal environment. An investigation into the role of glass stoichiometry, in terms of the quasi-crystalline mineral species in a glass, has shown that the chemistry and structure in the parent glass appear to control the activated surface complexes that form in the leached layers, and these mineral complexes ,some Fe +3 rich and some Al +3 rich, play a role in whether or not clays or zeolites are the dominant species formed on the leached glass surface. The chemistry and structure, in terms of Q distributions of the parent glass, are well represented by the atomic ratios of the glass forming components. Thus, glass dissolution modeling using simple

  17. Analysis of Evolutionarily Independent Protein-RNA Complexes Yields a Criterion to Evaluate the Relevance of Prebiotic Scenarios.

    Science.gov (United States)

    Blanco, Celia; Bayas, Marco; Yan, Fu; Chen, Irene A

    2018-02-19

    A central difficulty facing study of the origin of life on Earth is evaluating the relevance of different proposed prebiotic scenarios. Perhaps the most established feature of the origin of life was the progression through an RNA World, a prebiotic stage dominated by functional RNA. We use the appearance of proteins in the RNA World to understand the prebiotic milieu and develop a criterion to evaluate proposed synthetic scenarios. Current consensus suggests that the earliest amino acids of the genetic code were anionic or small hydrophobic or polar amino acids. However, the ability to interact with the RNA World would have been a crucial feature of early proteins. To determine which amino acids would be important for the RNA World, we analyze non-biological protein-aptamer complexes in which the RNA or DNA is the result of in vitro evolution. This approach avoids confounding effects of biological context and evolutionary history. We use bioinformatic analysis and molecular dynamics simulations to characterize these complexes. We find that positively charged and aromatic amino acids are over-represented whereas small hydrophobic amino acids are under-represented. Binding enthalpy is found to be primarily electrostatic, with positively charged amino acids contributing cooperatively to binding enthalpy. Arginine dominates all modes of interaction at the interface. These results suggest that proposed prebiotic syntheses must be compatible with cationic amino acids, particularly arginine or a biophysically similar amino acid, in order to be relevant to the invention of protein by the RNA World. Copyright © 2018 Elsevier Ltd. All rights reserved.

  18. Application of all relevant feature selection for failure analysis of parameter-induced simulation crashes in climate models

    Science.gov (United States)

    Paja, W.; Wrzesień, M.; Niemiec, R.; Rudnicki, W. R.

    2015-07-01

    The climate models are extremely complex pieces of software. They reflect best knowledge on physical components of the climate, nevertheless, they contain several parameters, which are too weakly constrained by observations, and can potentially lead to a crash of simulation. Recently a study by Lucas et al. (2013) has shown that machine learning methods can be used for predicting which combinations of parameters can lead to crash of simulation, and hence which processes described by these parameters need refined analyses. In the current study we reanalyse the dataset used in this research using different methodology. We confirm the main conclusion of the original study concerning suitability of machine learning for prediction of crashes. We show, that only three of the eight parameters indicated in the original study as relevant for prediction of the crash are indeed strongly relevant, three other are relevant but redundant, and two are not relevant at all. We also show that the variance due to split of data between training and validation sets has large influence both on accuracy of predictions and relative importance of variables, hence only cross-validated approach can deliver robust prediction of performance and relevance of variables.

  19. Application of all-relevant feature selection for the failure analysis of parameter-induced simulation crashes in climate models

    Science.gov (United States)

    Paja, Wiesław; Wrzesien, Mariusz; Niemiec, Rafał; Rudnicki, Witold R.

    2016-03-01

    Climate models are extremely complex pieces of software. They reflect the best knowledge on the physical components of the climate; nevertheless, they contain several parameters, which are too weakly constrained by observations, and can potentially lead to a simulation crashing. Recently a study by Lucas et al. (2013) has shown that machine learning methods can be used for predicting which combinations of parameters can lead to the simulation crashing and hence which processes described by these parameters need refined analyses. In the current study we reanalyse the data set used in this research using different methodology. We confirm the main conclusion of the original study concerning the suitability of machine learning for the prediction of crashes. We show that only three of the eight parameters indicated in the original study as relevant for prediction of the crash are indeed strongly relevant, three others are relevant but redundant and two are not relevant at all. We also show that the variance due to the split of data between training and validation sets has a large influence both on the accuracy of predictions and on the relative importance of variables; hence only a cross-validated approach can deliver a robust prediction of performance and relevance of variables.

  20. Chaos from simple models to complex systems

    CERN Document Server

    Cencini, Massimo; Vulpiani, Angelo

    2010-01-01

    Chaos: from simple models to complex systems aims to guide science and engineering students through chaos and nonlinear dynamics from classical examples to the most recent fields of research. The first part, intended for undergraduate and graduate students, is a gentle and self-contained introduction to the concepts and main tools for the characterization of deterministic chaotic systems, with emphasis to statistical approaches. The second part can be used as a reference by researchers as it focuses on more advanced topics including the characterization of chaos with tools of information theor

  1. Main Features of a 3d GIS for a Monumental Complex with AN Historical-Cultural Relevance

    Science.gov (United States)

    Scianna, A.; La Guardia, M.

    2017-05-01

    The last achievements of technologies in geomatics especially in survey and restitution of 3D models (UAV/drones and laser scanner technologies) generated new procedures and higher standards of quality in representation of archaeological sites. Together with Geomatics, the recent development of Information and Communication Technologies (ICT) strongly contribute to document and the Cultural Heritage (CH). The representation and documentation of CH using these new technologies has became necessary in order to satisfy different needs: - for restorers in order to acquire a deep knowledge of the cultural good and to define possible strategies of restoration; - for the conservation of information, allowing to preserve the 3D geometry of the monumental complex with the integration of descriptions about architectural elements; - for touristic aims, giving the opportunity of sharing CH information on web, allowing users to visit and explore, in a virtual way, monumental complexes, acquiring information details about architectural elements or the history of monumental complex. Looking through these new scenarios, the development of a 3D Geographic Information System (GIS) applied to a cultural good could be, today, an added value of fundamental importance for full description and data management of monumental complexes. In this work, the main features necessary for the correct construction of a 3D GIS of a monumental complex will be analyzed, with a particular focus on the possibilities for creating a standardized procedure to follow.

  2. The relevance of non-human primate and rodent malaria models for humans

    OpenAIRE

    Langhorne, Jean; Buffet, Pierre; Galinski, Mary; Good, Michael; Harty, John; Leroy, Didier; Mota, Maria M; Pasini, Erica; Renia, Laurent; Riley, Eleanor; Stins, Monique; Duffy, Patrick

    2011-01-01

    Abstract At the 2010 Keystone Symposium on "Malaria: new approaches to understanding Host-Parasite interactions", an extra scientific session to discuss animal models in malaria research was convened at the request of participants. This was prompted by the concern of investigators that skepticism in the malaria community about the use and relevance of animal models, particularly rodent models of severe malaria, has impacted on funding decisions and publication of research using animal models....

  3. Political economy models and agricultural policy formation : empirical applicability and relevance for the CAP

    OpenAIRE

    Zee, van der, F.A.

    1997-01-01

    This study explores the relevance and applicability of political economy models for the explanation of agricultural policies. Part I (chapters 4-7) takes a general perspective and evaluates the empirical applicability of voting models and interest group models to agricultural policy formation in industrialised market economics. Part II (chapters 8-11) focuses on the empirical applicability of political economy models to agricultural policy formation and agricultural policy developmen...

  4. Clinical relevance of positive voltage-gated potassium channel (VGKC)-complex antibodies: experience from a tertiary referral centre.

    Science.gov (United States)

    Paterson, Ross W; Zandi, Michael S; Armstrong, Richard; Vincent, Angela; Schott, Jonathan M

    2014-06-01

    Voltage-gated potassium channel (VGKC)-complex antibodies can be associated with a range of immunotherapy-responsive clinical presentations including limbic encephalitis, Morvan's syndrome and acquired neuromyotonia. However, there are patients with positive levels in whom the significance is uncertain. To evaluate the clinical significance associated with positive (>100 pM) VGKC-complex antibodies. Over a 4-year period, 1053 samples were sent for testing of which 55 were positive. The clinical presentations, final diagnoses and responses to immunotherapies, when given, were assessed retrospectively and the likelihood of autoimmunity was categorised as definite, possible, unlikely or undetermined (modified from Zuliani et al 2012). Only 4 of the 32 patients with low-positive (100-400 pM) levels were considered definitely autoimmune, 3 with peripheral nerve hyperexcitability and 1 with a thymoma; 3 were given immunotherapies. Of the remaining 28 with low-positive levels, 13 (3 of whom had tumours) were considered possibly autoimmune, and 15 were unlikely or undetermined; 1 was given immunotherapy unsuccessfully. Of the 23 patients with high-positive (>400 pM) levels, 12 were given immunotherapies, 11 of whom showed a good response. 11 were considered definitely autoimmune, 10 with limbic encephalitis (antibody specificity: 5 LGI1, 1 contactin2, 2 negative, 2 untested) and 1 with a tumour. In the remaining 12, autoimmunity was considered possible (n=9; most had not received immunotherapies), or unlikely (n=3). As antibody testing becomes more widely available, and many samples are referred from patients with less clear-cut diagnoses, it is important to assess the utility of the results. VGKC-complex antibodies in the range of 100-400 pM (0.1-0.4 nM) were considered clinically relevant in rare conditions with peripheral nerve hyperexcitability and appeared to associate with tumours (12.5%). By contrast high-positive (>400 pM; >0.4 nM) levels were considered definitely

  5. Complex singlet extension of the standard model

    International Nuclear Information System (INIS)

    Barger, Vernon; McCaskey, Mathew; Langacker, Paul; Ramsey-Musolf, Michael; Shaughnessy, Gabe

    2009-01-01

    We analyze a simple extension of the standard model (SM) obtained by adding a complex singlet to the scalar sector (cxSM). We show that the cxSM can contain one or two viable cold dark matter candidates and analyze the conditions on the parameters of the scalar potential that yield the observed relic density. When the cxSM potential contains a global U(1) symmetry that is both softly and spontaneously broken, it contains both a viable dark matter candidate and the ingredients necessary for a strong first order electroweak phase transition as needed for electroweak baryogenesis. We also study the implications of the model for discovery of a Higgs boson at the Large Hadron Collider.

  6. Extension of association models to complex chemicals

    DEFF Research Database (Denmark)

    Avlund, Ane Søgaard

    Summary of “Extension of association models to complex chemicals”. Ph.D. thesis by Ane Søgaard Avlund The subject of this thesis is application of SAFT type equations of state (EoS). Accurate and predictive thermodynamic models are important in many industries including the petroleum industry......; CPA and sPC-SAFT. Phase equilibrium and monomer fraction calculations with sPC-SAFT for methanol are used in the thesis to illustrate the importance of parameter estimation when using SAFT. Different parameter sets give similar pure component vapor pressure and liquid density results, whereas very...... association is presented in the thesis, and compared to the corresponding lattice theory. The theory for intramolecular association is then applied in connection with sPC-SAFT for mixtures containing glycol ethers. Calculations with sPC-SAFT (without intramolecular association) are presented for comparison...

  7. Functionally relevant climate variables for arid lands: Aclimatic water deficit approach for modelling desert shrub distributions

    Science.gov (United States)

    Thomas E. Dilts; Peter J. Weisberg; Camie M. Dencker; Jeanne C. Chambers

    2015-01-01

    We have three goals. (1) To develop a suite of functionally relevant climate variables for modelling vegetation distribution on arid and semi-arid landscapes of the Great Basin, USA. (2) To compare the predictive power of vegetation distribution models based on mechanistically proximate factors (water deficit variables) and factors that are more mechanistically removed...

  8. Towards Increased Relevance: Context-Adapted Models of the Learning Organization

    Science.gov (United States)

    Örtenblad, Anders

    2015-01-01

    Purpose: The purposes of this paper are to take a closer look at the relevance of the idea of the learning organization for organizations in different generalized organizational contexts; to open up for the existence of multiple, context-adapted models of the learning organization; and to suggest a number of such models.…

  9. Control-relevant modeling and simulation of a SOFC-GT hybrid system

    Directory of Open Access Journals (Sweden)

    Rambabu Kandepu

    2006-07-01

    Full Text Available In this paper, control-relevant models of the most important components in a SOFC-GT hybrid system are described. Dynamic simulations are performed on the overall hybrid system. The model is used to develop a simple control structure, but the simulations show that more elaborate control is needed.

  10. Control-relevant modeling and simulation of a SOFC-GT hybrid system

    OpenAIRE

    Rambabu Kandepu; Lars Imsland; Christoph Stiller; Bjarne A. Foss; Vinay Kariwala

    2006-01-01

    In this paper, control-relevant models of the most important components in a SOFC-GT hybrid system are described. Dynamic simulations are performed on the overall hybrid system. The model is used to develop a simple control structure, but the simulations show that more elaborate control is needed.

  11. The Model of Complex Structure of Quark

    Science.gov (United States)

    Liu, Rongwu

    2017-09-01

    In Quantum Chromodynamics, quark is known as a kind of point-like fundamental particle which carries mass, charge, color, and flavor, strong interaction takes place between quarks by means of exchanging intermediate particles-gluons. An important consequence of this theory is that, strong interaction is a kind of short-range force, and it has the features of ``asymptotic freedom'' and ``quark confinement''. In order to reveal the nature of strong interaction, the ``bag'' model of vacuum and the ``string'' model of string theory were proposed in the context of quantum mechanics, but neither of them can provide a clear interaction mechanism. This article formulates a new mechanism by proposing a model of complex structure of quark, it can be outlined as follows: (1) Quark (as well as electron, etc) is a kind of complex structure, it is composed of fundamental particle (fundamental matter mass and electricity) and fundamental volume field (fundamental matter flavor and color) which exists in the form of limited volume; fundamental particle lies in the center of fundamental volume field, forms the ``nucleus'' of quark. (2) As static electric force, the color field force between quarks has classical form, it is proportional to the square of the color quantity carried by each color field, and inversely proportional to the area of cross section of overlapping color fields which is along force direction, it has the properties of overlap, saturation, non-central, and constant. (3) Any volume field undergoes deformation when interacting with other volume field, the deformation force follows Hooke's law. (4) The phenomena of ``asymptotic freedom'' and ``quark confinement'' are the result of color field force and deformation force.

  12. Clinical Complexity in Medicine: A Measurement Model of Task and Patient Complexity.

    Science.gov (United States)

    Islam, R; Weir, C; Del Fiol, G

    2016-01-01

    Complexity in medicine needs to be reduced to simple components in a way that is comprehensible to researchers and clinicians. Few studies in the current literature propose a measurement model that addresses both task and patient complexity in medicine. The objective of this paper is to develop an integrated approach to understand and measure clinical complexity by incorporating both task and patient complexity components focusing on the infectious disease domain. The measurement model was adapted and modified for the healthcare domain. Three clinical infectious disease teams were observed, audio-recorded and transcribed. Each team included an infectious diseases expert, one infectious diseases fellow, one physician assistant and one pharmacy resident fellow. The transcripts were parsed and the authors independently coded complexity attributes. This baseline measurement model of clinical complexity was modified in an initial set of coding processes and further validated in a consensus-based iterative process that included several meetings and email discussions by three clinical experts from diverse backgrounds from the Department of Biomedical Informatics at the University of Utah. Inter-rater reliability was calculated using Cohen's kappa. The proposed clinical complexity model consists of two separate components. The first is a clinical task complexity model with 13 clinical complexity-contributing factors and 7 dimensions. The second is the patient complexity model with 11 complexity-contributing factors and 5 dimensions. The measurement model for complexity encompassing both task and patient complexity will be a valuable resource for future researchers and industry to measure and understand complexity in healthcare.

  13. Reducing Spatial Data Complexity for Classification Models

    International Nuclear Information System (INIS)

    Ruta, Dymitr; Gabrys, Bogdan

    2007-01-01

    Intelligent data analytics gradually becomes a day-to-day reality of today's businesses. However, despite rapidly increasing storage and computational power current state-of-the-art predictive models still can not handle massive and noisy corporate data warehouses. What is more adaptive and real-time operational environment requires multiple models to be frequently retrained which further hinders their use. Various data reduction techniques ranging from data sampling up to density retention models attempt to address this challenge by capturing a summarised data structure, yet they either do not account for labelled data or degrade the classification performance of the model trained on the condensed dataset. Our response is a proposition of a new general framework for reducing the complexity of labelled data by means of controlled spatial redistribution of class densities in the input space. On the example of Parzen Labelled Data Compressor (PLDC) we demonstrate a simulatory data condensation process directly inspired by the electrostatic field interaction where the data are moved and merged following the attracting and repelling interactions with the other labelled data. The process is controlled by the class density function built on the original data that acts as a class-sensitive potential field ensuring preservation of the original class density distributions, yet allowing data to rearrange and merge joining together their soft class partitions. As a result we achieved a model that reduces the labelled datasets much further than any competitive approaches yet with the maximum retention of the original class densities and hence the classification performance. PLDC leaves the reduced dataset with the soft accumulative class weights allowing for efficient online updates and as shown in a series of experiments if coupled with Parzen Density Classifier (PDC) significantly outperforms competitive data condensation methods in terms of classification performance at the

  14. Reducing Spatial Data Complexity for Classification Models

    Science.gov (United States)

    Ruta, Dymitr; Gabrys, Bogdan

    2007-11-01

    Intelligent data analytics gradually becomes a day-to-day reality of today's businesses. However, despite rapidly increasing storage and computational power current state-of-the-art predictive models still can not handle massive and noisy corporate data warehouses. What is more adaptive and real-time operational environment requires multiple models to be frequently retrained which further hinders their use. Various data reduction techniques ranging from data sampling up to density retention models attempt to address this challenge by capturing a summarised data structure, yet they either do not account for labelled data or degrade the classification performance of the model trained on the condensed dataset. Our response is a proposition of a new general framework for reducing the complexity of labelled data by means of controlled spatial redistribution of class densities in the input space. On the example of Parzen Labelled Data Compressor (PLDC) we demonstrate a simulatory data condensation process directly inspired by the electrostatic field interaction where the data are moved and merged following the attracting and repelling interactions with the other labelled data. The process is controlled by the class density function built on the original data that acts as a class-sensitive potential field ensuring preservation of the original class density distributions, yet allowing data to rearrange and merge joining together their soft class partitions. As a result we achieved a model that reduces the labelled datasets much further than any competitive approaches yet with the maximum retention of the original class densities and hence the classification performance. PLDC leaves the reduced dataset with the soft accumulative class weights allowing for efficient online updates and as shown in a series of experiments if coupled with Parzen Density Classifier (PDC) significantly outperforms competitive data condensation methods in terms of classification performance at the

  15. A Tissue Relevance and Meshing Method for Computing Patient-Specific Anatomical Models in Endoscopic Sinus Surgery Simulation

    Science.gov (United States)

    Audette, M. A.; Hertel, I.; Burgert, O.; Strauss, G.

    This paper presents on-going work on a method for determining which subvolumes of a patient-specific tissue map, extracted from CT data of the head, are relevant to simulating endoscopic sinus surgery of that individual, and for decomposing these relevant tissues into triangles and tetrahedra whose mesh size is well controlled. The overall goal is to limit the complexity of the real-time biomechanical interaction while ensuring the clinical relevance of the simulation. Relevant tissues are determined as the union of the pathology present in the patient, of critical tissues deemed to be near the intended surgical path or pathology, and of bone and soft tissue near the intended path, pathology or critical tissues. The processing of tissues, prior to meshing, is based on the Fast Marching method applied under various guises, in a conditional manner that is related to tissue classes. The meshing is based on an adaptation of a meshing method of ours, which combines the Marching Tetrahedra method and the discrete Simplex mesh surface model to produce a topologically faithful surface mesh with well controlled edge and face size as a first stage, and Almost-regular Tetrahedralization of the same prescribed mesh size as a last stage.

  16. Principle of an operational complexity index for the characterization of the human factor relevance of future reactors concepts

    International Nuclear Information System (INIS)

    Papin, Bernard

    2004-01-01

    With the increasing reliability of the modern technological systems, the human contribution to the global risk in the operation of industrial systems is becoming more and more significant : in the nuclear reactor operation for example, a recent PSA estimation of this contribution is about 25% of the risk of core melting, all situations considered. This urges the designers of future nuclear reactors to consider the minimisation of this Human Factor (HF) contribution, at the very early stage of their design : the experience feedback shows that this is indeed at this stage that the fundamental design options, impacting the most the human reliability in operation, are fixed. The problem is that at these early design stages, it is also quite impossible to apply formal human reliability methods to support this HF optimisation, while the precise operating conditions of the reactor are not yet known in enough details. In this paper, another approach of the HF evaluation during the design, based on the functional and operational complexity assessment, is proposed. As an illustration, this approach is used to compare various concepts of Pressurized Water Reactors from the point of view of the Human Factor relevance. (Author)

  17. Complex Environmental Data Modelling Using Adaptive General Regression Neural Networks

    Science.gov (United States)

    Kanevski, Mikhail

    2015-04-01

    The research deals with an adaptation and application of Adaptive General Regression Neural Networks (GRNN) to high dimensional environmental data. GRNN [1,2,3] are efficient modelling tools both for spatial and temporal data and are based on nonparametric kernel methods closely related to classical Nadaraya-Watson estimator. Adaptive GRNN, using anisotropic kernels, can be also applied for features selection tasks when working with high dimensional data [1,3]. In the present research Adaptive GRNN are used to study geospatial data predictability and relevant feature selection using both simulated and real data case studies. The original raw data were either three dimensional monthly precipitation data or monthly wind speeds embedded into 13 dimensional space constructed by geographical coordinates and geo-features calculated from digital elevation model. GRNN were applied in two different ways: 1) adaptive GRNN with the resulting list of features ordered according to their relevancy; and 2) adaptive GRNN applied to evaluate all possible models N [in case of wind fields N=(2^13 -1)=8191] and rank them according to the cross-validation error. In both cases training were carried out applying leave-one-out procedure. An important result of the study is that the set of the most relevant features depends on the month (strong seasonal effect) and year. The predictabilities of precipitation and wind field patterns, estimated using the cross-validation and testing errors of raw and shuffled data, were studied in detail. The results of both approaches were qualitatively and quantitatively compared. In conclusion, Adaptive GRNN with their ability to select features and efficient modelling of complex high dimensional data can be widely used in automatic/on-line mapping and as an integrated part of environmental decision support systems. 1. Kanevski M., Pozdnoukhov A., Timonin V. Machine Learning for Spatial Environmental Data. Theory, applications and software. EPFL Press

  18. Modeling the Structure and Complexity of Engineering Routine Design Problems

    NARCIS (Netherlands)

    Jauregui Becker, Juan Manuel; Wits, Wessel Willems; van Houten, Frederikus J.A.M.

    2011-01-01

    This paper proposes a model to structure routine design problems as well as a model of its design complexity. The idea is that having a proper model of the structure of such problems enables understanding its complexity, and likewise, a proper understanding of its complexity enables the development

  19. Facing urban complexity : towards cognitive modelling. Part 1. Modelling as a cognitive mediator

    Directory of Open Access Journals (Sweden)

    Sylvie Occelli

    2002-03-01

    Full Text Available Over the last twenty years, complexity issues have been a central theme of enquiry for the modelling field. Whereas contributing to both a critical revisiting of the existing methods and opening new ways of reasoning, the effectiveness (and sense of modelling activity was rarely questioned. Acknowledgment of complexity however has been a fruitful spur new and more sophisticated methods in order to improve understanding and advance geographical sciences. However its contribution to tackle urban problems in everyday life has been rather poor and mainly limited to rhetorical claims about the potentialities of the new approach. We argue that although complexity has put the classical modelling activity in serious distress, it is disclosing new potentialities, which are still largely unnoticed. These are primarily related to what the authors has called the structural cognitive shift, which involves both the contents and role of modelling activity. This paper is a first part of a work aimed to illustrate the main features of this shift and discuss its main consequences on the modelling activity. We contend that a most relevant aspect of novelty lies in the new role of modelling as a cognitive mediator, i.e. as a kind of interface between the various components of a modelling process and the external environment to which a model application belongs.

  20. The LAILAPS search engine: a feature model for relevance ranking in life science databases.

    Science.gov (United States)

    Lange, Matthias; Spies, Karl; Colmsee, Christian; Flemming, Steffen; Klapperstück, Matthias; Scholz, Uwe

    2010-03-25

    Efficient and effective information retrieval in life sciences is one of the most pressing challenge in bioinformatics. The incredible growth of life science databases to a vast network of interconnected information systems is to the same extent a big challenge and a great chance for life science research. The knowledge found in the Web, in particular in life-science databases, are a valuable major resource. In order to bring it to the scientist desktop, it is essential to have well performing search engines. Thereby, not the response time nor the number of results is important. The most crucial factor for millions of query results is the relevance ranking. In this paper, we present a feature model for relevance ranking in life science databases and its implementation in the LAILAPS search engine. Motivated by the observation of user behavior during their inspection of search engine result, we condensed a set of 9 relevance discriminating features. These features are intuitively used by scientists, who briefly screen database entries for potential relevance. The features are both sufficient to estimate the potential relevance, and efficiently quantifiable. The derivation of a relevance prediction function that computes the relevance from this features constitutes a regression problem. To solve this problem, we used artificial neural networks that have been trained with a reference set of relevant database entries for 19 protein queries. Supporting a flexible text index and a simple data import format, this concepts are implemented in the LAILAPS search engine. It can easily be used both as search engine for comprehensive integrated life science databases and for small in-house project databases. LAILAPS is publicly available for SWISSPROT data at http://lailaps.ipk-gatersleben.de.

  1. On Feature Relevance in Image-Based Prediction Models: An Empirical Study

    DEFF Research Database (Denmark)

    Konukoglu, E.; Ganz, Melanie; Van Leemput, Koen

    2013-01-01

    Determining disease-related variations of the anatomy and function is an important step in better understanding diseases and developing early diagnostic systems. In particular, image-based multivariate prediction models and the “relevant features” they produce are attracting attention from the co...

  2. Stem cell therapy for joint problems using the horse as a clinically relevant animal model

    DEFF Research Database (Denmark)

    Koch, Thomas Gadegaard; Betts, Dean H.

    2007-01-01

    of experimentally induced lesions. The horse lends itself as a good animal model of spontaneous joint disorders that are clinically relevant to similar human disorders. Equine stem cell and tissue engineering studies may be financially feasible to principal investigators and small biotechnology companies...

  3. Experimental Models of Vaginal Candidiasis and Their Relevance to Human Candidiasis

    Science.gov (United States)

    Sobel, Jack D.

    2016-01-01

    Vulvovaginal candidiasis (VVC) is a high-incidence disease seriously affecting the quality of life of women worldwide, particularly in its chronic, recurrent forms (RVVC), and with no definitive cure or preventive measure. Experimental studies in currently used rat and mouse models of vaginal candidiasis have generated a large mass of data on pathogenicity determinants and inflammation and immune responses of potential importance for the control of human pathology. However, reflection is necessary about the relevance of these rodent models to RVVC. Here we examine the chemical, biochemical, and biological factors that determine or contrast the forms of the disease in rodent models and in women and highlight the differences between them. We also appeal for approaches to improve or replace the current models in order to enhance their relevance to human infection. PMID:26883592

  4. Infrared spectra of complex organic molecules in astronomically relevant ice matrices. I. Acetaldehyde, ethanol, and dimethyl ether

    Science.gov (United States)

    Terwisscha van Scheltinga, J.; Ligterink, N. F. W.; Boogert, A. C. A.; van Dishoeck, E. F.; Linnartz, H.

    2018-03-01

    Context. The number of identified complex organic molecules (COMs) in inter- and circumstellar gas-phase environments is steadily increasing. Recent laboratory studies show that many such species form on icy dust grains. At present only smaller molecular species have been directly identified in space in the solid state. Accurate spectroscopic laboratory data of frozen COMs, embedded in ice matrices containing ingredients related to their formation scheme, are still largely lacking. Aim. This work provides infrared reference spectra of acetaldehyde (CH3CHO), ethanol (CH3CH2OH), and dimethyl ether (CH3OCH3) recorded in a variety of ice environments and for astronomically relevant temperatures, as needed to guide or interpret astronomical observations, specifically for upcoming James Webb Space Telescope observations. Methods: Fourier transform transmission spectroscopy (500-4000 cm-1/20-2.5 μm, 1.0 cm-1 resolution) was used to investigate solid acetaldehyde, ethanol and dimethyl ether, pure or mixed with water, CO, methanol, or CO:methanol. These species were deposited on a cryogenically cooled infrared transmissive window at 15 K. A heating ramp was applied, during which IR spectra were recorded until all ice constituents were thermally desorbed. Results: We present a large number of reference spectra that can be compared with astronomical data. Accurate band positions and band widths are provided for the studied ice mixtures and temperatures. Special efforts have been put into those bands of each molecule that are best suited for identification. For acetaldehyde the 7.427 and 5.803 μm bands are recommended, for ethanol the 11.36 and 7.240 μm bands are good candidates, and for dimethyl ether bands at 9.141 and 8.011 μm can be used. All spectra are publicly available in the Leiden Database for Ice.

  5. Equation-free model reduction for complex dynamical systems

    International Nuclear Information System (INIS)

    Le Maitre, O. P.; Mathelin, L.; Le Maitre, O. P.

    2010-01-01

    This paper presents a reduced model strategy for simulation of complex physical systems. A classical reduced basis is first constructed relying on proper orthogonal decomposition of the system. Then, unlike the alternative approaches, such as Galerkin projection schemes for instance, an equation-free reduced model is constructed. It consists in the determination of an explicit transformation, or mapping, for the evolution over a coarse time-step of the projection coefficients of the system state on the reduced basis. The mapping is expressed as an explicit polynomial transformation of the projection coefficients and is computed once and for all in a pre-processing stage using the detailed model equation of the system. The reduced system can then be advanced in time by successive applications of the mapping. The CPU cost of the method lies essentially in the mapping approximation which is performed offline, in a parallel fashion, and only once. Subsequent application of the mapping to perform a time-integration is carried out at a low cost thanks to its explicit character. Application of the method is considered for the 2-D flow around a circular cylinder. We investigate the effectiveness of the reduced model in rendering the dynamics for both asymptotic state and transient stages. It is shown that the method leads to a stable and accurate time-integration for only a fraction of the cost of a detailed simulation, provided that the mapping is properly approximated and the reduced basis remains relevant for the dynamics investigated. (authors)

  6. Modeling competitive substitution in a polyelectrolyte complex

    International Nuclear Information System (INIS)

    Peng, B.; Muthukumar, M.

    2015-01-01

    We have simulated the invasion of a polyelectrolyte complex made of a polycation chain and a polyanion chain, by another longer polyanion chain, using the coarse-grained united atom model for the chains and the Langevin dynamics methodology. Our simulations reveal many intricate details of the substitution reaction in terms of conformational changes of the chains and competition between the invading chain and the chain being displaced for the common complementary chain. We show that the invading chain is required to be sufficiently longer than the chain being displaced for effecting the substitution. Yet, having the invading chain to be longer than a certain threshold value does not reduce the substitution time much further. While most of the simulations were carried out in salt-free conditions, we show that presence of salt facilitates the substitution reaction and reduces the substitution time. Analysis of our data shows that the dominant driving force for the substitution process involving polyelectrolytes lies in the release of counterions during the substitution

  7. Modelling of information processes management of educational complex

    Directory of Open Access Journals (Sweden)

    Оксана Николаевна Ромашкова

    2014-12-01

    Full Text Available This work concerns information model of the educational complex which includes several schools. A classification of educational complexes formed in Moscow is given. There are also a consideration of the existing organizational structure of the educational complex and a suggestion of matrix management structure. Basic management information processes of the educational complex were conceptualized.

  8. Sandpile model for relaxation in complex systems

    International Nuclear Information System (INIS)

    Vazquez, A.; Sotolongo-Costa, O.; Brouers, F.

    1997-10-01

    The relaxation in complex systems is, in general, nonexponential. After an initial rapid decay the system relaxes slowly following a long time tail. In the present paper a sandpile moderation of the relaxation in complex systems is analysed. Complexity is introduced by a process of avalanches in the Bethe lattice and a feedback mechanism which leads to slower decay with increasing time. In this way, some features of relaxation in complex systems: long time tails relaxation, aging, and fractal distribution of characteristic times, are obtained by simple computer simulations. (author)

  9. Intertwining personal and reward relevance: evidence from the drift-diffusion model.

    Science.gov (United States)

    Yankouskaya, A; Bührle, R; Lugt, E; Stolte, M; Sui, J

    2018-01-24

    In their seminal paper 'Is our self nothing but reward', Northoff and Hayes (Biol Psychiatry 69(11):1019-1025, Northoff, Hayes, Biological Psychiatry 69(11):1019-1025, 2011) proposed three models of the relationship between self and reward and opened a continuing debate about how these different fields can be linked. To date, none of the proposed models received strong empirical support. The present study tested common and distinct effects of personal relevance and reward values by de-componenting different stages of perceptual decision making using a drift-diffusion approach. We employed a recently developed associative matching paradigm where participants (N = 40) formed mental associations between five geometric shapes and five labels referring personal relevance in the personal task, or five shape-label pairings with different reward values in the reward task and then performed a matching task by indicating whether a displayed shape-label pairing was correct or incorrect. We found that common effects of personal relevance and monetary reward were manifested in the facilitation of behavioural performance for high personal relevance and high reward value as socially important signals. The differential effects between personal and monetary relevance reflected non-decisional time in a perceptual decision process, and task-specific prioritization of stimuli. Our findings support the parallel processing model (Northoff & Hayes, Biol Psychiatry 69(11):1019-1025, Northoff, Hayes, Biological Psychiatry 69(11):1019-1025, 2011) and suggest that self-specific processing occurs in parallel with high reward processing. Limitations and further directions are discussed.

  10. The relevance of non-human primate and rodent malaria models for humans

    Directory of Open Access Journals (Sweden)

    Riley Eleanor

    2011-02-01

    Full Text Available Abstract At the 2010 Keystone Symposium on "Malaria: new approaches to understanding Host-Parasite interactions", an extra scientific session to discuss animal models in malaria research was convened at the request of participants. This was prompted by the concern of investigators that skepticism in the malaria community about the use and relevance of animal models, particularly rodent models of severe malaria, has impacted on funding decisions and publication of research using animal models. Several speakers took the opportunity to demonstrate the similarities between findings in rodent models and human severe disease, as well as points of difference. The variety of malaria presentations in the different experimental models parallels the wide diversity of human malaria disease and, therefore, might be viewed as a strength. Many of the key features of human malaria can be replicated in a variety of nonhuman primate models, which are very under-utilized. The importance of animal models in the discovery of new anti-malarial drugs was emphasized. The major conclusions of the session were that experimental and human studies should be more closely linked so that they inform each other, and that there should be wider access to relevant clinical material.

  11. Modeling Complex Chemical Systems: Problems and Solutions

    Science.gov (United States)

    van Dijk, Jan

    2016-09-01

    Non-equilibrium plasmas in complex gas mixtures are at the heart of numerous contemporary technologies. They typically contain dozens to hundreds of species, involved in hundreds to thousands of reactions. Chemists and physicists have always been interested in what are now called chemical reduction techniques (CRT's). The idea of such CRT's is that they reduce the number of species that need to be considered explicitly without compromising the validity of the model. This is usually achieved on the basis of an analysis of the reaction time scales of the system under study, which identifies species that are in partial equilibrium after a given time span. The first such CRT that has been widely used in plasma physics was developed in the 1960's and resulted in the concept of effective ionization and recombination rates. It was later generalized to systems in which multiple levels are effected by transport. In recent years there has been a renewed interest in tools for chemical reduction and reaction pathway analysis. An example of the latter is the PumpKin tool. Another trend is that techniques that have previously been developed in other fields of science are adapted as to be able to handle the plasma state of matter. Examples are the Intrinsic Low Dimension Manifold (ILDM) method and its derivatives, which originate from combustion engineering, and the general-purpose Principle Component Analysis (PCA) technique. In this contribution we will provide an overview of the most common reduction techniques, then critically assess the pros and cons of the methods that have gained most popularity in recent years. Examples will be provided for plasmas in argon and carbon dioxide.

  12. Modeling the Chemical Complexity in Titan's Atmosphere

    Science.gov (United States)

    Vuitton, Veronique; Yelle, Roger; Klippenstein, Stephen J.; Horst, Sarah; Lavvas, Panayotis

    2018-06-01

    Titan's atmospheric chemistry is extremely complicated because of the multiplicity of chemical as well as physical processes involved. Chemical processes begin with the dissociation and ionization of the most abundant species, N2 and CH4, by a variety of energy sources, i.e. solar UV and X-ray photons, suprathermal electrons (reactions involving radicals as well as positive and negative ions, all possibly in some excited electronic and vibrational state. Heterogeneous chemistry at the surface of the aerosols could also play a significant role. The efficiency and outcome of these reactions depends strongly on the physical characteristics of the atmosphere, namely pressure and temperature, ranging from 1.5×103 to 10-10 mbar and from 70 to 200 K, respectively. Moreover, the distribution of the species is affected by molecular diffusion and winds as well as escape from the top of the atmosphere and condensation in the lower stratosphere.Photochemical and microphysical models are the keystones of our understanding of Titan's atmospheric chemistry. Their main objective is to compute the distribution and nature of minor chemical species (typically containing up to 6 carbon atoms) and haze particles, respectively. Density profiles are compared to the available observations, allowing to identify important processes and to highlight those that remain to be constrained in the laboratory, experimentally and/or theoretically. We argue that positive ion chemistry is at the origin of complex organic molecules, such as benzene, ammonia and hydrogen isocyanide while neutral-neutral radiative association reactions are a significant source of alkanes. We find that negatively charged macromolecules (m/z ~100) attract the abundant positive ions, which ultimately leads to the formation of the aerosols. We also discuss the possibility that an incoming flux of oxygen from Enceladus, another Saturn's satellite, is responsible for the presence of oxygen-bearing species in Titan's reductive

  13. Modelling the complex dynamics of vegetation, livestock and rainfall ...

    African Journals Online (AJOL)

    Open Access DOWNLOAD FULL TEXT ... In this paper, we present mathematical models that incorporate ideas from complex systems theory to integrate several strands of rangeland theory in a hierarchical framework. ... Keywords: catastrophe theory; complexity theory; disequilibrium; hysteresis; moving attractors

  14. Model Predictive Engine Air-Ratio Control Using Online Sequential Relevance Vector Machine

    Directory of Open Access Journals (Sweden)

    Hang-cheong Wong

    2012-01-01

    Full Text Available Engine power, brake-specific fuel consumption, and emissions relate closely to air ratio (i.e., lambda among all the engine variables. An accurate and adaptive model for lambda prediction is essential to effective lambda control for long term. This paper utilizes an emerging technique, relevance vector machine (RVM, to build a reliable time-dependent lambda model which can be continually updated whenever a sample is added to, or removed from, the estimated lambda model. The paper also presents a new model predictive control (MPC algorithm for air-ratio regulation based on RVM. This study shows that the accuracy, training, and updating time of the RVM model are superior to the latest modelling methods, such as diagonal recurrent neural network (DRNN and decremental least-squares support vector machine (DLSSVM. Moreover, the control algorithm has been implemented on a real car to test. Experimental results reveal that the control performance of the proposed relevance vector machine model predictive controller (RVMMPC is also superior to DRNNMPC, support vector machine-based MPC, and conventional proportional-integral (PI controller in production cars. Therefore, the proposed RVMMPC is a promising scheme to replace conventional PI controller for engine air-ratio control.

  15. Systematic Assessment of Neutron and Gamma Backgrounds Relevant to Operational Modeling and Detection Technology Implementation

    Energy Technology Data Exchange (ETDEWEB)

    Archer, Daniel E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hornback, Donald Eric [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Johnson, Jeffrey O. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Nicholson, Andrew D. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Patton, Bruce W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Peplow, Douglas E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Miller, Thomas Martin [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Ayaz-Maierhafer, Birsen [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-01-01

    This report summarizes the findings of a two year effort to systematically assess neutron and gamma backgrounds relevant to operational modeling and detection technology implementation. The first year effort focused on reviewing the origins of background sources and their impact on measured rates in operational scenarios of interest. The second year has focused on the assessment of detector and algorithm performance as they pertain to operational requirements against the various background sources and background levels.

  16. Generative complexity of Gray-Scott model

    Science.gov (United States)

    Adamatzky, Andrew

    2018-03-01

    In the Gray-Scott reaction-diffusion system one reactant is constantly fed in the system, another reactant is reproduced by consuming the supplied reactant and also converted to an inert product. The rate of feeding one reactant in the system and the rate of removing another reactant from the system determine configurations of concentration profiles: stripes, spots, waves. We calculate the generative complexity-a morphological complexity of concentration profiles grown from a point-wise perturbation of the medium-of the Gray-Scott system for a range of the feeding and removal rates. The morphological complexity is evaluated using Shannon entropy, Simpson diversity, approximation of Lempel-Ziv complexity, and expressivity (Shannon entropy divided by space-filling). We analyse behaviour of the systems with highest values of the generative morphological complexity and show that the Gray-Scott systems expressing highest levels of the complexity are composed of the wave-fragments (similar to wave-fragments in sub-excitable media) and travelling localisations (similar to quasi-dissipative solitons and gliders in Conway's Game of Life).

  17. Model complexity and choice of model approaches for practical simulations of CO2 injection, migration, leakage and long-term fate

    Energy Technology Data Exchange (ETDEWEB)

    Celia, Michael A. [Princeton Univ., NJ (United States)

    2016-12-30

    This report documents the accomplishments achieved during the project titled “Model complexity and choice of model approaches for practical simulations of CO2 injection,migration, leakage and long-term fate” funded by the US Department of Energy, Office of Fossil Energy. The objective of the project was to investigate modeling approaches of various levels of complexity relevant to geologic carbon storage (GCS) modeling with the goal to establish guidelines on choice of modeling approach.

  18. Microscale Synthesis, Reactions, and (Super 1)H NMR Spectroscopic Investigations of Square Planar Macrocyclic, Tetramido-N Co(III) Complexes Relevant to Green Chemistry

    Science.gov (United States)

    Watson, Tanya T.; Uffelman, Erich S.; Lee, Daniel W., III; Doherty, Jonathan R.; Schulze, Carl; Burke, Amy L.; Bonnema, Kristen, R.

    2004-01-01

    The microscale preparation, characterization, and reactivity of a square planar Co(III) complex that has grown out of a program to introduce experiments of relevance to green chemistry into the undergraduate curriculum is presented. The given experiments illustrate the remarkable redox and aqueous acid-base stability that make the macrocycles very…

  19. Theories, models and frameworks used in capacity building interventions relevant to public health: a systematic review.

    Science.gov (United States)

    Bergeron, Kim; Abdi, Samiya; DeCorby, Kara; Mensah, Gloria; Rempel, Benjamin; Manson, Heather

    2017-11-28

    There is limited research on capacity building interventions that include theoretical foundations. The purpose of this systematic review is to identify underlying theories, models and frameworks used to support capacity building interventions relevant to public health practice. The aim is to inform and improve capacity building practices and services offered by public health organizations. Four search strategies were used: 1) electronic database searching; 2) reference lists of included papers; 3) key informant consultation; and 4) grey literature searching. Inclusion and exclusion criteria are outlined with included papers focusing on capacity building, learning plans, professional development plans in combination with tools, resources, processes, procedures, steps, model, framework, guideline, described in a public health or healthcare setting, or non-government, government, or community organizations as they relate to healthcare, and explicitly or implicitly mention a theory, model and/or framework that grounds the type of capacity building approach developed. Quality assessment were performed on all included articles. Data analysis included a process for synthesizing, analyzing and presenting descriptive summaries, categorizing theoretical foundations according to which theory, model and/or framework was used and whether or not the theory, model or framework was implied or explicitly identified. Nineteen articles were included in this review. A total of 28 theories, models and frameworks were identified. Of this number, two theories (Diffusion of Innovations and Transformational Learning), two models (Ecological and Interactive Systems Framework for Dissemination and Implementation) and one framework (Bloom's Taxonomy of Learning) were identified as the most frequently cited. This review identifies specific theories, models and frameworks to support capacity building interventions relevant to public health organizations. It provides public health practitioners

  20. Direct nano ESI time-of-flight mass spectrometric investigations on lanthanide BTP complexes in the extraction-relevant diluent 1-octanol

    International Nuclear Information System (INIS)

    Steppert, M.; Walther, C.; Geist, A.; Fanghanel, Th.

    2009-01-01

    The present work focuses on investigations of a highly selective ligand for Am(III)/Ln(III) separation: bis-triazinyl-pyridine (BTP). By means of nano-electro-spray mass spectrometry, complex formation of BTP with selected elements of the lanthanide series is investigated. We show that the diluent drastically influences complex speciation. Measurements obtained in the extraction-relevant diluent 1-octanol show the occurrence of Ln(BTP) i (i 1-3) species in different relative abundances, depending on the lanthanide used. Here, the relative abundances of the Ln(BTP) 3 complexes correlate with the distribution ratios for extraction to the organic phase of the respective lanthanide. (authors)

  1. Model fit versus biological relevance: Evaluating photosynthesis-temperature models for three tropical seagrass species

    OpenAIRE

    Matthew P. Adams; Catherine J. Collier; Sven Uthicke; Yan X. Ow; Lucas Langlois; Katherine R. O’Brien

    2017-01-01

    When several models can describe a biological process, the equation that best fits the data is typically considered the best. However, models are most useful when they also possess biologically-meaningful parameters. In particular, model parameters should be stable, physically interpretable, and transferable to other contexts, e.g. for direct indication of system state, or usage in other model types. As an example of implementing these recommended requirements for model parameters, we evaluat...

  2. Family influences on mania-relevant cognitions and beliefs: a cognitive model of mania and reward.

    Science.gov (United States)

    Chen, Stephen H; Johnson, Sheri L

    2012-07-01

    The present study proposed and tested a cognitive model of mania and reward. Undergraduates (N = 284; 68.4% female; mean age = 20.99 years, standard deviation ± 3.37) completed measures of family goal setting and achievement values, personal reward-related beliefs, cognitive symptoms of mania, and risk for mania. Correlational analyses and structural equation modeling supported two distinct, but related facets of mania-relevant cognition: stably present reward-related beliefs and state-dependent cognitive symptoms in response to success and positive emotion. Results also indicated that family emphasis on achievement and highly ambitious extrinsic goals were associated with these mania-relevant cognitions. Finally, controlling for other factors, cognitive symptoms in response to success and positive emotion were uniquely associated with lifetime propensity towards mania symptoms. Results support the merit of distinguishing between facets of mania-relevant cognition and the importance of the family in shaping both aspects of cognition. © 2012 Wiley Periodicals, Inc.

  3. Hierarchical Bayesian nonparametric mixture models for clustering with variable relevance determination.

    Science.gov (United States)

    Yau, Christopher; Holmes, Chris

    2011-07-01

    We propose a hierarchical Bayesian nonparametric mixture model for clustering when some of the covariates are assumed to be of varying relevance to the clustering problem. This can be thought of as an issue in variable selection for unsupervised learning. We demonstrate that by defining a hierarchical population based nonparametric prior on the cluster locations scaled by the inverse covariance matrices of the likelihood we arrive at a 'sparsity prior' representation which admits a conditionally conjugate prior. This allows us to perform full Gibbs sampling to obtain posterior distributions over parameters of interest including an explicit measure of each covariate's relevance and a distribution over the number of potential clusters present in the data. This also allows for individual cluster specific variable selection. We demonstrate improved inference on a number of canonical problems.

  4. New Perspectives on Rodent Models of Advanced Paternal Age: Relevance to Autism

    Directory of Open Access Journals (Sweden)

    Claire J Foldi

    2011-06-01

    Full Text Available Offspring of older fathers have an increased risk of various adverse health outcomes, including autism and schizophrenia. With respect to biological mechanisms for this association, there are many more germline cell divisions in the life history of a sperm relative to that of an oocyte. This leads to more opportunities for copy error mutations in germ cells from older fathers. Evidence also suggests that epigenetic patterning in the sperm from older men is altered. Rodent models provide an experimental platform to examine the association between paternal age and brain development. Several rodent models of advanced paternal age (APA have been published with relevance to intermediate phenotypes related to autism. All four published APA models vary in key features creating a lack of consistency with respect to behavioural phenotypes. A consideration of common phenotypes that emerge from these APA-related mouse models may be informative in the exploration of the molecular and neurobiological correlates of APA.

  5. Modeling Complex Workflow in Molecular Diagnostics

    Science.gov (United States)

    Gomah, Mohamed E.; Turley, James P.; Lu, Huimin; Jones, Dan

    2010-01-01

    One of the hurdles to achieving personalized medicine has been implementing the laboratory processes for performing and reporting complex molecular tests. The rapidly changing test rosters and complex analysis platforms in molecular diagnostics have meant that many clinical laboratories still use labor-intensive manual processing and testing without the level of automation seen in high-volume chemistry and hematology testing. We provide here a discussion of design requirements and the results of implementation of a suite of lab management tools that incorporate the many elements required for use of molecular diagnostics in personalized medicine, particularly in cancer. These applications provide the functionality required for sample accessioning and tracking, material generation, and testing that are particular to the evolving needs of individualized molecular diagnostics. On implementation, the applications described here resulted in improvements in the turn-around time for reporting of more complex molecular test sets, and significant changes in the workflow. Therefore, careful mapping of workflow can permit design of software applications that simplify even the complex demands of specialized molecular testing. By incorporating design features for order review, software tools can permit a more personalized approach to sample handling and test selection without compromising efficiency. PMID:20007844

  6. Complex systems modeling by cellular automata

    NARCIS (Netherlands)

    Kroc, J.; Sloot, P.M.A.; Rabuñal Dopico, J.R.; Dorado de la Calle, J.; Pazos Sierra, A.

    2009-01-01

    In recent years, the notion of complex systems proved to be a very useful concept to define, describe, and study various natural phenomena observed in a vast number of scientific disciplines. Examples of scientific disciplines that highly benefit from this concept range from physics, mathematics,

  7. Modeling pitch perception of complex tones

    NARCIS (Netherlands)

    Houtsma, A.J.M.

    1986-01-01

    When one listens to a series of harmonic complex tones that have no acoustic energy at their fundamental frequencies, one usually still hears a melody that corresponds to those missing fundamentals. Since it has become evident some two decades ago that neither Helmholtz's difference tone theory nor

  8. Use of an ecologically relevant modelling approach to improve remote sensing-based schistosomiasis risk profiling

    Directory of Open Access Journals (Sweden)

    Yvonne Walz

    2015-11-01

    Full Text Available Schistosomiasis is a widespread water-based disease that puts close to 800 million people at risk of infection with more than 250 million infected, mainly in sub-Saharan Africa. Transmission is governed by the spatial distribution of specific freshwater snails that act as intermediate hosts and the frequency, duration and extent of human bodies exposed to infested water sources during human water contact. Remote sensing data have been utilized for spatially explicit risk profiling of schistosomiasis. Since schistosomiasis risk profiling based on remote sensing data inherits a conceptual drawback if school-based disease prevalence data are directly related to the remote sensing measurements extracted at the location of the school, because the disease transmission usually does not exactly occur at the school, we took the local environment around the schools into account by explicitly linking ecologically relevant environmental information of potential disease transmission sites to survey measurements of disease prevalence. Our models were validated at two sites with different landscapes in Côte d’Ivoire using high- and moderateresolution remote sensing data based on random forest and partial least squares regression. We found that the ecologically relevant modelling approach explained up to 70% of the variation in Schistosoma infection prevalence and performed better compared to a purely pixelbased modelling approach. Furthermore, our study showed that model performance increased as a function of enlarging the school catchment area, confirming the hypothesis that suitable environments for schistosomiasis transmission rarely occur at the location of survey measurements.

  9. Model fit versus biological relevance: Evaluating photosynthesis-temperature models for three tropical seagrass species.

    Science.gov (United States)

    Adams, Matthew P; Collier, Catherine J; Uthicke, Sven; Ow, Yan X; Langlois, Lucas; O'Brien, Katherine R

    2017-01-04

    When several models can describe a biological process, the equation that best fits the data is typically considered the best. However, models are most useful when they also possess biologically-meaningful parameters. In particular, model parameters should be stable, physically interpretable, and transferable to other contexts, e.g. for direct indication of system state, or usage in other model types. As an example of implementing these recommended requirements for model parameters, we evaluated twelve published empirical models for temperature-dependent tropical seagrass photosynthesis, based on two criteria: (1) goodness of fit, and (2) how easily biologically-meaningful parameters can be obtained. All models were formulated in terms of parameters characterising the thermal optimum (T opt ) for maximum photosynthetic rate (P max ). These parameters indicate the upper thermal limits of seagrass photosynthetic capacity, and hence can be used to assess the vulnerability of seagrass to temperature change. Our study exemplifies an approach to model selection which optimises the usefulness of empirical models for both modellers and ecologists alike.

  10. Model fit versus biological relevance: Evaluating photosynthesis-temperature models for three tropical seagrass species

    Science.gov (United States)

    Adams, Matthew P.; Collier, Catherine J.; Uthicke, Sven; Ow, Yan X.; Langlois, Lucas; O'Brien, Katherine R.

    2017-01-01

    When several models can describe a biological process, the equation that best fits the data is typically considered the best. However, models are most useful when they also possess biologically-meaningful parameters. In particular, model parameters should be stable, physically interpretable, and transferable to other contexts, e.g. for direct indication of system state, or usage in other model types. As an example of implementing these recommended requirements for model parameters, we evaluated twelve published empirical models for temperature-dependent tropical seagrass photosynthesis, based on two criteria: (1) goodness of fit, and (2) how easily biologically-meaningful parameters can be obtained. All models were formulated in terms of parameters characterising the thermal optimum (Topt) for maximum photosynthetic rate (Pmax). These parameters indicate the upper thermal limits of seagrass photosynthetic capacity, and hence can be used to assess the vulnerability of seagrass to temperature change. Our study exemplifies an approach to model selection which optimises the usefulness of empirical models for both modellers and ecologists alike.

  11. Filtered selection coupled with support vector machines generate a functionally relevant prediction model for colorectal cancer

    Directory of Open Access Journals (Sweden)

    Gabere MN

    2016-06-01

    Full Text Available Musa Nur Gabere,1 Mohamed Aly Hussein,1 Mohammad Azhar Aziz2 1Department of Bioinformatics, King Abdullah International Medical Research Center/King Saud bin Abdulaziz University for Health Sciences, Riyadh, Saudi Arabia; 2Colorectal Cancer Research Program, Department of Medical Genomics, King Abdullah International Medical Research Center, Riyadh, Saudi Arabia Purpose: There has been considerable interest in using whole-genome expression profiles for the classification of colorectal cancer (CRC. The selection of important features is a crucial step before training a classifier.Methods: In this study, we built a model that uses support vector machine (SVM to classify cancer and normal samples using Affymetrix exon microarray data obtained from 90 samples of 48 patients diagnosed with CRC. From the 22,011 genes, we selected the 20, 30, 50, 100, 200, 300, and 500 genes most relevant to CRC using the minimum-redundancy–maximum-relevance (mRMR technique. With these gene sets, an SVM model was designed using four different kernel types (linear, polynomial, radial basis function [RBF], and sigmoid.Results: The best model, which used 30 genes and RBF kernel, outperformed other combinations; it had an accuracy of 84% for both ten fold and leave-one-out cross validations in discriminating the cancer samples from the normal samples. With this 30 genes set from mRMR, six classifiers were trained using random forest (RF, Bayes net (BN, multilayer perceptron (MLP, naïve Bayes (NB, reduced error pruning tree (REPT, and SVM. Two hybrids, mRMR + SVM and mRMR + BN, were the best models when tested on other datasets, and they achieved a prediction accuracy of 95.27% and 91.99%, respectively, compared to other mRMR hybrid models (mRMR + RF, mRMR + NB, mRMR + REPT, and mRMR + MLP. Ingenuity pathway analysis was used to analyze the functions of the 30 genes selected for this model and their potential association with CRC: CDH3, CEACAM7, CLDN1, IL8, IL6R, MMP1

  12. Decision-relevant evaluation of climate models: A case study of chill hours in California

    Science.gov (United States)

    Jagannathan, K. A.; Jones, A. D.; Kerr, A. C.

    2017-12-01

    The past decade has seen a proliferation of different climate datasets with over 60 climate models currently in use. Comparative evaluation and validation of models can assist practitioners chose the most appropriate models for adaptation planning. However, such assessments are usually conducted for `climate metrics' such as seasonal temperature, while sectoral decisions are often based on `decision-relevant outcome metrics' such as growing degree days or chill hours. Since climate models predict different metrics with varying skill, the goal of this research is to conduct a bottom-up evaluation of model skill for `outcome-based' metrics. Using chill hours (number of hours in winter months where temperature is lesser than 45 deg F) in Fresno, CA as a case, we assess how well different GCMs predict the historical mean and slope of chill hours, and whether and to what extent projections differ based on model selection. We then compare our results with other climate-based evaluations of the region, to identify similarities and differences. For the model skill evaluation, historically observed chill hours were compared with simulations from 27 GCMs (and multiple ensembles). Model skill scores were generated based on a statistical hypothesis test of the comparative assessment. Future projections from RCP 8.5 runs were evaluated, and a simple bias correction was also conducted. Our analysis indicates that model skill in predicting chill hour slope is dependent on its skill in predicting mean chill hours, which results from the non-linear nature of the chill metric. However, there was no clear relationship between the models that performed well for the chill hour metric and those that performed well in other temperature-based evaluations (such winter minimum temperature or diurnal temperature range). Further, contrary to conclusions from other studies, we also found that the multi-model mean or large ensemble mean results may not always be most appropriate for this

  13. Looking for a relevant potential evapotranspiration model at the watershed scale

    Science.gov (United States)

    Oudin, L.; Hervieu, F.; Michel, C.; Perrin, C.; Anctil, F.; Andréassian, V.

    2003-04-01

    In this paper, we try to identify the most relevant approach to calculate Potential Evapotranspiration (PET) for use in a daily watershed model, to try to bring an answer to the following question: "how can we use commonly available atmospheric parameters to represent the evaporative demand at the catchment scale?". Hydrologists generally see the Penman model as the ideal model regarding to its good adequacy with lysimeter measurements and its physically-based formulation. However, in real-world engineering situations, where meteorological stations are scarce, hydrologists are often constrained to use other PET formulae with less data requirements or/and long-term average of PET values (the rationale being that PET is an inherently conservative variable). We chose to test 28 commonly used PET models coupled with 4 different daily watershed models. For each test, we compare both PET input options: actual data and long-term average data. The comparison is made in terms of streamflow simulation efficiency, over a large sample of 308 watersheds. The watersheds are located in France, Australia and the United States of America and represent varied climates. Strikingly, we find no systematic improvements of the watershed model efficiencies when using actual PET series instead of long-term averages. This suggests either that watershed models may not conveniently use the climatic information contained in PET values or that formulae are only awkward indicators of the real PET which watershed models need.

  14. The utility of Earth system Models of Intermediate Complexity

    NARCIS (Netherlands)

    Weber, S.L.

    2010-01-01

    Intermediate-complexity models are models which describe the dynamics of the atmosphere and/or ocean in less detail than conventional General Circulation Models (GCMs). At the same time, they go beyond the approach taken by atmospheric Energy Balance Models (EBMs) or ocean box models by

  15. Advances in dynamic network modeling in complex transportation systems

    CERN Document Server

    Ukkusuri, Satish V

    2013-01-01

    This book focuses on the latest in dynamic network modeling, including route guidance and traffic control in transportation systems and other complex infrastructure networks. Covers dynamic traffic assignment, flow modeling, mobile sensor deployment and more.

  16. Crack propagation rate modelling for 316SS exposed to PWR-relevant conditions

    International Nuclear Information System (INIS)

    Vankeerberghen, M.; Weyns, G.; Gavrilov, S.; Martens, B.; Deconinck, J.

    2009-01-01

    The crack propagation rate of Type 316 stainless steel in boric acid-lithium hydroxide solutions under PWR-relevant conditions was modelled. A film rupture/dissolution/repassivation mechanism is assumed and extended to cold worked materials by including a stress-dependent bare metal dissolution current density. The chemical and electrochemical conditions within the crack are calculated by finite element calculations, an analytical expression is used for the crack-tip strain rate and the crack-tip stress is assumed equal to 2.5 times the yield stress (plane-strain). First the model was calibrated against a literature published data set. Afterwards, the influence of various variables - dissolved hydrogen, boric acid and lithium hydroxide content, stress intensity, crack length, temperature, flow rate - was studied. Finally, other published crack growth rate tests were modelled and the calculated crack growth rates were found to be in reasonable agreement with the reported ones

  17. The relevance of existing health communication models in the email age: An

    Science.gov (United States)

    Fage-Butler, Antoinette Mary; Jensen, Matilde Nisbeth

    2015-01-01

    Email communication is being integrated relatively slowly into doctor–patient communication. Patients have expressed enthusiasm for the medium, while doctors are generally more reluctant. As existing health communication models have characteristically assumed the co-presence of doctor and patient and primarily reflect medical practitioners’ perspectives, their suitability in relation to email communication and patients’ perspectives warrants further investigation. Following a two-step process and using the methodology of the integrative literature review, 29 articles from 2004–2014 are analysed with the aim of investigating the advantages and disadvantages of the medium of email from the patient’s perspective. The findings are compared to the health communication models of biomedicine, patient-centeredness, patient education and patient empowerment to investigate these models’ relevance for doctor–patient email communication. Results show that patients identify numerous advantages with email communication, including improved convenience and access, more detailed informational exchanges, greater reflection opportunities, freedom from the medical gaze and the potential to level out power imbalances, as well as a number of primarily medium-related disadvantages. The findings indicate that email can counteract some of the communicative problems associated with biomedicine and suggest the ongoing relevance of aspects of the models of patient empowerment, patient-centeredness and patient education for email communication.

  18. Relevant Criteria for Testing the Quality of Models for Turbulent Wind Speed Fluctuations

    DEFF Research Database (Denmark)

    Frandsen, Sten Tronæs; Ejsing Jørgensen, Hans; Sørensen, John Dalsgaard

    2008-01-01

    Seeking relevant criteria for testing the quality of turbulence models, the scale of turbulence and the gust factor have been estimated from data and compared with predictions from first-order models of these two quantities. It is found that the mean of the measured length scales is approximately...... 10% smaller than the IEC model for wind turbine hub height levels. The mean is only marginally dependent on trends in time series. It is also found that the coefficient of variation of the measured length scales is about 50%. 3  s and 10  s preaveraging of wind speed data are relevant for megawatt......-size wind turbines when seeking wind characteristics that correspond to one blade and the entire rotor, respectively. For heights exceeding 50-60  m, the gust factor increases with wind speed. For heights larger than 60-80  m, present assumptions on the value of the gust factor are significantly...

  19. Construct Relevant or Irrelevant? The Role of Linguistic Complexity in the Assessment of English Language Learners' Science Knowledge

    Science.gov (United States)

    Avenia-Tapper, Brianna; Llosa, Lorena

    2015-01-01

    This article addresses the issue of language-related construct-irrelevant variance on content area tests from the perspective of systemic functional linguistics. We propose that the construct relevance of language used in content area assessments, and consequent claims of construct-irrelevant variance and bias, should be determined according to…

  20. Promoting culturally competent chronic pain management using the clinically relevant continuum model.

    Science.gov (United States)

    Monsivais, Diane B

    2011-06-01

    This article reviews the culture of biomedicine and current practices in pain management education, which often merge to create a hostile environment for effective chronic pain care. Areas of cultural tensions in chronic pain frequently involve the struggle to achieve credibility regarding one's complaints of pain (or being believed that the pain is real) and complying with pain medication protocols. The clinically relevant continuum model is presented as a framework allowing providers to approach care from an evidence-based, culturally appropriate (patient centered) perspective that takes into account the highest level of evidence available, provider expertise, and patient preferences and values. Copyright © 2011 Elsevier Inc. All rights reserved.

  1. Narrowing the gap between network models and real complex systems

    OpenAIRE

    Viamontes Esquivel, Alcides

    2014-01-01

    Simple network models that focus only on graph topology or, at best, basic interactions are often insufficient to capture all the aspects of a dynamic complex system. In this thesis, I explore those limitations, and some concrete methods of resolving them. I argue that, in order to succeed at interpreting and influencing complex systems, we need to take into account  slightly more complex parts, interactions and information flows in our models.This thesis supports that affirmation with five a...

  2. Uncertainty and Complexity in Mathematical Modeling

    Science.gov (United States)

    Cannon, Susan O.; Sanders, Mark

    2017-01-01

    Modeling is an effective tool to help students access mathematical concepts. Finding a math teacher who has not drawn a fraction bar or pie chart on the board would be difficult, as would finding students who have not been asked to draw models and represent numbers in different ways. In this article, the authors will discuss: (1) the properties of…

  3. Information, complexity and efficiency: The automobile model

    Energy Technology Data Exchange (ETDEWEB)

    Allenby, B. [Lucent Technologies (United States)]|[Lawrence Livermore National Lab., CA (United States)

    1996-08-08

    The new, rapidly evolving field of industrial ecology - the objective, multidisciplinary study of industrial and economic systems and their linkages with fundamental natural systems - provides strong ground for believing that a more environmentally and economically efficient economy will be more information intensive and complex. Information and intellectual capital will be substituted for the more traditional inputs of materials and energy in producing a desirable, yet sustainable, quality of life. While at this point this remains a strong hypothesis, the evolution of the automobile industry can be used to illustrate how such substitution may, in fact, already be occurring in an environmentally and economically critical sector.

  4. Control Relevant Modeling and Design of Scramjet-Powered Hypersonic Vehicles

    Science.gov (United States)

    Dickeson, Jeffrey James

    This report provides an overview of scramjet-powered hypersonic vehicle modeling and control challenges. Such vehicles are characterized by unstable non-minimum phase dynamics with significant coupling and low thrust margins. Recent trends in hypersonic vehicle research are summarized. To illustrate control relevant design issues and tradeoffs, a generic nonlinear 3DOF longitudinal dynamics model capturing aero-elastic-propulsive interactions for wedge-shaped vehicle is used. Limitations of the model are discussed and numerous modifications have been made to address control relevant needs. Two different baseline configurations are examined over a two-stage to orbit ascent trajectory. The report highlights how vehicle level-flight static (trim) and dynamic properties change over the trajectory. Thermal choking constraints are imposed on control system design as a direct consequence of having a finite FER margin. The implication of this state-dependent nonlinear FER margin constraint, the right half plane (RHP) zero, and lightly damped flexible modes, on control system bandwidth (BW) and FPA tracking has been discussed. A control methodology has been proposed that addresses the above dynamics while providing some robustness to modeling uncertainty. Vehicle closure (the ability to fly a trajectory segment subject to constraints) is provided through a proposed vehicle design methodology. The design method attempts to use open loop metrics whenever possible to design the vehicle. The design method is applied to a vehicle/control law closed loop nonlinear simulation for validation. The 3DOF longitudinal modeling results are validated against a newly released NASA 6DOF code.

  5. Realistic modelling of observed seismic motion in complex sedimentary basins

    International Nuclear Information System (INIS)

    Faeh, D.; Panza, G.F.

    1994-03-01

    Three applications of a numerical technique are illustrated to model realistically the seismic ground motion for complex two-dimensional structures. First we consider a sedimentary basin in the Friuli region, and we model strong motion records from an aftershock of the 1976 earthquake. Then we simulate the ground motion caused in Rome by the 1915, Fucino (Italy) earthquake, and we compare our modelling with the damage distribution observed in the town. Finally we deal with the interpretation of ground motion recorded in Mexico City, as a consequence of earthquakes in the Mexican subduction zone. The synthetic signals explain the major characteristics (relative amplitudes, spectral amplification, frequency content) of the considered seismograms, and the space distribution of the available macroseismic data. For the sedimentary basin in the Friuli area, parametric studies demonstrate the relevant sensitivity of the computed ground motion to small changes in the subsurface topography of the sedimentary basin, and in the velocity and quality factor of the sediments. The total energy of ground motion, determined from our numerical simulation in Rome, is in very good agreement with the distribution of damage observed during the Fucino earthquake. For epicentral distances in the range 50km-100km, the source location and not only the local soil conditions control the local effects. For Mexico City, the observed ground motion can be explained as resonance effects and as excitation of local surface waves, and the theoretical and the observed maximum spectral amplifications are very similar. In general, our numerical simulations permit the estimate of the maximum and average spectral amplification for specific sites, i.e. are a very powerful tool for accurate micro-zonation. (author). 38 refs, 19 figs, 1 tab

  6. Modeling Power Systems as Complex Adaptive Systems

    Energy Technology Data Exchange (ETDEWEB)

    Chassin, David P.; Malard, Joel M.; Posse, Christian; Gangopadhyaya, Asim; Lu, Ning; Katipamula, Srinivas; Mallow, J V.

    2004-12-30

    Physical analogs have shown considerable promise for understanding the behavior of complex adaptive systems, including macroeconomics, biological systems, social networks, and electric power markets. Many of today's most challenging technical and policy questions can be reduced to a distributed economic control problem. Indeed, economically based control of large-scale systems is founded on the conjecture that the price-based regulation (e.g., auctions, markets) results in an optimal allocation of resources and emergent optimal system control. This report explores the state-of-the-art physical analogs for understanding the behavior of some econophysical systems and deriving stable and robust control strategies for using them. We review and discuss applications of some analytic methods based on a thermodynamic metaphor, according to which the interplay between system entropy and conservation laws gives rise to intuitive and governing global properties of complex systems that cannot be otherwise understood. We apply these methods to the question of how power markets can be expected to behave under a variety of conditions.

  7. Linking Complexity and Sustainability Theories: Implications for Modeling Sustainability Transitions

    Directory of Open Access Journals (Sweden)

    Camaren Peter

    2014-03-01

    Full Text Available In this paper, we deploy a complexity theory as the foundation for integration of different theoretical approaches to sustainability and develop a rationale for a complexity-based framework for modeling transitions to sustainability. We propose a framework based on a comparison of complex systems’ properties that characterize the different theories that deal with transitions to sustainability. We argue that adopting a complexity theory based approach for modeling transitions requires going beyond deterministic frameworks; by adopting a probabilistic, integrative, inclusive and adaptive approach that can support transitions. We also illustrate how this complexity-based modeling framework can be implemented; i.e., how it can be used to select modeling techniques that address particular properties of complex systems that we need to understand in order to model transitions to sustainability. In doing so, we establish a complexity-based approach towards modeling sustainability transitions that caters for the broad range of complex systems’ properties that are required to model transitions to sustainability.

  8. Lab-on-a-brane: A novel physiologically relevant planar arterial model to study transendothelial transport

    Science.gov (United States)

    Budhwani, Karim Ismail

    The tremendous quality of life impact notwithstanding, cardiovascular diseases and Cancer add up to over US$ 700bn each year in financial costs alone. Aging and population growth are expected to further expand the problem space while drug research and development remain expensive. However, preclinical costs can be substantially mitigated by substituting animal models with in vitro devices that accurately model human cardiovascular transport. Here we present a novel physiologically relevant lab-on-a-brane that simulates in vivo pressure, flow, strain, and shear waveforms associated with normal and pathological conditions in large and small blood vessels for studying molecular transport across the endothelial monolayer. The device builds upon previously demonstrated integrated microfluidic loop design by: (a) introducing nanoscale pores in the substrate membrane to enable transmembrane molecular transport, (b) transforming the substrate membrane into a nanofibrous matrix for 3D smooth muscle cell (SMC) tissue culture, (c) integrating electrospinning fabrication methods, (d) engineering an invertible sandwich cell culture device architecture, and (e) devising a healthy co-culture mechanism for human arterial endothelial cell (HAEC) monolayer and multiple layers of human smooth muscle cells (HSMC) to accurately mimic arterial anatomy. Structural and mechanical characterization was conducted using confocal microscopy, SEM, stress/strain analysis, and infrared spectroscopy. Transport was characterized using FITC-Dextran hydraulic permeability protocol. Structure and transport characterization successfully demonstrate device viability as a physiologically relevant arterial mimic for testing transendothelial transport. Thus, our lab-on-a-brane provides a highly effective and efficient, yet considerably inexpensive, physiologically relevant alternative for pharmacokinetic evaluation; possibly reducing animals used in pre-clinical testing, clinical trials cost from false

  9. Review: To be or not to be an identifiable model. Is this a relevant question in animal science modelling?

    Science.gov (United States)

    Muñoz-Tamayo, R; Puillet, L; Daniel, J B; Sauvant, D; Martin, O; Taghipoor, M; Blavy, P

    2018-04-01

    What is a good (useful) mathematical model in animal science? For models constructed for prediction purposes, the question of model adequacy (usefulness) has been traditionally tackled by statistical analysis applied to observed experimental data relative to model-predicted variables. However, little attention has been paid to analytic tools that exploit the mathematical properties of the model equations. For example, in the context of model calibration, before attempting a numerical estimation of the model parameters, we might want to know if we have any chance of success in estimating a unique best value of the model parameters from available measurements. This question of uniqueness is referred to as structural identifiability; a mathematical property that is defined on the sole basis of the model structure within a hypothetical ideal experiment determined by a setting of model inputs (stimuli) and observable variables (measurements). Structural identifiability analysis applied to dynamic models described by ordinary differential equations (ODEs) is a common practice in control engineering and system identification. This analysis demands mathematical technicalities that are beyond the academic background of animal science, which might explain the lack of pervasiveness of identifiability analysis in animal science modelling. To fill this gap, in this paper we address the analysis of structural identifiability from a practitioner perspective by capitalizing on the use of dedicated software tools. Our objectives are (i) to provide a comprehensive explanation of the structural identifiability notion for the community of animal science modelling, (ii) to assess the relevance of identifiability analysis in animal science modelling and (iii) to motivate the community to use identifiability analysis in the modelling practice (when the identifiability question is relevant). We focus our study on ODE models. By using illustrative examples that include published

  10. Relevance of Discrecionary Accruals in Ohlson Model: the Case of Mexico

    Directory of Open Access Journals (Sweden)

    Rocío Durán-Vázquez

    2012-01-01

    Full Text Available This study applied the modified Jones´ model (1991 for selected companies of Mexico. This model aims to assess the impact of Discretionary Accrual Information (DAI on financial reporting statements, in order to identify the value relevance of “earnings quality”. We applied methodological criteria of Chung et al (2005 and Mukit & Iskandar (2009. We analyzed financial information of the 35 stock included in the Index of Prices and Quotations (IPC of the Mexican Stock Exchange (BMV for the period 2000 to 2011. 19 companies met the specifications of the model, for 48 quarters of information. The analysis was done in three parts: first, an analysis of the modified Jones´ model under panel data considerations by using fixed effects and adjustments of performing autocorrelation of order 1; second, a correlation analysis between the residuals of the modified Jones´ model and the return of stock price in 3 annual closings years of study: 2007, 2008 and 2009; and third, we incorporated this variable (DAI in the Ohlson model (of the financial and corporate accounting literature and we tested it with panel data analysis, under fixed effects, throughout the study period.

  11. More Realistic Face Model Surface Improves Relevance of Pediatric In-Vitro Aerosol Studies.

    Science.gov (United States)

    Amirav, Israel; Halamish, Asaf; Gorenberg, Miguel; Omar, Hamza; Newhouse, Michael T

    2015-01-01

    Various hard face models are commonly used to evaluate the efficiency of aerosol face masks. Softer more realistic "face" surface materials, like skin, deform upon mask application and should provide more relevant in-vitro tests. Studies that simultaneously take into consideration many of the factors characteristic of the in vivo face are lacking. These include airways, various application forces, comparison of various devices, comparison with a hard-surface model and use of a more representative model face based on large numbers of actual faces. To compare mask to "face" seal and aerosol delivery of two pediatric masks using a soft vs. a hard, appropriately representative, pediatric face model under various applied forces. Two identical face models and upper airways replicas were constructed, the only difference being the suppleness and compressibility of the surface layer of the "face." Integrity of the seal and aerosol delivery of two different masks [AeroChamber (AC) and SootherMask (SM)] were compared using a breath simulator, filter collection and realistic applied forces. The soft "face" significantly increased the delivery efficiency and the sealing characteristics of both masks. Aerosol delivery with the soft "face" was significantly greater for the SM compared to the AC (pmasks was observed with the hard "face." The material and pliability of the model "face" surface has a significant influence on both the seal and delivery efficiency of face masks. This finding should be taken into account during in-vitro aerosol studies.

  12. Mathematical modeling and optimization of complex structures

    CERN Document Server

    Repin, Sergey; Tuovinen, Tero

    2016-01-01

    This volume contains selected papers in three closely related areas: mathematical modeling in mechanics, numerical analysis, and optimization methods. The papers are based upon talks presented  on the International Conference for Mathematical Modeling and Optimization in Mechanics, held in Jyväskylä, Finland, March 6-7, 2014 dedicated to Prof. N. Banichuk on the occasion of his 70th birthday. The articles are written by well-known scientists working in computational mechanics and in optimization of complicated technical models. Also, the volume contains papers discussing the historical development, the state of the art, new ideas, and open problems arising in  modern continuum mechanics and applied optimization problems. Several papers are concerned with mathematical problems in numerical analysis, which are also closely related to important mechanical models. The main topics treated include:  * Computer simulation methods in mechanics, physics, and biology;  * Variational problems and methods; minimiz...

  13. Hierarchical Models of the Nearshore Complex System

    National Research Council Canada - National Science Library

    Werner, Brad

    2004-01-01

    .... This grant was termination funding for the Werner group, specifically aimed at finishing up and publishing research related to synoptic imaging of near shore bathymetry, testing models for beach cusp...

  14. Linking electricity and water models to assess electricity choices at water-relevant scales

    International Nuclear Information System (INIS)

    Sattler, S; Rogers, J; Macknick, J; Lopez, A; Yates, D; Flores-Lopez, F

    2012-01-01

    Hydrology/water management and electricity generation projections have been modeled separately, but there has been little effort in intentionally and explicitly linking the two sides of the water–energy nexus. This paper describes a platform for assessing power plant cooling water withdrawals and consumption under different electricity pathways at geographic and time scales appropriate for both electricity and hydrology/water management. This platform uses estimates of regional electricity generation by the Regional Energy Deployment System (ReEDS) as input to a hydrologic and water management model—the Water Evaluation and Planning (WEAP) system. In WEAP, this electricity use represents thermoelectric cooling water withdrawals and consumption within the broader, regional water resource context. Here we describe linking the electricity and water models, including translating electricity generation results from ReEDS-relevant geographies to the water-relevant geographies of WEAP. The result of this analysis is water use by the electric sector at the regional watershed level, which is used to examine the water resource implications of these electricity pathways. (letter)

  15. Complex models of nodal nuclear data

    International Nuclear Information System (INIS)

    Dufek, Jan

    2011-01-01

    During the core simulations, nuclear data are required at various nodal thermal-hydraulic and fuel burnup conditions. The nodal data are also partially affected by thermal-hydraulic and fuel burnup conditions in surrounding nodes as these change the neutron energy spectrum in the node. Therefore, the nodal data are functions of many parameters (state variables), and the more state variables are considered by the nodal data models the more accurate and flexible the models get. The existing table and polynomial regression models, however, cannot reflect the data dependences on many state variables. As for the table models, the number of mesh points (and necessary lattice calculations) grows exponentially with the number of variables. As for the polynomial regression models, the number of possible multivariate polynomials exceeds the limits of existing selection algorithms that should identify a few dozens of the most important polynomials. Also, the standard scheme of lattice calculations is not convenient for modelling the data dependences on various burnup conditions since it performs only a single or few burnup calculations at fixed nominal conditions. We suggest a new efficient algorithm for selecting the most important multivariate polynomials for the polynomial regression models so that dependences on many state variables can be considered. We also present a new scheme for lattice calculations where a large number of burnup histories are accomplished at varied nodal conditions. The number of lattice calculations being performed and the number of polynomials being analysed are controlled and minimised while building the nodal data models of a required accuracy. (author)

  16. Integrated Modeling of Complex Optomechanical Systems

    Science.gov (United States)

    Andersen, Torben; Enmark, Anita

    2011-09-01

    Mathematical modeling and performance simulation are playing an increasing role in large, high-technology projects. There are two reasons; first, projects are now larger than they were before, and the high cost calls for detailed performance prediction before construction. Second, in particular for space-related designs, it is often difficult to test systems under realistic conditions beforehand, and mathematical modeling is then needed to verify in advance that a system will work as planned. Computers have become much more powerful, permitting calculations that were not possible before. At the same time mathematical tools have been further developed and found acceptance in the community. Particular progress has been made in the fields of structural mechanics, optics and control engineering, where new methods have gained importance over the last few decades. Also, methods for combining optical, structural and control system models into global models have found widespread use. Such combined models are usually called integrated models and were the subject of this symposium. The objective was to bring together people working in the fields of groundbased optical telescopes, ground-based radio telescopes, and space telescopes. We succeeded in doing so and had 39 interesting presentations and many fruitful discussions during coffee and lunch breaks and social arrangements. We are grateful that so many top ranked specialists found their way to Kiruna and we believe that these proceedings will prove valuable during much future work.

  17. Malaria in pregnancy: the relevance of animal models for vaccine development.

    Science.gov (United States)

    Doritchamou, Justin; Teo, Andrew; Fried, Michal; Duffy, Patrick E

    2017-10-06

    Malaria during pregnancy due to Plasmodium falciparum or P. vivax is a major public health problem in endemic areas, with P. falciparum causing the greatest burden of disease. Increasing resistance of parasites and mosquitoes to existing tools, such as preventive antimalarial treatments and insecticide-treated bed nets respectively, is eroding the partial protection that they offer to pregnant women. Thus, development of effective vaccines against malaria during pregnancy is an urgent priority. Relevant animal models that recapitulate key features of the pathophysiology and immunology of malaria in pregnant women could be used to accelerate vaccine development. This review summarizes available rodent and nonhuman primate models of malaria in pregnancy, and discusses their suitability for studies of biologics intended to prevent or treat malaria in this vulnerable population.

  18. Expanding the scope and relevance of health interventions: Moving beyond clinical trials and behavior change models

    Directory of Open Access Journals (Sweden)

    Khary K. Rigg

    2014-07-01

    Full Text Available An overemphasis on clinical trials and behavior change models has narrowed the knowledge base that can be used to design interventions. The overarching point is that the process of overanalyzing variables is impeding the process of gaining insight into the everyday experiences that shape how people define health and seek treatment. This claim is especially important to health decision-making and behavior change because subtle interpretations often influence the decisions that people make. This manuscript provides a critique of traditional approaches to developing health interventions, and theoretically justifies what and why changes are warranted. The limited scope of these models is also discussed, and an argument is made to adopt a strategy that includes the perceptions of people as necessary for understanding health and health-related decision-making. Three practical strategies are suggested to be used with the more standard approaches to assessing the effectiveness and relevance of health interventions.

  19. Impact relevance and usability of high resolution climate modeling and data

    Energy Technology Data Exchange (ETDEWEB)

    Arnott, James C. [Aspen Global Change Inst., Basalt, CO (United States)

    2016-10-30

    The Aspen Global Change Institute hosted a technical science workshop entitled, “Impact Relevance and Usability of High-Resolution Climate Modeling and Datasets,” on August 2-7, 2015 in Aspen, CO. Kate Calvin (Pacific Northwest National Laboratory), Andrew Jones (Lawrence Berkeley National Laboratory) and Jean-François Lamarque (NCAR) served as co-chairs for the workshop. The meeting included the participation of 29 scientists for a total of 145 participant days. Following the workshop, workshop co-chairs authored a meeting report published in Eos on April 27, 2016. Insights from the workshop directly contributed to the formation of a new DOE-supported project co-led by workshop co-chair Andy Jones. A subset of meeting participants continue to work on a publication on institutional innovations that can support the usability of high resolution modeling, among other sources of climate information.

  20. Smart modeling and simulation for complex systems practice and theory

    CERN Document Server

    Ren, Fenghui; Zhang, Minjie; Ito, Takayuki; Tang, Xijin

    2015-01-01

    This book aims to provide a description of these new Artificial Intelligence technologies and approaches to the modeling and simulation of complex systems, as well as an overview of the latest scientific efforts in this field such as the platforms and/or the software tools for smart modeling and simulating complex systems. These tasks are difficult to accomplish using traditional computational approaches due to the complex relationships of components and distributed features of resources, as well as the dynamic work environments. In order to effectively model the complex systems, intelligent technologies such as multi-agent systems and smart grids are employed to model and simulate the complex systems in the areas of ecosystem, social and economic organization, web-based grid service, transportation systems, power systems and evacuation systems.

  1. The Design, Synthesis and Study of Mixed-Metal Ru,Rh and Os, Rh Complexes with Biologically Relevant Reactivity

    OpenAIRE

    Wang, Jing

    2013-01-01

    A series of mixed-metal bimetallic complexes [(TL)2M(dpp)RhCl2(TL)]3 (M = Ru and Os, terminal ligands (TL) = phen, Ph2phen, Me2phen and bpy, terminal ligands (TL) = phen, bpy and Me2bpy ), which couple one Ru or Os polyazine light absorber (LA) to a cis-RhIIICl2 center through a dpp bridging ligand (BL), were synthesized using a building block method. These are related to previously studied trimetallic systems [{(TL)2M(dpp)2RhCl2]5+, but the bimetallics are synthetically more complex to prepa...

  2. Clinical and Neurobiological Relevance of Current Animal Models of Autism Spectrum Disorders

    Science.gov (United States)

    Kim, Ki Chan; Gonzales, Edson Luck; Lázaro, María T.; Choi, Chang Soon; Bahn, Geon Ho; Yoo, Hee Jeong; Shin, Chan Young

    2016-01-01

    Autism spectrum disorder (ASD) is a neurodevelopmental disorder characterized by social and communication impairments, as well as repetitive and restrictive behaviors. The phenotypic heterogeneity of ASD has made it overwhelmingly difficult to determine the exact etiology and pathophysiology underlying the core symptoms, which are often accompanied by comorbidities such as hyperactivity, seizures, and sensorimotor abnormalities. To our benefit, the advent of animal models has allowed us to assess and test diverse risk factors of ASD, both genetic and environmental, and measure their contribution to the manifestation of autistic symptoms. At a broader scale, rodent models have helped consolidate molecular pathways and unify the neurophysiological mechanisms underlying each one of the various etiologies. This approach will potentially enable the stratification of ASD into clinical, molecular, and neurophenotypic subgroups, further proving their translational utility. It is henceforth paramount to establish a common ground of mechanistic theories from complementing results in preclinical research. In this review, we cluster the ASD animal models into lesion and genetic models and further classify them based on the corresponding environmental, epigenetic and genetic factors. Finally, we summarize the symptoms and neuropathological highlights for each model and make critical comparisons that elucidate their clinical and neurobiological relevance. PMID:27133257

  3. Towards and Effective Financial Management: Relevance of Dividend Discount Model in Stock Price Valuation

    Directory of Open Access Journals (Sweden)

    Ana Mugoša

    2015-06-01

    Full Text Available The aim of this paper is to analyze the relevance of dividend discount model, i.e. its specific form in stock price estimation known as Gordon growth model. The expected dividends can be a measure of cash flows returned to the stockholder. In this context, the model is useful for assessment of how risk factors, such as interest rates and changing inflation rates, affect stock returns. This is especially important in case when investors are value oriented, i.e. when expected dividends are theirmain investing drivers. We compared the estimated with the actual stock price values and tested the statistical significance of price differences in 199 publicly traded European companies for the period2010-2013. Statistical difference between pairs of price series (actual and estimated was tested using Wilcoxon and Kruskal-Wallis tests of median and distribution equality. The hypothesis that Gordon growth model cannot be reliable measure of stock price valuation on European equity market over period of 2010-2013 due to influence of the global financial crisis was rejected with 95% confidence. Gordon growth model has proven to be reliable measure of stock price valuation even over period of strong global financial crisis influence.

  4. On the hyperporous non-linear elasticity model for fusion-relevant pebble beds

    International Nuclear Information System (INIS)

    Di Maio, P.A.; Giammusso, R.; Vella, G.

    2010-01-01

    Packed pebble beds are particular granular systems composed of a large amount of small particles, arranged in irregular lattices and surrounded by a gas filling interstitial spaces. Due to their heterogeneous structure, pebble beds have non-linear and strongly coupled thermal and mechanical behaviours whose constitutive models seem limited, being not suitable for fusion-relevant design-oriented applications. Within the framework of the modelling activities promoted for the lithiated ceramics and beryllium pebble beds foreseen in the Helium-Cooled Pebble Bed breeding blanket concept of DEMO, at the Department of Nuclear Engineering of the University of Palermo (DIN) a thermo-mechanical constitutive model has been set-up assuming that pebble beds can be considered as continuous, homogeneous and isotropic media. The present paper deals with the DIN non-linear elasticity constitutive model, based on the assumption that during the reversible straining of a pebble bed its effective logarithmic bulk modulus depends on the equivalent pressure according to a modified power law and its effective Poisson modulus remains constant. In these hypotheses the functional dependence of the effective tangential and secant bed deformation moduli on either the equivalent pressure or the volumetric strain have been derived in a closed analytical form. A procedure has been, then, defined to assess the model parameters for a given pebble bed from its oedometric test results and it has been applied to both polydisperse lithium orthosilicate and single size beryllium pebble beds.

  5. The sigma model on complex projective superspaces

    Energy Technology Data Exchange (ETDEWEB)

    Candu, Constantin; Mitev, Vladimir; Schomerus, Volker [DESY, Hamburg (Germany). Theory Group; Quella, Thomas [Amsterdam Univ. (Netherlands). Inst. for Theoretical Physics; Saleur, Hubert [CEA Saclay, 91 - Gif-sur-Yvette (France). Inst. de Physique Theorique; USC, Los Angeles, CA (United States). Physics Dept.

    2009-08-15

    The sigma model on projective superspaces CP{sup S-1} {sup vertical} {sup stroke} {sup S} gives rise to a continuous family of interacting 2D conformal field theories which are parametrized by the curvature radius R and the theta angle {theta}. Our main goal is to determine the spectrum of the model, non-perturbatively as a function of both parameters. We succeed to do so for all open boundary conditions preserving the full global symmetry of the model. In string theory parlor, these correspond to volume filling branes that are equipped with a monopole line bundle and connection. The paper consists of two parts. In the first part, we approach the problem within the continuum formulation. Combining combinatorial arguments with perturbative studies and some simple free field calculations, we determine a closed formula for the partition function of the theory. This is then tested numerically in the second part. There we propose a spin chain regularization of the CP{sup S-1} {sup vertical} {sup stroke} {sup S} model with open boundary conditions and use it to determine the spectrum at the conformal fixed point. The numerical results are in remarkable agreement with the continuum analysis. (orig.)

  6. The sigma model on complex projective superspaces

    International Nuclear Information System (INIS)

    Candu, Constantin; Mitev, Vladimir; Schomerus, Volker; Quella, Thomas; Saleur, Hubert; USC, Los Angeles, CA

    2009-08-01

    The sigma model on projective superspaces CP S-1 vertical stroke S gives rise to a continuous family of interacting 2D conformal field theories which are parametrized by the curvature radius R and the theta angle θ. Our main goal is to determine the spectrum of the model, non-perturbatively as a function of both parameters. We succeed to do so for all open boundary conditions preserving the full global symmetry of the model. In string theory parlor, these correspond to volume filling branes that are equipped with a monopole line bundle and connection. The paper consists of two parts. In the first part, we approach the problem within the continuum formulation. Combining combinatorial arguments with perturbative studies and some simple free field calculations, we determine a closed formula for the partition function of the theory. This is then tested numerically in the second part. There we propose a spin chain regularization of the CP S-1 vertical stroke S model with open boundary conditions and use it to determine the spectrum at the conformal fixed point. The numerical results are in remarkable agreement with the continuum analysis. (orig.)

  7. Regional and global modeling estimates of policy relevant background ozone over the United States

    Science.gov (United States)

    Emery, Christopher; Jung, Jaegun; Downey, Nicole; Johnson, Jeremiah; Jimenez, Michele; Yarwood, Greg; Morris, Ralph

    2012-02-01

    Policy Relevant Background (PRB) ozone, as defined by the US Environmental Protection Agency (EPA), refers to ozone concentrations that would occur in the absence of all North American anthropogenic emissions. PRB enters into the calculation of health risk benefits, and as the US ozone standard approaches background levels, PRB is increasingly important in determining the feasibility and cost of compliance. As PRB is a hypothetical construct, modeling is a necessary tool. Since 2006 EPA has relied on global modeling to establish PRB for their regulatory analyses. Recent assessments with higher resolution global models exhibit improved agreement with remote observations and modest upward shifts in PRB estimates. This paper shifts the paradigm to a regional model (CAMx) run at 12 km resolution, for which North American boundary conditions were provided by a low-resolution version of the GEOS-Chem global model. We conducted a comprehensive model inter-comparison, from which we elucidate differences in predictive performance against ozone observations and differences in temporal and spatial background variability over the US. In general, CAMx performed better in replicating observations at remote monitoring sites, and performance remained better at higher concentrations. While spring and summer mean PRB predicted by GEOS-Chem ranged 20-45 ppb, CAMx predicted PRB ranged 25-50 ppb and reached well over 60 ppb in the west due to event-oriented phenomena such as stratospheric intrusion and wildfires. CAMx showed a higher correlation between modeled PRB and total observed ozone, which is significant for health risk assessments. A case study during April 2006 suggests that stratospheric exchange of ozone is underestimated in both models on an event basis. We conclude that wildfires, lightning NO x and stratospheric intrusions contribute a significant level of uncertainty in estimating PRB, and that PRB will require careful consideration in the ozone standard setting process.

  8. A complex autoregressive model and application to monthly temperature forecasts

    Directory of Open Access Journals (Sweden)

    X. Gu

    2005-11-01

    Full Text Available A complex autoregressive model was established based on the mathematic derivation of the least squares for the complex number domain which is referred to as the complex least squares. The model is different from the conventional way that the real number and the imaginary number are separately calculated. An application of this new model shows a better forecast than forecasts from other conventional statistical models, in predicting monthly temperature anomalies in July at 160 meteorological stations in mainland China. The conventional statistical models include an autoregressive model, where the real number and the imaginary number are separately disposed, an autoregressive model in the real number domain, and a persistence-forecast model.

  9. Understanding complex urban systems integrating multidisciplinary data in urban models

    CERN Document Server

    Gebetsroither-Geringer, Ernst; Atun, Funda; Werner, Liss

    2016-01-01

    This book is devoted to the modeling and understanding of complex urban systems. This second volume of Understanding Complex Urban Systems focuses on the challenges of the modeling tools, concerning, e.g., the quality and quantity of data and the selection of an appropriate modeling approach. It is meant to support urban decision-makers—including municipal politicians, spatial planners, and citizen groups—in choosing an appropriate modeling approach for their particular modeling requirements. The contributors to this volume are from different disciplines, but all share the same goal: optimizing the representation of complex urban systems. They present and discuss a variety of approaches for dealing with data-availability problems and finding appropriate modeling approaches—and not only in terms of computer modeling. The selection of articles featured in this volume reflect a broad variety of new and established modeling approaches such as: - An argument for using Big Data methods in conjunction with Age...

  10. Kinetic studies on the oxidation of oxyhemoglobin by biologically active iron thiosemicarbazone complexes: relevance to iron-chelator-induced methemoglobinemia.

    Science.gov (United States)

    Basha, Maram T; Rodríguez, Carlos; Richardson, Des R; Martínez, Manuel; Bernhardt, Paul V

    2014-03-01

    The oxidation of oxyhemoglobin to methemoglobin has been found to be facilitated by low molecular weight iron(III) thiosemicarbazone complexes. This deleterious reaction, which produces hemoglobin protein units unable to bind dioxygen and occurs during the administration of iron chelators such as the well-known 3-aminopyridine-2-pyridinecarbaldehyde thiosemicarbazone (3-AP; Triapine), has been observed in the reaction with Fe(III) complexes of some members of the 3-AP structurally-related thiosemicarbazone ligands derived from di-2-pyridyl ketone (HDpxxT series). We have studied the kinetics of this oxidation reaction in vitro using human hemoglobin and found that the reaction proceeds with two distinct time-resolved steps. These have been associated with sequential oxidation of the two different oxyheme cofactors in the α and β protein chains. Unexpected steric and hydrogen-bonding effects on the Fe(III) complexes appear to be the responsible for the observed differences in the reaction rate across the series of HDpxxT ligand complexes used in this study.

  11. Towards Measures to Establish the Relevance of Climate Model Output for Decision Support

    Science.gov (United States)

    Clarke, L.; Smith, L. A.

    2007-12-01

    How exactly can decision-support and policy making benefit from the use of multiple climate model experiments in terms of coping with the uncertainties on climate change projections? Climate modelling faces challenges beyond those of weather forecasting or even seasonal forecasting, as with climate we are now (and will probably always be) required to extrapolate to regimes in which we have no relevant forecast-verification archive. This suggests a very different approach from traditional methods of mixing models and skill based weighting to gain profitable probabilistic information when a large forecast-verification archive is in hand. In the case of climate, it may prove more rational to search for agreement between our models (in distribution), the aim being to determine the space and timescales on which, given our current understanding, the details of the simulation models are unimportant. This suggestion and others from Smith (2002, Proc. National Acad. Sci. USA 4 (99): 2487-2492) are interpreted in the light of recent advances. Climate models are large nonlinear dynamical systems which insightfully but imperfectly reflect the evolving weather patterns of the Earth. Their use in policy making and decision support assumes both that they contain sufficient information regarding reality to inform the decision, and that this information can be effectively communicated to the decision makers. There is nothing unique about climate modeling and these constraints, they apply in all cases where scientific modeling is applied to real-word actions (other than, perhaps, the action of improving our models). Starting with the issue of communication, figures from the 2007 IPCC Summary for Policy Makers will be constructively criticized from the perspective of decision makers, specifically those of the energy sector and the insurance/reinsurance sector. More information on basic questions of reliability and robustness would be of significant value when determining how heavily

  12. Models of policy-making and their relevance for drug research.

    Science.gov (United States)

    Ritter, Alison; Bammer, Gabriele

    2010-07-01

    Researchers are often frustrated by their inability to influence policy. We describe models of policy-making to provide new insights and a more realistic assessment of research impacts on policy. We describe five prominent models of policy-making and illustrate them with examples from the alcohol and drugs field, before drawing lessons for researchers. Policy-making is a complex and messy process, with different models describing different elements. We start with the incrementalist model, which highlights small amendments to policy, as occurs in school-based drug education. A technical/rational approach then outlines the key steps in a policy process from identification of problems and their causes, through to examination and choice of response options, and subsequent implementation and evaluation. There is a clear role for research, as we illustrate with the introduction of new medications, but this model largely ignores the dominant political aspects of policy-making. Such political aspects include the influence of interest groups, and we describe models about power and pressure groups, as well as advocacy coalitions, and the challenges they pose for researchers. These are illustrated with reference to the alcohol industry, and interest group conflicts in establishing a Medically Supervised Injecting Centre. Finally, we describe the multiple streams framework, which alerts researchers to 'windows of opportunity', and we show how these were effectively exploited in policy for cannabis law reform in Western Australia. Understanding models of policy-making can help researchers maximise the uptake of their work and advance evidence-informed policy.

  13. Fluid flow modeling in complex areas*, **

    Directory of Open Access Journals (Sweden)

    Poullet Pascal

    2012-04-01

    Full Text Available We show first results of 3D simulation of sea currents in a realistic context. We use the full Navier–Stokes equations for incompressible viscous fluid. The problem is solved using a second order incremental projection method associated with the finite volume of the staggered (MAC scheme for the spatial discretization. After validation on classical cases, it is used in a numerical simulation of the Pointe à Pitre harbour area. The use of the fictious domain method permits us to take into account the complexity of bathymetric data and allows us to work with regular meshes and thus preserves the efficiency essential for a 3D code. Dans cette étude, nous présentons les premiers résultats de simulation d’un écoulement d’un fluide incompressible visqueux dans un contexte environnemental réel. L’approche utilisée utilise une méthode de domaines fictifs pour une prise en compte d’un domaine physique tridimensionnel très irrégulier. Le schéma numérique combine un schéma de projection incrémentale et des volumes finis utilisant des volumes de contrôle adaptés à un maillage décalé. Les tests de validation sont menés pour les cas tests de la cavité double entraînée ainsi que l’écoulement dans un canal avec un obstacle placé de manière asymmétrique.

  14. Gene Expression Analysis to Assess the Relevance of Rodent Models to Human Lung Injury.

    Science.gov (United States)

    Sweeney, Timothy E; Lofgren, Shane; Khatri, Purvesh; Rogers, Angela J

    2017-08-01

    The relevance of animal models to human diseases is an area of intense scientific debate. The degree to which mouse models of lung injury recapitulate human lung injury has never been assessed. Integrating data from both human and animal expression studies allows for increased statistical power and identification of conserved differential gene expression across organisms and conditions. We sought comprehensive integration of gene expression data in experimental acute lung injury (ALI) in rodents compared with humans. We performed two separate gene expression multicohort analyses to determine differential gene expression in experimental animal and human lung injury. We used correlational and pathway analyses combined with external in vitro gene expression data to identify both potential drivers of underlying inflammation and therapeutic drug candidates. We identified 21 animal lung tissue datasets and three human lung injury bronchoalveolar lavage datasets. We show that the metasignatures of animal and human experimental ALI are significantly correlated despite these widely varying experimental conditions. The gene expression changes among mice and rats across diverse injury models (ozone, ventilator-induced lung injury, LPS) are significantly correlated with human models of lung injury (Pearson r = 0.33-0.45, P human lung injury. Predicted therapeutic targets, peptide ligand signatures, and pathway analyses are also all highly overlapping. Gene expression changes are similar in animal and human experimental ALI, and provide several physiologic and therapeutic insights to the disease.

  15. Relevance of the Lin's and Host hydropedological models to predict grape yield and wine quality

    Directory of Open Access Journals (Sweden)

    E. A. C. Costantini

    2009-09-01

    Full Text Available The adoption of precision agriculture in viticulture could be greatly enhanced by the diffusion of straightforward and easy to be applied hydropedological models, able to predict the spatial variability of available soil water. The Lin's and Host hydropedological models were applied to standard soil series descriptions and hillslope position, to predict the distribution of hydrological functional units in two vineyard and their relevance for grape yield and wine quality. A three-years trial was carried out in Chianti (Central Italy on Sangiovese. The soils of the vineyards differentiated in structure, porosity and related hydropedological characteristics, as well as in salinity. Soil spatial variability was deeply affected by earth movement carried out before vine plantation. Six plots were selected in the different hydrological functional units of the two vineyards, that is, at summit, backslope and footslope morphological positions, to monitor soil hydrology, grape production and wine quality. Plot selection was based upon a cluster analysis of local slope, topographic wetness index (TWI, and cumulative moisture up to the root limiting layer, appreciated by means of a detailed combined geophysical survey. Water content, redox processes and temperature were monitored, as well as yield, phenological phases, and chemical analysis of grapes. The isotopic ratio δ13C was measured in the wine ethanol upon harvesting to evaluate the degree of stress suffered by vines. The grapes in each plot were collected for wine making in small barrels. The wines obtained were analysed and submitted to a blind organoleptic testing.

    The results demonstrated that the combined application of the two hydropedological models can be used for the prevision of the moisture status of soils cultivated with grape during summertime in Mediterranean climate. As correctly foreseen by the models, the amount of mean daily transpirable soil water (TSW during

  16. Cumulative complexity: a functional, patient-centered model of patient complexity can improve research and practice.

    Science.gov (United States)

    Shippee, Nathan D; Shah, Nilay D; May, Carl R; Mair, Frances S; Montori, Victor M

    2012-10-01

    To design a functional, patient-centered model of patient complexity with practical applicability to analytic design and clinical practice. Existing literature on patient complexity has mainly identified its components descriptively and in isolation, lacking clarity as to their combined functions in disrupting care or to how complexity changes over time. The authors developed a cumulative complexity model, which integrates existing literature and emphasizes how clinical and social factors accumulate and interact to complicate patient care. A narrative literature review is used to explicate the model. The model emphasizes a core, patient-level mechanism whereby complicating factors impact care and outcomes: the balance between patient workload of demands and patient capacity to address demands. Workload encompasses the demands on the patient's time and energy, including demands of treatment, self-care, and life in general. Capacity concerns ability to handle work (e.g., functional morbidity, financial/social resources, literacy). Workload-capacity imbalances comprise the mechanism driving patient complexity. Treatment and illness burdens serve as feedback loops, linking negative outcomes to further imbalances, such that complexity may accumulate over time. With its components largely supported by existing literature, the model has implications for analytic design, clinical epidemiology, and clinical practice. Copyright © 2012 Elsevier Inc. All rights reserved.

  17. Activity, polypeptide and gene identification of thylakoid Ndh complex in trees: potential physiological relevance of fluorescence assays.

    Science.gov (United States)

    Serrot, Patricia H; Sabater, Bartolomé; Martín, Mercedes

    2012-09-01

    Three evergreen (Laurus nobilis, Viburnum tinus and Thuja plicata) and two autumnal abscission deciduous trees (Cydonia oblonga and Prunus domestica) have been investigated for the presence (zymogram and immunodetection) and functionality (post-illumination chlorophyll fluorescence) of the thylakoid Ndh complex. The presence of encoding ndh genes has also been investigated in T. plicata. Western assays allowed tentative identification of zymogram NADH dehydrogenase bands corresponding to the Ndh complex after native electrophoresis of solubilized fractions from L. nobilis, V. tinus, C. oblonga and P. domestica leaves, but not in those of T. plicata. However, Ndh subunits were detected after SDS-PAGE of thylakoid solubilized proteins of T. plicata. The leaves of the five plants showed the post-illumination chlorophyll fluorescence increase dependent on the presence of active Ndh complex. The fluorescence increase was higher in autumn in deciduous, but not in evergreen trees, which suggests that the thylakoid Ndh complex could be involved in autumnal leaf senescence. Two ndhB genes were sequenced from T. plicata that differ at the 350 bp 3' end sequence. Comparison with the mRNA revealed that ndhB genes have a 707-bp type II intron between exons 1 (723 bp) and 2 (729 bp) and that the UCA 259th codon is edited to UUA in mRNA. Phylogenetically, the ndhB genes of T. plicata group close to those of Metasequoia, Cryptomeria, Taxodium, Juniperus and Widdringtonia in the cupresaceae branch and are 5' end shortened by 18 codons with respect to that of angiosperms. Copyright © Physiologia Plantarum 2012.

  18. The big seven model of personality and its relevance to personality pathology.

    Science.gov (United States)

    Simms, Leonard J

    2007-02-01

    Proponents of the Big Seven model of personality have suggested that Positive Valence (PV) and Negative Valence (NV) are independent of the Big Five personality dimensions and may be particularly relevant to personality disorder. These hypotheses were tested with 403 undergraduates who completed a Big Seven measure and markers of the Big Five and personality pathology. Results revealed that PV and NV incrementally predicted personality pathology dimensions beyond those predicted by multiple markers of the Big Five. However, factor analyses suggested that PV and NV might be best understood as specific, maladaptive aspects of positive emotionality and low agreeableness, respectively, as opposed to independent factors of personality. Implications for the description of normal and abnormal personality are discussed.

  19. Method for extracting relevant electrical parameters from graphene field-effect transistors using a physical model

    International Nuclear Information System (INIS)

    Boscá, A.; Pedrós, J.; Martínez, J.; Calle, F.

    2015-01-01

    Due to its intrinsic high mobility, graphene has proved to be a suitable material for high-speed electronics, where graphene field-effect transistor (GFET) has shown excellent properties. In this work, we present a method for extracting relevant electrical parameters from GFET devices using a simple electrical characterization and a model fitting. With experimental data from the device output characteristics, the method allows to calculate parameters such as the mobility, the contact resistance, and the fixed charge. Differentiated electron and hole mobilities and direct connection with intrinsic material properties are some of the key aspects of this method. Moreover, the method output values can be correlated with several issues during key fabrication steps such as the graphene growth and transfer, the lithographic steps, or the metalization processes, providing a flexible tool for quality control in GFET fabrication, as well as a valuable feedback for improving the material-growth process

  20. Method for extracting relevant electrical parameters from graphene field-effect transistors using a physical model

    Energy Technology Data Exchange (ETDEWEB)

    Boscá, A., E-mail: alberto.bosca@upm.es [Instituto de Sistemas Optoelectrónicos y Microtecnología, Universidad Politécnica de Madrid, Madrid 28040 (Spain); Dpto. de Ingeniería Electrónica, E.T.S.I. de Telecomunicación, Universidad Politécnica de Madrid, Madrid 28040 (Spain); Pedrós, J. [Instituto de Sistemas Optoelectrónicos y Microtecnología, Universidad Politécnica de Madrid, Madrid 28040 (Spain); Campus de Excelencia Internacional, Campus Moncloa UCM-UPM, Madrid 28040 (Spain); Martínez, J. [Instituto de Sistemas Optoelectrónicos y Microtecnología, Universidad Politécnica de Madrid, Madrid 28040 (Spain); Dpto. de Ciencia de Materiales, E.T.S.I de Caminos, Canales y Puertos, Universidad Politécnica de Madrid, Madrid 28040 (Spain); Calle, F. [Instituto de Sistemas Optoelectrónicos y Microtecnología, Universidad Politécnica de Madrid, Madrid 28040 (Spain); Dpto. de Ingeniería Electrónica, E.T.S.I. de Telecomunicación, Universidad Politécnica de Madrid, Madrid 28040 (Spain); Campus de Excelencia Internacional, Campus Moncloa UCM-UPM, Madrid 28040 (Spain)

    2015-01-28

    Due to its intrinsic high mobility, graphene has proved to be a suitable material for high-speed electronics, where graphene field-effect transistor (GFET) has shown excellent properties. In this work, we present a method for extracting relevant electrical parameters from GFET devices using a simple electrical characterization and a model fitting. With experimental data from the device output characteristics, the method allows to calculate parameters such as the mobility, the contact resistance, and the fixed charge. Differentiated electron and hole mobilities and direct connection with intrinsic material properties are some of the key aspects of this method. Moreover, the method output values can be correlated with several issues during key fabrication steps such as the graphene growth and transfer, the lithographic steps, or the metalization processes, providing a flexible tool for quality control in GFET fabrication, as well as a valuable feedback for improving the material-growth process.

  1. Mitochondrial metabolism in early neural fate and its relevance for neuronal disease modeling.

    Science.gov (United States)

    Lorenz, Carmen; Prigione, Alessandro

    2017-12-01

    Modulation of energy metabolism is emerging as a key aspect associated with cell fate transition. The establishment of a correct metabolic program is particularly relevant for neural cells given their high bioenergetic requirements. Accordingly, diseases of the nervous system commonly involve mitochondrial impairment. Recent studies in animals and in neural derivatives of human pluripotent stem cells (PSCs) highlighted the importance of mitochondrial metabolism for neural fate decisions in health and disease. The mitochondria-based metabolic program of early neurogenesis suggests that PSC-derived neural stem cells (NSCs) may be used for modeling neurological disorders. Understanding how metabolic programming is orchestrated during neural commitment may provide important information for the development of therapies against conditions affecting neural functions, including aging and mitochondrial disorders. Copyright © 2017. Published by Elsevier Ltd.

  2. Passengers, Crowding and Complexity : Models for passenger oriented public transport

    NARCIS (Netherlands)

    P.C. Bouman (Paul)

    2017-01-01

    markdownabstractPassengers, Crowding and Complexity was written as part of the Complexity in Public Transport (ComPuTr) project funded by the Netherlands Organisation for Scientific Research (NWO). This thesis studies in three parts how microscopic data can be used in models that have the potential

  3. Stability of Rotor Systems: A Complex Modelling Approach

    DEFF Research Database (Denmark)

    Kliem, Wolfhard; Pommer, Christian; Stoustrup, Jakob

    1996-01-01

    A large class of rotor systems can be modelled by a complex matrix differential equation of secondorder. The angular velocity of the rotor plays the role of a parameter. We apply the Lyapunov matrix equation in a complex setting and prove two new stability results which are compared...

  4. Complex versus simple models: ion-channel cardiac toxicity prediction.

    Science.gov (United States)

    Mistry, Hitesh B

    2018-01-01

    There is growing interest in applying detailed mathematical models of the heart for ion-channel related cardiac toxicity prediction. However, a debate as to whether such complex models are required exists. Here an assessment in the predictive performance between two established large-scale biophysical cardiac models and a simple linear model B net was conducted. Three ion-channel data-sets were extracted from literature. Each compound was designated a cardiac risk category using two different classification schemes based on information within CredibleMeds. The predictive performance of each model within each data-set for each classification scheme was assessed via a leave-one-out cross validation. Overall the B net model performed equally as well as the leading cardiac models in two of the data-sets and outperformed both cardiac models on the latest. These results highlight the importance of benchmarking complex versus simple models but also encourage the development of simple models.

  5. Complex versus simple models: ion-channel cardiac toxicity prediction

    Directory of Open Access Journals (Sweden)

    Hitesh B. Mistry

    2018-02-01

    Full Text Available There is growing interest in applying detailed mathematical models of the heart for ion-channel related cardiac toxicity prediction. However, a debate as to whether such complex models are required exists. Here an assessment in the predictive performance between two established large-scale biophysical cardiac models and a simple linear model Bnet was conducted. Three ion-channel data-sets were extracted from literature. Each compound was designated a cardiac risk category using two different classification schemes based on information within CredibleMeds. The predictive performance of each model within each data-set for each classification scheme was assessed via a leave-one-out cross validation. Overall the Bnet model performed equally as well as the leading cardiac models in two of the data-sets and outperformed both cardiac models on the latest. These results highlight the importance of benchmarking complex versus simple models but also encourage the development of simple models.

  6. Modeling Air-Quality in Complex Terrain Using Mesoscale and ...

    African Journals Online (AJOL)

    Air-quality in a complex terrain (Colorado-River-Valley/Grand-Canyon Area, Southwest U.S.) is modeled using a higher-order closure mesoscale model and a higher-order closure dispersion model. Non-reactive tracers have been released in the Colorado-River valley, during winter and summer 1992, to study the ...

  7. Surface-complexation models for sorption onto heterogeneous surfaces

    International Nuclear Information System (INIS)

    Harvey, K.B.

    1997-10-01

    This report provides a description of the discrete-logK spectrum model, together with a description of its derivation, and of its place in the larger context of surface-complexation modelling. The tools necessary to apply the discrete-logK spectrum model are discussed, and background information appropriate to this discussion is supplied as appendices. (author)

  8. On spin and matrix models in the complex plane

    International Nuclear Information System (INIS)

    Damgaard, P.H.; Heller, U.M.

    1993-01-01

    We describe various aspects of statistical mechanics defined in the complex temperature or coupling-constant plane. Using exactly solvable models, we analyse such aspects as renormalization group flows in the complex plane, the distribution of partition function zeros, and the question of new coupling-constant symmetries of complex-plane spin models. The double-scaling form of matrix models is shown to be exactly equivalent to finite-size scaling of two-dimensional spin systems. This is used to show that the string susceptibility exponents derived from matrix models can be obtained numerically with very high accuracy from the scaling of finite-N partition function zeros in the complex plane. (orig.)

  9. A Framework for Modeling and Analyzing Complex Distributed Systems

    National Research Council Canada - National Science Library

    Lynch, Nancy A; Shvartsman, Alex Allister

    2005-01-01

    Report developed under STTR contract for topic AF04-T023. This Phase I project developed a modeling language and laid a foundation for computational support tools for specifying, analyzing, and verifying complex distributed system designs...

  10. Modelling the self-organization and collapse of complex networks

    Indian Academy of Sciences (India)

    Modelling the self-organization and collapse of complex networks. Sanjay Jain Department of Physics and Astrophysics, University of Delhi Jawaharlal Nehru Centre for Advanced Scientific Research, Bangalore Santa Fe Institute, Santa Fe, New Mexico.

  11. Relevance of the ICRP biokinetic model for dietary organically bound tritium

    International Nuclear Information System (INIS)

    Trivedi, A.

    1999-10-01

    Ingested dietary tritium can participate in metabolic processes, and become synthesized into organically bound tritium in the tissues and organs. The distribution and retention of the organically bound tritium throughout the body are much different than tritium in the body water. The International Commission on Radiological Protection (ICRP) Publication 56 (1989) has a biokinetic model to calculate dose from the ingestion of organically bound dietary tritium. The model predicts that the dose from the ingestion of organically bound dietary tritium is about 2.3 times higher than from the ingestion of the same activity of tritiated water. Under steady-state conditions, the calculated dose rate (using the first principle approach) from the ingestion of dietary organically bound tritium can be twice that from the ingestion of tritiated water. For an adult, the upper-bound dose estimate for the ingestion of dietary organically bound tritium is estimated to be close to 2.3 times higher than that of tritiated water. Therefore, given the uncertainty in the dose calculation with respect to the actual relevant dose, the ICRP biokinetic model for organically bound tritium is sufficient for dosimetry for adults. (author)

  12. On the complexity of measuring forests microclimate and interpreting its relevance in habitat ecology: the example of Ixodes ricinus ticks

    Directory of Open Access Journals (Sweden)

    Denise Boehnke

    2017-11-01

    Full Text Available Abstract Background Ecological field research on the influence of meteorological parameters on a forest inhabiting species is confronted with the complex relations between measured data and the real conditions the species is exposed to. This study highlights this complexity for the example of Ixodes ricinus. This species lives mainly in forest habitats near the ground, but field research on impacts of meteorological conditions on population dynamics is often based on data from nearby official weather stations or occasional in situ measurements. In addition, studies use very different data approaches to analyze comparable research questions. This study is an extensive examination of the methodology used to analyze the impact of meteorological parameters on Ixodes ricinus and proposes a methodological approach that tackles the underlying complexity. Methods Our specifically developed measurement concept was implemented at 25 forest study sites across Baden-Württemberg, Germany. Meteorological weather stations recorded data in situ and continuously between summer 2012 and autumn 2015, including relative humidity measures in the litter layer and different heights above it (50 cm, 2 m. Hourly averages of relative humidity were calculated and compared with data from the nearest official weather station. Results Data measured directly in the forest can differ dramatically from conditions recorded at official weather stations. In general, data indicate a remarkable relative humidity decrease from inside to outside the forest and from ground to atmosphere. Relative humidity measured in the litter layer were, on average, 24% higher than the official data and were much more balanced, especially in summer. Conclusions The results illustrate the need for, and benefit of, continuous in situ measurements to grasp the complex relative humidity conditions in forests. Data from official weather stations do not accurately represent actual humidity conditions in

  13. On the complexity of measuring forests microclimate and interpreting its relevance in habitat ecology: the example of Ixodes ricinus ticks.

    Science.gov (United States)

    Boehnke, Denise; Gebhardt, Reiner; Petney, Trevor; Norra, Stefan

    2017-11-06

    Ecological field research on the influence of meteorological parameters on a forest inhabiting species is confronted with the complex relations between measured data and the real conditions the species is exposed to. This study highlights this complexity for the example of Ixodes ricinus. This species lives mainly in forest habitats near the ground, but field research on impacts of meteorological conditions on population dynamics is often based on data from nearby official weather stations or occasional in situ measurements. In addition, studies use very different data approaches to analyze comparable research questions. This study is an extensive examination of the methodology used to analyze the impact of meteorological parameters on Ixodes ricinus and proposes a methodological approach that tackles the underlying complexity. Our specifically developed measurement concept was implemented at 25 forest study sites across Baden-Württemberg, Germany. Meteorological weather stations recorded data in situ and continuously between summer 2012 and autumn 2015, including relative humidity measures in the litter layer and different heights above it (50 cm, 2 m). Hourly averages of relative humidity were calculated and compared with data from the nearest official weather station. Data measured directly in the forest can differ dramatically from conditions recorded at official weather stations. In general, data indicate a remarkable relative humidity decrease from inside to outside the forest and from ground to atmosphere. Relative humidity measured in the litter layer were, on average, 24% higher than the official data and were much more balanced, especially in summer. The results illustrate the need for, and benefit of, continuous in situ measurements to grasp the complex relative humidity conditions in forests. Data from official weather stations do not accurately represent actual humidity conditions in forest stands and the explanatory power of short period and

  14. Size and complexity in model financial systems

    Science.gov (United States)

    Arinaminpathy, Nimalan; Kapadia, Sujit; May, Robert M.

    2012-01-01

    The global financial crisis has precipitated an increasing appreciation of the need for a systemic perspective toward financial stability. For example: What role do large banks play in systemic risk? How should capital adequacy standards recognize this role? How is stability shaped by concentration and diversification in the financial system? We explore these questions using a deliberately simplified, dynamic model of a banking system that combines three different channels for direct transmission of contagion from one bank to another: liquidity hoarding, asset price contagion, and the propagation of defaults via counterparty credit risk. Importantly, we also introduce a mechanism for capturing how swings in “confidence” in the system may contribute to instability. Our results highlight that the importance of relatively large, well-connected banks in system stability scales more than proportionately with their size: the impact of their collapse arises not only from their connectivity, but also from their effect on confidence in the system. Imposing tougher capital requirements on larger banks than smaller ones can thus enhance the resilience of the system. Moreover, these effects are more pronounced in more concentrated systems, and continue to apply, even when allowing for potential diversification benefits that may be realized by larger banks. We discuss some tentative implications for policy, as well as conceptual analogies in ecosystem stability and in the control of infectious diseases. PMID:23091020

  15. Algebraic computability and enumeration models recursion theory and descriptive complexity

    CERN Document Server

    Nourani, Cyrus F

    2016-01-01

    This book, Algebraic Computability and Enumeration Models: Recursion Theory and Descriptive Complexity, presents new techniques with functorial models to address important areas on pure mathematics and computability theory from the algebraic viewpoint. The reader is first introduced to categories and functorial models, with Kleene algebra examples for languages. Functorial models for Peano arithmetic are described toward important computational complexity areas on a Hilbert program, leading to computability with initial models. Infinite language categories are also introduced to explain descriptive complexity with recursive computability with admissible sets and urelements. Algebraic and categorical realizability is staged on several levels, addressing new computability questions with omitting types realizably. Further applications to computing with ultrafilters on sets and Turing degree computability are examined. Functorial models computability is presented with algebraic trees realizing intuitionistic type...

  16. Modeling of Complex Life Cycle Prediction Based on Cell Division

    Directory of Open Access Journals (Sweden)

    Fucheng Zhang

    2017-01-01

    Full Text Available Effective fault diagnosis and reasonable life expectancy are of great significance and practical engineering value for the safety, reliability, and maintenance cost of equipment and working environment. At present, the life prediction methods of the equipment are equipment life prediction based on condition monitoring, combined forecasting model, and driven data. Most of them need to be based on a large amount of data to achieve the problem. For this issue, we propose learning from the mechanism of cell division in the organism. We have established a moderate complexity of life prediction model across studying the complex multifactor correlation life model. In this paper, we model the life prediction of cell division. Experiments show that our model can effectively simulate the state of cell division. Through the model of reference, we will use it for the equipment of the complex life prediction.

  17. Cadmium toxicity investigated at the physiological and biophysical levels under environmentally relevant conditions using the aquatic model plant Ceratophyllum demersum

    Czech Academy of Sciences Publication Activity Database

    Andresen, Elisa; Kappel, S.; Stärk, H.-J.; Riegger, U.; Borovec, Jakub; Mattusch, J.; Heinz, A.; Schmelzer, C.E.H.; Matoušková, Šárka; Dickinson, B.; Küpper, Hendrik

    2016-01-01

    Roč. 210, č. 4 (2016), s. 1244-1258 ISSN 0028-646X Institutional support: RVO:60077344 ; RVO:67985831 Keywords : Ceratophyllum demersum * Environmentally relevant * Light-harvesting complexes (LHCs) * Toxic metals Subject RIV: CE - Biochemistry; DD - Geochemistry (GLU-S) Impact factor: 7.330, year: 2016

  18. Applications of Nonlinear Dynamics Model and Design of Complex Systems

    CERN Document Server

    In, Visarath; Palacios, Antonio

    2009-01-01

    This edited book is aimed at interdisciplinary, device-oriented, applications of nonlinear science theory and methods in complex systems. In particular, applications directed to nonlinear phenomena with space and time characteristics. Examples include: complex networks of magnetic sensor systems, coupled nano-mechanical oscillators, nano-detectors, microscale devices, stochastic resonance in multi-dimensional chaotic systems, biosensors, and stochastic signal quantization. "applications of nonlinear dynamics: model and design of complex systems" brings together the work of scientists and engineers that are applying ideas and methods from nonlinear dynamics to design and fabricate complex systems.

  19. Coping with Complexity Model Reduction and Data Analysis

    CERN Document Server

    Gorban, Alexander N

    2011-01-01

    This volume contains the extended version of selected talks given at the international research workshop 'Coping with Complexity: Model Reduction and Data Analysis', Ambleside, UK, August 31 - September 4, 2009. This book is deliberately broad in scope and aims at promoting new ideas and methodological perspectives. The topics of the chapters range from theoretical analysis of complex and multiscale mathematical models to applications in e.g., fluid dynamics and chemical kinetics.

  20. Mathematical Models to Determine Stable Behavior of Complex Systems

    Science.gov (United States)

    Sumin, V. I.; Dushkin, A. V.; Smolentseva, T. E.

    2018-05-01

    The paper analyzes a possibility to predict functioning of a complex dynamic system with a significant amount of circulating information and a large number of random factors impacting its functioning. Functioning of the complex dynamic system is described as a chaotic state, self-organized criticality and bifurcation. This problem may be resolved by modeling such systems as dynamic ones, without applying stochastic models and taking into account strange attractors.

  1. Relevant pH and lipase for in vitro models of gastric digestion.

    Science.gov (United States)

    Sams, Laura; Paume, Julie; Giallo, Jacqueline; Carrière, Frédéric

    2016-01-01

    The development of in vitro digestion models relies on the availability of in vivo data such as digestive enzyme levels and pH values recorded in the course of meal digestion. The variations of these parameters along the GI tract are important for designing dynamic digestion models but also static models for which the choice of representative conditions of the gastric and intestinal conditions is critical. Simulating gastric digestion with a static model and a single set of parameters is particularly challenging because the variations in pH and enzyme concentration occurring in the stomach are much broader than those occurring in the small intestine. A review of the literature on this topic reveals that most models of gastric digestion use very low pH values that are not representative of the fed conditions. This is illustrated here by showing the variations in gastric pH as a function of meal gastric emptying instead of time. This representation highlights those pH values that are the most relevant for testing meal digestion in the stomach. Gastric lipolysis is still largely ignored or is performed with microbial lipases. In vivo data on gastric lipase and lipolysis have however been collected in humans and dogs during test meals. The biochemical characterization of gastric lipase has shown that this enzyme is rather unique among lipases: (i) stability and activity in the pH range 2 to 7 with an optimum at pH 4-5.4; (ii) high tensioactivity that allows resistance to bile salts and penetration into phospholipid layers covering TAG droplets; (iii) sn-3 stereospecificity for TAG hydrolysis; and (iv) resistance to pepsin. Most of these properties have been known for more than two decades and should provide a rational basis for the replacement of gastric lipase by other lipases when gastric lipase is not available.

  2. On the continuing relevance of Mandelbrot's non-ergodic fractional renewal models of 1963 to 1967

    Science.gov (United States)

    Watkins, Nicholas W.

    2017-12-01

    The problem of "1/f" noise has been with us for about a century. Because it is so often framed in Fourier spectral language, the most famous solutions have tended to be the stationary long range dependent (LRD) models such as Mandelbrot's fractional Gaussian noise. In view of the increasing importance to physics of non-ergodic fractional renewal models, and their links to the CTRW, I present preliminary results of my research into the history of Mandelbrot's very little known work in that area from 1963 to 1967. I speculate about how the lack of awareness of this work in the physics and statistics communities may have affected the development of complexity science, and I discuss the differences between the Hurst effect, "1/f" noise and LRD, concepts which are often treated as equivalent. Contribution to the "Topical Issue: Continuous Time Random Walk Still Trendy: Fifty-year History, Current State and Outlook", edited by Ryszard Kutner and Jaume Masoliver.

  3. A Coordination Chemistry Approach to Fine-Tune the Physicochemical Parameters of Lanthanide Complexes Relevant to Medical Applications.

    Science.gov (United States)

    Le Fur, Mariane; Molnár, Enikő; Beyler, Maryline; Kálmán, Ferenc K; Fougère, Olivier; Esteban-Gómez, David; Rousseaux, Olivier; Tripier, Raphaël; Tircsó, Gyula; Platas-Iglesias, Carlos

    2018-03-02

    The geometric features of two pyclen-based ligands possessing identical donor atoms but different site organization have a profound impact in their complexation properties toward lanthanide ions. The ligand containing two acetate groups and a picolinate arm arranged in a symmetrical fashion (L1) forms a Gd 3+ complex being two orders of magnitude less stable than its dissymmetric analogue GdL2. Besides, GdL1 experiences a much faster dissociation following the acid-catalyzed mechanism than GdL2. On the contrary, GdL1 exhibits a lower exchange rate of the coordinated water molecule compared to GdL2. These very different properties are related to different strengths of the Gd-ligand bonds associated to steric effects, which hinder the coordination of a water molecule in GdL2 and the binding of acetate groups in GdL1. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Translational research in immune senescence: Assessing the relevance of current models

    Science.gov (United States)

    High, Kevin P.; Akbar, Arne N.; Nikolich-Zugich, Janko

    2014-01-01

    Advancing age is accompanied by profound changes in immune function; some are induced by the loss of critical niches that support development of naïve cells (e.g. thymic involution), others by the intrinsic physiology of long-lived cells attempting to maintain homeostasis, still others by extrinsic effects such as oxidative stress or long-term exposure to antigen due to persistent viral infections. Once compensatory mechanisms can no longer maintain a youthful phenotype the end result is the immune senescent milieu – one characterized by chronic, low grade, systemic inflammation and impaired responses to immune challenge, particularly when encountering new antigens. This state is associated with progression of chronic illnesses like atherosclerosis and dementia, and an increased risk of acute illness, disability and death in older adults. The complex interaction between immune senescence and chronic illness provides an ideal landscape for translational research with the potential to greatly affect human health. However, current animal models and even human investigative strategies for immune senescence have marked limitations, and the reductionist paradigm itself may be poorly suited to meet these challenges. A new paradigm, one that embraces complexity as a core feature of research in older adults is required to address the critical health issues facing the burgeoning senior population, the group that consumes the majority of healthcare resources. In this review, we outline the major advantages and limitations of current models and offer suggestions for how to move forward. PMID:22633440

  5. Understanding complex urban systems multidisciplinary approaches to modeling

    CERN Document Server

    Gurr, Jens; Schmidt, J

    2014-01-01

    Understanding Complex Urban Systems takes as its point of departure the insight that the challenges of global urbanization and the complexity of urban systems cannot be understood – let alone ‘managed’ – by sectoral and disciplinary approaches alone. But while there has recently been significant progress in broadening and refining the methodologies for the quantitative modeling of complex urban systems, in deepening the theoretical understanding of cities as complex systems, or in illuminating the implications for urban planning, there is still a lack of well-founded conceptual thinking on the methodological foundations and the strategies of modeling urban complexity across the disciplines. Bringing together experts from the fields of urban and spatial planning, ecology, urban geography, real estate analysis, organizational cybernetics, stochastic optimization, and literary studies, as well as specialists in various systems approaches and in transdisciplinary methodologies of urban analysis, the volum...

  6. Dynamic complexities in a parasitoid-host-parasitoid ecological model

    International Nuclear Information System (INIS)

    Yu Hengguo; Zhao Min; Lv Songjuan; Zhu Lili

    2009-01-01

    Chaotic dynamics have been observed in a wide range of population models. In this study, the complex dynamics in a discrete-time ecological model of parasitoid-host-parasitoid are presented. The model shows that the superiority coefficient not only stabilizes the dynamics, but may strongly destabilize them as well. Many forms of complex dynamics were observed, including pitchfork bifurcation with quasi-periodicity, period-doubling cascade, chaotic crisis, chaotic bands with narrow or wide periodic window, intermittent chaos, and supertransient behavior. Furthermore, computation of the largest Lyapunov exponent demonstrated the chaotic dynamic behavior of the model

  7. Dynamic complexities in a parasitoid-host-parasitoid ecological model

    Energy Technology Data Exchange (ETDEWEB)

    Yu Hengguo [School of Mathematic and Information Science, Wenzhou University, Wenzhou, Zhejiang 325035 (China); Zhao Min [School of Life and Environmental Science, Wenzhou University, Wenzhou, Zhejiang 325027 (China)], E-mail: zmcn@tom.com; Lv Songjuan; Zhu Lili [School of Mathematic and Information Science, Wenzhou University, Wenzhou, Zhejiang 325035 (China)

    2009-01-15

    Chaotic dynamics have been observed in a wide range of population models. In this study, the complex dynamics in a discrete-time ecological model of parasitoid-host-parasitoid are presented. The model shows that the superiority coefficient not only stabilizes the dynamics, but may strongly destabilize them as well. Many forms of complex dynamics were observed, including pitchfork bifurcation with quasi-periodicity, period-doubling cascade, chaotic crisis, chaotic bands with narrow or wide periodic window, intermittent chaos, and supertransient behavior. Furthermore, computation of the largest Lyapunov exponent demonstrated the chaotic dynamic behavior of the model.

  8. Relevance of the c-statistic when evaluating risk-adjustment models in surgery.

    Science.gov (United States)

    Merkow, Ryan P; Hall, Bruce L; Cohen, Mark E; Dimick, Justin B; Wang, Edward; Chow, Warren B; Ko, Clifford Y; Bilimoria, Karl Y

    2012-05-01

    The measurement of hospital quality based on outcomes requires risk adjustment. The c-statistic is a popular tool used to judge model performance, but can be limited, particularly when evaluating specific operations in focused populations. Our objectives were to examine the interpretation and relevance of the c-statistic when used in models with increasingly similar case mix and to consider an alternative perspective on model calibration based on a graphical depiction of model fit. From the American College of Surgeons National Surgical Quality Improvement Program (2008-2009), patients were identified who underwent a general surgery procedure, and procedure groups were increasingly restricted: colorectal-all, colorectal-elective cases only, and colorectal-elective cancer cases only. Mortality and serious morbidity outcomes were evaluated using logistic regression-based risk adjustment, and model c-statistics and calibration curves were used to compare model performance. During the study period, 323,427 general, 47,605 colorectal-all, 39,860 colorectal-elective, and 21,680 colorectal cancer patients were studied. Mortality ranged from 1.0% in general surgery to 4.1% in the colorectal-all group, and serious morbidity ranged from 3.9% in general surgery to 12.4% in the colorectal-all procedural group. As case mix was restricted, c-statistics progressively declined from the general to the colorectal cancer surgery cohorts for both mortality and serious morbidity (mortality: 0.949 to 0.866; serious morbidity: 0.861 to 0.668). Calibration was evaluated graphically by examining predicted vs observed number of events over risk deciles. For both mortality and serious morbidity, there was no qualitative difference in calibration identified between the procedure groups. In the present study, we demonstrate how the c-statistic can become less informative and, in certain circumstances, can lead to incorrect model-based conclusions, as case mix is restricted and patients become

  9. Constructing an everywhere and locally relevant predictive model of the West-African critical zone

    Science.gov (United States)

    Hector, B.; Cohard, J. M.; Pellarin, T.; Maxwell, R. M.; Cappelaere, B.; Demarty, J.; Grippa, M.; Kergoat, L.; Lebel, T.; Mamadou, O.; Mougin, E.; Panthou, G.; Peugeot, C.; Vandervaere, J. P.; Vischel, T.; Vouillamoz, J. M.

    2017-12-01

    Considering water resources and hydrologic hazards, West Africa is among the most vulnerable regions to face both climatic (e.g. with the observed intensification of precipitation) and anthropogenic changes. With +3% of demographic rate, the region experiences rapid land use changes and increased pressure on surface and groundwater resources with observed consequences on the hydrological cycle (water table rise result of the sahelian paradox, increase in flood occurrence, etc.) Managing large hydrosystems (such as transboundary aquifers or rivers basins as the Niger river) requires anticipation of such changes. However, the region significantly lacks observations, for constructing and validating critical zone (CZ) models able to predict future hydrologic regime, but also comprises hydrosystems which encompass strong environmental gradients (e.g. geological, climatic, ecological) with highly different dominating hydrological processes. We address these issues by constructing a high resolution (1 km²) regional scale physically-based model using ParFlow-CLM which allows modeling a wide range of processes without prior knowledge on their relative dominance. Our approach combines multiple scale modeling from local to meso and regional scales within the same theoretical framework. Local and meso-scale models are evaluated thanks to the rich AMMA-CATCH CZ observation database which covers 3 supersites with contrasted environments in Benin (Lat.: 9.8°N), Niger (Lat.: 13.3°N) and Mali (Lat.: 15.3°N). At the regional scale the lack of relevant map of soil hydrodynamic parameters is addressed using remote sensing data assimilation. Our first results show the model's ability to reproduce the known dominant hydrological processes (runoff generation, ET, groundwater recharge…) across the major West-African regions and allow us to conduct virtual experiments to explore the impact of global changes on the hydrosystems. This approach is a first step toward the construction of

  10. Short ensembles: An Efficient Method for Discerning Climate-relevant Sensitivities in Atmospheric General Circulation Models

    Energy Technology Data Exchange (ETDEWEB)

    Wan, Hui; Rasch, Philip J.; Zhang, Kai; Qian, Yun; Yan, Huiping; Zhao, Chun

    2014-09-08

    This paper explores the feasibility of an experimentation strategy for investigating sensitivities in fast components of atmospheric general circulation models. The basic idea is to replace the traditional serial-in-time long-term climate integrations by representative ensembles of shorter simulations. The key advantage of the proposed method lies in its efficiency: since fewer days of simulation are needed, the computational cost is less, and because individual realizations are independent and can be integrated simultaneously, the new dimension of parallelism can dramatically reduce the turnaround time in benchmark tests, sensitivities studies, and model tuning exercises. The strategy is not appropriate for exploring sensitivity of all model features, but it is very effective in many situations. Two examples are presented using the Community Atmosphere Model version 5. The first example demonstrates that the method is capable of characterizing the model cloud and precipitation sensitivity to time step length. A nudging technique is also applied to an additional set of simulations to help understand the contribution of physics-dynamics interaction to the detected time step sensitivity. In the second example, multiple empirical parameters related to cloud microphysics and aerosol lifecycle are perturbed simultaneously in order to explore which parameters have the largest impact on the simulated global mean top-of-atmosphere radiation balance. Results show that in both examples, short ensembles are able to correctly reproduce the main signals of model sensitivities revealed by traditional long-term climate simulations for fast processes in the climate system. The efficiency of the ensemble method makes it particularly useful for the development of high-resolution, costly and complex climate models.

  11. Comparison of Structurally–Related Alkoxide, Amine, and Thiolate–Ligated MII (M= Fe, Co) Complexes: the Influence of Thiolates on the Properties of Biologically Relevant Metal Complexes

    Science.gov (United States)

    Brines, Lisa M.; Villar-Acevedo, Gloria; Kitagawa, Terutaka; Swartz, Rodney D.; Lugo-Mas, Priscilla; Kaminsky, Werner; Benedict, Jason B.; Kovacs, Julie A.

    2009-01-01

    Mechanistic pathways of metalloenzymes are controlled by the metal ion’s electronic and magnetic properties, which are tuned by the coordinated ligands. The functional advantage gained by incorporating cysteinates into the active site of non-heme iron enzymes such as superoxide reductase (SOR) is not entirely understood. Herein we compare the structural and redox properties of a series of structurally–related thiolate, alkoxide, and amine–ligated Fe(II) complexes in order to determine how the thiolate influences properties critical to function. Thiolates are shown to reduce metal ion Lewis acidity relative to alkoxides and amines, and have a strong trans influence thereby helping to maintain an open coordination site. Comparison of the redox potentials of the structurally analogous compounds described herein indicates that alkoxide ligands favor the higher-valent Fe3+ oxidation state, amine ligands favor the reduced Fe2+ oxidation state, and thiolates fall somewhere in between. These properties provide a functional advantange for substrate reducing enzymes in that they provide a site at the metal ion for substrate to bind, and a moderate potential that facilitates both substrate reduction, and regeneration of the catalytically active reduced state. Redox potentials for structurally–related Co(II) complexes are shown to be cathodically–shifted relative to their Fe(II) analogues, making them ineffective reducing agents for substrates such as superoxide. PMID:21731109

  12. A marketing mix model for a complex and turbulent environment

    Directory of Open Access Journals (Sweden)

    R. B. Mason

    2007-12-01

    Full Text Available Purpose: This paper is based on the proposition that the choice of marketing tactics is determined, or at least significantly influenced, by the nature of the company’s external environment. It aims to illustrate the type of marketing mix tactics that are suggested for a complex and turbulent environment when marketing and the environment are viewed through a chaos and complexity theory lens. Design/Methodology/Approach: Since chaos and complexity theories are proposed as a good means of understanding the dynamics of complex and turbulent markets, a comprehensive review and analysis of literature on the marketing mix and marketing tactics from a chaos and complexity viewpoint was conducted. From this literature review, a marketing mix model was conceptualised. Findings: A marketing mix model considered appropriate for success in complex and turbulent environments was developed. In such environments, the literature suggests destabilising marketing activities are more effective, whereas stabilising type activities are more effective in simple, stable environments. Therefore the model proposes predominantly destabilising type tactics as appropriate for a complex and turbulent environment such as is currently being experienced in South Africa. Implications: This paper is of benefit to marketers by emphasising a new way to consider the future marketing activities of their companies. How this model can assist marketers and suggestions for research to develop and apply this model are provided. It is hoped that the model suggested will form the basis of empirical research to test its applicability in the turbulent South African environment. Originality/Value: Since businesses and markets are complex adaptive systems, using complexity theory to understand how to cope in complex, turbulent environments is necessary, but has not been widely researched. In fact, most chaos and complexity theory work in marketing has concentrated on marketing strategy, with

  13. A conscious mouse model of gastric ileus using clinically relevant endpoints

    Directory of Open Access Journals (Sweden)

    Shao Yuanlin

    2005-06-01

    Full Text Available Abstract Background Gastric ileus is an unsolved clinical problem and current treatment is limited to supportive measures. Models of ileus using anesthetized animals, muscle strips or isolated smooth muscle cells do not adequately reproduce the clinical situation. Thus, previous studies using these techniques have not led to a clear understanding of the pathophysiology of ileus. The feasibility of using food intake and fecal output as simple, clinically relevant endpoints for monitoring ileus in a conscious mouse model was evaluated by assessing the severity and time course of various insults known to cause ileus. Methods Delayed food intake and fecal output associated with ileus was monitored after intraperitoneal injection of endotoxin, laparotomy with bowel manipulation, thermal injury or cerulein induced acute pancreatitis. The correlation of decreased food intake after endotoxin injection with gastric ileus was validated by measuring gastric emptying. The effect of endotoxin on general activity level and feeding behavior was also determined. Small bowel transit was measured using a phenol red marker. Results Each insult resulted in a transient and comparable decrease in food intake and fecal output consistent with the clinical picture of ileus. The endpoints were highly sensitive to small changes in low doses of endotoxin, the extent of bowel manipulation, and cerulein dose. The delay in food intake directly correlated with delayed gastric emptying. Changes in general activity and feeding behavior were insufficient to explain decreased food intake. Intestinal transit remained unchanged at the times measured. Conclusion Food intake and fecal output are sensitive markers of gastric dysfunction in four experimental models of ileus. In the mouse, delayed gastric emptying appears to be the major cause of the anorexic effect associated with ileus. Gastric dysfunction is more important than small bowel dysfunction in this model. Recovery of

  14. Translational relevance of rodent models of hypothalamic-pituitary-adrenal function and stressors in adolescence

    Directory of Open Access Journals (Sweden)

    Cheryl M. McCormick

    2017-02-01

    Full Text Available Elevations in glucocorticoids that result from environmental stressors can have programming effects on brain structure and function when the exposure occurs during sensitive periods that involve heightened neural development. In recent years, adolescence has gained increasing attention as another sensitive period of development, a period in which pubertal transitions may increase the vulnerability to stressors. There are similarities in physical and behavioural development between humans and rats, and rats have been used effectively as an animal model of adolescence and the unique plasticity of this period of ontogeny. This review focuses on benefits and challenges of rats as a model for translational research on hypothalamic-pituitary-adrenal (HPA function and stressors in adolescence, highlighting important parallels and contrasts between adolescent rats and humans, and we review the main stress procedures that are used in investigating HPA stress responses and their consequences in adolescence in rats. We conclude that a greater focus on timing of puberty as a factor in research in adolescent rats may increase the translational relevance of the findings.

  15. Upper Limb Immobilisation: A Neural Plasticity Model with Relevance to Poststroke Motor Rehabilitation

    Directory of Open Access Journals (Sweden)

    Leonardo Furlan

    2016-01-01

    Full Text Available Advances in our understanding of the neural plasticity that occurs after hemiparetic stroke have contributed to the formulation of theories of poststroke motor recovery. These theories, in turn, have underpinned contemporary motor rehabilitation strategies for treating motor deficits after stroke, such as upper limb hemiparesis. However, a relative drawback has been that, in general, these strategies are most compatible with the recovery profiles of relatively high-functioning stroke survivors and therefore do not easily translate into benefit to those individuals sustaining low-functioning upper limb hemiparesis, who otherwise have poorer residual function. For these individuals, alternative motor rehabilitation strategies are currently needed. In this paper, we will review upper limb immobilisation studies that have been conducted with healthy adult humans and animals. Then, we will discuss how the findings from these studies could inspire the creation of a neural plasticity model that is likely to be of particular relevance to the context of motor rehabilitation after stroke. For instance, as will be elaborated, such model could contribute to the development of alternative motor rehabilitation strategies for treating poststroke upper limb hemiparesis. The implications of the findings from those immobilisation studies for contemporary motor rehabilitation strategies will also be discussed and perspectives for future research in this arena will be provided as well.

  16. Reassessing Geophysical Models of the Bushveld Complex in 3D

    Science.gov (United States)

    Cole, J.; Webb, S. J.; Finn, C.

    2012-12-01

    Conceptual geophysical models of the Bushveld Igneous Complex show three possible geometries for its mafic component: 1) Separate intrusions with vertical feeders for the eastern and western lobes (Cousins, 1959) 2) Separate dipping sheets for the two lobes (Du Plessis and Kleywegt, 1987) 3) A single saucer-shaped unit connected at depth in the central part between the two lobes (Cawthorn et al, 1998) Model three incorporates isostatic adjustment of the crust in response to the weight of the dense mafic material. The model was corroborated by results of a broadband seismic array over southern Africa, known as the Southern African Seismic Experiment (SASE) (Nguuri, et al, 2001; Webb et al, 2004). This new information about the crustal thickness only became available in the last decade and could not be considered in the earlier models. Nevertheless, there is still on-going debate as to which model is correct. All of the models published up to now have been done in 2 or 2.5 dimensions. This is not well suited to modelling the complex geometry of the Bushveld intrusion. 3D modelling takes into account effects of variations in geometry and geophysical properties of lithologies in a full three dimensional sense and therefore affects the shape and amplitude of calculated fields. The main question is how the new knowledge of the increased crustal thickness, as well as the complexity of the Bushveld Complex, will impact on the gravity fields calculated for the existing conceptual models, when modelling in 3D. The three published geophysical models were remodelled using full 3Dl potential field modelling software, and including crustal thickness obtained from the SASE. The aim was not to construct very detailed models, but to test the existing conceptual models in an equally conceptual way. Firstly a specific 2D model was recreated in 3D, without crustal thickening, to establish the difference between 2D and 3D results. Then the thicker crust was added. Including the less

  17. Electromagnetic modelling of Ground Penetrating Radar responses to complex targets

    Science.gov (United States)

    Pajewski, Lara; Giannopoulos, Antonis

    2014-05-01

    This work deals with the electromagnetic modelling of composite structures for Ground Penetrating Radar (GPR) applications. It was developed within the Short-Term Scientific Mission ECOST-STSM-TU1208-211013-035660, funded by COST Action TU1208 "Civil Engineering Applications of Ground Penetrating Radar". The Authors define a set of test concrete structures, hereinafter called cells. The size of each cell is 60 x 100 x 18 cm and the content varies with growing complexity, from a simple cell with few rebars of different diameters embedded in concrete at increasing depths, to a final cell with a quite complicated pattern, including a layer of tendons between two overlying meshes of rebars. Other cells, of intermediate complexity, contain pvc ducts (air filled or hosting rebars), steel objects commonly used in civil engineering (as a pipe, an angle bar, a box section and an u-channel), as well as void and honeycombing defects. One of the cells has a steel mesh embedded in it, overlying two rebars placed diagonally across the comers of the structure. Two cells include a couple of rebars bent into a right angle and placed on top of each other, with a square/round circle lying at the base of the concrete slab. Inspiration for some of these cells is taken from the very interesting experimental work presented in Ref. [1]. For each cell, a subset of models with growing complexity is defined, starting from a simple representation of the cell and ending with a more realistic one. In particular, the model's complexity increases from the geometrical point of view, as well as in terms of how the constitutive parameters of involved media and GPR antennas are described. Some cells can be simulated in both two and three dimensions; the concrete slab can be approximated as a finite-thickness layer having infinite extension on the transverse plane, thus neglecting how edges affect radargrams, or else its finite size can be fully taken into account. The permittivity of concrete can be

  18. Planting increases the abundance and structure complexity of soil core functional genes relevant to carbon and nitrogen cycling.

    Science.gov (United States)

    Wang, Feng; Liang, Yuting; Jiang, Yuji; Yang, Yunfeng; Xue, Kai; Xiong, Jinbo; Zhou, Jizhong; Sun, Bo

    2015-09-23

    Plants have an important impact on soil microbial communities and their functions. However, how plants determine the microbial composition and network interactions is still poorly understood. During a four-year field experiment, we investigated the functional gene composition of three types of soils (Phaeozem, Cambisols and Acrisol) under maize planting and bare fallow regimes located in cold temperate, warm temperate and subtropical regions, respectively. The core genes were identified using high-throughput functional gene microarray (GeoChip 3.0), and functional molecular ecological networks (fMENs) were subsequently developed with the random matrix theory (RMT)-based conceptual framework. Our results demonstrated that planting significantly (P soils and 83.5% of microbial alpha-diversity can be explained by the plant factor. Moreover, planting had significant impacts on the microbial community structure and the network interactions of the microbial communities. The calculated network complexity was higher under maize planting than under bare fallow regimes. The increase of the functional genes led to an increase in both soil respiration and nitrification potential with maize planting, indicating that changes in the soil microbial communities and network interactions influenced ecological functioning.

  19. Models of Stress in Nonhuman Primates and Their Relevance for Human Psychopathology and Endocrine Dysfunction

    Science.gov (United States)

    Meyer, Jerrold S.; Hamel, Amanda F.

    2014-01-01

    Stressful life events have been linked to the onset of severe psychopathology and endocrine dysfunction in many patients. Moreover, vulnerability to the later development of such disorders can be increased by stress or adversity during development (e.g., childhood neglect, abuse, or trauma). This review discusses the methodological features and results of various models of stress in nonhuman primates in the context of their potential relevance for human psychopathology and endocrine dysfunction, particularly mood disorders and dysregulation of the hypothalamic-pituitary-adrenocortical (HPA) system. Such models have typically examined the effects of stress on the animals' behavior, endocrine function (primarily the HPA and hypothalamic-pituitary-gonadal systems), and, in some cases, immune status. Manipulations such as relocation and/or removal of an animal from its current social group or, alternatively, formation of a new social group can have adverse effects on all of these outcome measures that may be either transient or more persistent depending on the species, sex, and other experimental conditions. Social primates may also experience significant stress associated with their rank in the group's dominance hierarchy. Finally, stress during prenatal development or during the early postnatal period may have long-lasting neurobiological and endocrine effects that manifest in an altered ability to cope behaviorally and physiologically with later challenges. Whereas early exposure to severe stress usually results in deficient coping abilities, certain kinds of milder stressors can promote subsequent resilience in the animal. We conclude that studies of stress in nonhuman primates can model many features of stress exposure in human populations and that such studies can play a valuable role in helping to elucidate the mechanisms underlying the role of stress in human psychopathology and endocrine dysfunction. PMID:25225311

  20. Models of stress in nonhuman primates and their relevance for human psychopathology and endocrine dysfunction.

    Science.gov (United States)

    Meyer, Jerrold S; Hamel, Amanda F

    2014-01-01

    Stressful life events have been linked to the onset of severe psychopathology and endocrine dysfunction in many patients. Moreover, vulnerability to the later development of such disorders can be increased by stress or adversity during development (e.g., childhood neglect, abuse, or trauma). This review discusses the methodological features and results of various models of stress in nonhuman primates in the context of their potential relevance for human psychopathology and endocrine dysfunction, particularly mood disorders and dysregulation of the hypothalamic-pituitary-adrenocortical (HPA) system. Such models have typically examined the effects of stress on the animals' behavior, endocrine function (primarily the HPA and hypothalamic-pituitary-gonadal systems), and, in some cases, immune status. Manipulations such as relocation and/or removal of an animal from its current social group or, alternatively, formation of a new social group can have adverse effects on all of these outcome measures that may be either transient or more persistent depending on the species, sex, and other experimental conditions. Social primates may also experience significant stress associated with their rank in the group's dominance hierarchy. Finally, stress during prenatal development or during the early postnatal period may have long-lasting neurobiological and endocrine effects that manifest in an altered ability to cope behaviorally and physiologically with later challenges. Whereas early exposure to severe stress usually results in deficient coping abilities, certain kinds of milder stressors can promote subsequent resilience in the animal. We conclude that studies of stress in nonhuman primates can model many features of stress exposure in human populations and that such studies can play a valuable role in helping to elucidate the mechanisms underlying the role of stress in human psychopathology and endocrine dysfunction. © The Author 2014. Published by Oxford University Press on

  1. Complexation and molecular modeling studies of europium(III)-gallic acid-amino acid complexes.

    Science.gov (United States)

    Taha, Mohamed; Khan, Imran; Coutinho, João A P

    2016-04-01

    With many metal-based drugs extensively used today in the treatment of cancer, attention has focused on the development of new coordination compounds with antitumor activity with europium(III) complexes recently introduced as novel anticancer drugs. The aim of this work is to design new Eu(III) complexes with gallic acid, an antioxida'nt phenolic compound. Gallic acid was chosen because it shows anticancer activity without harming health cells. As antioxidant, it helps to protect human cells against oxidative damage that implicated in DNA damage, cancer, and accelerated cell aging. In this work, the formation of binary and ternary complexes of Eu(III) with gallic acid, primary ligand, and amino acids alanine, leucine, isoleucine, and tryptophan was studied by glass electrode potentiometry in aqueous solution containing 0.1M NaNO3 at (298.2 ± 0.1) K. Their overall stability constants were evaluated and the concentration distributions of the complex species in solution were calculated. The protonation constants of gallic acid and amino acids were also determined at our experimental conditions and compared with those predicted by using conductor-like screening model for realistic solvation (COSMO-RS) model. The geometries of Eu(III)-gallic acid complexes were characterized by the density functional theory (DFT). The spectroscopic UV-visible and photoluminescence measurements are carried out to confirm the formation of Eu(III)-gallic acid complexes in aqueous solutions. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. Modeling Complex Nesting Structures in International Business Research

    DEFF Research Database (Denmark)

    Nielsen, Bo Bernhard; Nielsen, Sabina

    2013-01-01

    hierarchical random coefficient models (RCM) are often used for the analysis of multilevel phenomena, IB issues often result in more complex nested structures. This paper illustrates how cross-nested multilevel modeling allowing for predictor variables and cross-level interactions at multiple (crossed) levels...

  3. Short ensembles: an efficient method for discerning climate-relevant sensitivities in atmospheric general circulation models

    Directory of Open Access Journals (Sweden)

    H. Wan

    2014-09-01

    high-resolution, costly, and complex climate models.

  4. Universal correlators for multi-arc complex matrix models

    International Nuclear Information System (INIS)

    Akemann, G.

    1997-01-01

    The correlation functions of the multi-arc complex matrix model are shown to be universal for any finite number of arcs. The universality classes are characterized by the support of the eigenvalue density and are conjectured to fall into the same classes as the ones recently found for the Hermitian model. This is explicitly shown to be true for the case of two arcs, apart from the known result for one arc. The basic tool is the iterative solution of the loop equation for the complex matrix model with multiple arcs, which provides all multi-loop correlators up to an arbitrary genus. Explicit results for genus one are given for any number of arcs. The two-arc solution is investigated in detail, including the double-scaling limit. In addition universal expressions for the string susceptibility are given for both the complex and Hermitian model. (orig.)

  5. The Anti-Inflammatory Effects of Acupuncture and Their Relevance to Allergic Rhinitis: A Narrative Review and Proposed Model

    Directory of Open Access Journals (Sweden)

    John L. McDonald

    2013-01-01

    Full Text Available Classical literature indicates that acupuncture has been used for millennia to treat numerous inflammatory conditions, including allergic rhinitis. Recent research has examined some of the mechanisms underpinning acupuncture's anti-inflammatory effects which include mediation by sympathetic and parasympathetic pathways. The hypothalamus-pituitary-adrenal (HPA axis has been reported to mediate the antioedema effects of acupuncture, but not antihyperalgesic actions during inflammation. Other reported anti-inflammatory effects of acupuncture include an antihistamine action and downregulation of proinflammatory cytokines (such as TNF-α, IL-1β, IL-6, and IL-10, proinflammatory neuropeptides (such as SP, CGRP, and VIP, and neurotrophins (such as NGF and BDNF which can enhance and prolong inflammatory response. Acupuncture has been reported to suppress the expression of COX-1, COX-2, and iNOS during experimentally induced inflammation. Downregulation of the expression and sensitivity of the transient receptor potential vallinoid 1 (TRPV1 after acupuncture has been reported. In summary, acupuncture may exert anti-inflammatory effects through a complex neuro-endocrino-immunological network of actions. Many of these generic anti-inflammatory effects of acupuncture are of direct relevance to allergic rhinitis; however, more research is needed to elucidate specifically how immune mechanisms might be modulated by acupuncture in allergic rhinitis, and to this end a proposed model is offered to guide further research.

  6. Cisplatin Resistant Spheroids Model Clinically Relevant Survival Mechanisms in Ovarian Tumors.

    Directory of Open Access Journals (Sweden)

    Winyoo Chowanadisai

    Full Text Available The majority of ovarian tumors eventually recur in a drug resistant form. Using cisplatin sensitive and resistant cell lines assembled into 3D spheroids we profiled gene expression and identified candidate mechanisms and biological pathways associated with cisplatin resistance. OVCAR-8 human ovarian carcinoma cells were exposed to sub-lethal concentrations of cisplatin to create a matched cisplatin-resistant cell line, OVCAR-8R. Genome-wide gene expression profiling of sensitive and resistant ovarian cancer spheroids identified 3,331 significantly differentially expressed probesets coding for 3,139 distinct protein-coding genes (Fc >2, FDR < 0.05 (S2 Table. Despite significant expression changes in some transporters including MDR1, cisplatin resistance was not associated with differences in intracellular cisplatin concentration. Cisplatin resistant cells were significantly enriched for a mesenchymal gene expression signature. OVCAR-8R resistance derived gene sets were significantly more biased to patients with shorter survival. From the most differentially expressed genes, we derived a 17-gene expression signature that identifies ovarian cancer patients with shorter overall survival in three independent datasets. We propose that the use of cisplatin resistant cell lines in 3D spheroid models is a viable approach to gain insight into resistance mechanisms relevant to ovarian tumors in patients. Our data support the emerging concept that ovarian cancers can acquire drug resistance through an epithelial-to-mesenchymal transition.

  7. Automatic Query Generation and Query Relevance Measurement for Unsupervised Language Model Adaptation of Speech Recognition

    Directory of Open Access Journals (Sweden)

    Suzuki Motoyuki

    2009-01-01

    Full Text Available Abstract We are developing a method of Web-based unsupervised language model adaptation for recognition of spoken documents. The proposed method chooses keywords from the preliminary recognition result and retrieves Web documents using the chosen keywords. A problem is that the selected keywords tend to contain misrecognized words. The proposed method introduces two new ideas for avoiding the effects of keywords derived from misrecognized words. The first idea is to compose multiple queries from selected keyword candidates so that the misrecognized words and correct words do not fall into one query. The second idea is that the number of Web documents downloaded for each query is determined according to the "query relevance." Combining these two ideas, we can alleviate bad effect of misrecognized keywords by decreasing the number of downloaded Web documents from queries that contain misrecognized keywords. Finally, we examine a method of determining the number of iterative adaptations based on the recognition likelihood. Experiments have shown that the proposed stopping criterion can determine almost the optimum number of iterations. In the final experiment, the word accuracy without adaptation (55.29% was improved to 60.38%, which was 1.13 point better than the result of the conventional unsupervised adaptation method (59.25%.

  8. Automatic Query Generation and Query Relevance Measurement for Unsupervised Language Model Adaptation of Speech Recognition

    Directory of Open Access Journals (Sweden)

    Akinori Ito

    2009-01-01

    Full Text Available We are developing a method of Web-based unsupervised language model adaptation for recognition of spoken documents. The proposed method chooses keywords from the preliminary recognition result and retrieves Web documents using the chosen keywords. A problem is that the selected keywords tend to contain misrecognized words. The proposed method introduces two new ideas for avoiding the effects of keywords derived from misrecognized words. The first idea is to compose multiple queries from selected keyword candidates so that the misrecognized words and correct words do not fall into one query. The second idea is that the number of Web documents downloaded for each query is determined according to the “query relevance.” Combining these two ideas, we can alleviate bad effect of misrecognized keywords by decreasing the number of downloaded Web documents from queries that contain misrecognized keywords. Finally, we examine a method of determining the number of iterative adaptations based on the recognition likelihood. Experiments have shown that the proposed stopping criterion can determine almost the optimum number of iterations. In the final experiment, the word accuracy without adaptation (55.29% was improved to 60.38%, which was 1.13 point better than the result of the conventional unsupervised adaptation method (59.25%.

  9. Risk-adjusted Outcomes of Clinically Relevant Pancreatic Fistula Following Pancreatoduodenectomy: A Model for Performance Evaluation.

    Science.gov (United States)

    McMillan, Matthew T; Soi, Sameer; Asbun, Horacio J; Ball, Chad G; Bassi, Claudio; Beane, Joal D; Behrman, Stephen W; Berger, Adam C; Bloomston, Mark; Callery, Mark P; Christein, John D; Dixon, Elijah; Drebin, Jeffrey A; Castillo, Carlos Fernandez-Del; Fisher, William E; Fong, Zhi Ven; House, Michael G; Hughes, Steven J; Kent, Tara S; Kunstman, John W; Malleo, Giuseppe; Miller, Benjamin C; Salem, Ronald R; Soares, Kevin; Valero, Vicente; Wolfgang, Christopher L; Vollmer, Charles M

    2016-08-01

    To evaluate surgical performance in pancreatoduodenectomy using clinically relevant postoperative pancreatic fistula (CR-POPF) occurrence as a quality indicator. Accurate assessment of surgeon and institutional performance requires (1) standardized definitions for the outcome of interest and (2) a comprehensive risk-adjustment process to control for differences in patient risk. This multinational, retrospective study of 4301 pancreatoduodenectomies involved 55 surgeons at 15 institutions. Risk for CR-POPF was assessed using the previously validated Fistula Risk Score, and pancreatic fistulas were stratified by International Study Group criteria. CR-POPF variability was evaluated and hierarchical regression analysis assessed individual surgeon and institutional performance. There was considerable variability in both CR-POPF risk and occurrence. Factors increasing the risk for CR-POPF development included increasing Fistula Risk Score (odds ratio 1.49 per point, P ratio 3.30, P performance outliers were identified at the surgeon and institutional levels. Of the top 10 surgeons (≥15 cases) for nonrisk-adjusted performance, only 6 remained in this high-performing category following risk adjustment. This analysis of pancreatic fistulas following pancreatoduodenectomy demonstrates considerable variability in both the risk and occurrence of CR-POPF among surgeons and institutions. Disparities in patient risk between providers reinforce the need for comprehensive, risk-adjusted modeling when assessing performance based on procedure-specific complications. Furthermore, beyond inherent patient risk factors, surgical decision-making influences fistula outcomes.

  10. Using language models to identify relevant new information in inpatient clinical notes.

    Science.gov (United States)

    Zhang, Rui; Pakhomov, Serguei V; Lee, Janet T; Melton, Genevieve B

    2014-01-01

    Redundant information in clinical notes within electronic health record (EHR) systems is ubiquitous and may negatively impact the use of these notes by clinicians, and, potentially, the efficiency of patient care delivery. Automated methods to identify redundant versus relevant new information may provide a valuable tool for clinicians to better synthesize patient information and navigate to clinically important details. In this study, we investigated the use of language models for identification of new information in inpatient notes, and evaluated our methods using expert-derived reference standards. The best method achieved precision of 0.743, recall of 0.832 and F1-measure of 0.784. The average proportion of redundant information was similar between inpatient and outpatient progress notes (76.6% (SD=17.3%) and 76.7% (SD=14.0%), respectively). Advanced practice providers tended to have higher rates of redundancy in their notes compared to physicians. Future investigation includes the addition of semantic components and visualization of new information.

  11. Evaluation of relevant information for optimal reflector modeling through data assimilation procedures

    International Nuclear Information System (INIS)

    Argaud, J.P.; Bouriquet, B.; Clerc, T.; Lucet-Sanchez, F.; Poncot, A.

    2015-01-01

    The goal of this study is to look after the amount of information that is mandatory to get a relevant parameters optimisation by data assimilation for physical models in neutronic diffusion calculations, and to determine what is the best information to reach the optimum of accuracy at the cheapest cost. To evaluate the quality of the optimisation, we study the covariance matrix that represents the accuracy of the optimised parameter. This matrix is a classical output of the data assimilation procedure, and it is the main information about accuracy and sensitivity of the parameter optimal determination. We present some results collected in the field of neutronic simulation for PWR type reactor. We seek to optimise the reflector parameters that characterise the neutronic reflector surrounding the whole reactive core. On the basis of the configuration studies, it has been shown that with data assimilation we can determine a global strategy to optimise the quality of the result with respect to the amount of information provided. The consequence of this is a cost reduction in terms of measurement and/or computing time with respect to the basic approach. Another result is that using multi-campaign data rather data from a unique campaign significantly improves the efficiency of parameters optimisation

  12. THM-coupled modeling of selected processes in argillaceous rock relevant to rock mechanics

    International Nuclear Information System (INIS)

    Czaikowski, Oliver

    2012-01-01

    Scientific investigations in European countries other than Germany concentrate not only on granite formations (Switzerland, Sweden) but also on argillaceous rock formations (France, Switzerland, Belgium) to assess their suitability as host and barrier rock for the final storage of radioactive waste. In Germany, rock salt has been under thorough study as a host rock over the past few decades. According to a study by the German Federal Institute for Geosciences and Natural Resources, however, not only salt deposits but also argillaceous rock deposits are available at relevant depths and of extensions in space which make final storage of high-level radioactive waste basically possible in Germany. Equally qualified findings about the suitability/unsuitability of non-saline rock formations require fundamental studies to be conducted nationally because of the comparatively low level of knowledge. The article presents basic analyses of coupled mechanical and hydraulic properties of argillaceous rock formations as host rock for a repository. The interaction of various processes is explained on the basis of knowledge derived from laboratory studies, and open problems are deduced. For modeling coupled processes, a simplified analytical computation method is proposed and compared with the results of numerical simulations, and the limits to its application are outlined. (orig.)

  13. Bim Automation: Advanced Modeling Generative Process for Complex Structures

    Science.gov (United States)

    Banfi, F.; Fai, S.; Brumana, R.

    2017-08-01

    The new paradigm of the complexity of modern and historic structures, which are characterised by complex forms, morphological and typological variables, is one of the greatest challenges for building information modelling (BIM). Generation of complex parametric models needs new scientific knowledge concerning new digital technologies. These elements are helpful to store a vast quantity of information during the life cycle of buildings (LCB). The latest developments of parametric applications do not provide advanced tools, resulting in time-consuming work for the generation of models. This paper presents a method capable of processing and creating complex parametric Building Information Models (BIM) with Non-Uniform to NURBS) with multiple levels of details (Mixed and ReverseLoD) based on accurate 3D photogrammetric and laser scanning surveys. Complex 3D elements are converted into parametric BIM software and finite element applications (BIM to FEA) using specific exchange formats and new modelling tools. The proposed approach has been applied to different case studies: the BIM of modern structure for the courtyard of West Block on Parliament Hill in Ottawa (Ontario) and the BIM of Masegra Castel in Sondrio (Italy), encouraging the dissemination and interaction of scientific results without losing information during the generative process.

  14. Systems Engineering Metrics: Organizational Complexity and Product Quality Modeling

    Science.gov (United States)

    Mog, Robert A.

    1997-01-01

    Innovative organizational complexity and product quality models applicable to performance metrics for NASA-MSFC's Systems Analysis and Integration Laboratory (SAIL) missions and objectives are presented. An intensive research effort focuses on the synergistic combination of stochastic process modeling, nodal and spatial decomposition techniques, organizational and computational complexity, systems science and metrics, chaos, and proprietary statistical tools for accelerated risk assessment. This is followed by the development of a preliminary model, which is uniquely applicable and robust for quantitative purposes. Exercise of the preliminary model using a generic system hierarchy and the AXAF-I architectural hierarchy is provided. The Kendall test for positive dependence provides an initial verification and validation of the model. Finally, the research and development of the innovation is revisited, prior to peer review. This research and development effort results in near-term, measurable SAIL organizational and product quality methodologies, enhanced organizational risk assessment and evolutionary modeling results, and 91 improved statistical quantification of SAIL productivity interests.

  15. Complex-plane strategy for computing rotating polytropic models - efficiency and accuracy of the complex first-order perturbation theory

    International Nuclear Information System (INIS)

    Geroyannis, V.S.

    1988-01-01

    In this paper, a numerical method is developed for determining the structure distortion of a polytropic star which rotates either uniformly or differentially. This method carries out the required numerical integrations in the complex plane. The method is implemented to compute indicative quantities, such as the critical perturbation parameter which represents an upper limit in the rotational behavior of the star. From such indicative results, it is inferred that this method achieves impressive improvement against other relevant methods; most important, it is comparable to some of the most elaborate and accurate techniques on the subject. It is also shown that the use of this method with Chandrasekhar's first-order perturbation theory yields an immediate drastic improvement of the results. Thus, there is no neeed - for most applications concerning rotating polytropic models - to proceed to the further use of the method with higher order techniques, unless the maximum accuracy of the method is required. 31 references

  16. Anatomical masking of pressure footprints based on the Oxford Foot Model: validation and clinical relevance.

    Science.gov (United States)

    Giacomozzi, Claudia; Stebbins, Julie A

    2017-03-01

    Plantar pressure analysis is widely used in the assessment of foot function. In order to assess regional loading, a mask is applied to the footprint to sub-divide it into regions of interest (ROIs). The most common masking method is based on geometric features of the footprint (GM). Footprint masking based on anatomical landmarks of the foot has been implemented more recently, and involves the integration of a 3D motion capture system, plantar pressure measurement device, and a multi-segment foot model. However, thorough validation of anatomical masking (AM) using pathological footprints has not yet been presented. In the present study, an AM method based on the Oxford Foot Model (OFM) was compared to an equivalent GM. Pressure footprints from 20 young healthy subjects (HG) and 20 patients with clubfoot (CF) were anatomically divided into 5 ROIs using a subset of the OFM markers. The same foot regions were also identified by using a standard GM method. Comparisons of intra-subject coefficient of variation (CV) showed that the OFM-based AM was at least as reliable as the GM for all investigated pressure parameters in all foot regions. Clinical relevance of AM was investigated by comparing footprints from HG and CF groups. Contact time, maximum force, force-time integral and contact area proved to be sensitive parameters that were able to distinguish HG and CF groups, using both AM and GM methods However, the AM method revealed statistically significant differences between groups in 75% of measured variables, compared to 62% using a standard GM method, indicating that the AM method is more sensitive for revealing differences between groups. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. New clinically relevant, orthotopic mouse models of human chondrosarcoma with spontaneous metastasis

    Directory of Open Access Journals (Sweden)

    Dass Crispin R

    2010-06-01

    Full Text Available Abstract Background Chondrosarcoma responds poorly to adjuvant therapy and new, clinically relevant animal models are required to test targeted therapy. Methods Two human chondrosarcoma cell lines, JJ012 and FS090, were evaluated for proliferation, colony formation, invasion, angiogenesis and osteoclastogenesis. Cell lines were also investigated for VEGF, MMP-2, MMP-9, and RECK expression. JJ012 and FS090 were injected separately into the mouse tibia intramedullary canal or tibial periosteum. Animal limbs were measured, and x-rayed for evidence of tumour take and progression. Tibias and lungs were harvested to determine the presence of tumour and lung metastases. Results JJ012 demonstrated significantly higher proliferative capacity, invasion, and colony formation in collagen I gel. JJ012 conditioned medium stimulated endothelial tube formation and osteoclastogenesis with a greater potency than FS090 conditioned medium, perhaps related to the effects of VEGF and MMP-9. In vivo, tumours formed in intratibial and periosteal groups injected with JJ012, however no mice injected with FS090 developed tumours. JJ012 periosteal tumours grew to 3 times the non-injected limb size by 7 weeks, whereas intratibial injected limbs required 10 weeks to achieve a similar tumour size. Sectioned tumour tissue demonstrated features of grade III chondrosarcoma. All JJ012 periosteal tumours (5/5 resulted in lung micro-metastases, while only 2/4 JJ012 intratibial tumours demonstrated metastases. Conclusions The established JJ012 models replicate the site, morphology, and many behavioural characteristics of human chondrosarcoma. Local tumour invasion of bone and spontaneous lung metastasis offer valuable assessment tools to test the potential of novel agents for future chondrosarcoma therapy.

  18. Relevance of aerodynamic modelling for load reduction control strategies of two-bladed wind turbines

    Science.gov (United States)

    Luhmann, B.; Cheng, P. W.

    2014-06-01

    A new load reduction concept is being developed for the two-bladed prototype of the Skywind 3.5MW wind turbine. Due to transport and installation advantages both offshore and in complex terrain two-bladed turbine designs are potentially more cost-effective than comparable three-bladed configurations. A disadvantage of two-bladed wind turbines is the increased fatigue loading, which is a result of asymmetrically distributed rotor forces. The innovative load reduction concept of the Skywind prototype consists of a combination of cyclic pitch control and tumbling rotor kinematics to mitigate periodic structural loading. Aerodynamic design tools must be able to model correctly the advanced dynamics of the rotor. In this paper the impact of the aerodynamic modelling approach is investigated for critical operational modes of a two-bladed wind turbine. Using a lifting line free wake vortex code (FVM) the physical limitations of the classical blade element momentum theory (BEM) can be evaluated. During regular operation vertical shear and yawed inflow are the main contributors to periodic blade load asymmetry. It is shown that the near wake interaction of the blades under such conditions is not fully captured by the correction models of BEM approach. The differing prediction of local induction causes a high fatigue load uncertainty especially for two-bladed turbines. The implementation of both cyclic pitch control and a tumbling rotor can mitigate the fatigue loading by increasing the aerodynamic and structural damping. The influence of the time and space variant vorticity distribution in the near wake is evaluated in detail for different cyclic pitch control functions and tumble dynamics respectively. It is demonstrated that dynamic inflow as well as wake blade interaction have a significant impact on the calculated blade forces and need to be accounted for by the aerodynamic modelling approach. Aeroelastic simulations are carried out using the high fidelity multi body

  19. Complex groundwater flow systems as traveling agent models

    Directory of Open Access Journals (Sweden)

    Oliver López Corona

    2014-10-01

    Full Text Available Analyzing field data from pumping tests, we show that as with many other natural phenomena, groundwater flow exhibits complex dynamics described by 1/f power spectrum. This result is theoretically studied within an agent perspective. Using a traveling agent model, we prove that this statistical behavior emerges when the medium is complex. Some heuristic reasoning is provided to justify both spatial and dynamic complexity, as the result of the superposition of an infinite number of stochastic processes. Even more, we show that this implies that non-Kolmogorovian probability is needed for its study, and provide a set of new partial differential equations for groundwater flow.

  20. Synchronization Experiments With A Global Coupled Model of Intermediate Complexity

    Science.gov (United States)

    Selten, Frank; Hiemstra, Paul; Shen, Mao-Lin

    2013-04-01

    In the super modeling approach an ensemble of imperfect models are connected through nudging terms that nudge the solution of each model to the solution of all other models in the ensemble. The goal is to obtain a synchronized state through a proper choice of connection strengths that closely tracks the trajectory of the true system. For the super modeling approach to be successful, the connections should be dense and strong enough for synchronization to occur. In this study we analyze the behavior of an ensemble of connected global atmosphere-ocean models of intermediate complexity. All atmosphere models are connected to the same ocean model through the surface fluxes of heat, water and momentum, the ocean is integrated using weighted averaged surface fluxes. In particular we analyze the degree of synchronization between the atmosphere models and the characteristics of the ensemble mean solution. The results are interpreted using a low order atmosphere-ocean toy model.

  1. The effects of model complexity and calibration period on groundwater recharge simulations

    Science.gov (United States)

    Moeck, Christian; Van Freyberg, Jana; Schirmer, Mario

    2017-04-01

    A significant number of groundwater recharge models exist that vary in terms of complexity (i.e., structure and parametrization). Typically, model selection and conceptualization is very subjective and can be a key source of uncertainty in the recharge simulations. Another source of uncertainty is the implicit assumption that model parameters, calibrated over historical periods, are also valid for the simulation period. To the best of our knowledge there is no systematic evaluation of the effect of the model complexity and calibration strategy on the performance of recharge models. To address this gap, we utilized a long-term recharge data set (20 years) from a large weighting lysimeter. We performed a differential split sample test with four groundwater recharge models that vary in terms of complexity. They were calibrated using six calibration periods with climatically contrasting conditions in a constrained Monte Carlo approach. Despite the climatically contrasting conditions, all models performed similarly well during the calibration. However, during validation a clear effect of the model structure on model performance was evident. The more complex, physically-based models predicted recharge best, even when calibration and prediction periods had very different climatic conditions. In contrast, more simplistic soil-water balance and lumped model performed poorly under such conditions. For these models we found a strong dependency on the chosen calibration period. In particular, our analysis showed that this can have relevant implications when using recharge models as decision-making tools in a broad range of applications (e.g. water availability, climate change impact studies, water resource management, etc.).

  2. Are invertebrates relevant models in ageing research? Focus on the effects of rapamycin on TOR.

    Science.gov (United States)

    Erdogan, Cihan Suleyman; Hansen, Benni Winding; Vang, Ole

    2016-01-01

    Ageing is the organisms increased susceptibility to death, which is linked to accumulated damage in the cells and tissues. Ageing is a complex process regulated by crosstalk of various pathways in the cells. Ageing is highly regulated by the Target of Rapamycin (TOR) pathway activity. TOR is an evolutionary conserved key protein kinase in the TOR pathway that regulates growth, proliferation and cell metabolism in response to nutrients, growth factors and stress. Comparing the ageing process in invertebrate model organisms with relatively short lifespan with mammals provides valuable information about the molecular mechanisms underlying the ageing process faster than mammal systems. Inhibition of the TOR pathway activity via either genetic manipulation or rapamycin increases lifespan profoundly in most invertebrate model organisms. This contribution will review the recent findings in invertebrates concerning the TOR pathway and effects of TOR inhibition by rapamycin on lifespan. Besides some contradictory results, the majority points out that rapamycin induces longevity. This suggests that administration of rapamycin in invertebrates is a promising tool for pursuing the scientific puzzle of lifespan prolongation. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  3. Simulation of groundwater flow in the glacial aquifer system of northeastern Wisconsin with variable model complexity

    Science.gov (United States)

    Juckem, Paul F.; Clark, Brian R.; Feinstein, Daniel T.

    2017-05-04

    The U.S. Geological Survey, National Water-Quality Assessment seeks to map estimated intrinsic susceptibility of the glacial aquifer system of the conterminous United States. Improved understanding of the hydrogeologic characteristics that explain spatial patterns of intrinsic susceptibility, commonly inferred from estimates of groundwater age distributions, is sought so that methods used for the estimation process are properly equipped. An important step beyond identifying relevant hydrogeologic datasets, such as glacial geology maps, is to evaluate how incorporation of these resources into process-based models using differing levels of detail could affect resulting simulations of groundwater age distributions and, thus, estimates of intrinsic susceptibility.This report describes the construction and calibration of three groundwater-flow models of northeastern Wisconsin that were developed with differing levels of complexity to provide a framework for subsequent evaluations of the effects of process-based model complexity on estimations of groundwater age distributions for withdrawal wells and streams. Preliminary assessments, which focused on the effects of model complexity on simulated water levels and base flows in the glacial aquifer system, illustrate that simulation of vertical gradients using multiple model layers improves simulated heads more in low-permeability units than in high-permeability units. Moreover, simulation of heterogeneous hydraulic conductivity fields in coarse-grained and some fine-grained glacial materials produced a larger improvement in simulated water levels in the glacial aquifer system compared with simulation of uniform hydraulic conductivity within zones. The relation between base flows and model complexity was less clear; however, the relation generally seemed to follow a similar pattern as water levels. Although increased model complexity resulted in improved calibrations, future application of the models using simulated particle

  4. Translating hydrologically-relevant variables from the ice sheet model SICOPOLIS to the Greenland Analog Project hydrologic modeling domain

    Science.gov (United States)

    Vallot, Dorothée; Applegate, Patrick; Pettersson, Rickard

    2013-04-01

    Projecting future climate and ice sheet development requires sophisticated models and extensive field observations. Given the present state of our knowledge, it is very difficult to say what will happen with certainty. Despite the ongoing increase in atmospheric greenhouse gas concentrations, the possibility that a new ice sheet might form over Scandinavia in the far distant future cannot be excluded. The growth of a new Scandinavian Ice Sheet would have important consequences for buried nuclear waste repositories. The Greenland Analogue Project, initiated by the Swedish Nuclear Fuel and Waste Management Company (SKB), is working to assess the effects of a possible future ice sheet on groundwater flow by studying a constrained domain in Western Greenland by field measurements (including deep bedrock drilling in front of the ice sheet) combined with numerical modeling. To address the needs of the GAP project, we interpolated results from an ensemble of ice sheet model runs to the smaller and more finely resolved modeling domain used in the GAP project's hydrologic modeling. Three runs have been chosen with three fairly different positive degree-day factors among those that reproduced the modern ice margin at the borehole position. The interpolated results describe changes in hydrologically-relevant variables over two time periods, 115 ka to 80 ka, and 20 ka to 1 ka. In the first of these time periods, the ice margin advances over the model domain; in the second time period, the ice margin retreats over the model domain. The spatially-and temporally dependent variables that we treated include the ice thickness, basal melting rate, surface mass balance, basal temperature, basal thermal regime (frozen or thawed), surface temperature, and basal water pressure. The melt flux is also calculated.

  5. ANS main control complex three-dimensional computer model development

    International Nuclear Information System (INIS)

    Cleaves, J.E.; Fletcher, W.M.

    1993-01-01

    A three-dimensional (3-D) computer model of the Advanced Neutron Source (ANS) main control complex is being developed. The main control complex includes the main control room, the technical support center, the materials irradiation control room, computer equipment rooms, communications equipment rooms, cable-spreading rooms, and some support offices and breakroom facilities. The model will be used to provide facility designers and operations personnel with capabilities for fit-up/interference analysis, visual ''walk-throughs'' for optimizing maintain-ability, and human factors and operability analyses. It will be used to determine performance design characteristics, to generate construction drawings, and to integrate control room layout, equipment mounting, grounding equipment, electrical cabling, and utility services into ANS building designs. This paper describes the development of the initial phase of the 3-D computer model for the ANS main control complex and plans for its development and use

  6. Nostradamus 2014 prediction, modeling and analysis of complex systems

    CERN Document Server

    Suganthan, Ponnuthurai; Chen, Guanrong; Snasel, Vaclav; Abraham, Ajith; Rössler, Otto

    2014-01-01

    The prediction of behavior of complex systems, analysis and modeling of its structure is a vitally important problem in engineering, economy and generally in science today. Examples of such systems can be seen in the world around us (including our bodies) and of course in almost every scientific discipline including such “exotic” domains as the earth’s atmosphere, turbulent fluids, economics (exchange rate and stock markets), population growth, physics (control of plasma), information flow in social networks and its dynamics, chemistry and complex networks. To understand such complex dynamics, which often exhibit strange behavior, and to use it in research or industrial applications, it is paramount to create its models. For this purpose there exists a rich spectrum of methods, from classical such as ARMA models or Box Jenkins method to modern ones like evolutionary computation, neural networks, fuzzy logic, geometry, deterministic chaos amongst others. This proceedings book is a collection of accepted ...

  7. The effects of model and data complexity on predictions from species distributions models

    DEFF Research Database (Denmark)

    García-Callejas, David; Bastos, Miguel

    2016-01-01

    How complex does a model need to be to provide useful predictions is a matter of continuous debate across environmental sciences. In the species distributions modelling literature, studies have demonstrated that more complex models tend to provide better fits. However, studies have also shown...... that predictive performance does not always increase with complexity. Testing of species distributions models is challenging because independent data for testing are often lacking, but a more general problem is that model complexity has never been formally described in such studies. Here, we systematically...

  8. Modeling geophysical complexity: a case for geometric determinism

    Directory of Open Access Journals (Sweden)

    C. E. Puente

    2007-01-01

    Full Text Available It has been customary in the last few decades to employ stochastic models to represent complex data sets encountered in geophysics, particularly in hydrology. This article reviews a deterministic geometric procedure to data modeling, one that represents whole data sets as derived distributions of simple multifractal measures via fractal functions. It is shown how such a procedure may lead to faithful holistic representations of existing geophysical data sets that, while complementing existing representations via stochastic methods, may also provide a compact language for geophysical complexity. The implications of these ideas, both scientific and philosophical, are stressed.

  9. Parameterization and Sensitivity Analysis of a Complex Simulation Model for Mosquito Population Dynamics, Dengue Transmission, and Their Control

    Science.gov (United States)

    Ellis, Alicia M.; Garcia, Andres J.; Focks, Dana A.; Morrison, Amy C.; Scott, Thomas W.

    2011-01-01

    Models can be useful tools for understanding the dynamics and control of mosquito-borne disease. More detailed models may be more realistic and better suited for understanding local disease dynamics; however, evaluating model suitability, accuracy, and performance becomes increasingly difficult with greater model complexity. Sensitivity analysis is a technique that permits exploration of complex models by evaluating the sensitivity of the model to changes in parameters. Here, we present results of sensitivity analyses of two interrelated complex simulation models of mosquito population dynamics and dengue transmission. We found that dengue transmission may be influenced most by survival in each life stage of the mosquito, mosquito biting behavior, and duration of the infectious period in humans. The importance of these biological processes for vector-borne disease models and the overwhelming lack of knowledge about them make acquisition of relevant field data on these biological processes a top research priority. PMID:21813844

  10. Two clinically relevant pressures of carbon dioxide pneumoperitoneum cause hepatic injury in a rabbit model

    OpenAIRE

    Li, Jun; Liu, Ying-Hai; Ye, Zhan-Yong; Liu, He-Nian; Ou, Shan; Tian, Fu-Zhou

    2011-01-01

    AIM: To observe the hepatic injury induced by carbon dioxide pneumoperitoneum (CDP) in rabbits, compare the effects of low- and high-pressure pneumoperitoneum, and to determine the degree of hepatic injury induced by these two clinically relevant CDP pressures.

  11. Knowledge-based inspection:modelling complex processes with the integrated Safeguards Modelling Method (iSMM)

    International Nuclear Information System (INIS)

    Abazi, F.

    2011-01-01

    Increased level of complexity in almost every discipline and operation today raises the demand for knowledge in order to successfully run an organization whether to generate profit or to attain a non-profit mission. Traditional way of transferring knowledge to information systems rich in data structures and complex algorithms continue to hinder the ability to swiftly turnover concepts into operations. Diagrammatic modelling commonly applied in engineering in order to represent concepts or reality remains to be an excellent way of converging knowledge from domain experts. The nuclear verification domain represents ever more a matter which has great importance to the World safety and security. Demand for knowledge about nuclear processes and verification activities used to offset potential misuse of nuclear technology will intensify with the growth of the subject technology. This Doctoral thesis contributes with a model-based approach for representing complex process such as nuclear inspections. The work presented contributes to other domains characterized with knowledge intensive and complex processes. Based on characteristics of a complex process a conceptual framework was established as the theoretical basis for creating a number of modelling languages to represent the domain. The integrated Safeguards Modelling Method (iSMM) is formalized through an integrated meta-model. The diagrammatic modelling languages represent the verification domain and relevant nuclear verification aspects. Such a meta-model conceptualizes the relation between practices of process management, knowledge management and domain specific verification principles. This fusion is considered as necessary in order to create quality processes. The study also extends the formalization achieved through a meta-model by contributing with a formalization language based on Pattern Theory. Through the use of graphical and mathematical constructs of the theory, process structures are formalized enhancing

  12. Modelling of interplanetary pickup ion fluxes and relevance for LISM parameters

    International Nuclear Information System (INIS)

    Fahr, H.J.; Rucinski, D.

    1989-01-01

    It has been known for many years that neutral interstellar atoms enter the solar system from the upwind side and penetrate deep into the inner heliosphere. Helium atoms, in particular, advance towards very small solar distances before they are ionized and then again convected as He - pickup ions outwards with the solar wind. Since these ions were recently detected in space, we concentrate here on calculations of He + production rates and He + fluxes. It is shown that inside 1 a.u., the He - production is essentially determined both by solar e.u.v. photoionization and by electron impact ionization. We calculate He + production rates as a function of space coordinates, taking into account the core-halo structure of the energy distribution of solar wind electrons and their temperature distribution with distance according to relevant solar wind models. For this purpose, a newly developed program to compute He densities was used. In contrast to the production of H + , the He - production rates are found to be higher on the downwind axis than on the upwind axis by a factor of 5. We also determine partial and total He + ion fluxes as a function of solar distance and longitude. It is interesting to note that only the values for total fluxes agree well with the integrated He + fluxes measured by the SULEICA experiment aboard the AMPTE satellite. This indicates that pickup ions under the influence of the intrinsic MHD wave turbulence in the solar wind change their primary seed distribution function by rapid pitch-angle scattering and subsequent adiabatic cooling. To interpret the He + intensity profile along the orbit of the Earth in terms of LISM helium parameters, we point to the need to take into account carefully electron impact ionization in order to prevent misinterpretations. (author)

  13. Macro-economic model of aggregate market in the Albanian economy, and relevant problems thereto

    Directory of Open Access Journals (Sweden)

    Dr.Sc. Alqi Naqellari

    2011-12-01

    Full Text Available This paper uses concrete data obtained on the Albanian economy to analyse the positions of aggregate demand/supply curves in the economy. As examples from micro-economics, we have taken the models of Ŵalras and Marshall, to view the possibilities of achieving an economic equilibrium. Data available from the Albanian economy, and from the global economic trends generally, have shown that the positions of curves are such, with differences in their inclination, while the classic position of the aggregate demand curve, with a negative trend, studied in the macro-economic theory, is unique. Therefore, our objective is to try and show the scholars of the field that the macro-economic problems must be viewed in this light, and not through the static scheme used so far. The equilibrium is met not only when the aggregate demand and aggregate supply curves are met, meaning when the aggregate expenditure are equal to aggregate production, but it exists at every moment, independently of whether it is consistent or not, while the pricing trends continue to increase, similar to two other aggregates. The understanding of such a situation should give the possibility to governments and other policy-making institutions to review their positions and relations with monetary and fiscal indicators, in a view of making the organic connection, and increasing their working effectiveness. The paper aims to show how one can define the relation between monetary and fiscal policies necessary to see their role and relevance in the economic growth of a country.

  14. Predictive modelling of complex agronomic and biological systems.

    Science.gov (United States)

    Keurentjes, Joost J B; Molenaar, Jaap; Zwaan, Bas J

    2013-09-01

    Biological systems are tremendously complex in their functioning and regulation. Studying the multifaceted behaviour and describing the performance of such complexity has challenged the scientific community for years. The reduction of real-world intricacy into simple descriptive models has therefore convinced many researchers of the usefulness of introducing mathematics into biological sciences. Predictive modelling takes such an approach another step further in that it takes advantage of existing knowledge to project the performance of a system in alternating scenarios. The ever growing amounts of available data generated by assessing biological systems at increasingly higher detail provide unique opportunities for future modelling and experiment design. Here we aim to provide an overview of the progress made in modelling over time and the currently prevalent approaches for iterative modelling cycles in modern biology. We will further argue for the importance of versatility in modelling approaches, including parameter estimation, model reduction and network reconstruction. Finally, we will discuss the difficulties in overcoming the mathematical interpretation of in vivo complexity and address some of the future challenges lying ahead. © 2013 John Wiley & Sons Ltd.

  15. The relevance of the IFPE Database to the modelling of WWER-type fuel behaviour

    International Nuclear Information System (INIS)

    Killeen, J.; Sartori, E.

    2006-01-01

    The aim of the International Fuel Performance Experimental Database (IFPE Database) is to provide, in the public domain, a comprehensive and well-qualified database on zircaloy-clad UO 2 fuel for model development and code validation. The data encompass both normal and off-normal operation and include prototypic commercial irradiations as well as experiments performed in Material Testing Reactors. To date, the Database contains over 800 individual cases, providing data on fuel centreline temperatures, dimensional changes and FGR either from in-pile pressure measurements or PIE techniques, including puncturing, Electron Probe Micro Analysis (EPMA) and X-ray Fluorescence (XRF) measurements. This work in assembling and disseminating the Database is carried out in close co-operation and co-ordination between OECD/NEA and the IAEA. The majority of data sets are dedicated to fuel behaviour under LWR irradiation, and every effort has been made to obtain data representative of BWR, PWR and WWER conditions. In each case, the data set contains information on the pre-characterisation of the fuel, cladding and fuel rod geometry, the irradiation history presented in as much detail as the source documents allow, and finally any in-pile or PIE measurements that were made. The purpose of this paper is to highlight data that are relevant specifically to WWER application. To this end, the NEA and IAEA have been successful in obtaining appropriate data for both WWER-440 and WWER-1000-type reactors. These are: 1) Twelve (12) rods from the Finnish-Russian co-operative SOFIT programme; 2) Kola-3 WWER-440 irradiation; 3) MIR ramp tests on Kola-3 rods; 4) Zaporozskaya WWER-1000 irradiation; 5) Novovoronezh WWER-1000 irradiation. Before reviewing these data sets and their usefulness, the paper touches briefly on recent, more novel additions to the Database and on progress made in the use of the Database for the current IAEA FUMEX II Project. Finally, the paper describes the Computer

  16. Assessing Complexity in Learning Outcomes--A Comparison between the SOLO Taxonomy and the Model of Hierarchical Complexity

    Science.gov (United States)

    Stålne, Kristian; Kjellström, Sofia; Utriainen, Jukka

    2016-01-01

    An important aspect of higher education is to educate students who can manage complex relationships and solve complex problems. Teachers need to be able to evaluate course content with regard to complexity, as well as evaluate students' ability to assimilate complex content and express it in the form of a learning outcome. One model for evaluating…

  17. Experimental and theoretical data on ion-molecule-reactions relevant for plasma modelling

    International Nuclear Information System (INIS)

    Hansel, A.; Praxmarer, C.; Lindinger, W.

    1995-01-01

    Despite the fact that the rate coefficients of hundreds of ion-molecule-reactions have been published in the literature, much more data are required for the purpose of plasma modelling. Many ion molecule reactions have rate coefficients, k, as large as the collisional limiting value, k c , i.e. the rate coefficients k c at which ion-neutral collision complexes are formed are close to the actual rate coefficients observed. In the case of the interaction of an ion with a non polar molecule, k c , is determined by the Langevin limiting value k L being typically 10 -9 cm 3 s -1 . However, when ions react with polar molecules k c is predicted by the average dipole orientation (ADO) theory. These classical theories yield accurate rate coefficients at thermal and elevated temperatures for practically all proton transfer as well as for many charge transfer and hydrogen abstraction reactions. The agreement between experimental and calculated values is usually better than ±20% and in the case of proton transfer reactions the agreement seems to be even better as recent investigations have shown. Even the interaction of the permanent ion dipole with non polar and polar neutrals can be taken into account to predict reaction rate coefficients as has been shown very recently in reactions of the highly polar ion ArH 3 + with various neutrals

  18. Modelling, Estimation and Control of Networked Complex Systems

    CERN Document Server

    Chiuso, Alessandro; Frasca, Mattia; Rizzo, Alessandro; Schenato, Luca; Zampieri, Sandro

    2009-01-01

    The paradigm of complexity is pervading both science and engineering, leading to the emergence of novel approaches oriented at the development of a systemic view of the phenomena under study; the definition of powerful tools for modelling, estimation, and control; and the cross-fertilization of different disciplines and approaches. This book is devoted to networked systems which are one of the most promising paradigms of complexity. It is demonstrated that complex, dynamical networks are powerful tools to model, estimate, and control many interesting phenomena, like agent coordination, synchronization, social and economics events, networks of critical infrastructures, resources allocation, information processing, or control over communication networks. Moreover, it is shown how the recent technological advances in wireless communication and decreasing in cost and size of electronic devices are promoting the appearance of large inexpensive interconnected systems, each with computational, sensing and mobile cap...

  19. Infinite Multiple Membership Relational Modeling for Complex Networks

    DEFF Research Database (Denmark)

    Mørup, Morten; Schmidt, Mikkel Nørgaard; Hansen, Lars Kai

    Learning latent structure in complex networks has become an important problem fueled by many types of networked data originating from practically all fields of science. In this paper, we propose a new non-parametric Bayesian multiplemembership latent feature model for networks. Contrary to existing...... multiplemembership models that scale quadratically in the number of vertices the proposedmodel scales linearly in the number of links admittingmultiple-membership analysis in large scale networks. We demonstrate a connection between the single membership relational model and multiple membership models and show...

  20. Modeling data irregularities and structural complexities in data envelopment analysis

    CERN Document Server

    Zhu, Joe

    2007-01-01

    In a relatively short period of time, Data Envelopment Analysis (DEA) has grown into a powerful quantitative, analytical tool for measuring and evaluating performance. It has been successfully applied to a whole variety of problems in many different contexts worldwide. This book deals with the micro aspects of handling and modeling data issues in modeling DEA problems. DEA's use has grown with its capability of dealing with complex "service industry" and the "public service domain" types of problems that require modeling of both qualitative and quantitative data. This handbook treatment deals with specific data problems including: imprecise or inaccurate data; missing data; qualitative data; outliers; undesirable outputs; quality data; statistical analysis; software and other data aspects of modeling complex DEA problems. In addition, the book will demonstrate how to visualize DEA results when the data is more than 3-dimensional, and how to identify efficiency units quickly and accurately.

  1. Modeling the propagation of mobile malware on complex networks

    Science.gov (United States)

    Liu, Wanping; Liu, Chao; Yang, Zheng; Liu, Xiaoyang; Zhang, Yihao; Wei, Zuxue

    2016-08-01

    In this paper, the spreading behavior of malware across mobile devices is addressed. By introducing complex networks to model mobile networks, which follows the power-law degree distribution, a novel epidemic model for mobile malware propagation is proposed. The spreading threshold that guarantees the dynamics of the model is calculated. Theoretically, the asymptotic stability of the malware-free equilibrium is confirmed when the threshold is below the unity, and the global stability is further proved under some sufficient conditions. The influences of different model parameters as well as the network topology on malware propagation are also analyzed. Our theoretical studies and numerical simulations show that networks with higher heterogeneity conduce to the diffusion of malware, and complex networks with lower power-law exponents benefit malware spreading.

  2. Development of a novel, physiologically relevant cytotoxicity model: Application to the study of chemotherapeutic damage to mesenchymal stromal cells

    Energy Technology Data Exchange (ETDEWEB)

    May, Jennifer E., E-mail: Jennifer2.May@uwe.ac.uk; Morse, H. Ruth, E-mail: Ruth.Morse@uwe.ac.uk; Xu, Jinsheng, E-mail: Jinsheng.Xu@uwe.ac.uk; Donaldson, Craig, E-mail: Craig.Donaldson@uwe.ac.uk

    2012-09-15

    There is an increasing need for development of physiologically relevant in-vitro models for testing toxicity, however determining toxic effects of agents which undergo extensive hepatic metabolism can be particularly challenging. If a source of such metabolic enzymes is inadequate within a model system, toxicity from prodrugs may be grossly underestimated. Conversely, the vast majority of agents are detoxified by the liver, consequently toxicity from such agents may be overestimated. In this study we describe the development of a novel in-vitro model, which could be adapted for any toxicology setting. The model utilises HepG2 liver spheroids as a source of metabolic enzymes, which have been shown to more closely resemble human liver than traditional monolayer cultures. A co-culture model has been developed enabling the effect of any metabolised agent on another cell type to be assessed. This has been optimised to enable the study of damaging effects of chemotherapy on mesenchymal stem cells (MSC), the supportive stem cells of the bone marrow. Several optimisation steps were undertaken, including determining optimal culture conditions, confirmation of hepatic P450 enzyme activity and ensuring physiologically relevant doses of chemotherapeutic agents were appropriate for use within the model. The developed model was subsequently validated using several chemotherapeutic agents, both prodrugs and active drugs, with resulting MSC damage closely resembling effects seen in patients following chemotherapy. Minimal modifications would enable this novel co-culture model to be utilised as a general toxicity model, contributing to the drive to reduce animal safety testing and enabling physiologically relevant in-vitro study. -- Highlights: ► An in vitro model was developed for study of drugs requiring hepatic metabolism ► HepG2 spheroids were utilised as a physiologically relevant source of liver enzymes ► The model was optimised to enable study of chemotherapeutic

  3. Development of a novel, physiologically relevant cytotoxicity model: Application to the study of chemotherapeutic damage to mesenchymal stromal cells

    International Nuclear Information System (INIS)

    May, Jennifer E.; Morse, H. Ruth; Xu, Jinsheng; Donaldson, Craig

    2012-01-01

    There is an increasing need for development of physiologically relevant in-vitro models for testing toxicity, however determining toxic effects of agents which undergo extensive hepatic metabolism can be particularly challenging. If a source of such metabolic enzymes is inadequate within a model system, toxicity from prodrugs may be grossly underestimated. Conversely, the vast majority of agents are detoxified by the liver, consequently toxicity from such agents may be overestimated. In this study we describe the development of a novel in-vitro model, which could be adapted for any toxicology setting. The model utilises HepG2 liver spheroids as a source of metabolic enzymes, which have been shown to more closely resemble human liver than traditional monolayer cultures. A co-culture model has been developed enabling the effect of any metabolised agent on another cell type to be assessed. This has been optimised to enable the study of damaging effects of chemotherapy on mesenchymal stem cells (MSC), the supportive stem cells of the bone marrow. Several optimisation steps were undertaken, including determining optimal culture conditions, confirmation of hepatic P450 enzyme activity and ensuring physiologically relevant doses of chemotherapeutic agents were appropriate for use within the model. The developed model was subsequently validated using several chemotherapeutic agents, both prodrugs and active drugs, with resulting MSC damage closely resembling effects seen in patients following chemotherapy. Minimal modifications would enable this novel co-culture model to be utilised as a general toxicity model, contributing to the drive to reduce animal safety testing and enabling physiologically relevant in-vitro study. -- Highlights: ► An in vitro model was developed for study of drugs requiring hepatic metabolism ► HepG2 spheroids were utilised as a physiologically relevant source of liver enzymes ► The model was optimised to enable study of chemotherapeutic

  4. Uncertainty and validation. Effect of model complexity on uncertainty estimates

    International Nuclear Information System (INIS)

    Elert, M.

    1996-09-01

    In the Model Complexity subgroup of BIOMOVS II, models of varying complexity have been applied to the problem of downward transport of radionuclides in soils. A scenario describing a case of surface contamination of a pasture soil was defined. Three different radionuclides with different environmental behavior and radioactive half-lives were considered: Cs-137, Sr-90 and I-129. The intention was to give a detailed specification of the parameters required by different kinds of model, together with reasonable values for the parameter uncertainty. A total of seven modelling teams participated in the study using 13 different models. Four of the modelling groups performed uncertainty calculations using nine different modelling approaches. The models used range in complexity from analytical solutions of a 2-box model using annual average data to numerical models coupling hydrology and transport using data varying on a daily basis. The complex models needed to consider all aspects of radionuclide transport in a soil with a variable hydrology are often impractical to use in safety assessments. Instead simpler models, often box models, are preferred. The comparison of predictions made with the complex models and the simple models for this scenario show that the predictions in many cases are very similar, e g in the predictions of the evolution of the root zone concentration. However, in other cases differences of many orders of magnitude can appear. One example is the prediction of the flux to the groundwater of radionuclides being transported through the soil column. Some issues that have come to focus in this study: There are large differences in the predicted soil hydrology and as a consequence also in the radionuclide transport, which suggests that there are large uncertainties in the calculation of effective precipitation and evapotranspiration. The approach used for modelling the water transport in the root zone has an impact on the predictions of the decline in root

  5. Modelling and simulating in-stent restenosis with complex automata

    NARCIS (Netherlands)

    Hoekstra, A.G.; Lawford, P.; Hose, R.

    2010-01-01

    In-stent restenosis, the maladaptive response of a blood vessel to injury caused by the deployment of a stent, is a multiscale system involving a large number of biological and physical processes. We describe a Complex Automata Model for in-stent restenosis, coupling bulk flow, drug diffusion, and

  6. The Complexity of Developmental Predictions from Dual Process Models

    Science.gov (United States)

    Stanovich, Keith E.; West, Richard F.; Toplak, Maggie E.

    2011-01-01

    Drawing developmental predictions from dual-process theories is more complex than is commonly realized. Overly simplified predictions drawn from such models may lead to premature rejection of the dual process approach as one of many tools for understanding cognitive development. Misleading predictions can be avoided by paying attention to several…

  7. Constructive Lower Bounds on Model Complexity of Shallow Perceptron Networks

    Czech Academy of Sciences Publication Activity Database

    Kůrková, Věra

    2018-01-01

    Roč. 29, č. 7 (2018), s. 305-315 ISSN 0941-0643 R&D Projects: GA ČR GA15-18108S Institutional support: RVO:67985807 Keywords : shallow and deep networks * model complexity and sparsity * signum perceptron networks * finite mappings * variational norms * Hadamard matrices Subject RIV: IN - Informatics, Computer Science Impact factor: 2.505, year: 2016

  8. Complexity effects in choice experiments-based models

    NARCIS (Netherlands)

    Dellaert, B.G.C.; Donkers, B.; van Soest, A.H.O.

    2012-01-01

    Many firms rely on choice experiment–based models to evaluate future marketing actions under various market conditions. This research investigates choice complexity (i.e., number of alternatives, number of attributes, and utility similarity between the most attractive alternatives) and individual

  9. Kolmogorov complexity, pseudorandom generators and statistical models testing

    Czech Academy of Sciences Publication Activity Database

    Šindelář, Jan; Boček, Pavel

    2002-01-01

    Roč. 38, č. 6 (2002), s. 747-759 ISSN 0023-5954 R&D Projects: GA ČR GA102/99/1564 Institutional research plan: CEZ:AV0Z1075907 Keywords : Kolmogorov complexity * pseudorandom generators * statistical models testing Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.341, year: 2002

  10. Framework for Modelling Multiple Input Complex Aggregations for Interactive Installations

    DEFF Research Database (Denmark)

    Padfield, Nicolas; Andreasen, Troels

    2012-01-01

    on fuzzy logic and provides a method for variably balancing interaction and user input with the intention of the artist or director. An experimental design is presented, demonstrating an intuitive interface for parametric modelling of a complex aggregation function. The aggregation function unifies...

  11. Model-based safety architecture framework for complex systems

    NARCIS (Netherlands)

    Schuitemaker, Katja; Rajabali Nejad, Mohammadreza; Braakhuis, J.G.; Podofillini, Luca; Sudret, Bruno; Stojadinovic, Bozidar; Zio, Enrico; Kröger, Wolfgang

    2015-01-01

    The shift to transparency and rising need of the general public for safety, together with the increasing complexity and interdisciplinarity of modern safety-critical Systems of Systems (SoS) have resulted in a Model-Based Safety Architecture Framework (MBSAF) for capturing and sharing architectural

  12. A binary logistic regression model with complex sampling design of ...

    African Journals Online (AJOL)

    2017-09-03

    Sep 3, 2017 ... Bi-variable and multi-variable binary logistic regression model with complex sampling design was fitted. .... Data was entered into STATA-12 and analyzed using. SPSS-21. .... lack of access/too far or costs too much. 35. 1.2.

  13. On the general procedure for modelling complex ecological systems

    International Nuclear Information System (INIS)

    He Shanyu.

    1987-12-01

    In this paper, the principle of a general procedure for modelling complex ecological systems, i.e. the Adaptive Superposition Procedure (ASP) is shortly stated. The result of application of ASP in a national project for ecological regionalization is also described. (author). 3 refs

  14. The dynamic complexity of a three species food chain model

    International Nuclear Information System (INIS)

    Lv Songjuan; Zhao Min

    2008-01-01

    In this paper, a three-species food chain model is analytically investigated on theories of ecology and using numerical simulation. Bifurcation diagrams are obtained for biologically feasible parameters. The results show that the system exhibits rich complexity features such as stable, periodic and chaotic dynamics

  15. Modeling Stochastic Complexity in Complex Adaptive Systems: Non-Kolmogorov Probability and the Process Algebra Approach.

    Science.gov (United States)

    Sulis, William H

    2017-10-01

    Walter Freeman III pioneered the application of nonlinear dynamical systems theories and methodologies in his work on mesoscopic brain dynamics.Sadly, mainstream psychology and psychiatry still cling to linear correlation based data analysis techniques, which threaten to subvert the process of experimentation and theory building. In order to progress, it is necessary to develop tools capable of managing the stochastic complexity of complex biopsychosocial systems, which includes multilevel feedback relationships, nonlinear interactions, chaotic dynamics and adaptability. In addition, however, these systems exhibit intrinsic randomness, non-Gaussian probability distributions, non-stationarity, contextuality, and non-Kolmogorov probabilities, as well as the absence of mean and/or variance and conditional probabilities. These properties and their implications for statistical analysis are discussed. An alternative approach, the Process Algebra approach, is described. It is a generative model, capable of generating non-Kolmogorov probabilities. It has proven useful in addressing fundamental problems in quantum mechanics and in the modeling of developing psychosocial systems.

  16. BROA: An agent-based model to recommend relevant Learning Objects from Repository Federations adapted to learner profile

    Directory of Open Access Journals (Sweden)

    Paula A. Rodríguez

    2013-03-01

    Full Text Available Learning Objects (LOs are distinguished from traditional educational resources for their easy and quickly availability through Web-based repositories, from which they are accessed through their metadata. In addition, having a user profile allows an educational recommender system to help the learner to find the most relevant LOs based on their needs and preferences. The aim of this paper is to propose an agent-based model so-called BROA to recommend relevant LOs recovered from Repository Federations as well as LOs adapted to learner profile. The model proposed uses both role and service models of GAIA methodology, and the analysis models of the MAS-CommonKADS methodology. A prototype was built based on this model and validated to obtain some assessing results that are finally presented.

  17. Adapting APSIM to model the physiology and genetics of complex adaptive traits in field crops.

    Science.gov (United States)

    Hammer, Graeme L; van Oosterom, Erik; McLean, Greg; Chapman, Scott C; Broad, Ian; Harland, Peter; Muchow, Russell C

    2010-05-01

    Progress in molecular plant breeding is limited by the ability to predict plant phenotype based on its genotype, especially for complex adaptive traits. Suitably constructed crop growth and development models have the potential to bridge this predictability gap. A generic cereal crop growth and development model is outlined here. It is designed to exhibit reliable predictive skill at the crop level while also introducing sufficient physiological rigour for complex phenotypic responses to become emergent properties of the model dynamics. The approach quantifies capture and use of radiation, water, and nitrogen within a framework that predicts the realized growth of major organs based on their potential and whether the supply of carbohydrate and nitrogen can satisfy that potential. The model builds on existing approaches within the APSIM software platform. Experiments on diverse genotypes of sorghum that underpin the development and testing of the adapted crop model are detailed. Genotypes differing in height were found to differ in biomass partitioning among organs and a tall hybrid had significantly increased radiation use efficiency: a novel finding in sorghum. Introducing these genetic effects associated with plant height into the model generated emergent simulated phenotypic differences in green leaf area retention during grain filling via effects associated with nitrogen dynamics. The relevance to plant breeding of this capability in complex trait dissection and simulation is discussed.

  18. Modelling small-angle scattering data from complex protein-lipid systems

    DEFF Research Database (Denmark)

    Kynde, Søren Andreas Røssell

    This thesis consists of two parts. The rst part is divided into five chapters. Chapter 1 gives a general introduction to the bio-molecular systems that have been studied. These are membrane proteins and their lipid environments in the form of phospholipid nanodiscs. Membrane proteins...... the techniques very well suited for the study of the nanodisc system. Chapter 3 explains two different modelling approaches that can be used in the analysis of small-angle scattering data from lipid-protein complexes. These are the continuous approach where the system of interest is modelled as a few regular...... combine the bene ts of each of the methods and give unique structural information about relevant bio-molecular complexes in solution. Chapter 4 describes the work behind a proposal of a small-angle neutron scattering instrument for the European Spallation Source under construction in Lund. The instrument...

  19. Uncertainty and validation. Effect of model complexity on uncertainty estimates

    Energy Technology Data Exchange (ETDEWEB)

    Elert, M. [Kemakta Konsult AB, Stockholm (Sweden)] [ed.

    1996-09-01

    In the Model Complexity subgroup of BIOMOVS II, models of varying complexity have been applied to the problem of downward transport of radionuclides in soils. A scenario describing a case of surface contamination of a pasture soil was defined. Three different radionuclides with different environmental behavior and radioactive half-lives were considered: Cs-137, Sr-90 and I-129. The intention was to give a detailed specification of the parameters required by different kinds of model, together with reasonable values for the parameter uncertainty. A total of seven modelling teams participated in the study using 13 different models. Four of the modelling groups performed uncertainty calculations using nine different modelling approaches. The models used range in complexity from analytical solutions of a 2-box model using annual average data to numerical models coupling hydrology and transport using data varying on a daily basis. The complex models needed to consider all aspects of radionuclide transport in a soil with a variable hydrology are often impractical to use in safety assessments. Instead simpler models, often box models, are preferred. The comparison of predictions made with the complex models and the simple models for this scenario show that the predictions in many cases are very similar, e g in the predictions of the evolution of the root zone concentration. However, in other cases differences of many orders of magnitude can appear. One example is the prediction of the flux to the groundwater of radionuclides being transported through the soil column. Some issues that have come to focus in this study: There are large differences in the predicted soil hydrology and as a consequence also in the radionuclide transport, which suggests that there are large uncertainties in the calculation of effective precipitation and evapotranspiration. The approach used for modelling the water transport in the root zone has an impact on the predictions of the decline in root

  20. How fear-relevant illusory correlations might develop and persist in anxiety disorders: A model of contributing factors.

    Science.gov (United States)

    Wiemer, Julian; Pauli, Paul

    2016-12-01

    Fear-relevant illusory correlations (ICs) are defined as the overestimation of the relationship between a fear-relevant stimulus and aversive consequences. ICs reflect biased cognitions affecting the learning and unlearning of fear in anxiety disorders, and a deeper understanding might help to improve treatment. A model for the maintenance of ICs is proposed that highlights the importance of amplified aversiveness and salience of fear-relevant outcomes, impaired executive contingency monitoring and an availability heuristic. The model explains why ICs are enhanced in high fearful individuals and allows for some implications that might be applied to augment the effectiveness of cognitive behavior therapy, such as emotion regulation and the direction of attention to non-aversive experiences. Finally, we suggest possible future research directions and an alternative measure of ICs. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Rethinking the Psychogenic Model of Complex Regional Pain Syndrome: Somatoform Disorders and Complex Regional Pain Syndrome

    Science.gov (United States)

    Hill, Renee J.; Chopra, Pradeep; Richardi, Toni

    2012-01-01

    Abstract Explaining the etiology of Complex Regional Pain Syndrome (CRPS) from the psychogenic model is exceedingly unsophisticated, because neurocognitive deficits, neuroanatomical abnormalities, and distortions in cognitive mapping are features of CRPS pathology. More importantly, many people who have developed CRPS have no history of mental illness. The psychogenic model offers comfort to physicians and mental health practitioners (MHPs) who have difficulty understanding pain maintained by newly uncovered neuro inflammatory processes. With increased education about CRPS through a biopsychosocial perspective, both physicians and MHPs can better diagnose, treat, and manage CRPS symptomatology. PMID:24223338

  2. Functional Relevance of Different Basal Ganglia Pathways Investigated in a Spiking Model with Reward Dependent Plasticity

    Directory of Open Access Journals (Sweden)

    Pierre Berthet

    2016-07-01

    Full Text Available The brain enables animals to behaviourally adapt in order to survive in a complex and dynamic environment, but how reward-oriented behaviours are achieved and computed by its underlying neural circuitry is an open question. To address this concern, we have developed a spiking model of the basal ganglia (BG that learns to dis-inhibit the action leading to a reward despite ongoing changes in the reward schedule. The architecture of the network features the two pathways commonly described in BG, the direct (denoted D1 and the indirect (denoted D2 pathway, as well as a loop involving striatum and the dopaminergic system. The activity of these dopaminergic neurons conveys the reward prediction error (RPE, which determines the magnitude of synaptic plasticity within the different pathways. All plastic connections implement a versatile four-factor learning rule derived from Bayesian inference that depends upon pre- and postsynaptic activity, receptor type and dopamine level. Synaptic weight updates occur in the D1 or D2 pathways depending on the sign of the RPE, and an efference copy informs upstream nuclei about the action selected. We demonstrate successful performance of the system in a multiple-choice learning task with a transiently changing reward schedule. We simulate lesioning of the various pathways and show that a condition without the D2 pathway fares worse than one without D1. Additionally, we simulate the degeneration observed in Parkinson’s disease (PD by decreasing the number of dopaminergic neurons during learning. The results suggest that the D1 pathway impairment in PD might have been overlooked. Furthermore, an analysis of the alterations in the synaptic weights shows that using the absolute reward value instead of the RPE leads to a larger change in D1.

  3. Experimental and modelling studies of iodine oxide formation and aerosol behaviour relevant to nuclear reactor accidents

    International Nuclear Information System (INIS)

    Dickinson, S.; Auvinen, A.; Ammar, Y.; Bosland, L.; Clément, B.; Funke, F.; Glowa, G.; Kärkelä, T.; Powers, D.A.; Tietze, S.; Weber, G.; Zhang, S.

    2014-01-01

    Highlights: • Radiolytic reactions can influence iodine volatility following a nuclear accident. • Kinetic models have been developed based on atmospheric chemistry studies. • Properties of iodine oxide aerosols produced by radiation have been measured. • Decomposition of iodine oxides by the action of heat or radiation has been observed. - Abstract: Plant assessments have shown that iodine contributes significantly to the source term for a range of accident scenarios. Iodine has a complex chemistry that determines its chemical form and, consequently, its volatility in the containment. If volatile iodine species are formed by reactions in the containment, they will be subject to radiolytic reactions in the atmosphere, resulting in the conversion of the gaseous species into involatile iodine oxides, which may deposit on surfaces or re-dissolve in water pools. The concentration of airborne iodine in the containment will, therefore, be determined by the balance between the reactions contributing to the formation and destruction of volatile species, as well as by the physico-chemical properties of the iodine oxide aerosols which will influence their longevity in the atmosphere. This paper summarises the work that has been done in the framework of the EC SARNET (Severe Accident Research Network) to develop a greater understanding of the reactions of gaseous iodine species in irradiated air/steam atmospheres, and the nature and behaviour of the reaction products. This work has mainly been focussed on investigating the nature and behaviour of iodine oxide aerosols, but earlier work by members of the SARNET group on gaseous reaction rates is also discussed to place the more recent work into context

  4. ASYMMETRIC PRICE TRANSMISSION MODELING: THE IMPORTANCE OF MODEL COMPLEXITY AND THE PERFORMANCE OF THE SELECTION CRITERIA

    Directory of Open Access Journals (Sweden)

    Henry de-Graft Acquah

    2013-01-01

    Full Text Available Information Criteria provides an attractive basis for selecting the best model from a set of competing asymmetric price transmission models or theories. However, little is understood about the sensitivity of the model selection methods to model complexity. This study therefore fits competing asymmetric price transmission models that differ in complexity to simulated data and evaluates the ability of the model selection methods to recover the true model. The results of Monte Carlo experimentation suggest that in general BIC, CAIC and DIC were superior to AIC when the true data generating process was the standard error correction model, whereas AIC was more successful when the true model was the complex error correction model. It is also shown that the model selection methods performed better in large samples for a complex asymmetric data generating process than with a standard asymmetric data generating process. Except for complex models, AIC's performance did not make substantial gains in recovery rates as sample size increased. The research findings demonstrate the influence of model complexity in asymmetric price transmission model comparison and selection.

  5. Higher genus correlators for the complex matrix model

    International Nuclear Information System (INIS)

    Ambjorn, J.; Kristhansen, C.F.; Makeenko, Y.M.

    1992-01-01

    In this paper, the authors describe an iterative scheme which allows us to calculate any multi-loop correlator for the complex matrix model to any genus using only the first in the chain of loop equations. The method works for a completely general potential and the results contain no explicit reference to the couplings. The genus g contribution to the m-loop correlator depends on a finite number of parameters, namely at most 4g - 2 + m. The authors find the generating functional explicitly up to genus three. The authors show as well that the model is equivalent to an external field problem for the complex matrix model with a logarithmic potential

  6. Reduced Complexity Volterra Models for Nonlinear System Identification

    Directory of Open Access Journals (Sweden)

    Hacıoğlu Rıfat

    2001-01-01

    Full Text Available A broad class of nonlinear systems and filters can be modeled by the Volterra series representation. However, its practical use in nonlinear system identification is sometimes limited due to the large number of parameters associated with the Volterra filter′s structure. The parametric complexity also complicates design procedures based upon such a model. This limitation for system identification is addressed in this paper using a Fixed Pole Expansion Technique (FPET within the Volterra model structure. The FPET approach employs orthonormal basis functions derived from fixed (real or complex pole locations to expand the Volterra kernels and reduce the number of estimated parameters. That the performance of FPET can considerably reduce the number of estimated parameters is demonstrated by a digital satellite channel example in which we use the proposed method to identify the channel dynamics. Furthermore, a gradient-descent procedure that adaptively selects the pole locations in the FPET structure is developed in the paper.

  7. Nonlinear model of epidemic spreading in a complex social network.

    Science.gov (United States)

    Kosiński, Robert A; Grabowski, A

    2007-10-01

    The epidemic spreading in a human society is a complex process, which can be described on the basis of a nonlinear mathematical model. In such an approach the complex and hierarchical structure of social network (which has implications for the spreading of pathogens and can be treated as a complex network), can be taken into account. In our model each individual has one of the four permitted states: susceptible, infected, infective, unsusceptible or dead. This refers to the SEIR model used in epidemiology. The state of an individual changes in time, depending on the previous state and the interactions with other individuals. The description of the interpersonal contacts is based on the experimental observations of the social relations in the community. It includes spatial localization of the individuals and hierarchical structure of interpersonal interactions. Numerical simulations were performed for different types of epidemics, giving the progress of a spreading process and typical relationships (e.g. range of epidemic in time, the epidemic curve). The spreading process has a complex and spatially chaotic character. The time dependence of the number of infective individuals shows the nonlinear character of the spreading process. We investigate the influence of the preventive vaccinations on the spreading process. In particular, for a critical value of preventively vaccinated individuals the percolation threshold is observed and the epidemic is suppressed.

  8. Elastic Network Model of a Nuclear Transport Complex

    Science.gov (United States)

    Ryan, Patrick; Liu, Wing K.; Lee, Dockjin; Seo, Sangjae; Kim, Young-Jin; Kim, Moon K.

    2010-05-01

    The structure of Kap95p was obtained from the Protein Data Bank (www.pdb.org) and analyzed RanGTP plays an important role in both nuclear protein import and export cycles. In the nucleus, RanGTP releases macromolecular cargoes from importins and conversely facilitates cargo binding to exportins. Although the crystal structure of the nuclear import complex formed by importin Kap95p and RanGTP was recently identified, its molecular mechanism still remains unclear. To understand the relationship between structure and function of a nuclear transport complex, a structure-based mechanical model of Kap95p:RanGTP complex is introduced. In this model, a protein structure is simply modeled as an elastic network in which a set of coarse-grained point masses are connected by linear springs representing biochemical interactions at atomic level. Harmonic normal mode analysis (NMA) and anharmonic elastic network interpolation (ENI) are performed to predict the modes of vibrations and a feasible pathway between locked and unlocked conformations of Kap95p, respectively. Simulation results imply that the binding of RanGTP to Kap95p induces the release of the cargo in the nucleus as well as prevents any new cargo from attaching to the Kap95p:RanGTP complex.

  9. Entropy, complexity, and Markov diagrams for random walk cancer models.

    Science.gov (United States)

    Newton, Paul K; Mason, Jeremy; Hurt, Brian; Bethel, Kelly; Bazhenova, Lyudmila; Nieva, Jorge; Kuhn, Peter

    2014-12-19

    The notion of entropy is used to compare the complexity associated with 12 common cancers based on metastatic tumor distribution autopsy data. We characterize power-law distributions, entropy, and Kullback-Liebler divergence associated with each primary cancer as compared with data for all cancer types aggregated. We then correlate entropy values with other measures of complexity associated with Markov chain dynamical systems models of progression. The Markov transition matrix associated with each cancer is associated with a directed graph model where nodes are anatomical locations where a metastatic tumor could develop, and edge weightings are transition probabilities of progression from site to site. The steady-state distribution corresponds to the autopsy data distribution. Entropy correlates well with the overall complexity of the reduced directed graph structure for each cancer and with a measure of systemic interconnectedness of the graph, called graph conductance. The models suggest that grouping cancers according to their entropy values, with skin, breast, kidney, and lung cancers being prototypical high entropy cancers, stomach, uterine, pancreatic and ovarian being mid-level entropy cancers, and colorectal, cervical, bladder, and prostate cancers being prototypical low entropy cancers, provides a potentially useful framework for viewing metastatic cancer in terms of predictability, complexity, and metastatic potential.

  10. BlenX-based compositional modeling of complex reaction mechanisms

    Directory of Open Access Journals (Sweden)

    Judit Zámborszky

    2010-02-01

    Full Text Available Molecular interactions are wired in a fascinating way resulting in complex behavior of biological systems. Theoretical modeling provides a useful framework for understanding the dynamics and the function of such networks. The complexity of the biological networks calls for conceptual tools that manage the combinatorial explosion of the set of possible interactions. A suitable conceptual tool to attack complexity is compositionality, already successfully used in the process algebra field to model computer systems. We rely on the BlenX programming language, originated by the beta-binders process calculus, to specify and simulate high-level descriptions of biological circuits. The Gillespie's stochastic framework of BlenX requires the decomposition of phenomenological functions into basic elementary reactions. Systematic unpacking of complex reaction mechanisms into BlenX templates is shown in this study. The estimation/derivation of missing parameters and the challenges emerging from compositional model building in stochastic process algebras are discussed. A biological example on circadian clock is presented as a case study of BlenX compositionality.

  11. A Hybrid Model of Mathematics Support for Science Students Emphasizing Basic Skills and Discipline Relevance

    Science.gov (United States)

    Jackson, Deborah C.; Johnson, Elizabeth D.

    2013-01-01

    The problem of students entering university lacking basic mathematical skills is a critical issue in the Australian higher-education sector and relevant globally. The Maths Skills programme at La Trobe University has been developed to address under preparation in the first-year science cohort in the absence of an institutional mathematics support…

  12. Some New Theoretical Issues in Systems Thinking Relevant for Modelling Corporate Learning

    Science.gov (United States)

    Minati, Gianfranco

    2007-01-01

    Purpose: The purpose of this paper is to describe fundamental concepts and theoretical challenges with regard to systems, and to build on these in proposing new theoretical frameworks relevant to learning, for example in so-called learning organizations. Design/methodology/approach: The paper focuses on some crucial fundamental aspects introduced…

  13. Relevance of bone graft viability in a goat transverse process model

    NARCIS (Netherlands)

    Kruyt, Moyo C.; Delawi, Diyar; Habibovic, Pamela; Oner, F. Cumhur; van Blitterswijk, Clemens; Dhert, Wouter J.A.

    2009-01-01

    Little is known about the mechanism by which autologous bone grafts are so successful. The relevance of viable osteogenic cells, which is a prominent difference between autologous bone graft and conventional alternatives, is especially controversial. With the emergence of bone tissue engineering,

  14. Multiscale modeling of complex materials phenomenological, theoretical and computational aspects

    CERN Document Server

    Trovalusci, Patrizia

    2014-01-01

    The papers in this volume deal with materials science, theoretical mechanics and experimental and computational techniques at multiple scales, providing a sound base and a framework for many applications which are hitherto treated in a phenomenological sense. The basic principles are formulated of multiscale modeling strategies towards modern complex multiphase materials subjected to various types of mechanical, thermal loadings and environmental effects. The focus is on problems where mechanics is highly coupled with other concurrent physical phenomena. Attention is also focused on the historical origins of multiscale modeling and foundations of continuum mechanics currently adopted to model non-classical continua with substructure, for which internal length scales play a crucial role.

  15. The Relevance Voxel Machine (RVoxM): A Self-Tuning Bayesian Model for Informative Image-Based Prediction

    DEFF Research Database (Denmark)

    Sabuncu, Mert R.; Van Leemput, Koen

    2012-01-01

    This paper presents the relevance voxel machine (RVoxM), a dedicated Bayesian model for making predictions based on medical imaging data. In contrast to the generic machine learning algorithms that have often been used for this purpose, the method is designed to utilize a small number of spatially...

  16. Receptor⁻Receptor Interactions in Multiple 5-HT1A Heteroreceptor Complexes in Raphe-Hippocampal 5-HT Transmission and Their Relevance for Depression and Its Treatment.

    Science.gov (United States)

    Borroto-Escuela, Dasiel O; Narváez, Manuel; Ambrogini, Patrizia; Ferraro, Luca; Brito, Ismel; Romero-Fernandez, Wilber; Andrade-Talavera, Yuniesky; Flores-Burgess, Antonio; Millon, Carmelo; Gago, Belen; Narvaez, Jose Angel; Odagaki, Yuji; Palkovits, Miklos; Diaz-Cabiale, Zaida; Fuxe, Kjell

    2018-06-03

    Due to the binding to a number of proteins to the receptor protomers in receptor heteromers in the brain, the term "heteroreceptor complexes" was introduced. A number of serotonin 5-HT1A heteroreceptor complexes were recently found to be linked to the ascending 5-HT pathways known to have a significant role in depression. The 5-HT1A⁻FGFR1 heteroreceptor complexes were involved in synergistically enhancing neuroplasticity in the hippocampus and in the dorsal raphe 5-HT nerve cells. The 5-HT1A protomer significantly increased FGFR1 protomer signaling in wild-type rats. Disturbances in the 5-HT1A⁻FGFR1 heteroreceptor complexes in the raphe-hippocampal 5-HT system were found in a genetic rat model of depression (Flinders sensitive line (FSL) rats). Deficits in FSL rats were observed in the ability of combined FGFR1 and 5-HT1A agonist cotreatment to produce antidepressant-like effects. It may in part reflect a failure of FGFR1 treatment to uncouple the 5-HT1A postjunctional receptors and autoreceptors from the hippocampal and dorsal raphe GIRK channels, respectively. This may result in maintained inhibition of hippocampal pyramidal nerve cell and dorsal raphe 5-HT nerve cell firing. Also, 5-HT1A⁻5-HT2A isoreceptor complexes were recently demonstrated to exist in the hippocampus and limbic cortex. They may play a role in depression through an ability of 5-HT2A protomer signaling to inhibit the 5-HT1A protomer recognition and signaling. Finally, galanin (1⁻15) was reported to enhance the antidepressant effects of fluoxetine through the putative formation of GalR1⁻GalR2⁻5-HT1A heteroreceptor complexes. Taken together, these novel 5-HT1A receptor complexes offer new targets for treatment of depression.

  17. Simulating Complex, Cold-region Process Interactions Using a Multi-scale, Variable-complexity Hydrological Model

    Science.gov (United States)

    Marsh, C.; Pomeroy, J. W.; Wheater, H. S.

    2017-12-01

    Accurate management of water resources is necessary for social, economic, and environmental sustainability worldwide. In locations with seasonal snowcovers, the accurate prediction of these water resources is further complicated due to frozen soils, solid-phase precipitation, blowing snow transport, and snowcover-vegetation-atmosphere interactions. Complex process interactions and feedbacks are a key feature of hydrological systems and may result in emergent phenomena, i.e., the arising of novel and unexpected properties within a complex system. One example is the feedback associated with blowing snow redistribution, which can lead to drifts that cause locally-increased soil moisture, thus increasing plant growth that in turn subsequently impacts snow redistribution, creating larger drifts. Attempting to simulate these emergent behaviours is a significant challenge, however, and there is concern that process conceptualizations within current models are too incomplete to represent the needed interactions. An improved understanding of the role of emergence in hydrological systems often requires high resolution distributed numerical hydrological models that incorporate the relevant process dynamics. The Canadian Hydrological Model (CHM) provides a novel tool for examining cold region hydrological systems. Key features include efficient terrain representation, allowing simulations at various spatial scales, reduced computational overhead, and a modular process representation allowing for an alternative-hypothesis framework. Using both physics-based and conceptual process representations sourced from long term process studies and the current cold regions literature allows for comparison of process representations and importantly, their ability to produce emergent behaviours. Examining the system in a holistic, process-based manner can hopefully derive important insights and aid in development of improved process representations.

  18. Modelling and simulation of gas explosions in complex geometries

    Energy Technology Data Exchange (ETDEWEB)

    Saeter, Olav

    1998-12-31

    This thesis presents a three-dimensional Computational Fluid Dynamics (CFD) code (EXSIM94) for modelling and simulation of gas explosions in complex geometries. It gives the theory and validates the following sub-models : (1) the flow resistance and turbulence generation model for densely packed regions, (2) the flow resistance and turbulence generation model for single objects, and (3) the quasi-laminar combustion model. It is found that a simple model for flow resistance and turbulence generation in densely packed beds is able to reproduce the medium and large scale MERGE explosion experiments of the Commission of European Communities (CEC) within a band of factor 2. The model for a single representation is found to predict explosion pressure in better agreement with the experiments with a modified k-{epsilon} model. This modification also gives a slightly improved grid independence for realistic gas explosion approaches. One laminar model is found unsuitable for gas explosion modelling because of strong grid dependence. Another laminar model is found to be relatively grid independent and to work well in harmony with the turbulent combustion model. The code is validated against 40 realistic gas explosion experiments. It is relatively grid independent in predicting explosion pressure in different offshore geometries. It can predict the influence of ignition point location, vent arrangements, different geometries, scaling effects and gas reactivity. The validation study concludes with statistical and uncertainty analyses of the code performance. 98 refs., 96 figs, 12 tabs.

  19. Parametric Linear Hybrid Automata for Complex Environmental Systems Modeling

    Directory of Open Access Journals (Sweden)

    Samar Hayat Khan Tareen

    2015-07-01

    Full Text Available Environmental systems, whether they be weather patterns or predator-prey relationships, are dependent on a number of different variables, each directly or indirectly affecting the system at large. Since not all of these factors are known, these systems take on non-linear dynamics, making it difficult to accurately predict meaningful behavioral trends far into the future. However, such dynamics do not warrant complete ignorance of different efforts to understand and model close approximations of these systems. Towards this end, we have applied a logical modeling approach to model and analyze the behavioral trends and systematic trajectories that these systems exhibit without delving into their quantification. This approach, formalized by René Thomas for discrete logical modeling of Biological Regulatory Networks (BRNs and further extended in our previous studies as parametric biological linear hybrid automata (Bio-LHA, has been previously employed for the analyses of different molecular regulatory interactions occurring across various cells and microbial species. As relationships between different interacting components of a system can be simplified as positive or negative influences, we can employ the Bio-LHA framework to represent different components of the environmental system as positive or negative feedbacks. In the present study, we highlight the benefits of hybrid (discrete/continuous modeling which lead to refinements among the fore-casted behaviors in order to find out which ones are actually possible. We have taken two case studies: an interaction of three microbial species in a freshwater pond, and a more complex atmospheric system, to show the applications of the Bio-LHA methodology for the timed hybrid modeling of environmental systems. Results show that the approach using the Bio-LHA is a viable method for behavioral modeling of complex environmental systems by finding timing constraints while keeping the complexity of the model

  20. Building confidence and credibility amid growing model and computing complexity

    Science.gov (United States)

    Evans, K. J.; Mahajan, S.; Veneziani, C.; Kennedy, J. H.

    2017-12-01

    As global Earth system models are developed to answer an ever-wider range of science questions, software products that provide robust verification, validation, and evaluation must evolve in tandem. Measuring the degree to which these new models capture past behavior, predict the future, and provide the certainty of predictions is becoming ever more challenging for reasons that are generally well known, yet are still challenging to address. Two specific and divergent needs for analysis of the Accelerated Climate Model for Energy (ACME) model - but with a similar software philosophy - are presented to show how a model developer-based focus can address analysis needs during expansive model changes to provide greater fidelity and execute on multi-petascale computing facilities. A-PRIME is a python script-based quick-look overview of a fully-coupled global model configuration to determine quickly if it captures specific behavior before significant computer time and expense is invested. EVE is an ensemble-based software framework that focuses on verification of performance-based ACME model development, such as compiler or machine settings, to determine the equivalence of relevant climate statistics. The challenges and solutions for analysis of multi-petabyte output data are highlighted from the aspect of the scientist using the software, with the aim of fostering discussion and further input from the community about improving developer confidence and community credibility.

  1. Bridging Mechanistic and Phenomenological Models of Complex Biological Systems.

    Science.gov (United States)

    Transtrum, Mark K; Qiu, Peng

    2016-05-01

    The inherent complexity of biological systems gives rise to complicated mechanistic models with a large number of parameters. On the other hand, the collective behavior of these systems can often be characterized by a relatively small number of phenomenological parameters. We use the Manifold Boundary Approximation Method (MBAM) as a tool for deriving simple phenomenological models from complicated mechanistic models. The resulting models are not black boxes, but remain expressed in terms of the microscopic parameters. In this way, we explicitly connect the macroscopic and microscopic descriptions, characterize the equivalence class of distinct systems exhibiting the same range of collective behavior, and identify the combinations of components that function as tunable control knobs for the behavior. We demonstrate the procedure for adaptation behavior exhibited by the EGFR pathway. From a 48 parameter mechanistic model, the system can be effectively described by a single adaptation parameter τ characterizing the ratio of time scales for the initial response and recovery time of the system which can in turn be expressed as a combination of microscopic reaction rates, Michaelis-Menten constants, and biochemical concentrations. The situation is not unlike modeling in physics in which microscopically complex processes can often be renormalized into simple phenomenological models with only a few effective parameters. The proposed method additionally provides a mechanistic explanation for non-universal features of the behavior.

  2. Lewis basicity of relevant monoanions in a non-protogenic organic solvent using a zinc(ii) Schiff-base complex as a reference Lewis acid.

    Science.gov (United States)

    Oliveri, Ivan Pietro; Di Bella, Santo

    2017-09-12

    Anions are ubiquitous species playing a primary role in chemistry, whose reactivity is essentially dominated by their Lewis basicity. However, no Lewis basicity data, in terms of Gibbs energy, are reported in the literature. We report here the first Lewis basicity of relevant monoanions through the determination of binding constants for the formation of stable 1 : 1 adducts, using a Zn II Schiff-base complex, 1, as a reference Lewis acid. Binding constants for equilibrium reactions were achieved through a nonlinear regression analysis of the binding isotherms from spectrophotometric titration data. The Lewis acidic complex 1 is a proper reference species because it forms stable adducts with both neutral and charged Lewis bases, thus allowing ranking their Lewis basicity. Binding constants indicate generally a strong Lewis basicity for all involved anions, rivalling or exceeding that of the stronger neutral bases, such as primary amines or pyridine. The cyanide anion results to be the strongest Lewis base, while the nitrate is the weaker base within the present anion series. Moreover, even the weaker base anions behave as stronger bases than the most common non-protogenic coordinating solvents.

  3. Using Gene Ontology to describe the role of the neurexin-neuroligin-SHANK complex in human, mouse and rat and its relevance to autism.

    Science.gov (United States)

    Patel, Sejal; Roncaglia, Paola; Lovering, Ruth C

    2015-06-06

    People with an autistic spectrum disorder (ASD) display a variety of characteristic behavioral traits, including impaired social interaction, communication difficulties and repetitive behavior. This complex neurodevelopment disorder is known to be associated with a combination of genetic and environmental factors. Neurexins and neuroligins play a key role in synaptogenesis and neurexin-neuroligin adhesion is one of several processes that have been implicated in autism spectrum disorders. In this report we describe the manual annotation of a selection of gene products known to be associated with autism and/or the neurexin-neuroligin-SHANK complex and demonstrate how a focused annotation approach leads to the creation of more descriptive Gene Ontology (GO) terms, as well as an increase in both the number of gene product annotations and their granularity, thus improving the data available in the GO database. The manual annotations we describe will impact on the functional analysis of a variety of future autism-relevant datasets. Comprehensive gene annotation is an essential aspect of genomic and proteomic studies, as the quality of gene annotations incorporated into statistical analysis tools affects the effective interpretation of data obtained through genome wide association studies, next generation sequencing, proteomic and transcriptomic datasets.

  4. Mathematical modelling of complex contagion on clustered networks

    Science.gov (United States)

    O'sullivan, David J.; O'Keeffe, Gary; Fennell, Peter; Gleeson, James

    2015-09-01

    The spreading of behavior, such as the adoption of a new innovation, is influenced bythe structure of social networks that interconnect the population. In the experiments of Centola (Science, 2010), adoption of new behavior was shown to spread further and faster across clustered-lattice networks than across corresponding random networks. This implies that the “complex contagion” effects of social reinforcement are important in such diffusion, in contrast to “simple” contagion models of disease-spread which predict that epidemics would grow more efficiently on random networks than on clustered networks. To accurately model complex contagion on clustered networks remains a challenge because the usual assumptions (e.g. of mean-field theory) regarding tree-like networks are invalidated by the presence of triangles in the network; the triangles are, however, crucial to the social reinforcement mechanism, which posits an increased probability of a person adopting behavior that has been adopted by two or more neighbors. In this paper we modify the analytical approach that was introduced by Hebert-Dufresne et al. (Phys. Rev. E, 2010), to study disease-spread on clustered networks. We show how the approximation method can be adapted to a complex contagion model, and confirm the accuracy of the method with numerical simulations. The analytical results of the model enable us to quantify the level of social reinforcement that is required to observe—as in Centola’s experiments—faster diffusion on clustered topologies than on random networks.

  5. Mathematical modelling of complex contagion on clustered networks

    Directory of Open Access Journals (Sweden)

    David J. P. O'Sullivan

    2015-09-01

    Full Text Available The spreading of behavior, such as the adoption of a new innovation, is influenced bythe structure of social networks that interconnect the population. In the experiments of Centola (Science, 2010, adoption of new behavior was shown to spread further and faster across clustered-lattice networks than across corresponding random networks. This implies that the complex contagion effects of social reinforcement are important in such diffusion, in contrast to simple contagion models of disease-spread which predict that epidemics would grow more efficiently on random networks than on clustered networks. To accurately model complex contagion on clustered networks remains a challenge because the usual assumptions (e.g. of mean-field theory regarding tree-like networks are invalidated by the presence of triangles in the network; the triangles are, however, crucial to the social reinforcement mechanism, which posits an increased probability of a person adopting behavior that has been adopted by two or more neighbors. In this paper we modify the analytical approach that was introduced by Hebert-Dufresne et al. (Phys. Rev. E, 2010, to study disease-spread on clustered networks. We show how the approximation method can be adapted to a complex contagion model, and confirm the accuracy of the method with numerical simulations. The analytical results of the model enable us to quantify the level of social reinforcement that is required to observe—as in Centola’s experiments—faster diffusion on clustered topologies than on random networks.

  6. The semiotics of control and modeling relations in complex systems.

    Science.gov (United States)

    Joslyn, C

    2001-01-01

    We provide a conceptual analysis of ideas and principles from the systems theory discourse which underlie Pattee's semantic or semiotic closure, which is itself foundational for a school of theoretical biology derived from systems theory and cybernetics, and is now being related to biological semiotics and explicated in the relational biological school of Rashevsky and Rosen. Atomic control systems and models are described as the canonical forms of semiotic organization, sharing measurement relations, but differing topologically in that control systems are circularly and models linearly related to their environments. Computation in control systems is introduced, motivating hierarchical decomposition, hybrid modeling and control systems, and anticipatory or model-based control. The semiotic relations in complex control systems are described in terms of relational constraints, and rules and laws are distinguished as contingent and necessary functional entailments, respectively. Finally, selection as a meta-level of constraint is introduced as the necessary condition for semantic relations in control systems and models.

  7. Predicting the future completing models of observed complex systems

    CERN Document Server

    Abarbanel, Henry

    2013-01-01

    Predicting the Future: Completing Models of Observed Complex Systems provides a general framework for the discussion of model building and validation across a broad spectrum of disciplines. This is accomplished through the development of an exact path integral for use in transferring information from observations to a model of the observed system. Through many illustrative examples drawn from models in neuroscience, fluid dynamics, geosciences, and nonlinear electrical circuits, the concepts are exemplified in detail. Practical numerical methods for approximate evaluations of the path integral are explored, and their use in designing experiments and determining a model's consistency with observations is investigated. Using highly instructive examples, the problems of data assimilation and the means to treat them are clearly illustrated. This book will be useful for students and practitioners of physics, neuroscience, regulatory networks, meteorology and climate science, network dynamics, fluid dynamics, and o...

  8. Uranium(VI) speciation: modelling, uncertainty and relevance to bioavailability models. Application to uranium uptake by the gills of a freshwater bivalve

    International Nuclear Information System (INIS)

    Denison, F.H.

    2004-07-01

    The effects of varying solution composition on the interactions between uranium(VI) and excised gills of the freshwater bivalve Corbicula fluminea have been investigated in well defined solution media. A significant reduction in the uptake of uranium was observed on increasing the concentrations of the uranium complexing ligands citrate and carbonate. Saturation kinetics as a function of uranium concentration at a pH value of 5.0 were observed, indicating that the uptake of uranium is a facilitated process, probably involving one or several trans-membrane transport systems. A relatively small change in the uptake of uranium was found as a function of pH (factor of ca. 2), despite the extremely large changes to the solution speciation of uranium within the range of pH investigated (5.0 - 7.5). A comprehensive review of the thermodynamic data relevant to the solution composition domain employed for this study was performed. Estimates of the uncertainties for the formation constants of aqueous uranium(VI) species were integrated into a thermodynamic database. A computer program was written to predict the equilibrium distribution of uranium(VI) in simple aqueous systems, using thermodynamic parameter mean-values. The program was extended to perform Monte Carlo and Quasi Monte Carlo uncertainty analyses, incorporating the thermodynamic database uncertainty estimates, to quantitatively predict the uncertainties inherent in predicting the solution speciation of uranium. The use of thermodynamic equilibrium modelling as a tool for interpreting the bioavailability of uranium(VI) was investigated. Observed uranium(VI) uptake behaviour was interpreted as a function of the predicted changes to the solution speciation of uranium. Different steady-state or pre-equilibrium approaches to modelling uranium uptake were tested. Alternative modelling approaches were also tested, considering the potential changes to membrane transport system activity or sorption characteristics on

  9. On the ""early-time"" evolution of variables relevant to turbulence models for Rayleigh-Taylor instability

    Energy Technology Data Exchange (ETDEWEB)

    Rollin, Bertrand [Los Alamos National Laboratory; Andrews, Malcolm J [Los Alamos National Laboratory

    2010-01-01

    We present our progress toward setting initial conditions in variable density turbulence models. In particular, we concentrate our efforts on the BHR turbulence model for turbulent Rayleigh-Taylor instability. Our approach is to predict profiles of relevant parameters before the fully turbulent regime and use them as initial conditions for the turbulence model. We use an idealized model of the mixing between two interpenetrating fluids to define the initial profiles for the turbulence model parameters. Velocities and volume fractions used in the idealized mixing model are obtained respectively from a set of ordinary differential equations modeling the growth of the Rayleigh-Taylor instability and from an idealization of the density profile in the mixing layer. A comparison between predicted initial profiles for the turbulence model parameters and initial profiles of the parameters obtained from low Atwood number three dimensional simulations show reasonable agreement.

  10. On the ""early-time"" evolution of variables relevant to turbulence models for the Rayleigh-Taylor instability

    Energy Technology Data Exchange (ETDEWEB)

    Rollin, Bertrand [Los Alamos National Laboratory; Andrews, Malcolm J [Los Alamos National Laboratory

    2010-01-01

    We present our progress toward setting initial conditions in variable density turbulence models. In particular, we concentrate our efforts on the BHR turbulence model for turbulent Rayleigh-Taylor instability. Our approach is to predict profiles of relevant variables before fully turbulent regime and use them as initial conditions for the turbulence model. We use an idealized model of mixing between two interpenetrating fluids to define the initial profiles for the turbulence model variables. Velocities and volume fractions used in the idealized mixing model are obtained respectively from a set of ordinary differential equations modeling the growth of the Rayleigh-Taylor instability and from an idealization of the density profile in the mixing layer. A comparison between predicted profiles for the turbulence model variables and profiles of the variables obtained from low Atwood number three dimensional simulations show reasonable agreement.

  11. An Ontology for Modeling Complex Inter-relational Organizations

    Science.gov (United States)

    Wautelet, Yves; Neysen, Nicolas; Kolp, Manuel

    This paper presents an ontology for organizational modeling through multiple complementary aspects. The primary goal of the ontology is to dispose of an adequate set of related concepts for studying complex organizations involved in a lot of relationships at the same time. In this paper, we define complex organizations as networked organizations involved in a market eco-system that are playing several roles simultaneously. In such a context, traditional approaches focus on the macro analytic level of transactions; this is supplemented here with a micro analytic study of the actors' rationale. At first, the paper overviews enterprise ontologies literature to position our proposal and exposes its contributions and limitations. The ontology is then brought to an advanced level of formalization: a meta-model in the form of a UML class diagram allows to overview the ontology concepts and their relationships which are formally defined. Finally, the paper presents the case study on which the ontology has been validated.

  12. Receptor–Receptor Interactions in Multiple 5-HT1A Heteroreceptor Complexes in Raphe-Hippocampal 5-HT Transmission and Their Relevance for Depression and Its Treatment

    Directory of Open Access Journals (Sweden)

    Dasiel O. Borroto-Escuela

    2018-06-01

    Full Text Available Due to the binding to a number of proteins to the receptor protomers in receptor heteromers in the brain, the term “heteroreceptor complexes” was introduced. A number of serotonin 5-HT1A heteroreceptor complexes were recently found to be linked to the ascending 5-HT pathways known to have a significant role in depression. The 5-HT1A–FGFR1 heteroreceptor complexes were involved in synergistically enhancing neuroplasticity in the hippocampus and in the dorsal raphe 5-HT nerve cells. The 5-HT1A protomer significantly increased FGFR1 protomer signaling in wild-type rats. Disturbances in the 5-HT1A–FGFR1 heteroreceptor complexes in the raphe-hippocampal 5-HT system were found in a genetic rat model of depression (Flinders sensitive line (FSL rats. Deficits in FSL rats were observed in the ability of combined FGFR1 and 5-HT1A agonist cotreatment to produce antidepressant-like effects. It may in part reflect a failure of FGFR1 treatment to uncouple the 5-HT1A postjunctional receptors and autoreceptors from the hippocampal and dorsal raphe GIRK channels, respectively. This may result in maintained inhibition of hippocampal pyramidal nerve cell and dorsal raphe 5-HT nerve cell firing. Also, 5-HT1A–5-HT2A isoreceptor complexes were recently demonstrated to exist in the hippocampus and limbic cortex. They may play a role in depression through an ability of 5-HT2A protomer signaling to inhibit the 5-HT1A protomer recognition and signaling. Finally, galanin (1–15 was reported to enhance the antidepressant effects of fluoxetine through the putative formation of GalR1–GalR2–5-HT1A heteroreceptor complexes. Taken together, these novel 5-HT1A receptor complexes offer new targets for treatment of depression.

  13. A computational framework for modeling targets as complex adaptive systems

    Science.gov (United States)

    Santos, Eugene; Santos, Eunice E.; Korah, John; Murugappan, Vairavan; Subramanian, Suresh

    2017-05-01

    Modeling large military targets is a challenge as they can be complex systems encompassing myriad combinations of human, technological, and social elements that interact, leading to complex behaviors. Moreover, such targets have multiple components and structures, extending across multiple spatial and temporal scales, and are in a state of change, either in response to events in the environment or changes within the system. Complex adaptive system (CAS) theory can help in capturing the dynamism, interactions, and more importantly various emergent behaviors, displayed by the targets. However, a key stumbling block is incorporating information from various intelligence, surveillance and reconnaissance (ISR) sources, while dealing with the inherent uncertainty, incompleteness and time criticality of real world information. To overcome these challenges, we present a probabilistic reasoning network based framework called complex adaptive Bayesian Knowledge Base (caBKB). caBKB is a rigorous, overarching and axiomatic framework that models two key processes, namely information aggregation and information composition. While information aggregation deals with the union, merger and concatenation of information and takes into account issues such as source reliability and information inconsistencies, information composition focuses on combining information components where such components may have well defined operations. Since caBKBs can explicitly model the relationships between information pieces at various scales, it provides unique capabilities such as the ability to de-aggregate and de-compose information for detailed analysis. Using a scenario from the Network Centric Operations (NCO) domain, we will describe how our framework can be used for modeling targets with a focus on methodologies for quantifying NCO performance metrics.

  14. Fundamentals of complex networks models, structures and dynamics

    CERN Document Server

    Chen, Guanrong; Li, Xiang

    2014-01-01

    Complex networks such as the Internet, WWW, transportationnetworks, power grids, biological neural networks, and scientificcooperation networks of all kinds provide challenges for futuretechnological development. In particular, advanced societies havebecome dependent on large infrastructural networks to an extentbeyond our capability to plan (modeling) and to operate (control).The recent spate of collapses in power grids and ongoing virusattacks on the Internet illustrate the need for knowledge aboutmodeling, analysis of behaviors, optimized planning and performancecontrol in such networks. F

  15. Model Complexities of Shallow Networks Representing Highly Varying Functions

    Czech Academy of Sciences Publication Activity Database

    Kůrková, Věra; Sanguineti, M.

    2016-01-01

    Roč. 171, 1 January (2016), s. 598-604 ISSN 0925-2312 R&D Projects: GA MŠk(CZ) LD13002 Grant - others:grant for Visiting Professors(IT) GNAMPA-INdAM Institutional support: RVO:67985807 Keywords : shallow networks * model complexity * highly varying functions * Chernoff bound * perceptrons * Gaussian kernel units Subject RIV: IN - Informatics, Computer Science Impact factor: 3.317, year: 2016

  16. Experiments and models of MHD jets and their relevance to astrophysics and solar physics

    Science.gov (United States)

    Bellan, Paul

    2017-10-01

    simulations. Upon attaining a critical length, laboratory jets develop a complex but resolvable sequence of instabilities which is effectively a cascade from the large-scale MHD regime to the small-scale two-fluid and kinetic regimes. This cascade involves kinking, Rayleigh-Taylor instabilities, magnetic reconnection, whistler waves, ion and electron heating, and generation of hard X-rays. An extended model shows how clumps of particles in a weakly ionized accretion disk move like a metaparticle having its charge to mass ratio reduced from that of an ion by the fractional ionization. These weakly charged metaparticles follow an inward spiral trajectory that is neither a cyclotron nor a Kepler orbit and accumulate at small radius where they produce a disk-plane radial EMF that drives astrophysical jets. Supported by DOE, NSF, and AFOSR.

  17. Coupled variable selection for regression modeling of complex treatment patterns in a clinical cancer registry.

    Science.gov (United States)

    Schmidtmann, I; Elsäßer, A; Weinmann, A; Binder, H

    2014-12-30

    For determining a manageable set of covariates potentially influential with respect to a time-to-event endpoint, Cox proportional hazards models can be combined with variable selection techniques, such as stepwise forward selection or backward elimination based on p-values, or regularized regression techniques such as component-wise boosting. Cox regression models have also been adapted for dealing with more complex event patterns, for example, for competing risks settings with separate, cause-specific hazard models for each event type, or for determining the prognostic effect pattern of a variable over different landmark times, with one conditional survival model for each landmark. Motivated by a clinical cancer registry application, where complex event patterns have to be dealt with and variable selection is needed at the same time, we propose a general approach for linking variable selection between several Cox models. Specifically, we combine score statistics for each covariate across models by Fisher's method as a basis for variable selection. This principle is implemented for a stepwise forward selection approach as well as for a regularized regression technique. In an application to data from hepatocellular carcinoma patients, the coupled stepwise approach is seen to facilitate joint interpretation of the different cause-specific Cox models. In conditional survival models at landmark times, which address updates of prediction as time progresses and both treatment and other potential explanatory variables may change, the coupled regularized regression approach identifies potentially important, stably selected covariates together with their effect time pattern, despite having only a small number of events. These results highlight the promise of the proposed approach for coupling variable selection between Cox models, which is particularly relevant for modeling for clinical cancer registries with their complex event patterns. Copyright © 2014 John Wiley & Sons

  18. Perspectives on creating clinically relevant blast models for mild traumatic brain injury and post traumatic stress disorder symptoms

    Directory of Open Access Journals (Sweden)

    Lisa eBrenner

    2012-03-01

    Full Text Available Military personnel are returning from Iraq and Afghanistan and reporting non-specific physical (somatic, behavioral, psychological, and cognitive symptoms. Many of these symptoms are frequently associated with mild traumatic brain injury (mTBI and/or post traumatic stress disorder (PTSD. Despite significant attention and advances in assessment and intervention for these two conditions, challenges persist. To address this, clinically relevant blast models are essential in the full characterization of this type of injury, as well as in the testing and identification of potential treatment strategies. In this publication, existing diagnostic challenges and current treatment practices for mTBI and/or PTSD will be summarized, along with suggestions regarding how what has been learned from existing models of PTSD and traditional mechanism (e.g., non-blast TBI can be used to facilitate the development of clinically relevant blast models.

  19. Complexity and agent-based modelling in urban research

    DEFF Research Database (Denmark)

    Fertner, Christian

    influence on the bigger system. Traditional scientific methods or theories often tried to simplify, not accounting complex relations of actors and decision-making. The introduction of computers in simulation made new approaches in modelling, as for example agent-based modelling (ABM), possible, dealing......Urbanisation processes are results of a broad variety of actors or actor groups and their behaviour and decisions based on different experiences, knowledge, resources, values etc. The decisions done are often on a micro/individual level but resulting in macro/collective behaviour. In urban research...

  20. Methodology and Results of Mathematical Modelling of Complex Technological Processes

    Science.gov (United States)

    Mokrova, Nataliya V.

    2018-03-01

    The methodology of system analysis allows us to draw a mathematical model of the complex technological process. The mathematical description of the plasma-chemical process was proposed. The importance the quenching rate and initial temperature decrease time was confirmed for producing the maximum amount of the target product. The results of numerical integration of the system of differential equations can be used to describe reagent concentrations, plasma jet rate and temperature in order to achieve optimal mode of hardening. Such models are applicable both for solving control problems and predicting future states of sophisticated technological systems.

  1. The complex sine-Gordon model on a half line

    International Nuclear Information System (INIS)

    Tzamtzis, Georgios

    2003-01-01

    In this thesis, we study the complex sine-Gordon model on a half line. The model in the bulk is an integrable (1+1) dimensional field theory which is U(1) gauge invariant and comprises a generalisation of the sine-Gordon theory. It accepts soliton and breather solutions. By introducing suitably selected boundary conditions we may consider the model on a half line. Through such conditions the model can be shown to remain integrable and various aspects of the boundary theory can be examined. The first chapter serves as a brief introduction to some basic concepts of integrability and soliton solutions. As an example of an integrable system with soliton solutions, the sine-Gordon model is presented both in the bulk and on a half line. These results will serve as a useful guide for the model at hand. The introduction finishes with a brief overview of the two methods that will be used on the fourth chapter in order to obtain the quantum spectrum of the boundary complex sine-Gordon model. In the second chapter the model is properly introduced along with a brief literature review. Different realisations of the model and their connexions are discussed. The vacuum of the theory is investigated. Soliton solutions are given and a discussion on the existence of breathers follows. Finally the collapse of breather solutions to single solitons is demonstrated and the chapter concludes with a different approach to the breather problem. In the third chapter, we construct the lowest conserved currents and through them we find suitable boundary conditions that allow for their conservation in the presence of a boundary. The boundary term is added to the Lagrangian and the vacuum is reexamined in the half line case. The reflection process of solitons from the boundary is studied and the time-delay is calculated. Finally we address the existence of boundary-bound states. In the fourth chapter we study the quantum complex sine-Gordon model. We begin with a brief overview of the theory in

  2. Extending a configuration model to find communities in complex networks

    International Nuclear Information System (INIS)

    Jin, Di; Hu, Qinghua; He, Dongxiao; Yang, Bo; Baquero, Carlos

    2013-01-01

    Discovery of communities in complex networks is a fundamental data analysis task in various domains. Generative models are a promising class of techniques for identifying modular properties from networks, which has been actively discussed recently. However, most of them cannot preserve the degree sequence of networks, which will distort the community detection results. Rather than using a blockmodel as most current works do, here we generalize a configuration model, namely, a null model of modularity, to solve this problem. Towards decomposing and combining sub-graphs according to the soft community memberships, our model incorporates the ability to describe community structures, something the original model does not have. Also, it has the property, as with the original model, that it fixes the expected degree sequence to be the same as that of the observed network. We combine both the community property and degree sequence preserving into a single unified model, which gives better community results compared with other models. Thereafter, we learn the model using a technique of nonnegative matrix factorization and determine the number of communities by applying consensus clustering. We test this approach both on synthetic benchmarks and on real-world networks, and compare it with two similar methods. The experimental results demonstrate the superior performance of our method over competing methods in detecting both disjoint and overlapping communities. (paper)

  3. Health behavior change models and their socio-cultural relevance for breast cancer screening in African American women.

    Science.gov (United States)

    Ashing-Giwa, K

    1999-01-01

    Models of health behavior provide the conceptual bases for most of the breast cancer screening intervention studies. These models were not designed for and have not been adequately tested with African American women. The models discussed in this paper are: The Health Belief Model, the Theory of Reasoned Action/Theory of Planned Behavior, and the Transtheoretical Model. This paper will examine the socio-cultural relevance of these health behavior models, and discuss specific socio-cultural dimensions that are not accounted for by these paradigms. It is critical that researchers include socio-cultural dimensions, such as interconnectedness, health socialization, ecological factors and health care system factors into their intervention models with African American women. Comprehensive and socio-culturally based investigations are necessary to guide the scientific and policy challenge for reducing breast cancer mortality in African American women.

  4. Modelling the fate of ciprofloxacin in activated sludge systems - The relevance of the sorption process

    DEFF Research Database (Denmark)

    Polesel, Fabio; Lehnberg, Kai; Dott, Wolfgang

    The sorption process can impact the removal of specific pharmaceuticals in municipal wastewater treatment plants (WWTPs). Ionic interactions (e.g., pH-driven equilibria and complexation), rather than hydrophobic interactions, are known to affect the sorption of zwitterionic pharmaceuticals...

  5. A model of negotiation scenarios based on time, relevance andcontrol used to define advantageous positions in a negotiation

    Directory of Open Access Journals (Sweden)

    Omar Guillermo Rojas Altamirano

    2016-04-01

    Full Text Available Models that apply to negotiation are based on different perspectives that range from the relationship between the actors, game theory or the steps in a procedure. This research proposes a model of negotiation scenarios that considers three factors (time, relevance and control, which are displayed as the most important in a negotiation. These factors interact with each other and create different scenarios for each of the actors involved in a negotiation. The proposed model not only facilitates the creation of a negotiation strategy but also an ideal choice of effective tactics.

  6. Internationalization through business model innovation: In search of relevant design dimensions and elements

    DEFF Research Database (Denmark)

    Rask, Morten

    2014-01-01

    Internationalization through business model innovation involves the creation, or reinvention, of the business itself. This paper aims to integrate basic insight from the literature on business model innovation, internationalization of the firm, international entrepreneurship and global marketing...

  7. Using model complexes to augment and advance metalloproteinase inhibitor design.

    Science.gov (United States)

    Jacobsen, Faith E; Cohen, Seth M

    2004-05-17

    The tetrahedral zinc complex [(Tp(Ph,Me))ZnOH] (Tp(Ph,Me) = hydrotris(3,5-phenylmethylpyrazolyl)borate) was combined with 2-thenylmercaptan, ethyl 4,4,4-trifluoroacetoacetate, salicylic acid, salicylamide, thiosalicylic acid, thiosalicylamide, methyl salicylate, methyl thiosalicyliate, and 2-hydroxyacetophenone to form the corresponding [(Tp(Ph,Me))Zn(ZBG)] complexes (ZBG = zinc-binding group). X-ray crystal structures of these complexes were obtained to determine the mode of binding for each ZBG, several of which had been previously studied with SAR by NMR (structure-activity relationship by nuclear magnetic resonance) as potential ligands for use in matrix metalloproteinase inhibitors. The [(Tp(Ph,Me))Zn(ZBG)] complexes show that hydrogen bonding and donor atom acidity have a pronounced effect on the mode of binding for this series of ligands. The results of these studies give valuable insight into how ligand protonation state and intramolecular hydrogen bonds can influence the coordination mode of metal-binding proteinase inhibitors. The findings here suggest that model-based approaches can be used to augment drug discovery methods applied to metalloproteins and can aid second-generation drug design.

  8. INFORMATION MODEL OF MAJOR DEPRESSION TREATMENT COST - RELEVANCE OF QUALITY MANAGEMENT OF HEALTH SYSTEM

    Directory of Open Access Journals (Sweden)

    Danijela Tadić

    2010-09-01

    Full Text Available This paper develops multirelational data base for major depression costs. It lists how data are collected and stored into the fact base and dimension base. Uncertain data is described linguistically and modelled by fuzzy sets. Linguistic expressions are stored in dimension base. Models of major depression treatment costs are developed for each patient and all population. On the basis of this model and multirelational data base MD-OLAP a model for major depression treatment costs is developed.

  9. Fidelity in Animal Modeling: Prerequisite for a Mechanistic Research Front Relevant to the Inflammatory Incompetence of Acute Pediatric Malnutrition.

    Science.gov (United States)

    Woodward, Bill

    2016-04-11

    Inflammatory incompetence is characteristic of acute pediatric protein-energy malnutrition, but its underlying mechanisms remain obscure. Perhaps substantially because the research front lacks the driving force of a scholarly unifying hypothesis, it is adrift and research activity is declining. A body of animal-based research points to a unifying paradigm, the Tolerance Model, with some potential to offer coherence and a mechanistic impetus to the field. However, reasonable skepticism prevails regarding the relevance of animal models of acute pediatric malnutrition; consequently, the fundamental contributions of the animal-based component of this research front are largely overlooked. Design-related modifications to improve the relevance of animal modeling in this research front include, most notably, prioritizing essential features of pediatric malnutrition pathology rather than dietary minutiae specific to infants and children, selecting windows of experimental animal development that correspond to targeted stages of pediatric immunological ontogeny, and controlling for ontogeny-related confounders. In addition, important opportunities are presented by newer tools including the immunologically humanized mouse and outbred stocks exhibiting a magnitude of genetic heterogeneity comparable to that of human populations. Sound animal modeling is within our grasp to stimulate and support a mechanistic research front relevant to the immunological problems that accompany acute pediatric malnutrition.

  10. Fidelity in Animal Modeling: Prerequisite for a Mechanistic Research Front Relevant to the Inflammatory Incompetence of Acute Pediatric Malnutrition

    Science.gov (United States)

    Woodward, Bill

    2016-01-01

    Inflammatory incompetence is characteristic of acute pediatric protein-energy malnutrition, but its underlying mechanisms remain obscure. Perhaps substantially because the research front lacks the driving force of a scholarly unifying hypothesis, it is adrift and research activity is declining. A body of animal-based research points to a unifying paradigm, the Tolerance Model, with some potential to offer coherence and a mechanistic impetus to the field. However, reasonable skepticism prevails regarding the relevance of animal models of acute pediatric malnutrition; consequently, the fundamental contributions of the animal-based component of this research front are largely overlooked. Design-related modifications to improve the relevance of animal modeling in this research front include, most notably, prioritizing essential features of pediatric malnutrition pathology rather than dietary minutiae specific to infants and children, selecting windows of experimental animal development that correspond to targeted stages of pediatric immunological ontogeny, and controlling for ontogeny-related confounders. In addition, important opportunities are presented by newer tools including the immunologically humanized mouse and outbred stocks exhibiting a magnitude of genetic heterogeneity comparable to that of human populations. Sound animal modeling is within our grasp to stimulate and support a mechanistic research front relevant to the immunological problems that accompany acute pediatric malnutrition. PMID:27077845

  11. A Sensitivity Analysis Method to Study the Behavior of Complex Process-based Models

    Science.gov (United States)

    Brugnach, M.; Neilson, R.; Bolte, J.

    2001-12-01

    The use of process-based models as a tool for scientific inquiry is becoming increasingly relevant in ecosystem studies. Process-based models are artificial constructs that simulate the system by mechanistically mimicking the functioning of its component processes. Structurally, a process-based model can be characterized, in terms of its processes and the relationships established among them. Each process comprises a set of functional relationships among several model components (e.g., state variables, parameters and input data). While not encoded explicitly, the dynamics of the model emerge from this set of components and interactions organized in terms of processes. It is the task of the modeler to guarantee that the dynamics generated are appropriate and semantically equivalent to the phenomena being modeled. Despite the availability of techniques to characterize and understand model behavior, they do not suffice to completely and easily understand how a complex process-based model operates. For example, sensitivity analysis studies model behavior by determining the rate of change in model output as parameters or input data are varied. One of the problems with this approach is that it considers the model as a "black box", and it focuses on explaining model behavior by analyzing the relationship input-output. Since, these models have a high degree of non-linearity, understanding how the input affects an output can be an extremely difficult task. Operationally, the application of this technique may constitute a challenging task because complex process-based models are generally characterized by a large parameter space. In order to overcome some of these difficulties, we propose a method of sensitivity analysis to be applicable to complex process-based models. This method focuses sensitivity analysis at the process level, and it aims to determine how sensitive the model output is to variations in the processes. Once the processes that exert the major influence in

  12. Semiotic aspects of control and modeling relations in complex systems

    Energy Technology Data Exchange (ETDEWEB)

    Joslyn, C.

    1996-08-01

    A conceptual analysis of the semiotic nature of control is provided with the goal of elucidating its nature in complex systems. Control is identified as a canonical form of semiotic relation of a system to its environment. As a form of constraint between a system and its environment, its necessary and sufficient conditions are established, and the stabilities resulting from control are distinguished from other forms of stability. These result from the presence of semantic coding relations, and thus the class of control systems is hypothesized to be equivalent to that of semiotic systems. Control systems are contrasted with models, which, while they have the same measurement functions as control systems, do not necessarily require semantic relations because of the lack of the requirement of an interpreter. A hybrid construction of models in control systems is detailed. Towards the goal of considering the nature of control in complex systems, the possible relations among collections of control systems are considered. Powers arguments on conflict among control systems and the possible nature of control in social systems are reviewed, and reconsidered based on our observations about hierarchical control. Finally, we discuss the necessary semantic functions which must be present in complex systems for control in this sense to be present at all.

  13. Stability of rotor systems: A complex modelling approach

    DEFF Research Database (Denmark)

    Kliem, Wolfhard; Pommer, Christian; Stoustrup, Jakob

    1998-01-01

    The dynamics of a large class of rotor systems can be modelled by a linearized complex matrix differential equation of second order, Mz + (D + iG)(z) over dot + (K + iN)z = 0, where the system matrices M, D, G, K and N are real symmetric. Moreover M and K are assumed to be positive definite and D...... approach applying bounds of appropriate Rayleigh quotients. The rotor systems tested are: a simple Laval rotor, a Laval rotor with additional elasticity and damping in the bearings, and a number of rotor systems with complex symmetric 4 x 4 randomly generated matrices.......The dynamics of a large class of rotor systems can be modelled by a linearized complex matrix differential equation of second order, Mz + (D + iG)(z) over dot + (K + iN)z = 0, where the system matrices M, D, G, K and N are real symmetric. Moreover M and K are assumed to be positive definite and D...

  14. Rhenium(V) oxo complexes relevant to technetium renal imaging agents derived from mercaptoacetylglycylglycylaminobenzoic acid isomers. Structural and molecular mechanics studies

    International Nuclear Information System (INIS)

    Hansen, L.; Taylor, A. Jr; Marzilli, L.G.; Cini, R.

    1992-01-01

    The synthesis and characterization of three rhenium(V) oxo complexes derived from isomers of mercaptoacetylglycylglycylaminobenzoic acid (MAG 2 -ABAH 5 ) are reported. The isomers were synthesized from o-, m- and p-aminobenzoic acid and differed in the position of the terminal carboxyl group. The anions of 8-10, [ReO(MAG 2 -*ABAH)] - (* = para (8), meta (9), ortho (10)), contained the tetraanionic form of the ligands with the carboxyl group protonated. Compounds 8,9, and 10 were synthesized by exchange reactions of ReOCl 3 (Me 2 SO)(Ph 3 P) under moderate conditions and were isolated as [Ph 4 P] + , [Bu 4 N] + , and [Ph 4 P] + salts, respectively. The structures of 8 and 10 were determined by X-ray diffraction methods; except for the location of the carboxyl group, the structures are similar. The coordination geometry is pseudo square pyramidal, with nitrogen and sulfur donor atoms forming a square base and the oxo ligand at the apex. The orientation of the carboxyl group in 10 is anti to the Re double-bond O group. Since the carboxyl groups are protonated in 8 and 10 and in other relevant structures from this class of radiopharmaceuticals including [Ph 4 As][TcO(MAG 3 H)] (MAG 3 H = tetraanionic form of mercaptoacetyltriglycine), the authors developed molecular mechanics parameters that allowed them to calculate the structures of 8, 10, and [TcO(MAG 3 H)] - . They then extended the calculations to all three isomeric complexes in their deprotonated forms and to [TcO(MAG 3 )] 2- in order to approximate their solution phase structures. They conclude that the [TcO(MAG 3 )] 2- species is conformationally flexible, and they have made an initial assessment of structures vs renal clearance

  15. Surface complexation modeling of zinc sorption onto ferrihydrite.

    Science.gov (United States)

    Dyer, James A; Trivedi, Paras; Scrivner, Noel C; Sparks, Donald L

    2004-02-01

    A previous study involving lead(II) [Pb(II)] sorption onto ferrihydrite over a wide range of conditions highlighted the advantages of combining molecular- and macroscopic-scale investigations with surface complexation modeling to predict Pb(II) speciation and partitioning in aqueous systems. In this work, an extensive collection of new macroscopic and spectroscopic data was used to assess the ability of the modified triple-layer model (TLM) to predict single-solute zinc(II) [Zn(II)] sorption onto 2-line ferrihydrite in NaNO(3) solutions as a function of pH, ionic strength, and concentration. Regression of constant-pH isotherm data, together with potentiometric titration and pH edge data, was a much more rigorous test of the modified TLM than fitting pH edge data alone. When coupled with valuable input from spectroscopic analyses, good fits of the isotherm data were obtained with a one-species, one-Zn-sorption-site model using the bidentate-mononuclear surface complex, (triple bond FeO)(2)Zn; however, surprisingly, both the density of Zn(II) sorption sites and the value of the best-fit equilibrium "constant" for the bidentate-mononuclear complex had to be adjusted with pH to adequately fit the isotherm data. Although spectroscopy provided some evidence for multinuclear surface complex formation at surface loadings approaching site saturation at pH >/=6.5, the assumption of a bidentate-mononuclear surface complex provided acceptable fits of the sorption data over the entire range of conditions studied. Regressing edge data in the absence of isotherm and spectroscopic data resulted in a fair number of surface-species/site-type combinations that provided acceptable fits of the edge data, but unacceptable fits of the isotherm data. A linear relationship between logK((triple bond FeO)2Zn) and pH was found, given by logK((triple bond FeO)2Znat1g/l)=2.058 (pH)-6.131. In addition, a surface activity coefficient term was introduced to the model to reduce the ionic strength

  16. Sex and gonadal hormones in mouse models of Alzheimer’s disease: what is relevant to the human condition?

    Directory of Open Access Journals (Sweden)

    Dubal Dena B

    2012-11-01

    Full Text Available Abstract Biologic sex and gonadal hormones matter in human aging and diseases of aging such as Alzheimer’s – and the importance of studying their influences relates directly to human health. The goal of this article is to review the literature to date on sex and hormones in mouse models of Alzheimer’s disease (AD with an exclusive focus on interpreting the relevance of findings to the human condition. To this end, we highlight advances in AD and in sex and hormone biology, discuss what these advances mean for merging the two fields, review the current mouse model literature, raise major unresolved questions, and offer a research framework that incorporates human reproductive aging for future studies aimed at translational discoveries in this important area. Unraveling human relevant pathways in sex and hormone-based biology may ultimately pave the way to novel and urgently needed treatments for AD and other neurodegenerative diseases.

  17. Modelling of the quenching process in complex superconducting magnet systems

    International Nuclear Information System (INIS)

    Hagedorn, D.; Rodriguez-Mateos, F.

    1992-01-01

    This paper reports that the superconducting twin bore dipole magnet for the proposed Large Hadron Collider (LHC) at CERN shows a complex winding structure consisting of eight compact layers each of them electromagnetically and thermally coupled with the others. This magnet is only one part of an electrical circuit; test and operation conditions are characterized by different circuits. In order to study the quenching process in this complex system, design adequate protection schemes, and provide a basis for the dimensioning of protection devices such as heaters, current breakers and dump resistors, a general simulation tool called QUABER has been developed using the analog system analysis program SABER. A complete set of electro-thermal models has been crated for the propagation of normal regions. Any network extension or modification is easy to implement without rewriting the whole set of differential equations

  18. A Primer for Model Selection: The Decisive Role of Model Complexity

    Science.gov (United States)

    Höge, Marvin; Wöhling, Thomas; Nowak, Wolfgang

    2018-03-01

    Selecting a "best" model among several competing candidate models poses an often encountered problem in water resources modeling (and other disciplines which employ models). For a modeler, the best model fulfills a certain purpose best (e.g., flood prediction), which is typically assessed by comparing model simulations to data (e.g., stream flow). Model selection methods find the "best" trade-off between good fit with data and model complexity. In this context, the interpretations of model complexity implied by different model selection methods are crucial, because they represent different underlying goals of modeling. Over the last decades, numerous model selection criteria have been proposed, but modelers who primarily want to apply a model selection criterion often face a lack of guidance for choosing the right criterion that matches their goal. We propose a classification scheme for model selection criteria that helps to find the right criterion for a specific goal, i.e., which employs the correct complexity interpretation. We identify four model selection classes which seek to achieve high predictive density, low predictive error, high model probability, or shortest compression of data. These goals can be achieved by following either nonconsistent or consistent model selection and by either incorporating a Bayesian parameter prior or not. We allocate commonly used criteria to these four classes, analyze how they represent model complexity and what this means for the model selection task. Finally, we provide guidance on choosing the right type of criteria for specific model selection tasks. (A quick guide through all key points is given at the end of the introduction.)

  19. Spectroscopic studies of molybdenum complexes as models for nitrogenase

    International Nuclear Information System (INIS)

    Walker, T.P.

    1981-05-01

    Because biological nitrogen fixation requires Mo, there is an interest in inorganic Mo complexes which mimic the reactions of nitrogen-fixing enzymes. Two such complexes are the dimer Mo 2 O 4 (cysteine) 2 2- and trans-Mo(N 2 ) 2 (dppe) 2 (dppe = 1,2-bis(diphenylphosphino)ethane). The H 1 and C 13 NMR of solutions of Mo 2 O 4 (cys) 2 2- are described. It is shown that in aqueous solution the cysteine ligands assume at least three distinct configurations. A step-wise dissociation of the cysteine ligand is proposed to explain the data. The Extended X-ray Absorption Fine Structure (EXAFS) of trans-Mo(N 2 ) 2 (dppe) 2 is described and compared to the EXAFS of MoH 4 (dppe) 2 . The spectra are fitted to amplitude and phase parameters developed at Bell Laboratories. On the basis of this analysis, one can determine (1) that the dinitrogen complex contains nitrogen and the hydride complex does not and (2) the correct Mo-N distance. This is significant because the Mo inn both complexes is coordinated by four P atoms which dominate the EXAFS. A similar sort of interference is present in nitrogenase due to S coordination of the Mo in the enzyme. This model experiment indicates that, given adequate signal to noise ratios, the presence or absence of dinitrogen coordination to Mo in the enzyme may be determined by EXAFS using existing data analysis techniques. A new reaction between Mo 2 O 4 (cys) 2 2- and acetylene is described to the extent it is presently understood. A strong EPR signal is observed, suggesting the production of stable Mo(V) monomers. EXAFS studies support this suggestion. The Mo K-edge is described. The edge data suggests Mo(VI) is also produced in the reaction. Ultraviolet spectra suggest that cysteine is released in the course of the reaction

  20. Diffusion in higher dimensional SYK model with complex fermions

    Science.gov (United States)

    Cai, Wenhe; Ge, Xian-Hui; Yang, Guo-Hong

    2018-01-01

    We construct a new higher dimensional SYK model with complex fermions on bipartite lattices. As an extension of the original zero-dimensional SYK model, we focus on the one-dimension case, and similar Hamiltonian can be obtained in higher dimensions. This model has a conserved U(1) fermion number Q and a conjugate chemical potential μ. We evaluate the thermal and charge diffusion constants via large q expansion at low temperature limit. The results show that the diffusivity depends on the ratio of free Majorana fermions to Majorana fermions with SYK interactions. The transport properties and the butterfly velocity are accordingly calculated at low temperature. The specific heat and the thermal conductivity are proportional to the temperature. The electrical resistivity also has a linear temperature dependence term.

  1. 3D model of amphioxus steroid receptor complexed with estradiol

    Energy Technology Data Exchange (ETDEWEB)

    Baker, Michael E., E-mail: mbaker@ucsd.edu [Department of Medicine, University of California, San Diego, 9500 Gilman Drive, La Jolla, CA 92093-0693 (United States); Chang, David J. [Department of Biology, University of California, San Diego, 9500 Gilman Drive, La Jolla, CA 92093-0693 (United States)

    2009-08-28

    The origins of signaling by vertebrate steroids are not fully understood. An important advance was the report that an estrogen-binding steroid receptor [SR] is present in amphioxus, a basal chordate with a similar body plan as vertebrates. To investigate the evolution of estrogen-binding to steroid receptors, we constructed a 3D model of amphioxus SR complexed with estradiol. This 3D model indicates that although the SR is activated by estradiol, some interactions between estradiol and human ER{alpha} are not conserved in the SR, which can explain the low affinity of estradiol for the SR. These differences between the SR and ER{alpha} in the steroid-binding domain are sufficient to suggest that another steroid is the physiological regulator of the SR. The 3D model predicts that mutation of Glu-346 to Gln will increase the affinity of testosterone for amphioxus SR and elucidate the evolution of steroid-binding to nuclear receptors.

  2. Bliss and Loewe interaction analyses of clinically relevant drug combinations in human colon cancer cell lines reveal complex patterns of synergy and antagonism.

    Science.gov (United States)

    Kashif, Muhammad; Andersson, Claes; Mansoori, Sharmineh; Larsson, Rolf; Nygren, Peter; Gustafsson, Mats G

    2017-11-28

    We analyzed survival effects for 15 different pairs of clinically relevant anti-cancer drugs in three iso-genic pairs of human colorectal cancer carcinoma cell lines, by applying for the first time our novel software (R package) called COMBIA. In our experiments iso-genic pairs of cell lines were used, differing only with respect to a single clinically important KRAS or BRAF mutation. Frequently, concentration dependent but mutation independent joint Bliss and Loewe synergy/antagonism was found statistically significant. Four combinations were found synergistic/antagonistic specifically to the parental (harboring KRAS or BRAF mutation) cell line of the corresponding iso-genic cell lines pair. COMBIA offers considerable improvements over established software for synergy analysis such as MacSynergy™ II as it includes both Bliss (independence) and Loewe (additivity) analyses, together with a tailored non-parametric statistical analysis employing heteroscedasticity, controlled resampling, and global (omnibus) testing. In many cases Loewe analyses found significant synergistic as well as antagonistic effects in a cell line at different concentrations of a tested drug combination. By contrast, Bliss analysis found only one type of significant effect per cell line. In conclusion, the integrated Bliss and Loewe interaction analysis based on non-parametric statistics may provide more robust interaction analyses and reveal complex patterns of synergy and antagonism.

  3. Using advanced surface complexation models for modelling soil chemistry under forests: Solling forest, Germany

    International Nuclear Information System (INIS)

    Bonten, Luc T.C.; Groenenberg, Jan E.; Meesenburg, Henning; Vries, Wim de

    2011-01-01

    Various dynamic soil chemistry models have been developed to gain insight into impacts of atmospheric deposition of sulphur, nitrogen and other elements on soil and soil solution chemistry. Sorption parameters for anions and cations are generally calibrated for each site, which hampers extrapolation in space and time. On the other hand, recently developed surface complexation models (SCMs) have been successful in predicting ion sorption for static systems using generic parameter sets. This study reports the inclusion of an assemblage of these SCMs in the dynamic soil chemistry model SMARTml and applies this model to a spruce forest site in Solling Germany. Parameters for SCMs were taken from generic datasets and not calibrated. Nevertheless, modelling results for major elements matched observations well. Further, trace metals were included in the model, also using the existing framework of SCMs. The model predicted sorption for most trace elements well. - Highlights: → Surface complexation models can be well applied in field studies. → Soil chemistry under a forest site is adequately modelled using generic parameters. → The model is easily extended with extra elements within the existing framework. → Surface complexation models can show the linkages between major soil chemistry and trace element behaviour. - Surface complexation models with generic parameters make calibration of sorption superfluous in dynamic modelling of deposition impacts on soil chemistry under nature areas.

  4. Using advanced surface complexation models for modelling soil chemistry under forests: Solling forest, Germany

    Energy Technology Data Exchange (ETDEWEB)

    Bonten, Luc T.C., E-mail: luc.bonten@wur.nl [Alterra-Wageningen UR, Soil Science Centre, P.O. Box 47, 6700 AA Wageningen (Netherlands); Groenenberg, Jan E. [Alterra-Wageningen UR, Soil Science Centre, P.O. Box 47, 6700 AA Wageningen (Netherlands); Meesenburg, Henning [Northwest German Forest Research Station, Abt. Umweltkontrolle, Sachgebiet Intensives Umweltmonitoring, Goettingen (Germany); Vries, Wim de [Alterra-Wageningen UR, Soil Science Centre, P.O. Box 47, 6700 AA Wageningen (Netherlands)

    2011-10-15

    Various dynamic soil chemistry models have been developed to gain insight into impacts of atmospheric deposition of sulphur, nitrogen and other elements on soil and soil solution chemistry. Sorption parameters for anions and cations are generally calibrated for each site, which hampers extrapolation in space and time. On the other hand, recently developed surface complexation models (SCMs) have been successful in predicting ion sorption for static systems using generic parameter sets. This study reports the inclusion of an assemblage of these SCMs in the dynamic soil chemistry model SMARTml and applies this model to a spruce forest site in Solling Germany. Parameters for SCMs were taken from generic datasets and not calibrated. Nevertheless, modelling results for major elements matched observations well. Further, trace metals were included in the model, also using the existing framework of SCMs. The model predicted sorption for most trace elements well. - Highlights: > Surface complexation models can be well applied in field studies. > Soil chemistry under a forest site is adequately modelled using generic parameters. > The model is easily extended with extra elements within the existing framework. > Surface complexation models can show the linkages between major soil chemistry and trace element behaviour. - Surface complexation models with generic parameters make calibration of sorption superfluous in dynamic modelling of deposition impacts on soil chemistry under nature areas.

  5. Hot electron transport modelling in fast ignition relevant targets with non-Spitzer resistivity

    Energy Technology Data Exchange (ETDEWEB)

    Chapman, D A; Hoarty, D J; Swatton, D J R [Plasma Physics Department, AWE, Aldermaston, Reading, Berkshire, RG7 4PR (United Kingdom); Hughes, S J, E-mail: david.chapman@awe.co.u [Computational Physics Group, AWE, Aldermaston, Reading, Berkshire, RG7 4PR (United Kingdom)

    2010-08-01

    The simple Lee-More model for electrical resistivity is implemented in the hybrid fast electron transport code THOR. The model is shown to reproduce experimental data across a wide range of temperatures using a small number of parameters. The effect of this model on the heating of simple Al targets by a short-pulse laser is studied and compared to the predictions of the classical Spitzer-Haerm resistivity. The model is then used in simulations of hot electron transport experiments using buried layer targets.

  6. Consistent robustness analysis (CRA) identifies biologically relevant properties of regulatory network models.

    Science.gov (United States)

    Saithong, Treenut; Painter, Kevin J; Millar, Andrew J

    2010-12-16

    A number of studies have previously demonstrated that "goodness of fit" is insufficient in reliably classifying the credibility of a biological model. Robustness and/or sensitivity analysis is commonly employed as a secondary method for evaluating the suitability of a particular model. The results of such analyses invariably depend on the particular parameter set tested, yet many parameter values for biological models are uncertain. Here, we propose a novel robustness analysis that aims to determine the "common robustness" of the model with multiple, biologically plausible parameter sets, rather than the local robustness for a particular parameter set. Our method is applied to two published models of the Arabidopsis circadian clock (the one-loop [1] and two-loop [2] models). The results reinforce current findings suggesting the greater reliability of the two-loop model and pinpoint the crucial role of TOC1 in the circadian network. Consistent Robustness Analysis can indicate both the relative plausibility of different models and also the critical components and processes controlling each model.

  7. Pharmacokinetic models relevant to toxicity and metabolism for uranium in humans and animals

    International Nuclear Information System (INIS)

    Wrenn, M.E.

    1989-01-01

    Models to predict short and long term accumulation of uranium in the human kidney are reviewed and summarised. These are generally first order linear compartmental models or pseudo-pharmacokinetic models such as the retention model of the ICRP. Pharmacokinetic models account not only for transfer from blood to organs, but also recirculation from the organ to blood. The most recent information on mammalian and human metabolism of uranium is used to establish a revised model. The model is applied to the short term accumulation of uranium in the human kidney after a single rapid dosage to the blood, such as that obtained by inhaling UF6 or its hydrolysis products. It is shown that the maximum accumulation in the kidney under these conditions is less than the fraction of the material distributed from the blood to kidney if a true pharmacokinetic model is used. The best coefficients applicable to man in the authors' view are summarised in model V. For a half-time of two days in the mammalian kidney, the maximum concentration in kidney is 75% of that predicted by a retention model such as that used by the ICRP following a single acute intake. We conclude that one must use true pharmacokinetic models, which incorporate recirculation from the organs to the blood, in order to realistically predict time dependent uptake in the kidneys and other organs. Information is presented showing that the half-time for urinary excretion of soluble uranium in man after inhalation of UF6 is about one quarter of a day. (author)

  8. An evolutionary-game model of tumour-cell interactions: possible relevance to gene therapy

    DEFF Research Database (Denmark)

    Bach, L.A.; Bentzen, S.M.; Alsner, Jan

    2001-01-01

    Evolutionary games have been applied as simple mathematical models of populations where interactions between individuals control the dynamics. Recently, it has been proposed to use this type of model to describe the evolution of tumour cell populations with interactions between cells. We extent...

  9. A Biblical-Theological Model of Cognitive Dissonance Theory: Relevance for Christian Educators

    Science.gov (United States)

    Bowen, Danny Ray

    2012-01-01

    The purpose of this content analysis research was to develop a biblical-theological model of Cognitive Dissonance Theory applicable to pedagogy. Evidence of cognitive dissonance found in Scripture was used to infer a purpose for the innate drive toward consonance. This inferred purpose was incorporated into a model that improves the descriptive…

  10. Determinants of Dermal Exposure Relevant for Exposure Modelling in Regulatory Risk Assessment

    NARCIS (Netherlands)

    Marquart, J.; Brouwer, D.H.; Gijsbers, J.H.J.; Links, I.H.M.; Warren, N.; Hemmen, J.J. van

    2003-01-01

    Risk assessment of chemicals requires assessment of the exposure levels of workers. In the absence of adequate specific measured data, models are often used to estimate exposure levels. For dermal exposure only a few models exist, which are not validated externally. In the scope of a large European

  11. A physiologically-based pharmacokinetic(PB-PK) model for ethylene dibromide : relevance of extrahepatic metabolism

    NARCIS (Netherlands)

    Hissink, A M; Wormhoudt, L.W.; Sherratt, P.J.; Hayes, D.J.; Commandeur, J N; Vermeulen, N P; van Bladeren, P.J.

    A physiologically-based pharmacokinetic (PB-PK) model was developed for ethylene dibromide (1,2-dibromoethane, EDB) for rats and humans, partly based on previously published in vitro data (Ploemen et al., 1997). In the present study, this PB-PK model has been validated for the rat. In addition, new

  12. A physiologically-based pharmacokinetic (PB-PK) model for ethylene dibromide : relevance of extrahepatic metabolism

    NARCIS (Netherlands)

    Hissink, A.M.; Wormhoudt, L.W.; Sherratt, P.J.; Hayes, J.D.; Commandeur, J.N.M.; Vermeulen, N.P.E.; Bladeren, P.J. van

    2000-01-01

    A physiologically-based pharmacokinetic (PB-PK) model was developed for ethylene dibromide (1,2-dibromoethane, EDB) for rats and humans, partly based on previously published in vitro data (Ploemen et al., 1997). In the present study, this PB-PK model has been validated for the rat. In addition, new

  13. On agent cooperation : the relevance of cognitive plausibility for multiagent simulation models of organizations

    NARCIS (Netherlands)

    van den Broek, J.

    2001-01-01

    Human organizations and computational multiagent systems both are social systems because they are both made up of a large number of interacting parts. Since human organizations are arrangements of distributed real intelligence, any DAI model is in some sense a model of an organization. This

  14. On agent cooperation : The relevance of cognitive plausibility for multiagent simulation models of organizations

    NARCIS (Netherlands)

    Broek, J. van den

    2001-01-01

    Human organizations and computational multiagent systems both are social systems because they are both made up of a large number of interacting parts. Since human organizations are arrangements of distributed real intelligence, any DAI model is in some sense a model of an organization. This

  15. A forest model relevant to red-cockaded woodpeckers (Picoides Borealis)

    Science.gov (United States)

    J.C.G. Goelz; C.C. Rewerts; N.J. Hess

    2005-01-01

    Most forest models are created with timber production as the implied primacy. For many land managers, timber production is less important than production of habitat for wildlife. Red-cockaded woodpeckers (RCW) are one of the priorites for management of the forests at Ft. Benning army installation. To aid management, a model that integrates a red-cockaded woodpecker...

  16. Multiagent model and mean field theory of complex auction dynamics

    Science.gov (United States)

    Chen, Qinghua; Huang, Zi-Gang; Wang, Yougui; Lai, Ying-Cheng

    2015-09-01

    Recent years have witnessed a growing interest in analyzing a variety of socio-economic phenomena using methods from statistical and nonlinear physics. We study a class of complex systems arising from economics, the lowest unique bid auction (LUBA) systems, which is a recently emerged class of online auction game systems. Through analyzing large, empirical data sets of LUBA, we identify a general feature of the bid price distribution: an inverted J-shaped function with exponential decay in the large bid price region. To account for the distribution, we propose a multi-agent model in which each agent bids stochastically in the field of winner’s attractiveness, and develop a theoretical framework to obtain analytic solutions of the model based on mean field analysis. The theory produces bid-price distributions that are in excellent agreement with those from the real data. Our model and theory capture the essential features of human behaviors in the competitive environment as exemplified by LUBA, and may provide significant quantitative insights into complex socio-economic phenomena.

  17. Multiagent model and mean field theory of complex auction dynamics

    International Nuclear Information System (INIS)

    Chen, Qinghua; Wang, Yougui; Huang, Zi-Gang; Lai, Ying-Cheng

    2015-01-01

    Recent years have witnessed a growing interest in analyzing a variety of socio-economic phenomena using methods from statistical and nonlinear physics. We study a class of complex systems arising from economics, the lowest unique bid auction (LUBA) systems, which is a recently emerged class of online auction game systems. Through analyzing large, empirical data sets of LUBA, we identify a general feature of the bid price distribution: an inverted J-shaped function with exponential decay in the large bid price region. To account for the distribution, we propose a multi-agent model in which each agent bids stochastically in the field of winner’s attractiveness, and develop a theoretical framework to obtain analytic solutions of the model based on mean field analysis. The theory produces bid-price distributions that are in excellent agreement with those from the real data. Our model and theory capture the essential features of human behaviors in the competitive environment as exemplified by LUBA, and may provide significant quantitative insights into complex socio-economic phenomena. (paper)

  18. Complex Coronary Hemodynamics - Simple Analog Modelling as an Educational Tool.

    Science.gov (United States)

    Parikh, Gaurav R; Peter, Elvis; Kakouros, Nikolaos

    2017-01-01

    Invasive coronary angiography remains the cornerstone for evaluation of coronary stenoses despite there being a poor correlation between luminal loss assessment by coronary luminography and myocardial ischemia. This is especially true for coronary lesions deemed moderate by visual assessment. Coronary pressure-derived fractional flow reserve (FFR) has emerged as the gold standard for the evaluation of hemodynamic significance of coronary artery stenosis, which is cost effective and leads to improved patient outcomes. There are, however, several limitations to the use of FFR including the evaluation of serial stenoses. In this article, we discuss the electronic-hydraulic analogy and the utility of simple electrical modelling to mimic the coronary circulation and coronary stenoses. We exemplify the effect of tandem coronary lesions on the FFR by modelling of a patient with sequential disease segments and complex anatomy. We believe that such computational modelling can serve as a powerful educational tool to help clinicians better understand the complexity of coronary hemodynamics and improve patient care.

  19. Electrochemical dehalogenisation of chlorinated aromatics - from model substances to practice-relevant ''real life'' samples

    International Nuclear Information System (INIS)

    Voss, I.; Altrogge, M.; Francke, W.

    1993-01-01

    Building on methods for the dehalogenisation of chlorinated benzoles known from the literature, an investigation was carried out whether polychlorinated biphenyls, dibenzo furane and dibenzo-p-dioxin can be dehalogenated electrochemically. The experiments were carried out with pure substances and transferred to mixed substances (real life samples). The investigations showed that both pure substances and complex mixtures can be dehalogenated without problems. Even in the presence of a clear oil matrix (e.g.: Oil trickled through a deposit), dehalogenisation of the xenobiotica present is possible. First attempts at 'scaling up' show that the method is also suitable in principle, for the disposal of large quantities of contaminated liquids. (BBR) [de

  20. Are more complex physiological models of forest ecosystems better choices for plot and regional predictions?

    Science.gov (United States)

    Wenchi Jin; Hong S. He; Frank R. Thompson

    2016-01-01

    Process-based forest ecosystem models vary from simple physiological, complex physiological, to hybrid empirical-physiological models. Previous studies indicate that complex models provide the best prediction at plot scale with a temporal extent of less than 10 years, however, it is largely untested as to whether complex models outperform the other two types of models...

  1. Modelling study of sea breezes in a complex coastal environment

    Science.gov (United States)

    Cai, X.-M.; Steyn, D. G.

    This study investigates a mesoscale modelling of sea breezes blowing from a narrow strait into the lower Fraser valley (LFV), British Columbia, Canada, during the period of 17-20 July, 1985. Without a nudging scheme in the inner grid, the CSU-RAMS model produces satisfactory wind and temperature fields during the daytime. In comparison with observation, the agreement indices for surface wind and temperature during daytime reach about 0.6 and 0.95, respectively, while the agreement indices drop to 0.4 at night. In the vertical, profiles of modelled wind and temperature generally agree with tethersonde data collected on 17 and 19 July. The study demonstrates that in late afternoon, the model does not capture the advection of an elevated warm layer which originated from land surfaces outside of the inner grid. Mixed layer depth (MLD) is calculated from model output of turbulent kinetic energy field. Comparison of MLD results with observation shows that the method generates a reliable MLD during the daytime, and that accurate estimates of MLD near the coast require the correct simulation of wind conditions over the sea. The study has shown that for a complex coast environment like the LFV, a reliable modelling study depends not only on local surface fluxes but also on elevated layers transported from remote land surfaces. This dependence is especially important when local forcings are weak, for example, during late afternoon and at night.

  2. Physical modelling of flow and dispersion over complex terrain

    Science.gov (United States)

    Cermak, J. E.

    1984-09-01

    Atmospheric motion and dispersion over topography characterized by irregular (or regular) hill-valley or mountain-valley distributions are strongly dependent upon three general sets of variables. These are variables that describe topographic geometry, synoptic-scale winds and surface-air temperature distributions. In addition, pollutant concentration distributions also depend upon location and physical characteristics of the pollutant source. Overall fluid-flow complexity and variability from site to site have stimulated the development and use of physical modelling for determination of flow and dispersion in many wind-engineering applications. Models with length scales as small as 1:12,000 have been placed in boundary-layer wind tunnels to study flows in which forced convection by synoptic winds is of primary significance. Flows driven primarily by forces arising from temperature differences (gravitational or free convection) have been investigated by small-scale physical models placed in an isolated space (gravitational convection chamber). Similarity criteria and facilities for both forced and gravitational-convection flow studies are discussed. Forced-convection modelling is illustrated by application to dispersion of air pollutants by unstable flow near a paper mill in the state of Maryland and by stable flow over Point Arguello, California. Gravitational-convection modelling is demonstrated by a study of drainage flow and pollutant transport from a proposed mining operation in the Rocky Mountains of Colorado. Other studies in which field data are available for comparison with model data are reviewed.

  3. Modeling the Propagation of Mobile Phone Virus under Complex Network

    Science.gov (United States)

    Yang, Wei; Wei, Xi-liang; Guo, Hao; An, Gang; Guo, Lei

    2014-01-01

    Mobile phone virus is a rogue program written to propagate from one phone to another, which can take control of a mobile device by exploiting its vulnerabilities. In this paper the propagation model of mobile phone virus is tackled to understand how particular factors can affect its propagation and design effective containment strategies to suppress mobile phone virus. Two different propagation models of mobile phone viruses under the complex network are proposed in this paper. One is intended to describe the propagation of user-tricking virus, and the other is to describe the propagation of the vulnerability-exploiting virus. Based on the traditional epidemic models, the characteristics of mobile phone viruses and the network topology structure are incorporated into our models. A detailed analysis is conducted to analyze the propagation models. Through analysis, the stable infection-free equilibrium point and the stability condition are derived. Finally, considering the network topology, the numerical and simulation experiments are carried out. Results indicate that both models are correct and suitable for describing the spread of two different mobile phone viruses, respectively. PMID:25133209

  4. Animal Models of Lymphangioleiomyomatosis (LAM) and Tuberous Sclerosis Complex (TSC)

    Science.gov (United States)

    2010-01-01

    Abstract Animal models of lymphangioleiomyomatosis (LAM) and tuberous sclerosis complex (TSC) are highly desired to enable detailed investigation of the pathogenesis of these diseases. Multiple rats and mice have been generated in which a mutation similar to that occurring in TSC patients is present in an allele of Tsc1 or Tsc2. Unfortunately, these mice do not develop pathologic lesions that match those seen in LAM or TSC. However, these Tsc rodent models have been useful in confirming the two-hit model of tumor development in TSC, and in providing systems in which therapeutic trials (e.g., rapamycin) can be performed. In addition, conditional alleles of both Tsc1 and Tsc2 have provided the opportunity to target loss of these genes to specific tissues and organs, to probe the in vivo function of these genes, and attempt to generate better models. Efforts to generate an authentic LAM model are impeded by a lack of understanding of the cell of origin of this process. However, ongoing studies provide hope that such a model will be generated in the coming years. PMID:20235887

  5. Complex Data Modeling and Computationally Intensive Statistical Methods

    CERN Document Server

    Mantovan, Pietro

    2010-01-01

    The last years have seen the advent and development of many devices able to record and store an always increasing amount of complex and high dimensional data; 3D images generated by medical scanners or satellite remote sensing, DNA microarrays, real time financial data, system control datasets. The analysis of this data poses new challenging problems and requires the development of novel statistical models and computational methods, fueling many fascinating and fast growing research areas of modern statistics. The book offers a wide variety of statistical methods and is addressed to statistici

  6. Complex Dynamics of an Adnascent-Type Game Model

    Directory of Open Access Journals (Sweden)

    Baogui Xin

    2008-01-01

    Full Text Available The paper presents a nonlinear discrete game model for two oligopolistic firms whose products are adnascent. (In biology, the term adnascent has only one sense, “growing to or on something else,” e.g., “moss is an adnascent plant.” See Webster's Revised Unabridged Dictionary published in 1913 by C. & G. Merriam Co., edited by Noah Porter. The bifurcation of its Nash equilibrium is analyzed with Schwarzian derivative and normal form theory. Its complex dynamics is demonstrated by means of the largest Lyapunov exponents, fractal dimensions, bifurcation diagrams, and phase portraits. At last, bifurcation and chaos anticontrol of this system are studied.

  7. Importance and variability in processes relevant to environmental tritium ingestion dose models

    International Nuclear Information System (INIS)

    Raskob, W.; Barry, P.

    1997-01-01

    The Aiken List was devised in 1990 to help decide which transport processes should be investigated experimentally so as to derive the greatest improvement in performance of environmental tritium assessment models. Each process was rated high, medium and low on each of two criteria. These were ''Importance'', which rated processes by how much each contributed to ingestion doses, and ''State of Modelling'', which rated the adequacy of the knowledge base on which models were built. Ratings, though unanimous, were, nevertheless, qualitative and subjective opinions. This paper describes how we have tried to quantify the ratings. To do this, we use, as measures of ''Importance'', sensitivities of predicted ingestion doses to changes in values of parameters in mathematical descriptions of individual processes. Measures of ''ModellinStatus'' were taken from a recently completed BIOMOVS study of HTO transport model performance and based either on by how much predicted transport by individual processes differed amongst participating modellers or by the variety of different ways that modellers chose to describe individual processes. The tritium transport model UFOTRI was used, and because environmental transport of HTO varies according to the weather at and after release time, sensitivities were measured in a sample of all conditions likely to arise in central Europe. (Author)

  8. Bleomycin induces molecular changes directly relevant to idiopathic pulmonary fibrosis: a model for "active" disease.

    Science.gov (United States)

    Peng, Ruoqi; Sridhar, Sriram; Tyagi, Gaurav; Phillips, Jonathan E; Garrido, Rosario; Harris, Paul; Burns, Lisa; Renteria, Lorena; Woods, John; Chen, Leena; Allard, John; Ravindran, Palanikumar; Bitter, Hans; Liang, Zhenmin; Hogaboam, Cory M; Kitson, Chris; Budd, David C; Fine, Jay S; Bauer, Carla M T; Stevenson, Christopher S

    2013-01-01

    The preclinical model of bleomycin-induced lung fibrosis, used to investigate mechanisms related to idiopathic pulmonary fibrosis (IPF), has incorrectly predicted efficacy for several candidate compounds suggesting that it may be of limited value. As an attempt to improve the predictive nature of this model, integrative bioinformatic approaches were used to compare molecular alterations in the lungs of bleomycin-treated mice and patients with IPF. Using gene set enrichment analysis we show for the first time that genes differentially expressed during the fibrotic phase of the single challenge bleomycin model were significantly enriched in the expression profiles of IPF patients. The genes that contributed most to the enrichment were largely involved in mitosis, growth factor, and matrix signaling. Interestingly, these same mitotic processes were increased in the expression profiles of fibroblasts isolated from rapidly progressing, but not slowly progressing, IPF patients relative to control subjects. The data also indicated that TGFβ was not the sole mediator responsible for the changes observed in this model since the ALK-5 inhibitor SB525334 effectively attenuated some but not all of the fibrosis associated with this model. Although some would suggest that repetitive bleomycin injuries may more effectively model IPF-like changes, our data do not support this conclusion. Together, these data highlight that a single bleomycin instillation effectively replicates several of the specific pathogenic molecular changes associated with IPF, and may be best used as a model for patients with active disease.

  9. Bleomycin induces molecular changes directly relevant to idiopathic pulmonary fibrosis: a model for "active" disease.

    Directory of Open Access Journals (Sweden)

    Ruoqi Peng

    Full Text Available The preclinical model of bleomycin-induced lung fibrosis, used to investigate mechanisms related to idiopathic pulmonary fibrosis (IPF, has incorrectly predicted efficacy for several candidate compounds suggesting that it may be of limited value. As an attempt to improve the predictive nature of this model, integrative bioinformatic approaches were used to compare molecular alterations in the lungs of bleomycin-treated mice and patients with IPF. Using gene set enrichment analysis we show for the first time that genes differentially expressed during the fibrotic phase of the single challenge bleomycin model were significantly enriched in the expression profiles of IPF patients. The genes that contributed most to the enrichment were largely involved in mitosis, growth factor, and matrix signaling. Interestingly, these same mitotic processes were increased in the expression profiles of fibroblasts isolated from rapidly progressing, but not slowly progressing, IPF patients relative to control subjects. The data also indicated that TGFβ was not the sole mediator responsible for the changes observed in this model since the ALK-5 inhibitor SB525334 effectively attenuated some but not all of the fibrosis associated with this model. Although some would suggest that repetitive bleomycin injuries may more effectively model IPF-like changes, our data do not support this conclusion. Together, these data highlight that a single bleomycin instillation effectively replicates several of the specific pathogenic molecular changes associated with IPF, and may be best used as a model for patients with active disease.

  10. An overview of structurally complex network-based modeling of public opinion in the “We the Media” era

    Science.gov (United States)

    Wang, Guanghui; Wang, Yufei; Liu, Yijun; Chi, Yuxue

    2018-05-01

    As the transmission of public opinion on the Internet in the “We the Media” era tends to be supraterritorial, concealed and complex, the traditional “point-to-surface” transmission of information has been transformed into “point-to-point” reciprocal transmission. A foundation for studies of the evolution of public opinion and its transmission on the Internet in the “We the Media” era can be laid by converting the massive amounts of fragmented information on public opinion that exists on “We the Media” platforms into structurally complex networks of information. This paper describes studies of structurally complex network-based modeling of public opinion on the Internet in the “We the Media” era from the perspective of the development and evolution of complex networks. The progress that has been made in research projects relevant to the structural modeling of public opinion on the Internet is comprehensively summarized. The review considers aspects such as regular grid-based modeling of the rules that describe the propagation of public opinion on the Internet in the “We the Media” era, social network modeling, dynamic network modeling, and supernetwork modeling. Moreover, an outlook for future studies that address complex network-based modeling of public opinion on the Internet is put forward as a summary from the perspective of modeling conducted using the techniques mentioned above.

  11. Inverse analyses of effective diffusion parameters relevant for a two-phase moisture model of cementitious materials

    DEFF Research Database (Denmark)

    Addassi, Mouadh; Johannesson, Björn; Wadsö, Lars

    2018-01-01

    Here we present an inverse analyses approach to determining the two-phase moisture transport properties relevant to concrete durability modeling. The purposed moisture transport model was based on a continuum approach with two truly separate equations for the liquid and gas phase being connected...... test, and, (iv) capillary suction test. Mass change over time, as obtained from the drying test, the two different cup test intervals and the capillary suction test, was used to obtain the effective diffusion parameters using the proposed inverse analyses approach. The moisture properties obtained...

  12. Boundary effects relevant for the string interpretation of σ-models

    International Nuclear Information System (INIS)

    Behrndt, K.; Dorn, H.

    1991-01-01

    At first a short discussion of the on/off boundary position dependence of the renormalization counter terms and β-functions for generalized σ-models on manifolds with boundary is given. Treating the energy-momentum tensor of such models as a two-dimensional distribution one can show that contrary to the first impression this does not imply any obstruction for the string interpretation of such models. The analysis is extended to the effect of dual loop corrections to string induced equations of motion, too. (orig.)

  13. Socio-Environmental Resilience and Complex Urban Systems Modeling

    Science.gov (United States)

    Deal, Brian; Petri, Aaron; Pan, Haozhi; Goldenberg, Romain; Kalantari, Zahra; Cvetkovic, Vladimir

    2017-04-01

    The increasing pressure of climate change has inspired two normative agendas; socio-technical transitions and socio-ecological resilience, both sharing a complex-systems epistemology (Gillard et al. 2016). Socio-technical solutions include a continuous, massive data gathering exercise now underway in urban places under the guise of developing a 'smart'(er) city. This has led to the creation of data-rich environments where large data sets have become central to monitoring and forming a response to anomalies. Some have argued that these kinds of data sets can help in planning for resilient cities (Norberg and Cumming 2008; Batty 2013). In this paper, we focus on a more nuanced, ecologically based, socio-environmental perspective of resilience planning that is often given less consideration. Here, we broadly discuss (and model) the tightly linked, mutually influenced, social and biophysical subsystems that are critical for understanding urban resilience. We argue for the need to incorporate these sub system linkages into the resilience planning lexicon through the integration of systems models and planning support systems. We make our case by first providing a context for urban resilience from a socio-ecological and planning perspective. We highlight the data needs for this type of resilient planning and compare it to currently collected data streams in various smart city efforts. This helps to define an approach for operationalizing socio-environmental resilience planning using robust systems models and planning support systems. For this, we draw from our experiences in coupling a spatio-temporal land use model (the Landuse Evolution and impact Assessment Model (LEAM)) with water quality and quantity models in Stockholm Sweden. We describe the coupling of these systems models using a robust Planning Support System (PSS) structural framework. We use the coupled model simulations and PSS to analyze the connection between urban land use transformation (social) and water

  14. Mechanisms of Bone Metastasis from Breast Cancer Using a Clinically Relevant Model

    National Research Council Canada - National Science Library

    Anderson, Robin

    2001-01-01

    .... We have developed a murine model of breast cancer that actively mimics the human disease. After implantation of tumor cells into the mammary gland, a primary tumour develops and subsequently metastasises to the lymph nodes, lung and bone...

  15. A Chemically Relevant Model for Teaching the Second Law of Thermodynamics.

    Science.gov (United States)

    Williamson, Bryce E.; Morikawa, Tetsuo

    2002-01-01

    Introduces a chemical model illustrating the aspects of the second law of thermodynamics which explains concepts such as reversibility, path dependence, and extrapolation in terms of electrochemistry and calorimetry. Presents a thought experiment using an ideal galvanic electrochemical cell. (YDS)

  16. Relevance of workplace social mixing during influenza pandemics: an experimental modelling study of workplace cultures.

    Science.gov (United States)

    Timpka, T; Eriksson, H; Holm, E; Strömgren, M; Ekberg, J; Spreco, A; Dahlström, Ö

    2016-07-01

    Workplaces are one of the most important regular meeting places in society. The aim of this study was to use simulation experiments to examine the impact of different workplace cultures on influenza dissemination during pandemics. The impact is investigated by experiments with defined social-mixing patterns at workplaces using semi-virtual models based on authentic sociodemographic and geographical data from a North European community (population 136 000). A simulated pandemic outbreak was found to affect 33% of the total population in the community with the reference academic-creative workplace culture; virus transmission at the workplace accounted for 10·6% of the cases. A model with a prevailing industrial-administrative workplace culture generated 11% lower incidence than the reference model, while the model with a self-employed workplace culture (also corresponding to a hypothetical scenario with all workplaces closed) produced 20% fewer cases. The model representing an academic-creative workplace culture with restricted workplace interaction generated 12% lower cumulative incidence compared to the reference model. The results display important theoretical associations between workplace social-mixing cultures and community-level incidence rates during influenza pandemics. Social interaction patterns at workplaces should be taken into consideration when analysing virus transmission patterns during influenza pandemics.

  17. Pharmacokinetic models relevant to toxicity and metabolism for uranium in humans and animals

    International Nuclear Information System (INIS)

    Wrenn, M.E.; Lipsztein, J.; Bertelli, L.

    1988-01-01

    The aim of this paper is to summarize pharmacokinetic models of uranium metabolism. Fortunately, others have recently reviewed metabolic models of all types, not just pharmacokinetic models. Their papers should be consulted for greater biological detail than is possible here. Improvements in the models since these other papers are noted. Models for assessing the biological consequences of exposure should account for the kinetics of intake by ingestion, inhalation, and injection, and the chemical form of uranium; predict the time dependent concentration in red blood cells, plasma, urine, kidney, bone and other organs (or compartments); and be adaptable to calculating these concentrations for varying regimens of intake. The biological parameters in the models come from metabolic data in humans and animals. Some of these parameters are reasonably well defined. For example, the cumulative urinary excretion at 24 hours post injection of soluble uranium in man is about 70%, the absorbed fraction for soluble uranium ingested by man in drinking water during normal dietary conditions is about 1%, and the half time in the mammalian kidney is several days. 17 refs., 8 figs

  18. Relevance of behavioral and social models to the study of consumer energy decision making and behavior

    Energy Technology Data Exchange (ETDEWEB)

    Burns, B.A.

    1980-11-01

    This report reviews social and behavioral science models and techniques for their possible use in understanding and predicting consumer energy decision making and behaviors. A number of models and techniques have been developed that address different aspects of the decision process, use different theoretical bases and approaches, and have been aimed at different audiences. Three major areas of discussion were selected: (1) models of adaptation to social change, (2) decision making and choice, and (3) diffusion of innovation. Within these three areas, the contributions of psychologists, sociologists, economists, marketing researchers, and others were reviewed. Five primary components of the models were identified and compared. The components are: (1) situational characteristics, (2) product characteristics, (3) individual characteristics, (4) social influences, and (5) the interaction or decision rules. The explicit use of behavioral and social science models in energy decision-making and behavior studies has been limited. Examples are given of a small number of energy studies which applied and tested existing models in studying the adoption of energy conservation behaviors and technologies, and solar technology.

  19. Phase diagram, thermodynamic investigations, and modelling of systems relevant to lithium-ion batteries

    International Nuclear Information System (INIS)

    Fuertauer, Siegfried; Beutl, Alexander; Flanorfer, Hans; Henriques, David; Giel, Hans; Markus, Thorsten

    2017-01-01

    This article reports on two consecutive joint projects titled ''Experimental Thermodynamics and Phase Relations of New Electrode Materials for Lithium-Ion Batteries'', which were performed in the framework of the WenDeLIB 1473 priority program ''Materials with new Design for Lithium Ion Batteries''. Hundreds of samples were synthesized using experimental techniques specifically developed to deal with highly reactive lithium and lithium-containing compounds to generate electrochemical, phase diagram and crystal structure data in the Cu-Li, Li-Sn, Li-Sb, Cu-Li-Sn, Cu-Li-Sb and selected oxide systems. The thermochemical and phase diagram data were subsequently used to develop self-consistent thermodynamic descriptions of several binary systems. In the present contribution, the experimental techniques, working procedures, results and their relevance to the development of new electrode materials for lithium ion batteries are discussed and summarized. The collaboration between the three groups has resulted in more than fifteen (15) published articles during the six-year funding period.

  20. How pharmacist-patient communication determines pharmacy loyalty? Modeling relevant factors.

    Science.gov (United States)

    Patrícia Antunes, Liliana; Gomes, João José; Cavaco, Afonso Miguel

    2015-01-01

    Portuguese community pharmacies provide pharmaceutical services, such as therapeutic outcomes follow-up, supplemented by relevant point-of-care testing that require continuity of provision to be effective. To identify factors of technical and communication nature that during a patient interview contribute to patients' loyalty. A cross-sectional descriptive study, with a purposive sample of community pharmacies providing pharmaceutical care, was conducted. Patient interviews were taped and transcribed verbatim. Duration, segments and utterances were identified and time stamped, using a previously validated coding scheme. To identify predictors of loyalty, logistic regression analyses were performed. From 59 interviews, participants' average age was 65.7 years and 42 (71.2%) were female; 45 (76.3%) interviews were classified as outcomes measurements and 14 (23.7%) as pharmaceutical consultations, with 33.2% of the patients booking a following appointment. The significant items to explain loyalty were associated with lifestyle and psychosocial exchange, age of the patient, and the presence of all interview segments (i.e. a complete consultation). Contrary to common professional beliefs and practice orientation it would appear that pharmacists' technical skills are not the essential factors that promote patients' loyalty needed for continuity of care, at least in the same extent as the social and lifestyle-related content of the exchange. Pharmaceutical care education should focus on relational skills as much as on medication-related competencies. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. Phase diagram, thermodynamic investigations, and modelling of systems relevant to lithium-ion batteries

    Energy Technology Data Exchange (ETDEWEB)

    Fuertauer, Siegfried; Beutl, Alexander; Flanorfer, Hans [Vienna Univ. (Austria). Dept. of Inorganic Chemistry - Functional Materials; Li, Dajian; Cupid, Damian [Karlsruhe Institute of Technology (KIT), Eggenstein-Leopoldshafen (Germany). Inst. for Applied Materials - Applied Materials Physics (IAM-AWP); Henriques, David; Giel, Hans; Markus, Thorsten [Mannheim Univ. of Applied Sciences (Germany). Inst. for Thermo- and Fluiddynamics

    2017-11-15

    This article reports on two consecutive joint projects titled ''Experimental Thermodynamics and Phase Relations of New Electrode Materials for Lithium-Ion Batteries'', which were performed in the framework of the WenDeLIB 1473 priority program ''Materials with new Design for Lithium Ion Batteries''. Hundreds of samples were synthesized using experimental techniques specifically developed to deal with highly reactive lithium and lithium-containing compounds to generate electrochemical, phase diagram and crystal structure data in the Cu-Li, Li-Sn, Li-Sb, Cu-Li-Sn, Cu-Li-Sb and selected oxide systems. The thermochemical and phase diagram data were subsequently used to develop self-consistent thermodynamic descriptions of several binary systems. In the present contribution, the experimental techniques, working procedures, results and their relevance to the development of new electrode materials for lithium ion batteries are discussed and summarized. The collaboration between the three groups has resulted in more than fifteen (15) published articles during the six-year funding period.

  2. a Range Based Method for Complex Facade Modeling

    Science.gov (United States)

    Adami, A.; Fregonese, L.; Taffurelli, L.

    2011-09-01

    3d modelling of Architectural Heritage does not follow a very well-defined way, but it goes through different algorithms and digital form according to the shape complexity of the object, to the main goal of the representation and to the starting data. Even if the process starts from the same data, such as a pointcloud acquired by laser scanner, there are different possibilities to realize a digital model. In particular we can choose between two different attitudes: the mesh and the solid model. In the first case the complexity of architecture is represented by a dense net of triangular surfaces which approximates the real surface of the object. In the other -opposite- case the 3d digital model can be realized by the use of simple geometrical shapes, by the use of sweeping algorithm and the Boolean operations. Obviously these two models are not the same and each one is characterized by some peculiarities concerning the way of modelling (the choice of a particular triangulation algorithm or the quasi-automatic modelling by known shapes) and the final results (a more detailed and complex mesh versus an approximate and more simple solid model). Usually the expected final representation and the possibility of publishing lead to one way or the other. In this paper we want to suggest a semiautomatic process to build 3d digital models of the facades of complex architecture to be used for example in city models or in other large scale representations. This way of modelling guarantees also to obtain small files to be published on the web or to be transmitted. The modelling procedure starts from laser scanner data which can be processed in the well known way. Usually more than one scan is necessary to describe a complex architecture and to avoid some shadows on the facades. These have to be registered in a single reference system by the use of targets which are surveyed by topography and then to be filtered in order to obtain a well controlled and homogeneous point cloud of

  3. A RANGE BASED METHOD FOR COMPLEX FACADE MODELING

    Directory of Open Access Journals (Sweden)

    A. Adami

    2012-09-01

    Full Text Available 3d modelling of Architectural Heritage does not follow a very well-defined way, but it goes through different algorithms and digital form according to the shape complexity of the object, to the main goal of the representation and to the starting data. Even if the process starts from the same data, such as a pointcloud acquired by laser scanner, there are different possibilities to realize a digital model. In particular we can choose between two different attitudes: the mesh and the solid model. In the first case the complexity of architecture is represented by a dense net of triangular surfaces which approximates the real surface of the object. In the other -opposite- case the 3d digital model can be realized by the use of simple geometrical shapes, by the use of sweeping algorithm and the Boolean operations. Obviously these two models are not the same and each one is characterized by some peculiarities concerning the way of modelling (the choice of a particular triangulation algorithm or the quasi-automatic modelling by known shapes and the final results (a more detailed and complex mesh versus an approximate and more simple solid model. Usually the expected final representation and the possibility of publishing lead to one way or the other. In this paper we want to suggest a semiautomatic process to build 3d digital models of the facades of complex architecture to be used for example in city models or in other large scale representations. This way of modelling guarantees also to obtain small files to be published on the web or to be transmitted. The modelling procedure starts from laser scanner data which can be processed in the well known way. Usually more than one scan is necessary to describe a complex architecture and to avoid some shadows on the facades. These have to be registered in a single reference system by the use of targets which are surveyed by topography and then to be filtered in order to obtain a well controlled and

  4. Sensitivity of the coastal tsunami simulation to the complexity of the 2011 Tohoku earthquake source model

    Science.gov (United States)

    Monnier, Angélique; Loevenbruck, Anne; Gailler, Audrey; Hébert, Hélène

    2016-04-01

    The 11 March 2011 Tohoku-Oki event, whether earthquake or tsunami, is exceptionally well documented. A wide range of onshore and offshore data has been recorded from seismic, geodetic, ocean-bottom pressure and sea level sensors. Along with these numerous observations, advance in inversion technique and computing facilities have led to many source studies. Rupture parameters inversion such as slip distribution and rupture history permit to estimate the complex coseismic seafloor deformation. From the numerous published seismic source studies, the most relevant coseismic source models are tested. The comparison of the predicted signals generated using both static and cinematic ruptures to the offshore and coastal measurements help determine which source model should be used to obtain the more consistent coastal tsunami simulations. This work is funded by the TANDEM project, reference ANR-11-RSNR-0023-01 of the French Programme Investissements d'Avenir (PIA 2014-2018).

  5. A subsurface model of the beaver meadow complex

    Science.gov (United States)

    Nash, C.; Grant, G.; Flinchum, B. A.; Lancaster, J.; Holbrook, W. S.; Davis, L. G.; Lewis, S.

    2015-12-01

    Wet meadows are a vital component of arid and semi-arid environments. These valley spanning, seasonally inundated wetlands provide critical habitat and refugia for wildlife, and may potentially mediate catchment-scale hydrology in otherwise "water challenged" landscapes. In the last 150 years, these meadows have begun incising rapidly, causing the wetlands to drain and much of the ecological benefit to be lost. The mechanisms driving this incision are poorly understood, with proposed means ranging from cattle grazing to climate change, to the removal of beaver. There is considerable interest in identifying cost-effective strategies to restore the hydrologic and ecological conditions of these meadows at a meaningful scale, but effective process based restoration first requires a thorough understanding of the constructional history of these ubiquitous features. There is emerging evidence to suggest that the North American beaver may have had a considerable role in shaping this landscape through the building of dams. This "beaver meadow complex hypothesis" posits that as beaver dams filled with fine-grained sediments, they became large wet meadows on which new dams, and new complexes, were formed, thereby aggrading valley bottoms. A pioneering study done in Yellowstone indicated that 32-50% of the alluvial sediment was deposited in ponded environments. The observed aggradation rates were highly heterogeneous, suggesting spatial variability in the depositional process - all consistent with the beaver meadow complex hypothesis (Polvi and Wohl, 2012). To expand on this initial work, we have probed deeper into these meadow complexes using a combination of geophysical techniques, coring methods and numerical modeling to create a 3-dimensional representation of the subsurface environments. This imaging has given us a unique view into the patterns and processes responsible for the landforms, and may shed further light on the role of beaver in shaping these landscapes.

  6. Relevance of multiple spatial scales in habitat models: A case study with amphibians and grasshoppers

    Science.gov (United States)

    Altmoos, Michael; Henle, Klaus

    2010-11-01

    Habitat models for animal species are important tools in conservation planning. We assessed the need to consider several scales in a case study for three amphibian and two grasshopper species in the post-mining landscapes near Leipzig (Germany). The two species groups were selected because habitat analyses for grasshoppers are usually conducted on one scale only whereas amphibians are thought to depend on more than one spatial scale. First, we analysed how the preference to single habitat variables changed across nested scales. Most environmental variables were only significant for a habitat model on one or two scales, with the smallest scale being particularly important. On larger scales, other variables became significant, which cannot be recognized on lower scales. Similar preferences across scales occurred in only 13 out of 79 cases and in 3 out of 79 cases the preference and avoidance for the same variable were even reversed among scales. Second, we developed habitat models by using a logistic regression on every scale and for all combinations of scales and analysed how the quality of habitat models changed with the scales considered. To achieve a sufficient accuracy of the habitat models with a minimum number of variables, at least two scales were required for all species except for Bufo viridis, for which a single scale, the microscale, was sufficient. Only for the European tree frog ( Hyla arborea), at least three scales were required. The results indicate that the quality of habitat models increases with the number of surveyed variables and with the number of scales, but costs increase too. Searching for simplifications in multi-scaled habitat models, we suggest that 2 or 3 scales should be a suitable trade-off, when attempting to define a suitable microscale.

  7. Simple models for studying complex spatiotemporal patterns of animal behavior

    Science.gov (United States)

    Tyutyunov, Yuri V.; Titova, Lyudmila I.

    2017-06-01

    Minimal mathematical models able to explain complex patterns of animal behavior are essential parts of simulation systems describing large-scale spatiotemporal dynamics of trophic communities, particularly those with wide-ranging species, such as occur in pelagic environments. We present results obtained with three different modelling approaches: (i) an individual-based model of animal spatial behavior; (ii) a continuous taxis-diffusion-reaction system of partial-difference equations; (iii) a 'hybrid' approach combining the individual-based algorithm of organism movements with explicit description of decay and diffusion of the movement stimuli. Though the models are based on extremely simple rules, they all allow description of spatial movements of animals in a predator-prey system within a closed habitat, reproducing some typical patterns of the pursuit-evasion behavior observed in natural populations. In all three models, at each spatial position the animal movements are determined by local conditions only, so the pattern of collective behavior emerges due to self-organization. The movement velocities of animals are proportional to the density gradients of specific cues emitted by individuals of the antagonistic species (pheromones, exometabolites or mechanical waves of the media, e.g., sound). These cues play a role of taxis stimuli: prey attract predators, while predators repel prey. Depending on the nature and the properties of the movement stimulus we propose using either a simplified individual-based model, a continuous taxis pursuit-evasion system, or a little more detailed 'hybrid' approach that combines simulation of the individual movements with the continuous model describing diffusion and decay of the stimuli in an explicit way. These can be used to improve movement models for many species, including large marine predators.

  8. Sparse Estimation Using Bayesian Hierarchical Prior Modeling for Real and Complex Linear Models

    DEFF Research Database (Denmark)

    Pedersen, Niels Lovmand; Manchón, Carles Navarro; Badiu, Mihai Alin

    2015-01-01

    In sparse Bayesian learning (SBL), Gaussian scale mixtures (GSMs) have been used to model sparsity-inducing priors that realize a class of concave penalty functions for the regression task in real-valued signal models. Motivated by the relative scarcity of formal tools for SBL in complex-valued m......In sparse Bayesian learning (SBL), Gaussian scale mixtures (GSMs) have been used to model sparsity-inducing priors that realize a class of concave penalty functions for the regression task in real-valued signal models. Motivated by the relative scarcity of formal tools for SBL in complex...... error, and robustness in low and medium signal-to-noise ratio regimes....

  9. Automated sensitivity analysis: New tools for modeling complex dynamic systems

    International Nuclear Information System (INIS)

    Pin, F.G.

    1987-01-01

    Sensitivity analysis is an established methodology used by researchers in almost every field to gain essential insight in design and modeling studies and in performance assessments of complex systems. Conventional sensitivity analysis methodologies, however, have not enjoyed the widespread use they deserve considering the wealth of information they can provide, partly because of their prohibitive cost or the large initial analytical investment they require. Automated systems have recently been developed at ORNL to eliminate these drawbacks. Compilers such as GRESS and EXAP now allow automatic and cost effective calculation of sensitivities in FORTRAN computer codes. In this paper, these and other related tools are described and their impact and applicability in the general areas of modeling, performance assessment and decision making for radioactive waste isolation problems are discussed

  10. Complex system modelling and control through intelligent soft computations

    CERN Document Server

    Azar, Ahmad

    2015-01-01

    The book offers a snapshot of the theories and applications of soft computing in the area of complex systems modeling and control. It presents the most important findings discussed during the 5th International Conference on Modelling, Identification and Control, held in Cairo, from August 31-September 2, 2013. The book consists of twenty-nine selected contributions, which have been thoroughly reviewed and extended before their inclusion in the volume. The different chapters, written by active researchers in the field, report on both current theories and important applications of soft-computing. Besides providing the readers with soft-computing fundamentals, and soft-computing based inductive methodologies/algorithms, the book also discusses key industrial soft-computing applications, as well as multidisciplinary solutions developed for a variety of purposes, like windup control, waste management, security issues, biomedical applications and many others. It is a perfect reference guide for graduate students, r...

  11. A comparison of non-local electron transport models relevant to inertial confinement fusion

    Science.gov (United States)

    Sherlock, Mark; Brodrick, Jonathan; Ridgers, Christopher

    2017-10-01

    We compare the reduced non-local electron transport model developed by Schurtz et al. to Vlasov-Fokker-Planck simulations. Two new test cases are considered: the propagation of a heat wave through a high density region into a lower density gas, and a 1-dimensional hohlraum ablation problem. We find the reduced model reproduces the peak heat flux well in the ablation region but significantly over-predicts the coronal preheat. The suitability of the reduced model for computing non-local transport effects other than thermal conductivity is considered by comparing the computed distribution function to the Vlasov-Fokker-Planck distribution function. It is shown that even when the reduced model reproduces the correct heat flux, the distribution function is significantly different to the Vlasov-Fokker-Planck prediction. Two simple modifications are considered which improve agreement between models in the coronal region. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  12. Rheology and FTIR studies of model waxy crude oils with relevance to gelled pipeline restart

    Energy Technology Data Exchange (ETDEWEB)

    Magda, J.J.; Guimeraes, K.; Deo, M.D. [Utah Univ., Salt Lake City, UT (United States). Dept. of Chemical Engineering; Venkatesan, R.; Montesi, A. [Chevron Energy Technology Co., Houston, TX (United States)

    2008-07-01

    Gels composed of wax crystals may sometimes form when crude oils are transported in pipelines when ambient temperatures are low. The gels may stop the pipe flow, making it difficult or even impossible to restart the flow without breaking the pipe. Rheology and FTIR techniques were used to study the problem and to characterize transparent model waxy crude oils in pipeline flow experiments. These model oils were formulated without any highly volatile components to enhance the reproducibility of the rheology tests. Results were presented for the time- and temperature-dependent rheology of the model waxy crude oils as obtained in linear oscillatory shear and in creep-recovery experiments. The model oils were shown to exhibit many of the rheological features reported for real crude oils, such as 3 distinct apparent yield stresses, notably static yield stress, dynamic yield stress, and elastic-limit yield stress. It was concluded that of the 3, the static yield stress value, particularly its time dependence, can best be used to predict the restart behaviour observed for the same gel in model pipelines.

  13. Does model performance improve with complexity? A case study with three hydrological models

    Science.gov (United States)

    Orth, Rene; Staudinger, Maria; Seneviratne, Sonia I.; Seibert, Jan; Zappa, Massimiliano

    2015-04-01

    In recent decades considerable progress has been made in climate model development. Following the massive increase in computational power, models became more sophisticated. At the same time also simple conceptual models have advanced. In this study we validate and compare three hydrological models of different complexity to investigate whether their performance varies accordingly. For this purpose we use runoff and also soil moisture measurements, which allow a truly independent validation, from several sites across Switzerland. The models are calibrated in similar ways with the same runoff data. Our results show that the more complex models HBV and PREVAH outperform the simple water balance model (SWBM) in case of runoff but not for soil moisture. Furthermore the most sophisticated PREVAH model shows an added value compared to the HBV model only in case of soil moisture. Focusing on extreme events we find generally improved performance of the SWBM during drought conditions and degraded agreement with observations during wet extremes. For the more complex models we find the opposite behavior, probably because they were primarily developed for prediction of runoff extremes. As expected given their complexity, HBV and PREVAH have more problems with over-fitting. All models show a tendency towards better performance in lower altitudes as opposed to (pre-) alpine sites. The results vary considerably across the investigated sites. In contrast, the different metrics we consider to estimate the agreement between models and observations lead to similar conclusions, indicating that the performance of the considered models is similar at different time scales as well as for anomalies and long-term means. We conclude that added complexity does not necessarily lead to improved performance of hydrological models, and that performance can vary greatly depending on the considered hydrological variable (e.g. runoff vs. soil moisture) or hydrological conditions (floods vs. droughts).

  14. Accurate modeling and evaluation of microstructures in complex materials

    Science.gov (United States)

    Tahmasebi, Pejman

    2018-02-01

    Accurate characterization of heterogeneous materials is of great importance for different fields of science and engineering. Such a goal can be achieved through imaging. Acquiring three- or two-dimensional images under different conditions is not, however, always plausible. On the other hand, accurate characterization of complex and multiphase materials requires various digital images (I) under different conditions. An ensemble method is presented that can take one single (or a set of) I(s) and stochastically produce several similar models of the given disordered material. The method is based on a successive calculating of a conditional probability by which the initial stochastic models are produced. Then, a graph formulation is utilized for removing unrealistic structures. A distance transform function for the Is with highly connected microstructure and long-range features is considered which results in a new I that is more informative. Reproduction of the I is also considered through a histogram matching approach in an iterative framework. Such an iterative algorithm avoids reproduction of unrealistic structures. Furthermore, a multiscale approach, based on pyramid representation of the large Is, is presented that can produce materials with millions of pixels in a matter of seconds. Finally, the nonstationary systems—those for which the distribution of data varies spatially—are studied using two different methods. The method is tested on several complex and large examples of microstructures. The produced results are all in excellent agreement with the utilized Is and the similarities are quantified using various correlation functions.

  15. Green IT engineering concepts, models, complex systems architectures

    CERN Document Server

    Kondratenko, Yuriy; Kacprzyk, Janusz

    2017-01-01

    This volume provides a comprehensive state of the art overview of a series of advanced trends and concepts that have recently been proposed in the area of green information technologies engineering as well as of design and development methodologies for models and complex systems architectures and their intelligent components. The contributions included in the volume have their roots in the authors’ presentations, and vivid discussions that have followed the presentations, at a series of workshop and seminars held within the international TEMPUS-project GreenCo project in United Kingdom, Italy, Portugal, Sweden and the Ukraine, during 2013-2015 and at the 1st - 5th Workshops on Green and Safe Computing (GreenSCom) held in Russia, Slovakia and the Ukraine. The book presents a systematic exposition of research on principles, models, components and complex systems and a description of industry- and society-oriented aspects of the green IT engineering. A chapter-oriented structure has been adopted for this book ...

  16. Expansion of the Kano model to identify relevant customer segments and functional requirements

    DEFF Research Database (Denmark)

    Atlason, Reynir Smari; Stefansson, Arnaldur Smari; Wietz, Miriam

    2017-01-01

    The Kano model of customer satisfaction has been widely used to analyse perceived needs of customers. The model provides product developers valuable information about if, and then how much a given functional requirement (FR) will impact customer satisfaction if implemented within a product, system...... or a service. A current limitation of the Kano model is that it does not allow developers to visualise which combined sets of FRs would provide the highest satisfaction between different customer segments. In this paper, a stepwise method to address this particular shortcoming is presented. First......, a traditional Kano analysis is conducted for the different segments of interest. Second, for each FR, relationship functions are integrated between x=0 and x=1. Third, integrals are inserted into a combination matrix crossing segments and FRs, where FRs with the highest sum across the chosen segments...

  17. Modelling methodology for engineering of complex sociotechnical systems

    CSIR Research Space (South Africa)

    Oosthuizen, R

    2014-10-01

    Full Text Available Different systems engineering techniques and approaches are applied to design and develop complex sociotechnical systems for complex problems. In a complex sociotechnical system cognitive and social humans use information technology to make sense...

  18. Exposure Modeling Tools and Databases for Consideration for Relevance to the Amended TSCA (ISES)

    Science.gov (United States)

    The Agency’s Office of Research and Development (ORD) has a number of ongoing exposure modeling tools and databases. These efforts are anticipated to be useful in supporting ongoing implementation of the amended Toxic Substances Control Act (TSCA). Under ORD’s Chemic...

  19. Hydrothermal germination models: Improving experimental efficiency by limiting data collection to the relevant hydrothermal range

    Science.gov (United States)

    Hydrothermal models used to predict germination response in the field are usually parameterized with data from laboratory experiments that examine the full range of germination response to temperature and water potential. Inclusion of low water potential and high and low-temperature treatments, how...

  20. ABOUT THE RELEVANCE AND METHODOLOGY ASPECTS OF TEACHING THE MATHEMATICAL MODELING TO PEDAGOGICAL STUDENTS

    Directory of Open Access Journals (Sweden)

    Y. A. Perminov

    2014-01-01

    Full Text Available The paper substantiates the need for profile training in mathematical modeling for pedagogical students, caused by the total penetration of mathematics into different sciences, including the humanities; fast development of the information communications technologies; and growing importance of mathematical modeling, combining the informal scientific and formal mathematical languages with the unique opportunities of computer programming. The author singles out the reasons for mastering and using the mathematical apparatus by teaches in every discipline. Indeed, among all the modern mathematical methods and ideas, mathematical modeling retains its priority in all professional spheres. Therefore, the discipline of “Mathematical Modeling” can play an important role in integrating different components of specialists training in various profiles. By mastering the basics of mathematical modeling, students acquire skills of methodological thinking; learn the principles of analysis, synthesis, generalization of ideas and methods in different disciplines and scientific spheres; and achieve general culture competences. In conclusion, the author recommends incorporating the “Methods of Profile Training in Mathematical Modeling” into the pedagogical magistracy curricula. 

  1. Development of two phase turbulent mixing model for subchannel analysis relevant to BWR

    International Nuclear Information System (INIS)

    Sharma, M.P.; Nayak, A.K.; Kannan, Umasankari

    2014-01-01

    A two phase flow model is presented, which predicts both liquid and gas phase turbulent mixing rate between adjacent subchannels of reactor rod bundles. The model presented here is for slug churn flow regime, which is dominant as compared to the other regimes like bubbly flow and annular flow regimes, since turbulent mixing rate is the highest in slug churn flow regime. In this paper, we have defined new dimensionless parameters i.e. liquid mixing number and gas mixing number for two phase turbulent mixing. The liquid mixing number is a function of mixture Reynolds number whereas the gas phase mixing number is a function of both mixture Reynolds number and volumetric fraction of gas. The effect of pressure, geometrical influence of subchannel is also included in this model. The present model has been tested against low pressure and temperature air-water and high pressure and temperature steam-water experimental data found that it shows good agreement with available experimental data. (author)

  2. Role of Stat in Skin Carcinogenesis: Insights Gained from Relevant Mouse Models

    International Nuclear Information System (INIS)

    Macias, E.; Rao, D.; DiGiovanni, J.; DiGiovanni, J.; DiGiovanni, J.

    2013-01-01

    Signal transducer and activator of transcription 3 (Stat) is a cytoplasmic protein that is activated in response to cytokines and growth factors and acts as a transcription factor. Stat plays critical roles in various biological activities including cell proliferation, migration, and survival. Studies using keratinocyte-specific Stat-deficient mice have revealed that Stat plays an important role in skin homeostasis including keratinocyte migration, wound healing, and hair follicle growth. Use of both constitutive and inducible keratinocyte-specific Stat-deficient mouse models has demonstrated that Stat is required for both the initiation and promotion stages of multistage skin carcinogenesis. Further studies using a transgenic mouse model with a gain of function mutant of Stat (Stat3C) expressed in the basal layer of the epidermis revealed a novel role for Stat in skin tumor progression. Studies using similar Stat-deficient and gain-of-function mouse models have indicated its similar roles in ultraviolet B (UVB) radiation-mediated skin carcinogenesis. This paper summarizes the use of these various mouse models for studying the role and underlying mechanisms for the function of Stat in skin carcinogenesis. Given its significant role throughout the skin carcinogenesis process, Stat is an attractive target for skin cancer prevention and treatment.

  3. Quantifying the relevance of adaptive thermal comfort models in moderate thermal climate zones

    Energy Technology Data Exchange (ETDEWEB)

    Hoof, Joost van; Hensen, Jan L.M. [Faculty of Architecture, Building and Planning, Technische Universiteit Eindhoven, Vertigo 6.18, P.O. Box 513, 5600 MB Eindhoven (Netherlands)

    2007-01-15

    Standards governing thermal comfort evaluation are on a constant cycle of revision and public review. One of the main topics being discussed in the latest round was the introduction of an adaptive thermal comfort model, which now forms an optional part of ASHRAE Standard 55. Also on a national level, adaptive thermal comfort guidelines come into being, such as in the Netherlands. This paper discusses two implementations of the adaptive comfort model in terms of usability and energy use for moderate maritime climate zones by means of literature study, a case study comprising temperature measurements, and building performance simulation. It is concluded that for moderate climate zones the adaptive model is only applicable during summer months, and can reduce energy for naturally conditioned buildings. However, the adaptive thermal comfort model has very limited application potential for such climates. Additionally we suggest a temperature parameter with a gradual course to replace the mean monthly outdoor air temperature to avoid step changes in optimum comfort temperatures. (author)

  4. Quantifying the relevance of adaptive thermal comfort models in moderate thermal climate zones

    NARCIS (Netherlands)

    Hoof, van J.; Hensen, J.L.M.

    2007-01-01

    Standards governing thermal comfort evaluation are on a constant cycle of revision and public review. One of the main topics being discussed in the latest round was the introduction of an adaptive thermal comfort model, which now forms an optional part of ASHRAE Standard 55. Also on a national

  5. Optimization of a Clinically Relevant Model of White Matter Stroke in Mice: Histological and Functional Evidences

    Science.gov (United States)

    Ahmad, Abdullah S.; Satriotomo, Irawan; Fazal, Jawad A.; Nadeau, Stephen E.; Doré, Sylvain

    2015-01-01

    Background and Purpose White matter (WM) injury during stroke increases the risk of disability and gloomy prognosis of post-stroke rehabilitation. However, modeling of WM loss in rodents has proven to be challenging. Methods We report improved WM injury models in male C57BL/6 mice. Mice were given either endothelin-1 (ET-1) or L-N5-(1-iminoethyl)ornitine (L-NIO) into the periventricular white matter (PVWM), in the corpus callosum (CC), or in the posterior limb of internal capsule (PLIC). Anatomical and functional outcomes were quantified on day 7 post injection. Results Injection of ET-1 or L-NIO caused a small focal lesion in the injection site in the PVWM. No significant motor function deficits were observed in the PVWM lesion model. We next targeted the PLIC by using single or double injections of L-NIO and found that this strategy induced small focal infarction. Interestingly, injection of L-NIO in the PLIC also resulted in gliosis, and significant motor function deficits. Conclusions By employing different agents, doses, and locations, this study shows the feasibility of inducing brain WM injury accompanied with functional deficits in mice. Selective targeting of the injury location, behavioral testing, and the agents chosen to induce WM injury are all keys to successfully develop a mouse model and subsequent testing of therapeutic interventions against WM injury. PMID:27512724

  6. Statistical modeling of static strengths of nuclear graphites with relevance to structural design

    International Nuclear Information System (INIS)

    Arai, Taketoshi

    1992-02-01

    Use of graphite materials for structural members poses a problem as to how to take into account of statistical properties of static strength, especially tensile fracture stresses, in component structural design. The present study concerns comprehensive examinations on statistical data base and modelings on nuclear graphites. First, the report provides individual samples and their analyses on strengths of IG-110 and PGX graphites for HTTR components. Those statistical characteristics on other HTGR graphites are also exemplified from the literature. Most of statistical distributions of individual samples are found to be approximately normal. The goodness of fit to normal distributions is more satisfactory with larger sample sizes. Molded and extruded graphites, however, possess a variety of statistical properties depending of samples from different with-in-log locations and/or different orientations. Second, the previous statistical models including the Weibull theory are assessed from the viewpoint of applicability to design procedures. This leads to a conclusion that the Weibull theory and its modified ones are satisfactory only for limited parts of tensile fracture behavior. They are not consistent for whole observations. Only normal statistics are justifiable as practical approaches to discuss specified minimum ultimate strengths as statistical confidence limits for individual samples. Third, the assessment of various statistical models emphasizes the need to develop advanced analytical ones which should involve modeling of microstructural features of actual graphite materials. Improvements of other structural design methodologies are also presented. (author)

  7. Radiation-Induced Leukemia at Doses Relevant to Radiation Therapy: Modeling Mechanisms and Estimating Risks

    Science.gov (United States)

    Shuryak, Igor; Sachs, Rainer K.; Hlatky, Lynn; Mark P. Little; Hahnfeldt, Philip; Brenner, David J.

    2006-01-01

    Because many cancer patients are diagnosed earlier and live longer than in the past, second cancers induced by radiation therapy have become a clinically significant issue. An earlier biologically based model that was designed to estimate risks of high-dose radiation induced solid cancers included initiation of stem cells to a premalignant state, inactivation of stem cells at high radiation doses, and proliferation of stem cells during cellular repopulation after inactivation. This earlier model predicted the risks of solid tumors induced by radiation therapy but overestimated the corresponding leukemia risks. Methods: To extend the model to radiation-induced leukemias, we analyzed in addition to cellular initiation, inactivation, and proliferation a repopulation mechanism specific to the hematopoietic system: long-range migration through the blood stream of hematopoietic stem cells (HSCs) from distant locations. Parameters for the model were derived from HSC biologic data in the literature and from leukemia risks among atomic bomb survivors v^ ho were subjected to much lower radiation doses. Results: Proliferating HSCs that migrate from sites distant from the high-dose region include few preleukemic HSCs, thus decreasing the high-dose leukemia risk. The extended model for leukemia provides risk estimates that are consistent with epidemiologic data for leukemia risk associated with radiation therapy over a wide dose range. For example, when applied to an earlier case-control study of 110000 women undergoing radiotherapy for uterine cancer, the model predicted an excess relative risk (ERR) of 1.9 for leukemia among women who received a large inhomogeneous fractionated external beam dose to the bone marrow (mean = 14.9 Gy), consistent with the measured ERR (2.0, 95% confidence interval [CI] = 0.2 to 6.4; from 3.6 cases expected and 11 cases observed). As a corresponding example for brachytherapy, the predicted ERR of 0.80 among women who received an inhomogeneous low

  8. Identifying diagnostically-relevant resting state brain functional connectivity in the ventral posterior complex via genetic data mining in autism spectrum disorder.

    Science.gov (United States)

    Baldwin, Philip R; Curtis, Kaylah N; Patriquin, Michelle A; Wolf, Varina; Viswanath, Humsini; Shaw, Chad; Sakai, Yasunari; Salas, Ramiro

    2016-05-01

    Exome sequencing and copy number variation analyses continue to provide novel insight to the biological bases of autism spectrum disorder (ASD). The growing speed at which massive genetic data are produced causes serious lags in analysis and interpretation of the data. Thus, there is a need to develop systematic genetic data mining processes that facilitate efficient analysis of large datasets. We report a new genetic data mining system, ProcessGeneLists and integrated a list of ASD-related genes with currently available resources in gene expression and functional connectivity of the human brain. Our data-mining program successfully identified three primary regions of interest (ROIs) in the mouse brain: inferior colliculus, ventral posterior complex of the thalamus (VPC), and parafascicular nucleus (PFn). To understand its pathogenic relevance in ASD, we examined the resting state functional connectivity (RSFC) of the homologous ROIs in human brain with other brain regions that were previously implicated in the neuro-psychiatric features of ASD. Among them, the RSFC of the VPC with the medial frontal gyrus (MFG) was significantly more anticorrelated, whereas the RSFC of the PN with the globus pallidus was significantly increased in children with ASD compared with healthy children. Moreover, greater values of RSFC between VPC and MFG were correlated with severity index and repetitive behaviors in children with ASD. No significant RSFC differences were detected in adults with ASD. Together, these data demonstrate the utility of our data-mining program through identifying the aberrant connectivity of thalamo-cortical circuits in children with ASD. Autism Res 2016, 9: 553-562. © 2015 International Society for Autism Research, Wiley Periodicals, Inc. © 2015 International Society for Autism Research, Wiley Periodicals, Inc.

  9. Analysis and evaluation of the ASTEC model basis. Relevant experiments. Technical report

    International Nuclear Information System (INIS)

    Koppers, V.; Koch, M.K.

    2015-12-01

    The present report is a Technical Report within the research project ''ASMO'', funded by the German Federal Ministry of Economics and Technology (BMWi 1501433) and projected at the Reactor Simulation and Safety Group, Chair of Energy Systems and Energy Economics (LEE) at the Ruhr-Universitaet Bochum (RUB). The project deals with the analysis of the model basis of the Accident Source Term Evaluation Code (ASTEC). This report focuses on the containment part of ASTEC (CPA) and presents the simulation results of the experiment TH20.7. The experimental series TH20 was performed in the test vessel THAI (Thermal-hydraulics, Aerosols, Iodine) to investigate the erosion of a helium layer by a blower generated air jet. Helium is used as a substitute for hydrogen. In the experiment TH20.7 a light-gas layer is established and eroded by a momentum driven jet. The simulation of momentum driven jets is challenging for CPA because there is no model to simulate the kinetic momentum transfer. Subject of this report is the analysis of the capability of the code with the current model basis to model momentum driven phenomena. The jet is modelled using virtual ventilation systems, so called FAN-Systems. The FAN-Systems are adapted to the erosion velocity. The simulation results are compared to the experimental results and a basic calculation using FAN-Systems without any adjustments. For further improvement, different variation calculations are performed. At first, the vertical nodalization is refined. Subsequently, the resistance coefficients are adjusted to support the jet flow pattern and the number of the FAN-Systems is reduced. The analysis shows that the simulation of a momentum driven light-gas layer erosion is possible using adjusted FAN-Systems. A fine selected vertical nodalization and adaption of the resistance coefficients improves the simulation results.

  10. A modeling process to understand complex system architectures

    Science.gov (United States)

    Robinson, Santiago Balestrini

    2009-12-01

    In recent decades, several tools have been developed by the armed forces, and their contractors, to test the capability of a force. These campaign level analysis tools, often times characterized as constructive simulations are generally expensive to create and execute, and at best they are extremely difficult to verify and validate. This central observation, that the analysts are relying more and more on constructive simulations to predict the performance of future networks of systems, leads to the two central objectives of this thesis: (1) to enable the quantitative comparison of architectures in terms of their ability to satisfy a capability without resorting to constructive simulations, and (2) when constructive simulations must be created, to quantitatively determine how to spend the modeling effort amongst the different system classes. The first objective led to Hypothesis A, the first main hypotheses, which states that by studying the relationships between the entities that compose an architecture, one can infer how well it will perform a given capability. The method used to test the hypothesis is based on two assumptions: (1) the capability can be defined as a cycle of functions, and that it (2) must be possible to estimate the probability that a function-based relationship occurs between any two types of entities. If these two requirements are met, then by creating random functional networks, different architectures can be compared in terms of their ability to satisfy a capability. In order to test this hypothesis, a novel process for creating representative functional networks of large-scale system architectures was developed. The process, named the Digraph Modeling for Architectures (DiMA), was tested by comparing its results to those of complex constructive simulations. Results indicate that if the inputs assigned to DiMA are correct (in the tests they were based on time-averaged data obtained from the ABM), DiMA is able to identify which of any two

  11. Green roof rainfall-runoff modelling: is the comparison between conceptual and physically based approaches relevant?

    Science.gov (United States)

    Versini, Pierre-Antoine; Tchiguirinskaia, Ioulia; Schertzer, Daniel

    2017-04-01

    Green roofs are commonly considered as efficient tools to mitigate urban runoff as they can store precipitation, and consequently provide retention and detention performances. Designed as a compromise between water holding capacity, weight and hydraulic conductivity, their substrate is usually an artificial media differentiating significantly from a traditional soil. In order to assess green roofs hydrological performances, many models have been developed. Classified into two categories (conceptual and physically based), they are usually applied to reproduce the discharge of a particular monitored green roof considered as homogeneous. Although the resulted simulations could be satisfactory, the question of robustness and consistency of the calibrated parameters is often not addressed. Here, a modeling framework has been developed to assess the efficiency and the robustness of both modelling approaches (conceptual and physically based) in reproducing green roof hydrological behaviour. SWMM and VS2DT models have been used for this purpose. This work also benefits from an experimental setup where several green roofs differentiated by their substrate thickness and vegetation cover are monitored. Based on the data collected for several rainfall events, it has been studied how the calibrated parameters are effectively linked to their physical properties and how they can vary from one green roof configuration to another. Although both models reproduce correctly the observed discharges in most of the cases, their calibrated parameters exhibit a high inconsistency. For a same green roof configuration, these parameters can vary significantly from one rainfall event to another, even if they are supposed to be linked to the green roof characteristics (roughness, residual moisture content for instance). They can also be different from one green roof configuration to another although the implemented substrate is the same. Finally, it appears very difficult to find any

  12. Isolated working heart: description of models relevant to radioisotopic and pharmacological assessments

    International Nuclear Information System (INIS)

    Depre, Christophe

    1998-01-01

    Isolated heart preparations are used to study physiological and metabolic parameters of the heart independently of its environment. Several preparations of isolated perfused heart are currently used, mainly the retrograde perfusion system and the working heart model. Both models allow investigations of the metabolic regulation of the heart in various physiological conditions (changes in workload, hormonal influences, substrate competition). These systems may also reproduce different pathological conditions, such as ischemia, reperfusion and hypoxia. Quantitation of metabolic activity can be performed with specific radioactive tracers. Finally, the effects of various drugs on cardiac performance and resistance to ischemia can be studied as well. Heart perfusion also revealed efficient methods to determine the tracer/tracee relation for radioisotopic analogues used with Positron Emission Tomography

  13. Retrieval-time properties of the Little-Hopfield model and their physiological relevance

    International Nuclear Information System (INIS)

    Risau-Gusman, Sebastian; Idiart, Marco A.P.

    2005-01-01

    We perform an extensive numerical investigation on the retrieval dynamics of the synchronous Hopfield model, also known as Little-Hopfield model, up to sizes of 2 18 neurons. Our results correct and extend much of the early simulations on the model. We find that the average convergence time has a power law behavior for a wide range of system sizes, whose exponent depends both on the network loading and the initial overlap with the memory to be retrieved. Surprisingly, we also find that the variance of the convergence time grows as fast as its average, making it a non-self-averaging quantity. Based on the simulation data we differentiate between two definitions for memory retrieval time, one that is mathematically strict, τ c , the number of updates needed to reach the attractor whose properties we just described, and a second definition correspondent to the time τ η when the network stabilizes within a tolerance threshold η such that the difference of two consecutive overlaps with a stored memory is smaller that η. We show that the scaling relationships between τ c and τ η and the typical network parameters as the memory load α or the size of the network N vary greatly, being τ η relatively insensitive to system sizes and loading. We propose τ η as the physiological realistic measure for the typical attractor network response

  14. Different concepts and models of information for family-relevant genetic findings: comparison and ethical analysis.

    Science.gov (United States)

    Lenk, Christian; Frommeld, Debora

    2015-08-01

    Genetic predispositions often concern not only individual persons, but also other family members. Advances in the development of genetic tests lead to a growing number of genetic diagnoses in medical practice and to an increasing importance of genetic counseling. In the present article, a number of ethical foundations and preconditions for this issue are discussed. Four different models for the handling of genetic information are presented and analyzed including a discussion of practical implications. The different models' ranges of content reach from a strictly autonomous position over self-governed arrangements in the practice of genetic counseling up to the involvement of official bodies and committees. The different models show a number of elements which seem to be very useful for the handling of genetic data in families from an ethical perspective. In contrast, the limitations of the standard medical attempt regarding confidentiality and personal autonomy in the context of genetic information in the family are described. Finally, recommendations for further ethical research and the development of genetic counseling in families are given.

  15. Effects of quark structure on NN scattering: relevance to current data and bag models

    International Nuclear Information System (INIS)

    Lomon, E.L.

    1984-01-01

    The applicability of the R-matrix method to the transition from asymptotic freedom to confinement depends on the overlap of the regions in which asymptotic freedom is a good approximation and the region well described by hadronic field theory. This enables a quantitative description of hadron-hadron interactions at low and intermediate energies. ''Compound'' and ''Cloudy'' bag models and the P-matrix method are shown to be special or approximate versions of the R-matrix method in its f-matrix form. The f-matrix condition is applied to S-state nucleon-nucleon scattering where it (i) overcomes the deficiencies of the P-matrix applications, (ii) shows that some of the bag models which have had some success in describing mesons and baryons are inconsistent when applied to nucleon-nucleon scattering, and (iii) that the bag models which are consistent with that data predict inelastic resonant structures of 50-100 MeV width at barycentric energies between 2.3 GeV and 3.5 GeV

  16. Environmentally-relevant concentrations of Al(III) and Fe(III) cations induce aggregation of free DNA by complexation with phosphate group.

    Science.gov (United States)

    Qin, Chao; Kang, Fuxing; Zhang, Wei; Shou, Weijun; Hu, Xiaojie; Gao, Yanzheng

    2017-10-15

    Environmental persistence of free DNA is influenced by its complexation with other chemical species and its aggregation mechanisms. However, it is not well-known how naturally-abundant metal ions, e.g., Al(III) and Fe(III), influence DNA aggregation. This study investigated aggregation behaviors of model DNA from salmon testes as influenced by metal cations, and elucidated the predominant mechanism responsible for DNA aggregation. Compared to monovalent (K + and Na + ) and divalent (Ca 2+ and Mg 2+ ) cations, Al(III) and Fe(III) species in aqueous solution caused rapid DNA aggregations. The maximal DNA aggregation occurred at 0.05 mmol/L Al(III) or 0.075 mmol/L Fe(III), respectively. A combination of atomic force microscopy, transmission electron microscopy, Fourier transform infrared spectroscopy, and X-ray photoelectron spectroscopy revealed that Al(III) and Fe(III) complexed with negatively charged phosphate groups to neutralize DNA charges, resulting in decreased electrostatic repulsion and subsequent DNA aggregation. Zeta potential measurements and molecular computation further support this mechanism. Furthermore, DNA aggregation was enhanced at higher temperature and near neutral pH. Therefore, DNA aggregation is collectively determined by many environmental factors such as ion species, temperature, and solution pH. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Data-assisted reduced-order modeling of extreme events in complex dynamical systems.

    Directory of Open Access Journals (Sweden)

    Zhong Yi Wan

    Full Text Available The prediction of extreme events, from avalanches and droughts to tsunamis and epidemics, depends on the formulation and analysis of relevant, complex dynamical systems. Such dynamical systems are characterized by high intrinsic dimensionality with extreme events having the form of rare transitions that are several standard deviations away from the mean. Such systems are not amenable to classical order-reduction methods through projection of the governing equations due to the large intrinsic dimensionality of the underlying attractor as well as the complexity of the transient events. Alternatively, data-driven techniques aim to quantify the dynamics of specific, critical modes by utilizing data-streams and by expanding the dimensionality of the reduced-order model using delayed coordinates. In turn, these methods have major limitations in regions of the phase space with sparse data, which is the case for extreme events. In this work, we develop a novel hybrid framework that complements an imperfect reduced order model, with data-streams that are integrated though a recurrent neural network (RNN architecture. The reduced order model has the form of projected equations into a low-dimensional subspace that still contains important dynamical information about the system and it is expanded by a long short-term memory (LSTM regularization. The LSTM-RNN is trained by analyzing the mismatch between the imperfect model and the data-streams, projected to the reduced-order space. The data-driven model assists the imperfect model in regions where data is available, while for locations where data is sparse the imperfect model still provides a baseline for the prediction of the system state. We assess the developed framework on two challenging prototype systems exhibiting extreme events. We show that the blended approach has improved performance compared with methods that use either data streams or the imperfect model alone. Notably the improvement is more

  18. Data-assisted reduced-order modeling of extreme events in complex dynamical systems.

    Science.gov (United States)

    Wan, Zhong Yi; Vlachas, Pantelis; Koumoutsakos, Petros; Sapsis, Themistoklis

    2018-01-01

    The prediction of extreme events, from avalanches and droughts to tsunamis and epidemics, depends on the formulation and analysis of relevant, complex dynamical systems. Such dynamical systems are characterized by high intrinsic dimensionality with extreme events having the form of rare transitions that are several standard deviations away from the mean. Such systems are not amenable to classical order-reduction methods through projection of the governing equations due to the large intrinsic dimensionality of the underlying attractor as well as the complexity of the transient events. Alternatively, data-driven techniques aim to quantify the dynamics of specific, critical modes by utilizing data-streams and by expanding the dimensionality of the reduced-order model using delayed coordinates. In turn, these methods have major limitations in regions of the phase space with sparse data, which is the case for extreme events. In this work, we develop a novel hybrid framework that complements an imperfect reduced order model, with data-streams that are integrated though a recurrent neural network (RNN) architecture. The reduced order model has the form of projected equations into a low-dimensional subspace that still contains important dynamical information about the system and it is expanded by a long short-term memory (LSTM) regularization. The LSTM-RNN is trained by analyzing the mismatch between the imperfect model and the data-streams, projected to the reduced-order space. The data-driven model assists the imperfect model in regions where data is available, while for locations where data is sparse the imperfect model still provides a baseline for the prediction of the system state. We assess the developed framework on two challenging prototype systems exhibiting extreme events. We show that the blended approach has improved performance compared with methods that use either data streams or the imperfect model alone. Notably the improvement is more significant in

  19. How relevant is the deposition of mercury onto snowpacks? – Part 2: A modeling study

    Directory of Open Access Journals (Sweden)

    D. Durnford

    2012-10-01

    Full Text Available An unknown fraction of mercury that is deposited onto snowpacks is revolatilized to the atmosphere. Determining the revolatilized fraction is important since mercury that enters the snowpack meltwater may be converted to highly toxic bioaccumulating methylmercury. In this study, we present a new dynamic physically-based snowpack/meltwater model for mercury that is suitable for large-scale atmospheric models for mercury. It represents the primary physical and chemical processes that determine the fate of mercury deposited onto snowpacks. The snowpack/meltwater model was implemented in Environment Canada's atmospheric mercury model GRAHM. For the first time, observed snowpack-related mercury concentrations are used to evaluate and constrain an atmospheric mercury model. We find that simulated concentrations of mercury in both snowpacks and the atmosphere's surface layer agree closely with observations. The simulated concentration of mercury in both in the top 30 cm and the top 150 cm of the snowpack, averaged over 2005–2009, is predominantly below 6 ng L−1 over land south of 66.5° N but exceeds 18 ng L−1 over sea ice in extensive areas of the Arctic Ocean and Hudson Bay. The average simulated concentration of mercury in snowpack meltwater runoff tends to be higher on the Russian/European side (>20 ng L−1 of the Arctic Ocean than on the Canadian side (<10 ng L−1. The correlation coefficient between observed and simulated monthly mean atmospheric surface-level gaseous elemental mercury (GEM concentrations increased significantly with the inclusion of the new snowpack/meltwater model at two of the three stations (midlatitude, subarctic studied and remained constant at the third (arctic. Oceanic emissions are postulated to produce the observed summertime maximum in concentrations of surface-level atmospheric GEM at Alert in the Canadian Arctic and to generate the summertime volatility observed in

  20. Proceedings of the meeting on computational and experimental studies for modeling of radionuclide migration in complex aquatic ecosystems

    International Nuclear Information System (INIS)

    Matsunaga, Takeshi; Hakanson, Lars

    2010-09-01

    The Research Group for Environmental Science of JAEA held a meeting on computational and experimental studies for modeling of radionuclide migration in complex aquatic ecosystems during November 16-20 of 2009. The aim was to discuss the relevance of various computational and experimental approaches to that modeling. The meeting was attended by a Swedish researcher, Prof. Dr. Lars Hakanson of Uppsala University. It included a joint talk at the Institute for Environmental Sciences, in addition to a field and facility survey of the JNFL commercial reprocessing plant located in Rokkasho, Aomori. The meeting demonstrated that it is crucial 1) to make a model structure be strictly relevant to the target objectives of a study and 2) to account for inherent fluctuations in target values in nature in a manner of qualitative parameterization. Moreover, it was confirmed that there can be multiple different approaches of modeling (e.g. detailed or simplified) with relevance for the objectives of a study. These discussions should be considered in model integration for complex aquatic ecosystems consisting catchments, rivers, lakes and coastal oceans which can interact with the atmosphere. This report compiles research subjects and lectures presented at the meeting with associated discussions. The 10 of the presented papers indexed individually. (J.P.N.)

  1. Educational complex of light-colored modeling of urban environment

    Directory of Open Access Journals (Sweden)

    Karpenko Vladimir E.

    2018-01-01

    Full Text Available Mechanisms, methodological tools and structure of a training complex of light-colored modeling of the urban environment are developed in this paper. The following results of the practical work of students are presented: light composition and installation, media facades, lighting of building facades, city streets and embankment. As a result of modeling, the structure of the light form is determined. Light-transmitting materials and causing characteristic optical illusions, light-visual and light-dynamic effects (video-dynamics and photostatics, basic compositional techniques of light form are revealed. The main elements of the light installation are studied, including a light projection, an electronic device, interactivity and relationality of the installation, and the mechanical device which becomes a part of the installation composition. The meaning of modern media facade technology is the transformation of external building structures and their facades into a changing information cover, into a media content translator using LED technology. Light tectonics and the light rhythm of the plastics of the architectural object are built up through point and local illumination, modeling of the urban ensemble assumes the structural interaction of several light building models with special light-composition techniques. When modeling the social and pedestrian environment, the lighting parameters depend on the scale of the chosen space and are adapted taking into account the visual perception of the pedestrian, and the atmospheric effects of comfort and safety of the environment are achieved with the help of special light compositional techniques. With the aim of realizing the tasks of light modeling, a methodology has been created, including the mechanisms of models, variability and complementarity. The perspectives of light modeling in the context of structural elements of the city, neuropsychology, wireless and bioluminescence technologies are proposed

  2. Differential Effectiveness of Clinically-Relevant Analgesics in a Rat Model of Chemotherapy-Induced Mucositis.

    Directory of Open Access Journals (Sweden)

    Alexandra L Whittaker

    Full Text Available Chemotherapy-induced intestinal mucositis is characterized by pain and a pro-inflammatory tissue response. Rat models are frequently used in mucositis disease investigations yet little is known about the presence of pain in these animals, the ability of analgesics to ameliorate the condition, or the effect that analgesic administration may have on study outcomes. This study investigated different classes of analgesics with the aim of determining their analgesic effects and impact on research outcomes of interest in a rat model of mucositis. Female DA rats were allocated to 8 groups to include saline and chemotherapy controls (n = 8. Analgesics included opioid derivatives (buprenorphine; 0.05mg/kg and tramadol 12.5mg/kg and NSAID (carprofen; 15mg/kg in combination with either saline or 5-Fluorouracil (5-FU; 150mg/kg. Research outcome measures included daily clinical parameters, pain score and gut histology. Myeloperoxidase assay was performed to determine gut inflammation. At the dosages employed, all agents had an analgesic effect based on behavioural pain scores. Jejunal myeloperoxidase activity was significantly reduced by buprenorphine and tramadol in comparison to 5-FU control animals (53%, p = 0.0004 and 58%, p = 0.0001. Carprofen had no ameliorating effect on myeloperoxidase levels. None of the agents reduced the histological damage caused by 5-FU administration although tramadol tended to increase villus length even when administered to healthy animals. These data provide evidence that carprofen offers potential as an analgesic in this animal model due to its pain-relieving efficacy and minimal effect on measured parameters. This study also supports further investigation into the mechanism and utility of opioid agents in the treatment of chemotherapy-induced mucositis.

  3. Drought-associated changes in climate and their relevance for ecosystem experiments and models

    Directory of Open Access Journals (Sweden)

    H. J. De Boeck

    2011-05-01

    Full Text Available Drought periods can have important impacts on plant productivity and ecosystem functioning, but climatic conditions other than the lack of precipitation during droughts have never been quantified and have therefore not been considered explicitly in both experimental and modeling studies. Here, we identify which climatic characteristics deviate from normal during droughts and how these deviations could affect plant responses. Analysis of 609 years of daily data from nine Western European meteorological stations reveals that droughts in the studied region are consistently associated with more sunshine (+45 %, increased mean (+1.6 °C and maximum (+2.8 °C air temperatures and vapour pressure deficits that were 51 % higher than under normal conditions. These deviations from normal increase significantly as droughts progress. Using the process-model ORCHIDEE, we simulated droughts consistent with the results of the dataset analysis and compared water and carbon exchange of three different vegetation types during such natural droughts and droughts in which only the precipitation was affected. The comparison revealed contrasting responses: carbon loss was higher under natural drought in grasslands, while increased carbon uptake was found especially in decidious forests. This difference was attributed to better access to water reserves in forest ecosystems which prevented drought stress. This demonstrates that the warmer and sunnier conditions naturally associated with droughts can either improve growth or aggravate drought-related stress, depending on water reserves. As the impacts of including or excluding climatic parameters that correlate with drought are substantial, we propose that both experimental and modeling efforts should take into account other environmental factors than merely precipitation.

  4. Expression of presynaptic markers in a neurodevelopmental animal model with relevance to schizophrenia

    DEFF Research Database (Denmark)

    Karlsen, Anna S; Kaalund, Sanne Simone; Møller, Morten

    2013-01-01

    Administration of N-methyl-D-aspartate receptor antagonist phencyclidine (PCP) to rat pups at postnatal day (PND) 7, 9, and 11 [neonatal PCP (neoPCP) model] induces cognitive deficits similar to those observed in schizophrenia. Expression of presynaptic SNARE protein, synaptosomal......-associated protein of 25 kDa (Snap25), has been shown to be downregulated in postmortem brains from patients with schizophrenia. The present study was designed to investigate the long-term effects of neoPCP administration on expression of presynaptic markers altered in schizophrenia. Using radioactive in...

  5. A meta-analysis of the abscopal effect in preclinical models: Is the biologically effective dose a relevant physical trigger?

    Directory of Open Access Journals (Sweden)

    Raffaella Marconi

    Full Text Available Preclinical in vivo studies using small animals are considered crucial in translational cancer research and clinical implementation of novel treatments. This is of paramount relevance in radiobiology, especially for any technological developments permitted to deliver high doses in single or oligo-fractionated regimens, such as stereotactic ablative radiotherapy (SABR. In this context, clinical success in cancer treatment needs to be guaranteed, sparing normal tissue and preventing the potential spread of disease or local recurrence. In this work we introduce a new dose-response relationship based on relevant publications concerning preclinical models with regard to delivered dose, fractionation schedule and occurrence of biological effects on non-irradiated tissue, abscopal effects.We reviewed relevant publications on murine models and the abscopal effect in radiation cancer research following PRISMA methodology. In particular, through a log-likelihood method, we evaluated whether the occurrence of abscopal effects may be related to the biologically effective dose (BED. To this aim, studies accomplished with different tumor histotypes were considered in our analysis including breast, colon, lung, fibrosarcoma, pancreas, melanoma and head and neck cancer. For all the tumors, the α / β ratio was assumed to be 10 Gy, as generally adopted for neoplastic cells.Our results support the hypothesis that the occurrence rate of abscopal effects in preclinical models increases with BED. In particular, the probability of revealing abscopal effects is 50% when a BED of 60 Gy is generated.Our study provides evidence that SABR treatments associated with high BEDs could be considered an effective strategy in triggering the abscopal effect, thus shedding light on the promising outcomes revealed in clinical practice.

  6. New statistical methodology, mathematical models, and data bases relevant to the assessment of health impacts of energy technologies

    International Nuclear Information System (INIS)

    Ginevan, M.E.; Collins, J.J.; Brown, C.D.; Carnes, B.A.; Curtiss, J.B.; Devine, N.

    1981-01-01

    The present research develops new statistical methodology, mathematical models, and data bases of relevance to the assessment of health impacts of energy technologies, and uses these to identify, quantify, and pedict adverse health effects of energy related pollutants. Efforts are in five related areas including: (1) evaluation and development of statistical procedures for the analysis of death rate data, disease incidence data, and large scale data sets; (2) development of dose response and demographic models useful in the prediction of the health effects of energy technologies; (3) application of our method and models to analyses of the health risks of energy production; (4) a reanalysis of the Tri-State leukemia survey data, focusing on the relationship between myelogenous leukemia risk and diagnostic x-ray exposure; and (5) investigation of human birth weights as a possible early warning system for the effects of environmental pollution

  7. A clinically relevant in vivo model for the assessment of scaffold efficacy in abdominal wall reconstruction

    Directory of Open Access Journals (Sweden)

    Jeffrey CY Chan

    2016-12-01

    Full Text Available An animal model that allows for assessment of the degree of stretching or contraction of the implant area and the in vivo degradation properties of biological meshes is required to evaluate their performance in vivo. Adult New Zealand rabbits underwent full thickness subtotal unilateral rectus abdominis muscle excision and were reconstructed with the non-biodegradable Peri-Guard®, Prolene® or biodegradable Surgisis® meshes. Following 8 weeks of recovery, the anterior abdominal wall tissue samples were collected for measurement of the implant dimensions. The Peri-Guard and Prolene meshes showed a slight and obvious shrinkage, respectively, whereas the Surgisis mesh showed stretching, resulting in hernia formation. Surgisis meshes showed in vivo biodegradation and increased collagen formation. This surgical rabbit model for abdominal wall defects is advantageous for evaluating the in vivo behaviour of surgical meshes. Implant area stretching and shrinkage were detected corresponding to mesh properties, and histological analysis and stereological methods supported these findings.

  8. Experimental Animal Models of Pancreatic Carcinogenesis for Prevention Studies and Their Relevance to Human Disease

    Directory of Open Access Journals (Sweden)

    Hitoshi Nakagama

    2011-02-01

    Full Text Available Pancreatic cancer is difficult to cure, so its prevention is very important. For this purpose, animal model studies are necessary to develop effective methods. Injection of N-nitrosobis(2-oxopropylamine (BOP into Syrian golden hamsters is known to induce pancreatic ductal adenocarcinomas, the histology of which is similar to human tumors. Moreover, K-ras activation by point mutations and p16 inactivation by aberrant methylation of 5’ CpG islands or by homozygous deletions have been frequently observed in common in both the hamster and humans. Thus, this chemical carcinogenesis model has an advantage of histopathological and genetic similarity to human pancreatic cancer, and it is useful to study promotive and suppressive factors. Syrian golden hamsters are in a hyperlipidemic state even under normal dietary conditions, and a ligand of peroxizome proliferator-activated receptor gamma was found to improve the hyperlipidemia and suppress pancreatic carcinogenesis. Chronic inflammation is a known important risk factor, and selective inhibitors of inducible nitric oxide synthase and cyclooxygenase-2 also have protective effects against pancreatic cancer development. Anti-inflammatory and anti-hyperlipidemic agents can thus be considered candidate chemopreventive agents deserving more attention.

  9. Experimental Animal Models of Pancreatic Carcinogenesis for Prevention Studies and Their Relevance to Human Disease

    International Nuclear Information System (INIS)

    Takahashi, Mami; Hori, Mika; Mutoh, Michihiro; Wakabayashi, Keiji; Nakagama, Hitoshi

    2011-01-01

    Pancreatic cancer is difficult to cure, so its prevention is very important. For this purpose, animal model studies are necessary to develop effective methods. Injection of N-nitrosobis(2-oxopropyl)amine (BOP) into Syrian golden hamsters is known to induce pancreatic ductal adenocarcinomas, the histology of which is similar to human tumors. Moreover, K-ras activation by point mutations and p16 inactivation by aberrant methylation of 5′ CpG islands or by homozygous deletions have been frequently observed in common in both the hamster and humans. Thus, this chemical carcinogenesis model has an advantage of histopathological and genetic similarity to human pancreatic cancer, and it is useful to study promotive and suppressive factors. Syrian golden hamsters are in a hyperlipidemic state even under normal dietary conditions, and a ligand of peroxizome proliferator-activated receptor gamma was found to improve the hyperlipidemia and suppress pancreatic carcinogenesis. Chronic inflammation is a known important risk factor, and selective inhibitors of inducible nitric oxide synthase and cyclooxygenase-2 also have protective effects against pancreatic cancer development. Anti-inflammatory and anti-hyperlipidemic agents can thus be considered candidate chemopreventive agents deserving more attention

  10. Experimental Animal Models of Pancreatic Carcinogenesis for Prevention Studies and Their Relevance to Human Disease

    Energy Technology Data Exchange (ETDEWEB)

    Takahashi, Mami, E-mail: mtakahas@ncc.go.jp; Hori, Mika; Mutoh, Michihiro [Division of Cancer Development System, Carcinogenesis Research Group, National Cancer Center Research Institute, 1-1, Tsukiji 5-chome, Chuo-ku, Tokyo 104-0045 (Japan); Wakabayashi, Keiji [Graduate School of Nutritional and Environmental Sciences, University of Shizuoka, Yada 52-1, Suruga-ku, Shizuoka 422-8526 (Japan); Nakagama, Hitoshi [Division of Cancer Development System, Carcinogenesis Research Group, National Cancer Center Research Institute, 1-1, Tsukiji 5-chome, Chuo-ku, Tokyo 104-0045 (Japan)

    2011-02-09

    Pancreatic cancer is difficult to cure, so its prevention is very important. For this purpose, animal model studies are necessary to develop effective methods. Injection of N-nitrosobis(2-oxopropyl)amine (BOP) into Syrian golden hamsters is known to induce pancreatic ductal adenocarcinomas, the histology of which is similar to human tumors. Moreover, K-ras activation by point mutations and p16 inactivation by aberrant methylation of 5′ CpG islands or by homozygous deletions have been frequently observed in common in both the hamster and humans. Thus, this chemical carcinogenesis model has an advantage of histopathological and genetic similarity to human pancreatic cancer, and it is useful to study promotive and suppressive factors. Syrian golden hamsters are in a hyperlipidemic state even under normal dietary conditions, and a ligand of peroxizome proliferator-activated receptor gamma was found to improve the hyperlipidemia and suppress pancreatic carcinogenesis. Chronic inflammation is a known important risk factor, and selective inhibitors of inducible nitric oxide synthase and cyclooxygenase-2 also have protective effects against pancreatic cancer development. Anti-inflammatory and anti-hyperlipidemic agents can thus be considered candidate chemopreventive agents deserving more attention.

  11. The relevance of parametric U-uptake models in ESR age calculations

    International Nuclear Information System (INIS)

    Gruen, Rainer

    2009-01-01

    In the ESR dating three basic parametric U-uptake models have been applied for dating teeth: early U-uptake (EU: closed system), linear U-uptake (LU) and recent U-uptake (RU, it is assumed that the dose rate contribution of U in the dental tissues is zero). In many ESR dating publications it is still assumed that samples comply with one or the other parametric U-uptake model calculation or that their correct age lies somewhere between EU and LU. Observations of the spatial distribution of uranium in dental tissues show that it is difficult to predict any relationships between the relative uptake in the dental tissues. Combined U-series/ESR age estimates can give insights into the actual U-uptake. An evaluation of published data shows that for cave sites, a significant number of results fall outside the EU and LU bracket, while for open air sites, the majority of data are outside this bracket, particularly showing greatly delayed U-uptake. This may be due to changes in the hydrological system, leading to erosion which exposes the open air site. U-leaching has also been observed on samples from open air sites, in which case any reasonable age calculation is impossible.

  12. Adaptive Surface Modeling of Soil Properties in Complex Landforms

    Directory of Open Access Journals (Sweden)

    Wei Liu

    2017-06-01

    Full Text Available Abstract: Spatial discontinuity often causes poor accuracy when a single model is used for the surface modeling of soil properties in complex geomorphic areas. Here we present a method for adaptive surface modeling of combined secondary variables to improve prediction accuracy during the interpolation of soil properties (ASM-SP. Using various secondary variables and multiple base interpolation models, ASM-SP was used to interpolate soil K+ in a typical complex geomorphic area (Qinghai Lake Basin, China. Five methods, including inverse distance weighting (IDW, ordinary kriging (OK, and OK combined with different secondary variables (e.g., OK-Landuse, OK-Geology, and OK-Soil, were used to validate the proposed method. The mean error (ME, mean absolute error (MAE, root mean square error (RMSE, mean relative error (MRE, and accuracy (AC were used as evaluation indicators. Results showed that: (1 The OK interpolation result is spatially smooth and has a weak bull's-eye effect, and the IDW has a stronger ‘bull’s-eye’ effect, relatively. They both have obvious deficiencies in depicting spatial variability of soil K+. (2 The methods incorporating combinations of different secondary variables (e.g., ASM-SP, OK-Landuse, OK-Geology, and OK-Soil were associated with lower estimation bias. Compared with IDW, OK, OK-Landuse, OK-Geology, and OK-Soil, the accuracy of ASM-SP increased by 13.63%, 10.85%, 9.98%, 8.32%, and 7.66%, respectively. Furthermore, ASM-SP was more stable, with lower MEs, MAEs, RMSEs, and MREs. (3 ASM-SP presents more details than others in the abrupt boundary, which can render the result consistent with the true secondary variables. In conclusion, ASM-SP can not only consider the nonlinear relationship between secondary variables and soil properties, but can also adaptively combine the advantages of multiple models, which contributes to making the spatial interpolation of soil K+ more reasonable.

  13. Modeling Cu{sup 2+}-Aβ complexes from computational approaches

    Energy Technology Data Exchange (ETDEWEB)

    Alí-Torres, Jorge [Departamento de Química, Universidad Nacional de Colombia- Sede Bogotá, 111321 (Colombia); Mirats, Andrea; Maréchal, Jean-Didier; Rodríguez-Santiago, Luis; Sodupe, Mariona, E-mail: Mariona.Sodupe@uab.cat [Departament de Química, Universitat Autònoma de Barcelona, 08193 Bellaterra, Barcelona (Spain)

    2015-09-15

    Amyloid plaques formation and oxidative stress are two key events in the pathology of the Alzheimer disease (AD), in which metal cations have been shown to play an important role. In particular, the interaction of the redox active Cu{sup 2+} metal cation with Aβ has been found to interfere in amyloid aggregation and to lead to reactive oxygen species (ROS). A detailed knowledge of the electronic and molecular structure of Cu{sup 2+}-Aβ complexes is thus important to get a better understanding of the role of these complexes in the development and progression of the AD disease. The computational treatment of these systems requires a combination of several available computational methodologies, because two fundamental aspects have to be addressed: the metal coordination sphere and the conformation adopted by the peptide upon copper binding. In this paper we review the main computational strategies used to deal with the Cu{sup 2+}-Aβ coordination and build plausible Cu{sup 2+}-Aβ models that will afterwards allow determining physicochemical properties of interest, such as their redox potential.

  14. Modeling and simulation of complex systems a framework for efficient agent-based modeling and simulation

    CERN Document Server

    Siegfried, Robert

    2014-01-01

    Robert Siegfried presents a framework for efficient agent-based modeling and simulation of complex systems. He compares different approaches for describing structure and dynamics of agent-based models in detail. Based on this evaluation the author introduces the "General Reference Model for Agent-based Modeling and Simulation" (GRAMS). Furthermore he presents parallel and distributed simulation approaches for execution of agent-based models -from small scale to very large scale. The author shows how agent-based models may be executed by different simulation engines that utilize underlying hard

  15. Wind Tunnel Modeling Of Wind Flow Over Complex Terrain

    Science.gov (United States)

    Banks, D.; Cochran, B.

    2010-12-01

    This presentation will describe the finding of an atmospheric boundary layer (ABL) wind tunnel study conducted as part of the Bolund Experiment. This experiment was sponsored by Risø DTU (National Laboratory for Sustainable Energy, Technical University of Denmark) during the fall of 2009 to enable a blind comparison of various air flow models in an attempt to validate their performance in predicting airflow over complex terrain. Bohlund hill sits 12 m above the water level at the end of a narrow isthmus. The island features a steep escarpment on one side, over which the airflow can be expected to separate. The island was equipped with several anemometer towers, and the approach flow over the water was well characterized. This study was one of only two only physical model studies included in the blind model comparison, the other being a water plume study. The remainder were computational fluid dynamics (CFD) simulations, including both RANS and LES. Physical modeling of air flow over topographical features has been used since the middle of the 20th century, and the methods required are well understood and well documented. Several books have been written describing how to properly perform ABL wind tunnel studies, including ASCE manual of engineering practice 67. Boundary layer wind tunnel tests are the only modelling method deemed acceptable in ASCE 7-10, the most recent edition of the American Society of Civil Engineers standard that provides wind loads for buildings and other structures for buildings codes across the US. Since the 1970’s, most tall structures undergo testing in a boundary layer wind tunnel to accurately determine the wind induced loading. When compared to CFD, the US EPA considers a properly executed wind tunnel study to be equivalent to a CFD model with infinitesimal grid resolution and near infinite memory. One key reason for this widespread acceptance is that properly executed ABL wind tunnel studies will accurately simulate flow separation

  16. Toxicological risk assessment of complex mixtures through the Wtox model

    Directory of Open Access Journals (Sweden)

    William Gerson Matias

    2015-01-01

    Full Text Available Mathematical models are important tools for environmental management and risk assessment. Predictions about the toxicity of chemical mixtures must be enhanced due to the complexity of eects that can be caused to the living species. In this work, the environmental risk was accessed addressing the need to study the relationship between the organism and xenobiotics. Therefore, ve toxicological endpoints were applied through the WTox Model, and with this methodology we obtained the risk classication of potentially toxic substances. Acute and chronic toxicity, citotoxicity and genotoxicity were observed in the organisms Daphnia magna, Vibrio scheri and Oreochromis niloticus. A case study was conducted with solid wastes from textile, metal-mechanic and pulp and paper industries. The results have shown that several industrial wastes induced mortality, reproductive eects, micronucleus formation and increases in the rate of lipid peroxidation and DNA methylation of the organisms tested. These results, analyzed together through the WTox Model, allowed the classication of the environmental risk of industrial wastes. The evaluation showed that the toxicological environmental risk of the samples analyzed can be classied as signicant or critical.

  17. Integrated modeling tool for performance engineering of complex computer systems

    Science.gov (United States)

    Wright, Gary; Ball, Duane; Hoyt, Susan; Steele, Oscar

    1989-01-01

    This report summarizes Advanced System Technologies' accomplishments on the Phase 2 SBIR contract NAS7-995. The technical objectives of the report are: (1) to develop an evaluation version of a graphical, integrated modeling language according to the specification resulting from the Phase 2 research; and (2) to determine the degree to which the language meets its objectives by evaluating ease of use, utility of two sets of performance predictions, and the power of the language constructs. The technical approach followed to meet these objectives was to design, develop, and test an evaluation prototype of a graphical, performance prediction tool. The utility of the prototype was then evaluated by applying it to a variety of test cases found in the literature and in AST case histories. Numerous models were constructed and successfully tested. The major conclusion of this Phase 2 SBIR research and development effort is that complex, real-time computer systems can be specified in a non-procedural manner using combinations of icons, windows, menus, and dialogs. Such a specification technique provides an interface that system designers and architects find natural and easy to use. In addition, PEDESTAL's multiview approach provides system engineers with the capability to perform the trade-offs necessary to produce a design that meets timing performance requirements. Sample system designs analyzed during the development effort showed that models could be constructed in a fraction of the time required by non-visual system design capture tools.

  18. Atmospheric dispersion modelling over complex terrain at small scale

    Science.gov (United States)

    Nosek, S.; Janour, Z.; Kukacka, L.; Jurcakova, K.; Kellnerova, R.; Gulikova, E.

    2014-03-01

    Previous study concerned of qualitative modelling neutrally stratified flow over open-cut coal mine and important surrounding topography at meso-scale (1:9000) revealed an important area for quantitative modelling of atmospheric dispersion at small-scale (1:3300). The selected area includes a necessary part of the coal mine topography with respect to its future expansion and surrounding populated areas. At this small-scale simultaneous measurement of velocity components and concentrations in specified points of vertical and horizontal planes were performed by two-dimensional Laser Doppler Anemometry (LDA) and Fast-Response Flame Ionization Detector (FFID), respectively. The impact of the complex terrain on passive pollutant dispersion with respect to the prevailing wind direction was observed and the prediction of the air quality at populated areas is discussed. The measured data will be used for comparison with another model taking into account the future coal mine transformation. Thus, the impact of coal mine transformation on pollutant dispersion can be observed.

  19. Complex accident scenarios modelled and analysed by Stochastic Petri Nets

    International Nuclear Information System (INIS)

    Nývlt, Ondřej; Haugen, Stein; Ferkl, Lukáš

    2015-01-01

    This paper is focused on the usage of Petri nets for an effective modelling and simulation of complicated accident scenarios, where an order of events can vary and some events may occur anywhere in an event chain. These cases are hardly manageable by traditional methods as event trees – e.g. one pivotal event must be often inserted several times into one branch of the tree. Our approach is based on Stochastic Petri Nets with Predicates and Assertions and on an idea, which comes from the area of Programmable Logic Controllers: an accidental scenario is described as a net of interconnected blocks, which represent parts of the scenario. So the scenario is firstly divided into parts, which are then modelled by Petri nets. Every block can be easily interconnected with other blocks by input/output variables to create complex ones. In the presented approach, every event or a part of a scenario is modelled only once, independently on a number of its occurrences in the scenario. The final model is much more transparent then the corresponding event tree. The method is shown in two case studies, where the advanced one contains a dynamic behavior. - Highlights: • Event & Fault trees have problems with scenarios where an order of events can vary. • Paper presents a method for modelling and analysis of dynamic accident scenarios. • The presented method is based on Petri nets. • The proposed method solves mentioned problems of traditional approaches. • The method is shown in two case studies: simple and advanced (with dynamic behavior)

  20. Surface Generation Modeling in Ball Nose End Milling: a review of relevant literature

    DEFF Research Database (Denmark)

    Bissacco, Giuliano

    One of the most common metal removal operation used in industry is the milling process. This machining process is well known since the beginning of last century and has experienced, along the years, many improvements of the basic technology, as concerns tools, machine tools, coolants...... to be adjusted afterwards. Nevertheless, many efforts have been done during the last 50 years in order to realize prediction tools for machining processes and particularly for conventional turning and milling operations. Most of these models aim at prediction of cutting forces tool wear and tool life. However...... been addressed in this direction. Among all the machining operations, ball nose end milling has shown great potentials, particularly in machining of sculptured surfaces with high requirements in terms of surface finish; this is due to the good spatial agreement of the mill shape with the geometry...