WorldWideScience

Sample records for relevant model complexes

  1. A Peep into the Uncertainty-Complexity-Relevance Modeling Trilemma through Global Sensitivity and Uncertainty Analysis

    Science.gov (United States)

    Munoz-Carpena, R.; Muller, S. J.; Chu, M.; Kiker, G. A.; Perz, S. G.

    2014-12-01

    Model Model complexity resulting from the need to integrate environmental system components cannot be understated. In particular, additional emphasis is urgently needed on rational approaches to guide decision making through uncertainties surrounding the integrated system across decision-relevant scales. However, in spite of the difficulties that the consideration of modeling uncertainty represent for the decision process, it should not be avoided or the value and science behind the models will be undermined. These two issues; i.e., the need for coupled models that can answer the pertinent questions and the need for models that do so with sufficient certainty, are the key indicators of a model's relevance. Model relevance is inextricably linked with model complexity. Although model complexity has advanced greatly in recent years there has been little work to rigorously characterize the threshold of relevance in integrated and complex models. Formally assessing the relevance of the model in the face of increasing complexity would be valuable because there is growing unease among developers and users of complex models about the cumulative effects of various sources of uncertainty on model outputs. In particular, this issue has prompted doubt over whether the considerable effort going into further elaborating complex models will in fact yield the expected payback. New approaches have been proposed recently to evaluate the uncertainty-complexity-relevance modeling trilemma (Muller, Muñoz-Carpena and Kiker, 2011) by incorporating state-of-the-art global sensitivity and uncertainty analysis (GSA/UA) in every step of the model development so as to quantify not only the uncertainty introduced by the addition of new environmental components, but the effect that these new components have over existing components (interactions, non-linear responses). Outputs from the analysis can also be used to quantify system resilience (stability, alternative states, thresholds or tipping

  2. Developing predictive systems models to address complexity and relevance for ecological risk assessment.

    Science.gov (United States)

    Forbes, Valery E; Calow, Peter

    2013-07-01

    Ecological risk assessments (ERAs) are not used as well as they could be in risk management. Part of the problem is that they often lack ecological relevance; that is, they fail to grasp necessary ecological complexities. Adding realism and complexity can be difficult and costly. We argue that predictive systems models (PSMs) can provide a way of capturing complexity and ecological relevance cost-effectively. However, addressing complexity and ecological relevance is only part of the problem. Ecological risk assessments often fail to meet the needs of risk managers by not providing assessments that relate to protection goals and by expressing risk in ratios that cannot be weighed against the costs of interventions. Once more, PSMs can be designed to provide outputs in terms of value-relevant effects that are modulated against exposure and that can provide a better basis for decision making than arbitrary ratios or threshold values. Recent developments in the modeling and its potential for implementation by risk assessors and risk managers are beginning to demonstrate how PSMs can be practically applied in risk assessment and the advantages that doing so could have. Copyright © 2013 SETAC.

  3. Parsimonious relevance models

    NARCIS (Netherlands)

    Meij, E.; Weerkamp, W.; Balog, K.; de Rijke, M.; Myang, S.-H.; Oard, D.W.; Sebastiani, F.; Chua, T.-S.; Leong, M.-K.

    2008-01-01

    We describe a method for applying parsimonious language models to re-estimate the term probabilities assigned by relevance models. We apply our method to six topic sets from test collections in five different genres. Our parsimonious relevance models (i) improve retrieval effectiveness in terms of

  4. Genetic mouse models relevant to schizophrenia: taking stock and looking forward.

    Science.gov (United States)

    Harrison, Paul J; Pritchett, David; Stumpenhorst, Katharina; Betts, Jill F; Nissen, Wiebke; Schweimer, Judith; Lane, Tracy; Burnet, Philip W J; Lamsa, Karri P; Sharp, Trevor; Bannerman, David M; Tunbridge, Elizabeth M

    2012-03-01

    Genetic mouse models relevant to schizophrenia complement, and have to a large extent supplanted, pharmacological and lesion-based rat models. The main attraction is that they potentially have greater construct validity; however, they share the fundamental limitations of all animal models of psychiatric disorder, and must also be viewed in the context of the uncertain and complex genetic architecture of psychosis. Some of the key issues, including the choice of gene to target, the manner of its manipulation, gene-gene and gene-environment interactions, and phenotypic characterization, are briefly considered in this commentary, illustrated by the relevant papers reported in this special issue. Copyright © 2011 Elsevier Ltd. All rights reserved.

  5. Mathematical approaches for complexity/predictivity trade-offs in complex system models : LDRD final report.

    Energy Technology Data Exchange (ETDEWEB)

    Goldsby, Michael E.; Mayo, Jackson R.; Bhattacharyya, Arnab (Massachusetts Institute of Technology, Cambridge, MA); Armstrong, Robert C.; Vanderveen, Keith

    2008-09-01

    The goal of this research was to examine foundational methods, both computational and theoretical, that can improve the veracity of entity-based complex system models and increase confidence in their predictions for emergent behavior. The strategy was to seek insight and guidance from simplified yet realistic models, such as cellular automata and Boolean networks, whose properties can be generalized to production entity-based simulations. We have explored the usefulness of renormalization-group methods for finding reduced models of such idealized complex systems. We have prototyped representative models that are both tractable and relevant to Sandia mission applications, and quantified the effect of computational renormalization on the predictive accuracy of these models, finding good predictivity from renormalized versions of cellular automata and Boolean networks. Furthermore, we have theoretically analyzed the robustness properties of certain Boolean networks, relevant for characterizing organic behavior, and obtained precise mathematical constraints on systems that are robust to failures. In combination, our results provide important guidance for more rigorous construction of entity-based models, which currently are often devised in an ad-hoc manner. Our results can also help in designing complex systems with the goal of predictable behavior, e.g., for cybersecurity.

  6. Epidemic modeling in complex realities.

    Science.gov (United States)

    Colizza, Vittoria; Barthélemy, Marc; Barrat, Alain; Vespignani, Alessandro

    2007-04-01

    In our global world, the increasing complexity of social relations and transport infrastructures are key factors in the spread of epidemics. In recent years, the increasing availability of computer power has enabled both to obtain reliable data allowing one to quantify the complexity of the networks on which epidemics may propagate and to envision computational tools able to tackle the analysis of such propagation phenomena. These advances have put in evidence the limits of homogeneous assumptions and simple spatial diffusion approaches, and stimulated the inclusion of complex features and heterogeneities relevant in the description of epidemic diffusion. In this paper, we review recent progresses that integrate complex systems and networks analysis with epidemic modelling and focus on the impact of the various complex features of real systems on the dynamics of epidemic spreading.

  7. Complex Systems and Self-organization Modelling

    CERN Document Server

    Bertelle, Cyrille; Kadri-Dahmani, Hakima

    2009-01-01

    The concern of this book is the use of emergent computing and self-organization modelling within various applications of complex systems. The authors focus their attention both on the innovative concepts and implementations in order to model self-organizations, but also on the relevant applicative domains in which they can be used efficiently. This book is the outcome of a workshop meeting within ESM 2006 (Eurosis), held in Toulouse, France in October 2006.

  8. The complexity of DNA damage: relevance to biological consequences

    International Nuclear Information System (INIS)

    Ward, J.F.

    1994-01-01

    Ionizing radiation causes both singly and multiply damaged sites in DNA when the range of radical migration is limited by the presence of hydroxyl radical scavengers (e.g. within cells). Multiply damaged sites are considered to be more biologically relevant because of the challenges they present to cellular repair mechanisms. These sites occur in the form of DNA double-strand breaks (dsb) but also as other multiple damages that can be converted to dsb during attempted repair. The presence of a dsb can lead to loss of base sequence information and/or can permit the two ends of a break to separate and rejoin with the wrong partner. (Multiply damaged sites may also be the biologically relevant type of damage caused by other agents, such as UVA, B and/or C light, and some antitumour antibiotics). The quantitative data available from radiation studies of DNA are shown to support the proposed mechanisms for the production of complex damage in cellular DNA, i.e. via scavengable and non-scavengable mechanisms. The yields of complex damages can in turn be used to support the conclusion that cellular mutations are a consequence of the presence of these damages within a gene. (Author)

  9. THE COMPLEX OF EMOTIONAL EXPERIENCES, RELEVANT MANIFESTATIONS OF INSPIRATION

    Directory of Open Access Journals (Sweden)

    Pavel A. Starikov

    2015-01-01

    Full Text Available The aim of the study is to investigate structure of emotional experiences, relevant manifestations of inspiration creative activities of students.Methods. The proposed methods of mathematical statistics (correlation analysis, factor analysis, multidimensional scaling are applied.Results and scientific novelty. The use of factor analysis, multidimensional scaling allowed to reveal a consistent set of positive experiences of the students, the relevant experience of inspiration in creative activities. «Operational» rueful feelings dedicated by M. Chiksentmihaji («feeling of full involvement, and dilution in what you do», «feeling of concentration, perfect clarity of purpose, complete control and a feeling of total immersion in a job that does not require special efforts» and experiences of the «spiritual» nature, more appropriate to peaks experiences of A. Maslow («feeling of love for all existing, all life»; «a deep sense of self importance, the inner feeling of approval of self»; «feeling of unity with the whole world»; «acute perception of the beauty of the world of nature, “beautiful instant”»; «feeling of lightness, flowing» are included in this complex in accordance with the study results. The interrelation of degree of expressiveness of the given complex of experiences with inspiration experience is considered.Practical significance. The results of the study show structure of emotional experiences, relevant manifestations of inspiration. Research materials can be useful both to psychologists, and experts in the field of pedagogy of creative activity.

  10. The Kuramoto model in complex networks

    Science.gov (United States)

    Rodrigues, Francisco A.; Peron, Thomas K. DM.; Ji, Peng; Kurths, Jürgen

    2016-01-01

    Synchronization of an ensemble of oscillators is an emergent phenomenon present in several complex systems, ranging from social and physical to biological and technological systems. The most successful approach to describe how coherent behavior emerges in these complex systems is given by the paradigmatic Kuramoto model. This model has been traditionally studied in complete graphs. However, besides being intrinsically dynamical, complex systems present very heterogeneous structure, which can be represented as complex networks. This report is dedicated to review main contributions in the field of synchronization in networks of Kuramoto oscillators. In particular, we provide an overview of the impact of network patterns on the local and global dynamics of coupled phase oscillators. We cover many relevant topics, which encompass a description of the most used analytical approaches and the analysis of several numerical results. Furthermore, we discuss recent developments on variations of the Kuramoto model in networks, including the presence of noise and inertia. The rich potential for applications is discussed for special fields in engineering, neuroscience, physics and Earth science. Finally, we conclude by discussing problems that remain open after the last decade of intensive research on the Kuramoto model and point out some promising directions for future research.

  11. Passage relevance models for genomics search

    Directory of Open Access Journals (Sweden)

    Frieder Ophir

    2009-03-01

    Full Text Available Abstract We present a passage relevance model for integrating syntactic and semantic evidence of biomedical concepts and topics using a probabilistic graphical model. Component models of topics, concepts, terms, and document are represented as potential functions within a Markov Random Field. The probability of a passage being relevant to a biologist's information need is represented as the joint distribution across all potential functions. Relevance model feedback of top ranked passages is used to improve distributional estimates of query concepts and topics in context, and a dimensional indexing strategy is used for efficient aggregation of concept and term statistics. By integrating multiple sources of evidence including dependencies between topics, concepts, and terms, we seek to improve genomics literature passage retrieval precision. Using this model, we are able to demonstrate statistically significant improvements in retrieval precision using a large genomics literature corpus.

  12. Mouse models of ageing and their relevance to disease.

    Science.gov (United States)

    Kõks, Sulev; Dogan, Soner; Tuna, Bilge Guvenc; González-Navarro, Herminia; Potter, Paul; Vandenbroucke, Roosmarijn E

    2016-12-01

    Ageing is a process that gradually increases the organism's vulnerability to death. It affects different biological pathways, and the underlying cellular mechanisms are complex. In view of the growing disease burden of ageing populations, increasing efforts are being invested in understanding the pathways and mechanisms of ageing. We review some mouse models commonly used in studies on ageing, highlight the advantages and disadvantages of the different strategies, and discuss their relevance to disease susceptibility. In addition to addressing the genetics and phenotypic analysis of mice, we discuss examples of models of delayed or accelerated ageing and their modulation by caloric restriction. Copyright © 2016 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  13. A study of ruthenium complexes of some biologically relevant a-N ...

    Indian Academy of Sciences (India)

    Home; Journals; Journal of Chemical Sciences; Volume 112; Issue 3. A study of ruthenium complexes of some biologically relevant ∙ -N-heterocyclic ... Author Affiliations. P Sengupta1 S Ghosh1. Department of Inorganic Chemistry, Indian Association for the Cultivation of Science, Jadavpur, Calcutta 700 032, India ...

  14. Deterministic ripple-spreading model for complex networks.

    Science.gov (United States)

    Hu, Xiao-Bing; Wang, Ming; Leeson, Mark S; Hines, Evor L; Di Paolo, Ezequiel

    2011-04-01

    This paper proposes a deterministic complex network model, which is inspired by the natural ripple-spreading phenomenon. The motivations and main advantages of the model are the following: (i) The establishment of many real-world networks is a dynamic process, where it is often observed that the influence of a few local events spreads out through nodes, and then largely determines the final network topology. Obviously, this dynamic process involves many spatial and temporal factors. By simulating the natural ripple-spreading process, this paper reports a very natural way to set up a spatial and temporal model for such complex networks. (ii) Existing relevant network models are all stochastic models, i.e., with a given input, they cannot output a unique topology. Differently, the proposed ripple-spreading model can uniquely determine the final network topology, and at the same time, the stochastic feature of complex networks is captured by randomly initializing ripple-spreading related parameters. (iii) The proposed model can use an easily manageable number of ripple-spreading related parameters to precisely describe a network topology, which is more memory efficient when compared with traditional adjacency matrix or similar memory-expensive data structures. (iv) The ripple-spreading model has a very good potential for both extensions and applications.

  15. X-ray and vibrational spectroscopy of manganese complexes relevant to the oxygen-evolving complex of photosynthesis

    Energy Technology Data Exchange (ETDEWEB)

    Visser, Hendrik [Univ. of California, Berkeley, CA (United States)

    2001-01-01

    Manganese model complexes, relevant to the oxygen-evolving complex (OEC) in photosynthesis, were studied with Mn K-edge X-ray absorption near-edge spectroscopy (XANES), Mn Kb X-ray emission spectroscopy (XES), and vibrational spectroscopy. A more detailed understanding was obtained of the influence of nuclearity, overall structure, oxidation state, and ligand environment of the Mn atoms on the spectra from these methods. This refined understanding is necessary for improving the interpretation of spectra of the OEC. Mn XANES and Kb XES were used to study a di-(mu)-oxo and a mono-(mu)-oxo di-nuclear Mn compound in the (III,III), (III,IV), and (IV,IV) oxidation states. XANES spectra show energy shifts of 0.8 - 2.2 eV for 1-electron oxidation-state changes and 0.4 - 1.8 eV for ligand-environment changes. The shifts observed for Mn XES spectra were approximately 0.21 eV for oxidation state-changes and only approximately 0.04 eV for ligand-environment changes. This indicates that Mn Kb XES i s more sensitive to the oxidation state and less sensitive to the ligand environment of the Mn atoms than XANES. These complimentary methods provide information about the oxidation state and the ligand environment of Mn atoms in model compounds and biological systems. A versatile spectroelectrochemical apparatus was designed to aid the interpretation of IR spectra of Mn compounds in different oxidation states. The design, based on an attenuated total reflection device, permits the study of a wide spectral range: 16,700 (600 nm) - 225

  16. Foundations for Streaming Model Transformations by Complex Event Processing.

    Science.gov (United States)

    Dávid, István; Ráth, István; Varró, Dániel

    2018-01-01

    Streaming model transformations represent a novel class of transformations to manipulate models whose elements are continuously produced or modified in high volume and with rapid rate of change. Executing streaming transformations requires efficient techniques to recognize activated transformation rules over a live model and a potentially infinite stream of events. In this paper, we propose foundations of streaming model transformations by innovatively integrating incremental model query, complex event processing (CEP) and reactive (event-driven) transformation techniques. Complex event processing allows to identify relevant patterns and sequences of events over an event stream. Our approach enables event streams to include model change events which are automatically and continuously populated by incremental model queries. Furthermore, a reactive rule engine carries out transformations on identified complex event patterns. We provide an integrated domain-specific language with precise semantics for capturing complex event patterns and streaming transformations together with an execution engine, all of which is now part of the Viatra reactive transformation framework. We demonstrate the feasibility of our approach with two case studies: one in an advanced model engineering workflow; and one in the context of on-the-fly gesture recognition.

  17. On sampling and modeling complex systems

    International Nuclear Information System (INIS)

    Marsili, Matteo; Mastromatteo, Iacopo; Roudi, Yasser

    2013-01-01

    The study of complex systems is limited by the fact that only a few variables are accessible for modeling and sampling, which are not necessarily the most relevant ones to explain the system behavior. In addition, empirical data typically undersample the space of possible states. We study a generic framework where a complex system is seen as a system of many interacting degrees of freedom, which are known only in part, that optimize a given function. We show that the underlying distribution with respect to the known variables has the Boltzmann form, with a temperature that depends on the number of unknown variables. In particular, when the influence of the unknown degrees of freedom on the known variables is not too irregular, the temperature decreases as the number of variables increases. This suggests that models can be predictable only when the number of relevant variables is less than a critical threshold. Concerning sampling, we argue that the information that a sample contains on the behavior of the system is quantified by the entropy of the frequency with which different states occur. This allows us to characterize the properties of maximally informative samples: within a simple approximation, the most informative frequency size distributions have power law behavior and Zipf’s law emerges at the crossover between the under sampled regime and the regime where the sample contains enough statistics to make inferences on the behavior of the system. These ideas are illustrated in some applications, showing that they can be used to identify relevant variables or to select the most informative representations of data, e.g. in data clustering. (paper)

  18. Generalized complex geometry, generalized branes and the Hitchin sigma model

    International Nuclear Information System (INIS)

    Zucchini, Roberto

    2005-01-01

    Hitchin's generalized complex geometry has been shown to be relevant in compactifications of superstring theory with fluxes and is expected to lead to a deeper understanding of mirror symmetry. Gualtieri's notion of generalized complex submanifold seems to be a natural candidate for the description of branes in this context. Recently, we introduced a Batalin-Vilkovisky field theoretic realization of generalized complex geometry, the Hitchin sigma model, extending the well known Poisson sigma model. In this paper, exploiting Gualtieri's formalism, we incorporate branes into the model. A detailed study of the boundary conditions obeyed by the world sheet fields is provided. Finally, it is found that, when branes are present, the classical Batalin-Vilkovisky cohomology contains an extra sector that is related non trivially to a novel cohomology associated with the branes as generalized complex submanifolds. (author)

  19. Complex mixtures: Relevance of combined exposure to substances at low dose levels

    NARCIS (Netherlands)

    Leeman, W.R.; Krul, L.; Houben, G.F.

    2013-01-01

    Upon analysis of chemically complex food matrices a forest of peaks is likely to be found. Identification of these peaks and concurrent determination of the toxicological relevance upon exposure is very time consuming, expensive and often requires animal studies. Recently, a safety assessment

  20. A Compositional Relevance Model for Adaptive Information Retrieval

    Science.gov (United States)

    Mathe, Nathalie; Chen, James; Lu, Henry, Jr. (Technical Monitor)

    1994-01-01

    There is a growing need for rapid and effective access to information in large electronic documentation systems. Access can be facilitated if information relevant in the current problem solving context can be automatically supplied to the user. This includes information relevant to particular user profiles, tasks being performed, and problems being solved. However most of this knowledge on contextual relevance is not found within the contents of documents, and current hypermedia tools do not provide any easy mechanism to let users add this knowledge to their documents. We propose a compositional relevance network to automatically acquire the context in which previous information was found relevant. The model records information on the relevance of references based on user feedback for specific queries and contexts. It also generalizes such information to derive relevant references for similar queries and contexts. This model lets users filter information by context of relevance, build personalized views of documents over time, and share their views with other users. It also applies to any type of multimedia information. Compared to other approaches, it is less costly and doesn't require any a priori statistical computation, nor an extended training period. It is currently being implemented into the Computer Integrated Documentation system which enables integration of various technical documents in a hypertext framework.

  1. Model complexity and choice of model approaches for practical simulations of CO2 injection, migration, leakage and long-term fate

    Energy Technology Data Exchange (ETDEWEB)

    Celia, Michael A. [Princeton Univ., NJ (United States)

    2016-12-30

    This report documents the accomplishments achieved during the project titled “Model complexity and choice of model approaches for practical simulations of CO2 injection,migration, leakage and long-term fate” funded by the US Department of Energy, Office of Fossil Energy. The objective of the project was to investigate modeling approaches of various levels of complexity relevant to geologic carbon storage (GCS) modeling with the goal to establish guidelines on choice of modeling approach.

  2. Facing urban complexity : towards cognitive modelling. Part 1. Modelling as a cognitive mediator

    Directory of Open Access Journals (Sweden)

    Sylvie Occelli

    2002-03-01

    Full Text Available Over the last twenty years, complexity issues have been a central theme of enquiry for the modelling field. Whereas contributing to both a critical revisiting of the existing methods and opening new ways of reasoning, the effectiveness (and sense of modelling activity was rarely questioned. Acknowledgment of complexity however has been a fruitful spur new and more sophisticated methods in order to improve understanding and advance geographical sciences. However its contribution to tackle urban problems in everyday life has been rather poor and mainly limited to rhetorical claims about the potentialities of the new approach. We argue that although complexity has put the classical modelling activity in serious distress, it is disclosing new potentialities, which are still largely unnoticed. These are primarily related to what the authors has called the structural cognitive shift, which involves both the contents and role of modelling activity. This paper is a first part of a work aimed to illustrate the main features of this shift and discuss its main consequences on the modelling activity. We contend that a most relevant aspect of novelty lies in the new role of modelling as a cognitive mediator, i.e. as a kind of interface between the various components of a modelling process and the external environment to which a model application belongs.

  3. Structural zinc(II thiolate complexes relevant to the modeling of Ada repair protein: Application toward alkylation reactions

    Directory of Open Access Journals (Sweden)

    Mohamed M. Ibrahim

    2014-11-01

    Full Text Available The TtZn(II-bound perchlorate complex [TtZn–OClO3] 1 (Ttxyly = hydrotris[N-xylyl-thioimidazolyl]borate was used for the synthesis of zinc(II-bound ethanthiothiol complex [TtZn–SCH2CH3] 2 and its hydrogen-bond containing analog Tt–ZnSCH2CH2–NH(COOC(CH33 3. These thiolate complexes were examined as structural models for the active sites of Ada repair protein toward methylation reactions. The Zn[S3O] coordination sphere in complex 1 includes three thione donors from the ligand Ttixyl and one oxygen donor from the perchlorate coligand in ideally tetrahedral arrangement around the zinc center. The average Zn(1–S(thione bond length is 2.344 Å, and the Zn(1–O(1 bond length is 1.917 Å.

  4. The temporal-relevance temporal-uncertainty model of prospective duration judgment.

    Science.gov (United States)

    Zakay, Dan

    2015-12-15

    A model aimed at explaining prospective duration judgments in real life settings (as well as in the laboratory) is presented. The model is based on the assumption that situational meaning is continuously being extracted by humans' perceptual and cognitive information processing systems. Time is one of the important dimensions of situational meaning. Based on the situational meaning, a value for Temporal Relevance is set. Temporal Relevance reflects the importance of temporal aspects for enabling adaptive behavior in a specific moment in time. When Temporal Relevance is above a certain threshold a prospective duration judgment process is evoked automatically. In addition, a search for relevant temporal information is taking place and its outcomes determine the level of Temporal Uncertainty which reflects the degree of knowledge one has regarding temporal aspects of the task to be performed. The levels of Temporal Relevance and Temporal Uncertainty determine the amount of attentional resources allocated for timing by the executive system. The merit of the model is in connecting timing processes with the ongoing general information processing stream. The model rests on findings in various domains which indicate that cognitive-relevance and self-relevance are powerful determinants of resource allocation policy. The feasibility of the model is demonstrated by analyzing various temporal phenomena. Suggestions for further empirical validation of the model are presented. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. Structural Model Error and Decision Relevancy

    Science.gov (United States)

    Goldsby, M.; Lusk, G.

    2017-12-01

    The extent to which climate models can underwrite specific climate policies has long been a contentious issue. Skeptics frequently deny that climate models are trustworthy in an attempt to undermine climate action, whereas policy makers often desire information that exceeds the capabilities of extant models. While not skeptics, a group of mathematicians and philosophers [Frigg et al. (2014)] recently argued that even tiny differences between the structure of a complex dynamical model and its target system can lead to dramatic predictive errors, possibly resulting in disastrous consequences when policy decisions are based upon those predictions. They call this result the Hawkmoth effect (HME), and seemingly use it to rebuke rightwing proposals to forgo mitigation in favor of adaptation. However, a vigorous debate has emerged between Frigg et al. on one side and another philosopher-mathematician pair [Winsberg and Goodwin (2016)] on the other. On one hand, Frigg et al. argue that their result shifts the burden to climate scientists to demonstrate that their models do not fall prey to the HME. On the other hand, Winsberg and Goodwin suggest that arguments like those asserted by Frigg et al. can be, if taken seriously, "dangerous": they fail to consider the variety of purposes for which models can be used, and thus too hastily undermine large swaths of climate science. They put the burden back on Frigg et al. to show their result has any effect on climate science. This paper seeks to attenuate this debate by establishing an irenic middle position; we find that there is more agreement between sides than it first seems. We distinguish a `decision standard' from a `burden of proof', which helps clarify the contributions to the debate from both sides. In making this distinction, we argue that scientists bear the burden of assessing the consequences of HME, but that the standard Frigg et al. adopt for decision relevancy is too strict.

  6. XAS Investigation of bio-relevant cobalt complexes in aqueous media

    International Nuclear Information System (INIS)

    Bresson, C.; Lamouroux, C.; Esnouf, S.; Solari, P.L.; Den Auwer, C.

    2006-01-01

    Cobalt is an essential element of biological cycles involved in numerous metallo-biomolecules, but it becomes a toxic element at high concentration or a radio-toxic element because of its use in the nuclear industry. 'Molecular speciation' in biological media is an essential prerequisite to evaluate its chemical behaviour as well as its toxic or beneficial effects. In this scheme, we have focused on the coordination properties of the thiol-containing amino acid cysteine (Cys) and the pseudo-peptide N-(2-mercapto-propionyl) glycine (MPG) towards the Co 2+ cation in aqueous media. XAS at the Co K edge and traditional spectroscopic techniques have been coupled in order to structurally characterize the cobalt coordination sphere. Oxidation states and geometries of the bis- and tris-cysteinato Co(III) complexes are in agreement with the literature data. In addition, bond lengths between the metallic centre and the donor atoms have been determined. The structure of a new dimeric N-(2-mercapto-propionyl) glycinato Co(II) complex in solution is also reported. The coordination of MPG to Co(II) through the thiolate and carboxylate functions is ascertained. This work provides fundamental structural information about bio-relevant complexes of cobalt, which will contribute to our understanding of the chemical behaviour and the biological role of this radionuclide. (authors)

  7. Deciphering the complexity of acute inflammation using mathematical models.

    Science.gov (United States)

    Vodovotz, Yoram

    2006-01-01

    Various stresses elicit an acute, complex inflammatory response, leading to healing but sometimes also to organ dysfunction and death. We constructed both equation-based models (EBM) and agent-based models (ABM) of various degrees of granularity--which encompass the dynamics of relevant cells, cytokines, and the resulting global tissue dysfunction--in order to begin to unravel these inflammatory interactions. The EBMs describe and predict various features of septic shock and trauma/hemorrhage (including the response to anthrax, preconditioning phenomena, and irreversible hemorrhage) and were used to simulate anti-inflammatory strategies in clinical trials. The ABMs that describe the interrelationship between inflammation and wound healing yielded insights into intestinal healing in necrotizing enterocolitis, vocal fold healing during phonotrauma, and skin healing in the setting of diabetic foot ulcers. Modeling may help in understanding the complex interactions among the components of inflammation and response to stress, and therefore aid in the development of novel therapies and diagnostics.

  8. Modelling low energy electron and positron tracks in biologically relevant media

    International Nuclear Information System (INIS)

    Blanco, F.; Munoz, A.; Almeida, D.; Ferreira da Silva, F.; Limao-Vieira, P.; Fuss, M.C.; Sanz, A.G.; Garcia, G.

    2013-01-01

    This colloquium describes an approach to incorporate into radiation damage models the effect of low and intermediate energy (0-100 eV) electrons and positrons, slowing down in biologically relevant materials (water and representative biomolecules). The core of the modelling procedure is a C++ computing programme named 'Low Energy Particle Track Simulation (LEPTS)', which is compatible with available general purpose Monte Carlo packages. Input parameters are carefully selected from theoretical and experimental cross section data and energy loss distribution functions. Data sources used for this purpose are reviewed showing examples of electron and positron cross section and energy loss data for interactions with different media of increasing complexity: atoms, molecules, clusters and condense matter. Finally, we show how such a model can be used to develop an effective dosimetric tool at the molecular level (i.e. nanodosimetry). Recent experimental developments to study the fragmentation induced in biologically material by charge transfer from neutrals and negative ions are also included. (authors)

  9. On relevant boundary perturbations of unitary minimal models

    International Nuclear Information System (INIS)

    Recknagel, A.; Roggenkamp, D.; Schomerus, V.

    2000-01-01

    We consider unitary Virasoro minimal models on the disk with Cardy boundary conditions and discuss deformations by certain relevant boundary operators, analogous to tachyon condensation in string theory. Concentrating on the least relevant boundary field, we can perform a perturbative analysis of renormalization group fixed points. We find that the systems always flow towards stable fixed points which admit no further (non-trivial) relevant perturbations. The new conformal boundary conditions are in general given by superpositions of 'pure' Cardy boundary conditions

  10. Thermochemical data for environmentally-relevant elements

    International Nuclear Information System (INIS)

    Markich, S.J.; Brown, P.L.

    1999-01-01

    This study provides an extensive stability constant (log K) database suitable for calculating the speciation of selected environmentally-relevant elements (H, Na, K, Ca, Mg, Fe, Mn, U, Al Pb, Zn, Cu and cd) in an aqueous system, where a model fulvic acid (comprising aspartic, citric, malonic, salicylic and tricarballylic acids) is used to simulate metal binding by dissolved organic material Stability constants for inorganic metal complexes and minerals were selected primarily from critical literature complications and/or reviews. In contrast, few critically evaluated data were available for metal complexes with aspartic, citric, malonic, salicylic and tricarballylic acids. Consequently, data from original research articles were carefully evaluated and compiled as part of the study, following defined selection criteria. to meet the objective of compiling a comprehensive and reliable database of stability constants, all relevant equilibria and species, ranging from simple binary metal complexes to more complex ternary and even quaternary, metal complexes were included where possible in addition to the selection of stability constants from empirical sources, estimates of stability constants were performed when this could be done reliably, based on the unified theory of metal ion complexation and/or linear tree energy relationships The stability constants are given as common logarithms (logo) in the form required by the HARPHRQ geochemical code and refer to the standard state, i.e 298.15 K (25 deg C), 10 5 Pa (1 atm) and, for all species, infinite dilution (ionic strength = 0 mol L -1 ). In addition to the compilation of stability constant data, an overview is given of geochemical speciation modelling in aqueous systems and available conceptual models of metal binding by humic substances. (authors)

  11. Modelling small-angle scattering data from complex protein-lipid systems

    DEFF Research Database (Denmark)

    Kynde, Søren Andreas Røssell

    This thesis consists of two parts. The rst part is divided into five chapters. Chapter 1 gives a general introduction to the bio-molecular systems that have been studied. These are membrane proteins and their lipid environments in the form of phospholipid nanodiscs. Membrane proteins...... the techniques very well suited for the study of the nanodisc system. Chapter 3 explains two different modelling approaches that can be used in the analysis of small-angle scattering data from lipid-protein complexes. These are the continuous approach where the system of interest is modelled as a few regular...... combine the bene ts of each of the methods and give unique structural information about relevant bio-molecular complexes in solution. Chapter 4 describes the work behind a proposal of a small-angle neutron scattering instrument for the European Spallation Source under construction in Lund. The instrument...

  12. Structural characterisation of medically relevant protein assemblies by integrating mass spectrometry with computational modelling.

    Science.gov (United States)

    Politis, Argyris; Schmidt, Carla

    2018-03-20

    Structural mass spectrometry with its various techniques is a powerful tool for the structural elucidation of medically relevant protein assemblies. It delivers information on the composition, stoichiometries, interactions and topologies of these assemblies. Most importantly it can deal with heterogeneous mixtures and assemblies which makes it universal among the conventional structural techniques. In this review we summarise recent advances and challenges in structural mass spectrometric techniques. We describe how the combination of the different mass spectrometry-based methods with computational strategies enable structural models at molecular levels of resolution. These models hold significant potential for helping us in characterizing the function of protein assemblies related to human health and disease. In this review we summarise the techniques of structural mass spectrometry often applied when studying protein-ligand complexes. We exemplify these techniques through recent examples from literature that helped in the understanding of medically relevant protein assemblies. We further provide a detailed introduction into various computational approaches that can be integrated with these mass spectrometric techniques. Last but not least we discuss case studies that integrated mass spectrometry and computational modelling approaches and yielded models of medically important protein assembly states such as fibrils and amyloids. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.

  13. Knowledge-based inspection:modelling complex processes with the integrated Safeguards Modelling Method (iSMM)

    International Nuclear Information System (INIS)

    Abazi, F.

    2011-01-01

    Increased level of complexity in almost every discipline and operation today raises the demand for knowledge in order to successfully run an organization whether to generate profit or to attain a non-profit mission. Traditional way of transferring knowledge to information systems rich in data structures and complex algorithms continue to hinder the ability to swiftly turnover concepts into operations. Diagrammatic modelling commonly applied in engineering in order to represent concepts or reality remains to be an excellent way of converging knowledge from domain experts. The nuclear verification domain represents ever more a matter which has great importance to the World safety and security. Demand for knowledge about nuclear processes and verification activities used to offset potential misuse of nuclear technology will intensify with the growth of the subject technology. This Doctoral thesis contributes with a model-based approach for representing complex process such as nuclear inspections. The work presented contributes to other domains characterized with knowledge intensive and complex processes. Based on characteristics of a complex process a conceptual framework was established as the theoretical basis for creating a number of modelling languages to represent the domain. The integrated Safeguards Modelling Method (iSMM) is formalized through an integrated meta-model. The diagrammatic modelling languages represent the verification domain and relevant nuclear verification aspects. Such a meta-model conceptualizes the relation between practices of process management, knowledge management and domain specific verification principles. This fusion is considered as necessary in order to create quality processes. The study also extends the formalization achieved through a meta-model by contributing with a formalization language based on Pattern Theory. Through the use of graphical and mathematical constructs of the theory, process structures are formalized enhancing

  14. Stress and adaptation : Toward ecologically relevant animal models

    NARCIS (Netherlands)

    Koolhaas, Jaap M.; Boer, Sietse F. de; Buwalda, Bauke

    Animal models have contributed considerably to the current understanding of mechanisms underlying the role of stress in health and disease. Despite the progress made already, much more can be made by more carefully exploiting animals' and humans' shared biology, using ecologically relevant models.

  15. Modeling Complex Time Limits

    Directory of Open Access Journals (Sweden)

    Oleg Svatos

    2013-01-01

    Full Text Available In this paper we analyze complexity of time limits we can find especially in regulated processes of public administration. First we review the most popular process modeling languages. There is defined an example scenario based on the current Czech legislature which is then captured in discussed process modeling languages. Analysis shows that the contemporary process modeling languages support capturing of the time limit only partially. This causes troubles to analysts and unnecessary complexity of the models. Upon unsatisfying results of the contemporary process modeling languages we analyze the complexity of the time limits in greater detail and outline lifecycles of a time limit using the multiple dynamic generalizations pattern. As an alternative to the popular process modeling languages there is presented PSD process modeling language, which supports the defined lifecycles of a time limit natively and therefore allows keeping the models simple and easy to understand.

  16. Modeling Complex Systems

    CERN Document Server

    Boccara, Nino

    2010-01-01

    Modeling Complex Systems, 2nd Edition, explores the process of modeling complex systems, providing examples from such diverse fields as ecology, epidemiology, sociology, seismology, and economics. It illustrates how models of complex systems are built and provides indispensable mathematical tools for studying their dynamics. This vital introductory text is useful for advanced undergraduate students in various scientific disciplines, and serves as an important reference book for graduate students and young researchers. This enhanced second edition includes: . -recent research results and bibliographic references -extra footnotes which provide biographical information on cited scientists who have made significant contributions to the field -new and improved worked-out examples to aid a student’s comprehension of the content -exercises to challenge the reader and complement the material Nino Boccara is also the author of Essentials of Mathematica: With Applications to Mathematics and Physics (Springer, 2007).

  17. The effects of model complexity and calibration period on groundwater recharge simulations

    Science.gov (United States)

    Moeck, Christian; Van Freyberg, Jana; Schirmer, Mario

    2017-04-01

    A significant number of groundwater recharge models exist that vary in terms of complexity (i.e., structure and parametrization). Typically, model selection and conceptualization is very subjective and can be a key source of uncertainty in the recharge simulations. Another source of uncertainty is the implicit assumption that model parameters, calibrated over historical periods, are also valid for the simulation period. To the best of our knowledge there is no systematic evaluation of the effect of the model complexity and calibration strategy on the performance of recharge models. To address this gap, we utilized a long-term recharge data set (20 years) from a large weighting lysimeter. We performed a differential split sample test with four groundwater recharge models that vary in terms of complexity. They were calibrated using six calibration periods with climatically contrasting conditions in a constrained Monte Carlo approach. Despite the climatically contrasting conditions, all models performed similarly well during the calibration. However, during validation a clear effect of the model structure on model performance was evident. The more complex, physically-based models predicted recharge best, even when calibration and prediction periods had very different climatic conditions. In contrast, more simplistic soil-water balance and lumped model performed poorly under such conditions. For these models we found a strong dependency on the chosen calibration period. In particular, our analysis showed that this can have relevant implications when using recharge models as decision-making tools in a broad range of applications (e.g. water availability, climate change impact studies, water resource management, etc.).

  18. Modeling Complex Systems

    International Nuclear Information System (INIS)

    Schreckenberg, M

    2004-01-01

    This book by Nino Boccara presents a compilation of model systems commonly termed as 'complex'. It starts with a definition of the systems under consideration and how to build up a model to describe the complex dynamics. The subsequent chapters are devoted to various categories of mean-field type models (differential and recurrence equations, chaos) and of agent-based models (cellular automata, networks and power-law distributions). Each chapter is supplemented by a number of exercises and their solutions. The table of contents looks a little arbitrary but the author took the most prominent model systems investigated over the years (and up until now there has been no unified theory covering the various aspects of complex dynamics). The model systems are explained by looking at a number of applications in various fields. The book is written as a textbook for interested students as well as serving as a comprehensive reference for experts. It is an ideal source for topics to be presented in a lecture on dynamics of complex systems. This is the first book on this 'wide' topic and I have long awaited such a book (in fact I planned to write it myself but this is much better than I could ever have written it!). Only section 6 on cellular automata is a little too limited to the author's point of view and one would have expected more about the famous Domany-Kinzel model (and more accurate citation!). In my opinion this is one of the best textbooks published during the last decade and even experts can learn a lot from it. Hopefully there will be an actualization after, say, five years since this field is growing so quickly. The price is too high for students but this, unfortunately, is the normal case today. Nevertheless I think it will be a great success! (book review)

  19. Proceedings of the meeting on computational and experimental studies for modeling of radionuclide migration in complex aquatic ecosystems

    International Nuclear Information System (INIS)

    Matsunaga, Takeshi; Hakanson, Lars

    2010-09-01

    The Research Group for Environmental Science of JAEA held a meeting on computational and experimental studies for modeling of radionuclide migration in complex aquatic ecosystems during November 16-20 of 2009. The aim was to discuss the relevance of various computational and experimental approaches to that modeling. The meeting was attended by a Swedish researcher, Prof. Dr. Lars Hakanson of Uppsala University. It included a joint talk at the Institute for Environmental Sciences, in addition to a field and facility survey of the JNFL commercial reprocessing plant located in Rokkasho, Aomori. The meeting demonstrated that it is crucial 1) to make a model structure be strictly relevant to the target objectives of a study and 2) to account for inherent fluctuations in target values in nature in a manner of qualitative parameterization. Moreover, it was confirmed that there can be multiple different approaches of modeling (e.g. detailed or simplified) with relevance for the objectives of a study. These discussions should be considered in model integration for complex aquatic ecosystems consisting catchments, rivers, lakes and coastal oceans which can interact with the atmosphere. This report compiles research subjects and lectures presented at the meeting with associated discussions. The 10 of the presented papers indexed individually. (J.P.N.)

  20. Application of all relevant feature selection for failure analysis of parameter-induced simulation crashes in climate models

    Science.gov (United States)

    Paja, W.; Wrzesień, M.; Niemiec, R.; Rudnicki, W. R.

    2015-07-01

    The climate models are extremely complex pieces of software. They reflect best knowledge on physical components of the climate, nevertheless, they contain several parameters, which are too weakly constrained by observations, and can potentially lead to a crash of simulation. Recently a study by Lucas et al. (2013) has shown that machine learning methods can be used for predicting which combinations of parameters can lead to crash of simulation, and hence which processes described by these parameters need refined analyses. In the current study we reanalyse the dataset used in this research using different methodology. We confirm the main conclusion of the original study concerning suitability of machine learning for prediction of crashes. We show, that only three of the eight parameters indicated in the original study as relevant for prediction of the crash are indeed strongly relevant, three other are relevant but redundant, and two are not relevant at all. We also show that the variance due to split of data between training and validation sets has large influence both on accuracy of predictions and relative importance of variables, hence only cross-validated approach can deliver robust prediction of performance and relevance of variables.

  1. Complex Environmental Data Modelling Using Adaptive General Regression Neural Networks

    Science.gov (United States)

    Kanevski, Mikhail

    2015-04-01

    The research deals with an adaptation and application of Adaptive General Regression Neural Networks (GRNN) to high dimensional environmental data. GRNN [1,2,3] are efficient modelling tools both for spatial and temporal data and are based on nonparametric kernel methods closely related to classical Nadaraya-Watson estimator. Adaptive GRNN, using anisotropic kernels, can be also applied for features selection tasks when working with high dimensional data [1,3]. In the present research Adaptive GRNN are used to study geospatial data predictability and relevant feature selection using both simulated and real data case studies. The original raw data were either three dimensional monthly precipitation data or monthly wind speeds embedded into 13 dimensional space constructed by geographical coordinates and geo-features calculated from digital elevation model. GRNN were applied in two different ways: 1) adaptive GRNN with the resulting list of features ordered according to their relevancy; and 2) adaptive GRNN applied to evaluate all possible models N [in case of wind fields N=(2^13 -1)=8191] and rank them according to the cross-validation error. In both cases training were carried out applying leave-one-out procedure. An important result of the study is that the set of the most relevant features depends on the month (strong seasonal effect) and year. The predictabilities of precipitation and wind field patterns, estimated using the cross-validation and testing errors of raw and shuffled data, were studied in detail. The results of both approaches were qualitatively and quantitatively compared. In conclusion, Adaptive GRNN with their ability to select features and efficient modelling of complex high dimensional data can be widely used in automatic/on-line mapping and as an integrated part of environmental decision support systems. 1. Kanevski M., Pozdnoukhov A., Timonin V. Machine Learning for Spatial Environmental Data. Theory, applications and software. EPFL Press

  2. Polycomb repressive complex 2 regulates MiR-200b in retinal endothelial cells: potential relevance in diabetic retinopathy.

    Directory of Open Access Journals (Sweden)

    Michael Anthony Ruiz

    Full Text Available Glucose-induced augmented vascular endothelial growth factor (VEGF production is a key event in diabetic retinopathy. We have previously demonstrated that downregulation of miR-200b increases VEGF, mediating structural and functional changes in the retina in diabetes. However, mechanisms regulating miR-200b in diabetes are not known. Histone methyltransferase complex, Polycomb Repressive Complex 2 (PRC2, has been shown to repress miRNAs in neoplastic process. We hypothesized that, in diabetes, PRC2 represses miR-200b through its histone H3 lysine-27 trimethylation mark. We show that human retinal microvascular endothelial cells exposed to high levels of glucose regulate miR-200b repression through histone methylation and that inhibition of PRC2 increases miR-200b while reducing VEGF. Furthermore, retinal tissue from animal models of diabetes showed increased expression of major PRC2 components, demonstrating in vivo relevance. This research established a repressive relationship between PRC2 and miR-200b, providing evidence of a novel mechanism of miRNA regulation through histone methylation.

  3. Nonparametric Bayesian Modeling of Complex Networks

    DEFF Research Database (Denmark)

    Schmidt, Mikkel Nørgaard; Mørup, Morten

    2013-01-01

    an infinite mixture model as running example, we go through the steps of deriving the model as an infinite limit of a finite parametric model, inferring the model parameters by Markov chain Monte Carlo, and checking the model?s fit and predictive performance. We explain how advanced nonparametric models......Modeling structure in complex networks using Bayesian nonparametrics makes it possible to specify flexible model structures and infer the adequate model complexity from the observed data. This article provides a gentle introduction to nonparametric Bayesian modeling of complex networks: Using...

  4. Clinical Complexity in Medicine: A Measurement Model of Task and Patient Complexity.

    Science.gov (United States)

    Islam, R; Weir, C; Del Fiol, G

    2016-01-01

    Complexity in medicine needs to be reduced to simple components in a way that is comprehensible to researchers and clinicians. Few studies in the current literature propose a measurement model that addresses both task and patient complexity in medicine. The objective of this paper is to develop an integrated approach to understand and measure clinical complexity by incorporating both task and patient complexity components focusing on the infectious disease domain. The measurement model was adapted and modified for the healthcare domain. Three clinical infectious disease teams were observed, audio-recorded and transcribed. Each team included an infectious diseases expert, one infectious diseases fellow, one physician assistant and one pharmacy resident fellow. The transcripts were parsed and the authors independently coded complexity attributes. This baseline measurement model of clinical complexity was modified in an initial set of coding processes and further validated in a consensus-based iterative process that included several meetings and email discussions by three clinical experts from diverse backgrounds from the Department of Biomedical Informatics at the University of Utah. Inter-rater reliability was calculated using Cohen's kappa. The proposed clinical complexity model consists of two separate components. The first is a clinical task complexity model with 13 clinical complexity-contributing factors and 7 dimensions. The second is the patient complexity model with 11 complexity-contributing factors and 5 dimensions. The measurement model for complexity encompassing both task and patient complexity will be a valuable resource for future researchers and industry to measure and understand complexity in healthcare.

  5. Complex matrix model duality

    International Nuclear Information System (INIS)

    Brown, T.W.

    2010-11-01

    The same complex matrix model calculates both tachyon scattering for the c=1 non-critical string at the self-dual radius and certain correlation functions of half-BPS operators in N=4 super- Yang-Mills. It is dual to another complex matrix model where the couplings of the first model are encoded in the Kontsevich-like variables of the second. The duality between the theories is mirrored by the duality of their Feynman diagrams. Analogously to the Hermitian Kontsevich- Penner model, the correlation functions of the second model can be written as sums over discrete points in subspaces of the moduli space of punctured Riemann surfaces. (orig.)

  6. Complex matrix model duality

    Energy Technology Data Exchange (ETDEWEB)

    Brown, T.W.

    2010-11-15

    The same complex matrix model calculates both tachyon scattering for the c=1 non-critical string at the self-dual radius and certain correlation functions of half-BPS operators in N=4 super- Yang-Mills. It is dual to another complex matrix model where the couplings of the first model are encoded in the Kontsevich-like variables of the second. The duality between the theories is mirrored by the duality of their Feynman diagrams. Analogously to the Hermitian Kontsevich- Penner model, the correlation functions of the second model can be written as sums over discrete points in subspaces of the moduli space of punctured Riemann surfaces. (orig.)

  7. Determinants of dermal exposure relevant for exposure modelling in regulatory risk assessment.

    Science.gov (United States)

    Marquart, J; Brouwer, D H; Gijsbers, J H J; Links, I H M; Warren, N; van Hemmen, J J

    2003-11-01

    Risk assessment of chemicals requires assessment of the exposure levels of workers. In the absence of adequate specific measured data, models are often used to estimate exposure levels. For dermal exposure only a few models exist, which are not validated externally. In the scope of a large European research programme, an analysis of potential dermal exposure determinants was made based on the available studies and models and on the expert judgement of the authors of this publication. Only a few potential determinants appear to have been studied in depth. Several studies have included clusters of determinants into vaguely defined parameters, such as 'task' or 'cleaning and maintenance of clothing'. Other studies include several highly correlated parameters, such as 'amount of product handled', 'duration of task' and 'area treated', and separation of these parameters to study their individual influence is not possible. However, based on the available information, a number of determinants could clearly be defined as proven or highly plausible determinants of dermal exposure in one or more exposure situation. This information was combined with expert judgement on the scientific plausibility of the influence of parameters that have not been extensively studied and on the possibilities to gather relevant information during a risk assessment process. The result of this effort is a list of determinants relevant for dermal exposure models in the scope of regulatory risk assessment. The determinants have been divided into the major categories 'substance and product characteristics', 'task done by the worker', 'process technique and equipment', 'exposure control measures', 'worker characteristics and habits' and 'area and situation'. To account for the complex nature of the dermal exposure processes, a further subdivision was made into the three major processes 'direct contact', 'surface contact' and 'deposition'.

  8. The value and cost of complexity in predictive modelling: role of tissue anisotropic conductivity and fibre tracts in neuromodulation.

    Science.gov (United States)

    Shahid, Syed Salman; Bikson, Marom; Salman, Humaira; Wen, Peng; Ahfock, Tony

    2014-06-01

    Computational methods are increasingly used to optimize transcranial direct current stimulation (tDCS) dose strategies and yet complexities of existing approaches limit their clinical access. Since predictive modelling indicates the relevance of subject/pathology based data and hence the need for subject specific modelling, the incremental clinical value of increasingly complex modelling methods must be balanced against the computational and clinical time and costs. For example, the incorporation of multiple tissue layers and measured diffusion tensor (DTI) based conductivity estimates increase model precision but at the cost of clinical and computational resources. Costs related to such complexities aggregate when considering individual optimization and the myriad of potential montages. Here, rather than considering if additional details change current-flow prediction, we consider when added complexities influence clinical decisions. Towards developing quantitative and qualitative metrics of value/cost associated with computational model complexity, we considered field distributions generated by two 4 × 1 high-definition montages (m1 = 4 × 1 HD montage with anode at C3 and m2 = 4 × 1 HD montage with anode at C1) and a single conventional (m3 = C3-Fp2) tDCS electrode montage. We evaluated statistical methods, including residual error (RE) and relative difference measure (RDM), to consider the clinical impact and utility of increased complexities, namely the influence of skull, muscle and brain anisotropic conductivities in a volume conductor model. Anisotropy modulated current-flow in a montage and region dependent manner. However, significant statistical changes, produced within montage by anisotropy, did not change qualitative peak and topographic comparisons across montages. Thus for the examples analysed, clinical decision on which dose to select would not be altered by the omission of anisotropic brain conductivity. Results illustrate the need to rationally

  9. Towards increased policy relevance in energy modeling

    Energy Technology Data Exchange (ETDEWEB)

    Worrell, Ernst; Ramesohl, Stephan; Boyd, Gale

    2003-07-29

    Historically, most energy models were reasonably equipped to assess the impact of a subsidy or change in taxation, but are often insufficient to assess the impact of more innovative policy instruments. We evaluate the models used to assess future energy use, focusing on industrial energy use. We explore approaches to engineering-economic analysis that could help improve the realism and policy relevance of engineering-economic modeling frameworks. We also explore solutions to strengthen the policy usefulness of engineering-economic analysis that can be built from a framework of multi-disciplinary cooperation. We focus on the so-called ''engineering-economic'' (or ''bottom-up'') models, as they include the amount of detail that is commonly needed to model policy scenarios. We identify research priorities for the modeling framework, technology representation in models, policy evaluation and modeling of decision-making behavior.

  10. Building Better Ecological Machines: Complexity Theory and Alternative Economic Models

    Directory of Open Access Journals (Sweden)

    Jess Bier

    2016-12-01

    Full Text Available Computer models of the economy are regularly used to predict economic phenomena and set financial policy. However, the conventional macroeconomic models are currently being reimagined after they failed to foresee the current economic crisis, the outlines of which began to be understood only in 2007-2008. In this article we analyze the most prominent of this reimagining: Agent-Based models (ABMs. ABMs are an influential alternative to standard economic models, and they are one focus of complexity theory, a discipline that is a more open successor to the conventional chaos and fractal modeling of the 1990s. The modelers who create ABMs claim that their models depict markets as ecologies, and that they are more responsive than conventional models that depict markets as machines. We challenge this presentation, arguing instead that recent modeling efforts amount to the creation of models as ecological machines. Our paper aims to contribute to an understanding of the organizing metaphors of macroeconomic models, which we argue is relevant conceptually and politically, e.g., when models are used for regulatory purposes.

  11. Simulation in Complex Modelling

    DEFF Research Database (Denmark)

    Nicholas, Paul; Ramsgaard Thomsen, Mette; Tamke, Martin

    2017-01-01

    This paper will discuss the role of simulation in extended architectural design modelling. As a framing paper, the aim is to present and discuss the role of integrated design simulation and feedback between design and simulation in a series of projects under the Complex Modelling framework. Complex...... performance, engage with high degrees of interdependency and allow the emergence of design agency and feedback between the multiple scales of architectural construction. This paper presents examples for integrated design simulation from a series of projects including Lace Wall, A Bridge Too Far and Inflated...... Restraint developed for the research exhibition Complex Modelling, Meldahls Smedie Gallery, Copenhagen in 2016. Where the direct project aims and outcomes have been reported elsewhere, the aim for this paper is to discuss overarching strategies for working with design integrated simulation....

  12. A User-Centered Approach to Adaptive Hypertext Based on an Information Relevance Model

    Science.gov (United States)

    Mathe, Nathalie; Chen, James

    1994-01-01

    Rapid and effective to information in large electronic documentation systems can be facilitated if information relevant in an individual user's content can be automatically supplied to this user. However most of this knowledge on contextual relevance is not found within the contents of documents, it is rather established incrementally by users during information access. We propose a new model for interactively learning contextual relevance during information retrieval, and incrementally adapting retrieved information to individual user profiles. The model, called a relevance network, records the relevance of references based on user feedback for specific queries and user profiles. It also generalizes such knowledge to later derive relevant references for similar queries and profiles. The relevance network lets users filter information by context of relevance. Compared to other approaches, it does not require any prior knowledge nor training. More importantly, our approach to adaptivity is user-centered. It facilitates acceptance and understanding by users by giving them shared control over the adaptation without disturbing their primary task. Users easily control when to adapt and when to use the adapted system. Lastly, the model is independent of the particular application used to access information, and supports sharing of adaptations among users.

  13. Parameterization and Sensitivity Analysis of a Complex Simulation Model for Mosquito Population Dynamics, Dengue Transmission, and Their Control

    Science.gov (United States)

    Ellis, Alicia M.; Garcia, Andres J.; Focks, Dana A.; Morrison, Amy C.; Scott, Thomas W.

    2011-01-01

    Models can be useful tools for understanding the dynamics and control of mosquito-borne disease. More detailed models may be more realistic and better suited for understanding local disease dynamics; however, evaluating model suitability, accuracy, and performance becomes increasingly difficult with greater model complexity. Sensitivity analysis is a technique that permits exploration of complex models by evaluating the sensitivity of the model to changes in parameters. Here, we present results of sensitivity analyses of two interrelated complex simulation models of mosquito population dynamics and dengue transmission. We found that dengue transmission may be influenced most by survival in each life stage of the mosquito, mosquito biting behavior, and duration of the infectious period in humans. The importance of these biological processes for vector-borne disease models and the overwhelming lack of knowledge about them make acquisition of relevant field data on these biological processes a top research priority. PMID:21813844

  14. Integrating retention soil filters into urban hydrologic models - Relevant processes and important parameters

    Science.gov (United States)

    Bachmann-Machnik, Anna; Meyer, Daniel; Waldhoff, Axel; Fuchs, Stephan; Dittmer, Ulrich

    2018-04-01

    Retention Soil Filters (RSFs), a form of vertical flow constructed wetlands specifically designed for combined sewer overflow (CSO) treatment, have proven to be an effective tool to mitigate negative impacts of CSOs on receiving water bodies. Long-term hydrologic simulations are used to predict the emissions from urban drainage systems during planning of stormwater management measures. So far no universally accepted model for RSF simulation exists. When simulating hydraulics and water quality in RSFs, an appropriate level of detail must be chosen for reasonable balancing between model complexity and model handling, considering the model input's level of uncertainty. The most crucial parameters determining the resultant uncertainties of the integrated sewer system and filter bed model were identified by evaluating a virtual drainage system with a Retention Soil Filter for CSO treatment. To determine reasonable parameter ranges for RSF simulations, data of 207 events from six full-scale RSF plants in Germany were analyzed. Data evaluation shows that even though different plants with varying loading and operation modes were examined, a simple model is sufficient to assess relevant suspended solids (SS), chemical oxygen demand (COD) and NH4 emissions from RSFs. Two conceptual RSF models with different degrees of complexity were assessed. These models were developed based on evaluation of data from full scale RSF plants and column experiments. Incorporated model processes are ammonium adsorption in the filter layer and degradation during subsequent dry weather period, filtration of SS and particulate COD (XCOD) to a constant background concentration and removal of solute COD (SCOD) by a constant removal rate during filter passage as well as sedimentation of SS and XCOD in the filter overflow. XCOD, SS and ammonium loads as well as ammonium concentration peaks are discharged primarily via RSF overflow not passing through the filter bed. Uncertainties of the integrated

  15. Modeling complexes of modeled proteins.

    Science.gov (United States)

    Anishchenko, Ivan; Kundrotas, Petras J; Vakser, Ilya A

    2017-03-01

    Structural characterization of proteins is essential for understanding life processes at the molecular level. However, only a fraction of known proteins have experimentally determined structures. This fraction is even smaller for protein-protein complexes. Thus, structural modeling of protein-protein interactions (docking) primarily has to rely on modeled structures of the individual proteins, which typically are less accurate than the experimentally determined ones. Such "double" modeling is the Grand Challenge of structural reconstruction of the interactome. Yet it remains so far largely untested in a systematic way. We present a comprehensive validation of template-based and free docking on a set of 165 complexes, where each protein model has six levels of structural accuracy, from 1 to 6 Å C α RMSD. Many template-based docking predictions fall into acceptable quality category, according to the CAPRI criteria, even for highly inaccurate proteins (5-6 Å RMSD), although the number of such models (and, consequently, the docking success rate) drops significantly for models with RMSD > 4 Å. The results show that the existing docking methodologies can be successfully applied to protein models with a broad range of structural accuracy, and the template-based docking is much less sensitive to inaccuracies of protein models than the free docking. Proteins 2017; 85:470-478. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  16. From research excellence to brand relevance: A model for higher education reputation building

    Directory of Open Access Journals (Sweden)

    Nina Overton-de Klerk

    2016-05-01

    Full Text Available In this article we propose a novel approach to reputation development at higher education institutions. Global reputation development at higher education institutions is largely driven by research excellence, is predominantly measured by research output, and is predominantly reflected in hierarchical university rankings. The ranking becomes equated with brand equity. We argue that the current approach to reputation development in higher education institutions is modernist and linear. This is strangely out-of-kilter with the complexities of a transforming society in flux, the demands of a diversity of stakeholders, and the drive towards transdisciplinarity, laterality, reflexivity and relevance in science. Good research clearly remains an important ingredient of a university's brand value. However, a case can be made for brand relevance, co-created in collaboration with stakeholders, as an alternative and non-linear way of differentiation. This approach is appropriate in light of challenges in strategic science globally as well as trends and shifts in the emerging paradigm of strategic communication. In applying strategic communication principles to current trends and issues in strategic science and the communication thereof, an alternative model for strategic reputation building at higher education institutions is developed.

  17. Effect of tDCS on task relevant and irrelevant perceptual learning of complex objects.

    Science.gov (United States)

    Van Meel, Chayenne; Daniels, Nicky; de Beeck, Hans Op; Baeck, Annelies

    2016-01-01

    During perceptual learning the visual representations in the brain are altered, but these changes' causal role has not yet been fully characterized. We used transcranial direct current stimulation (tDCS) to investigate the role of higher visual regions in lateral occipital cortex (LO) in perceptual learning with complex objects. We also investigated whether object learning is dependent on the relevance of the objects for the learning task. Participants were trained in two tasks: object recognition using a backward masking paradigm and an orientation judgment task. During both tasks, an object with a red line on top of it were presented in each trial. The crucial difference between both tasks was the relevance of the object: the object was relevant for the object recognition task, but not for the orientation judgment task. During training, half of the participants received anodal tDCS stimulation targeted at the lateral occipital cortex (LO). Afterwards, participants were tested on how well they recognized the trained objects, the irrelevant objects presented during the orientation judgment task and a set of completely new objects. Participants stimulated with tDCS during training showed larger improvements of performance compared to participants in the sham condition. No learning effect was found for the objects presented during the orientation judgment task. To conclude, this study suggests a causal role of LO in relevant object learning, but given the rather low spatial resolution of tDCS, more research on the specificity of this effect is needed. Further, mere exposure is not sufficient to train object recognition in our paradigm.

  18. Simulating Complex, Cold-region Process Interactions Using a Multi-scale, Variable-complexity Hydrological Model

    Science.gov (United States)

    Marsh, C.; Pomeroy, J. W.; Wheater, H. S.

    2017-12-01

    Accurate management of water resources is necessary for social, economic, and environmental sustainability worldwide. In locations with seasonal snowcovers, the accurate prediction of these water resources is further complicated due to frozen soils, solid-phase precipitation, blowing snow transport, and snowcover-vegetation-atmosphere interactions. Complex process interactions and feedbacks are a key feature of hydrological systems and may result in emergent phenomena, i.e., the arising of novel and unexpected properties within a complex system. One example is the feedback associated with blowing snow redistribution, which can lead to drifts that cause locally-increased soil moisture, thus increasing plant growth that in turn subsequently impacts snow redistribution, creating larger drifts. Attempting to simulate these emergent behaviours is a significant challenge, however, and there is concern that process conceptualizations within current models are too incomplete to represent the needed interactions. An improved understanding of the role of emergence in hydrological systems often requires high resolution distributed numerical hydrological models that incorporate the relevant process dynamics. The Canadian Hydrological Model (CHM) provides a novel tool for examining cold region hydrological systems. Key features include efficient terrain representation, allowing simulations at various spatial scales, reduced computational overhead, and a modular process representation allowing for an alternative-hypothesis framework. Using both physics-based and conceptual process representations sourced from long term process studies and the current cold regions literature allows for comparison of process representations and importantly, their ability to produce emergent behaviours. Examining the system in a holistic, process-based manner can hopefully derive important insights and aid in development of improved process representations.

  19. Relevant criteria for testing the quality of turbulence models

    DEFF Research Database (Denmark)

    Frandsen, Sten Tronæs; Ejsing Jørgensen, Hans; Sørensen, J.D.

    2007-01-01

    Seeking relevant criteria for testing the quality of turbulence models, the scale of turbulence and the gust factor have been estimated from data and compared with predictions from first-order models of these two quantities. It is found that the mean of the measured length scales is approx. 10......% smaller than the IEC model, for wind turbine hub height levels. The mean is only marginally dependent on trends in time series. It is also found that the coefficient of variation of the measured length scales is about 50%. 3sec and 10sec pre-averaging of wind speed data are relevant for MW-size wind...... turbines when seeking wind characteristics that correspond to one blade and the entire rotor, respectively. For heights exceeding 50-60m the gust factor increases with wind speed. For heights larger the 60-80m, present assumptions on the value of the gust factor are significantly conservative, both for 3...

  20. Model complexity control for hydrologic prediction

    NARCIS (Netherlands)

    Schoups, G.; Van de Giesen, N.C.; Savenije, H.H.G.

    2008-01-01

    A common concern in hydrologic modeling is overparameterization of complex models given limited and noisy data. This leads to problems of parameter nonuniqueness and equifinality, which may negatively affect prediction uncertainties. A systematic way of controlling model complexity is therefore

  1. The value and cost of complexity in predictive modelling: role of tissue anisotropic conductivity and fibre tracts in neuromodulation

    Science.gov (United States)

    Salman Shahid, Syed; Bikson, Marom; Salman, Humaira; Wen, Peng; Ahfock, Tony

    2014-06-01

    Objectives. Computational methods are increasingly used to optimize transcranial direct current stimulation (tDCS) dose strategies and yet complexities of existing approaches limit their clinical access. Since predictive modelling indicates the relevance of subject/pathology based data and hence the need for subject specific modelling, the incremental clinical value of increasingly complex modelling methods must be balanced against the computational and clinical time and costs. For example, the incorporation of multiple tissue layers and measured diffusion tensor (DTI) based conductivity estimates increase model precision but at the cost of clinical and computational resources. Costs related to such complexities aggregate when considering individual optimization and the myriad of potential montages. Here, rather than considering if additional details change current-flow prediction, we consider when added complexities influence clinical decisions. Approach. Towards developing quantitative and qualitative metrics of value/cost associated with computational model complexity, we considered field distributions generated by two 4 × 1 high-definition montages (m1 = 4 × 1 HD montage with anode at C3 and m2 = 4 × 1 HD montage with anode at C1) and a single conventional (m3 = C3-Fp2) tDCS electrode montage. We evaluated statistical methods, including residual error (RE) and relative difference measure (RDM), to consider the clinical impact and utility of increased complexities, namely the influence of skull, muscle and brain anisotropic conductivities in a volume conductor model. Main results. Anisotropy modulated current-flow in a montage and region dependent manner. However, significant statistical changes, produced within montage by anisotropy, did not change qualitative peak and topographic comparisons across montages. Thus for the examples analysed, clinical decision on which dose to select would not be altered by the omission of anisotropic brain conductivity

  2. Application of all-relevant feature selection for the failure analysis of parameter-induced simulation crashes in climate models

    Science.gov (United States)

    Paja, Wiesław; Wrzesien, Mariusz; Niemiec, Rafał; Rudnicki, Witold R.

    2016-03-01

    Climate models are extremely complex pieces of software. They reflect the best knowledge on the physical components of the climate; nevertheless, they contain several parameters, which are too weakly constrained by observations, and can potentially lead to a simulation crashing. Recently a study by Lucas et al. (2013) has shown that machine learning methods can be used for predicting which combinations of parameters can lead to the simulation crashing and hence which processes described by these parameters need refined analyses. In the current study we reanalyse the data set used in this research using different methodology. We confirm the main conclusion of the original study concerning the suitability of machine learning for the prediction of crashes. We show that only three of the eight parameters indicated in the original study as relevant for prediction of the crash are indeed strongly relevant, three others are relevant but redundant and two are not relevant at all. We also show that the variance due to the split of data between training and validation sets has a large influence both on the accuracy of predictions and on the relative importance of variables; hence only a cross-validated approach can deliver a robust prediction of performance and relevance of variables.

  3. Multifaceted Modelling of Complex Business Enterprises.

    Science.gov (United States)

    Chakraborty, Subrata; Mengersen, Kerrie; Fidge, Colin; Ma, Lin; Lassen, David

    2015-01-01

    We formalise and present a new generic multifaceted complex system approach for modelling complex business enterprises. Our method has a strong focus on integrating the various data types available in an enterprise which represent the diverse perspectives of various stakeholders. We explain the challenges faced and define a novel approach to converting diverse data types into usable Bayesian probability forms. The data types that can be integrated include historic data, survey data, and management planning data, expert knowledge and incomplete data. The structural complexities of the complex system modelling process, based on various decision contexts, are also explained along with a solution. This new application of complex system models as a management tool for decision making is demonstrated using a railway transport case study. The case study demonstrates how the new approach can be utilised to develop a customised decision support model for a specific enterprise. Various decision scenarios are also provided to illustrate the versatility of the decision model at different phases of enterprise operations such as planning and control.

  4. Multifaceted Modelling of Complex Business Enterprises

    Science.gov (United States)

    2015-01-01

    We formalise and present a new generic multifaceted complex system approach for modelling complex business enterprises. Our method has a strong focus on integrating the various data types available in an enterprise which represent the diverse perspectives of various stakeholders. We explain the challenges faced and define a novel approach to converting diverse data types into usable Bayesian probability forms. The data types that can be integrated include historic data, survey data, and management planning data, expert knowledge and incomplete data. The structural complexities of the complex system modelling process, based on various decision contexts, are also explained along with a solution. This new application of complex system models as a management tool for decision making is demonstrated using a railway transport case study. The case study demonstrates how the new approach can be utilised to develop a customised decision support model for a specific enterprise. Various decision scenarios are also provided to illustrate the versatility of the decision model at different phases of enterprise operations such as planning and control. PMID:26247591

  5. Mathematical Properties Relevant to Geomagnetic Field Modeling

    DEFF Research Database (Denmark)

    Sabaka, Terence J.; Hulot, Gauthier; Olsen, Nils

    2010-01-01

    be directly measured. In this chapter, the mathematical foundation of global (as opposed to regional) geomagnetic field modeling is reviewed, and the spatial modeling of the field in spherical coordinates is focussed. Time can be dealt with as an independent variable and is not explicitly considered......Geomagnetic field modeling consists in converting large numbers of magnetic observations into a linear combination of elementary mathematical functions that best describes those observations.The set of numerical coefficients defining this linear combination is then what one refers.......The relevant elementary mathematical functions are introduced, their properties are reviewed, and how they can be used to describe the magnetic field in a source-free (such as the Earth’s neutral atmosphere) or source-dense (such as the ionosphere) environment is explained. Completeness and uniqueness...

  6. Mathematical Properties Relevant to Geomagnetic Field Modeling

    DEFF Research Database (Denmark)

    Sabaka, Terence J.; Hulot, Gauthier; Olsen, Nils

    2014-01-01

    be directly measured. In this chapter, the mathematical foundation of global (as opposed to regional) geomagnetic field modeling is reviewed, and the spatial modeling of the field in spherical coordinates is focused. Time can be dealt with as an independent variable and is not explicitly considered......Geomagnetic field modeling consists in converting large numbers of magnetic observations into a linear combination of elementary mathematical functions that best describes those observations. The set of numerical coefficients defining this linear combination is then what one refers....... The relevant elementary mathematical functions are introduced, their properties are reviewed, and how they can be used to describe the magnetic field in a source-free (such as the Earth’s neutral atmosphere) or source-dense (such as the ionosphere) environment is explained. Completeness and uniqueness...

  7. A Tissue Relevance and Meshing Method for Computing Patient-Specific Anatomical Models in Endoscopic Sinus Surgery Simulation

    Science.gov (United States)

    Audette, M. A.; Hertel, I.; Burgert, O.; Strauss, G.

    This paper presents on-going work on a method for determining which subvolumes of a patient-specific tissue map, extracted from CT data of the head, are relevant to simulating endoscopic sinus surgery of that individual, and for decomposing these relevant tissues into triangles and tetrahedra whose mesh size is well controlled. The overall goal is to limit the complexity of the real-time biomechanical interaction while ensuring the clinical relevance of the simulation. Relevant tissues are determined as the union of the pathology present in the patient, of critical tissues deemed to be near the intended surgical path or pathology, and of bone and soft tissue near the intended path, pathology or critical tissues. The processing of tissues, prior to meshing, is based on the Fast Marching method applied under various guises, in a conditional manner that is related to tissue classes. The meshing is based on an adaptation of a meshing method of ours, which combines the Marching Tetrahedra method and the discrete Simplex mesh surface model to produce a topologically faithful surface mesh with well controlled edge and face size as a first stage, and Almost-regular Tetrahedralization of the same prescribed mesh size as a last stage.

  8. Macroscale hydrologic modeling of ecologically relevant flow metrics

    Science.gov (United States)

    Wenger, Seth J.; Luce, Charles H.; Hamlet, Alan F.; Isaak, Daniel J.; Neville, Helen M.

    2010-09-01

    Stream hydrology strongly affects the structure of aquatic communities. Changes to air temperature and precipitation driven by increased greenhouse gas concentrations are shifting timing and volume of streamflows potentially affecting these communities. The variable infiltration capacity (VIC) macroscale hydrologic model has been employed at regional scales to describe and forecast hydrologic changes but has been calibrated and applied mainly to large rivers. An important question is how well VIC runoff simulations serve to answer questions about hydrologic changes in smaller streams, which are important habitat for many fish species. To answer this question, we aggregated gridded VIC outputs within the drainage basins of 55 streamflow gages in the Pacific Northwest United States and compared modeled hydrographs and summary metrics to observations. For most streams, several ecologically relevant aspects of the hydrologic regime were accurately modeled, including center of flow timing, mean annual and summer flows and frequency of winter floods. Frequencies of high and low flows in the summer were not well predicted, however. Predictions were worse for sites with strong groundwater influence, and some sites showed errors that may result from limitations in the forcing climate data. Higher resolution (1/16th degree) modeling provided small improvements over lower resolution (1/8th degree). Despite some limitations, the VIC model appears capable of representing several ecologically relevant hydrologic characteristics in streams, making it a useful tool for understanding the effects of hydrology in delimiting species distributions and predicting the potential effects of climate shifts on aquatic organisms.

  9. Alkali Metal Ion Complexes with Phosphates, Nucleotides, Amino Acids, and Related Ligands of Biological Relevance. Their Properties in Solution.

    Science.gov (United States)

    Crea, Francesco; De Stefano, Concetta; Foti, Claudia; Lando, Gabriele; Milea, Demetrio; Sammartano, Silvio

    2016-01-01

    Alkali metal ions play very important roles in all biological systems, some of them are essential for life. Their concentration depends on several physiological factors and is very variable. For example, sodium concentrations in human fluids vary from quite low (e.g., 8.2 mmol dm(-3) in mature maternal milk) to high values (0.14 mol dm(-3) in blood plasma). While many data on the concentration of Na(+) and K(+) in various fluids are available, the information on other alkali metal cations is scarce. Since many vital functions depend on the network of interactions occurring in various biofluids, this chapter reviews their complex formation with phosphates, nucleotides, amino acids, and related ligands of biological relevance. Literature data on this topic are quite rare if compared to other cations. Generally, the stability of alkali metal ion complexes of organic and inorganic ligands is rather low (usually log K  Na(+) > K(+) > Rb(+) > Cs(+). For example, for citrate it is: log K ML = 0.88, 0.80, 0.48, 0.38, and 0.13 at 25 °C and infinite dilution. Some considerations are made on the main aspects related to the difficulties in the determination of weak complexes. The importance of the alkali metal ion complexes was also studied in the light of modelling natural fluids and in the use of these cations as probes for different processes. Some empirical relationships are proposed for the dependence of the stability constants of Na(+) complexes on the ligand charge, as well as for correlations among log K values of NaL, KL or LiL species (L = generic ligand).

  10. Other relevant numerical modelling papers

    International Nuclear Information System (INIS)

    Chartier, M.

    1989-01-01

    The ocean modelling is a rapidly evolving science and a large number of results have been published. Several categories of papers are of particular interest for this review: the papers published by the international atomic institutions, such as the NEA (for the CRESP or Subseabed Programs), the IAEA (for example the Safety Series, the Technical Report Series or the TECDOC), and the ICRP, and the papers concerned by more fundamental research, which are published in specific scientific literature. This paper aims to list some of the most relevant publications for the CRESP purposes. It means by no way to be exhaustive, but informative on the incontestable progress recently achieved in that field. One should note that some of these papers are so recent that their final version has not yet been published

  11. Intertwining personal and reward relevance: evidence from the drift-diffusion model.

    Science.gov (United States)

    Yankouskaya, A; Bührle, R; Lugt, E; Stolte, M; Sui, J

    2018-01-24

    In their seminal paper 'Is our self nothing but reward', Northoff and Hayes (Biol Psychiatry 69(11):1019-1025, Northoff, Hayes, Biological Psychiatry 69(11):1019-1025, 2011) proposed three models of the relationship between self and reward and opened a continuing debate about how these different fields can be linked. To date, none of the proposed models received strong empirical support. The present study tested common and distinct effects of personal relevance and reward values by de-componenting different stages of perceptual decision making using a drift-diffusion approach. We employed a recently developed associative matching paradigm where participants (N = 40) formed mental associations between five geometric shapes and five labels referring personal relevance in the personal task, or five shape-label pairings with different reward values in the reward task and then performed a matching task by indicating whether a displayed shape-label pairing was correct or incorrect. We found that common effects of personal relevance and monetary reward were manifested in the facilitation of behavioural performance for high personal relevance and high reward value as socially important signals. The differential effects between personal and monetary relevance reflected non-decisional time in a perceptual decision process, and task-specific prioritization of stimuli. Our findings support the parallel processing model (Northoff & Hayes, Biol Psychiatry 69(11):1019-1025, Northoff, Hayes, Biological Psychiatry 69(11):1019-1025, 2011) and suggest that self-specific processing occurs in parallel with high reward processing. Limitations and further directions are discussed.

  12. Canine intrahepatic vasculature: is a functional anatomic model relevant to the dog?

    Science.gov (United States)

    Hall, Jon L; Mannion, Paddy; Ladlow, Jane F

    2015-01-01

    To clarify canine intrahepatic portal and hepatic venous system anatomy using corrosion casting and advanced imaging and to devise a novel functional anatomic model of the canine liver to investigate whether this could help guide the planning and surgical procedure of partial hepatic lobectomy and interventional radiological procedures. Prospective experimental study. Adult Greyhound cadavers (n = 8). Portal and hepatic vein corrosion casts of healthy livers were assessed using computed tomography (CT). The hepatic lobes have a consistent hilar hepatic and portal vein supply with some variation in the number of intrahepatic branches. For all specimens, 3 surgically resectable areas were identified in the left lateral lobe and 2 surgically resectable areas were identified in the right medial lobe as defined by a functional anatomic model. CT of detailed acrylic casts allowed complex intrahepatic vascular relationships to be investigated and compared with previous studies. Improving understanding of the intrahepatic vascular supply facilitates interpretation of advanced images in clinical patients, the planning and performance of surgical procedures, and may facilitate interventional vascular procedures, such as intravenous embolization of portosystemic shunts. Functional division of the canine liver similar to human models is possible. The left lateral and right medial lobes can be consistently divided into surgically resectable functional areas and partial lobectomies can be performed following a functional model; further study in clinically affected animals would be required to investigate the relevance of this functional model in the dog. © Copyright 2014 by The American College of Veterinary Surgeons.

  13. Experimental Models of Vaginal Candidiasis and Their Relevance to Human Candidiasis

    Science.gov (United States)

    Sobel, Jack D.

    2016-01-01

    Vulvovaginal candidiasis (VVC) is a high-incidence disease seriously affecting the quality of life of women worldwide, particularly in its chronic, recurrent forms (RVVC), and with no definitive cure or preventive measure. Experimental studies in currently used rat and mouse models of vaginal candidiasis have generated a large mass of data on pathogenicity determinants and inflammation and immune responses of potential importance for the control of human pathology. However, reflection is necessary about the relevance of these rodent models to RVVC. Here we examine the chemical, biochemical, and biological factors that determine or contrast the forms of the disease in rodent models and in women and highlight the differences between them. We also appeal for approaches to improve or replace the current models in order to enhance their relevance to human infection. PMID:26883592

  14. Modelling the structure of complex networks

    DEFF Research Database (Denmark)

    Herlau, Tue

    networks has been independently studied as mathematical objects in their own right. As such, there has been both an increased demand for statistical methods for complex networks as well as a quickly growing mathematical literature on the subject. In this dissertation we explore aspects of modelling complex....... The next chapters will treat some of the various symmetries, representer theorems and probabilistic structures often deployed in the modelling complex networks, the construction of sampling methods and various network models. The introductory chapters will serve to provide context for the included written...

  15. Towards Increased Relevance: Context-Adapted Models of the Learning Organization

    Science.gov (United States)

    Örtenblad, Anders

    2015-01-01

    Purpose: The purposes of this paper are to take a closer look at the relevance of the idea of the learning organization for organizations in different generalized organizational contexts; to open up for the existence of multiple, context-adapted models of the learning organization; and to suggest a number of such models.…

  16. Updating the debate on model complexity

    Science.gov (United States)

    Simmons, Craig T.; Hunt, Randall J.

    2012-01-01

    As scientists who are trying to understand a complex natural world that cannot be fully characterized in the field, how can we best inform the society in which we live? This founding context was addressed in a special session, “Complexity in Modeling: How Much is Too Much?” convened at the 2011 Geological Society of America Annual Meeting. The session had a variety of thought-provoking presentations—ranging from philosophy to cost-benefit analyses—and provided some areas of broad agreement that were not evident in discussions of the topic in 1998 (Hunt and Zheng, 1999). The session began with a short introduction during which model complexity was framed borrowing from an economic concept, the Law of Diminishing Returns, and an example of enjoyment derived by eating ice cream. Initially, there is increasing satisfaction gained from eating more ice cream, to a point where the gain in satisfaction starts to decrease, ending at a point when the eater sees no value in eating more ice cream. A traditional view of model complexity is similar—understanding gained from modeling can actually decrease if models become unnecessarily complex. However, oversimplified models—those that omit important aspects of the problem needed to make a good prediction—can also limit and confound our understanding. Thus, the goal of all modeling is to find the “sweet spot” of model sophistication—regardless of whether complexity was added sequentially to an overly simple model or collapsed from an initial highly parameterized framework that uses mathematics and statistics to attain an optimum (e.g., Hunt et al., 2007). Thus, holistic parsimony is attained, incorporating “as simple as possible,” as well as the equally important corollary “but no simpler.”

  17. Role of calibration, validation, and relevance in multi-level uncertainty integration

    International Nuclear Information System (INIS)

    Li, Chenzhao; Mahadevan, Sankaran

    2016-01-01

    Calibration of model parameters is an essential step in predicting the response of a complicated system, but the lack of data at the system level makes it impossible to conduct this quantification directly. In such a situation, system model parameters are estimated using tests at lower levels of complexity which share the same model parameters with the system. For such a multi-level problem, this paper proposes a methodology to quantify the uncertainty in the system level prediction by integrating calibration, validation and sensitivity analysis at different levels. The proposed approach considers the validity of the models used for parameter estimation at lower levels, as well as the relevance at the lower level to the prediction at the system level. The model validity is evaluated using a model reliability metric, and models with multivariate output are considered. The relevance is quantified by comparing Sobol indices at the lower level and system level, thus measuring the extent to which a lower level test represents the characteristics of the system so that the calibration results can be reliably used in the system level. Finally the results of calibration, validation and relevance analysis are integrated in a roll-up method to predict the system output. - Highlights: • Relevance analysis to quantify the closeness of two models. • Stochastic model reliability metric to integrate multiple validation experiments. • Extend the model reliability metric to deal with multivariate output. • Roll-up formula to integrate calibration, validation, and relevance.

  18. An overview of structurally complex network-based modeling of public opinion in the “We the Media” era

    Science.gov (United States)

    Wang, Guanghui; Wang, Yufei; Liu, Yijun; Chi, Yuxue

    2018-05-01

    As the transmission of public opinion on the Internet in the “We the Media” era tends to be supraterritorial, concealed and complex, the traditional “point-to-surface” transmission of information has been transformed into “point-to-point” reciprocal transmission. A foundation for studies of the evolution of public opinion and its transmission on the Internet in the “We the Media” era can be laid by converting the massive amounts of fragmented information on public opinion that exists on “We the Media” platforms into structurally complex networks of information. This paper describes studies of structurally complex network-based modeling of public opinion on the Internet in the “We the Media” era from the perspective of the development and evolution of complex networks. The progress that has been made in research projects relevant to the structural modeling of public opinion on the Internet is comprehensively summarized. The review considers aspects such as regular grid-based modeling of the rules that describe the propagation of public opinion on the Internet in the “We the Media” era, social network modeling, dynamic network modeling, and supernetwork modeling. Moreover, an outlook for future studies that address complex network-based modeling of public opinion on the Internet is put forward as a summary from the perspective of modeling conducted using the techniques mentioned above.

  19. Mining Very High Resolution INSAR Data Based On Complex-GMRF Cues And Relevance Feedback

    Science.gov (United States)

    Singh, Jagmal; Popescu, Anca; Soccorsi, Matteo; Datcu, Mihai

    2012-01-01

    With the increase in number of remote sensing satellites, the number of image-data scenes in our repositories is also increasing and a large quantity of these scenes are never received and used. Thus automatic retrieval of de- sired image-data using query by image content to fully utilize the huge repository volume is becoming of great interest. Generally different users are interested in scenes containing different kind of objects and structures. So its important to analyze all the image information mining (IIM) methods so that its easier for user to select a method depending upon his/her requirement. We concentrate our study only on high-resolution SAR images and we propose to use InSAR observations instead of only one single look complex (SLC) images for mining scenes containing coherent objects such as high-rise buildings. However in case of objects with less coherence like areas with vegetation cover, SLC images exhibits better performance. We demonstrate IIM performance comparison using complex-Gauss Markov Random Fields as texture descriptor for image patches and SVM relevance- feedback.

  20. Bioprinting towards Physiologically Relevant Tissue Models for Pharmaceutics.

    Science.gov (United States)

    Peng, Weijie; Unutmaz, Derya; Ozbolat, Ibrahim T

    2016-09-01

    Improving the ability to predict the efficacy and toxicity of drug candidates earlier in the drug discovery process will speed up the introduction of new drugs into clinics. 3D in vitro systems have significantly advanced the drug screening process as 3D tissue models can closely mimic native tissues and, in some cases, the physiological response to drugs. Among various in vitro systems, bioprinting is a highly promising technology possessing several advantages such as tailored microarchitecture, high-throughput capability, coculture ability, and low risk of cross-contamination. In this opinion article, we discuss the currently available tissue models in pharmaceutics along with their limitations and highlight the possibilities of bioprinting physiologically relevant tissue models, which hold great potential in drug testing, high-throughput screening, and disease modeling. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Theoretical Relevance of Neuropsychological Data for Connectionist Modelling

    Directory of Open Access Journals (Sweden)

    Mauricio Iza

    2011-05-01

    Full Text Available The symbolic information-processing paradigm in cognitive psychology has met a growing challenge from neural network models over the past two decades. While neuropsychological
    evidence has been of great utility to theories concerned with information processing, the real question is, whether the less rigid connectionist models provide valid, or enough, information
    concerning complex cognitive structures. In this work, we will discuss the theoretical implications that neuropsychological data posits for modelling cognitive systems.

  2. Complexity, Modeling, and Natural Resource Management

    Directory of Open Access Journals (Sweden)

    Paul Cilliers

    2013-09-01

    Full Text Available This paper contends that natural resource management (NRM issues are, by their very nature, complex and that both scientists and managers in this broad field will benefit from a theoretical understanding of complex systems. It starts off by presenting the core features of a view of complexity that not only deals with the limits to our understanding, but also points toward a responsible and motivating position. Everything we do involves explicit or implicit modeling, and as we can never have comprehensive access to any complex system, we need to be aware both of what we leave out as we model and of the implications of the choice of our modeling framework. One vantage point is never sufficient, as complexity necessarily implies that multiple (independent conceptualizations are needed to engage the system adequately. We use two South African cases as examples of complex systems - restricting the case narratives mainly to the biophysical domain associated with NRM issues - that make the point that even the behavior of the biophysical subsystems themselves are already complex. From the insights into complex systems discussed in the first part of the paper and the lessons emerging from the way these cases have been dealt with in reality, we extract five interrelated generic principles for practicing science and management in complex NRM environments. These principles are then further elucidated using four further South African case studies - organized as two contrasting pairs - and now focusing on the more difficult organizational and social side, comparing the human organizational endeavors in managing such systems.

  3. Adapting APSIM to model the physiology and genetics of complex adaptive traits in field crops.

    Science.gov (United States)

    Hammer, Graeme L; van Oosterom, Erik; McLean, Greg; Chapman, Scott C; Broad, Ian; Harland, Peter; Muchow, Russell C

    2010-05-01

    Progress in molecular plant breeding is limited by the ability to predict plant phenotype based on its genotype, especially for complex adaptive traits. Suitably constructed crop growth and development models have the potential to bridge this predictability gap. A generic cereal crop growth and development model is outlined here. It is designed to exhibit reliable predictive skill at the crop level while also introducing sufficient physiological rigour for complex phenotypic responses to become emergent properties of the model dynamics. The approach quantifies capture and use of radiation, water, and nitrogen within a framework that predicts the realized growth of major organs based on their potential and whether the supply of carbohydrate and nitrogen can satisfy that potential. The model builds on existing approaches within the APSIM software platform. Experiments on diverse genotypes of sorghum that underpin the development and testing of the adapted crop model are detailed. Genotypes differing in height were found to differ in biomass partitioning among organs and a tall hybrid had significantly increased radiation use efficiency: a novel finding in sorghum. Introducing these genetic effects associated with plant height into the model generated emergent simulated phenotypic differences in green leaf area retention during grain filling via effects associated with nitrogen dynamics. The relevance to plant breeding of this capability in complex trait dissection and simulation is discussed.

  4. Direct nano ESI time-of-flight mass spectrometric investigations on lanthanide BTP complexes in the extraction-relevant diluent 1-octanol

    International Nuclear Information System (INIS)

    Steppert, M.; Walther, C.; Geist, A.; Fanghanel, Th.

    2009-01-01

    The present work focuses on investigations of a highly selective ligand for Am(III)/Ln(III) separation: bis-triazinyl-pyridine (BTP). By means of nano-electro-spray mass spectrometry, complex formation of BTP with selected elements of the lanthanide series is investigated. We show that the diluent drastically influences complex speciation. Measurements obtained in the extraction-relevant diluent 1-octanol show the occurrence of Ln(BTP) i (i 1-3) species in different relative abundances, depending on the lanthanide used. Here, the relative abundances of the Ln(BTP) 3 complexes correlate with the distribution ratios for extraction to the organic phase of the respective lanthanide. (authors)

  5. Sutherland models for complex reflection groups

    International Nuclear Information System (INIS)

    Crampe, N.; Young, C.A.S.

    2008-01-01

    There are known to be integrable Sutherland models associated to every real root system, or, which is almost equivalent, to every real reflection group. Real reflection groups are special cases of complex reflection groups. In this paper we associate certain integrable Sutherland models to the classical family of complex reflection groups. Internal degrees of freedom are introduced, defining dynamical spin chains, and the freezing limit taken to obtain static chains of Haldane-Shastry type. By considering the relation of these models to the usual BC N case, we are led to systems with both real and complex reflection groups as symmetries. We demonstrate their integrability by means of new Dunkl operators, associated to wreath products of dihedral groups

  6. Models of the Economic Growth and their Relevance

    Directory of Open Access Journals (Sweden)

    Nicolae MOROIANU

    2012-06-01

    Full Text Available Until few years ago, the economic growth was something perfect normal, part of an era marked by the transformation speed. Normality itself has been transformed and we currently are influenced by other rules, unknown yet, which should answer the question: “How do we return to the economic growth?” The economic growth and the models aiming to solve this problem concern the economic history even since its beginnings. In this paper we would like to find out what is the relevance that the well-known macroeconomic models still have and which might be their applicability level in a framework created by a black swan event type.

  7. Simulation of groundwater flow in the glacial aquifer system of northeastern Wisconsin with variable model complexity

    Science.gov (United States)

    Juckem, Paul F.; Clark, Brian R.; Feinstein, Daniel T.

    2017-05-04

    The U.S. Geological Survey, National Water-Quality Assessment seeks to map estimated intrinsic susceptibility of the glacial aquifer system of the conterminous United States. Improved understanding of the hydrogeologic characteristics that explain spatial patterns of intrinsic susceptibility, commonly inferred from estimates of groundwater age distributions, is sought so that methods used for the estimation process are properly equipped. An important step beyond identifying relevant hydrogeologic datasets, such as glacial geology maps, is to evaluate how incorporation of these resources into process-based models using differing levels of detail could affect resulting simulations of groundwater age distributions and, thus, estimates of intrinsic susceptibility.This report describes the construction and calibration of three groundwater-flow models of northeastern Wisconsin that were developed with differing levels of complexity to provide a framework for subsequent evaluations of the effects of process-based model complexity on estimations of groundwater age distributions for withdrawal wells and streams. Preliminary assessments, which focused on the effects of model complexity on simulated water levels and base flows in the glacial aquifer system, illustrate that simulation of vertical gradients using multiple model layers improves simulated heads more in low-permeability units than in high-permeability units. Moreover, simulation of heterogeneous hydraulic conductivity fields in coarse-grained and some fine-grained glacial materials produced a larger improvement in simulated water levels in the glacial aquifer system compared with simulation of uniform hydraulic conductivity within zones. The relation between base flows and model complexity was less clear; however, the relation generally seemed to follow a similar pattern as water levels. Although increased model complexity resulted in improved calibrations, future application of the models using simulated particle

  8. Predictive Surface Complexation Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Sverjensky, Dimitri A. [Johns Hopkins Univ., Baltimore, MD (United States). Dept. of Earth and Planetary Sciences

    2016-11-29

    Surface complexation plays an important role in the equilibria and kinetics of processes controlling the compositions of soilwaters and groundwaters, the fate of contaminants in groundwaters, and the subsurface storage of CO2 and nuclear waste. Over the last several decades, many dozens of individual experimental studies have addressed aspects of surface complexation that have contributed to an increased understanding of its role in natural systems. However, there has been no previous attempt to develop a model of surface complexation that can be used to link all the experimental studies in order to place them on a predictive basis. Overall, my research has successfully integrated the results of the work of many experimentalists published over several decades. For the first time in studies of the geochemistry of the mineral-water interface, a practical predictive capability for modeling has become available. The predictive correlations developed in my research now enable extrapolations of experimental studies to provide estimates of surface chemistry for systems not yet studied experimentally and for natural and anthropogenically perturbed systems.

  9. Control-relevant modeling and simulation of a SOFC-GT hybrid system

    OpenAIRE

    Rambabu Kandepu; Lars Imsland; Christoph Stiller; Bjarne A. Foss; Vinay Kariwala

    2006-01-01

    In this paper, control-relevant models of the most important components in a SOFC-GT hybrid system are described. Dynamic simulations are performed on the overall hybrid system. The model is used to develop a simple control structure, but the simulations show that more elaborate control is needed.

  10. Are Model Transferability And Complexity Antithetical? Insights From Validation of a Variable-Complexity Empirical Snow Model in Space and Time

    Science.gov (United States)

    Lute, A. C.; Luce, Charles H.

    2017-11-01

    The related challenges of predictions in ungauged basins and predictions in ungauged climates point to the need to develop environmental models that are transferable across both space and time. Hydrologic modeling has historically focused on modelling one or only a few basins using highly parameterized conceptual or physically based models. However, model parameters and structures have been shown to change significantly when calibrated to new basins or time periods, suggesting that model complexity and model transferability may be antithetical. Empirical space-for-time models provide a framework within which to assess model transferability and any tradeoff with model complexity. Using 497 SNOTEL sites in the western U.S., we develop space-for-time models of April 1 SWE and Snow Residence Time based on mean winter temperature and cumulative winter precipitation. The transferability of the models to new conditions (in both space and time) is assessed using non-random cross-validation tests with consideration of the influence of model complexity on transferability. As others have noted, the algorithmic empirical models transfer best when minimal extrapolation in input variables is required. Temporal split-sample validations use pseudoreplicated samples, resulting in the selection of overly complex models, which has implications for the design of hydrologic model validation tests. Finally, we show that low to moderate complexity models transfer most successfully to new conditions in space and time, providing empirical confirmation of the parsimony principal.

  11. A Hybrid Approach to Finding Relevant Social Media Content for Complex Domain Specific Information Needs.

    Science.gov (United States)

    Cameron, Delroy; Sheth, Amit P; Jaykumar, Nishita; Thirunarayan, Krishnaprasad; Anand, Gaurish; Smith, Gary A

    2014-12-01

    While contemporary semantic search systems offer to improve classical keyword-based search, they are not always adequate for complex domain specific information needs. The domain of prescription drug abuse, for example, requires knowledge of both ontological concepts and "intelligible constructs" not typically modeled in ontologies. These intelligible constructs convey essential information that include notions of intensity, frequency, interval, dosage and sentiments, which could be important to the holistic needs of the information seeker. In this paper, we present a hybrid approach to domain specific information retrieval that integrates ontology-driven query interpretation with synonym-based query expansion and domain specific rules, to facilitate search in social media on prescription drug abuse. Our framework is based on a context-free grammar (CFG) that defines the query language of constructs interpretable by the search system. The grammar provides two levels of semantic interpretation: 1) a top-level CFG that facilitates retrieval of diverse textual patterns, which belong to broad templates and 2) a low-level CFG that enables interpretation of specific expressions belonging to such textual patterns. These low-level expressions occur as concepts from four different categories of data: 1) ontological concepts, 2) concepts in lexicons (such as emotions and sentiments), 3) concepts in lexicons with only partial ontology representation, called lexico-ontology concepts (such as side effects and routes of administration (ROA)), and 4) domain specific expressions (such as date, time, interval, frequency and dosage) derived solely through rules. Our approach is embodied in a novel Semantic Web platform called PREDOSE, which provides search support for complex domain specific information needs in prescription drug abuse epidemiology. When applied to a corpus of over 1 million drug abuse-related web forum posts, our search framework proved effective in retrieving

  12. Control-relevant modeling and simulation of a SOFC-GT hybrid system

    Directory of Open Access Journals (Sweden)

    Rambabu Kandepu

    2006-07-01

    Full Text Available In this paper, control-relevant models of the most important components in a SOFC-GT hybrid system are described. Dynamic simulations are performed on the overall hybrid system. The model is used to develop a simple control structure, but the simulations show that more elaborate control is needed.

  13. Modeling Musical Complexity: Commentary on Eerola (2016

    Directory of Open Access Journals (Sweden)

    Joshua Albrecht

    2016-07-01

    Full Text Available In his paper, "Expectancy violation and information-theoretic models of melodic complexity," Eerola compares a number of models that correlate musical features of monophonic melodies with participant ratings of perceived melodic complexity. He finds that fairly strong results can be achieved using several different approaches to modeling perceived melodic complexity. The data used in this study are gathered from several previously published studies that use widely different types of melodies, including isochronous folk melodies, isochronous 12-tone rows, and rhythmically complex African folk melodies. This commentary first briefly reviews the article's method and main findings, then suggests a rethinking of the theoretical framework of the study. Finally, some of the methodological issues of the study are discussed.

  14. The effects of model and data complexity on predictions from species distributions models

    DEFF Research Database (Denmark)

    García-Callejas, David; Bastos, Miguel

    2016-01-01

    How complex does a model need to be to provide useful predictions is a matter of continuous debate across environmental sciences. In the species distributions modelling literature, studies have demonstrated that more complex models tend to provide better fits. However, studies have also shown...... that predictive performance does not always increase with complexity. Testing of species distributions models is challenging because independent data for testing are often lacking, but a more general problem is that model complexity has never been formally described in such studies. Here, we systematically...

  15. A Primer for Model Selection: The Decisive Role of Model Complexity

    Science.gov (United States)

    Höge, Marvin; Wöhling, Thomas; Nowak, Wolfgang

    2018-03-01

    Selecting a "best" model among several competing candidate models poses an often encountered problem in water resources modeling (and other disciplines which employ models). For a modeler, the best model fulfills a certain purpose best (e.g., flood prediction), which is typically assessed by comparing model simulations to data (e.g., stream flow). Model selection methods find the "best" trade-off between good fit with data and model complexity. In this context, the interpretations of model complexity implied by different model selection methods are crucial, because they represent different underlying goals of modeling. Over the last decades, numerous model selection criteria have been proposed, but modelers who primarily want to apply a model selection criterion often face a lack of guidance for choosing the right criterion that matches their goal. We propose a classification scheme for model selection criteria that helps to find the right criterion for a specific goal, i.e., which employs the correct complexity interpretation. We identify four model selection classes which seek to achieve high predictive density, low predictive error, high model probability, or shortest compression of data. These goals can be achieved by following either nonconsistent or consistent model selection and by either incorporating a Bayesian parameter prior or not. We allocate commonly used criteria to these four classes, analyze how they represent model complexity and what this means for the model selection task. Finally, we provide guidance on choosing the right type of criteria for specific model selection tasks. (A quick guide through all key points is given at the end of the introduction.)

  16. The relevance of non-human primate and rodent malaria models for humans

    Directory of Open Access Journals (Sweden)

    Riley Eleanor

    2011-02-01

    Full Text Available Abstract At the 2010 Keystone Symposium on "Malaria: new approaches to understanding Host-Parasite interactions", an extra scientific session to discuss animal models in malaria research was convened at the request of participants. This was prompted by the concern of investigators that skepticism in the malaria community about the use and relevance of animal models, particularly rodent models of severe malaria, has impacted on funding decisions and publication of research using animal models. Several speakers took the opportunity to demonstrate the similarities between findings in rodent models and human severe disease, as well as points of difference. The variety of malaria presentations in the different experimental models parallels the wide diversity of human malaria disease and, therefore, might be viewed as a strength. Many of the key features of human malaria can be replicated in a variety of nonhuman primate models, which are very under-utilized. The importance of animal models in the discovery of new anti-malarial drugs was emphasized. The major conclusions of the session were that experimental and human studies should be more closely linked so that they inform each other, and that there should be wider access to relevant clinical material.

  17. Towards policy relevant environmental modeling: contextual validity and pragmatic models

    Science.gov (United States)

    Miles, Scott B.

    2000-01-01

    "What makes for a good model?" In various forms, this question is a question that, undoubtedly, many people, businesses, and institutions ponder with regards to their particular domain of modeling. One particular domain that is wrestling with this question is the multidisciplinary field of environmental modeling. Examples of environmental models range from models of contaminated ground water flow to the economic impact of natural disasters, such as earthquakes. One of the distinguishing claims of the field is the relevancy of environmental modeling to policy and environment-related decision-making in general. A pervasive view by both scientists and decision-makers is that a "good" model is one that is an accurate predictor. Thus, determining whether a model is "accurate" or "correct" is done by comparing model output to empirical observations. The expected outcome of this process, usually referred to as "validation" or "ground truthing," is a stamp on the model in question of "valid" or "not valid" that serves to indicate whether or not the model will be reliable before it is put into service in a decision-making context. In this paper, I begin by elaborating on the prevailing view of model validation and why this view must change. Drawing from concepts coming out of the studies of science and technology, I go on to propose a contextual view of validity that can overcome the problems associated with "ground truthing" models as an indicator of model goodness. The problem of how we talk about and determine model validity has much to do about how we perceive the utility of environmental models. In the remainder of the paper, I argue that we should adopt ideas of pragmatism in judging what makes for a good model and, in turn, developing good models. From such a perspective of model goodness, good environmental models should facilitate communication, convey—not bury or "eliminate"—uncertainties, and, thus, afford the active building of consensus decisions, instead

  18. Model Predictive Engine Air-Ratio Control Using Online Sequential Relevance Vector Machine

    Directory of Open Access Journals (Sweden)

    Hang-cheong Wong

    2012-01-01

    Full Text Available Engine power, brake-specific fuel consumption, and emissions relate closely to air ratio (i.e., lambda among all the engine variables. An accurate and adaptive model for lambda prediction is essential to effective lambda control for long term. This paper utilizes an emerging technique, relevance vector machine (RVM, to build a reliable time-dependent lambda model which can be continually updated whenever a sample is added to, or removed from, the estimated lambda model. The paper also presents a new model predictive control (MPC algorithm for air-ratio regulation based on RVM. This study shows that the accuracy, training, and updating time of the RVM model are superior to the latest modelling methods, such as diagonal recurrent neural network (DRNN and decremental least-squares support vector machine (DLSSVM. Moreover, the control algorithm has been implemented on a real car to test. Experimental results reveal that the control performance of the proposed relevance vector machine model predictive controller (RVMMPC is also superior to DRNNMPC, support vector machine-based MPC, and conventional proportional-integral (PI controller in production cars. Therefore, the proposed RVMMPC is a promising scheme to replace conventional PI controller for engine air-ratio control.

  19. Linking Complexity and Sustainability Theories: Implications for Modeling Sustainability Transitions

    Directory of Open Access Journals (Sweden)

    Camaren Peter

    2014-03-01

    Full Text Available In this paper, we deploy a complexity theory as the foundation for integration of different theoretical approaches to sustainability and develop a rationale for a complexity-based framework for modeling transitions to sustainability. We propose a framework based on a comparison of complex systems’ properties that characterize the different theories that deal with transitions to sustainability. We argue that adopting a complexity theory based approach for modeling transitions requires going beyond deterministic frameworks; by adopting a probabilistic, integrative, inclusive and adaptive approach that can support transitions. We also illustrate how this complexity-based modeling framework can be implemented; i.e., how it can be used to select modeling techniques that address particular properties of complex systems that we need to understand in order to model transitions to sustainability. In doing so, we establish a complexity-based approach towards modeling sustainability transitions that caters for the broad range of complex systems’ properties that are required to model transitions to sustainability.

  20. A Sensitivity Analysis Method to Study the Behavior of Complex Process-based Models

    Science.gov (United States)

    Brugnach, M.; Neilson, R.; Bolte, J.

    2001-12-01

    The use of process-based models as a tool for scientific inquiry is becoming increasingly relevant in ecosystem studies. Process-based models are artificial constructs that simulate the system by mechanistically mimicking the functioning of its component processes. Structurally, a process-based model can be characterized, in terms of its processes and the relationships established among them. Each process comprises a set of functional relationships among several model components (e.g., state variables, parameters and input data). While not encoded explicitly, the dynamics of the model emerge from this set of components and interactions organized in terms of processes. It is the task of the modeler to guarantee that the dynamics generated are appropriate and semantically equivalent to the phenomena being modeled. Despite the availability of techniques to characterize and understand model behavior, they do not suffice to completely and easily understand how a complex process-based model operates. For example, sensitivity analysis studies model behavior by determining the rate of change in model output as parameters or input data are varied. One of the problems with this approach is that it considers the model as a "black box", and it focuses on explaining model behavior by analyzing the relationship input-output. Since, these models have a high degree of non-linearity, understanding how the input affects an output can be an extremely difficult task. Operationally, the application of this technique may constitute a challenging task because complex process-based models are generally characterized by a large parameter space. In order to overcome some of these difficulties, we propose a method of sensitivity analysis to be applicable to complex process-based models. This method focuses sensitivity analysis at the process level, and it aims to determine how sensitive the model output is to variations in the processes. Once the processes that exert the major influence in

  1. Integrating modelling and phenotyping approaches to identify and screen complex traits - Illustration for transpiration efficiency in cereals.

    Science.gov (United States)

    Chenu, K; van Oosterom, E J; McLean, G; Deifel, K S; Fletcher, A; Geetika, G; Tirfessa, A; Mace, E S; Jordan, D R; Sulman, R; Hammer, G L

    2018-02-21

    Following advances in genetics, genomics, and phenotyping, trait selection in breeding is limited by our ability to understand interactions within the plants and with their environments, and to target traits of most relevance for the target population of environments. We propose an integrated approach that combines insights from crop modelling, physiology, genetics, and breeding to identify traits valuable for yield gain in the target population of environments, develop relevant high-throughput phenotyping platforms, and identify genetic controls and their values in production environments. This paper uses transpiration efficiency (biomass produced per unit of water used) as an example of a complex trait of interest to illustrate how the approach can guide modelling, phenotyping, and selection in a breeding program. We believe that this approach, by integrating insights from diverse disciplines, can increase the resource use efficiency of breeding programs for improving yield gains in target populations of environments.

  2. Performance evaluation of functioning of natural-industrial system of mining-processing complex with help of analytical and mathematical models

    Science.gov (United States)

    Bosikov, I. I.; Klyuev, R. V.; Revazov, V. Ch; Pilieva, D. E.

    2018-03-01

    The article describes research and analysis of hazardous processes occurring in the natural-industrial system and effectiveness assessment of its functioning using mathematical models. Studies of the functioning regularities of the natural and industrial system are becoming increasingly relevant in connection with the formulation of the task of modernizing production and the economy of Russia as a whole. In connection with a significant amount of poorly structured data, it is complicated by regulations for the effective functioning of production processes, social and natural complexes, under which a sustainable development of the natural-industrial system of the mining and processing complex would be ensured. Therefore, the scientific and applied problems, the solution of which allows one to formalize the hidden structural functioning patterns of the natural-industrial system and to make managerial decisions of organizational and technological nature to improve the efficiency of the system, are very relevant.

  3. A multiple relevance feedback strategy with positive and negative models.

    Directory of Open Access Journals (Sweden)

    Yunlong Ma

    Full Text Available A commonly used strategy to improve search accuracy is through feedback techniques. Most existing work on feedback relies on positive information, and has been extensively studied in information retrieval. However, when a query topic is difficult and the results from the first-pass retrieval are very poor, it is impossible to extract enough useful terms from a few positive documents. Therefore, the positive feedback strategy is incapable to improve retrieval in this situation. Contrarily, there is a relatively large number of negative documents in the top of the result list, and it has been confirmed that negative feedback strategy is an important and useful way for adapting this scenario by several recent studies. In this paper, we consider a scenario when the search results are so poor that there are at most three relevant documents in the top twenty documents. Then, we conduct a novel study of multiple strategies for relevance feedback using both positive and negative examples from the first-pass retrieval to improve retrieval accuracy for such difficult queries. Experimental results on these TREC collections show that the proposed language model based multiple model feedback method which is generally more effective than both the baseline method and the methods using only positive or negative model.

  4. Complexity-aware simple modeling.

    Science.gov (United States)

    Gómez-Schiavon, Mariana; El-Samad, Hana

    2018-02-26

    Mathematical models continue to be essential for deepening our understanding of biology. On one extreme, simple or small-scale models help delineate general biological principles. However, the parsimony of detail in these models as well as their assumption of modularity and insulation make them inaccurate for describing quantitative features. On the other extreme, large-scale and detailed models can quantitatively recapitulate a phenotype of interest, but have to rely on many unknown parameters, making them often difficult to parse mechanistically and to use for extracting general principles. We discuss some examples of a new approach-complexity-aware simple modeling-that can bridge the gap between the small-scale and large-scale approaches. Copyright © 2018 Elsevier Ltd. All rights reserved.

  5. Computational models of complex systems

    CERN Document Server

    Dabbaghian, Vahid

    2014-01-01

    Computational and mathematical models provide us with the opportunities to investigate the complexities of real world problems. They allow us to apply our best analytical methods to define problems in a clearly mathematical manner and exhaustively test our solutions before committing expensive resources. This is made possible by assuming parameter(s) in a bounded environment, allowing for controllable experimentation, not always possible in live scenarios. For example, simulation of computational models allows the testing of theories in a manner that is both fundamentally deductive and experimental in nature. The main ingredients for such research ideas come from multiple disciplines and the importance of interdisciplinary research is well recognized by the scientific community. This book provides a window to the novel endeavours of the research communities to present their works by highlighting the value of computational modelling as a research tool when investigating complex systems. We hope that the reader...

  6. Equation-free model reduction for complex dynamical systems

    International Nuclear Information System (INIS)

    Le Maitre, O. P.; Mathelin, L.; Le Maitre, O. P.

    2010-01-01

    This paper presents a reduced model strategy for simulation of complex physical systems. A classical reduced basis is first constructed relying on proper orthogonal decomposition of the system. Then, unlike the alternative approaches, such as Galerkin projection schemes for instance, an equation-free reduced model is constructed. It consists in the determination of an explicit transformation, or mapping, for the evolution over a coarse time-step of the projection coefficients of the system state on the reduced basis. The mapping is expressed as an explicit polynomial transformation of the projection coefficients and is computed once and for all in a pre-processing stage using the detailed model equation of the system. The reduced system can then be advanced in time by successive applications of the mapping. The CPU cost of the method lies essentially in the mapping approximation which is performed offline, in a parallel fashion, and only once. Subsequent application of the mapping to perform a time-integration is carried out at a low cost thanks to its explicit character. Application of the method is considered for the 2-D flow around a circular cylinder. We investigate the effectiveness of the reduced model in rendering the dynamics for both asymptotic state and transient stages. It is shown that the method leads to a stable and accurate time-integration for only a fraction of the cost of a detailed simulation, provided that the mapping is properly approximated and the reduced basis remains relevant for the dynamics investigated. (authors)

  7. Possible self-complexity and affective reactions to goal-relevant evaluation.

    Science.gov (United States)

    Niedenthal, P M; Setterlund, M B; Wherry, M B

    1992-07-01

    The complexity of people's self-concept appears to be inversely related to the intensity of their reactions to evaluative feedback about present goals and abilities (Linville, 1985, 1987). The idea that the complexity of individuals' possible self-concept similarly mediates reactions to feedback regarding future goals was investigated. Two preliminary studies suggested that complexity of the actual self only explains 20% to 30% of the variance in possible self-complexity. Three studies were conducted. Support was found for the idea that possible self-complexity mediates affective reactions to evaluative feedback about future goals and actual self-complexity mediates affective reactions to evaluative feedback about present goals. The findings underscore the independent roles of the organization of actual and possible self-concepts in affective processes.

  8. The relevance of non-human primate and rodent malaria models for humans

    OpenAIRE

    Langhorne, Jean; Buffet, Pierre; Galinski, Mary; Good, Michael; Harty, John; Leroy, Didier; Mota, Maria M; Pasini, Erica; Renia, Laurent; Riley, Eleanor; Stins, Monique; Duffy, Patrick

    2011-01-01

    Abstract At the 2010 Keystone Symposium on "Malaria: new approaches to understanding Host-Parasite interactions", an extra scientific session to discuss animal models in malaria research was convened at the request of participants. This was prompted by the concern of investigators that skepticism in the malaria community about the use and relevance of animal models, particularly rodent models of severe malaria, has impacted on funding decisions and publication of research using animal models....

  9. Control Relevant Modeling and Design of Scramjet-Powered Hypersonic Vehicles

    Science.gov (United States)

    Dickeson, Jeffrey James

    This report provides an overview of scramjet-powered hypersonic vehicle modeling and control challenges. Such vehicles are characterized by unstable non-minimum phase dynamics with significant coupling and low thrust margins. Recent trends in hypersonic vehicle research are summarized. To illustrate control relevant design issues and tradeoffs, a generic nonlinear 3DOF longitudinal dynamics model capturing aero-elastic-propulsive interactions for wedge-shaped vehicle is used. Limitations of the model are discussed and numerous modifications have been made to address control relevant needs. Two different baseline configurations are examined over a two-stage to orbit ascent trajectory. The report highlights how vehicle level-flight static (trim) and dynamic properties change over the trajectory. Thermal choking constraints are imposed on control system design as a direct consequence of having a finite FER margin. The implication of this state-dependent nonlinear FER margin constraint, the right half plane (RHP) zero, and lightly damped flexible modes, on control system bandwidth (BW) and FPA tracking has been discussed. A control methodology has been proposed that addresses the above dynamics while providing some robustness to modeling uncertainty. Vehicle closure (the ability to fly a trajectory segment subject to constraints) is provided through a proposed vehicle design methodology. The design method attempts to use open loop metrics whenever possible to design the vehicle. The design method is applied to a vehicle/control law closed loop nonlinear simulation for validation. The 3DOF longitudinal modeling results are validated against a newly released NASA 6DOF code.

  10. On Feature Relevance in Image-Based Prediction Models: An Empirical Study

    DEFF Research Database (Denmark)

    Konukoglu, E.; Ganz, Melanie; Van Leemput, Koen

    2013-01-01

    Determining disease-related variations of the anatomy and function is an important step in better understanding diseases and developing early diagnostic systems. In particular, image-based multivariate prediction models and the “relevant features” they produce are attracting attention from the co...

  11. Elements of complexity in subsurface modeling, exemplified with three case studies

    Energy Technology Data Exchange (ETDEWEB)

    Freedman, Vicky L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Truex, Michael J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Rockhold, Mark [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Bacon, Diana H. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Freshley, Mark D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wellman, Dawn M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2017-04-03

    There are complexity elements to consider when applying subsurface flow and transport models to support environmental analyses. Modelers balance the benefits and costs of modeling along the spectrum of complexity, taking into account the attributes of more simple models (e.g., lower cost, faster execution, easier to explain, less mechanistic) and the attributes of more complex models (higher cost, slower execution, harder to explain, more mechanistic and technically defensible). In this paper, modeling complexity is examined with respect to considering this balance. The discussion of modeling complexity is organized into three primary elements: 1) modeling approach, 2) description of process, and 3) description of heterogeneity. Three examples are used to examine these complexity elements. Two of the examples use simulations generated from a complex model to develop simpler models for efficient use in model applications. The first example is designed to support performance evaluation of soil vapor extraction remediation in terms of groundwater protection. The second example investigates the importance of simulating different categories of geochemical reactions for carbon sequestration and selecting appropriate simplifications for use in evaluating sequestration scenarios. In the third example, the modeling history for a uranium-contaminated site demonstrates that conservative parameter estimates were inadequate surrogates for complex, critical processes and there is discussion on the selection of more appropriate model complexity for this application. All three examples highlight how complexity considerations are essential to create scientifically defensible models that achieve a balance between model simplification and complexity.

  12. Prototypes and matrix relevance learning in complex fourier space

    NARCIS (Netherlands)

    Straat, M.; Kaden, M.; Gay, M.; Villmann, T.; Lampe, Alexander; Seiffert, U.; Biehl, M.; Melchert, F.

    2017-01-01

    In this contribution, we consider the classification of time-series and similar functional data which can be represented in complex Fourier coefficient space. We apply versions of Learning Vector Quantization (LVQ) which are suitable for complex-valued data, based on the so-called Wirtinger

  13. Coupled variable selection for regression modeling of complex treatment patterns in a clinical cancer registry.

    Science.gov (United States)

    Schmidtmann, I; Elsäßer, A; Weinmann, A; Binder, H

    2014-12-30

    For determining a manageable set of covariates potentially influential with respect to a time-to-event endpoint, Cox proportional hazards models can be combined with variable selection techniques, such as stepwise forward selection or backward elimination based on p-values, or regularized regression techniques such as component-wise boosting. Cox regression models have also been adapted for dealing with more complex event patterns, for example, for competing risks settings with separate, cause-specific hazard models for each event type, or for determining the prognostic effect pattern of a variable over different landmark times, with one conditional survival model for each landmark. Motivated by a clinical cancer registry application, where complex event patterns have to be dealt with and variable selection is needed at the same time, we propose a general approach for linking variable selection between several Cox models. Specifically, we combine score statistics for each covariate across models by Fisher's method as a basis for variable selection. This principle is implemented for a stepwise forward selection approach as well as for a regularized regression technique. In an application to data from hepatocellular carcinoma patients, the coupled stepwise approach is seen to facilitate joint interpretation of the different cause-specific Cox models. In conditional survival models at landmark times, which address updates of prediction as time progresses and both treatment and other potential explanatory variables may change, the coupled regularized regression approach identifies potentially important, stably selected covariates together with their effect time pattern, despite having only a small number of events. These results highlight the promise of the proposed approach for coupling variable selection between Cox models, which is particularly relevant for modeling for clinical cancer registries with their complex event patterns. Copyright © 2014 John Wiley & Sons

  14. Investigating the need for complex vs. simple scenarios to improve predictions of aquatic ecosystem exposure with the SoilPlus model

    International Nuclear Information System (INIS)

    Ghirardello, Davide; Morselli, Melissa; Otto, Stefan; Zanin, Giuseppe; Di Guardo, Antonio

    2014-01-01

    A spatially-explicit version of the recent multimedia fate model SoilPlus was developed and applied to predict the runoff of three pesticides in a small agricultural watershed in north-eastern Italy. In order to evaluate model response to increasing spatial resolution, a tiered simulation approach was adopted, also using a dynamic model for surface water (DynA model), to predict the fate of pesticides in runoff water and sediment, and concentrations in river water. Simulation outputs were compared to water concentrations measured in the basin. Results showed that a high spatial resolution and scenario complexity improved model predictions of metolachlor and terbuthylazine in runoff to an acceptable performance (R 2 = 0.64–0.70). The importance was also shown of a field-based database of properties (i.e. soil texture and organic carbon, rainfall and water flow, pesticides half-life in soil) in reducing the distance between predicted and measured surface water concentrations and its relevance for risk assessment. Highlights: • A GIS based model was developed to predict pesticide fate in soil and water. • Spatial scenario was obtained at field level for a small agricultural basin. • A tiered strategy was applied to test the performance gain with complexity. • Increased details of scenario as well as the role of surface water are relevant. -- In order to obtain more ecologically realistic predictions of pulse exposure in aquatic ecosystems detailed information about the scenario is required

  15. Analysis of Evolutionarily Independent Protein-RNA Complexes Yields a Criterion to Evaluate the Relevance of Prebiotic Scenarios.

    Science.gov (United States)

    Blanco, Celia; Bayas, Marco; Yan, Fu; Chen, Irene A

    2018-02-19

    A central difficulty facing study of the origin of life on Earth is evaluating the relevance of different proposed prebiotic scenarios. Perhaps the most established feature of the origin of life was the progression through an RNA World, a prebiotic stage dominated by functional RNA. We use the appearance of proteins in the RNA World to understand the prebiotic milieu and develop a criterion to evaluate proposed synthetic scenarios. Current consensus suggests that the earliest amino acids of the genetic code were anionic or small hydrophobic or polar amino acids. However, the ability to interact with the RNA World would have been a crucial feature of early proteins. To determine which amino acids would be important for the RNA World, we analyze non-biological protein-aptamer complexes in which the RNA or DNA is the result of in vitro evolution. This approach avoids confounding effects of biological context and evolutionary history. We use bioinformatic analysis and molecular dynamics simulations to characterize these complexes. We find that positively charged and aromatic amino acids are over-represented whereas small hydrophobic amino acids are under-represented. Binding enthalpy is found to be primarily electrostatic, with positively charged amino acids contributing cooperatively to binding enthalpy. Arginine dominates all modes of interaction at the interface. These results suggest that proposed prebiotic syntheses must be compatible with cationic amino acids, particularly arginine or a biophysically similar amino acid, in order to be relevant to the invention of protein by the RNA World. Copyright © 2018 Elsevier Ltd. All rights reserved.

  16. Cumulative complexity: a functional, patient-centered model of patient complexity can improve research and practice.

    Science.gov (United States)

    Shippee, Nathan D; Shah, Nilay D; May, Carl R; Mair, Frances S; Montori, Victor M

    2012-10-01

    To design a functional, patient-centered model of patient complexity with practical applicability to analytic design and clinical practice. Existing literature on patient complexity has mainly identified its components descriptively and in isolation, lacking clarity as to their combined functions in disrupting care or to how complexity changes over time. The authors developed a cumulative complexity model, which integrates existing literature and emphasizes how clinical and social factors accumulate and interact to complicate patient care. A narrative literature review is used to explicate the model. The model emphasizes a core, patient-level mechanism whereby complicating factors impact care and outcomes: the balance between patient workload of demands and patient capacity to address demands. Workload encompasses the demands on the patient's time and energy, including demands of treatment, self-care, and life in general. Capacity concerns ability to handle work (e.g., functional morbidity, financial/social resources, literacy). Workload-capacity imbalances comprise the mechanism driving patient complexity. Treatment and illness burdens serve as feedback loops, linking negative outcomes to further imbalances, such that complexity may accumulate over time. With its components largely supported by existing literature, the model has implications for analytic design, clinical epidemiology, and clinical practice. Copyright © 2012 Elsevier Inc. All rights reserved.

  17. Highly Relevant Mentoring (HRM) as a Faculty Development Model for Web-Based Instruction

    Science.gov (United States)

    Carter, Lorraine; Salyers, Vincent; Page, Aroha; Williams, Lynda; Albl, Liz; Hofsink, Clarence

    2012-01-01

    This paper describes a faculty development model called the highly relevant mentoring (HRM) model; the model includes a framework as well as some practical strategies for meeting the professional development needs of faculty who teach web-based courses. The paper further emphasizes the need for faculty and administrative buy-in for HRM and…

  18. Is Model-Based Development a Favorable Approach for Complex and Safety-Critical Computer Systems on Commercial Aircraft?

    Science.gov (United States)

    Torres-Pomales, Wilfredo

    2014-01-01

    A system is safety-critical if its failure can endanger human life or cause significant damage to property or the environment. State-of-the-art computer systems on commercial aircraft are highly complex, software-intensive, functionally integrated, and network-centric systems of systems. Ensuring that such systems are safe and comply with existing safety regulations is costly and time-consuming as the level of rigor in the development process, especially the validation and verification activities, is determined by considerations of system complexity and safety criticality. A significant degree of care and deep insight into the operational principles of these systems is required to ensure adequate coverage of all design implications relevant to system safety. Model-based development methodologies, methods, tools, and techniques facilitate collaboration and enable the use of common design artifacts among groups dealing with different aspects of the development of a system. This paper examines the application of model-based development to complex and safety-critical aircraft computer systems. Benefits and detriments are identified and an overall assessment of the approach is given.

  19. Clinical relevance of positive voltage-gated potassium channel (VGKC)-complex antibodies: experience from a tertiary referral centre.

    Science.gov (United States)

    Paterson, Ross W; Zandi, Michael S; Armstrong, Richard; Vincent, Angela; Schott, Jonathan M

    2014-06-01

    Voltage-gated potassium channel (VGKC)-complex antibodies can be associated with a range of immunotherapy-responsive clinical presentations including limbic encephalitis, Morvan's syndrome and acquired neuromyotonia. However, there are patients with positive levels in whom the significance is uncertain. To evaluate the clinical significance associated with positive (>100 pM) VGKC-complex antibodies. Over a 4-year period, 1053 samples were sent for testing of which 55 were positive. The clinical presentations, final diagnoses and responses to immunotherapies, when given, were assessed retrospectively and the likelihood of autoimmunity was categorised as definite, possible, unlikely or undetermined (modified from Zuliani et al 2012). Only 4 of the 32 patients with low-positive (100-400 pM) levels were considered definitely autoimmune, 3 with peripheral nerve hyperexcitability and 1 with a thymoma; 3 were given immunotherapies. Of the remaining 28 with low-positive levels, 13 (3 of whom had tumours) were considered possibly autoimmune, and 15 were unlikely or undetermined; 1 was given immunotherapy unsuccessfully. Of the 23 patients with high-positive (>400 pM) levels, 12 were given immunotherapies, 11 of whom showed a good response. 11 were considered definitely autoimmune, 10 with limbic encephalitis (antibody specificity: 5 LGI1, 1 contactin2, 2 negative, 2 untested) and 1 with a tumour. In the remaining 12, autoimmunity was considered possible (n=9; most had not received immunotherapies), or unlikely (n=3). As antibody testing becomes more widely available, and many samples are referred from patients with less clear-cut diagnoses, it is important to assess the utility of the results. VGKC-complex antibodies in the range of 100-400 pM (0.1-0.4 nM) were considered clinically relevant in rare conditions with peripheral nerve hyperexcitability and appeared to associate with tumours (12.5%). By contrast high-positive (>400 pM; >0.4 nM) levels were considered definitely

  20. On spin and matrix models in the complex plane

    International Nuclear Information System (INIS)

    Damgaard, P.H.; Heller, U.M.

    1993-01-01

    We describe various aspects of statistical mechanics defined in the complex temperature or coupling-constant plane. Using exactly solvable models, we analyse such aspects as renormalization group flows in the complex plane, the distribution of partition function zeros, and the question of new coupling-constant symmetries of complex-plane spin models. The double-scaling form of matrix models is shown to be exactly equivalent to finite-size scaling of two-dimensional spin systems. This is used to show that the string susceptibility exponents derived from matrix models can be obtained numerically with very high accuracy from the scaling of finite-N partition function zeros in the complex plane. (orig.)

  1. Relevance of separation science and technology to nuclear fuel complex operations

    International Nuclear Information System (INIS)

    Rao, S.M.; Ojha, P.B.; Rajashri, M.; Mirji, K.V.; Kalidas, R.

    2004-01-01

    During the last three decades at Nuclear Fuel Complex (NFC), Hyderabad, the Science and Technology of separation to produce various reactor grade materials in tonnage quantity is being practiced in the fields of Zr/Hf, U and Nb/Ta. Apart from this, the separation science is also being used in the production of various high purity materials and in the analytical field. The separation science and technology that is used in the production and characterisation of reactor grade materials has many striking differences from that of the common metals. The relevance and significance of separation science in the field of nuclear materials arises mainly due to the harmful effects w.r.t corrosion property and absorption of neutron caused by the presence of impurities, that are to be brought down to ppm or sub ppm level. In many cases low separation factors, that too from a multi component system call for effective process control at every stage of the bulk production so as to get quality product consistently. This article brings out the importance of separation science and technology and various process standardisations/developments that have been carried out at NFC, starting from laboratory scale to pilot scale and up to industrial scale production in the case of (i) Uranium refining (ii) Zr-Hf separation (iii) Ta-Nb separation and (iv) High purity materials production. (author)

  2. Modeling OPC complexity for design for manufacturability

    Science.gov (United States)

    Gupta, Puneet; Kahng, Andrew B.; Muddu, Swamy; Nakagawa, Sam; Park, Chul-Hong

    2005-11-01

    Increasing design complexity in sub-90nm designs results in increased mask complexity and cost. Resolution enhancement techniques (RET) such as assist feature addition, phase shifting (attenuated PSM) and aggressive optical proximity correction (OPC) help in preserving feature fidelity in silicon but increase mask complexity and cost. Data volume increase with rise in mask complexity is becoming prohibitive for manufacturing. Mask cost is determined by mask write time and mask inspection time, which are directly related to the complexity of features printed on the mask. Aggressive RET increase complexity by adding assist features and by modifying existing features. Passing design intent to OPC has been identified as a solution for reducing mask complexity and cost in several recent works. The goal of design-aware OPC is to relax OPC tolerances of layout features to minimize mask cost, without sacrificing parametric yield. To convey optimal OPC tolerances for manufacturing, design optimization should drive OPC tolerance optimization using models of mask cost for devices and wires. Design optimization should be aware of impact of OPC correction levels on mask cost and performance of the design. This work introduces mask cost characterization (MCC) that quantifies OPC complexity, measured in terms of fracture count of the mask, for different OPC tolerances. MCC with different OPC tolerances is a critical step in linking design and manufacturing. In this paper, we present a MCC methodology that provides models of fracture count of standard cells and wire patterns for use in design optimization. MCC cannot be performed by designers as they do not have access to foundry OPC recipes and RET tools. To build a fracture count model, we perform OPC and fracturing on a limited set of standard cells and wire configurations with all tolerance combinations. Separately, we identify the characteristics of the layout that impact fracture count. Based on the fracture count (FC) data

  3. Spectroscopic investigation of complexation of Cm(III) und Eu(III) with partitioning-relevant N-donor ligands

    International Nuclear Information System (INIS)

    Bremer, Antje

    2014-01-01

    The separation of trivalent actinides and lanthanides is an essential part of the development of improved nuclear fuel cycles. Liquid-liquid extraction is an applicable technique to achieve this separation. Due to the chemical similarity and the almost identical ionic radii of trivalent actinides and lanthanides this separation is, however, only feasible with highly selective extracting agents. It has been proven that molecules with soft sulphur or nitrogen donor atoms have a higher affinity for trivalent actinides. In the present work, the complexation of Cm(III) and Eu(III) with N-donor ligands relevant for partitioning has been studied by time-resolved laser fluorescence spectroscopy (TRLFS). This work aims at a better understanding of the molecular reason of the selectivity of these ligands. In this context, enormous effort has been and is still put into detailed investigations on BTP and BTBP ligands, which are the most successful N-donor ligands for the selective extraction of trivalent actinides, to date. Additionally, the complexation and extraction behavior of molecules which are structurally related to these ligands is studied. The ligand C5-BPP (2,6-bis(5-(2,2-dimethylpropyl)-1H-pyrazol-3-yl)pyridine) where the triazine rings of the aromatic backbone of the BTP ligands have been replaced by pyrazole rings is one of these molecules. Laser fluorescence spectroscopic investigation of the complexation of Cm(III) with this ligand revealed stepwise formation of three (Cm(C5-BPP) n ) 3+ complexes (n = 1 - 3). The stability constant of the 1:3 complex was determined (log β 3 = 14.8 ± 0.4). Extraction experiments have shown that, in contrast to BTP and BTBP ligands, C5-BPP needs an additional lipophilic anion source such as a 2-bromocarboxylic acid to selectively extract trivalent actinides from nitric acid solutions. The comparison of the stability constant of the (Cm(C5-BPP) 3 ) 3+ complex with the stability constant of the (Cm(nPr-BTP) 3 ) 3+ complex

  4. Political economy models and agricultural policy formation : empirical applicability and relevance for the CAP

    NARCIS (Netherlands)

    Zee, van der F.A.

    1997-01-01

    This study explores the relevance and applicability of political economy models for the explanation of agricultural policies. Part I (chapters 4-7) takes a general perspective and evaluates the empirical applicability of voting models and interest group models to agricultural policy

  5. Modeling the Structure and Complexity of Engineering Routine Design Problems

    NARCIS (Netherlands)

    Jauregui Becker, Juan Manuel; Wits, Wessel Willems; van Houten, Frederikus J.A.M.

    2011-01-01

    This paper proposes a model to structure routine design problems as well as a model of its design complexity. The idea is that having a proper model of the structure of such problems enables understanding its complexity, and likewise, a proper understanding of its complexity enables the development

  6. Documentation Driven Development for Complex Real-Time Systems

    Science.gov (United States)

    2004-12-01

    This paper presents a novel approach for development of complex real - time systems , called the documentation-driven development (DDD) approach. This... time systems . DDD will also support automated software generation based on a computational model and some relevant techniques. DDD includes two main...stakeholders to be easily involved in development processes and, therefore, significantly improve the agility of software development for complex real

  7. ASYMMETRIC PRICE TRANSMISSION MODELING: THE IMPORTANCE OF MODEL COMPLEXITY AND THE PERFORMANCE OF THE SELECTION CRITERIA

    Directory of Open Access Journals (Sweden)

    Henry de-Graft Acquah

    2013-01-01

    Full Text Available Information Criteria provides an attractive basis for selecting the best model from a set of competing asymmetric price transmission models or theories. However, little is understood about the sensitivity of the model selection methods to model complexity. This study therefore fits competing asymmetric price transmission models that differ in complexity to simulated data and evaluates the ability of the model selection methods to recover the true model. The results of Monte Carlo experimentation suggest that in general BIC, CAIC and DIC were superior to AIC when the true data generating process was the standard error correction model, whereas AIC was more successful when the true model was the complex error correction model. It is also shown that the model selection methods performed better in large samples for a complex asymmetric data generating process than with a standard asymmetric data generating process. Except for complex models, AIC's performance did not make substantial gains in recovery rates as sample size increased. The research findings demonstrate the influence of model complexity in asymmetric price transmission model comparison and selection.

  8. Modelling of information processes management of educational complex

    Directory of Open Access Journals (Sweden)

    Оксана Николаевна Ромашкова

    2014-12-01

    Full Text Available This work concerns information model of the educational complex which includes several schools. A classification of educational complexes formed in Moscow is given. There are also a consideration of the existing organizational structure of the educational complex and a suggestion of matrix management structure. Basic management information processes of the educational complex were conceptualized.

  9. Complex networks under dynamic repair model

    Science.gov (United States)

    Chaoqi, Fu; Ying, Wang; Kun, Zhao; Yangjun, Gao

    2018-01-01

    Invulnerability is not the only factor of importance when considering complex networks' security. It is also critical to have an effective and reasonable repair strategy. Existing research on network repair is confined to the static model. The dynamic model makes better use of the redundant capacity of repaired nodes and repairs the damaged network more efficiently than the static model; however, the dynamic repair model is complex and polytropic. In this paper, we construct a dynamic repair model and systematically describe the energy-transfer relationships between nodes in the repair process of the failure network. Nodes are divided into three types, corresponding to three structures. We find that the strong coupling structure is responsible for secondary failure of the repaired nodes and propose an algorithm that can select the most suitable targets (nodes or links) to repair the failure network with minimal cost. Two types of repair strategies are identified, with different effects under the two energy-transfer rules. The research results enable a more flexible approach to network repair.

  10. Predictive modelling of complex agronomic and biological systems.

    Science.gov (United States)

    Keurentjes, Joost J B; Molenaar, Jaap; Zwaan, Bas J

    2013-09-01

    Biological systems are tremendously complex in their functioning and regulation. Studying the multifaceted behaviour and describing the performance of such complexity has challenged the scientific community for years. The reduction of real-world intricacy into simple descriptive models has therefore convinced many researchers of the usefulness of introducing mathematics into biological sciences. Predictive modelling takes such an approach another step further in that it takes advantage of existing knowledge to project the performance of a system in alternating scenarios. The ever growing amounts of available data generated by assessing biological systems at increasingly higher detail provide unique opportunities for future modelling and experiment design. Here we aim to provide an overview of the progress made in modelling over time and the currently prevalent approaches for iterative modelling cycles in modern biology. We will further argue for the importance of versatility in modelling approaches, including parameter estimation, model reduction and network reconstruction. Finally, we will discuss the difficulties in overcoming the mathematical interpretation of in vivo complexity and address some of the future challenges lying ahead. © 2013 John Wiley & Sons Ltd.

  11. Does model performance improve with complexity? A case study with three hydrological models

    Science.gov (United States)

    Orth, Rene; Staudinger, Maria; Seneviratne, Sonia I.; Seibert, Jan; Zappa, Massimiliano

    2015-04-01

    In recent decades considerable progress has been made in climate model development. Following the massive increase in computational power, models became more sophisticated. At the same time also simple conceptual models have advanced. In this study we validate and compare three hydrological models of different complexity to investigate whether their performance varies accordingly. For this purpose we use runoff and also soil moisture measurements, which allow a truly independent validation, from several sites across Switzerland. The models are calibrated in similar ways with the same runoff data. Our results show that the more complex models HBV and PREVAH outperform the simple water balance model (SWBM) in case of runoff but not for soil moisture. Furthermore the most sophisticated PREVAH model shows an added value compared to the HBV model only in case of soil moisture. Focusing on extreme events we find generally improved performance of the SWBM during drought conditions and degraded agreement with observations during wet extremes. For the more complex models we find the opposite behavior, probably because they were primarily developed for prediction of runoff extremes. As expected given their complexity, HBV and PREVAH have more problems with over-fitting. All models show a tendency towards better performance in lower altitudes as opposed to (pre-) alpine sites. The results vary considerably across the investigated sites. In contrast, the different metrics we consider to estimate the agreement between models and observations lead to similar conclusions, indicating that the performance of the considered models is similar at different time scales as well as for anomalies and long-term means. We conclude that added complexity does not necessarily lead to improved performance of hydrological models, and that performance can vary greatly depending on the considered hydrological variable (e.g. runoff vs. soil moisture) or hydrological conditions (floods vs. droughts).

  12. Computational Modeling of Complex Protein Activity Networks

    NARCIS (Netherlands)

    Schivo, Stefano; Leijten, Jeroen; Karperien, Marcel; Post, Janine N.; Prignet, Claude

    2017-01-01

    Because of the numerous entities interacting, the complexity of the networks that regulate cell fate makes it impossible to analyze and understand them using the human brain alone. Computational modeling is a powerful method to unravel complex systems. We recently described the development of a

  13. Different Epidemic Models on Complex Networks

    International Nuclear Information System (INIS)

    Zhang Haifeng; Small, Michael; Fu Xinchu

    2009-01-01

    Models for diseases spreading are not just limited to SIS or SIR. For instance, for the spreading of AIDS/HIV, the susceptible individuals can be classified into different cases according to their immunity, and similarly, the infected individuals can be sorted into different classes according to their infectivity. Moreover, some diseases may develop through several stages. Many authors have shown that the individuals' relation can be viewed as a complex network. So in this paper, in order to better explain the dynamical behavior of epidemics, we consider different epidemic models on complex networks, and obtain the epidemic threshold for each case. Finally, we present numerical simulations for each case to verify our results.

  14. Microscale Synthesis, Reactions, and (Super 1)H NMR Spectroscopic Investigations of Square Planar Macrocyclic, Tetramido-N Co(III) Complexes Relevant to Green Chemistry

    Science.gov (United States)

    Watson, Tanya T.; Uffelman, Erich S.; Lee, Daniel W., III; Doherty, Jonathan R.; Schulze, Carl; Burke, Amy L.; Bonnema, Kristen, R.

    2004-01-01

    The microscale preparation, characterization, and reactivity of a square planar Co(III) complex that has grown out of a program to introduce experiments of relevance to green chemistry into the undergraduate curriculum is presented. The given experiments illustrate the remarkable redox and aqueous acid-base stability that make the macrocycles very…

  15. Equation of state experiments and theory relevant to planetary modelling

    International Nuclear Information System (INIS)

    Ross, M.; Graboske, H.C. Jr.; Nellis, W.J.

    1981-01-01

    In recent years there have been a number of static and shockwave experiments on the properties of planetary materials. The highest pressure measurements, and the ones most relevant to planetary modelling, have been obtained by shock compression. Of particular interest to the Jovian group are results for H 2 , H 2 O, CH 4 and NH 3 . Although the properties of metallic hydrogen have not been measured, they have been the subject of extensive calculations. In addition recent shock wave experiments on iron report to have detected melting under Earth core conditions. From this data theoretical models have been developed for computing the equations of state of materials used in planetary studies. A compelling feature that has followed from the use of improved material properties is a simplification in the planetary models. (author)

  16. Data-assisted reduced-order modeling of extreme events in complex dynamical systems.

    Science.gov (United States)

    Wan, Zhong Yi; Vlachas, Pantelis; Koumoutsakos, Petros; Sapsis, Themistoklis

    2018-01-01

    The prediction of extreme events, from avalanches and droughts to tsunamis and epidemics, depends on the formulation and analysis of relevant, complex dynamical systems. Such dynamical systems are characterized by high intrinsic dimensionality with extreme events having the form of rare transitions that are several standard deviations away from the mean. Such systems are not amenable to classical order-reduction methods through projection of the governing equations due to the large intrinsic dimensionality of the underlying attractor as well as the complexity of the transient events. Alternatively, data-driven techniques aim to quantify the dynamics of specific, critical modes by utilizing data-streams and by expanding the dimensionality of the reduced-order model using delayed coordinates. In turn, these methods have major limitations in regions of the phase space with sparse data, which is the case for extreme events. In this work, we develop a novel hybrid framework that complements an imperfect reduced order model, with data-streams that are integrated though a recurrent neural network (RNN) architecture. The reduced order model has the form of projected equations into a low-dimensional subspace that still contains important dynamical information about the system and it is expanded by a long short-term memory (LSTM) regularization. The LSTM-RNN is trained by analyzing the mismatch between the imperfect model and the data-streams, projected to the reduced-order space. The data-driven model assists the imperfect model in regions where data is available, while for locations where data is sparse the imperfect model still provides a baseline for the prediction of the system state. We assess the developed framework on two challenging prototype systems exhibiting extreme events. We show that the blended approach has improved performance compared with methods that use either data streams or the imperfect model alone. Notably the improvement is more significant in

  17. Data-assisted reduced-order modeling of extreme events in complex dynamical systems.

    Directory of Open Access Journals (Sweden)

    Zhong Yi Wan

    Full Text Available The prediction of extreme events, from avalanches and droughts to tsunamis and epidemics, depends on the formulation and analysis of relevant, complex dynamical systems. Such dynamical systems are characterized by high intrinsic dimensionality with extreme events having the form of rare transitions that are several standard deviations away from the mean. Such systems are not amenable to classical order-reduction methods through projection of the governing equations due to the large intrinsic dimensionality of the underlying attractor as well as the complexity of the transient events. Alternatively, data-driven techniques aim to quantify the dynamics of specific, critical modes by utilizing data-streams and by expanding the dimensionality of the reduced-order model using delayed coordinates. In turn, these methods have major limitations in regions of the phase space with sparse data, which is the case for extreme events. In this work, we develop a novel hybrid framework that complements an imperfect reduced order model, with data-streams that are integrated though a recurrent neural network (RNN architecture. The reduced order model has the form of projected equations into a low-dimensional subspace that still contains important dynamical information about the system and it is expanded by a long short-term memory (LSTM regularization. The LSTM-RNN is trained by analyzing the mismatch between the imperfect model and the data-streams, projected to the reduced-order space. The data-driven model assists the imperfect model in regions where data is available, while for locations where data is sparse the imperfect model still provides a baseline for the prediction of the system state. We assess the developed framework on two challenging prototype systems exhibiting extreme events. We show that the blended approach has improved performance compared with methods that use either data streams or the imperfect model alone. Notably the improvement is more

  18. Relevant Criteria for Testing the Quality of Models for Turbulent Wind Speed Fluctuations

    DEFF Research Database (Denmark)

    Frandsen, Sten Tronæs; Ejsing Jørgensen, Hans; Sørensen, John Dalsgaard

    2008-01-01

    Seeking relevant criteria for testing the quality of turbulence models, the scale of turbulence and the gust factor have been estimated from data and compared with predictions from first-order models of these two quantities. It is found that the mean of the measured length scales is approximately...... 10% smaller than the IEC model for wind turbine hub height levels. The mean is only marginally dependent on trends in time series. It is also found that the coefficient of variation of the measured length scales is about 50%. 3  s and 10  s preaveraging of wind speed data are relevant for megawatt......-size wind turbines when seeking wind characteristics that correspond to one blade and the entire rotor, respectively. For heights exceeding 50-60  m, the gust factor increases with wind speed. For heights larger than 60-80  m, present assumptions on the value of the gust factor are significantly...

  19. Elaboration d'entrep\\^ots de donn\\'ees complexes

    OpenAIRE

    Teste, Olivier

    2010-01-01

    In this paper, we study the data warehouse modelling used in decision support systems. We provide an object-oriented data warehouse model allowing data warehouse description as a central repository of relevant, complex and temporal data. Our model integrates three concepts such as warehouse object, environment and warehouse class. Each warehouse object is composed of one current state, several past states (modelling its detailed evolutions) and several archive states (modelling its evolutions...

  20. Complex-plane strategy for computing rotating polytropic models - efficiency and accuracy of the complex first-order perturbation theory

    International Nuclear Information System (INIS)

    Geroyannis, V.S.

    1988-01-01

    In this paper, a numerical method is developed for determining the structure distortion of a polytropic star which rotates either uniformly or differentially. This method carries out the required numerical integrations in the complex plane. The method is implemented to compute indicative quantities, such as the critical perturbation parameter which represents an upper limit in the rotational behavior of the star. From such indicative results, it is inferred that this method achieves impressive improvement against other relevant methods; most important, it is comparable to some of the most elaborate and accurate techniques on the subject. It is also shown that the use of this method with Chandrasekhar's first-order perturbation theory yields an immediate drastic improvement of the results. Thus, there is no neeed - for most applications concerning rotating polytropic models - to proceed to the further use of the method with higher order techniques, unless the maximum accuracy of the method is required. 31 references

  1. Smart modeling and simulation for complex systems practice and theory

    CERN Document Server

    Ren, Fenghui; Zhang, Minjie; Ito, Takayuki; Tang, Xijin

    2015-01-01

    This book aims to provide a description of these new Artificial Intelligence technologies and approaches to the modeling and simulation of complex systems, as well as an overview of the latest scientific efforts in this field such as the platforms and/or the software tools for smart modeling and simulating complex systems. These tasks are difficult to accomplish using traditional computational approaches due to the complex relationships of components and distributed features of resources, as well as the dynamic work environments. In order to effectively model the complex systems, intelligent technologies such as multi-agent systems and smart grids are employed to model and simulate the complex systems in the areas of ecosystem, social and economic organization, web-based grid service, transportation systems, power systems and evacuation systems.

  2. Evolution of disorder in Mediator complex and its functional relevance.

    Science.gov (United States)

    Nagulapalli, Malini; Maji, Sourobh; Dwivedi, Nidhi; Dahiya, Pradeep; Thakur, Jitendra K

    2016-02-29

    Mediator, an important component of eukaryotic transcriptional machinery, is a huge multisubunit complex. Though the complex is known to be conserved across all the eukaryotic kingdoms, the evolutionary topology of its subunits has never been studied. In this study, we profiled disorder in the Mediator subunits of 146 eukaryotes belonging to three kingdoms viz., metazoans, plants and fungi, and attempted to find correlation between the evolution of Mediator complex and its disorder. Our analysis suggests that disorder in Mediator complex have played a crucial role in the evolutionary diversification of complexity of eukaryotic organisms. Conserved intrinsic disordered regions (IDRs) were identified in only six subunits in the three kingdoms whereas unique patterns of IDRs were identified in other Mediator subunits. Acquisition of novel molecular recognition features (MoRFs) through evolution of new subunits or through elongation of the existing subunits was evident in metazoans and plants. A new concept of 'junction-MoRF' has been introduced. Evolutionary link between CBP and Med15 has been provided which explain the evolution of extended-IDR in CBP from Med15 KIX-IDR junction-MoRF suggesting role of junction-MoRF in evolution and modulation of protein-protein interaction repertoire. This study can be informative and helpful in understanding the conserved and flexible nature of Mediator complex across eukaryotic kingdoms. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  3. Universal correlators for multi-arc complex matrix models

    International Nuclear Information System (INIS)

    Akemann, G.

    1997-01-01

    The correlation functions of the multi-arc complex matrix model are shown to be universal for any finite number of arcs. The universality classes are characterized by the support of the eigenvalue density and are conjectured to fall into the same classes as the ones recently found for the Hermitian model. This is explicitly shown to be true for the case of two arcs, apart from the known result for one arc. The basic tool is the iterative solution of the loop equation for the complex matrix model with multiple arcs, which provides all multi-loop correlators up to an arbitrary genus. Explicit results for genus one are given for any number of arcs. The two-arc solution is investigated in detail, including the double-scaling limit. In addition universal expressions for the string susceptibility are given for both the complex and Hermitian model. (orig.)

  4. A Practical Philosophy of Complex Climate Modelling

    Science.gov (United States)

    Schmidt, Gavin A.; Sherwood, Steven

    2014-01-01

    We give an overview of the practice of developing and using complex climate models, as seen from experiences in a major climate modelling center and through participation in the Coupled Model Intercomparison Project (CMIP).We discuss the construction and calibration of models; their evaluation, especially through use of out-of-sample tests; and their exploitation in multi-model ensembles to identify biases and make predictions. We stress that adequacy or utility of climate models is best assessed via their skill against more naive predictions. The framework we use for making inferences about reality using simulations is naturally Bayesian (in an informal sense), and has many points of contact with more familiar examples of scientific epistemology. While the use of complex simulations in science is a development that changes much in how science is done in practice, we argue that the concepts being applied fit very much into traditional practices of the scientific method, albeit those more often associated with laboratory work.

  5. Understanding complex urban systems multidisciplinary approaches to modeling

    CERN Document Server

    Gurr, Jens; Schmidt, J

    2014-01-01

    Understanding Complex Urban Systems takes as its point of departure the insight that the challenges of global urbanization and the complexity of urban systems cannot be understood – let alone ‘managed’ – by sectoral and disciplinary approaches alone. But while there has recently been significant progress in broadening and refining the methodologies for the quantitative modeling of complex urban systems, in deepening the theoretical understanding of cities as complex systems, or in illuminating the implications for urban planning, there is still a lack of well-founded conceptual thinking on the methodological foundations and the strategies of modeling urban complexity across the disciplines. Bringing together experts from the fields of urban and spatial planning, ecology, urban geography, real estate analysis, organizational cybernetics, stochastic optimization, and literary studies, as well as specialists in various systems approaches and in transdisciplinary methodologies of urban analysis, the volum...

  6. A complex autoregressive model and application to monthly temperature forecasts

    Directory of Open Access Journals (Sweden)

    X. Gu

    2005-11-01

    Full Text Available A complex autoregressive model was established based on the mathematic derivation of the least squares for the complex number domain which is referred to as the complex least squares. The model is different from the conventional way that the real number and the imaginary number are separately calculated. An application of this new model shows a better forecast than forecasts from other conventional statistical models, in predicting monthly temperature anomalies in July at 160 meteorological stations in mainland China. The conventional statistical models include an autoregressive model, where the real number and the imaginary number are separately disposed, an autoregressive model in the real number domain, and a persistence-forecast model.

  7. Reassessing Geophysical Models of the Bushveld Complex in 3D

    Science.gov (United States)

    Cole, J.; Webb, S. J.; Finn, C.

    2012-12-01

    Conceptual geophysical models of the Bushveld Igneous Complex show three possible geometries for its mafic component: 1) Separate intrusions with vertical feeders for the eastern and western lobes (Cousins, 1959) 2) Separate dipping sheets for the two lobes (Du Plessis and Kleywegt, 1987) 3) A single saucer-shaped unit connected at depth in the central part between the two lobes (Cawthorn et al, 1998) Model three incorporates isostatic adjustment of the crust in response to the weight of the dense mafic material. The model was corroborated by results of a broadband seismic array over southern Africa, known as the Southern African Seismic Experiment (SASE) (Nguuri, et al, 2001; Webb et al, 2004). This new information about the crustal thickness only became available in the last decade and could not be considered in the earlier models. Nevertheless, there is still on-going debate as to which model is correct. All of the models published up to now have been done in 2 or 2.5 dimensions. This is not well suited to modelling the complex geometry of the Bushveld intrusion. 3D modelling takes into account effects of variations in geometry and geophysical properties of lithologies in a full three dimensional sense and therefore affects the shape and amplitude of calculated fields. The main question is how the new knowledge of the increased crustal thickness, as well as the complexity of the Bushveld Complex, will impact on the gravity fields calculated for the existing conceptual models, when modelling in 3D. The three published geophysical models were remodelled using full 3Dl potential field modelling software, and including crustal thickness obtained from the SASE. The aim was not to construct very detailed models, but to test the existing conceptual models in an equally conceptual way. Firstly a specific 2D model was recreated in 3D, without crustal thickening, to establish the difference between 2D and 3D results. Then the thicker crust was added. Including the less

  8. Geometric Modelling with a-Complexes

    NARCIS (Netherlands)

    Gerritsen, B.H.M.; Werff, K. van der; Veltkamp, R.C.

    2001-01-01

    The shape of real objects can be so complicated, that only a sampling data point set can accurately represent them. Analytic descriptions are too complicated or impossible. Natural objects, for example, can be vague and rough with many holes. For this kind of modelling, a-complexes offer advantages

  9. Neptunium (V) Adsorption to a Halophilic Bacterium Under High Ionic Strength Conditions: A Surface Complexation Modeling Approach

    Energy Technology Data Exchange (ETDEWEB)

    Ams, David A [Los Alamos National Laboratory

    2012-06-11

    Rationale for experimental design: Np(V) -- important as analog for Pu(V) and for HLW scenarios; High ionic strength -- relevant to salt-based repositories such as the WIPP; Halophilic microorganisms -- representative of high ionic strength environments. For the first time showed: Significant adsorbant to halophilic microorganisms over entire pH range under high ionic strength conditions; Strong influence of ionic strength with increasing adsorption with increasing ionic strength (in contrast to trends of previous low ionic strength studies); Effect of aqueous Np(V) and bacterial surface site speciation on adsorption; and Developed thermodynamic models that can be incorporated into geochemical speciation models to aid in the prediction of the fate and transport of Np(V) in more complex systems.

  10. Bim Automation: Advanced Modeling Generative Process for Complex Structures

    Science.gov (United States)

    Banfi, F.; Fai, S.; Brumana, R.

    2017-08-01

    The new paradigm of the complexity of modern and historic structures, which are characterised by complex forms, morphological and typological variables, is one of the greatest challenges for building information modelling (BIM). Generation of complex parametric models needs new scientific knowledge concerning new digital technologies. These elements are helpful to store a vast quantity of information during the life cycle of buildings (LCB). The latest developments of parametric applications do not provide advanced tools, resulting in time-consuming work for the generation of models. This paper presents a method capable of processing and creating complex parametric Building Information Models (BIM) with Non-Uniform to NURBS) with multiple levels of details (Mixed and ReverseLoD) based on accurate 3D photogrammetric and laser scanning surveys. Complex 3D elements are converted into parametric BIM software and finite element applications (BIM to FEA) using specific exchange formats and new modelling tools. The proposed approach has been applied to different case studies: the BIM of modern structure for the courtyard of West Block on Parliament Hill in Ottawa (Ontario) and the BIM of Masegra Castel in Sondrio (Italy), encouraging the dissemination and interaction of scientific results without losing information during the generative process.

  11. Hierarchical Bayesian nonparametric mixture models for clustering with variable relevance determination.

    Science.gov (United States)

    Yau, Christopher; Holmes, Chris

    2011-07-01

    We propose a hierarchical Bayesian nonparametric mixture model for clustering when some of the covariates are assumed to be of varying relevance to the clustering problem. This can be thought of as an issue in variable selection for unsupervised learning. We demonstrate that by defining a hierarchical population based nonparametric prior on the cluster locations scaled by the inverse covariance matrices of the likelihood we arrive at a 'sparsity prior' representation which admits a conditionally conjugate prior. This allows us to perform full Gibbs sampling to obtain posterior distributions over parameters of interest including an explicit measure of each covariate's relevance and a distribution over the number of potential clusters present in the data. This also allows for individual cluster specific variable selection. We demonstrate improved inference on a number of canonical problems.

  12. Coping with Complexity Model Reduction and Data Analysis

    CERN Document Server

    Gorban, Alexander N

    2011-01-01

    This volume contains the extended version of selected talks given at the international research workshop 'Coping with Complexity: Model Reduction and Data Analysis', Ambleside, UK, August 31 - September 4, 2009. This book is deliberately broad in scope and aims at promoting new ideas and methodological perspectives. The topics of the chapters range from theoretical analysis of complex and multiscale mathematical models to applications in e.g., fluid dynamics and chemical kinetics.

  13. Using advanced surface complexation models for modelling soil chemistry under forests: Solling forest, Germany

    Energy Technology Data Exchange (ETDEWEB)

    Bonten, Luc T.C., E-mail: luc.bonten@wur.nl [Alterra-Wageningen UR, Soil Science Centre, P.O. Box 47, 6700 AA Wageningen (Netherlands); Groenenberg, Jan E. [Alterra-Wageningen UR, Soil Science Centre, P.O. Box 47, 6700 AA Wageningen (Netherlands); Meesenburg, Henning [Northwest German Forest Research Station, Abt. Umweltkontrolle, Sachgebiet Intensives Umweltmonitoring, Goettingen (Germany); Vries, Wim de [Alterra-Wageningen UR, Soil Science Centre, P.O. Box 47, 6700 AA Wageningen (Netherlands)

    2011-10-15

    Various dynamic soil chemistry models have been developed to gain insight into impacts of atmospheric deposition of sulphur, nitrogen and other elements on soil and soil solution chemistry. Sorption parameters for anions and cations are generally calibrated for each site, which hampers extrapolation in space and time. On the other hand, recently developed surface complexation models (SCMs) have been successful in predicting ion sorption for static systems using generic parameter sets. This study reports the inclusion of an assemblage of these SCMs in the dynamic soil chemistry model SMARTml and applies this model to a spruce forest site in Solling Germany. Parameters for SCMs were taken from generic datasets and not calibrated. Nevertheless, modelling results for major elements matched observations well. Further, trace metals were included in the model, also using the existing framework of SCMs. The model predicted sorption for most trace elements well. - Highlights: > Surface complexation models can be well applied in field studies. > Soil chemistry under a forest site is adequately modelled using generic parameters. > The model is easily extended with extra elements within the existing framework. > Surface complexation models can show the linkages between major soil chemistry and trace element behaviour. - Surface complexation models with generic parameters make calibration of sorption superfluous in dynamic modelling of deposition impacts on soil chemistry under nature areas.

  14. Using advanced surface complexation models for modelling soil chemistry under forests: Solling forest, Germany

    International Nuclear Information System (INIS)

    Bonten, Luc T.C.; Groenenberg, Jan E.; Meesenburg, Henning; Vries, Wim de

    2011-01-01

    Various dynamic soil chemistry models have been developed to gain insight into impacts of atmospheric deposition of sulphur, nitrogen and other elements on soil and soil solution chemistry. Sorption parameters for anions and cations are generally calibrated for each site, which hampers extrapolation in space and time. On the other hand, recently developed surface complexation models (SCMs) have been successful in predicting ion sorption for static systems using generic parameter sets. This study reports the inclusion of an assemblage of these SCMs in the dynamic soil chemistry model SMARTml and applies this model to a spruce forest site in Solling Germany. Parameters for SCMs were taken from generic datasets and not calibrated. Nevertheless, modelling results for major elements matched observations well. Further, trace metals were included in the model, also using the existing framework of SCMs. The model predicted sorption for most trace elements well. - Highlights: → Surface complexation models can be well applied in field studies. → Soil chemistry under a forest site is adequately modelled using generic parameters. → The model is easily extended with extra elements within the existing framework. → Surface complexation models can show the linkages between major soil chemistry and trace element behaviour. - Surface complexation models with generic parameters make calibration of sorption superfluous in dynamic modelling of deposition impacts on soil chemistry under nature areas.

  15. A marketing mix model for a complex and turbulent environment

    Directory of Open Access Journals (Sweden)

    R. B. Mason

    2007-12-01

    Full Text Available Purpose: This paper is based on the proposition that the choice of marketing tactics is determined, or at least significantly influenced, by the nature of the company’s external environment. It aims to illustrate the type of marketing mix tactics that are suggested for a complex and turbulent environment when marketing and the environment are viewed through a chaos and complexity theory lens. Design/Methodology/Approach: Since chaos and complexity theories are proposed as a good means of understanding the dynamics of complex and turbulent markets, a comprehensive review and analysis of literature on the marketing mix and marketing tactics from a chaos and complexity viewpoint was conducted. From this literature review, a marketing mix model was conceptualised. Findings: A marketing mix model considered appropriate for success in complex and turbulent environments was developed. In such environments, the literature suggests destabilising marketing activities are more effective, whereas stabilising type activities are more effective in simple, stable environments. Therefore the model proposes predominantly destabilising type tactics as appropriate for a complex and turbulent environment such as is currently being experienced in South Africa. Implications: This paper is of benefit to marketers by emphasising a new way to consider the future marketing activities of their companies. How this model can assist marketers and suggestions for research to develop and apply this model are provided. It is hoped that the model suggested will form the basis of empirical research to test its applicability in the turbulent South African environment. Originality/Value: Since businesses and markets are complex adaptive systems, using complexity theory to understand how to cope in complex, turbulent environments is necessary, but has not been widely researched. In fact, most chaos and complexity theory work in marketing has concentrated on marketing strategy, with

  16. Political economy models and agricultural policy formation : empirical applicability and relevance for the CAP

    OpenAIRE

    Zee, van der, F.A.

    1997-01-01

    This study explores the relevance and applicability of political economy models for the explanation of agricultural policies. Part I (chapters 4-7) takes a general perspective and evaluates the empirical applicability of voting models and interest group models to agricultural policy formation in industrialised market economics. Part II (chapters 8-11) focuses on the empirical applicability of political economy models to agricultural policy formation and agricultural policy developmen...

  17. Bourbaki's structure theory in the problem of complex systems simulation models synthesis and model-oriented programming

    Science.gov (United States)

    Brodsky, Yu. I.

    2015-01-01

    The work is devoted to the application of Bourbaki's structure theory to substantiate the synthesis of simulation models of complex multicomponent systems, where every component may be a complex system itself. An application of the Bourbaki's structure theory offers a new approach to the design and computer implementation of simulation models of complex multicomponent systems—model synthesis and model-oriented programming. It differs from the traditional object-oriented approach. The central concept of this new approach and at the same time, the basic building block for the construction of more complex structures is the concept of models-components. A model-component endowed with a more complicated structure than, for example, the object in the object-oriented analysis. This structure provides to the model-component an independent behavior-the ability of standard responds to standard requests of its internal and external environment. At the same time, the computer implementation of model-component's behavior is invariant under the integration of models-components into complexes. This fact allows one firstly to construct fractal models of any complexity, and secondly to implement a computational process of such constructions uniformly-by a single universal program. In addition, the proposed paradigm allows one to exclude imperative programming and to generate computer code with a high degree of parallelism.

  18. A review of models relevant to road safety.

    Science.gov (United States)

    Hughes, B P; Newstead, S; Anund, A; Shu, C C; Falkmer, T

    2015-01-01

    It is estimated that more than 1.2 million people die worldwide as a result of road traffic crashes and some 50 million are injured per annum. At present some Western countries' road safety strategies and countermeasures claim to have developed into 'Safe Systems' models to address the effects of road related crashes. Well-constructed models encourage effective strategies to improve road safety. This review aimed to identify and summarise concise descriptions, or 'models' of safety. The review covers information from a wide variety of fields and contexts including transport, occupational safety, food industry, education, construction and health. The information from 2620 candidate references were selected and summarised in 121 examples of different types of model and contents. The language of safety models and systems was found to be inconsistent. Each model provided additional information regarding style, purpose, complexity and diversity. In total, seven types of models were identified. The categorisation of models was done on a high level with a variation of details in each group and without a complete, simple and rational description. The models identified in this review are likely to be adaptable to road safety and some of them have previously been used. None of systems theory, safety management systems, the risk management approach, or safety culture was commonly or thoroughly applied to road safety. It is concluded that these approaches have the potential to reduce road trauma. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Polystochastic Models for Complexity

    CERN Document Server

    Iordache, Octavian

    2010-01-01

    This book is devoted to complexity understanding and management, considered as the main source of efficiency and prosperity for the next decades. Divided into six chapters, the book begins with a presentation of basic concepts as complexity, emergence and closure. The second chapter looks to methods and introduces polystochastic models, the wave equation, possibilities and entropy. The third chapter focusing on physical and chemical systems analyzes flow-sheet synthesis, cyclic operations of separation, drug delivery systems and entropy production. Biomimetic systems represent the main objective of the fourth chapter. Case studies refer to bio-inspired calculation methods, to the role of artificial genetic codes, neural networks and neural codes for evolutionary calculus and for evolvable circuits as biomimetic devices. The fifth chapter, taking its inspiration from systems sciences and cognitive sciences looks to engineering design, case base reasoning methods, failure analysis, and multi-agent manufacturing...

  20. Culturally relevant model program to prevent and reduce agricultural injuries.

    Science.gov (United States)

    Helitzer, D L; Hathorn, G; Benally, J; Ortega, C

    2014-07-01

    Limited research has explored pesticide injury prevention among American Indian farmers. In a five-year agricultural intervention, a university-community partnership, including the University of New Mexico School of Medicine, New Mexico State University, Shiprock Area Cooperative Extension Service, and Navajo Nation communities, used a culturally relevant model to introduce and maintain safe use of integrated pest management techniques. We applied the Diffusion of Innovations theory and community-based approaches to tailor health promotion strategies for our intervention. In a longitudinal study with repeated measures, we trained six "model farmers" to be crop management experts in pesticide safety, application, and control. Subsequently, these model farmers worked with 120 farm families randomized into two groups: intervention (Group 1) and delayed intervention (Group 2). Measurements included a walk-through analysis, test of knowledge and attitudes, and yield analysis. Both groups demonstrated improvements in pesticide storage behaviors after training. Test scores regarding safety practices improved significantly: from 57.3 to 72.4 for Group 1 and from 52.6 to 76.3 for Group 2. Group 1 maintained their knowledge and safety practices after the intervention. Attitudes about pesticides and communication of viewpoints changed across the study years. With pesticides and fertilizer, the number of corn ears increased by 56.3% and yield (kg m(-2)) of alfalfa increased by 41.2%. The study combined traditional farming practices with culturally relevant approaches and behavior change theory to affect knowledge, safety practices, attitudes, communication channels, and crop yield. Storage behaviors, use of pesticides and safety and application equipment, and safety practice knowledge changed significantly, as did attitudes about social networking, social support, and the compatibility and relative advantage of pesticides for farms.

  1. Complexation and molecular modeling studies of europium(III)-gallic acid-amino acid complexes.

    Science.gov (United States)

    Taha, Mohamed; Khan, Imran; Coutinho, João A P

    2016-04-01

    With many metal-based drugs extensively used today in the treatment of cancer, attention has focused on the development of new coordination compounds with antitumor activity with europium(III) complexes recently introduced as novel anticancer drugs. The aim of this work is to design new Eu(III) complexes with gallic acid, an antioxida'nt phenolic compound. Gallic acid was chosen because it shows anticancer activity without harming health cells. As antioxidant, it helps to protect human cells against oxidative damage that implicated in DNA damage, cancer, and accelerated cell aging. In this work, the formation of binary and ternary complexes of Eu(III) with gallic acid, primary ligand, and amino acids alanine, leucine, isoleucine, and tryptophan was studied by glass electrode potentiometry in aqueous solution containing 0.1M NaNO3 at (298.2 ± 0.1) K. Their overall stability constants were evaluated and the concentration distributions of the complex species in solution were calculated. The protonation constants of gallic acid and amino acids were also determined at our experimental conditions and compared with those predicted by using conductor-like screening model for realistic solvation (COSMO-RS) model. The geometries of Eu(III)-gallic acid complexes were characterized by the density functional theory (DFT). The spectroscopic UV-visible and photoluminescence measurements are carried out to confirm the formation of Eu(III)-gallic acid complexes in aqueous solutions. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. Microbial decomposition of keratin in nature—a new hypothesis of industrial relevance

    DEFF Research Database (Denmark)

    Lange, Lene; Huang, Yuhong; Kamp Busk, Peter

    2016-01-01

    with the keratinases to loosen the molecular structure, thus giving the enzymes access to their substrate, the protein structure. With such complexity, it is relevant to compare microbial keratin decomposition with the microbial decomposition of well-studied polymers such as cellulose and chitin. Interestingly...... enzymatic and boosting factors needed for keratin breakdown have been used to formulate a hypothesis for mode of action of the LPMOs in keratin decomposition and for a model for degradation of keratin in nature. Testing such hypotheses and models still needs to be done. Even now, the hypothesis can serve...

  3. The relevance of existing health communication models in the email age: An

    Science.gov (United States)

    Fage-Butler, Antoinette Mary; Jensen, Matilde Nisbeth

    2015-01-01

    Email communication is being integrated relatively slowly into doctor–patient communication. Patients have expressed enthusiasm for the medium, while doctors are generally more reluctant. As existing health communication models have characteristically assumed the co-presence of doctor and patient and primarily reflect medical practitioners’ perspectives, their suitability in relation to email communication and patients’ perspectives warrants further investigation. Following a two-step process and using the methodology of the integrative literature review, 29 articles from 2004–2014 are analysed with the aim of investigating the advantages and disadvantages of the medium of email from the patient’s perspective. The findings are compared to the health communication models of biomedicine, patient-centeredness, patient education and patient empowerment to investigate these models’ relevance for doctor–patient email communication. Results show that patients identify numerous advantages with email communication, including improved convenience and access, more detailed informational exchanges, greater reflection opportunities, freedom from the medical gaze and the potential to level out power imbalances, as well as a number of primarily medium-related disadvantages. The findings indicate that email can counteract some of the communicative problems associated with biomedicine and suggest the ongoing relevance of aspects of the models of patient empowerment, patient-centeredness and patient education for email communication.

  4. Controlling complexity: the clinical relevance of mouse complex genetics

    Czech Academy of Sciences Publication Activity Database

    Forejt, Jiří

    2013-01-01

    Roč. 21, č. 11 (2013), s. 1191-1196 ISSN 1018-4813 Institutional support: RVO:68378050 Keywords : Mouse model * Forward genetics * Rewiev Subject RIV: EB - Genetics ; Molecular Biology OBOR OECD: Genetics and heredity (medical genetics to be 3) Impact factor: 4.225, year: 2013

  5. Simulation and Analysis of Complex Biological Processes: an Organisation Modelling Perspective

    NARCIS (Netherlands)

    Bosse, T.; Jonker, C.M.; Treur, J.

    2005-01-01

    This paper explores how the dynamics of complex biological processes can be modelled and simulated as an organisation of multiple agents. This modelling perspective identifies organisational structure occurring in complex decentralised processes and handles complexity of the analysis of the dynamics

  6. Exploring the potential relevance of human-specific genes to complex disease

    Directory of Open Access Journals (Sweden)

    Cooper David N

    2011-01-01

    Full Text Available Abstract Although human disease genes generally tend to be evolutionarily more ancient than non-disease genes, complex disease genes appear to be represented more frequently than Mendelian disease genes among genes of more recent evolutionary origin. It is therefore proposed that the analysis of human-specific genes might provide new insights into the genetics of complex disease. Cross-comparison with the Human Gene Mutation Database (http://www.hgmd.org revealed a number of examples of disease-causing and disease-associated mutations in putatively human-specific genes. A sizeable proportion of these were missense polymorphisms associated with complex disease. Since both human-specific genes and genes associated with complex disease have often experienced particularly rapid rates of evolutionary change, either due to weaker purifying selection or positive selection, it is proposed that a significant number of human-specific genes may play a role in complex disease.

  7. Dynamic complexities in a parasitoid-host-parasitoid ecological model

    International Nuclear Information System (INIS)

    Yu Hengguo; Zhao Min; Lv Songjuan; Zhu Lili

    2009-01-01

    Chaotic dynamics have been observed in a wide range of population models. In this study, the complex dynamics in a discrete-time ecological model of parasitoid-host-parasitoid are presented. The model shows that the superiority coefficient not only stabilizes the dynamics, but may strongly destabilize them as well. Many forms of complex dynamics were observed, including pitchfork bifurcation with quasi-periodicity, period-doubling cascade, chaotic crisis, chaotic bands with narrow or wide periodic window, intermittent chaos, and supertransient behavior. Furthermore, computation of the largest Lyapunov exponent demonstrated the chaotic dynamic behavior of the model

  8. Dynamic complexities in a parasitoid-host-parasitoid ecological model

    Energy Technology Data Exchange (ETDEWEB)

    Yu Hengguo [School of Mathematic and Information Science, Wenzhou University, Wenzhou, Zhejiang 325035 (China); Zhao Min [School of Life and Environmental Science, Wenzhou University, Wenzhou, Zhejiang 325027 (China)], E-mail: zmcn@tom.com; Lv Songjuan; Zhu Lili [School of Mathematic and Information Science, Wenzhou University, Wenzhou, Zhejiang 325035 (China)

    2009-01-15

    Chaotic dynamics have been observed in a wide range of population models. In this study, the complex dynamics in a discrete-time ecological model of parasitoid-host-parasitoid are presented. The model shows that the superiority coefficient not only stabilizes the dynamics, but may strongly destabilize them as well. Many forms of complex dynamics were observed, including pitchfork bifurcation with quasi-periodicity, period-doubling cascade, chaotic crisis, chaotic bands with narrow or wide periodic window, intermittent chaos, and supertransient behavior. Furthermore, computation of the largest Lyapunov exponent demonstrated the chaotic dynamic behavior of the model.

  9. Family influences on mania-relevant cognitions and beliefs: a cognitive model of mania and reward.

    Science.gov (United States)

    Chen, Stephen H; Johnson, Sheri L

    2012-07-01

    The present study proposed and tested a cognitive model of mania and reward. Undergraduates (N = 284; 68.4% female; mean age = 20.99 years, standard deviation ± 3.37) completed measures of family goal setting and achievement values, personal reward-related beliefs, cognitive symptoms of mania, and risk for mania. Correlational analyses and structural equation modeling supported two distinct, but related facets of mania-relevant cognition: stably present reward-related beliefs and state-dependent cognitive symptoms in response to success and positive emotion. Results also indicated that family emphasis on achievement and highly ambitious extrinsic goals were associated with these mania-relevant cognitions. Finally, controlling for other factors, cognitive symptoms in response to success and positive emotion were uniquely associated with lifetime propensity towards mania symptoms. Results support the merit of distinguishing between facets of mania-relevant cognition and the importance of the family in shaping both aspects of cognition. © 2012 Wiley Periodicals, Inc.

  10. What do we gain from simplicity versus complexity in species distribution models?

    Science.gov (United States)

    Merow, Cory; Smith, Matthew J.; Edwards, Thomas C.; Guisan, Antoine; McMahon, Sean M.; Normand, Signe; Thuiller, Wilfried; Wuest, Rafael O.; Zimmermann, Niklaus E.; Elith, Jane

    2014-01-01

    Species distribution models (SDMs) are widely used to explain and predict species ranges and environmental niches. They are most commonly constructed by inferring species' occurrence–environment relationships using statistical and machine-learning methods. The variety of methods that can be used to construct SDMs (e.g. generalized linear/additive models, tree-based models, maximum entropy, etc.), and the variety of ways that such models can be implemented, permits substantial flexibility in SDM complexity. Building models with an appropriate amount of complexity for the study objectives is critical for robust inference. We characterize complexity as the shape of the inferred occurrence–environment relationships and the number of parameters used to describe them, and search for insights into whether additional complexity is informative or superfluous. By building ‘under fit’ models, having insufficient flexibility to describe observed occurrence–environment relationships, we risk misunderstanding the factors shaping species distributions. By building ‘over fit’ models, with excessive flexibility, we risk inadvertently ascribing pattern to noise or building opaque models. However, model selection can be challenging, especially when comparing models constructed under different modeling approaches. Here we argue for a more pragmatic approach: researchers should constrain the complexity of their models based on study objective, attributes of the data, and an understanding of how these interact with the underlying biological processes. We discuss guidelines for balancing under fitting with over fitting and consequently how complexity affects decisions made during model building. Although some generalities are possible, our discussion reflects differences in opinions that favor simpler versus more complex models. We conclude that combining insights from both simple and complex SDM building approaches best advances our knowledge of current and future species

  11. Algebraic computability and enumeration models recursion theory and descriptive complexity

    CERN Document Server

    Nourani, Cyrus F

    2016-01-01

    This book, Algebraic Computability and Enumeration Models: Recursion Theory and Descriptive Complexity, presents new techniques with functorial models to address important areas on pure mathematics and computability theory from the algebraic viewpoint. The reader is first introduced to categories and functorial models, with Kleene algebra examples for languages. Functorial models for Peano arithmetic are described toward important computational complexity areas on a Hilbert program, leading to computability with initial models. Infinite language categories are also introduced to explain descriptive complexity with recursive computability with admissible sets and urelements. Algebraic and categorical realizability is staged on several levels, addressing new computability questions with omitting types realizably. Further applications to computing with ultrafilters on sets and Turing degree computability are examined. Functorial models computability is presented with algebraic trees realizing intuitionistic type...

  12. Modeling complex work systems - method meets reality

    NARCIS (Netherlands)

    van der Veer, Gerrit C.; Hoeve, Machteld; Lenting, Bert

    1996-01-01

    Modeling an existing task situation is often a first phase in the (re)design of information systems. For complex systems design, this model should consider both the people and the organization involved, the work, and situational aspects. Groupware Task Analysis (GTA) as part of a method for the

  13. Location Criteria Relevant for Sustainability of Social Housing Model

    Directory of Open Access Journals (Sweden)

    Petković-Grozdanović Nataša

    2016-01-01

    Full Text Available Social housing models, which had began to develop during the last century, for their only objective had a need to overcome the housing problems of socially vulnerable categories. However, numerous studies have shown that these social categories, because of their low social status, are highly susceptible to various psychological and sociological problems. On the other hand a low level of quality, which was common for social housing dwellings, has further aggravated these problems by initiating trouble behaviours among tenants, affecting social exclusion and segregation. Contemporary social housing models are therefore conceptualized in a way to provide a positive psycho-sociological impact on their tenants. Therefore the planning approach in social housing should be such to: support important functions in daily life routines; promote tolerance and cooperation; influence on a sense of social order and belonging; affect the socialization of the tenant and their integration into the wider community; and improve social cohesion. Analysis of the influential location parameters of immediate and wider social housing environment strive to define the ones relevant to the life quality of social housing tenants and therefore influence on the sustainability of social housing model.

  14. Present status on atomic and molecular data relevant to fusion plasma diagnostics and modeling

    International Nuclear Information System (INIS)

    Tawara, H.

    1997-01-01

    This issue is the collection of the paper presented status on atomic and molecular data relevant to fusion plasma diagnostics and modeling. The 10 of the presented papers are indexed individually. (J.P.N.)

  15. Fast multi-output relevance vector regression

    OpenAIRE

    Ha, Youngmin

    2017-01-01

    This paper aims to decrease the time complexity of multi-output relevance vector regression from O(VM^3) to O(V^3+M^3), where V is the number of output dimensions, M is the number of basis functions, and V

  16. Intrinsic Uncertainties in Modeling Complex Systems.

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, Curtis S; Bramson, Aaron L.; Ames, Arlo L.

    2014-09-01

    Models are built to understand and predict the behaviors of both natural and artificial systems. Because it is always necessary to abstract away aspects of any non-trivial system being modeled, we know models can potentially leave out important, even critical elements. This reality of the modeling enterprise forces us to consider the prospective impacts of those effects completely left out of a model - either intentionally or unconsidered. Insensitivity to new structure is an indication of diminishing returns. In this work, we represent a hypothetical unknown effect on a validated model as a finite perturba- tion whose amplitude is constrained within a control region. We find robustly that without further constraints, no meaningful bounds can be placed on the amplitude of a perturbation outside of the control region. Thus, forecasting into unsampled regions is a very risky proposition. We also present inherent difficulties with proper time discretization of models and representing in- herently discrete quantities. We point out potentially worrisome uncertainties, arising from math- ematical formulation alone, which modelers can inadvertently introduce into models of complex systems. Acknowledgements This work has been funded under early-career LDRD project #170979, entitled "Quantify- ing Confidence in Complex Systems Models Having Structural Uncertainties", which ran from 04/2013 to 09/2014. We wish to express our gratitude to the many researchers at Sandia who con- tributed ideas to this work, as well as feedback on the manuscript. In particular, we would like to mention George Barr, Alexander Outkin, Walt Beyeler, Eric Vugrin, and Laura Swiler for provid- ing invaluable advice and guidance through the course of the project. We would also like to thank Steven Kleban, Amanda Gonzales, Trevor Manzanares, and Sarah Burwell for their assistance in managing project tasks and resources.

  17. Fatigue modeling of materials with complex microstructures

    DEFF Research Database (Denmark)

    Qing, Hai; Mishnaevsky, Leon

    2011-01-01

    with the phenomenological model of fatigue damage growth. As a result, the fatigue lifetime of materials with complex structures can be determined as a function of the parameters of their structures. As an example, the fatigue lifetimes of wood modeled as a cellular material with multilayered, fiber reinforced walls were...

  18. Reaction of cyanide with Pt-nucleobase complexes: preparative, spectroscopic, and structural studies. Unexpected stability of Pt-thymine and Pt-uracil complexes

    International Nuclear Information System (INIS)

    Raudaschl-Sieber, G.; Lippert, B.

    1985-01-01

    In order to improve the understanding of the nature of the strongly bound cisplatin on DNA, the reactivity of a large number of complexes of cis-(NH 3 ) 2 Pt/sup II/ with the model nucleobases, 9-ethylguanine, 9-methyladenine, 1-methylcytisine, 1-methylthymine, and i-methyluracil, toward a large excess of cyanide was studied. The behavior of Pt-nucleobase complexes toward CN - is compared with that of simple Pt-amine complexes, and reactions of thiourea with two selected nucleobase complexes is reported. The relevance of these findings with respect to substitution reactions of Pt-nucleobase complexes and the nature of the tightly DNA-bound Pt, which cannot be removed by excess KCN, is discussed

  19. Stability of Rotor Systems: A Complex Modelling Approach

    DEFF Research Database (Denmark)

    Kliem, Wolfhard; Pommer, Christian; Stoustrup, Jakob

    1996-01-01

    A large class of rotor systems can be modelled by a complex matrix differential equation of secondorder. The angular velocity of the rotor plays the role of a parameter. We apply the Lyapunov matrix equation in a complex setting and prove two new stability results which are compared...

  20. Understanding complex urban systems integrating multidisciplinary data in urban models

    CERN Document Server

    Gebetsroither-Geringer, Ernst; Atun, Funda; Werner, Liss

    2016-01-01

    This book is devoted to the modeling and understanding of complex urban systems. This second volume of Understanding Complex Urban Systems focuses on the challenges of the modeling tools, concerning, e.g., the quality and quantity of data and the selection of an appropriate modeling approach. It is meant to support urban decision-makers—including municipal politicians, spatial planners, and citizen groups—in choosing an appropriate modeling approach for their particular modeling requirements. The contributors to this volume are from different disciplines, but all share the same goal: optimizing the representation of complex urban systems. They present and discuss a variety of approaches for dealing with data-availability problems and finding appropriate modeling approaches—and not only in terms of computer modeling. The selection of articles featured in this volume reflect a broad variety of new and established modeling approaches such as: - An argument for using Big Data methods in conjunction with Age...

  1. Minimum-complexity helicopter simulation math model

    Science.gov (United States)

    Heffley, Robert K.; Mnich, Marc A.

    1988-01-01

    An example of a minimal complexity simulation helicopter math model is presented. Motivating factors are the computational delays, cost, and inflexibility of the very sophisticated math models now in common use. A helicopter model form is given which addresses each of these factors and provides better engineering understanding of the specific handling qualities features which are apparent to the simulator pilot. The technical approach begins with specification of features which are to be modeled, followed by a build up of individual vehicle components and definition of equations. Model matching and estimation procedures are given which enable the modeling of specific helicopters from basic data sources such as flight manuals. Checkout procedures are given which provide for total model validation. A number of possible model extensions and refinement are discussed. Math model computer programs are defined and listed.

  2. Social aggravation: Understanding the complex role of social relationships on stress and health-relevant physiology.

    Science.gov (United States)

    Birmingham, Wendy C; Holt-Lunstad, Julianne

    2018-04-05

    There is a rich literature on social support and physical health, but research has focused primarily on the protective effects of social relationship. The stress buffering model asserts that relationships may be protective by being a source of support when coping with stress, thereby blunting health relevant physiological responses. Research also indicates relationships can be a source of stress, also influencing health. In other words, the social buffering influence may have a counterpart, a social aggravating influence that has an opposite or opposing effect. Drawing upon existing conceptual models, we expand these to delineate how social relationships may influence stress processes and ultimately health. This review summarizes the existing literature that points to the potential deleterious physiological effects of our relationships when they are sources of stress or exacerbate stress. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. Elastic Network Model of a Nuclear Transport Complex

    Science.gov (United States)

    Ryan, Patrick; Liu, Wing K.; Lee, Dockjin; Seo, Sangjae; Kim, Young-Jin; Kim, Moon K.

    2010-05-01

    The structure of Kap95p was obtained from the Protein Data Bank (www.pdb.org) and analyzed RanGTP plays an important role in both nuclear protein import and export cycles. In the nucleus, RanGTP releases macromolecular cargoes from importins and conversely facilitates cargo binding to exportins. Although the crystal structure of the nuclear import complex formed by importin Kap95p and RanGTP was recently identified, its molecular mechanism still remains unclear. To understand the relationship between structure and function of a nuclear transport complex, a structure-based mechanical model of Kap95p:RanGTP complex is introduced. In this model, a protein structure is simply modeled as an elastic network in which a set of coarse-grained point masses are connected by linear springs representing biochemical interactions at atomic level. Harmonic normal mode analysis (NMA) and anharmonic elastic network interpolation (ENI) are performed to predict the modes of vibrations and a feasible pathway between locked and unlocked conformations of Kap95p, respectively. Simulation results imply that the binding of RanGTP to Kap95p induces the release of the cargo in the nucleus as well as prevents any new cargo from attaching to the Kap95p:RanGTP complex.

  4. Sparse Estimation Using Bayesian Hierarchical Prior Modeling for Real and Complex Linear Models

    DEFF Research Database (Denmark)

    Pedersen, Niels Lovmand; Manchón, Carles Navarro; Badiu, Mihai Alin

    2015-01-01

    In sparse Bayesian learning (SBL), Gaussian scale mixtures (GSMs) have been used to model sparsity-inducing priors that realize a class of concave penalty functions for the regression task in real-valued signal models. Motivated by the relative scarcity of formal tools for SBL in complex-valued m......In sparse Bayesian learning (SBL), Gaussian scale mixtures (GSMs) have been used to model sparsity-inducing priors that realize a class of concave penalty functions for the regression task in real-valued signal models. Motivated by the relative scarcity of formal tools for SBL in complex...... error, and robustness in low and medium signal-to-noise ratio regimes....

  5. Complex Networks in Psychological Models

    Science.gov (United States)

    Wedemann, R. S.; Carvalho, L. S. A. V. D.; Donangelo, R.

    We develop schematic, self-organizing, neural-network models to describe mechanisms associated with mental processes, by a neurocomputational substrate. These models are examples of real world complex networks with interesting general topological structures. Considering dopaminergic signal-to-noise neuronal modulation in the central nervous system, we propose neural network models to explain development of cortical map structure and dynamics of memory access, and unify different mental processes into a single neurocomputational substrate. Based on our neural network models, neurotic behavior may be understood as an associative memory process in the brain, and the linguistic, symbolic associative process involved in psychoanalytic working-through can be mapped onto a corresponding process of reconfiguration of the neural network. The models are illustrated through computer simulations, where we varied dopaminergic modulation and observed the self-organizing emergent patterns at the resulting semantic map, interpreting them as different manifestations of mental functioning, from psychotic through to normal and neurotic behavior, and creativity.

  6. Complex scaling in the cluster model

    International Nuclear Information System (INIS)

    Kruppa, A.T.; Lovas, R.G.; Gyarmati, B.

    1987-01-01

    To find the positions and widths of resonances, a complex scaling of the intercluster relative coordinate is introduced into the resonating-group model. In the generator-coordinate technique used to solve the resonating-group equation the complex scaling requires minor changes in the formulae and code. The finding of the resonances does not need any preliminary guess or explicit reference to any asymptotic prescription. The procedure is applied to the resonances in the relative motion of two ground-state α clusters in 8 Be, but is appropriate for any systems consisting of two clusters. (author) 23 refs.; 5 figs

  7. Predictive-property-ranked variable reduction in partial least squares modelling with final complexity adapted models: comparison of properties for ranking.

    Science.gov (United States)

    Andries, Jan P M; Vander Heyden, Yvan; Buydens, Lutgarde M C

    2013-01-14

    The calibration performance of partial least squares regression for one response (PLS1) can be improved by eliminating uninformative variables. Many variable-reduction methods are based on so-called predictor-variable properties or predictive properties, which are functions of various PLS-model parameters, and which may change during the steps of the variable-reduction process. Recently, a new predictive-property-ranked variable reduction method with final complexity adapted models, denoted as PPRVR-FCAM or simply FCAM, was introduced. It is a backward variable elimination method applied on the predictive-property-ranked variables. The variable number is first reduced, with constant PLS1 model complexity A, until A variables remain, followed by a further decrease in PLS complexity, allowing the final selection of small numbers of variables. In this study for three data sets the utility and effectiveness of six individual and nine combined predictor-variable properties are investigated, when used in the FCAM method. The individual properties include the absolute value of the PLS1 regression coefficient (REG), the significance of the PLS1 regression coefficient (SIG), the norm of the loading weight (NLW) vector, the variable importance in the projection (VIP), the selectivity ratio (SR), and the squared correlation coefficient of a predictor variable with the response y (COR). The selective and predictive performances of the models resulting from the use of these properties are statistically compared using the one-tailed Wilcoxon signed rank test. The results indicate that the models, resulting from variable reduction with the FCAM method, using individual or combined properties, have similar or better predictive abilities than the full spectrum models. After mean-centring of the data, REG and SIG, provide low numbers of informative variables, with a meaning relevant to the response, and lower than the other individual properties, while the predictive abilities are

  8. Higher genus correlators for the complex matrix model

    International Nuclear Information System (INIS)

    Ambjorn, J.; Kristhansen, C.F.; Makeenko, Y.M.

    1992-01-01

    In this paper, the authors describe an iterative scheme which allows us to calculate any multi-loop correlator for the complex matrix model to any genus using only the first in the chain of loop equations. The method works for a completely general potential and the results contain no explicit reference to the couplings. The genus g contribution to the m-loop correlator depends on a finite number of parameters, namely at most 4g - 2 + m. The authors find the generating functional explicitly up to genus three. The authors show as well that the model is equivalent to an external field problem for the complex matrix model with a logarithmic potential

  9. Assessing Complexity in Learning Outcomes--A Comparison between the SOLO Taxonomy and the Model of Hierarchical Complexity

    Science.gov (United States)

    Stålne, Kristian; Kjellström, Sofia; Utriainen, Jukka

    2016-01-01

    An important aspect of higher education is to educate students who can manage complex relationships and solve complex problems. Teachers need to be able to evaluate course content with regard to complexity, as well as evaluate students' ability to assimilate complex content and express it in the form of a learning outcome. One model for evaluating…

  10. Functional Diversity of AAA+ Protease Complexes in Bacillus subtilis

    Science.gov (United States)

    Elsholz, Alexander K. W.; Birk, Marlene S.; Charpentier, Emmanuelle; Turgay, Kürşad

    2017-01-01

    Here, we review the diverse roles and functions of AAA+ protease complexes in protein homeostasis, control of stress response and cellular development pathways by regulatory and general proteolysis in the Gram-positive model organism Bacillus subtilis. We discuss in detail the intricate involvement of AAA+ protein complexes in controlling sporulation, the heat shock response and the role of adaptor proteins in these processes. The investigation of these protein complexes and their adaptor proteins has revealed their relevance for Gram-positive pathogens and their potential as targets for new antibiotics. PMID:28748186

  11. Functional Diversity of AAA+ Protease Complexes in Bacillus subtilis.

    Science.gov (United States)

    Elsholz, Alexander K W; Birk, Marlene S; Charpentier, Emmanuelle; Turgay, Kürşad

    2017-01-01

    Here, we review the diverse roles and functions of AAA+ protease complexes in protein homeostasis, control of stress response and cellular development pathways by regulatory and general proteolysis in the Gram-positive model organism Bacillus subtilis . We discuss in detail the intricate involvement of AAA+ protein complexes in controlling sporulation, the heat shock response and the role of adaptor proteins in these processes. The investigation of these protein complexes and their adaptor proteins has revealed their relevance for Gram-positive pathogens and their potential as targets for new antibiotics.

  12. Modeling Networks and Dynamics in Complex Systems: from Nano-Composites to Opinion Formation

    Science.gov (United States)

    Shi, Feng

    Complex networks are ubiquitous in systems of physical, biological, social or technological origin. Components in those systems range from as large as cities in power grids, to as small as molecules in metabolic networks. Since the dawn of network science, significant attention has focused on the implications of dynamics in establishing network structure and the impact of structural properties on dynamics on those networks. The first part of the thesis follows this direction, studying the network formed by conductive nanorods in nano-materials, and focuses on the electrical response of the composite to the structure change of the network. New scaling laws for the shear-induced anisotropic percolation are introduced and a robust exponential tail of the current distribution across the network is identified. These results are relevant especially to "active" composite materials where materials are exposed to mechanical loading and strain deformations. However, in many real-world networks the evolution of the network topology is tied to the states of the vertices and vice versa. Networks that exhibit such a feedback are called adaptive or coevolutionary networks. The second part of the thesis examines two closely related variants of a simple, abstract model for coevolution of a network and the opinions of its members. As a representative model for adaptive networks, it displays the feature of self-organization of the system into a stable configuration due to the interplay between the network topology and the dynamics on the network. This simple model yields interesting dynamics and the slight change in the rewiring strategy results in qualitatively different behaviors of the system. In conclusion, the dissertation aims to develop new network models and tools which enable insights into the structure and dynamics of various systems, and seeks to advance network algorithms which provide approaches to coherently articulated questions in real-world complex systems such as

  13. Modeling of Complex Life Cycle Prediction Based on Cell Division

    Directory of Open Access Journals (Sweden)

    Fucheng Zhang

    2017-01-01

    Full Text Available Effective fault diagnosis and reasonable life expectancy are of great significance and practical engineering value for the safety, reliability, and maintenance cost of equipment and working environment. At present, the life prediction methods of the equipment are equipment life prediction based on condition monitoring, combined forecasting model, and driven data. Most of them need to be based on a large amount of data to achieve the problem. For this issue, we propose learning from the mechanism of cell division in the organism. We have established a moderate complexity of life prediction model across studying the complex multifactor correlation life model. In this paper, we model the life prediction of cell division. Experiments show that our model can effectively simulate the state of cell division. Through the model of reference, we will use it for the equipment of the complex life prediction.

  14. Mathematical Models to Determine Stable Behavior of Complex Systems

    Science.gov (United States)

    Sumin, V. I.; Dushkin, A. V.; Smolentseva, T. E.

    2018-05-01

    The paper analyzes a possibility to predict functioning of a complex dynamic system with a significant amount of circulating information and a large number of random factors impacting its functioning. Functioning of the complex dynamic system is described as a chaotic state, self-organized criticality and bifurcation. This problem may be resolved by modeling such systems as dynamic ones, without applying stochastic models and taking into account strange attractors.

  15. Understanding psychiatric disorder by capturing ecologically relevant features of learning and decision-making.

    Science.gov (United States)

    Scholl, Jacqueline; Klein-Flügge, Miriam

    2017-09-28

    Recent research in cognitive neuroscience has begun to uncover the processes underlying increasingly complex voluntary behaviours, including learning and decision-making. Partly this success has been possible by progressing from simple experimental tasks to paradigms that incorporate more ecological features. More specifically, the premise is that to understand cognitions and brain functions relevant for real life, we need to introduce some of the ecological challenges that we have evolved to solve. This often entails an increase in task complexity, which can be managed by using computational models to help parse complex behaviours into specific component mechanisms. Here we propose that using computational models with tasks that capture ecologically relevant learning and decision-making processes may provide a critical advantage for capturing the mechanisms underlying symptoms of disorders in psychiatry. As a result, it may help develop mechanistic approaches towards diagnosis and treatment. We begin this review by mapping out the basic concepts and models of learning and decision-making. We then move on to consider specific challenges that emerge in realistic environments and describe how they can be captured by tasks. These include changes of context, uncertainty, reflexive/emotional biases, cost-benefit decision-making, and balancing exploration and exploitation. Where appropriate we highlight future or current links to psychiatry. We particularly draw examples from research on clinical depression, a disorder that greatly compromises motivated behaviours in real-life, but where simpler paradigms have yielded mixed results. Finally, we highlight several paradigms that could be used to help provide new insights into the mechanisms of psychiatric disorders. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  16. Working Toward Policy-Relevant Air Quality Emissions Scenarios

    Science.gov (United States)

    Holloway, T.

    2010-12-01

    Though much work has been done to develop accurate chemical emission inventories, few publicly available inventories are appropriate for realistic policy analysis. Emissions from the electricity and transportation sectors, in particular, respond in complex ways to policy, technology, and energy use change. Many widely used inventories, such as the EPA National Emissions Inventory, are well-suited for modeling current air quality, but do not have the specificity needed to address "what if?" questions. Changes in electricity demand, fuel prices, new power sources, and emission controls all influence the emissions from regional power production, requiring a plant-by-plant assessment to capture the spatially explicit impacts. Similarly, land use, freight distribution, or driving behavior will yield differentiated transportation emissions for urban areas, suburbs, and rural highways. We here present results from three recent research projects at the University of Wisconsin—Madison, where bottom-up emission inventories for electricity, freight transport, and urban vehicle use were constructed to support policy-relevant air quality research. These three studies include: 1) Using the MyPower electricity dispatch model to calculate emissions and air quality impacts of Renewable Portfolio Standards and other carbon-management strategies; 2) Using advanced vehicle and commodity flow data from the Federal Highway Administration to evaluate the potential to shift commodities from truck to rail (assuming expanded infrastructure), and assess a range of alternative fuel suggestions; and 3) Working with urban planners to connect urban density with vehicle use to evaluate the air quality impacts of smart-growth in major Midwest cities. Drawing on the results of these three studies, and on challenges overcome in their execution, we discuss the current state of policy-relevant emission dataset generation, as well as techniques and attributes that need to be further refined in order

  17. BROA: An agent-based model to recommend relevant Learning Objects from Repository Federations adapted to learner profile

    Directory of Open Access Journals (Sweden)

    Paula A. Rodríguez

    2013-03-01

    Full Text Available Learning Objects (LOs are distinguished from traditional educational resources for their easy and quickly availability through Web-based repositories, from which they are accessed through their metadata. In addition, having a user profile allows an educational recommender system to help the learner to find the most relevant LOs based on their needs and preferences. The aim of this paper is to propose an agent-based model so-called BROA to recommend relevant LOs recovered from Repository Federations as well as LOs adapted to learner profile. The model proposed uses both role and service models of GAIA methodology, and the analysis models of the MAS-CommonKADS methodology. A prototype was built based on this model and validated to obtain some assessing results that are finally presented.

  18. BlenX-based compositional modeling of complex reaction mechanisms

    Directory of Open Access Journals (Sweden)

    Judit Zámborszky

    2010-02-01

    Full Text Available Molecular interactions are wired in a fascinating way resulting in complex behavior of biological systems. Theoretical modeling provides a useful framework for understanding the dynamics and the function of such networks. The complexity of the biological networks calls for conceptual tools that manage the combinatorial explosion of the set of possible interactions. A suitable conceptual tool to attack complexity is compositionality, already successfully used in the process algebra field to model computer systems. We rely on the BlenX programming language, originated by the beta-binders process calculus, to specify and simulate high-level descriptions of biological circuits. The Gillespie's stochastic framework of BlenX requires the decomposition of phenomenological functions into basic elementary reactions. Systematic unpacking of complex reaction mechanisms into BlenX templates is shown in this study. The estimation/derivation of missing parameters and the challenges emerging from compositional model building in stochastic process algebras are discussed. A biological example on circadian clock is presented as a case study of BlenX compositionality.

  19. Complex fluids modeling and algorithms

    CERN Document Server

    Saramito, Pierre

    2016-01-01

    This book presents a comprehensive overview of the modeling of complex fluids, including many common substances, such as toothpaste, hair gel, mayonnaise, liquid foam, cement and blood, which cannot be described by Navier-Stokes equations. It also offers an up-to-date mathematical and numerical analysis of the corresponding equations, as well as several practical numerical algorithms and software solutions for the approximation of the solutions. It discusses industrial (molten plastics, forming process), geophysical (mud flows, volcanic lava, glaciers and snow avalanches), and biological (blood flows, tissues) modeling applications. This book is a valuable resource for undergraduate students and researchers in applied mathematics, mechanical engineering and physics.

  20. The utility of Earth system Models of Intermediate Complexity

    NARCIS (Netherlands)

    Weber, S.L.

    2010-01-01

    Intermediate-complexity models are models which describe the dynamics of the atmosphere and/or ocean in less detail than conventional General Circulation Models (GCMs). At the same time, they go beyond the approach taken by atmospheric Energy Balance Models (EBMs) or ocean box models by

  1. Modelling, Estimation and Control of Networked Complex Systems

    CERN Document Server

    Chiuso, Alessandro; Frasca, Mattia; Rizzo, Alessandro; Schenato, Luca; Zampieri, Sandro

    2009-01-01

    The paradigm of complexity is pervading both science and engineering, leading to the emergence of novel approaches oriented at the development of a systemic view of the phenomena under study; the definition of powerful tools for modelling, estimation, and control; and the cross-fertilization of different disciplines and approaches. This book is devoted to networked systems which are one of the most promising paradigms of complexity. It is demonstrated that complex, dynamical networks are powerful tools to model, estimate, and control many interesting phenomena, like agent coordination, synchronization, social and economics events, networks of critical infrastructures, resources allocation, information processing, or control over communication networks. Moreover, it is shown how the recent technological advances in wireless communication and decreasing in cost and size of electronic devices are promoting the appearance of large inexpensive interconnected systems, each with computational, sensing and mobile cap...

  2. EVALUATING THE NOVEL METHODS ON SPECIES DISTRIBUTION MODELING IN COMPLEX FOREST

    Directory of Open Access Journals (Sweden)

    C. H. Tu

    2012-07-01

    Full Text Available The prediction of species distribution has become a focus in ecology. For predicting a result more effectively and accurately, some novel methods have been proposed recently, like support vector machine (SVM and maximum entropy (MAXENT. However, high complexity in the forest, like that in Taiwan, will make the modeling become even harder. In this study, we aim to explore which method is more applicable to species distribution modeling in the complex forest. Castanopsis carlesii (long-leaf chinkapin, LLC, growing widely in Taiwan, was chosen as the target species because its seeds are an important food source for animals. We overlaid the tree samples on the layers of altitude, slope, aspect, terrain position, and vegetation index derived from SOPT-5 images, and developed three models, MAXENT, SVM, and decision tree (DT, to predict the potential habitat of LLCs. We evaluated these models by two sets of independent samples in different site and the effect on the complexity of forest by changing the background sample size (BSZ. In the forest with low complex (small BSZ, the accuracies of SVM (kappa = 0.87 and DT (0.86 models were slightly higher than that of MAXENT (0.84. In the more complex situation (large BSZ, MAXENT kept high kappa value (0.85, whereas SVM (0.61 and DT (0.57 models dropped significantly due to limiting the habitat close to samples. Therefore, MAXENT model was more applicable to predict species’ potential habitat in the complex forest; whereas SVM and DT models would tend to underestimate the potential habitat of LLCs.

  3. Narrowing the gap between network models and real complex systems

    OpenAIRE

    Viamontes Esquivel, Alcides

    2014-01-01

    Simple network models that focus only on graph topology or, at best, basic interactions are often insufficient to capture all the aspects of a dynamic complex system. In this thesis, I explore those limitations, and some concrete methods of resolving them. I argue that, in order to succeed at interpreting and influencing complex systems, we need to take into account  slightly more complex parts, interactions and information flows in our models.This thesis supports that affirmation with five a...

  4. The LAILAPS search engine: a feature model for relevance ranking in life science databases.

    Science.gov (United States)

    Lange, Matthias; Spies, Karl; Colmsee, Christian; Flemming, Steffen; Klapperstück, Matthias; Scholz, Uwe

    2010-03-25

    Efficient and effective information retrieval in life sciences is one of the most pressing challenge in bioinformatics. The incredible growth of life science databases to a vast network of interconnected information systems is to the same extent a big challenge and a great chance for life science research. The knowledge found in the Web, in particular in life-science databases, are a valuable major resource. In order to bring it to the scientist desktop, it is essential to have well performing search engines. Thereby, not the response time nor the number of results is important. The most crucial factor for millions of query results is the relevance ranking. In this paper, we present a feature model for relevance ranking in life science databases and its implementation in the LAILAPS search engine. Motivated by the observation of user behavior during their inspection of search engine result, we condensed a set of 9 relevance discriminating features. These features are intuitively used by scientists, who briefly screen database entries for potential relevance. The features are both sufficient to estimate the potential relevance, and efficiently quantifiable. The derivation of a relevance prediction function that computes the relevance from this features constitutes a regression problem. To solve this problem, we used artificial neural networks that have been trained with a reference set of relevant database entries for 19 protein queries. Supporting a flexible text index and a simple data import format, this concepts are implemented in the LAILAPS search engine. It can easily be used both as search engine for comprehensive integrated life science databases and for small in-house project databases. LAILAPS is publicly available for SWISSPROT data at http://lailaps.ipk-gatersleben.de.

  5. Modeling geophysical complexity: a case for geometric determinism

    Directory of Open Access Journals (Sweden)

    C. E. Puente

    2007-01-01

    Full Text Available It has been customary in the last few decades to employ stochastic models to represent complex data sets encountered in geophysics, particularly in hydrology. This article reviews a deterministic geometric procedure to data modeling, one that represents whole data sets as derived distributions of simple multifractal measures via fractal functions. It is shown how such a procedure may lead to faithful holistic representations of existing geophysical data sets that, while complementing existing representations via stochastic methods, may also provide a compact language for geophysical complexity. The implications of these ideas, both scientific and philosophical, are stressed.

  6. Complex versus simple models: ion-channel cardiac toxicity prediction.

    Science.gov (United States)

    Mistry, Hitesh B

    2018-01-01

    There is growing interest in applying detailed mathematical models of the heart for ion-channel related cardiac toxicity prediction. However, a debate as to whether such complex models are required exists. Here an assessment in the predictive performance between two established large-scale biophysical cardiac models and a simple linear model B net was conducted. Three ion-channel data-sets were extracted from literature. Each compound was designated a cardiac risk category using two different classification schemes based on information within CredibleMeds. The predictive performance of each model within each data-set for each classification scheme was assessed via a leave-one-out cross validation. Overall the B net model performed equally as well as the leading cardiac models in two of the data-sets and outperformed both cardiac models on the latest. These results highlight the importance of benchmarking complex versus simple models but also encourage the development of simple models.

  7. Complex versus simple models: ion-channel cardiac toxicity prediction

    Directory of Open Access Journals (Sweden)

    Hitesh B. Mistry

    2018-02-01

    Full Text Available There is growing interest in applying detailed mathematical models of the heart for ion-channel related cardiac toxicity prediction. However, a debate as to whether such complex models are required exists. Here an assessment in the predictive performance between two established large-scale biophysical cardiac models and a simple linear model Bnet was conducted. Three ion-channel data-sets were extracted from literature. Each compound was designated a cardiac risk category using two different classification schemes based on information within CredibleMeds. The predictive performance of each model within each data-set for each classification scheme was assessed via a leave-one-out cross validation. Overall the Bnet model performed equally as well as the leading cardiac models in two of the data-sets and outperformed both cardiac models on the latest. These results highlight the importance of benchmarking complex versus simple models but also encourage the development of simple models.

  8. Network-oriented modeling addressing complexity of cognitive, affective and social interactions

    CERN Document Server

    Treur, Jan

    2016-01-01

    This book presents a new approach that can be applied to complex, integrated individual and social human processes. It provides an alternative means of addressing complexity, better suited for its purpose than and effectively complementing traditional strategies involving isolation and separation assumptions. Network-oriented modeling allows high-level cognitive, affective and social models in the form of (cyclic) graphs to be constructed, which can be automatically transformed into executable simulation models. The modeling format used makes it easy to take into account theories and findings about complex cognitive and social processes, which often involve dynamics based on interrelating cycles. Accordingly, it makes it possible to address complex phenomena such as the integration of emotions within cognitive processes of all kinds, of internal simulations of the mental processes of others, and of social phenomena such as shared understandings and collective actions. A variety of sample models – including ...

  9. Building a pseudo-atomic model of the anaphase-promoting complex

    International Nuclear Information System (INIS)

    Kulkarni, Kiran; Zhang, Ziguo; Chang, Leifu; Yang, Jing; Fonseca, Paula C. A. da; Barford, David

    2013-01-01

    This article describes an example of molecular replacement in which atomic models are used to interpret electron-density maps determined using single-particle electron-microscopy data. The anaphase-promoting complex (APC/C) is a large E3 ubiquitin ligase that regulates progression through specific stages of the cell cycle by coordinating the ubiquitin-dependent degradation of cell-cycle regulatory proteins. Depending on the species, the active form of the APC/C consists of 14–15 different proteins that assemble into a 20-subunit complex with a mass of approximately 1.3 MDa. A hybrid approach of single-particle electron microscopy and protein crystallography of individual APC/C subunits has been applied to generate pseudo-atomic models of various functional states of the complex. Three approaches for assigning regions of the EM-derived APC/C density map to specific APC/C subunits are described. This information was used to dock atomic models of APC/C subunits, determined either by protein crystallography or homology modelling, to specific regions of the APC/C EM map, allowing the generation of a pseudo-atomic model corresponding to 80% of the entire complex

  10. Modelling the complex dynamics of vegetation, livestock and rainfall ...

    African Journals Online (AJOL)

    Open Access DOWNLOAD FULL TEXT ... In this paper, we present mathematical models that incorporate ideas from complex systems theory to integrate several strands of rangeland theory in a hierarchical framework. ... Keywords: catastrophe theory; complexity theory; disequilibrium; hysteresis; moving attractors

  11. Cadmium toxicity investigated at the physiological and biophysical levels under environmentally relevant conditions using the aquatic model plant Ceratophyllum demersum

    Czech Academy of Sciences Publication Activity Database

    Andresen, Elisa; Kappel, S.; Stärk, H.-J.; Riegger, U.; Borovec, Jakub; Mattusch, J.; Heinz, A.; Schmelzer, C.E.H.; Matoušková, Šárka; Dickinson, B.; Küpper, Hendrik

    2016-01-01

    Roč. 210, č. 4 (2016), s. 1244-1258 ISSN 0028-646X Institutional support: RVO:60077344 ; RVO:67985831 Keywords : Ceratophyllum demersum * Environmentally relevant * Light-harvesting complexes (LHCs) * Toxic metals Subject RIV: CE - Biochemistry; DD - Geochemistry (GLU-S) Impact factor: 7.330, year: 2016

  12. Malaria in pregnancy: the relevance of animal models for vaccine development.

    Science.gov (United States)

    Doritchamou, Justin; Teo, Andrew; Fried, Michal; Duffy, Patrick E

    2017-10-06

    Malaria during pregnancy due to Plasmodium falciparum or P. vivax is a major public health problem in endemic areas, with P. falciparum causing the greatest burden of disease. Increasing resistance of parasites and mosquitoes to existing tools, such as preventive antimalarial treatments and insecticide-treated bed nets respectively, is eroding the partial protection that they offer to pregnant women. Thus, development of effective vaccines against malaria during pregnancy is an urgent priority. Relevant animal models that recapitulate key features of the pathophysiology and immunology of malaria in pregnant women could be used to accelerate vaccine development. This review summarizes available rodent and nonhuman primate models of malaria in pregnancy, and discusses their suitability for studies of biologics intended to prevent or treat malaria in this vulnerable population.

  13. Relevance of Discrecionary Accruals in Ohlson Model: the Case of Mexico

    Directory of Open Access Journals (Sweden)

    Rocío Durán-Vázquez

    2012-01-01

    Full Text Available This study applied the modified Jones´ model (1991 for selected companies of Mexico. This model aims to assess the impact of Discretionary Accrual Information (DAI on financial reporting statements, in order to identify the value relevance of “earnings quality”. We applied methodological criteria of Chung et al (2005 and Mukit & Iskandar (2009. We analyzed financial information of the 35 stock included in the Index of Prices and Quotations (IPC of the Mexican Stock Exchange (BMV for the period 2000 to 2011. 19 companies met the specifications of the model, for 48 quarters of information. The analysis was done in three parts: first, an analysis of the modified Jones´ model under panel data considerations by using fixed effects and adjustments of performing autocorrelation of order 1; second, a correlation analysis between the residuals of the modified Jones´ model and the return of stock price in 3 annual closings years of study: 2007, 2008 and 2009; and third, we incorporated this variable (DAI in the Ohlson model (of the financial and corporate accounting literature and we tested it with panel data analysis, under fixed effects, throughout the study period.

  14. Modeling Complex Nesting Structures in International Business Research

    DEFF Research Database (Denmark)

    Nielsen, Bo Bernhard; Nielsen, Sabina

    2013-01-01

    hierarchical random coefficient models (RCM) are often used for the analysis of multilevel phenomena, IB issues often result in more complex nested structures. This paper illustrates how cross-nested multilevel modeling allowing for predictor variables and cross-level interactions at multiple (crossed) levels...

  15. Linking electricity and water models to assess electricity choices at water-relevant scales

    International Nuclear Information System (INIS)

    Sattler, S; Rogers, J; Macknick, J; Lopez, A; Yates, D; Flores-Lopez, F

    2012-01-01

    Hydrology/water management and electricity generation projections have been modeled separately, but there has been little effort in intentionally and explicitly linking the two sides of the water–energy nexus. This paper describes a platform for assessing power plant cooling water withdrawals and consumption under different electricity pathways at geographic and time scales appropriate for both electricity and hydrology/water management. This platform uses estimates of regional electricity generation by the Regional Energy Deployment System (ReEDS) as input to a hydrologic and water management model—the Water Evaluation and Planning (WEAP) system. In WEAP, this electricity use represents thermoelectric cooling water withdrawals and consumption within the broader, regional water resource context. Here we describe linking the electricity and water models, including translating electricity generation results from ReEDS-relevant geographies to the water-relevant geographies of WEAP. The result of this analysis is water use by the electric sector at the regional watershed level, which is used to examine the water resource implications of these electricity pathways. (letter)

  16. Electromagnetic modelling of Ground Penetrating Radar responses to complex targets

    Science.gov (United States)

    Pajewski, Lara; Giannopoulos, Antonis

    2014-05-01

    This work deals with the electromagnetic modelling of composite structures for Ground Penetrating Radar (GPR) applications. It was developed within the Short-Term Scientific Mission ECOST-STSM-TU1208-211013-035660, funded by COST Action TU1208 "Civil Engineering Applications of Ground Penetrating Radar". The Authors define a set of test concrete structures, hereinafter called cells. The size of each cell is 60 x 100 x 18 cm and the content varies with growing complexity, from a simple cell with few rebars of different diameters embedded in concrete at increasing depths, to a final cell with a quite complicated pattern, including a layer of tendons between two overlying meshes of rebars. Other cells, of intermediate complexity, contain pvc ducts (air filled or hosting rebars), steel objects commonly used in civil engineering (as a pipe, an angle bar, a box section and an u-channel), as well as void and honeycombing defects. One of the cells has a steel mesh embedded in it, overlying two rebars placed diagonally across the comers of the structure. Two cells include a couple of rebars bent into a right angle and placed on top of each other, with a square/round circle lying at the base of the concrete slab. Inspiration for some of these cells is taken from the very interesting experimental work presented in Ref. [1]. For each cell, a subset of models with growing complexity is defined, starting from a simple representation of the cell and ending with a more realistic one. In particular, the model's complexity increases from the geometrical point of view, as well as in terms of how the constitutive parameters of involved media and GPR antennas are described. Some cells can be simulated in both two and three dimensions; the concrete slab can be approximated as a finite-thickness layer having infinite extension on the transverse plane, thus neglecting how edges affect radargrams, or else its finite size can be fully taken into account. The permittivity of concrete can be

  17. Main Features of a 3d GIS for a Monumental Complex with AN Historical-Cultural Relevance

    Science.gov (United States)

    Scianna, A.; La Guardia, M.

    2017-05-01

    The last achievements of technologies in geomatics especially in survey and restitution of 3D models (UAV/drones and laser scanner technologies) generated new procedures and higher standards of quality in representation of archaeological sites. Together with Geomatics, the recent development of Information and Communication Technologies (ICT) strongly contribute to document and the Cultural Heritage (CH). The representation and documentation of CH using these new technologies has became necessary in order to satisfy different needs: - for restorers in order to acquire a deep knowledge of the cultural good and to define possible strategies of restoration; - for the conservation of information, allowing to preserve the 3D geometry of the monumental complex with the integration of descriptions about architectural elements; - for touristic aims, giving the opportunity of sharing CH information on web, allowing users to visit and explore, in a virtual way, monumental complexes, acquiring information details about architectural elements or the history of monumental complex. Looking through these new scenarios, the development of a 3D Geographic Information System (GIS) applied to a cultural good could be, today, an added value of fundamental importance for full description and data management of monumental complexes. In this work, the main features necessary for the correct construction of a 3D GIS of a monumental complex will be analyzed, with a particular focus on the possibilities for creating a standardized procedure to follow.

  18. ANS main control complex three-dimensional computer model development

    International Nuclear Information System (INIS)

    Cleaves, J.E.; Fletcher, W.M.

    1993-01-01

    A three-dimensional (3-D) computer model of the Advanced Neutron Source (ANS) main control complex is being developed. The main control complex includes the main control room, the technical support center, the materials irradiation control room, computer equipment rooms, communications equipment rooms, cable-spreading rooms, and some support offices and breakroom facilities. The model will be used to provide facility designers and operations personnel with capabilities for fit-up/interference analysis, visual ''walk-throughs'' for optimizing maintain-ability, and human factors and operability analyses. It will be used to determine performance design characteristics, to generate construction drawings, and to integrate control room layout, equipment mounting, grounding equipment, electrical cabling, and utility services into ANS building designs. This paper describes the development of the initial phase of the 3-D computer model for the ANS main control complex and plans for its development and use

  19. Surface-complexation models for sorption onto heterogeneous surfaces

    International Nuclear Information System (INIS)

    Harvey, K.B.

    1997-10-01

    This report provides a description of the discrete-logK spectrum model, together with a description of its derivation, and of its place in the larger context of surface-complexation modelling. The tools necessary to apply the discrete-logK spectrum model are discussed, and background information appropriate to this discussion is supplied as appendices. (author)

  20. "What is relevant in a text document?": An interpretable machine learning approach.

    Directory of Open Access Journals (Sweden)

    Leila Arras

    Full Text Available Text documents can be described by a number of abstract concepts such as semantic category, writing style, or sentiment. Machine learning (ML models have been trained to automatically map documents to these abstract concepts, allowing to annotate very large text collections, more than could be processed by a human in a lifetime. Besides predicting the text's category very accurately, it is also highly desirable to understand how and why the categorization process takes place. In this paper, we demonstrate that such understanding can be achieved by tracing the classification decision back to individual words using layer-wise relevance propagation (LRP, a recently developed technique for explaining predictions of complex non-linear classifiers. We train two word-based ML models, a convolutional neural network (CNN and a bag-of-words SVM classifier, on a topic categorization task and adapt the LRP method to decompose the predictions of these models onto words. Resulting scores indicate how much individual words contribute to the overall classification decision. This enables one to distill relevant information from text documents without an explicit semantic information extraction step. We further use the word-wise relevance scores for generating novel vector-based document representations which capture semantic information. Based on these document vectors, we introduce a measure of model explanatory power and show that, although the SVM and CNN models perform similarly in terms of classification accuracy, the latter exhibits a higher level of explainability which makes it more comprehensible for humans and potentially more useful for other applications.

  1. Clinical relevance of voltage-gated potassium channel–complex antibodies in children.

    Science.gov (United States)

    Hacohen, Yael; Singh, Rahul; Rossi, Meghan; Lang, Bethan; Hemingway, Cheryl; Lim, Ming; Vincent, Angela

    2015-09-15

    To assess the clinical and immunologic findings in children with voltage-gated potassium channel (VGKC)-complex antibodies (Abs). Thirty-nine of 363 sera, referred from 2 pediatric centers from 2007 to 2013, had been reported positive (.100 pM) for VGKC-complex Abs. Medical records were reviewed retrospectively and the patients’ condition was independently classified as inflammatory (n 5 159) or noninflammatory (n 5 204). Positive sera (.100 pM) were tested/retested for the VGKC complex Ab–positive complex proteins LGI1 and CASPR2, screened for binding to live hippocampal neurons, and 12 high-titer sera (.400 pM) tested by radioimmunoassay for binding to VGKC Kv1 subunits with or without intracellular postsynaptic density proteins. VGKC-complex Abs were found in 39 children, including 20% of encephalopathies and 7.6% of other conditions (p 5 0.001). Thirty children had inflammatory conditions and 9 had noninflammatory etiologies but titers.400 pM (n512) were found only in inflammatory diseases (p , 0.0001). Four sera, including from 2 children with coexisting NMDA receptor Abs and one with Guillain-Barré syndrome and Abs to both LGI1 and CASPR2, bound to hippocampal neurons. None of the sera bound detectably to VGKC Kv1 subunits on live HEK cells, but 4 of 12 .400 pM sera immunoprecipitated VGKC Kv1 subunits, with or without postsynaptic densities, extracted from transfected cells. Positive VGKC-complex Abs cannot be taken to indicate a specific clinical syndrome in children, but appear to be a nonspecific biomarker of inflammatory neurologic diseases, particularly of encephalopathy. Some of the Abs may bind to intracellular epitopes on the VGKC subunits, or to the intracellular interacting proteins, but in many the targets remain undefined.

  2. Volterra representation enables modeling of complex synaptic nonlinear dynamics in large-scale simulations.

    Science.gov (United States)

    Hu, Eric Y; Bouteiller, Jean-Marie C; Song, Dong; Baudry, Michel; Berger, Theodore W

    2015-01-01

    Chemical synapses are comprised of a wide collection of intricate signaling pathways involving complex dynamics. These mechanisms are often reduced to simple spikes or exponential representations in order to enable computer simulations at higher spatial levels of complexity. However, these representations cannot capture important nonlinear dynamics found in synaptic transmission. Here, we propose an input-output (IO) synapse model capable of generating complex nonlinear dynamics while maintaining low computational complexity. This IO synapse model is an extension of a detailed mechanistic glutamatergic synapse model capable of capturing the input-output relationships of the mechanistic model using the Volterra functional power series. We demonstrate that the IO synapse model is able to successfully track the nonlinear dynamics of the synapse up to the third order with high accuracy. We also evaluate the accuracy of the IO synapse model at different input frequencies and compared its performance with that of kinetic models in compartmental neuron models. Our results demonstrate that the IO synapse model is capable of efficiently replicating complex nonlinear dynamics that were represented in the original mechanistic model and provide a method to replicate complex and diverse synaptic transmission within neuron network simulations.

  3. How fear-relevant illusory correlations might develop and persist in anxiety disorders: A model of contributing factors.

    Science.gov (United States)

    Wiemer, Julian; Pauli, Paul

    2016-12-01

    Fear-relevant illusory correlations (ICs) are defined as the overestimation of the relationship between a fear-relevant stimulus and aversive consequences. ICs reflect biased cognitions affecting the learning and unlearning of fear in anxiety disorders, and a deeper understanding might help to improve treatment. A model for the maintenance of ICs is proposed that highlights the importance of amplified aversiveness and salience of fear-relevant outcomes, impaired executive contingency monitoring and an availability heuristic. The model explains why ICs are enhanced in high fearful individuals and allows for some implications that might be applied to augment the effectiveness of cognitive behavior therapy, such as emotion regulation and the direction of attention to non-aversive experiences. Finally, we suggest possible future research directions and an alternative measure of ICs. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Model-Based Approach to the Evaluation of Task Complexity in Nuclear Power Plant

    International Nuclear Information System (INIS)

    Ham, Dong Han

    2007-02-01

    This study developed a model-based method for evaluating task complexity and examined the ways of evaluating the complexity of tasks designed for abnormal situations and daily task situations in NPPs. The main results of this study can be summarised as follows. First, this study developed a conceptual framework for studying complexity factors and a model of complexity factors that classifies complexity factors according to the types of knowledge that human operators use. Second, this study developed a more practical model of task complexity factors and identified twenty-one complexity factors based on the model. The model emphasizes that a task is a system to be designed and its complexity has several dimensions. Third, we developed a method of identifying task complexity factors and evaluating task complexity qualitatively based on the developed model of task complexity factors. This method can be widely used in various task situations. Fourth, this study examined the applicability of TACOM to abnormal situations and daily task situations, such as maintenance and confirmed that it can be reasonably used in those situations. Fifth, we developed application examples to demonstrate the use of the theoretical results of this study. Lastly, this study reinterpreted well-know principles for designing information displays in NPPs in terms of task complexity and suggested a way of evaluating the conceptual design of displays in an analytical way by using the concept of task complexity. All of the results of this study will be used as a basis when evaluating the complexity of tasks designed on procedures or information displays and designing ways of improving human performance in NPPs

  5. Repository environmental parameters and models/methodologies relevant to assessing the performance of high-level waste packages in basalt, tuff, and salt

    Energy Technology Data Exchange (ETDEWEB)

    Claiborne, H.C.; Croff, A.G.; Griess, J.C.; Smith, F.J.

    1987-09-01

    This document provides specifications for models/methodologies that could be employed in determining postclosure repository environmental parameters relevant to the performance of high-level waste packages for the Basalt Waste Isolation Project (BWIP) at Richland, Washington, the tuff at Yucca Mountain by the Nevada Test Site, and the bedded salt in Deaf Smith County, Texas. Guidance is provided on the identify of the relevant repository environmental parameters; the models/methodologies employed to determine the parameters, and the input data base for the models/methodologies. Supporting studies included are an analysis of potential waste package failure modes leading to identification of the relevant repository environmental parameters, an evaluation of the credible range of the repository environmental parameters, and a summary of the review of existing models/methodologies currently employed in determining repository environmental parameters relevant to waste package performance. 327 refs., 26 figs., 19 tabs.

  6. Repository environmental parameters and models/methodologies relevant to assessing the performance of high-level waste packages in basalt, tuff, and salt

    International Nuclear Information System (INIS)

    Claiborne, H.C.; Croff, A.G.; Griess, J.C.; Smith, F.J.

    1987-09-01

    This document provides specifications for models/methodologies that could be employed in determining postclosure repository environmental parameters relevant to the performance of high-level waste packages for the Basalt Waste Isolation Project (BWIP) at Richland, Washington, the tuff at Yucca Mountain by the Nevada Test Site, and the bedded salt in Deaf Smith County, Texas. Guidance is provided on the identify of the relevant repository environmental parameters; the models/methodologies employed to determine the parameters, and the input data base for the models/methodologies. Supporting studies included are an analysis of potential waste package failure modes leading to identification of the relevant repository environmental parameters, an evaluation of the credible range of the repository environmental parameters, and a summary of the review of existing models/methodologies currently employed in determining repository environmental parameters relevant to waste package performance. 327 refs., 26 figs., 19 tabs

  7. Sensitivity of the coastal tsunami simulation to the complexity of the 2011 Tohoku earthquake source model

    Science.gov (United States)

    Monnier, Angélique; Loevenbruck, Anne; Gailler, Audrey; Hébert, Hélène

    2016-04-01

    The 11 March 2011 Tohoku-Oki event, whether earthquake or tsunami, is exceptionally well documented. A wide range of onshore and offshore data has been recorded from seismic, geodetic, ocean-bottom pressure and sea level sensors. Along with these numerous observations, advance in inversion technique and computing facilities have led to many source studies. Rupture parameters inversion such as slip distribution and rupture history permit to estimate the complex coseismic seafloor deformation. From the numerous published seismic source studies, the most relevant coseismic source models are tested. The comparison of the predicted signals generated using both static and cinematic ruptures to the offshore and coastal measurements help determine which source model should be used to obtain the more consistent coastal tsunami simulations. This work is funded by the TANDEM project, reference ANR-11-RSNR-0023-01 of the French Programme Investissements d'Avenir (PIA 2014-2018).

  8. Understanding the implementation of complex interventions in health care: the normalization process model

    Directory of Open Access Journals (Sweden)

    Rogers Anne

    2007-09-01

    Full Text Available Abstract Background The Normalization Process Model is a theoretical model that assists in explaining the processes by which complex interventions become routinely embedded in health care practice. It offers a framework for process evaluation and also for comparative studies of complex interventions. It focuses on the factors that promote or inhibit the routine embedding of complex interventions in health care practice. Methods A formal theory structure is used to define the model, and its internal causal relations and mechanisms. The model is broken down to show that it is consistent and adequate in generating accurate description, systematic explanation, and the production of rational knowledge claims about the workability and integration of complex interventions. Results The model explains the normalization of complex interventions by reference to four factors demonstrated to promote or inhibit the operationalization and embedding of complex interventions (interactional workability, relational integration, skill-set workability, and contextual integration. Conclusion The model is consistent and adequate. Repeated calls for theoretically sound process evaluations in randomized controlled trials of complex interventions, and policy-makers who call for a proper understanding of implementation processes, emphasize the value of conceptual tools like the Normalization Process Model.

  9. Complexity, parameter sensitivity and parameter transferability in the modelling of floodplain inundation

    Science.gov (United States)

    Bates, P. D.; Neal, J. C.; Fewtrell, T. J.

    2012-12-01

    In this we paper we consider two related questions. First, we address the issue of how much physical complexity is necessary in a model in order to simulate floodplain inundation to within validation data error. This is achieved through development of a single code/multiple physics hydraulic model (LISFLOOD-FP) where different degrees of complexity can be switched on or off. Different configurations of this code are applied to four benchmark test cases, and compared to the results of a number of industry standard models. Second we address the issue of how parameter sensitivity and transferability change with increasing complexity using numerical experiments with models of different physical and geometric intricacy. Hydraulic models are a good example system with which to address such generic modelling questions as: (1) they have a strong physical basis; (2) there is only one set of equations to solve; (3) they require only topography and boundary conditions as input data; and (4) they typically require only a single free parameter, namely boundary friction. In terms of complexity required we show that for the problem of sub-critical floodplain inundation a number of codes of different dimensionality and resolution can be found to fit uncertain model validation data equally well, and that in this situation Occam's razor emerges as a useful logic to guide model selection. We find also find that model skill usually improves more rapidly with increases in model spatial resolution than increases in physical complexity, and that standard approaches to testing hydraulic models against laboratory data or analytical solutions may fail to identify this important fact. Lastly, we find that in benchmark testing studies significant differences can exist between codes with identical numerical solution techniques as a result of auxiliary choices regarding the specifics of model implementation that are frequently unreported by code developers. As a consequence, making sound

  10. Neutrophil programming dynamics and its disease relevance.

    Science.gov (United States)

    Ran, Taojing; Geng, Shuo; Li, Liwu

    2017-11-01

    Neutrophils are traditionally considered as first responders to infection and provide antimicrobial host defense. However, recent advances indicate that neutrophils are also critically involved in the modulation of host immune environments by dynamically adopting distinct functional states. Functionally diverse neutrophil subsets are increasingly recognized as critical components mediating host pathophysiology. Despite its emerging significance, molecular mechanisms as well as functional relevance of dynamically programmed neutrophils remain to be better defined. The increasing complexity of neutrophil functions may require integrative studies that address programming dynamics of neutrophils and their pathophysiological relevance. This review aims to provide an update on the emerging topics of neutrophil programming dynamics as well as their functional relevance in diseases.

  11. Modeling the propagation of mobile malware on complex networks

    Science.gov (United States)

    Liu, Wanping; Liu, Chao; Yang, Zheng; Liu, Xiaoyang; Zhang, Yihao; Wei, Zuxue

    2016-08-01

    In this paper, the spreading behavior of malware across mobile devices is addressed. By introducing complex networks to model mobile networks, which follows the power-law degree distribution, a novel epidemic model for mobile malware propagation is proposed. The spreading threshold that guarantees the dynamics of the model is calculated. Theoretically, the asymptotic stability of the malware-free equilibrium is confirmed when the threshold is below the unity, and the global stability is further proved under some sufficient conditions. The influences of different model parameters as well as the network topology on malware propagation are also analyzed. Our theoretical studies and numerical simulations show that networks with higher heterogeneity conduce to the diffusion of malware, and complex networks with lower power-law exponents benefit malware spreading.

  12. Nostradamus 2014 prediction, modeling and analysis of complex systems

    CERN Document Server

    Suganthan, Ponnuthurai; Chen, Guanrong; Snasel, Vaclav; Abraham, Ajith; Rössler, Otto

    2014-01-01

    The prediction of behavior of complex systems, analysis and modeling of its structure is a vitally important problem in engineering, economy and generally in science today. Examples of such systems can be seen in the world around us (including our bodies) and of course in almost every scientific discipline including such “exotic” domains as the earth’s atmosphere, turbulent fluids, economics (exchange rate and stock markets), population growth, physics (control of plasma), information flow in social networks and its dynamics, chemistry and complex networks. To understand such complex dynamics, which often exhibit strange behavior, and to use it in research or industrial applications, it is paramount to create its models. For this purpose there exists a rich spectrum of methods, from classical such as ARMA models or Box Jenkins method to modern ones like evolutionary computation, neural networks, fuzzy logic, geometry, deterministic chaos amongst others. This proceedings book is a collection of accepted ...

  13. Glass Durability Modeling, Activated Complex Theory (ACT)

    International Nuclear Information System (INIS)

    CAROL, JANTZEN

    2005-01-01

    The most important requirement for high-level waste glass acceptance for disposal in a geological repository is the chemical durability, expressed as a glass dissolution rate. During the early stages of glass dissolution in near static conditions that represent a repository disposal environment, a gel layer resembling a membrane forms on the glass surface through which ions exchange between the glass and the leachant. The hydrated gel layer exhibits acid/base properties which are manifested as the pH dependence of the thickness and nature of the gel layer. The gel layer has been found to age into either clay mineral assemblages or zeolite mineral assemblages. The formation of one phase preferentially over the other has been experimentally related to changes in the pH of the leachant and related to the relative amounts of Al +3 and Fe +3 in a glass. The formation of clay mineral assemblages on the leached glass surface layers ,lower pH and Fe +3 rich glasses, causes the dissolution rate to slow to a long-term steady state rate. The formation of zeolite mineral assemblages ,higher pH and Al +3 rich glasses, on leached glass surface layers causes the dissolution rate to increase and return to the initial high forward rate. The return to the forward dissolution rate is undesirable for long-term performance of glass in a disposal environment. An investigation into the role of glass stoichiometry, in terms of the quasi-crystalline mineral species in a glass, has shown that the chemistry and structure in the parent glass appear to control the activated surface complexes that form in the leached layers, and these mineral complexes ,some Fe +3 rich and some Al +3 rich, play a role in whether or not clays or zeolites are the dominant species formed on the leached glass surface. The chemistry and structure, in terms of Q distributions of the parent glass, are well represented by the atomic ratios of the glass forming components. Thus, glass dissolution modeling using simple

  14. Diversification versus specialization in complex ecosystems.

    Directory of Open Access Journals (Sweden)

    Riccardo Di Clemente

    Full Text Available By analyzing the distribution of revenues across the production sectors of quoted firms we suggest a novel dimension that drives the firms diversification process at country level. Data show a non trivial macro regional clustering of the diversification process, which underlines the relevance of geopolitical environments in determining the microscopic dynamics of economic entities. These findings demonstrate the possibility of singling out in complex ecosystems those micro-features that emerge at macro-levels, which could be of particular relevance for decision-makers in selecting the appropriate parameters to be acted upon in order to achieve desirable results. The understanding of this micro-macro information exchange is further deepened through the introduction of a simplified dynamic model.

  15. Stability of rotor systems: A complex modelling approach

    DEFF Research Database (Denmark)

    Kliem, Wolfhard; Pommer, Christian; Stoustrup, Jakob

    1998-01-01

    The dynamics of a large class of rotor systems can be modelled by a linearized complex matrix differential equation of second order, Mz + (D + iG)(z) over dot + (K + iN)z = 0, where the system matrices M, D, G, K and N are real symmetric. Moreover M and K are assumed to be positive definite and D...... approach applying bounds of appropriate Rayleigh quotients. The rotor systems tested are: a simple Laval rotor, a Laval rotor with additional elasticity and damping in the bearings, and a number of rotor systems with complex symmetric 4 x 4 randomly generated matrices.......The dynamics of a large class of rotor systems can be modelled by a linearized complex matrix differential equation of second order, Mz + (D + iG)(z) over dot + (K + iN)z = 0, where the system matrices M, D, G, K and N are real symmetric. Moreover M and K are assumed to be positive definite and D...

  16. Risk Modeling of Interdependent Complex Systems of Systems: Theory and Practice.

    Science.gov (United States)

    Haimes, Yacov Y

    2018-01-01

    The emergence of the complexity characterizing our systems of systems (SoS) requires a reevaluation of the way we model, assess, manage, communicate, and analyze the risk thereto. Current models for risk analysis of emergent complex SoS are insufficient because too often they rely on the same risk functions and models used for single systems. These models commonly fail to incorporate the complexity derived from the networks of interdependencies and interconnectedness (I-I) characterizing SoS. There is a need to reevaluate currently practiced risk analysis to respond to this reality by examining, and thus comprehending, what makes emergent SoS complex. The key to evaluating the risk to SoS lies in understanding the genesis of characterizing I-I of systems manifested through shared states and other essential entities within and among the systems that constitute SoS. The term "essential entities" includes shared decisions, resources, functions, policies, decisionmakers, stakeholders, organizational setups, and others. This undertaking can be accomplished by building on state-space theory, which is fundamental to systems engineering and process control. This article presents a theoretical and analytical framework for modeling the risk to SoS with two case studies performed with the MITRE Corporation and demonstrates the pivotal contributions made by shared states and other essential entities to modeling and analysis of the risk to complex SoS. A third case study highlights the multifarious representations of SoS, which require harmonizing the risk analysis process currently applied to single systems when applied to complex SoS. © 2017 Society for Risk Analysis.

  17. Using small XML elements to support relevance

    NARCIS (Netherlands)

    G. Ramirez Camps (Georgina); T.H.W. Westerveld (Thijs); A.P. de Vries (Arjen)

    2006-01-01

    htmlabstractSmall XML elements are often estimated relevant by the retrieval model but they are not desirable retrieval units. This paper presents a generic model that exploits the information obtained from small elements. We identify relationships between small and relevant elements and use this

  18. Complexity effects in choice experiments-based models

    NARCIS (Netherlands)

    Dellaert, B.G.C.; Donkers, B.; van Soest, A.H.O.

    2012-01-01

    Many firms rely on choice experiment–based models to evaluate future marketing actions under various market conditions. This research investigates choice complexity (i.e., number of alternatives, number of attributes, and utility similarity between the most attractive alternatives) and individual

  19. Increasing process understanding by analyzing complex interactions in experimental data

    DEFF Research Database (Denmark)

    Naelapaa, Kaisa; Allesø, Morten; Kristensen, Henning Gjelstrup

    2009-01-01

    understanding of a coating process. It was possible to model the response, that is, the amount of drug released, using both mentioned techniques. However, the ANOVAmodel was difficult to interpret as several interactions between process parameters existed. In contrast to ANOVA, GEMANOVA is especially suited...... for modeling complex interactions and making easily understandable models of these. GEMANOVA modeling allowed a simple visualization of the entire experimental space. Furthermore, information was obtained on how relative changes in the settings of process parameters influence the film quality and thereby drug......There is a recognized need for new approaches to understand unit operations with pharmaceutical relevance. A method for analyzing complex interactions in experimental data is introduced. Higher-order interactions do exist between process parameters, which complicate the interpretation...

  20. A multi-element cosmological model with a complex space-time topology

    Science.gov (United States)

    Kardashev, N. S.; Lipatova, L. N.; Novikov, I. D.; Shatskiy, A. A.

    2015-02-01

    Wormhole models with a complex topology having one entrance and two exits into the same space-time of another universe are considered, as well as models with two entrances from the same space-time and one exit to another universe. These models are used to build a model of a multi-sheeted universe (a multi-element model of the "Multiverse") with a complex topology. Spherical symmetry is assumed in all the models. A Reissner-Norström black-hole model having no singularity beyond the horizon is constructed. The strength of the central singularity of the black hole is analyzed.

  1. Systems Engineering Metrics: Organizational Complexity and Product Quality Modeling

    Science.gov (United States)

    Mog, Robert A.

    1997-01-01

    Innovative organizational complexity and product quality models applicable to performance metrics for NASA-MSFC's Systems Analysis and Integration Laboratory (SAIL) missions and objectives are presented. An intensive research effort focuses on the synergistic combination of stochastic process modeling, nodal and spatial decomposition techniques, organizational and computational complexity, systems science and metrics, chaos, and proprietary statistical tools for accelerated risk assessment. This is followed by the development of a preliminary model, which is uniquely applicable and robust for quantitative purposes. Exercise of the preliminary model using a generic system hierarchy and the AXAF-I architectural hierarchy is provided. The Kendall test for positive dependence provides an initial verification and validation of the model. Finally, the research and development of the innovation is revisited, prior to peer review. This research and development effort results in near-term, measurable SAIL organizational and product quality methodologies, enhanced organizational risk assessment and evolutionary modeling results, and 91 improved statistical quantification of SAIL productivity interests.

  2. Post-closure biosphere assessment modelling: comparison of complex and more stylised approaches.

    Science.gov (United States)

    Walke, Russell C; Kirchner, Gerald; Xu, Shulan; Dverstorp, Björn

    2015-10-01

    Geological disposal facilities are the preferred option for high-level radioactive waste, due to their potential to provide isolation from the surface environment (biosphere) on very long timescales. Assessments need to strike a balance between stylised models and more complex approaches that draw more extensively on site-specific information. This paper explores the relative merits of complex versus more stylised biosphere models in the context of a site-specific assessment. The more complex biosphere modelling approach was developed by the Swedish Nuclear Fuel and Waste Management Co (SKB) for the Formark candidate site for a spent nuclear fuel repository in Sweden. SKB's approach is built on a landscape development model, whereby radionuclide releases to distinct hydrological basins/sub-catchments (termed 'objects') are represented as they evolve through land rise and climate change. Each of seventeen of these objects is represented with more than 80 site specific parameters, with about 22 that are time-dependent and result in over 5000 input values per object. The more stylised biosphere models developed for this study represent releases to individual ecosystems without environmental change and include the most plausible transport processes. In the context of regulatory review of the landscape modelling approach adopted in the SR-Site assessment in Sweden, the more stylised representation has helped to build understanding in the more complex modelling approaches by providing bounding results, checking the reasonableness of the more complex modelling, highlighting uncertainties introduced through conceptual assumptions and helping to quantify the conservatisms involved. The more stylised biosphere models are also shown capable of reproducing the results of more complex approaches. A major recommendation is that biosphere assessments need to justify the degree of complexity in modelling approaches as well as simplifying and conservative assumptions. In light of

  3. DEVELOPING INDUSTRIAL ROBOT SIMULATION MODEL TUR10-K USING “UNIVERSAL MECHANISM” SOFTWARE COMPLEX

    Directory of Open Access Journals (Sweden)

    Vadim Vladimirovich Chirkov

    2018-02-01

    Full Text Available Manipulation robots are complex spatial mechanical systems having five or six degrees of freedom, and sometimes more. For this reason, modeling manipulative robots movement, even in the kinematic formulation, is a complex mathematical task. If one moves from kinematic modeling of motion to dynamic modeling then there must be taken into account the inertial properties of the modeling object. In this case, analytical constructing of such a complex object mathematical model as a manipulation robot becomes practically impossible. Therefore, special computer-aided design systems, called CAE-systems, are used for modeling complex mechanical systems. The purpose of the paper is simulation model construction of a complex mechanical system, such as the industrial robot TUR10-K, to obtain its dynamic characteristics. Developing such models makes it possible to reduce the complexity of designing complex systems process and to obtain the necessary characteristics. Purpose. Developing the simulation model of the industrial robot TUR10-K and obtaining dynamic characteristics of the mechanism. Methodology: the article is used a computer simulation method. Results: There is obtained the simulation model of the robot and its dynamic characteristics. Practical implications: the results can be used in the mechanical systems design and various simulation models.

  4. System Testability Analysis for Complex Electronic Devices Based on Multisignal Model

    International Nuclear Information System (INIS)

    Long, B; Tian, S L; Huang, J G

    2006-01-01

    It is necessary to consider the system testability problems for electronic devices during their early design phase because modern electronic devices become smaller and more compositive while their function and structure are more complex. Multisignal model, combining advantage of structure model and dependency model, is used to describe the fault dependency relationship for the complex electronic devices, and the main testability indexes (including optimal test program, fault detection rate, fault isolation rate, etc.) to evaluate testability and corresponding algorithms are given. The system testability analysis process is illustrated for USB-GPIB interface circuit with TEAMS toolbox. The experiment results show that the modelling method is simple, the computation speed is rapid and this method has important significance to improve diagnostic capability for complex electronic devices

  5. Epidemic processes in complex networks

    Science.gov (United States)

    Pastor-Satorras, Romualdo; Castellano, Claudio; Van Mieghem, Piet; Vespignani, Alessandro

    2015-07-01

    In recent years the research community has accumulated overwhelming evidence for the emergence of complex and heterogeneous connectivity patterns in a wide range of biological and sociotechnical systems. The complex properties of real-world networks have a profound impact on the behavior of equilibrium and nonequilibrium phenomena occurring in various systems, and the study of epidemic spreading is central to our understanding of the unfolding of dynamical processes in complex networks. The theoretical analysis of epidemic spreading in heterogeneous networks requires the development of novel analytical frameworks, and it has produced results of conceptual and practical relevance. A coherent and comprehensive review of the vast research activity concerning epidemic processes is presented, detailing the successful theoretical approaches as well as making their limits and assumptions clear. Physicists, mathematicians, epidemiologists, computer, and social scientists share a common interest in studying epidemic spreading and rely on similar models for the description of the diffusion of pathogens, knowledge, and innovation. For this reason, while focusing on the main results and the paradigmatic models in infectious disease modeling, the major results concerning generalized social contagion processes are also presented. Finally, the research activity at the forefront in the study of epidemic spreading in coevolving, coupled, and time-varying networks is reported.

  6. New approaches in agent-based modeling of complex financial systems

    Science.gov (United States)

    Chen, Ting-Ting; Zheng, Bo; Li, Yan; Jiang, Xiong-Fei

    2017-12-01

    Agent-based modeling is a powerful simulation technique to understand the collective behavior and microscopic interaction in complex financial systems. Recently, the concept for determining the key parameters of agent-based models from empirical data instead of setting them artificially was suggested. We first review several agent-based models and the new approaches to determine the key model parameters from historical market data. Based on the agents' behaviors with heterogeneous personal preferences and interactions, these models are successful in explaining the microscopic origination of the temporal and spatial correlations of financial markets. We then present a novel paradigm combining big-data analysis with agent-based modeling. Specifically, from internet query and stock market data, we extract the information driving forces and develop an agent-based model to simulate the dynamic behaviors of complex financial systems.

  7. Profiles of Dialogue for Relevance

    Directory of Open Access Journals (Sweden)

    Douglas Walton

    2016-12-01

    Full Text Available This paper uses argument diagrams, argumentation schemes, and some tools from formal argumentation systems developed in artificial intelligence to build a graph-theoretic model of relevance shown to be applicable (with some extensions as a practical method for helping a third party judge issues of relevance or irrelevance of an argument in real examples. Examples used to illustrate how the method works are drawn from disputes about relevance in natural language discourse, including a criminal trial and a parliamentary debate.

  8. Mathematical modelling of complex contagion on clustered networks

    Science.gov (United States)

    O'sullivan, David J.; O'Keeffe, Gary; Fennell, Peter; Gleeson, James

    2015-09-01

    The spreading of behavior, such as the adoption of a new innovation, is influenced bythe structure of social networks that interconnect the population. In the experiments of Centola (Science, 2010), adoption of new behavior was shown to spread further and faster across clustered-lattice networks than across corresponding random networks. This implies that the “complex contagion” effects of social reinforcement are important in such diffusion, in contrast to “simple” contagion models of disease-spread which predict that epidemics would grow more efficiently on random networks than on clustered networks. To accurately model complex contagion on clustered networks remains a challenge because the usual assumptions (e.g. of mean-field theory) regarding tree-like networks are invalidated by the presence of triangles in the network; the triangles are, however, crucial to the social reinforcement mechanism, which posits an increased probability of a person adopting behavior that has been adopted by two or more neighbors. In this paper we modify the analytical approach that was introduced by Hebert-Dufresne et al. (Phys. Rev. E, 2010), to study disease-spread on clustered networks. We show how the approximation method can be adapted to a complex contagion model, and confirm the accuracy of the method with numerical simulations. The analytical results of the model enable us to quantify the level of social reinforcement that is required to observe—as in Centola’s experiments—faster diffusion on clustered topologies than on random networks.

  9. Mathematical modelling of complex contagion on clustered networks

    Directory of Open Access Journals (Sweden)

    David J. P. O'Sullivan

    2015-09-01

    Full Text Available The spreading of behavior, such as the adoption of a new innovation, is influenced bythe structure of social networks that interconnect the population. In the experiments of Centola (Science, 2010, adoption of new behavior was shown to spread further and faster across clustered-lattice networks than across corresponding random networks. This implies that the complex contagion effects of social reinforcement are important in such diffusion, in contrast to simple contagion models of disease-spread which predict that epidemics would grow more efficiently on random networks than on clustered networks. To accurately model complex contagion on clustered networks remains a challenge because the usual assumptions (e.g. of mean-field theory regarding tree-like networks are invalidated by the presence of triangles in the network; the triangles are, however, crucial to the social reinforcement mechanism, which posits an increased probability of a person adopting behavior that has been adopted by two or more neighbors. In this paper we modify the analytical approach that was introduced by Hebert-Dufresne et al. (Phys. Rev. E, 2010, to study disease-spread on clustered networks. We show how the approximation method can be adapted to a complex contagion model, and confirm the accuracy of the method with numerical simulations. The analytical results of the model enable us to quantify the level of social reinforcement that is required to observe—as in Centola’s experiments—faster diffusion on clustered topologies than on random networks.

  10. Review: To be or not to be an identifiable model. Is this a relevant question in animal science modelling?

    Science.gov (United States)

    Muñoz-Tamayo, R; Puillet, L; Daniel, J B; Sauvant, D; Martin, O; Taghipoor, M; Blavy, P

    2018-04-01

    What is a good (useful) mathematical model in animal science? For models constructed for prediction purposes, the question of model adequacy (usefulness) has been traditionally tackled by statistical analysis applied to observed experimental data relative to model-predicted variables. However, little attention has been paid to analytic tools that exploit the mathematical properties of the model equations. For example, in the context of model calibration, before attempting a numerical estimation of the model parameters, we might want to know if we have any chance of success in estimating a unique best value of the model parameters from available measurements. This question of uniqueness is referred to as structural identifiability; a mathematical property that is defined on the sole basis of the model structure within a hypothetical ideal experiment determined by a setting of model inputs (stimuli) and observable variables (measurements). Structural identifiability analysis applied to dynamic models described by ordinary differential equations (ODEs) is a common practice in control engineering and system identification. This analysis demands mathematical technicalities that are beyond the academic background of animal science, which might explain the lack of pervasiveness of identifiability analysis in animal science modelling. To fill this gap, in this paper we address the analysis of structural identifiability from a practitioner perspective by capitalizing on the use of dedicated software tools. Our objectives are (i) to provide a comprehensive explanation of the structural identifiability notion for the community of animal science modelling, (ii) to assess the relevance of identifiability analysis in animal science modelling and (iii) to motivate the community to use identifiability analysis in the modelling practice (when the identifiability question is relevant). We focus our study on ODE models. By using illustrative examples that include published

  11. Uncertainty and validation. Effect of model complexity on uncertainty estimates

    Energy Technology Data Exchange (ETDEWEB)

    Elert, M. [Kemakta Konsult AB, Stockholm (Sweden)] [ed.

    1996-09-01

    In the Model Complexity subgroup of BIOMOVS II, models of varying complexity have been applied to the problem of downward transport of radionuclides in soils. A scenario describing a case of surface contamination of a pasture soil was defined. Three different radionuclides with different environmental behavior and radioactive half-lives were considered: Cs-137, Sr-90 and I-129. The intention was to give a detailed specification of the parameters required by different kinds of model, together with reasonable values for the parameter uncertainty. A total of seven modelling teams participated in the study using 13 different models. Four of the modelling groups performed uncertainty calculations using nine different modelling approaches. The models used range in complexity from analytical solutions of a 2-box model using annual average data to numerical models coupling hydrology and transport using data varying on a daily basis. The complex models needed to consider all aspects of radionuclide transport in a soil with a variable hydrology are often impractical to use in safety assessments. Instead simpler models, often box models, are preferred. The comparison of predictions made with the complex models and the simple models for this scenario show that the predictions in many cases are very similar, e g in the predictions of the evolution of the root zone concentration. However, in other cases differences of many orders of magnitude can appear. One example is the prediction of the flux to the groundwater of radionuclides being transported through the soil column. Some issues that have come to focus in this study: There are large differences in the predicted soil hydrology and as a consequence also in the radionuclide transport, which suggests that there are large uncertainties in the calculation of effective precipitation and evapotranspiration. The approach used for modelling the water transport in the root zone has an impact on the predictions of the decline in root

  12. Advances in dynamic network modeling in complex transportation systems

    CERN Document Server

    Ukkusuri, Satish V

    2013-01-01

    This book focuses on the latest in dynamic network modeling, including route guidance and traffic control in transportation systems and other complex infrastructure networks. Covers dynamic traffic assignment, flow modeling, mobile sensor deployment and more.

  13. A relativistic density functional study of uranyl hydrolysis and complexation by carboxylic acids in aqueous solution

    International Nuclear Information System (INIS)

    Ray, Rupashree Shyama

    2009-01-01

    In this work, the complexation of uranium in its most stable oxidation state VI in aqueous solution was studied computationally, within the framework of density functional (DF) theory. The thesis is divided into the following parts: Chapter 2 briefly summarizes the relevant general aspects of actinide chemistry and then focuses on actinide environmental chemistry. Experimental results on hydrolysis, actinide complexation by carboxylic acids, and humic substances are presented to establish a background for the subsequent discussion. Chapter 3 describes the computational method used in this work and the relevant features of the parallel quantum chemistry code PARAGAUSS employed. First, the most relevant basics of the applied density functional approach are presented focusing on relativistic effects. Then, the treatment of solvent effects, essential for an adequate modeling of actinide species in aqueous solution, will be introduced. At the end of this chapter, computational parameters and procedures will be summarized. Chapter 4 presents the computational results including a comparison to available experimental data. In the beginning, the mononuclear hydrolysis product of UO_2"2"+, [UO_2OH]"+, will be discussed. The second part deals with actinide complexation by carboxylate ligands. First of all the coordination number for uranylacetate will be discussed with respect to implications for the complexation of actinides by humic substances followed by the uranyl complexation of aromatic carboxylic acids in comparison to earlier results for aliphatic ones. In the end, the ternary uranyl-hydroxo-acetate are discussed, as models of uranyl humate complexation at ambient condition.

  14. Developing an agent-based model on how different individuals solve complex problems

    Directory of Open Access Journals (Sweden)

    Ipek Bozkurt

    2015-01-01

    Full Text Available Purpose: Research that focuses on the emotional, mental, behavioral and cognitive capabilities of individuals has been abundant within disciplines such as psychology, sociology, and anthropology, among others. However, when facing complex problems, a new perspective to understand individuals is necessary. The main purpose of this paper is to develop an agent-based model and simulation to gain understanding on the decision-making and problem-solving abilities of individuals. Design/Methodology/approach: The micro-level analysis modeling and simulation paradigm Agent-Based Modeling Through the use of Agent-Based Modeling, insight is gained on how different individuals with different profiles deal with complex problems. Using previous literature from different bodies of knowledge, established theories and certain assumptions as input parameters, a model is built and executed through a computer simulation. Findings: The results indicate that individuals with certain profiles have better capabilities to deal with complex problems. Moderate profiles could solve the entire complex problem, whereas profiles within extreme conditions could not. This indicates that having a strong predisposition is not the ideal way when approaching complex problems, and there should always be a component from the other perspective. The probability that an individual may use these capabilities provided by the opposite predisposition provides to be a useful option. Originality/value: The originality of the present research stems from how individuals are profiled, and the model and simulation that is built to understand how they solve complex problems. The development of the agent-based model adds value to the existing body of knowledge within both social sciences, and modeling and simulation.

  15. New Perspectives on Rodent Models of Advanced Paternal Age: Relevance to Autism

    Directory of Open Access Journals (Sweden)

    Claire J Foldi

    2011-06-01

    Full Text Available Offspring of older fathers have an increased risk of various adverse health outcomes, including autism and schizophrenia. With respect to biological mechanisms for this association, there are many more germline cell divisions in the life history of a sperm relative to that of an oocyte. This leads to more opportunities for copy error mutations in germ cells from older fathers. Evidence also suggests that epigenetic patterning in the sperm from older men is altered. Rodent models provide an experimental platform to examine the association between paternal age and brain development. Several rodent models of advanced paternal age (APA have been published with relevance to intermediate phenotypes related to autism. All four published APA models vary in key features creating a lack of consistency with respect to behavioural phenotypes. A consideration of common phenotypes that emerge from these APA-related mouse models may be informative in the exploration of the molecular and neurobiological correlates of APA.

  16. Nostalgia's place among self-relevant emotions.

    Science.gov (United States)

    van Tilburg, Wijnand A P; Wildschut, Tim; Sedikides, Constantine

    2017-07-24

    How is nostalgia positioned among self-relevant emotions? We tested, in six studies, which self-relevant emotions are perceived as most similar versus least similar to nostalgia, and what underlies these similarities/differences. We used multidimensional scaling to chart the perceived similarities/differences among self-relevant emotions, resulting in two-dimensional models. The results were revealing. Nostalgia is positioned among self-relevant emotions characterised by positive valence, an approach orientation, and low arousal. Nostalgia most resembles pride and self-compassion, and least resembles embarrassment and shame. Our research pioneered the integration of nostalgia among self-relevant emotions.

  17. Between Complexity and Parsimony: Can Agent-Based Modelling Resolve the Trade-off

    DEFF Research Database (Denmark)

    Nielsen, Helle Ørsted; Malawska, Anna Katarzyna

    2013-01-01

    to BR- based policy studies would be to couple research on bounded ra-tionality with agent-based modeling. Agent-based models (ABMs) are computational models for simulating the behavior and interactions of any number of decision makers in a dynamic system. Agent-based models are better suited than...... are general equilibrium models for capturing behavior patterns of complex systems. ABMs may have the potential to represent complex systems without oversimplifying them. At the same time, research in bounded rationality and behavioral economics has already yielded many insights that could inform the modeling......While Herbert Simon espoused development of general models of behavior, he also strongly advo-cated that these models be based on realistic assumptions about humans and therefore reflect the complexity of human cognition and social systems (Simon 1997). Hence, the model of bounded rationality...

  18. Generation of Complex Karstic Conduit Networks with a Hydro-chemical Model

    Science.gov (United States)

    De Rooij, R.; Graham, W. D.

    2016-12-01

    The discrete-continuum approach is very well suited to simulate flow and solute transport within karst aquifers. Using this approach, discrete one-dimensional conduits are embedded within a three-dimensional continuum representative of the porous limestone matrix. Typically, however, little is known about the geometry of the karstic conduit network. As such the discrete-continuum approach is rarely used for practical applications. It may be argued, however, that the uncertainty associated with the geometry of the network could be handled by modeling an ensemble of possible karst conduit networks within a stochastic framework. We propose to generate stochastically realistic karst conduit networks by simulating the widening of conduits as caused by the dissolution of limestone over geological relevant timescales. We illustrate that advanced numerical techniques permit to solve the non-linear and coupled hydro-chemical processes efficiently, such that relatively large and complex networks can be generated in acceptable time frames. Instead of specifying flow boundary conditions on conduit cells to recharge the network as is typically done in classical speleogenesis models, we specify an effective rainfall rate over the land surface and let model physics determine the amount of water entering the network. This is advantageous since the amount of water entering the network is extremely difficult to reconstruct, whereas the effective rainfall rate may be quantified using paleoclimatic data. Furthermore, we show that poorly known flow conditions may be constrained by requiring a realistic flow field. Using our speleogenesis model we have investigated factors that influence the geometry of simulated conduit networks. We illustrate that our model generates typical branchwork, network and anastomotic conduit systems. Flow, solute transport and water ages in karst aquifers are simulated using a few illustrative networks.

  19. Complex groundwater flow systems as traveling agent models

    Directory of Open Access Journals (Sweden)

    Oliver López Corona

    2014-10-01

    Full Text Available Analyzing field data from pumping tests, we show that as with many other natural phenomena, groundwater flow exhibits complex dynamics described by 1/f power spectrum. This result is theoretically studied within an agent perspective. Using a traveling agent model, we prove that this statistical behavior emerges when the medium is complex. Some heuristic reasoning is provided to justify both spatial and dynamic complexity, as the result of the superposition of an infinite number of stochastic processes. Even more, we show that this implies that non-Kolmogorovian probability is needed for its study, and provide a set of new partial differential equations for groundwater flow.

  20. Impact relevance and usability of high resolution climate modeling and data

    Energy Technology Data Exchange (ETDEWEB)

    Arnott, James C. [Aspen Global Change Inst., Basalt, CO (United States)

    2016-10-30

    The Aspen Global Change Institute hosted a technical science workshop entitled, “Impact Relevance and Usability of High-Resolution Climate Modeling and Datasets,” on August 2-7, 2015 in Aspen, CO. Kate Calvin (Pacific Northwest National Laboratory), Andrew Jones (Lawrence Berkeley National Laboratory) and Jean-François Lamarque (NCAR) served as co-chairs for the workshop. The meeting included the participation of 29 scientists for a total of 145 participant days. Following the workshop, workshop co-chairs authored a meeting report published in Eos on April 27, 2016. Insights from the workshop directly contributed to the formation of a new DOE-supported project co-led by workshop co-chair Andy Jones. A subset of meeting participants continue to work on a publication on institutional innovations that can support the usability of high resolution modeling, among other sources of climate information.

  1. Complex data modeling and computationally intensive methods for estimation and prediction

    CERN Document Server

    Secchi, Piercesare; Advances in Complex Data Modeling and Computational Methods in Statistics

    2015-01-01

    The book is addressed to statisticians working at the forefront of the statistical analysis of complex and high dimensional data and offers a wide variety of statistical models, computer intensive methods and applications: network inference from the analysis of high dimensional data; new developments for bootstrapping complex data; regression analysis for measuring the downsize reputational risk; statistical methods for research on the human genome dynamics; inference in non-euclidean settings and for shape data; Bayesian methods for reliability and the analysis of complex data; methodological issues in using administrative data for clinical and epidemiological research; regression models with differential regularization; geostatistical methods for mobility analysis through mobile phone data exploration. This volume is the result of a careful selection among the contributions presented at the conference "S.Co.2013: Complex data modeling and computationally intensive methods for estimation and prediction" held...

  2. Functionally relevant climate variables for arid lands: Aclimatic water deficit approach for modelling desert shrub distributions

    Science.gov (United States)

    Thomas E. Dilts; Peter J. Weisberg; Camie M. Dencker; Jeanne C. Chambers

    2015-01-01

    We have three goals. (1) To develop a suite of functionally relevant climate variables for modelling vegetation distribution on arid and semi-arid landscapes of the Great Basin, USA. (2) To compare the predictive power of vegetation distribution models based on mechanistically proximate factors (water deficit variables) and factors that are more mechanistically removed...

  3. Development of a novel, physiologically relevant cytotoxicity model: Application to the study of chemotherapeutic damage to mesenchymal stromal cells

    International Nuclear Information System (INIS)

    May, Jennifer E.; Morse, H. Ruth; Xu, Jinsheng; Donaldson, Craig

    2012-01-01

    There is an increasing need for development of physiologically relevant in-vitro models for testing toxicity, however determining toxic effects of agents which undergo extensive hepatic metabolism can be particularly challenging. If a source of such metabolic enzymes is inadequate within a model system, toxicity from prodrugs may be grossly underestimated. Conversely, the vast majority of agents are detoxified by the liver, consequently toxicity from such agents may be overestimated. In this study we describe the development of a novel in-vitro model, which could be adapted for any toxicology setting. The model utilises HepG2 liver spheroids as a source of metabolic enzymes, which have been shown to more closely resemble human liver than traditional monolayer cultures. A co-culture model has been developed enabling the effect of any metabolised agent on another cell type to be assessed. This has been optimised to enable the study of damaging effects of chemotherapy on mesenchymal stem cells (MSC), the supportive stem cells of the bone marrow. Several optimisation steps were undertaken, including determining optimal culture conditions, confirmation of hepatic P450 enzyme activity and ensuring physiologically relevant doses of chemotherapeutic agents were appropriate for use within the model. The developed model was subsequently validated using several chemotherapeutic agents, both prodrugs and active drugs, with resulting MSC damage closely resembling effects seen in patients following chemotherapy. Minimal modifications would enable this novel co-culture model to be utilised as a general toxicity model, contributing to the drive to reduce animal safety testing and enabling physiologically relevant in-vitro study. -- Highlights: ► An in vitro model was developed for study of drugs requiring hepatic metabolism ► HepG2 spheroids were utilised as a physiologically relevant source of liver enzymes ► The model was optimised to enable study of chemotherapeutic

  4. Development of a novel, physiologically relevant cytotoxicity model: Application to the study of chemotherapeutic damage to mesenchymal stromal cells

    Energy Technology Data Exchange (ETDEWEB)

    May, Jennifer E., E-mail: Jennifer2.May@uwe.ac.uk; Morse, H. Ruth, E-mail: Ruth.Morse@uwe.ac.uk; Xu, Jinsheng, E-mail: Jinsheng.Xu@uwe.ac.uk; Donaldson, Craig, E-mail: Craig.Donaldson@uwe.ac.uk

    2012-09-15

    There is an increasing need for development of physiologically relevant in-vitro models for testing toxicity, however determining toxic effects of agents which undergo extensive hepatic metabolism can be particularly challenging. If a source of such metabolic enzymes is inadequate within a model system, toxicity from prodrugs may be grossly underestimated. Conversely, the vast majority of agents are detoxified by the liver, consequently toxicity from such agents may be overestimated. In this study we describe the development of a novel in-vitro model, which could be adapted for any toxicology setting. The model utilises HepG2 liver spheroids as a source of metabolic enzymes, which have been shown to more closely resemble human liver than traditional monolayer cultures. A co-culture model has been developed enabling the effect of any metabolised agent on another cell type to be assessed. This has been optimised to enable the study of damaging effects of chemotherapy on mesenchymal stem cells (MSC), the supportive stem cells of the bone marrow. Several optimisation steps were undertaken, including determining optimal culture conditions, confirmation of hepatic P450 enzyme activity and ensuring physiologically relevant doses of chemotherapeutic agents were appropriate for use within the model. The developed model was subsequently validated using several chemotherapeutic agents, both prodrugs and active drugs, with resulting MSC damage closely resembling effects seen in patients following chemotherapy. Minimal modifications would enable this novel co-culture model to be utilised as a general toxicity model, contributing to the drive to reduce animal safety testing and enabling physiologically relevant in-vitro study. -- Highlights: ► An in vitro model was developed for study of drugs requiring hepatic metabolism ► HepG2 spheroids were utilised as a physiologically relevant source of liver enzymes ► The model was optimised to enable study of chemotherapeutic

  5. Strategies to reduce the complexity of hydrologic data assimilation for high-dimensional models

    Science.gov (United States)

    Hernandez, F.; Liang, X.

    2017-12-01

    Probabilistic forecasts in the geosciences offer invaluable information by allowing to estimate the uncertainty of predicted conditions (including threats like floods and droughts). However, while forecast systems based on modern data assimilation algorithms are capable of producing multi-variate probability distributions of future conditions, the computational resources required to fully characterize the dependencies between the model's state variables render their applicability impractical for high-resolution cases. This occurs because of the quadratic space complexity of storing the covariance matrices that encode these dependencies and the cubic time complexity of performing inference operations with them. In this work we introduce two complementary strategies to reduce the size of the covariance matrices that are at the heart of Bayesian assimilation methods—like some variants of (ensemble) Kalman filters and of particle filters—and variational methods. The first strategy involves the optimized grouping of state variables by clustering individual cells of the model into "super-cells." A dynamic fuzzy clustering approach is used to take into account the states (e.g., soil moisture) and forcings (e.g., precipitation) of each cell at each time step. The second strategy consists in finding a compressed representation of the covariance matrix that still encodes the most relevant information but that can be more efficiently stored and processed. A learning and a belief-propagation inference algorithm are developed to take advantage of this modified low-rank representation. The two proposed strategies are incorporated into OPTIMISTS, a state-of-the-art hybrid Bayesian/variational data assimilation algorithm, and comparative streamflow forecasting tests are performed using two watersheds modeled with the Distributed Hydrology Soil Vegetation Model (DHSVM). Contrasts are made between the efficiency gains and forecast accuracy losses of each strategy used in

  6. Modeling Air-Quality in Complex Terrain Using Mesoscale and ...

    African Journals Online (AJOL)

    Air-quality in a complex terrain (Colorado-River-Valley/Grand-Canyon Area, Southwest U.S.) is modeled using a higher-order closure mesoscale model and a higher-order closure dispersion model. Non-reactive tracers have been released in the Colorado-River valley, during winter and summer 1992, to study the ...

  7. Near-infrared 808 nm light boosts complex IV-dependent respiration and rescues a Parkinson-related pink1 model.

    Directory of Open Access Journals (Sweden)

    Melissa Vos

    Full Text Available Mitochondrial electron transport chain (ETC defects are observed in Parkinson's disease (PD patients and in PD fly- and mouse-models; however it remains to be tested if acute improvement of ETC function alleviates PD-relevant defects. We tested the hypothesis that 808 nm infrared light that effectively penetrates tissues rescues pink1 mutants. We show that irradiating isolated fly or mouse mitochondria with 808 nm light that is absorbed by ETC-Complex IV acutely improves Complex IV-dependent oxygen consumption and ATP production, a feature that is wavelength-specific. Irradiating Drosophila pink1 mutants using a single dose of 808 nm light results in a rescue of major systemic and mitochondrial defects. Time-course experiments indicate mitochondrial membrane potential defects are rescued prior to mitochondrial morphological defects, also in dopaminergic neurons, suggesting mitochondrial functional defects precede mitochondrial swelling. Thus, our data indicate that improvement of mitochondrial function using infrared light stimulation is a viable strategy to alleviate pink1-related defects.

  8. Stem cell therapy for joint problems using the horse as a clinically relevant animal model

    DEFF Research Database (Denmark)

    Koch, Thomas Gadegaard; Betts, Dean H.

    2007-01-01

    of experimentally induced lesions. The horse lends itself as a good animal model of spontaneous joint disorders that are clinically relevant to similar human disorders. Equine stem cell and tissue engineering studies may be financially feasible to principal investigators and small biotechnology companies...

  9. a Range Based Method for Complex Facade Modeling

    Science.gov (United States)

    Adami, A.; Fregonese, L.; Taffurelli, L.

    2011-09-01

    3d modelling of Architectural Heritage does not follow a very well-defined way, but it goes through different algorithms and digital form according to the shape complexity of the object, to the main goal of the representation and to the starting data. Even if the process starts from the same data, such as a pointcloud acquired by laser scanner, there are different possibilities to realize a digital model. In particular we can choose between two different attitudes: the mesh and the solid model. In the first case the complexity of architecture is represented by a dense net of triangular surfaces which approximates the real surface of the object. In the other -opposite- case the 3d digital model can be realized by the use of simple geometrical shapes, by the use of sweeping algorithm and the Boolean operations. Obviously these two models are not the same and each one is characterized by some peculiarities concerning the way of modelling (the choice of a particular triangulation algorithm or the quasi-automatic modelling by known shapes) and the final results (a more detailed and complex mesh versus an approximate and more simple solid model). Usually the expected final representation and the possibility of publishing lead to one way or the other. In this paper we want to suggest a semiautomatic process to build 3d digital models of the facades of complex architecture to be used for example in city models or in other large scale representations. This way of modelling guarantees also to obtain small files to be published on the web or to be transmitted. The modelling procedure starts from laser scanner data which can be processed in the well known way. Usually more than one scan is necessary to describe a complex architecture and to avoid some shadows on the facades. These have to be registered in a single reference system by the use of targets which are surveyed by topography and then to be filtered in order to obtain a well controlled and homogeneous point cloud of

  10. A RANGE BASED METHOD FOR COMPLEX FACADE MODELING

    Directory of Open Access Journals (Sweden)

    A. Adami

    2012-09-01

    Full Text Available 3d modelling of Architectural Heritage does not follow a very well-defined way, but it goes through different algorithms and digital form according to the shape complexity of the object, to the main goal of the representation and to the starting data. Even if the process starts from the same data, such as a pointcloud acquired by laser scanner, there are different possibilities to realize a digital model. In particular we can choose between two different attitudes: the mesh and the solid model. In the first case the complexity of architecture is represented by a dense net of triangular surfaces which approximates the real surface of the object. In the other -opposite- case the 3d digital model can be realized by the use of simple geometrical shapes, by the use of sweeping algorithm and the Boolean operations. Obviously these two models are not the same and each one is characterized by some peculiarities concerning the way of modelling (the choice of a particular triangulation algorithm or the quasi-automatic modelling by known shapes and the final results (a more detailed and complex mesh versus an approximate and more simple solid model. Usually the expected final representation and the possibility of publishing lead to one way or the other. In this paper we want to suggest a semiautomatic process to build 3d digital models of the facades of complex architecture to be used for example in city models or in other large scale representations. This way of modelling guarantees also to obtain small files to be published on the web or to be transmitted. The modelling procedure starts from laser scanner data which can be processed in the well known way. Usually more than one scan is necessary to describe a complex architecture and to avoid some shadows on the facades. These have to be registered in a single reference system by the use of targets which are surveyed by topography and then to be filtered in order to obtain a well controlled and

  11. Relevant Subspace Clustering

    DEFF Research Database (Denmark)

    Müller, Emmanuel; Assent, Ira; Günnemann, Stephan

    2009-01-01

    Subspace clustering aims at detecting clusters in any subspace projection of a high dimensional space. As the number of possible subspace projections is exponential in the number of dimensions, the result is often tremendously large. Recent approaches fail to reduce results to relevant subspace...... clusters. Their results are typically highly redundant, i.e. many clusters are detected multiple times in several projections. In this work, we propose a novel model for relevant subspace clustering (RESCU). We present a global optimization which detects the most interesting non-redundant subspace clusters...... achieves top clustering quality while competing approaches show greatly varying performance....

  12. Looking for a relevant potential evapotranspiration model at the watershed scale

    Science.gov (United States)

    Oudin, L.; Hervieu, F.; Michel, C.; Perrin, C.; Anctil, F.; Andréassian, V.

    2003-04-01

    In this paper, we try to identify the most relevant approach to calculate Potential Evapotranspiration (PET) for use in a daily watershed model, to try to bring an answer to the following question: "how can we use commonly available atmospheric parameters to represent the evaporative demand at the catchment scale?". Hydrologists generally see the Penman model as the ideal model regarding to its good adequacy with lysimeter measurements and its physically-based formulation. However, in real-world engineering situations, where meteorological stations are scarce, hydrologists are often constrained to use other PET formulae with less data requirements or/and long-term average of PET values (the rationale being that PET is an inherently conservative variable). We chose to test 28 commonly used PET models coupled with 4 different daily watershed models. For each test, we compare both PET input options: actual data and long-term average data. The comparison is made in terms of streamflow simulation efficiency, over a large sample of 308 watersheds. The watersheds are located in France, Australia and the United States of America and represent varied climates. Strikingly, we find no systematic improvements of the watershed model efficiencies when using actual PET series instead of long-term averages. This suggests either that watershed models may not conveniently use the climatic information contained in PET values or that formulae are only awkward indicators of the real PET which watershed models need.

  13. A cognitive model for software architecture complexity

    NARCIS (Netherlands)

    Bouwers, E.; Lilienthal, C.; Visser, J.; Van Deursen, A.

    2010-01-01

    Evaluating the complexity of the architecture of a softwaresystem is a difficult task. Many aspects have to be considered to come to a balanced assessment. Several architecture evaluation methods have been proposed, but very few define a quality model to be used during the evaluation process. In

  14. Effect of 2-chloro-substitution of adenine moiety in mixed-ligand gold(I triphenylphosphine complexes on anti-inflammatory activity: the discrepancy between the in vivo and in vitro models.

    Directory of Open Access Journals (Sweden)

    Jan Hošek

    Full Text Available A series of gold(I triphenylphosphine (PPh3 complexes (1-9 involving 2-chloro-N6-(substituted-benzyladenine derivatives as N-donor ligands was synthesized and thoroughly characterized by relevant methods, including electrospray-ionization (ESI mass spectrometry and multinuclear NMR spectroscopy. The anti-inflammatory and antiedematous effects of three representatives 1, 5 and 9 were evaluated by means of in vitro model based on the expression of pro- and anti-inflammatory cytokines and influence of the complexes on selected forms of matrix metalloproteinases secreted by LPS-activated THP-1 monocytes and in vivo model evaluating the antiedematous effect of the complexes in the carrageenan-induced rat hind-paw edema model. In addition to the pharmacological observations, the affected hind paws were post mortem subjected to histological and immunohistochemical evaluations. The results of both in vivo and ex vivo methods revealed low antiedematous and anti-inflammatory effects of the complexes, even though the in vitro model identified them as promising anti-inflammatory acting compounds. The reason for this discrepancy lies probably in low stability of the studied complexes in biological environment, as demonstrated by the solution interaction studies with sulfur-containing biomolecules (cysteine and reduced glutathione using the ESI mass spectrometry.

  15. Reduced Complexity Volterra Models for Nonlinear System Identification

    Directory of Open Access Journals (Sweden)

    Hacıoğlu Rıfat

    2001-01-01

    Full Text Available A broad class of nonlinear systems and filters can be modeled by the Volterra series representation. However, its practical use in nonlinear system identification is sometimes limited due to the large number of parameters associated with the Volterra filter′s structure. The parametric complexity also complicates design procedures based upon such a model. This limitation for system identification is addressed in this paper using a Fixed Pole Expansion Technique (FPET within the Volterra model structure. The FPET approach employs orthonormal basis functions derived from fixed (real or complex pole locations to expand the Volterra kernels and reduce the number of estimated parameters. That the performance of FPET can considerably reduce the number of estimated parameters is demonstrated by a digital satellite channel example in which we use the proposed method to identify the channel dynamics. Furthermore, a gradient-descent procedure that adaptively selects the pole locations in the FPET structure is developed in the paper.

  16. Infinite Multiple Membership Relational Modeling for Complex Networks

    DEFF Research Database (Denmark)

    Mørup, Morten; Schmidt, Mikkel Nørgaard; Hansen, Lars Kai

    Learning latent structure in complex networks has become an important problem fueled by many types of networked data originating from practically all fields of science. In this paper, we propose a new non-parametric Bayesian multiplemembership latent feature model for networks. Contrary to existing...... multiplemembership models that scale quadratically in the number of vertices the proposedmodel scales linearly in the number of links admittingmultiple-membership analysis in large scale networks. We demonstrate a connection between the single membership relational model and multiple membership models and show...

  17. Crack propagation rate modelling for 316SS exposed to PWR-relevant conditions

    International Nuclear Information System (INIS)

    Vankeerberghen, M.; Weyns, G.; Gavrilov, S.; Martens, B.; Deconinck, J.

    2009-01-01

    The crack propagation rate of Type 316 stainless steel in boric acid-lithium hydroxide solutions under PWR-relevant conditions was modelled. A film rupture/dissolution/repassivation mechanism is assumed and extended to cold worked materials by including a stress-dependent bare metal dissolution current density. The chemical and electrochemical conditions within the crack are calculated by finite element calculations, an analytical expression is used for the crack-tip strain rate and the crack-tip stress is assumed equal to 2.5 times the yield stress (plane-strain). First the model was calibrated against a literature published data set. Afterwards, the influence of various variables - dissolved hydrogen, boric acid and lithium hydroxide content, stress intensity, crack length, temperature, flow rate - was studied. Finally, other published crack growth rate tests were modelled and the calculated crack growth rates were found to be in reasonable agreement with the reported ones

  18. Some Comparisons of Complexity in Dictionary-Based and Linear Computational Models

    Czech Academy of Sciences Publication Activity Database

    Gnecco, G.; Kůrková, Věra; Sanguineti, M.

    2011-01-01

    Roč. 24, č. 2 (2011), s. 171-182 ISSN 0893-6080 R&D Project s: GA ČR GA201/08/1744 Grant - others:CNR - AV ČR project 2010-2012(XE) Complexity of Neural-Network and Kernel Computational Models Institutional research plan: CEZ:AV0Z10300504 Keywords : linear approximation schemes * variable-basis approximation schemes * model complexity * worst-case errors * neural networks * kernel models Subject RIV: IN - Informatics, Computer Science Impact factor: 2.182, year: 2011

  19. Applicability of surface complexation modelling in TVO's studies on sorption of radionuclides

    International Nuclear Information System (INIS)

    Carlsson, T.

    1994-03-01

    The report focuses on the possibility of applying surface complexation theories to the conditions at a potential repository site in Finland and of doing proper experimental work in order to determine necessary constants for the models. The report provides background information on: (1) what type experiments should be carried out in order to produce data for surface complexation modelling of sorption phenomena under potential Finnish repository conditions, and (2) how to design and perform properly such experiments, in order to gather data, develop models or both. The report does not describe in detail how proper surface complexation experiments or modelling should be carried out. The work contains several examples of information that may be valuable in both modelling and experimental work. (51 refs., 6 figs., 4 tabs.)

  20. Thinking about complexity in health: A systematic review of the key systems thinking and complexity ideas in health.

    Science.gov (United States)

    Rusoja, Evan; Haynie, Deson; Sievers, Jessica; Mustafee, Navonil; Nelson, Fred; Reynolds, Martin; Sarriot, Eric; Swanson, Robert Chad; Williams, Bob

    2018-01-30

    As the Sustainable Development Goals are rolled out worldwide, development leaders will be looking to the experiences of the past to improve implementation in the future. Systems thinking and complexity science (ST/CS) propose that health and the health system are composed of dynamic actors constantly evolving in response to each other and their context. While offering practical guidance for steering the next development agenda, there is no consensus as to how these important ideas are discussed in relation to health. This systematic review sought to identify and describe some of the key terms, concepts, and methods in recent ST/CS literature. Using the search terms "systems thinkin * AND health OR complexity theor* AND health OR complex adaptive system* AND health," we identified 516 relevant full texts out of 3982 titles across the search period (2002-2015). The peak number of articles were published in 2014 (83) with journals specifically focused on medicine/healthcare (265) and particularly the Journal of Evaluation in Clinical Practice (37) representing the largest number by volume. Dynamic/dynamical systems (n = 332), emergence (n = 294), complex adaptive system(s) (n = 270), and interdependent/interconnected (n = 263) were the most common terms with systems dynamic modelling (58) and agent-based modelling (43) as the most common methods. The review offered several important conclusions. First, while there was no core ST/CS "canon," certain terms appeared frequently across the reviewed texts. Second, even as these ideas are gaining traction in academic and practitioner communities, most are concentrated in a few journals. Finally, articles on ST/CS remain largely theoretical illustrating the need for further study and practical application. Given the challenge posed by the next phase of development, gaining a better understanding of ST/CS ideas and their use may lead to improvements in the implementation and practice of the Sustainable Development

  1. Agent-Based and Macroscopic Modeling of the Complex Socio-Economic Systems

    Directory of Open Access Journals (Sweden)

    Aleksejus Kononovičius

    2013-08-01

    Full Text Available Purpose – The focus of this contribution is the correspondence between collective behavior and inter-individual interactions in the complex socio-economic systems. Currently there is a wide selection of papers proposing various models for the both collective behavior and inter-individual interactions in the complex socio-economic systems. Yet the papers directly relating these two concepts are still quite rare. By studying this correspondence we discuss a cutting edge approach to the modeling of complex socio-economic systems. Design/methodology/approach – The collective behavior is often modeled using stochastic and ordinary calculus, while the inter-individual interactions are modeled using agent-based models. In order to obtain the ideal model, one should start from these frameworks and build a bridge to reach another. This is a formidable task, if we consider the top-down approach, namely starting from the collective behavior and moving towards inter-individual interactions. The bottom-up approach also fails, if complex inter-individual interaction models are considered, yet in this case we can start with simple models and increase the complexity as needed. Findings – The bottom-up approach, considering simple agent-based herding model as a model for the inter-individual interactions, allows us to derive certain macroscopic models of the complex socio-economic systems from the agent-based perspective. This provides interesting insights into the collective behavior patterns observed in the complex socio-economic systems. Research limitations/implications –The simplicity of the agent-based herding model might be considered to be somewhat limiting. Yet this simplicity implies that the model is highly universal. It reproduces universal features of social behavior and also can be further extended to fit different socio-economic scenarios. Practical implications – Insights provided in this contribution might be used to modify existing

  2. The simplicity complex: exploring simplified health messages in a complex world.

    Science.gov (United States)

    Zarcadoolas, Christina

    2011-09-01

    A challenge in individual and public health at the start of the 21st century is to effectively communicate health and science information about disease and complex emergencies. The low health literacy of millions of adults in the USA has been referred to as a 'silent killer'. A popular approach to improving health communication and health promotion to low health literate consumers has been to simplify the language of health information. The expected result has been that individuals and groups will better understand information and will then make informed decisions about their health and behaviors. This expectation has grown to include the belief that the public will be better prepared to take appropriate action in complex natural and man-made emergencies. Demonstrating the efficacy of this approach remains, in large part, uninvestigated. And it is becoming more evident that health literacy itself is complex and multifaceted. This article applies linguistic and sociolinguistic models in order to better articulate the role of simplification in health communication and health promotion. Focusing on two models from sociolinguistics-pragmatics and text theory-the article discusses their usefulness in rethinking message simplification. The discussion proposes that a richer, more theory-based understanding of text structures and functions, along with other powerful constructs, including cultural appropriateness, relevancy and context, are needed to close the gaps between health messages, health messengers and patients/the public. The article concludes by making recommendations for future study to empirically test the strengths and limitations of these models and constructs.

  3. Realistic modelling of observed seismic motion in complex sedimentary basins

    International Nuclear Information System (INIS)

    Faeh, D.; Panza, G.F.

    1994-03-01

    Three applications of a numerical technique are illustrated to model realistically the seismic ground motion for complex two-dimensional structures. First we consider a sedimentary basin in the Friuli region, and we model strong motion records from an aftershock of the 1976 earthquake. Then we simulate the ground motion caused in Rome by the 1915, Fucino (Italy) earthquake, and we compare our modelling with the damage distribution observed in the town. Finally we deal with the interpretation of ground motion recorded in Mexico City, as a consequence of earthquakes in the Mexican subduction zone. The synthetic signals explain the major characteristics (relative amplitudes, spectral amplification, frequency content) of the considered seismograms, and the space distribution of the available macroseismic data. For the sedimentary basin in the Friuli area, parametric studies demonstrate the relevant sensitivity of the computed ground motion to small changes in the subsurface topography of the sedimentary basin, and in the velocity and quality factor of the sediments. The total energy of ground motion, determined from our numerical simulation in Rome, is in very good agreement with the distribution of damage observed during the Fucino earthquake. For epicentral distances in the range 50km-100km, the source location and not only the local soil conditions control the local effects. For Mexico City, the observed ground motion can be explained as resonance effects and as excitation of local surface waves, and the theoretical and the observed maximum spectral amplifications are very similar. In general, our numerical simulations permit the estimate of the maximum and average spectral amplification for specific sites, i.e. are a very powerful tool for accurate micro-zonation. (author). 38 refs, 19 figs, 1 tab

  4. A low-complexity interacting multiple model filter for maneuvering target tracking

    KAUST Repository

    Khalid, Syed Safwan; Abrar, Shafayat

    2017-01-01

    In this work, we address the target tracking problem for a coordinate-decoupled Markovian jump-mean-acceleration based maneuvering mobility model. A novel low-complexity alternative to the conventional interacting multiple model (IMM) filter is proposed for this class of mobility models. The proposed tracking algorithm utilizes a bank of interacting filters where the interactions are limited to the mixing of the mean estimates, and it exploits a fixed off-line computed Kalman gain matrix for the entire filter bank. Consequently, the proposed filter does not require matrix inversions during on-line operation which significantly reduces its complexity. Simulation results show that the performance of the low-complexity proposed scheme remains comparable to that of the traditional (highly-complex) IMM filter. Furthermore, we derive analytical expressions that iteratively evaluate the transient and steady-state performance of the proposed scheme, and establish the conditions that ensure the stability of the proposed filter. The analytical findings are in close accordance with the simulated results.

  5. A low-complexity interacting multiple model filter for maneuvering target tracking

    KAUST Repository

    Khalid, Syed Safwan

    2017-01-22

    In this work, we address the target tracking problem for a coordinate-decoupled Markovian jump-mean-acceleration based maneuvering mobility model. A novel low-complexity alternative to the conventional interacting multiple model (IMM) filter is proposed for this class of mobility models. The proposed tracking algorithm utilizes a bank of interacting filters where the interactions are limited to the mixing of the mean estimates, and it exploits a fixed off-line computed Kalman gain matrix for the entire filter bank. Consequently, the proposed filter does not require matrix inversions during on-line operation which significantly reduces its complexity. Simulation results show that the performance of the low-complexity proposed scheme remains comparable to that of the traditional (highly-complex) IMM filter. Furthermore, we derive analytical expressions that iteratively evaluate the transient and steady-state performance of the proposed scheme, and establish the conditions that ensure the stability of the proposed filter. The analytical findings are in close accordance with the simulated results.

  6. Exploring the Impact of Visual Complexity Levels in 3d City Models on the Accuracy of Individuals' Orientation and Cognitive Maps

    Science.gov (United States)

    Rautenbach, V.; Çöltekin, A.; Coetzee, S.

    2015-08-01

    In this paper we report results from a qualitative user experiment (n=107) designed to contribute to understanding the impact of various levels of complexity (mainly based on levels of detail, i.e., LoD) in 3D city models, specifically on the participants' orientation and cognitive (mental) maps. The experiment consisted of a number of tasks motivated by spatial cognition theory where participants (among other things) were given orientation tasks, and in one case also produced sketches of a path they `travelled' in a virtual environment. The experiments were conducted in groups, where individuals provided responses on an answer sheet. The preliminary results based on descriptive statistics and qualitative sketch analyses suggest that very little information (i.e., a low LoD model of a smaller area) might have a negative impact on the accuracy of cognitive maps constructed based on a virtual experience. Building an accurate cognitive map is an inherently desired effect of the visualizations in planning tasks, thus the findings are important for understanding how to develop better-suited 3D visualizations such as 3D city models. In this study, we specifically discuss the suitability of different levels of visual complexity for development planning (urban planning), one of the domains where 3D city models are most relevant.

  7. Modeling of anaerobic digestion of complex substrates

    International Nuclear Information System (INIS)

    Keshtkar, A. R.; Abolhamd, G.; Meyssami, B.; Ghaforian, H.

    2003-01-01

    A structured mathematical model of anaerobic conversion of complex organic materials in non-ideally cyclic-batch reactors for biogas production has been developed. The model is based on multiple-reaction stoichiometry (enzymatic hydrolysis, acidogenesis, aceto genesis and methano genesis), microbial growth kinetics, conventional material balances in the liquid and gas phases for a cyclic-batch reactor, liquid-gas interactions, liquid-phase equilibrium reactions and a simple mixing model which considers the reactor volume in two separate sections: the flow-through and the retention regions. The dynamic model describes the effects of reactant's distribution resulting from the mixing conditions, time interval of feeding, hydraulic retention time and mixing parameters on the process performance. The model is applied in the simulation of anaerobic digestion of cattle manure under different operating conditions. The model is compared with experimental data and good correlations are obtained

  8. Parametric Linear Hybrid Automata for Complex Environmental Systems Modeling

    Directory of Open Access Journals (Sweden)

    Samar Hayat Khan Tareen

    2015-07-01

    Full Text Available Environmental systems, whether they be weather patterns or predator-prey relationships, are dependent on a number of different variables, each directly or indirectly affecting the system at large. Since not all of these factors are known, these systems take on non-linear dynamics, making it difficult to accurately predict meaningful behavioral trends far into the future. However, such dynamics do not warrant complete ignorance of different efforts to understand and model close approximations of these systems. Towards this end, we have applied a logical modeling approach to model and analyze the behavioral trends and systematic trajectories that these systems exhibit without delving into their quantification. This approach, formalized by René Thomas for discrete logical modeling of Biological Regulatory Networks (BRNs and further extended in our previous studies as parametric biological linear hybrid automata (Bio-LHA, has been previously employed for the analyses of different molecular regulatory interactions occurring across various cells and microbial species. As relationships between different interacting components of a system can be simplified as positive or negative influences, we can employ the Bio-LHA framework to represent different components of the environmental system as positive or negative feedbacks. In the present study, we highlight the benefits of hybrid (discrete/continuous modeling which lead to refinements among the fore-casted behaviors in order to find out which ones are actually possible. We have taken two case studies: an interaction of three microbial species in a freshwater pond, and a more complex atmospheric system, to show the applications of the Bio-LHA methodology for the timed hybrid modeling of environmental systems. Results show that the approach using the Bio-LHA is a viable method for behavioral modeling of complex environmental systems by finding timing constraints while keeping the complexity of the model

  9. Modeling and complexity of stochastic interacting Lévy type financial price dynamics

    Science.gov (United States)

    Wang, Yiduan; Zheng, Shenzhou; Zhang, Wei; Wang, Jun; Wang, Guochao

    2018-06-01

    In attempt to reproduce and investigate nonlinear dynamics of security markets, a novel nonlinear random interacting price dynamics, which is considered as a Lévy type process, is developed and investigated by the combination of lattice oriented percolation and Potts dynamics, which concerns with the instinctive random fluctuation and the fluctuation caused by the spread of the investors' trading attitudes, respectively. To better understand the fluctuation complexity properties of the proposed model, the complexity analyses of random logarithmic price return and corresponding volatility series are preformed, including power-law distribution, Lempel-Ziv complexity and fractional sample entropy. In order to verify the rationality of the proposed model, the corresponding studies of actual security market datasets are also implemented for comparison. The empirical results reveal that this financial price model can reproduce some important complexity features of actual security markets to some extent. The complexity of returns decreases with the increase of parameters γ1 and β respectively, furthermore, the volatility series exhibit lower complexity than the return series

  10. The Complexity Turn in Studies of Organisations and Leadership: Relevance and Implications

    Science.gov (United States)

    Johannessen, Stig O.

    2009-01-01

    The widespread experience of complexity is the experience of radical unpredictability and loss of clear connections between cause and effect. The typical response from leaders and researchers is to suggest that more complex contexts require new ways of management control and that particular ways of organising and leading are better than others in…

  11. Surface complexation modeling of zinc sorption onto ferrihydrite.

    Science.gov (United States)

    Dyer, James A; Trivedi, Paras; Scrivner, Noel C; Sparks, Donald L

    2004-02-01

    A previous study involving lead(II) [Pb(II)] sorption onto ferrihydrite over a wide range of conditions highlighted the advantages of combining molecular- and macroscopic-scale investigations with surface complexation modeling to predict Pb(II) speciation and partitioning in aqueous systems. In this work, an extensive collection of new macroscopic and spectroscopic data was used to assess the ability of the modified triple-layer model (TLM) to predict single-solute zinc(II) [Zn(II)] sorption onto 2-line ferrihydrite in NaNO(3) solutions as a function of pH, ionic strength, and concentration. Regression of constant-pH isotherm data, together with potentiometric titration and pH edge data, was a much more rigorous test of the modified TLM than fitting pH edge data alone. When coupled with valuable input from spectroscopic analyses, good fits of the isotherm data were obtained with a one-species, one-Zn-sorption-site model using the bidentate-mononuclear surface complex, (triple bond FeO)(2)Zn; however, surprisingly, both the density of Zn(II) sorption sites and the value of the best-fit equilibrium "constant" for the bidentate-mononuclear complex had to be adjusted with pH to adequately fit the isotherm data. Although spectroscopy provided some evidence for multinuclear surface complex formation at surface loadings approaching site saturation at pH >/=6.5, the assumption of a bidentate-mononuclear surface complex provided acceptable fits of the sorption data over the entire range of conditions studied. Regressing edge data in the absence of isotherm and spectroscopic data resulted in a fair number of surface-species/site-type combinations that provided acceptable fits of the edge data, but unacceptable fits of the isotherm data. A linear relationship between logK((triple bond FeO)2Zn) and pH was found, given by logK((triple bond FeO)2Znat1g/l)=2.058 (pH)-6.131. In addition, a surface activity coefficient term was introduced to the model to reduce the ionic strength

  12. Applications of Nonlinear Dynamics Model and Design of Complex Systems

    CERN Document Server

    In, Visarath; Palacios, Antonio

    2009-01-01

    This edited book is aimed at interdisciplinary, device-oriented, applications of nonlinear science theory and methods in complex systems. In particular, applications directed to nonlinear phenomena with space and time characteristics. Examples include: complex networks of magnetic sensor systems, coupled nano-mechanical oscillators, nano-detectors, microscale devices, stochastic resonance in multi-dimensional chaotic systems, biosensors, and stochastic signal quantization. "applications of nonlinear dynamics: model and design of complex systems" brings together the work of scientists and engineers that are applying ideas and methods from nonlinear dynamics to design and fabricate complex systems.

  13. A relativistic density functional study of uranyl hydrolysis and complexation by carboxylic acids in aqueous solution

    Energy Technology Data Exchange (ETDEWEB)

    Ray, Rupashree Shyama

    2009-02-10

    In this work, the complexation of uranium in its most stable oxidation state VI in aqueous solution was studied computationally, within the framework of density functional (DF) theory. The thesis is divided into the following parts: Chapter 2 briefly summarizes the relevant general aspects of actinide chemistry and then focuses on actinide environmental chemistry. Experimental results on hydrolysis, actinide complexation by carboxylic acids, and humic substances are presented to establish a background for the subsequent discussion. Chapter 3 describes the computational method used in this work and the relevant features of the parallel quantum chemistry code PARAGAUSS employed. First, the most relevant basics of the applied density functional approach are presented focusing on relativistic effects. Then, the treatment of solvent effects, essential for an adequate modeling of actinide species in aqueous solution, will be introduced. At the end of this chapter, computational parameters and procedures will be summarized. Chapter 4 presents the computational results including a comparison to available experimental data. In the beginning, the mononuclear hydrolysis product of UO{sub 2}{sup 2+}, [UO{sub 2}OH]{sup +}, will be discussed. The second part deals with actinide complexation by carboxylate ligands. First of all the coordination number for uranylacetate will be discussed with respect to implications for the complexation of actinides by humic substances followed by the uranyl complexation of aromatic carboxylic acids in comparison to earlier results for aliphatic ones. In the end, the ternary uranyl-hydroxo-acetate are discussed, as models of uranyl humate complexation at ambient condition.

  14. Electronic patient records in action: Transforming information into professionally relevant knowledge.

    Science.gov (United States)

    Winman, Thomas; Rystedt, Hans

    2011-03-01

    The implementation of generic models for organizing information in complex institutions like those in healthcare creates a gap between standardization and the need for locally relevant knowledge. The present study addresses how this gap can be bridged by focusing on the practical work of healthcare staff in transforming information in EPRs into knowledge that is useful for everyday work. Video recording of shift handovers on a rehabilitation ward serves as the empirical case. The results show how extensive selections and reorganizations of information in EPRs are carried out in order to transform information into professionally relevant accounts. We argue that knowledge about the institutional obligations and professional ways of construing information are fundamental for these transitions. The findings point to the need to consider the role of professional knowledge inherent in unpacking information in efforts to develop information systems intended to bridge between institutional and professional boundaries in healthcare. © The Author(s) 2011.

  15. Relevant Scatterers Characterization in SAR Images

    Science.gov (United States)

    Chaabouni, Houda; Datcu, Mihai

    2006-11-01

    Recognizing scenes in a single look meter resolution Synthetic Aperture Radar (SAR) images, requires the capability to identify relevant signal signatures in condition of variable image acquisition geometry, arbitrary objects poses and configurations. Among the methods to detect relevant scatterers in SAR images, we can mention the internal coherence. The SAR spectrum splitted in azimuth generates a series of images which preserve high coherence only for particular object scattering. The detection of relevant scatterers can be done by correlation study or Independent Component Analysis (ICA) methods. The present article deals with the state of the art for SAR internal correlation analysis and proposes further extensions using elements of inference based on information theory applied to complex valued signals. The set of azimuth looks images is analyzed using mutual information measures and an equivalent channel capacity is derived. The localization of the "target" requires analysis in a small image window, thus resulting in imprecise estimation of the second order statistics of the signal. For a better precision, a Hausdorff measure is introduced. The method is applied to detect and characterize relevant objects in urban areas.

  16. Topics in Complexity: Dynamical Patterns in the Cyberworld

    Science.gov (United States)

    Qi, Hong

    Quantitative understanding of mechanism in complex systems is a common "difficult" problem across many fields such as physical, biological, social and economic sciences. Investigation on underlying dynamics of complex systems and building individual-based models have recently been fueled by big data resulted from advancing information technology. This thesis investigates complex systems in social science, focusing on civil unrests on streets and relevant activities online. Investigation consists of collecting data of unrests from open digital source, featuring dynamical patterns underlying, making predictions and constructing models. A simple law governing the progress of two-sided confrontations is proposed with data of activities at micro-level. Unraveling the connections between activity of organizing online and outburst of unrests on streets gives rise to a further meso-level pattern of human behavior, through which adversarial groups evolve online and hyper-escalate ahead of real-world uprisings. Based on the patterns found, noticeable improvement of prediction of civil unrests is achieved. Meanwhile, novel model created from combination of mobility dynamics in the cyberworld and a traditional contagion model can better capture the characteristics of modern civil unrests and other contagion-like phenomena than the original one.

  17. Passengers, Crowding and Complexity : Models for passenger oriented public transport

    NARCIS (Netherlands)

    P.C. Bouman (Paul)

    2017-01-01

    markdownabstractPassengers, Crowding and Complexity was written as part of the Complexity in Public Transport (ComPuTr) project funded by the Netherlands Organisation for Scientific Research (NWO). This thesis studies in three parts how microscopic data can be used in models that have the potential

  18. FRAM Modelling Complex Socio-technical Systems

    CERN Document Server

    Hollnagel, Erik

    2012-01-01

    There has not yet been a comprehensive method that goes behind 'human error' and beyond the failure concept, and various complicated accidents have accentuated the need for it. The Functional Resonance Analysis Method (FRAM) fulfils that need. This book presents a detailed and tested method that can be used to model how complex and dynamic socio-technical systems work, and understand both why things sometimes go wrong but also why they normally succeed.

  19. Are more complex physiological models of forest ecosystems better choices for plot and regional predictions?

    Science.gov (United States)

    Wenchi Jin; Hong S. He; Frank R. Thompson

    2016-01-01

    Process-based forest ecosystem models vary from simple physiological, complex physiological, to hybrid empirical-physiological models. Previous studies indicate that complex models provide the best prediction at plot scale with a temporal extent of less than 10 years, however, it is largely untested as to whether complex models outperform the other two types of models...

  20. Decision-relevant evaluation of climate models: A case study of chill hours in California

    Science.gov (United States)

    Jagannathan, K. A.; Jones, A. D.; Kerr, A. C.

    2017-12-01

    The past decade has seen a proliferation of different climate datasets with over 60 climate models currently in use. Comparative evaluation and validation of models can assist practitioners chose the most appropriate models for adaptation planning. However, such assessments are usually conducted for `climate metrics' such as seasonal temperature, while sectoral decisions are often based on `decision-relevant outcome metrics' such as growing degree days or chill hours. Since climate models predict different metrics with varying skill, the goal of this research is to conduct a bottom-up evaluation of model skill for `outcome-based' metrics. Using chill hours (number of hours in winter months where temperature is lesser than 45 deg F) in Fresno, CA as a case, we assess how well different GCMs predict the historical mean and slope of chill hours, and whether and to what extent projections differ based on model selection. We then compare our results with other climate-based evaluations of the region, to identify similarities and differences. For the model skill evaluation, historically observed chill hours were compared with simulations from 27 GCMs (and multiple ensembles). Model skill scores were generated based on a statistical hypothesis test of the comparative assessment. Future projections from RCP 8.5 runs were evaluated, and a simple bias correction was also conducted. Our analysis indicates that model skill in predicting chill hour slope is dependent on its skill in predicting mean chill hours, which results from the non-linear nature of the chill metric. However, there was no clear relationship between the models that performed well for the chill hour metric and those that performed well in other temperature-based evaluations (such winter minimum temperature or diurnal temperature range). Further, contrary to conclusions from other studies, we also found that the multi-model mean or large ensemble mean results may not always be most appropriate for this

  1. The Amygdala and the Relevance Detection Theory of Autism: An Evolutionary Perspective

    Directory of Open Access Journals (Sweden)

    Tiziana eZalla

    2013-12-01

    Full Text Available In the last few decades, there has been increasing interest in the role of the amygdala in psychiatric disorders and in particular its contribution to the socio-emotional impairments in autism spectrum disorders (ASDs. Given that the amygdala is a component structure of the social brain, several theoretical explanations compatible with amygdala dysfunction have been proposed to account for socio-emotional impairments in ASDs, including abnormal eye contact, gaze monitoring, face processing, mental state understanding and empathy. Nevertheless, many theoretical accounts, based on the Amygdala Theory of Autism, fail to elucidate the complex pattern of impairments observed in this population, which extends beyond the social domain. As posited by the Relevance Detector theory (Sander, Grafman and Zalla, 2003, the human amygdala is a critical component of a brain circuit involved in the appraisal of self-relevant events that include, but are not restricted to, social stimuli. Here, we propose that the behavioral and social-emotional features of ASDs may be better understood in terms of a disruption in a ‘Relevance Detector Network’ affecting the processing of stimuli that are relevant for the organism’s self-regulating functions. In the present review, we will first summarize the main literature supporting the involvement of the amygdala in socio-emotional disturbances in ASDs. Next, we will present a revised version of the amygdala Relevance Detector hypothesis and we will show that this theoretical framework can provide a better understanding of the heterogeneity of the impairments and symptomatology of ASDs. Finally, we will discuss some predictions of our model, and suggest new directions in the investigation of the role of the amygdala within the more generally disrupted cortical connectivity framework as a model of neural organization of the autistic brain.

  2. A SIMULATION MODEL OF THE GAS COMPLEX

    Directory of Open Access Journals (Sweden)

    Sokolova G. E.

    2016-06-01

    Full Text Available The article considers the dynamics of gas production in Russia, the structure of sales in the different market segments, as well as comparative dynamics of selling prices on these segments. Problems of approach to the creation of the gas complex using a simulation model, allowing to estimate efficiency of the project and determine the stability region of the obtained solutions. In the presented model takes into account the unit repayment of the loan, allowing with the first year of simulation to determine the possibility of repayment of the loan. The model object is a group of gas fields, which is determined by the minimum flow rate above which the project is cost-effective. In determining the minimum source flow rate for the norm of discount is taken as a generalized weighted average percentage on debt and equity taking into account risk premiums. He also serves as the lower barrier to internal rate of return below which the project is rejected as ineffective. Analysis of the dynamics and methods of expert evaluation allow to determine the intervals of variation of the simulated parameters, such as the price of gas and the exit gas complex at projected capacity. Calculated using the Monte Carlo method, for each random realization of the model simulated values of parameters allow to obtain a set of optimal for each realization of values minimum yield of wells, and also allows to determine the stability region of the solution.

  3. Understanding large multiprotein complexes: applying a multiple allosteric networks model to explain the function of the Mediator transcription complex.

    Science.gov (United States)

    Lewis, Brian A

    2010-01-15

    The regulation of transcription and of many other cellular processes involves large multi-subunit protein complexes. In the context of transcription, it is known that these complexes serve as regulatory platforms that connect activator DNA-binding proteins to a target promoter. However, there is still a lack of understanding regarding the function of these complexes. Why do multi-subunit complexes exist? What is the molecular basis of the function of their constituent subunits, and how are these subunits organized within a complex? What is the reason for physical connections between certain subunits and not others? In this article, I address these issues through a model of network allostery and its application to the eukaryotic RNA polymerase II Mediator transcription complex. The multiple allosteric networks model (MANM) suggests that protein complexes such as Mediator exist not only as physical but also as functional networks of interconnected proteins through which information is transferred from subunit to subunit by the propagation of an allosteric state known as conformational spread. Additionally, there are multiple distinct sub-networks within the Mediator complex that can be defined by their connections to different subunits; these sub-networks have discrete functions that are activated when specific subunits interact with other activator proteins.

  4. Models of human operators

    International Nuclear Information System (INIS)

    Knee, H.E.; Schryver, J.C.

    1991-01-01

    Models of human behavior and cognition (HB and C) are necessary for understanding the total response of complex systems. Many such models have come available over the past thirty years for various applications. Unfortunately, many potential model users remain skeptical about their practicality, acceptability, and usefulness. Such hesitancy stems in part to disbelief in the ability to model complex cognitive processes, and a belief that relevant human behavior can be adequately accounted for through the use of commonsense heuristics. This paper will highlight several models of HB and C and identify existing and potential applications in attempt to dispel such notions. (author)

  5. Vibrational and vibronic coherences in the dynamics of the FMO complex

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Xiaomeng; Kühn, Oliver, E-mail: oliver.kuehn@uni-rostock.de

    2016-12-20

    The coupled exciton–vibrational dynamics of a seven site Frenkel exciton model of the Fenna–Matthews–Olson (FMO) complex is investigated using a Quantum Master Equation approach. Thereby, one vibrational mode per monomer is treated explicitly as being part of the relevant system. Emphasis is put on the comparison of this model with that of a purely excitonic relevant system. Further, the effects of two different approximations to the exciton–vibrational basis are investigated, namely the one- and two-particle description. Analysis of the vibronic and vibrational density matrix in the site basis points to the importance of on- and inter-site coherences for the exciton transfer. Here, one- and two-particle approximations give rise to qualitatively different results.

  6. Analysis on complex structure stability under different bar angle with BIM technology

    Directory of Open Access Journals (Sweden)

    Wang Xiongjue

    2016-03-01

    Full Text Available Sun Valley, the landmark building of World Expo in Shanghai, which has free surface with single-layer reticulated shell structure, is a typical complex structure. CAD/CAM integrated information system to design is used for the complex structure; however, it is a very rigorous process to be used widely. The relevant technology of the Sun Valley is not open to the public at present, so we try to use BIM technology to model the Sun Valley, including architecture modelling and structure analysis. By analysis of the Sun Valley structure using this method, it is proved that the problems in modelling may be solved by writing some script codes in Rhino software and the stability of the model can also be analyzed. The new approach is viable and effective in combination with different softwares such as Rhino, Revit, and Midas in solution of the complex shaped surfaces’ structure for modelling and calculation.

  7. Fast and Low-Complexity Simulations of the Inquiry Time in Bluetooth

    DEFF Research Database (Denmark)

    Figueiras, Joao; Schwefel, Hans-Peter

    2006-01-01

    The timing behavior of the Inquiry Procedure in Bluetooth is relevant for several important functionalities, in particular topology formation and localization. The detailed Inquiry procedure is rather complex and simulation models may become inefficient if they implement the full detailed...... specification. This paper presents an abstracted model to approximate the distribution of Bluetooth inquiry time for scenarios in which multiple Bluetooth nodes perform the inquiry procedure. The abstracted model leads to a simple algorithm which can be used in simulation models to generate samples from...

  8. Entropy, complexity, and Markov diagrams for random walk cancer models.

    Science.gov (United States)

    Newton, Paul K; Mason, Jeremy; Hurt, Brian; Bethel, Kelly; Bazhenova, Lyudmila; Nieva, Jorge; Kuhn, Peter

    2014-12-19

    The notion of entropy is used to compare the complexity associated with 12 common cancers based on metastatic tumor distribution autopsy data. We characterize power-law distributions, entropy, and Kullback-Liebler divergence associated with each primary cancer as compared with data for all cancer types aggregated. We then correlate entropy values with other measures of complexity associated with Markov chain dynamical systems models of progression. The Markov transition matrix associated with each cancer is associated with a directed graph model where nodes are anatomical locations where a metastatic tumor could develop, and edge weightings are transition probabilities of progression from site to site. The steady-state distribution corresponds to the autopsy data distribution. Entropy correlates well with the overall complexity of the reduced directed graph structure for each cancer and with a measure of systemic interconnectedness of the graph, called graph conductance. The models suggest that grouping cancers according to their entropy values, with skin, breast, kidney, and lung cancers being prototypical high entropy cancers, stomach, uterine, pancreatic and ovarian being mid-level entropy cancers, and colorectal, cervical, bladder, and prostate cancers being prototypical low entropy cancers, provides a potentially useful framework for viewing metastatic cancer in terms of predictability, complexity, and metastatic potential.

  9. Relevance of few-nucleon problems to nuclear power

    International Nuclear Information System (INIS)

    Divatia, A.S.

    1976-01-01

    It is well known that the study of few-nucleon problems did not specifically start because they were relevant to nuclear power. However, as the need for power has become more urgent and the systems which may generate nuclear power in the future are likely to be highly complex, it has become necessary to examine the question of relevance of few-nucleon problems to nuclear power. The nuclear data needs for nuclear power have been studied exhaustively by many groups all over the world and The International Atomic Energy Agency, operating through the International Nuclear Data Committee and their Nuclear Data section, have compiled and evaluated these nuclear data needs. It is therefore possible to draw upon the various studies and compilations of the IAEA for examining the question of relevance. The relevant nuclear data needs for fission reactors, fusion reactors and nuclear safeguards programmes are examined. (Auth.)

  10. Relevance of the ICRP biokinetic model for dietary organically bound tritium

    International Nuclear Information System (INIS)

    Trivedi, A.

    1999-10-01

    Ingested dietary tritium can participate in metabolic processes, and become synthesized into organically bound tritium in the tissues and organs. The distribution and retention of the organically bound tritium throughout the body are much different than tritium in the body water. The International Commission on Radiological Protection (ICRP) Publication 56 (1989) has a biokinetic model to calculate dose from the ingestion of organically bound dietary tritium. The model predicts that the dose from the ingestion of organically bound dietary tritium is about 2.3 times higher than from the ingestion of the same activity of tritiated water. Under steady-state conditions, the calculated dose rate (using the first principle approach) from the ingestion of dietary organically bound tritium can be twice that from the ingestion of tritiated water. For an adult, the upper-bound dose estimate for the ingestion of dietary organically bound tritium is estimated to be close to 2.3 times higher than that of tritiated water. Therefore, given the uncertainty in the dose calculation with respect to the actual relevant dose, the ICRP biokinetic model for organically bound tritium is sufficient for dosimetry for adults. (author)

  11. Relevance in the science classroom: A multidimensional analysis

    Science.gov (United States)

    Hartwell, Matthew F.

    While perceived relevance is considered a fundamental component of adaptive learning, the experience of relevance and its conceptual definition have not been well described. The mixed-methods research presented in this dissertation aimed to clarify the conceptual meaning of relevance by focusing on its phenomenological experience from the students' perspective. Following a critical literature review, I propose an identity-based model of perceived relevance that includes three components: a contextual target, an identity target, and a connection type, or lens. An empirical investigation of this model that consisted of two general phases was implemented in four 9th grade-biology classrooms. Participants in Phase 1 (N = 118) completed a series of four open-ended writing activities focused on eliciting perceived personal connections to academic content. Exploratory qualitative content analysis of a 25% random sample of the student responses was used to identify the main meaning-units of the proposed model as well as different dimensions of student relevance perceptions. These meaning-units and dimensions provided the basis for the construction of a conceptual mapping sentence capturing students' perceived relevance, which was then applied in a confirmatory analysis to all other student responses. Participants in Phase 2 (N = 139) completed a closed survey designed based on the mapping sentence to assess their perceived relevance of a biology unit. The survey also included scales assessing other domain-level motivational processes. Exploratory factor analysis and non-metric multidimensional scaling indicated a coherent conceptual structure, which included a primary interpretive relevance dimension. Comparison of the conceptual structure across various groups (randomly-split sample, gender, academic level, domain-general motivational profiles) provided support for its ubiquity and insight into variation in the experience of perceived relevance among students of different

  12. Methodology for Measuring the Complexity of Enterprise Information Systems

    Directory of Open Access Journals (Sweden)

    Ilja Holub

    2016-07-01

    Full Text Available The complexity of enterprise information systems is currently a challenge faced not only by IT professionals and project managers, but also by the users of such systems. Current methodologies and frameworks used to design and implement information systems do not specifically deal with the issue of their complexity and, apart from few exceptions, do not at all attempt to simplify the complexity. This article presents the author's own methodology for managing complexity, which can be used to complement any other methodology and which helps limit the growth of complexity. It introduces its own definition and metric of complexity, which it defines as the sum of entities of the individual UML models of the given system, which are selected according to the MMDIS methodology so as to consistently describe all relevant content dimensions of the system. The main objective is to propose a methodology to manage information system complexity and to verify it in practice on a real-life SAP implementation project.

  13. Why relevance theory is relevant for lexicography

    DEFF Research Database (Denmark)

    Bothma, Theo; Tarp, Sven

    2014-01-01

    This article starts by providing a brief summary of relevance theory in information science in relation to the function theory of lexicography, explaining the different types of relevance, viz. objective system relevance and the subjective types of relevance, i.e. topical, cognitive, situational...... that is very important for lexicography as well as for information science, viz. functional relevance. Since all lexicographic work is ultimately aimed at satisfying users’ information needs, the article then discusses why the lexicographer should take note of all these types of relevance when planning a new...... dictionary project, identifying new tasks and responsibilities of the modern lexicographer. The article furthermore discusses how relevance theory impacts on teaching dictionary culture and reference skills. By integrating insights from lexicography and information science, the article contributes to new...

  14. Estimating the complexity of 3D structural models using machine learning methods

    Science.gov (United States)

    Mejía-Herrera, Pablo; Kakurina, Maria; Royer, Jean-Jacques

    2016-04-01

    Quantifying the complexity of 3D geological structural models can play a major role in natural resources exploration surveys, for predicting environmental hazards or for forecasting fossil resources. This paper proposes a structural complexity index which can be used to help in defining the degree of effort necessary to build a 3D model for a given degree of confidence, and also to identify locations where addition efforts are required to meet a given acceptable risk of uncertainty. In this work, it is considered that the structural complexity index can be estimated using machine learning methods on raw geo-data. More precisely, the metrics for measuring the complexity can be approximated as the difficulty degree associated to the prediction of the geological objects distribution calculated based on partial information on the actual structural distribution of materials. The proposed methodology is tested on a set of 3D synthetic structural models for which the degree of effort during their building is assessed using various parameters (such as number of faults, number of part in a surface object, number of borders, ...), the rank of geological elements contained in each model, and, finally, their level of deformation (folding and faulting). The results show how the estimated complexity in a 3D model can be approximated by the quantity of partial data necessaries to simulated at a given precision the actual 3D model without error using machine learning algorithms.

  15. Assessment of wear dependence parameters in complex model of cutting tool wear

    Science.gov (United States)

    Antsev, A. V.; Pasko, N. I.; Antseva, N. V.

    2018-03-01

    This paper addresses wear dependence of the generic efficient life period of cutting tools taken as an aggregate of the law of tool wear rate distribution and dependence of parameters of this law's on the cutting mode, factoring in the random factor as exemplified by the complex model of wear. The complex model of wear takes into account the variance of cutting properties within one batch of tools, variance in machinability within one batch of workpieces, and the stochastic nature of the wear process itself. A technique of assessment of wear dependence parameters in a complex model of cutting tool wear is provided. The technique is supported by a numerical example.

  16. Uranium(VI) speciation: modelling, uncertainty and relevance to bioavailability models. Application to uranium uptake by the gills of a freshwater bivalve

    International Nuclear Information System (INIS)

    Denison, F.H.

    2004-07-01

    The effects of varying solution composition on the interactions between uranium(VI) and excised gills of the freshwater bivalve Corbicula fluminea have been investigated in well defined solution media. A significant reduction in the uptake of uranium was observed on increasing the concentrations of the uranium complexing ligands citrate and carbonate. Saturation kinetics as a function of uranium concentration at a pH value of 5.0 were observed, indicating that the uptake of uranium is a facilitated process, probably involving one or several trans-membrane transport systems. A relatively small change in the uptake of uranium was found as a function of pH (factor of ca. 2), despite the extremely large changes to the solution speciation of uranium within the range of pH investigated (5.0 - 7.5). A comprehensive review of the thermodynamic data relevant to the solution composition domain employed for this study was performed. Estimates of the uncertainties for the formation constants of aqueous uranium(VI) species were integrated into a thermodynamic database. A computer program was written to predict the equilibrium distribution of uranium(VI) in simple aqueous systems, using thermodynamic parameter mean-values. The program was extended to perform Monte Carlo and Quasi Monte Carlo uncertainty analyses, incorporating the thermodynamic database uncertainty estimates, to quantitatively predict the uncertainties inherent in predicting the solution speciation of uranium. The use of thermodynamic equilibrium modelling as a tool for interpreting the bioavailability of uranium(VI) was investigated. Observed uranium(VI) uptake behaviour was interpreted as a function of the predicted changes to the solution speciation of uranium. Different steady-state or pre-equilibrium approaches to modelling uranium uptake were tested. Alternative modelling approaches were also tested, considering the potential changes to membrane transport system activity or sorption characteristics on

  17. Performance and evaluation of a coupled prognostic model TAPM over a mountainous complex terrain industrial area

    Science.gov (United States)

    Matthaios, Vasileios N.; Triantafyllou, Athanasios G.; Albanis, Triantafyllos A.; Sakkas, Vasileios; Garas, Stelios

    2018-05-01

    Atmospheric modeling is considered an important tool with several applications such as prediction of air pollution levels, air quality management, and environmental impact assessment studies. Therefore, evaluation studies must be continuously made, in order to improve the accuracy and the approaches of the air quality models. In the present work, an attempt is made to examine the air pollution model (TAPM) efficiency in simulating the surface meteorology, as well as the SO2 concentrations in a mountainous complex terrain industrial area. Three configurations under different circumstances, firstly with default datasets, secondly with data assimilation, and thirdly with updated land use, ran in order to investigate the surface meteorology for a 3-year period (2009-2011) and one configuration applied to predict SO2 concentration levels for the year of 2011.The modeled hourly averaged meteorological and SO2 concentration values were statistically compared with those from five monitoring stations across the domain to evaluate the model's performance. Statistical measures showed that the surface temperature and relative humidity are predicted well in all three simulations, with index of agreement (IOA) higher than 0.94 and 0.70 correspondingly, in all monitoring sites, while an overprediction of extreme low temperature values is noted, with mountain altitudes to have an important role. However, the results also showed that the model's performance is related to the configuration regarding the wind. TAPM default dataset predicted better the wind variables in the center of the simulation than in the boundaries, while improvement in the boundary horizontal winds implied the performance of TAPM with updated land use. TAPM assimilation predicted the wind variables fairly good in the whole domain with IOA higher than 0.83 for the wind speed and higher than 0.85 for the horizontal wind components. Finally, the SO2 concentrations were assessed by the model with IOA varied from 0

  18. Cut Based Method for Comparing Complex Networks.

    Science.gov (United States)

    Liu, Qun; Dong, Zhishan; Wang, En

    2018-03-23

    Revealing the underlying similarity of various complex networks has become both a popular and interdisciplinary topic, with a plethora of relevant application domains. The essence of the similarity here is that network features of the same network type are highly similar, while the features of different kinds of networks present low similarity. In this paper, we introduce and explore a new method for comparing various complex networks based on the cut distance. We show correspondence between the cut distance and the similarity of two networks. This correspondence allows us to consider a broad range of complex networks and explicitly compare various networks with high accuracy. Various machine learning technologies such as genetic algorithms, nearest neighbor classification, and model selection are employed during the comparison process. Our cut method is shown to be suited for comparisons of undirected networks and directed networks, as well as weighted networks. In the model selection process, the results demonstrate that our approach outperforms other state-of-the-art methods with respect to accuracy.

  19. Diverse complexities, complex diversities: Resisting ‘normal science’ in pedagogical and research methodologies. A perspective from Aotearoa (New Zealand

    Directory of Open Access Journals (Sweden)

    Ritchie Jenny

    2016-06-01

    Full Text Available This paper offers an overview of complexities of the contexts for education in Aotearoa, which include the need to recognise and include Māori (Indigenous perspectives, but also to extend this inclusion to the context of increasing ethnic diversity. These complexities include the situation of worsening disparities between rich and poor which disproportionately position Māori and those from Pacific Island backgrounds in situations of poverty. It then offers a brief critique of government policies before providing some examples of models that resist ‘normal science’ categorisations. These include: the Māori values underpinning the effective teachers’ profile of the Kotahitanga project and of the Māori assessment model for early childhood education; the dispositions identified in a Samoan model for assessing young children’s learning; and the approach developed for assessing Māori children’s literacy and numeracy within schools where Māori language is the medium of instruction. These models all position learning within culturally relevant frames that are grounded in non-Western onto-epistemologies which include spiritual, cultural, and collective aspirations.

  20. Complexity: a potential paradigm for a health promotion discipline.

    Science.gov (United States)

    Tremblay, Marie-Claude; Richard, Lucie

    2014-06-01

    Health promotion underpins a distancing from narrow, simplifying health approaches associated with the biomedical model. However, it has not yet succeeded in formally establishing its theoretical, epistemological and methodological foundations on a single paradigm. The complexity paradigm, which it has yet to broach head-on, might provide it with a disciplinary matrix in line with its implicit stances and basic values. This article seeks to establish complexity's relevance as a paradigm that can contribute to the development of a health promotion discipline. The relevance of complexity is justified primarily by its matching with several implicit epistemological and methodological/theoretical stances found in the cardinal concepts and principles of health promotion. The transcendence of ontological realism and determinism as well as receptiveness in respect of the reflexivity that complexity encompasses are congruent with the values of social justice, participation, empowerment and the concept of positive health that the field promotes. Moreover, from a methodological and theoretical standpoint, complexity assumes a holistic, contextual and transdisciplinary approach, toward which health promotion is tending through its emphasis on ecology and interdisciplinary action. In a quest to illustrate our position, developmental evaluation is presented as an example of practice stemming from a complexity paradigm that can be useful in the evaluation of health promotion initiatives. In short, we argue that it would be advantageous for health promotion to integrate this paradigm, which would provide it with a formal framework appropriate to its purposes and concerns.

  1. On the ""early-time"" evolution of variables relevant to turbulence models for Rayleigh-Taylor instability

    Energy Technology Data Exchange (ETDEWEB)

    Rollin, Bertrand [Los Alamos National Laboratory; Andrews, Malcolm J [Los Alamos National Laboratory

    2010-01-01

    We present our progress toward setting initial conditions in variable density turbulence models. In particular, we concentrate our efforts on the BHR turbulence model for turbulent Rayleigh-Taylor instability. Our approach is to predict profiles of relevant parameters before the fully turbulent regime and use them as initial conditions for the turbulence model. We use an idealized model of the mixing between two interpenetrating fluids to define the initial profiles for the turbulence model parameters. Velocities and volume fractions used in the idealized mixing model are obtained respectively from a set of ordinary differential equations modeling the growth of the Rayleigh-Taylor instability and from an idealization of the density profile in the mixing layer. A comparison between predicted initial profiles for the turbulence model parameters and initial profiles of the parameters obtained from low Atwood number three dimensional simulations show reasonable agreement.

  2. Software Helps Retrieve Information Relevant to the User

    Science.gov (United States)

    Mathe, Natalie; Chen, James

    2003-01-01

    The Adaptive Indexing and Retrieval Agent (ARNIE) is a code library, designed to be used by an application program, that assists human users in retrieving desired information in a hypertext setting. Using ARNIE, the program implements a computational model for interactively learning what information each human user considers relevant in context. The model, called a "relevance network," incrementally adapts retrieved information to users individual profiles on the basis of feedback from the users regarding specific queries. The model also generalizes such knowledge for subsequent derivation of relevant references for similar queries and profiles, thereby, assisting users in filtering information by relevance. ARNIE thus enables users to categorize and share information of interest in various contexts. ARNIE encodes the relevance and structure of information in a neural network dynamically configured with a genetic algorithm. ARNIE maintains an internal database, wherein it saves associations, and from which it returns associated items in response to a query. A C++ compiler for a platform on which ARNIE will be utilized is necessary for creating the ARNIE library but is not necessary for the execution of the software.

  3. Using Models to Inform Policy: Insights from Modeling the Complexities of Global Polio Eradication

    Science.gov (United States)

    Thompson, Kimberly M.

    Drawing on over 20 years of experience modeling risks in complex systems, this talk will challenge SBP participants to develop models that provide timely and useful answers to critical policy questions when decision makers need them. The talk will include reflections on the opportunities and challenges associated with developing integrated models for complex problems and communicating their results effectively. Dr. Thompson will focus the talk largely on collaborative modeling related to global polio eradication and the application of system dynamics tools. After successful global eradication of wild polioviruses, live polioviruses will still present risks that could potentially lead to paralytic polio cases. This talk will present the insights of efforts to use integrated dynamic, probabilistic risk, decision, and economic models to address critical policy questions related to managing global polio risks. Using a dynamic disease transmission model combined with probabilistic model inputs that characterize uncertainty for a stratified world to account for variability, we find that global health leaders will face some difficult choices, but that they can take actions that will manage the risks effectively. The talk will emphasize the need for true collaboration between modelers and subject matter experts, and the importance of working with decision makers as partners to ensure the development of useful models that actually get used.

  4. Uncertainty and validation. Effect of model complexity on uncertainty estimates

    International Nuclear Information System (INIS)

    Elert, M.

    1996-09-01

    In the Model Complexity subgroup of BIOMOVS II, models of varying complexity have been applied to the problem of downward transport of radionuclides in soils. A scenario describing a case of surface contamination of a pasture soil was defined. Three different radionuclides with different environmental behavior and radioactive half-lives were considered: Cs-137, Sr-90 and I-129. The intention was to give a detailed specification of the parameters required by different kinds of model, together with reasonable values for the parameter uncertainty. A total of seven modelling teams participated in the study using 13 different models. Four of the modelling groups performed uncertainty calculations using nine different modelling approaches. The models used range in complexity from analytical solutions of a 2-box model using annual average data to numerical models coupling hydrology and transport using data varying on a daily basis. The complex models needed to consider all aspects of radionuclide transport in a soil with a variable hydrology are often impractical to use in safety assessments. Instead simpler models, often box models, are preferred. The comparison of predictions made with the complex models and the simple models for this scenario show that the predictions in many cases are very similar, e g in the predictions of the evolution of the root zone concentration. However, in other cases differences of many orders of magnitude can appear. One example is the prediction of the flux to the groundwater of radionuclides being transported through the soil column. Some issues that have come to focus in this study: There are large differences in the predicted soil hydrology and as a consequence also in the radionuclide transport, which suggests that there are large uncertainties in the calculation of effective precipitation and evapotranspiration. The approach used for modelling the water transport in the root zone has an impact on the predictions of the decline in root

  5. Complex Road Intersection Modelling Based on Low-Frequency GPS Track Data

    Science.gov (United States)

    Huang, J.; Deng, M.; Zhang, Y.; Liu, H.

    2017-09-01

    It is widely accepted that digital map becomes an indispensable guide for human daily traveling. Traditional road network maps are produced in the time-consuming and labour-intensive ways, such as digitizing printed maps and extraction from remote sensing images. At present, a large number of GPS trajectory data collected by floating vehicles makes it a reality to extract high-detailed and up-to-date road network information. Road intersections are often accident-prone areas and very critical to route planning and the connectivity of road networks is mainly determined by the topological geometry of road intersections. A few studies paid attention on detecting complex road intersections and mining the attached traffic information (e.g., connectivity, topology and turning restriction) from massive GPS traces. To the authors' knowledge, recent studies mainly used high frequency (1 s sampling rate) trajectory data to detect the crossroads regions or extract rough intersection models. It is still difficult to make use of low frequency (20-100 s) and easily available trajectory data to modelling complex road intersections geometrically and semantically. The paper thus attempts to construct precise models for complex road intersection by using low frequency GPS traces. We propose to firstly extract the complex road intersections by a LCSS-based (Longest Common Subsequence) trajectory clustering method, then delineate the geometry shapes of complex road intersections by a K-segment principle curve algorithm, and finally infer the traffic constraint rules inside the complex intersections.

  6. Perspectives on creating clinically relevant blast models for mild traumatic brain injury and post traumatic stress disorder symptoms

    Directory of Open Access Journals (Sweden)

    Lisa eBrenner

    2012-03-01

    Full Text Available Military personnel are returning from Iraq and Afghanistan and reporting non-specific physical (somatic, behavioral, psychological, and cognitive symptoms. Many of these symptoms are frequently associated with mild traumatic brain injury (mTBI and/or post traumatic stress disorder (PTSD. Despite significant attention and advances in assessment and intervention for these two conditions, challenges persist. To address this, clinically relevant blast models are essential in the full characterization of this type of injury, as well as in the testing and identification of potential treatment strategies. In this publication, existing diagnostic challenges and current treatment practices for mTBI and/or PTSD will be summarized, along with suggestions regarding how what has been learned from existing models of PTSD and traditional mechanism (e.g., non-blast TBI can be used to facilitate the development of clinically relevant blast models.

  7. Computational disease modeling – fact or fiction?

    Directory of Open Access Journals (Sweden)

    Stephan Klaas

    2009-06-01

    Full Text Available Abstract Background Biomedical research is changing due to the rapid accumulation of experimental data at an unprecedented scale, revealing increasing degrees of complexity of biological processes. Life Sciences are facing a transition from a descriptive to a mechanistic approach that reveals principles of cells, cellular networks, organs, and their interactions across several spatial and temporal scales. There are two conceptual traditions in biological computational-modeling. The bottom-up approach emphasizes complex intracellular molecular models and is well represented within the systems biology community. On the other hand, the physics-inspired top-down modeling strategy identifies and selects features of (presumably essential relevance to the phenomena of interest and combines available data in models of modest complexity. Results The workshop, "ESF Exploratory Workshop on Computational disease Modeling", examined the challenges that computational modeling faces in contributing to the understanding and treatment of complex multi-factorial diseases. Participants at the meeting agreed on two general conclusions. First, we identified the critical importance of developing analytical tools for dealing with model and parameter uncertainty. Second, the development of predictive hierarchical models spanning several scales beyond intracellular molecular networks was identified as a major objective. This contrasts with the current focus within the systems biology community on complex molecular modeling. Conclusion During the workshop it became obvious that diverse scientific modeling cultures (from computational neuroscience, theory, data-driven machine-learning approaches, agent-based modeling, network modeling and stochastic-molecular simulations would benefit from intense cross-talk on shared theoretical issues in order to make progress on clinically relevant problems.

  8. Structured analysis and modeling of complex systems

    Science.gov (United States)

    Strome, David R.; Dalrymple, Mathieu A.

    1992-01-01

    The Aircrew Evaluation Sustained Operations Performance (AESOP) facility at Brooks AFB, Texas, combines the realism of an operational environment with the control of a research laboratory. In recent studies we collected extensive data from the Airborne Warning and Control Systems (AWACS) Weapons Directors subjected to high and low workload Defensive Counter Air Scenarios. A critical and complex task in this environment involves committing a friendly fighter against a hostile fighter. Structured Analysis and Design techniques and computer modeling systems were applied to this task as tools for analyzing subject performance and workload. This technology is being transferred to the Man-Systems Division of NASA Johnson Space Center for application to complex mission related tasks, such as manipulating the Shuttle grappler arm.

  9. Modeling complexity in engineered infrastructure system: Water distribution network as an example

    Science.gov (United States)

    Zeng, Fang; Li, Xiang; Li, Ke

    2017-02-01

    The complex topology and adaptive behavior of infrastructure systems are driven by both self-organization of the demand and rigid engineering solutions. Therefore, engineering complex systems requires a method balancing holism and reductionism. To model the growth of water distribution networks, a complex network model was developed following the combination of local optimization rules and engineering considerations. The demand node generation is dynamic and follows the scaling law of urban growth. The proposed model can generate a water distribution network (WDN) similar to reported real-world WDNs on some structural properties. Comparison with different modeling approaches indicates that a realistic demand node distribution and co-evolvement of demand node and network are important for the simulation of real complex networks. The simulation results indicate that the efficiency of water distribution networks is exponentially affected by the urban growth pattern. On the contrary, the improvement of efficiency by engineering optimization is limited and relatively insignificant. The redundancy and robustness, on another aspect, can be significantly improved through engineering methods.

  10. Fidelity in Animal Modeling: Prerequisite for a Mechanistic Research Front Relevant to the Inflammatory Incompetence of Acute Pediatric Malnutrition

    Science.gov (United States)

    Woodward, Bill

    2016-01-01

    Inflammatory incompetence is characteristic of acute pediatric protein-energy malnutrition, but its underlying mechanisms remain obscure. Perhaps substantially because the research front lacks the driving force of a scholarly unifying hypothesis, it is adrift and research activity is declining. A body of animal-based research points to a unifying paradigm, the Tolerance Model, with some potential to offer coherence and a mechanistic impetus to the field. However, reasonable skepticism prevails regarding the relevance of animal models of acute pediatric malnutrition; consequently, the fundamental contributions of the animal-based component of this research front are largely overlooked. Design-related modifications to improve the relevance of animal modeling in this research front include, most notably, prioritizing essential features of pediatric malnutrition pathology rather than dietary minutiae specific to infants and children, selecting windows of experimental animal development that correspond to targeted stages of pediatric immunological ontogeny, and controlling for ontogeny-related confounders. In addition, important opportunities are presented by newer tools including the immunologically humanized mouse and outbred stocks exhibiting a magnitude of genetic heterogeneity comparable to that of human populations. Sound animal modeling is within our grasp to stimulate and support a mechanistic research front relevant to the immunological problems that accompany acute pediatric malnutrition. PMID:27077845

  11. Fidelity in Animal Modeling: Prerequisite for a Mechanistic Research Front Relevant to the Inflammatory Incompetence of Acute Pediatric Malnutrition.

    Science.gov (United States)

    Woodward, Bill

    2016-04-11

    Inflammatory incompetence is characteristic of acute pediatric protein-energy malnutrition, but its underlying mechanisms remain obscure. Perhaps substantially because the research front lacks the driving force of a scholarly unifying hypothesis, it is adrift and research activity is declining. A body of animal-based research points to a unifying paradigm, the Tolerance Model, with some potential to offer coherence and a mechanistic impetus to the field. However, reasonable skepticism prevails regarding the relevance of animal models of acute pediatric malnutrition; consequently, the fundamental contributions of the animal-based component of this research front are largely overlooked. Design-related modifications to improve the relevance of animal modeling in this research front include, most notably, prioritizing essential features of pediatric malnutrition pathology rather than dietary minutiae specific to infants and children, selecting windows of experimental animal development that correspond to targeted stages of pediatric immunological ontogeny, and controlling for ontogeny-related confounders. In addition, important opportunities are presented by newer tools including the immunologically humanized mouse and outbred stocks exhibiting a magnitude of genetic heterogeneity comparable to that of human populations. Sound animal modeling is within our grasp to stimulate and support a mechanistic research front relevant to the immunological problems that accompany acute pediatric malnutrition.

  12. Evaluation of soil flushing of complex contaminated soil: An experimental and modeling simulation study

    Energy Technology Data Exchange (ETDEWEB)

    Yun, Sung Mi; Kang, Christina S. [Department of Environmental Engineering, Konkuk University, 120 Neungdong-ro, Gwangjin-gu, Seoul 143-701 (Korea, Republic of); Kim, Jonghwa [Department of Industrial Engineering, Konkuk University, 120 Neungdong-ro, Gwangjin-gu, Seoul 143-701 (Korea, Republic of); Kim, Han S., E-mail: hankim@konkuk.ac.kr [Department of Environmental Engineering, Konkuk University, 120 Neungdong-ro, Gwangjin-gu, Seoul 143-701 (Korea, Republic of)

    2015-04-28

    Highlights: • Remediation of complex contaminated soil achieved by sequential soil flushing. • Removal of Zn, Pb, and heavy petroleum oils using 0.05 M citric acid and 2% SDS. • Unified desorption distribution coefficients modeled and experimentally determined. • Nonequilibrium models for the transport behavior of complex contaminants in soils. - Abstract: The removal of heavy metals (Zn and Pb) and heavy petroleum oils (HPOs) from a soil with complex contamination was examined by soil flushing. Desorption and transport behaviors of the complex contaminants were assessed by batch and continuous flow reactor experiments and through modeling simulations. Flushing a one-dimensional flow column packed with complex contaminated soil sequentially with citric acid then a surfactant resulted in the removal of 85.6% of Zn, 62% of Pb, and 31.6% of HPO. The desorption distribution coefficients, K{sub Ubatch} and K{sub Lbatch}, converged to constant values as C{sub e} increased. An equilibrium model (ADR) and nonequilibrium models (TSNE and TRNE) were used to predict the desorption and transport of complex contaminants. The nonequilibrium models demonstrated better fits with the experimental values obtained from the column test than the equilibrium model. The ranges of K{sub Ubatch} and K{sub Lbatch} were very close to those of K{sub Ufit} and K{sub Lfit} determined from model simulations. The parameters (R, β, ω, α, and f) determined from model simulations were useful for characterizing the transport of contaminants within the soil matrix. The results of this study provide useful information for the operational parameters of the flushing process for soils with complex contamination.

  13. A computational framework for modeling targets as complex adaptive systems

    Science.gov (United States)

    Santos, Eugene; Santos, Eunice E.; Korah, John; Murugappan, Vairavan; Subramanian, Suresh

    2017-05-01

    Modeling large military targets is a challenge as they can be complex systems encompassing myriad combinations of human, technological, and social elements that interact, leading to complex behaviors. Moreover, such targets have multiple components and structures, extending across multiple spatial and temporal scales, and are in a state of change, either in response to events in the environment or changes within the system. Complex adaptive system (CAS) theory can help in capturing the dynamism, interactions, and more importantly various emergent behaviors, displayed by the targets. However, a key stumbling block is incorporating information from various intelligence, surveillance and reconnaissance (ISR) sources, while dealing with the inherent uncertainty, incompleteness and time criticality of real world information. To overcome these challenges, we present a probabilistic reasoning network based framework called complex adaptive Bayesian Knowledge Base (caBKB). caBKB is a rigorous, overarching and axiomatic framework that models two key processes, namely information aggregation and information composition. While information aggregation deals with the union, merger and concatenation of information and takes into account issues such as source reliability and information inconsistencies, information composition focuses on combining information components where such components may have well defined operations. Since caBKBs can explicitly model the relationships between information pieces at various scales, it provides unique capabilities such as the ability to de-aggregate and de-compose information for detailed analysis. Using a scenario from the Network Centric Operations (NCO) domain, we will describe how our framework can be used for modeling targets with a focus on methodologies for quantifying NCO performance metrics.

  14. Model-based safety architecture framework for complex systems

    NARCIS (Netherlands)

    Schuitemaker, Katja; Rajabali Nejad, Mohammadreza; Braakhuis, J.G.; Podofillini, Luca; Sudret, Bruno; Stojadinovic, Bozidar; Zio, Enrico; Kröger, Wolfgang

    2015-01-01

    The shift to transparency and rising need of the general public for safety, together with the increasing complexity and interdisciplinarity of modern safety-critical Systems of Systems (SoS) have resulted in a Model-Based Safety Architecture Framework (MBSAF) for capturing and sharing architectural

  15. Contrasting model complexity under a changing climate in a headwaters catchment.

    Science.gov (United States)

    Foster, L.; Williams, K. H.; Maxwell, R. M.

    2017-12-01

    Alpine, snowmelt-dominated catchments are the source of water for more than 1/6th of the world's population. These catchments are topographically complex, leading to steep weather gradients and nonlinear relationships between water and energy fluxes. Recent evidence suggests that alpine systems are more sensitive to climate warming, but these regions are vastly simplified in climate models and operational water management tools due to computational limitations. Simultaneously, point-scale observations are often extrapolated to larger regions where feedbacks can both exacerbate or mitigate locally observed changes. It is critical to determine whether projected climate impacts are robust to different methodologies, including model complexity. Using high performance computing and an integrated model of a representative headwater catchment we determined the hydrologic response from 30 projected climate changes to precipitation, temperature and vegetation for the Rocky Mountains. Simulations were run with 100m and 1km resolution, and with and without lateral subsurface flow in order to vary model complexity. We found that model complexity alters nonlinear relationships between water and energy fluxes. Higher-resolution models predicted larger changes per degree of temperature increase than lower resolution models, suggesting that reductions to snowpack, surface water, and groundwater due to warming may be underestimated in simple models. Increases in temperature were found to have a larger impact on water fluxes and stores than changes in precipitation, corroborating previous research showing that mountain systems are significantly more sensitive to temperature changes than to precipitation changes and that increases in winter precipitation are unlikely to compensate for increased evapotranspiration in a higher energy environment. These numerical experiments help to (1) bracket the range of uncertainty in published literature of climate change impacts on headwater

  16. Clinical and Neurobiological Relevance of Current Animal Models of Autism Spectrum Disorders

    Science.gov (United States)

    Kim, Ki Chan; Gonzales, Edson Luck; Lázaro, María T.; Choi, Chang Soon; Bahn, Geon Ho; Yoo, Hee Jeong; Shin, Chan Young

    2016-01-01

    Autism spectrum disorder (ASD) is a neurodevelopmental disorder characterized by social and communication impairments, as well as repetitive and restrictive behaviors. The phenotypic heterogeneity of ASD has made it overwhelmingly difficult to determine the exact etiology and pathophysiology underlying the core symptoms, which are often accompanied by comorbidities such as hyperactivity, seizures, and sensorimotor abnormalities. To our benefit, the advent of animal models has allowed us to assess and test diverse risk factors of ASD, both genetic and environmental, and measure their contribution to the manifestation of autistic symptoms. At a broader scale, rodent models have helped consolidate molecular pathways and unify the neurophysiological mechanisms underlying each one of the various etiologies. This approach will potentially enable the stratification of ASD into clinical, molecular, and neurophenotypic subgroups, further proving their translational utility. It is henceforth paramount to establish a common ground of mechanistic theories from complementing results in preclinical research. In this review, we cluster the ASD animal models into lesion and genetic models and further classify them based on the corresponding environmental, epigenetic and genetic factors. Finally, we summarize the symptoms and neuropathological highlights for each model and make critical comparisons that elucidate their clinical and neurobiological relevance. PMID:27133257

  17. Modeling and simulation for fewer-axis grinding of complex surface

    Science.gov (United States)

    Li, Zhengjian; Peng, Xiaoqiang; Song, Ci

    2017-10-01

    As the basis of fewer-axis grinding of complex surface, the grinding mathematical model is of great importance. A mathematical model of the grinding wheel was established, and then coordinate and normal vector of the wheel profile could be calculated. Through normal vector matching at the cutter contact point and the coordinate system transformation, the grinding mathematical model was established to work out the coordinate of the cutter location point. Based on the model, interference analysis was simulated to find out the right position and posture of workpiece for grinding. Then positioning errors of the workpiece including the translation positioning error and the rotation positioning error were analyzed respectively, and the main locating datum was obtained. According to the analysis results, the grinding tool path was planned and generated to grind the complex surface, and good form accuracy was obtained. The grinding mathematical model is simple, feasible and can be widely applied.

  18. Nonlinear model of epidemic spreading in a complex social network.

    Science.gov (United States)

    Kosiński, Robert A; Grabowski, A

    2007-10-01

    The epidemic spreading in a human society is a complex process, which can be described on the basis of a nonlinear mathematical model. In such an approach the complex and hierarchical structure of social network (which has implications for the spreading of pathogens and can be treated as a complex network), can be taken into account. In our model each individual has one of the four permitted states: susceptible, infected, infective, unsusceptible or dead. This refers to the SEIR model used in epidemiology. The state of an individual changes in time, depending on the previous state and the interactions with other individuals. The description of the interpersonal contacts is based on the experimental observations of the social relations in the community. It includes spatial localization of the individuals and hierarchical structure of interpersonal interactions. Numerical simulations were performed for different types of epidemics, giving the progress of a spreading process and typical relationships (e.g. range of epidemic in time, the epidemic curve). The spreading process has a complex and spatially chaotic character. The time dependence of the number of infective individuals shows the nonlinear character of the spreading process. We investigate the influence of the preventive vaccinations on the spreading process. In particular, for a critical value of preventively vaccinated individuals the percolation threshold is observed and the epidemic is suppressed.

  19. Polymer-bound oxidovanadium(IV) and dioxidovanadium(V) complexes as catalysts for the oxidative desulfurization of model fuel diesel.

    Science.gov (United States)

    Maurya, Mannar R; Arya, Aarti; Kumar, Amit; Kuznetsov, Maxim L; Avecilla, Fernando; Costa Pessoa, João

    2010-07-19

    The Schiff base (Hfsal-dmen) derived from 3-formylsalicylic acid and N,N-dimethyl ethylenediamine has been covalently bonded to chloromethylated polystyrene to give the polymer-bound ligand, PS-Hfsal-dmen (I). Treatment of PS-Hfsal-dmen with [V(IV)O(acac)(2)] in the presence of MeOH gave the oxidovanadium(IV) complex PS-[V(IV)O(fsal-dmen)(MeO)] (1). On aerial oxidation in methanol, complex 1 was oxidized to PS-[V(V)O(2)(fsal-dmen)] (2). The corresponding neat complexes, [V(IV)O(sal-dmen)(acac)] (3) and [V(V)O(2)(sal-dmen)] (4) were similarly prepared. All these complexes are characterized by various spectroscopic techniques (IR, electronic, NMR, and electron paramagnetic resonance (EPR)) and thermal as well as field-emission scanning electron micrographs (FE-SEM) studies, and the molecular structures of 3 and 4 were determined by single crystal X-ray diffraction. The EPR spectrum of the polymer supported V(IV)O-complex 1 is characteristic of magnetically diluted V(IV)O-complexes, the resolved EPR pattern indicating that the V(IV)O-centers are well dispersed in the polymer matrix. A good (51)V NMR spectrum could also be measured with 4 suspended in dimethyl sulfoxide (DMSO), the chemical shift (-503 ppm) being compatible with a VO(2)(+)-center and a N,O binding set. The catalytic oxidative desulfurization of organosulfur compounds thiophene, dibenzothiophene, benzothiophene, and 2-methyl thiophene (model of fuel diesel) was carried out using complexes 1 and 2. The sulfur in model organosulfur compounds oxidizes to the corresponding sulfone in the presence of H(2)O(2). The systems 1 and 2 do not loose efficiency for sulfoxidation at least up to the third cycle of reaction, this indicating that they preserve their integrity under the conditions used. Plausible intermediates involved in these catalytic processes are established by UV-vis, EPR, (51)V NMR, and density functional theory (DFT) studies, and an outline of the mechanism is proposed. The (51)V NMR spectra

  20. Models of policy-making and their relevance for drug research.

    Science.gov (United States)

    Ritter, Alison; Bammer, Gabriele

    2010-07-01

    Researchers are often frustrated by their inability to influence policy. We describe models of policy-making to provide new insights and a more realistic assessment of research impacts on policy. We describe five prominent models of policy-making and illustrate them with examples from the alcohol and drugs field, before drawing lessons for researchers. Policy-making is a complex and messy process, with different models describing different elements. We start with the incrementalist model, which highlights small amendments to policy, as occurs in school-based drug education. A technical/rational approach then outlines the key steps in a policy process from identification of problems and their causes, through to examination and choice of response options, and subsequent implementation and evaluation. There is a clear role for research, as we illustrate with the introduction of new medications, but this model largely ignores the dominant political aspects of policy-making. Such political aspects include the influence of interest groups, and we describe models about power and pressure groups, as well as advocacy coalitions, and the challenges they pose for researchers. These are illustrated with reference to the alcohol industry, and interest group conflicts in establishing a Medically Supervised Injecting Centre. Finally, we describe the multiple streams framework, which alerts researchers to 'windows of opportunity', and we show how these were effectively exploited in policy for cannabis law reform in Western Australia. Understanding models of policy-making can help researchers maximise the uptake of their work and advance evidence-informed policy.

  1. Use of an ecologically relevant modelling approach to improve remote sensing-based schistosomiasis risk profiling

    Directory of Open Access Journals (Sweden)

    Yvonne Walz

    2015-11-01

    Full Text Available Schistosomiasis is a widespread water-based disease that puts close to 800 million people at risk of infection with more than 250 million infected, mainly in sub-Saharan Africa. Transmission is governed by the spatial distribution of specific freshwater snails that act as intermediate hosts and the frequency, duration and extent of human bodies exposed to infested water sources during human water contact. Remote sensing data have been utilized for spatially explicit risk profiling of schistosomiasis. Since schistosomiasis risk profiling based on remote sensing data inherits a conceptual drawback if school-based disease prevalence data are directly related to the remote sensing measurements extracted at the location of the school, because the disease transmission usually does not exactly occur at the school, we took the local environment around the schools into account by explicitly linking ecologically relevant environmental information of potential disease transmission sites to survey measurements of disease prevalence. Our models were validated at two sites with different landscapes in Côte d’Ivoire using high- and moderateresolution remote sensing data based on random forest and partial least squares regression. We found that the ecologically relevant modelling approach explained up to 70% of the variation in Schistosoma infection prevalence and performed better compared to a purely pixelbased modelling approach. Furthermore, our study showed that model performance increased as a function of enlarging the school catchment area, confirming the hypothesis that suitable environments for schistosomiasis transmission rarely occur at the location of survey measurements.

  2. Advances in complex societal, environmental and engineered systems

    CERN Document Server

    Essaaidi, Mohammad

    2017-01-01

    This book addresses recent technological progress that has led to an increased complexity in many natural and artificial systems. The resulting complexity research due to the emergence of new properties and spatio-temporal interactions among a large number of system elements - and between the system and its environment - is the primary focus of this text. This volume is divided into three parts: Part one focuses on societal and ecological systems, Part two deals with approaches for understanding, modeling, predicting and mastering socio-technical systems, and Part three includes real-life examples. Each chapter has its own special features; it is a self-contained contribution of distinguished experts working on different fields of science and technology relevant to the study of complex systems. Advances in Complex Systems of Contemporary Reality: Societal, Environmental and Engineered Systems will provide postgraduate students, researchers and managers with qualitative and quantitative methods for handling th...

  3. Bridging Mechanistic and Phenomenological Models of Complex Biological Systems.

    Science.gov (United States)

    Transtrum, Mark K; Qiu, Peng

    2016-05-01

    The inherent complexity of biological systems gives rise to complicated mechanistic models with a large number of parameters. On the other hand, the collective behavior of these systems can often be characterized by a relatively small number of phenomenological parameters. We use the Manifold Boundary Approximation Method (MBAM) as a tool for deriving simple phenomenological models from complicated mechanistic models. The resulting models are not black boxes, but remain expressed in terms of the microscopic parameters. In this way, we explicitly connect the macroscopic and microscopic descriptions, characterize the equivalence class of distinct systems exhibiting the same range of collective behavior, and identify the combinations of components that function as tunable control knobs for the behavior. We demonstrate the procedure for adaptation behavior exhibited by the EGFR pathway. From a 48 parameter mechanistic model, the system can be effectively described by a single adaptation parameter τ characterizing the ratio of time scales for the initial response and recovery time of the system which can in turn be expressed as a combination of microscopic reaction rates, Michaelis-Menten constants, and biochemical concentrations. The situation is not unlike modeling in physics in which microscopically complex processes can often be renormalized into simple phenomenological models with only a few effective parameters. The proposed method additionally provides a mechanistic explanation for non-universal features of the behavior.

  4. Low frequency complex dielectric (conductivity) response of dilute clay suspensions: Modeling and experiments.

    Science.gov (United States)

    Hou, Chang-Yu; Feng, Ling; Seleznev, Nikita; Freed, Denise E

    2018-04-11

    In this work, we establish an effective medium model to describe the low-frequency complex dielectric (conductivity) dispersion of dilute clay suspensions. We use previously obtained low-frequency polarization coefficients for a charged oblate spheroidal particle immersed in an electrolyte as the building block for the Maxwell Garnett mixing formula to model the dilute clay suspension. The complex conductivity phase dispersion exhibits a near-resonance peak when the clay grains have a narrow size distribution. The peak frequency is associated with the size distribution as well as the shape of clay grains and is often referred to as the characteristic frequency. In contrast, if the size of the clay grains has a broad distribution, the phase peak is broadened and can disappear into the background of the canonical phase response of the brine. To benchmark our model, the low-frequency dispersion of the complex conductivity of dilute clay suspensions is measured using a four-point impedance measurement, which can be reliably calibrated in the frequency range between 0.1 Hz and 10 kHz. By using a minimal number of fitting parameters when reliable information is available as input for the model and carefully examining the issue of potential over-fitting, we found that our model can be used to fit the measured dispersion of the complex conductivity with reasonable parameters. The good match between the modeled and experimental complex conductivity dispersion allows us to argue that our simplified model captures the essential physics for describing the low-frequency dispersion of the complex conductivity of dilute clay suspensions. Copyright © 2018 Elsevier Inc. All rights reserved.

  5. Using Structured Knowledge Representation for Context-Sensitive Probabilistic Modeling

    National Research Council Canada - National Science Library

    Sakhanenko, Nikita A; Luger, George F

    2008-01-01

    We propose a context-sensitive probabilistic modeling system (COSMOS) that reasons about a complex, dynamic environment through a series of applications of smaller, knowledge-focused models representing contextually relevant information...

  6. Green IT engineering concepts, models, complex systems architectures

    CERN Document Server

    Kondratenko, Yuriy; Kacprzyk, Janusz

    2017-01-01

    This volume provides a comprehensive state of the art overview of a series of advanced trends and concepts that have recently been proposed in the area of green information technologies engineering as well as of design and development methodologies for models and complex systems architectures and their intelligent components. The contributions included in the volume have their roots in the authors’ presentations, and vivid discussions that have followed the presentations, at a series of workshop and seminars held within the international TEMPUS-project GreenCo project in United Kingdom, Italy, Portugal, Sweden and the Ukraine, during 2013-2015 and at the 1st - 5th Workshops on Green and Safe Computing (GreenSCom) held in Russia, Slovakia and the Ukraine. The book presents a systematic exposition of research on principles, models, components and complex systems and a description of industry- and society-oriented aspects of the green IT engineering. A chapter-oriented structure has been adopted for this book ...

  7. An Ontology for Modeling Complex Inter-relational Organizations

    Science.gov (United States)

    Wautelet, Yves; Neysen, Nicolas; Kolp, Manuel

    This paper presents an ontology for organizational modeling through multiple complementary aspects. The primary goal of the ontology is to dispose of an adequate set of related concepts for studying complex organizations involved in a lot of relationships at the same time. In this paper, we define complex organizations as networked organizations involved in a market eco-system that are playing several roles simultaneously. In such a context, traditional approaches focus on the macro analytic level of transactions; this is supplemented here with a micro analytic study of the actors' rationale. At first, the paper overviews enterprise ontologies literature to position our proposal and exposes its contributions and limitations. The ontology is then brought to an advanced level of formalization: a meta-model in the form of a UML class diagram allows to overview the ontology concepts and their relationships which are formally defined. Finally, the paper presents the case study on which the ontology has been validated.

  8. Efficient Simulation Modeling of an Integrated High-Level-Waste Processing Complex

    International Nuclear Information System (INIS)

    Gregory, Michael V.; Paul, Pran K.

    2000-01-01

    An integrated computational tool named the Production Planning Model (ProdMod) has been developed to simulate the operation of the entire high-level-waste complex (HLW) at the Savannah River Site (SRS) over its full life cycle. ProdMod is used to guide SRS management in operating the waste complex in an economically efficient and environmentally sound manner. SRS HLW operations are modeled using coupled algebraic equations. The dynamic nature of plant processes is modeled in the form of a linear construct in which the time dependence is implicit. Batch processes are modeled in discrete event-space, while continuous processes are modeled in time-space. The ProdMod methodology maps between event-space and time-space such that the inherent mathematical discontinuities in batch process simulation are avoided without sacrificing any of the necessary detail in the batch recipe steps. Modeling the processes separately in event- and time-space using linear constructs, and then coupling the two spaces, has accelerated the speed of simulation compared to a typical dynamic simulation. The ProdMod simulator models have been validated against operating data and other computer codes. Case studies have demonstrated the usefulness of the ProdMod simulator in developing strategies that demonstrate significant cost savings in operating the SRS HLW complex and in verifying the feasibility of newly proposed processes

  9. Modeling data irregularities and structural complexities in data envelopment analysis

    CERN Document Server

    Zhu, Joe

    2007-01-01

    In a relatively short period of time, Data Envelopment Analysis (DEA) has grown into a powerful quantitative, analytical tool for measuring and evaluating performance. It has been successfully applied to a whole variety of problems in many different contexts worldwide. This book deals with the micro aspects of handling and modeling data issues in modeling DEA problems. DEA's use has grown with its capability of dealing with complex "service industry" and the "public service domain" types of problems that require modeling of both qualitative and quantitative data. This handbook treatment deals with specific data problems including: imprecise or inaccurate data; missing data; qualitative data; outliers; undesirable outputs; quality data; statistical analysis; software and other data aspects of modeling complex DEA problems. In addition, the book will demonstrate how to visualize DEA results when the data is more than 3-dimensional, and how to identify efficiency units quickly and accurately.

  10. Filtered selection coupled with support vector machines generate a functionally relevant prediction model for colorectal cancer

    Directory of Open Access Journals (Sweden)

    Gabere MN

    2016-06-01

    Full Text Available Musa Nur Gabere,1 Mohamed Aly Hussein,1 Mohammad Azhar Aziz2 1Department of Bioinformatics, King Abdullah International Medical Research Center/King Saud bin Abdulaziz University for Health Sciences, Riyadh, Saudi Arabia; 2Colorectal Cancer Research Program, Department of Medical Genomics, King Abdullah International Medical Research Center, Riyadh, Saudi Arabia Purpose: There has been considerable interest in using whole-genome expression profiles for the classification of colorectal cancer (CRC. The selection of important features is a crucial step before training a classifier.Methods: In this study, we built a model that uses support vector machine (SVM to classify cancer and normal samples using Affymetrix exon microarray data obtained from 90 samples of 48 patients diagnosed with CRC. From the 22,011 genes, we selected the 20, 30, 50, 100, 200, 300, and 500 genes most relevant to CRC using the minimum-redundancy–maximum-relevance (mRMR technique. With these gene sets, an SVM model was designed using four different kernel types (linear, polynomial, radial basis function [RBF], and sigmoid.Results: The best model, which used 30 genes and RBF kernel, outperformed other combinations; it had an accuracy of 84% for both ten fold and leave-one-out cross validations in discriminating the cancer samples from the normal samples. With this 30 genes set from mRMR, six classifiers were trained using random forest (RF, Bayes net (BN, multilayer perceptron (MLP, naïve Bayes (NB, reduced error pruning tree (REPT, and SVM. Two hybrids, mRMR + SVM and mRMR + BN, were the best models when tested on other datasets, and they achieved a prediction accuracy of 95.27% and 91.99%, respectively, compared to other mRMR hybrid models (mRMR + RF, mRMR + NB, mRMR + REPT, and mRMR + MLP. Ingenuity pathway analysis was used to analyze the functions of the 30 genes selected for this model and their potential association with CRC: CDH3, CEACAM7, CLDN1, IL8, IL6R, MMP1

  11. A content relevance model for social media health information.

    Science.gov (United States)

    Prybutok, Gayle Linda; Koh, Chang; Prybutok, Victor R

    2014-04-01

    Consumer health informatics includes the development and implementation of Internet-based systems to deliver health risk management information and health intervention applications to the public. The application of consumer health informatics to educational and interventional efforts such as smoking reduction and cessation has garnered attention from both consumers and health researchers in recent years. Scientists believe that smoking avoidance or cessation before the age of 30 years can prevent more than 90% of smoking-related cancers and that individuals who stop smoking fare as well in preventing cancer as those who never start. The goal of this study was to determine factors that were most highly correlated with content relevance for health information provided on the Internet for a study group of 18- to 30-year-old college students. Data analysis showed that the opportunity for convenient entertainment, social interaction, health information-seeking behavior, time spent surfing on the Internet, the importance of available activities on the Internet (particularly e-mail), and perceived site relevance for Internet-based sources of health information were significantly correlated with content relevance for 18- to 30-year-old college students, an educated subset of this population segment.

  12. The Relevance Voxel Machine (RVoxM): A Self-Tuning Bayesian Model for Informative Image-Based Prediction

    DEFF Research Database (Denmark)

    Sabuncu, Mert R.; Van Leemput, Koen

    2012-01-01

    This paper presents the relevance voxel machine (RVoxM), a dedicated Bayesian model for making predictions based on medical imaging data. In contrast to the generic machine learning algorithms that have often been used for this purpose, the method is designed to utilize a small number of spatially...

  13. Model-based identification and use of task complexity factors of human integrated systems

    International Nuclear Information System (INIS)

    Ham, Dong-Han; Park, Jinkyun; Jung, Wondea

    2012-01-01

    Task complexity is one of the conceptual constructs that are critical to explain and predict human performance in human integrated systems. A basic approach to evaluating the complexity of tasks is to identify task complexity factors and measure them. Although a great deal of task complexity factors have been studied, there is still a lack of conceptual frameworks for identifying and organizing them analytically, which can be generally used irrespective of the types of domains and tasks. This study proposes a model-based approach to identifying and using task complexity factors, which has two facets—the design aspects of a task and complexity dimensions. Three levels of design abstraction, which are functional, behavioral, and structural aspects of a task, characterize the design aspect of a task. The behavioral aspect is further classified into five cognitive processing activity types. The complexity dimensions explain a task complexity from different perspectives, which are size, variety, and order/organization. Twenty-one task complexity factors are identified by the combination of the attributes of each facet. Identification and evaluation of task complexity factors based on this model is believed to give insights for improving the design quality of tasks. This model for complexity factors can also be used as a referential framework for allocating tasks and designing information aids. The proposed approach is applied to procedure-based tasks of nuclear power plants (NPPs) as a case study to demonstrate its use. Last, we compare the proposed approach with other studies and then suggest some future research directions.

  14. Capturing complexity in work disability research: application of system dynamics modeling methodology.

    Science.gov (United States)

    Jetha, Arif; Pransky, Glenn; Hettinger, Lawrence J

    2016-01-01

    Work disability (WD) is characterized by variable and occasionally undesirable outcomes. The underlying determinants of WD outcomes include patterns of dynamic relationships among health, personal, organizational and regulatory factors that have been challenging to characterize, and inadequately represented by contemporary WD models. System dynamics modeling (SDM) methodology applies a sociotechnical systems thinking lens to view WD systems as comprising a range of influential factors linked by feedback relationships. SDM can potentially overcome limitations in contemporary WD models by uncovering causal feedback relationships, and conceptualizing dynamic system behaviors. It employs a collaborative and stakeholder-based model building methodology to create a visual depiction of the system as a whole. SDM can also enable researchers to run dynamic simulations to provide evidence of anticipated or unanticipated outcomes that could result from policy and programmatic intervention. SDM may advance rehabilitation research by providing greater insights into the structure and dynamics of WD systems while helping to understand inherent complexity. Challenges related to data availability, determining validity, and the extensive time and technical skill requirements for model building may limit SDM's use in the field and should be considered. Contemporary work disability (WD) models provide limited insight into complexity associated with WD processes. System dynamics modeling (SDM) has the potential to capture complexity through a stakeholder-based approach that generates a simulation model consisting of multiple feedback loops. SDM may enable WD researchers and practitioners to understand the structure and behavior of the WD system as a whole, and inform development of improved strategies to manage straightforward and complex WD cases.

  15. Low-complexity Behavioral Model for Predictive Maintenance of Railway Turnouts

    DEFF Research Database (Denmark)

    Barkhordari, Pegah; Galeazzi, Roberto; Tejada, Alejandro de Miguel

    2017-01-01

    together with the Eigensystem Realization Algorithm – a type of subspace identification – to identify a fourth order model of the infrastructure. The robustness and predictive capability of the low-complexity behavioral model to reproduce track responses under different types of train excitations have been......Maintenance of railway infrastructures represents a major cost driver for any infrastructure manager since reliability and dependability must be guaranteed at all times. Implementation of predictive maintenance policies relies on the availability of condition monitoring systems able to assess...... the infrastructure health state. The core of any condition monitoring system is the a-priori knowledge about the process to be monitored, in the form of either mathematical models of different complexity or signal features characterizing the healthy/faulty behavior. This study investigates the identification...

  16. Tuberous sclerosis complex surveillance and management: recommendations of the 2012 International Tuberous Sclerosis Complex Consensus Conference.

    Science.gov (United States)

    Krueger, Darcy A; Northrup, Hope

    2013-10-01

    Tuberous sclerosis complex is a genetic disorder affecting every organ system, but disease manifestations vary significantly among affected individuals. The diverse and varied presentations and progression can be life-threatening with significant impact on cost and quality of life. Current surveillance and management practices are highly variable among region and country, reflective of the fact that last consensus recommendations occurred in 1998 and an updated, comprehensive standard is lacking that incorporates the latest scientific evidence and current best clinical practices. The 2012 International Tuberous Sclerosis Complex Consensus Group, comprising 79 specialists from 14 countries, was organized into 12 separate subcommittees, each led by a clinician with advanced expertise in tuberous sclerosis complex and the relevant medical subspecialty. Each subcommittee focused on a specific disease area with important clinical management implications and was charged with formulating key clinical questions to address within its focus area, reviewing relevant literature, evaluating the strength of data, and providing a recommendation accordingly. The updated consensus recommendations for clinical surveillance and management in tuberous sclerosis complex are summarized here. The recommendations are relevant to the entire lifespan of the patient, from infancy to adulthood, including both individuals where the diagnosis is newly made as well as individuals where the diagnosis already is established. The 2012 International Tuberous Sclerosis Complex Consensus Recommendations provide an evidence-based, standardized approach for optimal clinical care provided for individuals with tuberous sclerosis complex. Copyright © 2013 The Authors. Published by Elsevier Inc. All rights reserved.

  17. Modelling the self-organization and collapse of complex networks

    Indian Academy of Sciences (India)

    Modelling the self-organization and collapse of complex networks. Sanjay Jain Department of Physics and Astrophysics, University of Delhi Jawaharlal Nehru Centre for Advanced Scientific Research, Bangalore Santa Fe Institute, Santa Fe, New Mexico.

  18. Segmentation of Image Data from Complex Organotypic 3D Models of Cancer Tissues with Markov Random Fields.

    Science.gov (United States)

    Robinson, Sean; Guyon, Laurent; Nevalainen, Jaakko; Toriseva, Mervi; Åkerfelt, Malin; Nees, Matthias

    2015-01-01

    Organotypic, three dimensional (3D) cell culture models of epithelial tumour types such as prostate cancer recapitulate key aspects of the architecture and histology of solid cancers. Morphometric analysis of multicellular 3D organoids is particularly important when additional components such as the extracellular matrix and tumour microenvironment are included in the model. The complexity of such models has so far limited their successful implementation. There is a great need for automatic, accurate and robust image segmentation tools to facilitate the analysis of such biologically relevant 3D cell culture models. We present a segmentation method based on Markov random fields (MRFs) and illustrate our method using 3D stack image data from an organotypic 3D model of prostate cancer cells co-cultured with cancer-associated fibroblasts (CAFs). The 3D segmentation output suggests that these cell types are in physical contact with each other within the model, which has important implications for tumour biology. Segmentation performance is quantified using ground truth labels and we show how each step of our method increases segmentation accuracy. We provide the ground truth labels along with the image data and code. Using independent image data we show that our segmentation method is also more generally applicable to other types of cellular microscopy and not only limited to fluorescence microscopy.

  19. Segmentation of Image Data from Complex Organotypic 3D Models of Cancer Tissues with Markov Random Fields.

    Directory of Open Access Journals (Sweden)

    Sean Robinson

    Full Text Available Organotypic, three dimensional (3D cell culture models of epithelial tumour types such as prostate cancer recapitulate key aspects of the architecture and histology of solid cancers. Morphometric analysis of multicellular 3D organoids is particularly important when additional components such as the extracellular matrix and tumour microenvironment are included in the model. The complexity of such models has so far limited their successful implementation. There is a great need for automatic, accurate and robust image segmentation tools to facilitate the analysis of such biologically relevant 3D cell culture models. We present a segmentation method based on Markov random fields (MRFs and illustrate our method using 3D stack image data from an organotypic 3D model of prostate cancer cells co-cultured with cancer-associated fibroblasts (CAFs. The 3D segmentation output suggests that these cell types are in physical contact with each other within the model, which has important implications for tumour biology. Segmentation performance is quantified using ground truth labels and we show how each step of our method increases segmentation accuracy. We provide the ground truth labels along with the image data and code. Using independent image data we show that our segmentation method is also more generally applicable to other types of cellular microscopy and not only limited to fluorescence microscopy.

  20. From complex to simple: interdisciplinary stochastic models

    International Nuclear Information System (INIS)

    Mazilu, D A; Zamora, G; Mazilu, I

    2012-01-01

    We present two simple, one-dimensional, stochastic models that lead to a qualitative understanding of very complex systems from biology, nanoscience and social sciences. The first model explains the complicated dynamics of microtubules, stochastic cellular highways. Using the theory of random walks in one dimension, we find analytical expressions for certain physical quantities, such as the time dependence of the length of the microtubules, and diffusion coefficients. The second one is a stochastic adsorption model with applications in surface deposition, epidemics and voter systems. We introduce the ‘empty interval method’ and show sample calculations for the time-dependent particle density. These models can serve as an introduction to the field of non-equilibrium statistical physics, and can also be used as a pedagogical tool to exemplify standard statistical physics concepts, such as random walks or the kinetic approach of the master equation. (paper)

  1. Scintillometry in urban and complex environments: a review

    International Nuclear Information System (INIS)

    Ward, Helen C

    2017-01-01

    Knowledge of turbulent exchange in complex environments is relevant to a wide range of hydro-meteorological applications. Observations are required to improve understanding and inform model parameterisations but the very nature of complex environments presents challenges for measurements. Scintillometry offers several advantages as a technique for providing spatially-integrated turbulence data (structure parameters and fluxes), particularly in areas that would be impracticable to monitor using eddy covariance, such as across a valley, above a city or over heterogeneous landscapes. Despite much of scintillometry theory assuming flat, homogeneous surfaces and ideal conditions, over the last 20 years scintillometers have been deployed in increasingly complex locations, including urban and mountainous areas. This review draws together fundamental and applied research in complex environments, to assess what has been learnt, summarise the state-of-the-art and identify key areas for future research. Particular attention is given to evidence, or relative lack thereof, of the impact of complex environments on scintillometer data. Practical and theoretical considerations to account for the effects of complexity are discussed, with the aim of developing measurement capability towards more reliable and accurate observations in future. The usefulness of structure parameter measurements (in addition to fluxes, which must be derived using similarity theory) should not be overlooked, particularly when comparing or combining scintillometry with other measurement techniques and model simulations. (paper)

  2. Real-Time Emulation of Nonstationary Channels in Safety-Relevant Vehicular Scenarios

    Directory of Open Access Journals (Sweden)

    Golsa Ghiaasi

    2018-01-01

    Full Text Available This paper proposes and discusses the architecture for a real-time vehicular channel emulator capable of reproducing the input/output behavior of nonstationary time-variant radio propagation channels in safety-relevant vehicular scenarios. The vehicular channel emulator architecture aims at a hardware implementation which requires minimal hardware complexity for emulating channels with the varying delay-Doppler characteristics of safety-relevant vehicular scenarios. The varying delay-Doppler characteristics require real-time updates to the multipath propagation model for each local stationarity region. The vehicular channel emulator is used for benchmarking the packet error performance of commercial off-the-shelf (COTS vehicular IEEE 802.11p modems and a fully software-defined radio-based IEEE 802.11p modem stack. The packet error ratio (PER estimated from temporal averaging over a single virtual drive and the packet error probability (PEP estimated from ensemble averaging over repeated virtual drives are evaluated and compared for the same vehicular scenario. The proposed architecture is realized as a virtual instrument on National Instruments™ LabVIEW. The National Instrument universal software radio peripheral with reconfigurable input-output (USRP-Rio 2953R is used as the software-defined radio platform for implementation; however, the results and considerations reported are of general purpose and can be applied to other platforms. Finally, we discuss the PER performance of the modem for two categories of vehicular channel models: a vehicular nonstationary channel model derived for urban single lane street crossing scenario of the DRIVEWAY’09 measurement campaign and the stationary ETSI models.

  3. Mathematical modeling of complexing in the scandium-salicylic acid-isoamyl alcohol system

    International Nuclear Information System (INIS)

    Evseev, A.M.; Smirnova, N.S.; Fadeeva, V.I.; Tikhomirova, T.I.; Kir'yanov, Yu.A.

    1984-01-01

    Mathematical modeling of an equilibrium multicomponent physicochemical system for extraction of Sc salicylate complexes by isoamyl alcohol was conducted. To calculate the equilibrium concentrations of Sc complexes different with respect to the content and composition, the system of nonlinear algebraic mass balance equations was solved. Experimental data on the extraction of Sc salicylates by isoamyl alcohol versus the pH of the solution at a constant Sc concentration and different concentration of salicylate-ions were used for construction of the mathematical model. The stability constants of ScHSal 2+ , Sc(HSal) 3 , ScOH(HSal) 2 , ScoH(HSal) 2 complexes were calculated

  4. Promoting culturally competent chronic pain management using the clinically relevant continuum model.

    Science.gov (United States)

    Monsivais, Diane B

    2011-06-01

    This article reviews the culture of biomedicine and current practices in pain management education, which often merge to create a hostile environment for effective chronic pain care. Areas of cultural tensions in chronic pain frequently involve the struggle to achieve credibility regarding one's complaints of pain (or being believed that the pain is real) and complying with pain medication protocols. The clinically relevant continuum model is presented as a framework allowing providers to approach care from an evidence-based, culturally appropriate (patient centered) perspective that takes into account the highest level of evidence available, provider expertise, and patient preferences and values. Copyright © 2011 Elsevier Inc. All rights reserved.

  5. Kolmogorov complexity, pseudorandom generators and statistical models testing

    Czech Academy of Sciences Publication Activity Database

    Šindelář, Jan; Boček, Pavel

    2002-01-01

    Roč. 38, č. 6 (2002), s. 747-759 ISSN 0023-5954 R&D Projects: GA ČR GA102/99/1564 Institutional research plan: CEZ:AV0Z1075907 Keywords : Kolmogorov complexity * pseudorandom generators * statistical models testing Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.341, year: 2002

  6. Achieving Complex Learning Outcomes through Adoption of a Pedagogical Perspective: A Model for Computer Technology Delivered Instruction

    Science.gov (United States)

    Bellard, Breshanica

    2018-01-01

    Professionals responsible for the delivery of education and training using technology systems and platforms can facilitate complex learning through application of relevant strategies, principles and theories that support how learners learn and that support how curriculum should be designed in a technology based learning environment. Technological…

  7. Coupled economic-ecological models for ecosystem-based fishery management: Exploration of trade-offs between model complexity and management needs

    DEFF Research Database (Denmark)

    Thunberg, Eric; Holland, Dan; Nielsen, J. Rasmus

    2012-01-01

    Ecosystem based fishery management has moved beyond rhetorical statements calling for a more holistic approach to resource management, to implementing decisions on resource use that are compatible with goals of maintaining ecosystem health and resilience. Coupled economic-ecological models...... are a primary tool for informing these decisions. Recognizing the importance of these models, the International Council for the Exploration of the Seas (ICES) formed a Study Group on Integration of Economics, Stock Assessment and Fisheries Management (SGIMM) to explore alternative modelling approaches...... and ecological systems are inherently complex, models are abstractions of these systems incorporating varying levels of complexity depending on available data and the management issues to be addressed. The objective of this special session was to assess the pros and cons of increasing model complexity...

  8. GEOQUIMICO : an interactive tool for comparing sorption conceptual models (surface complexation modeling versus K[D])

    International Nuclear Information System (INIS)

    Hammond, Glenn E.; Cygan, Randall Timothy

    2007-01-01

    Within reactive geochemical transport, several conceptual models exist for simulating sorption processes in the subsurface. Historically, the K D approach has been the method of choice due to ease of implementation within a reactive transport model and straightforward comparison with experimental data. However, for modeling complex sorption phenomenon (e.g. sorption of radionuclides onto mineral surfaces), this approach does not systematically account for variations in location, time, or chemical conditions, and more sophisticated methods such as a surface complexation model (SCM) must be utilized. It is critical to determine which conceptual model to use; that is, when the material variation becomes important to regulatory decisions. The geochemical transport tool GEOQUIMICO has been developed to assist in this decision-making process. GEOQUIMICO provides a user-friendly framework for comparing the accuracy and performance of sorption conceptual models. The model currently supports the K D and SCM conceptual models. The code is written in the object-oriented Java programming language to facilitate model development and improve code portability. The basic theory underlying geochemical transport and the sorption conceptual models noted above is presented in this report. Explanations are provided of how these physicochemical processes are instrumented in GEOQUIMICO and a brief verification study comparing GEOQUIMICO results to data found in the literature is given

  9. A coupled mass transfer and surface complexation model for uranium (VI) removal from wastewaters

    International Nuclear Information System (INIS)

    Lenhart, J.; Figueroa, L.A.; Honeyman, B.D.

    1994-01-01

    A remediation technique has been developed for removing uranium (VI) from complex contaminated groundwater using flake chitin as a biosorbent in batch and continuous flow configurations. With this system, U(VI) removal efficiency can be predicted using a model that integrates surface complexation models, mass transport limitations and sorption kinetics. This integration allows the reactor model to predict removal efficiencies for complex groundwaters with variable U(VI) concentrations and other constituents. The system has been validated using laboratory-derived kinetic data in batch and CSTR systems to verify the model predictions of U(VI) uptake from simulated contaminated groundwater

  10. Silk-polypyrrole biocompatible actuator performance under biologically relevant conditions

    Science.gov (United States)

    Hagler, Jo'elen; Peterson, Ben; Murphy, Amanda; Leger, Janelle

    Biocompatible actuators that are capable of controlled movement and can function under biologically relevant conditions are of significant interest in biomedical fields. Previously, we have demonstrated that a composite material of silk biopolymer and the conducting polymer polypyrrole (PPy) can be formed into a bilayer device that can bend under applied voltage. Further, these silk-PPy composites can generate forces comparable to human muscle (>0.1 MPa) making them ideal candidates for interfacing with biological tissues. Here silk-PPy composite films are tested for performance under biologically relevant conditions including exposure to a complex protein serum and biologically relevant temperatures. Free-end bending actuation performance, current response, force generation and, mass degradation were investigated . Preliminary results show that when exposed to proteins and biologically relevant temperatures, these silk-PPy composites show minimal degradation and are able to generate forces and conduct currents comparable to devices tested under standard conditions. NSF.

  11. A Framework for Modeling and Analyzing Complex Distributed Systems

    National Research Council Canada - National Science Library

    Lynch, Nancy A; Shvartsman, Alex Allister

    2005-01-01

    Report developed under STTR contract for topic AF04-T023. This Phase I project developed a modeling language and laid a foundation for computational support tools for specifying, analyzing, and verifying complex distributed system designs...

  12. Numerical Modeling of Fluid-Structure Interaction with Rheologically Complex Fluids

    OpenAIRE

    Chen, Xingyuan

    2014-01-01

    In the present work the interaction between rheologically complex fluids and elastic solids is studied by means of numerical modeling. The investigated complex fluids are non-Newtonian viscoelastic fluids. The fluid-structure interaction (FSI) of this kind is frequently encountered in injection molding, food processing, pharmaceutical engineering and biomedicine. The investigation via experiments is costly, difficult or in some cases, even impossible. Therefore, research is increasingly aided...

  13. Modelling the dynamics of the health-production complex in livestock herds

    DEFF Research Database (Denmark)

    Sørensen, J.T.; Enevoldsen, Carsten

    1992-01-01

    This paper reviews how the dynamics of the health-production complex in livestock herds is mimicked by livestock herd simulation models. Twelve models simulating the dynamics of dairy, beef, sheep and sow herds were examined. All models basically included options to alter input and output...

  14. DETERMINATION OF RELEVANT FEATURES OF A SCALE MODEL FOR A 55 000 DWT BULK CARRIER NECESSARY TO STUDY THE SHIP MANEUVERABILITY

    Directory of Open Access Journals (Sweden)

    ALECU TOMA

    2016-06-01

    Full Text Available The study method of a ship behavior based on practical tests on scale models is widely used both leading scientists and engineers, architects and researchers in the naval field. In this paper we propose to determine the parameters of a ship handling characteristics relevant to study the 55,000 dwt bulk carrier using a scale model. Scientific background for practical experimentation of this techniques necessary to built a scale model ship consists in applying the principles of similarity or "similitude". The scale model achieved by applying the laws of similarity must allow, through approximations available in certain circumstances, finding relevant parameters needed to simplify and solve the Navier-Stokes equations. These parameters are necessary for modeling the interaction between hull of the real ship and the fluid motion.

  15. Modeling Stochastic Complexity in Complex Adaptive Systems: Non-Kolmogorov Probability and the Process Algebra Approach.

    Science.gov (United States)

    Sulis, William H

    2017-10-01

    Walter Freeman III pioneered the application of nonlinear dynamical systems theories and methodologies in his work on mesoscopic brain dynamics.Sadly, mainstream psychology and psychiatry still cling to linear correlation based data analysis techniques, which threaten to subvert the process of experimentation and theory building. In order to progress, it is necessary to develop tools capable of managing the stochastic complexity of complex biopsychosocial systems, which includes multilevel feedback relationships, nonlinear interactions, chaotic dynamics and adaptability. In addition, however, these systems exhibit intrinsic randomness, non-Gaussian probability distributions, non-stationarity, contextuality, and non-Kolmogorov probabilities, as well as the absence of mean and/or variance and conditional probabilities. These properties and their implications for statistical analysis are discussed. An alternative approach, the Process Algebra approach, is described. It is a generative model, capable of generating non-Kolmogorov probabilities. It has proven useful in addressing fundamental problems in quantum mechanics and in the modeling of developing psychosocial systems.

  16. Modeling the surface tension of complex, reactive organic-inorganic mixtures

    Science.gov (United States)

    Schwier, A. N.; Viglione, G. A.; Li, Z.; McNeill, V. Faye

    2013-11-01

    Atmospheric aerosols can contain thousands of organic compounds which impact aerosol surface tension, affecting aerosol properties such as heterogeneous reactivity, ice nucleation, and cloud droplet formation. We present new experimental data for the surface tension of complex, reactive organic-inorganic aqueous mixtures mimicking tropospheric aerosols. Each solution contained 2-6 organic compounds, including methylglyoxal, glyoxal, formaldehyde, acetaldehyde, oxalic acid, succinic acid, leucine, alanine, glycine, and serine, with and without ammonium sulfate. We test two semi-empirical surface tension models and find that most reactive, complex, aqueous organic mixtures which do not contain salt are well described by a weighted Szyszkowski-Langmuir (S-L) model which was first presented by Henning et al. (2005). Two approaches for modeling the effects of salt were tested: (1) the Tuckermann approach (an extension of the Henning model with an additional explicit salt term), and (2) a new implicit method proposed here which employs experimental surface tension data obtained for each organic species in the presence of salt used with the Henning model. We recommend the use of method (2) for surface tension modeling of aerosol systems because the Henning model (using data obtained from organic-inorganic systems) and Tuckermann approach provide similar modeling results and goodness-of-fit (χ2) values, yet the Henning model is a simpler and more physical approach to modeling the effects of salt, requiring less empirically determined parameters.

  17. The Integrin Receptor in Biologically Relevant Bilayers

    DEFF Research Database (Denmark)

    Kalli, Antreas C.; Róg, Tomasz; Vattulainen, Ilpo

    2017-01-01

    /talin complex was inserted in biologically relevant bilayers that resemble the cell plasma membrane containing zwitterionic and charged phospholipids, cholesterol and sphingolipids to study the dynamics of the integrin receptor and its effect on bilayer structure and dynamics. The results of this study...... demonstrate the dynamic nature of the integrin receptor and suggest that the presence of the integrin receptor alters the lipid organization between the two leaflets of the bilayer. In particular, our results suggest elevated density of cholesterol and of phosphatidylserine lipids around the integrin....../talin complex and a slowing down of lipids in an annulus of ~30 Å around the protein due to interactions between the lipids and the integrin/talin F2–F3 complex. This may in part regulate the interactions of integrins with other related proteins or integrin clustering thus facilitating signal transduction...

  18. Translational research in immune senescence: Assessing the relevance of current models

    Science.gov (United States)

    High, Kevin P.; Akbar, Arne N.; Nikolich-Zugich, Janko

    2014-01-01

    Advancing age is accompanied by profound changes in immune function; some are induced by the loss of critical niches that support development of naïve cells (e.g. thymic involution), others by the intrinsic physiology of long-lived cells attempting to maintain homeostasis, still others by extrinsic effects such as oxidative stress or long-term exposure to antigen due to persistent viral infections. Once compensatory mechanisms can no longer maintain a youthful phenotype the end result is the immune senescent milieu – one characterized by chronic, low grade, systemic inflammation and impaired responses to immune challenge, particularly when encountering new antigens. This state is associated with progression of chronic illnesses like atherosclerosis and dementia, and an increased risk of acute illness, disability and death in older adults. The complex interaction between immune senescence and chronic illness provides an ideal landscape for translational research with the potential to greatly affect human health. However, current animal models and even human investigative strategies for immune senescence have marked limitations, and the reductionist paradigm itself may be poorly suited to meet these challenges. A new paradigm, one that embraces complexity as a core feature of research in older adults is required to address the critical health issues facing the burgeoning senior population, the group that consumes the majority of healthcare resources. In this review, we outline the major advantages and limitations of current models and offer suggestions for how to move forward. PMID:22633440

  19. A comprehensive model of anaerobic bioconversion of complex substrates to biogas

    DEFF Research Database (Denmark)

    Angelidaki, Irini; Ellegaard, Lars; Ahring, Birgitte Kiær

    1999-01-01

    A dynamic model describing the anaerobic degradation of complex material, and codigestion of different types of wastes, was developed based on a model previously described (Angelidaki et al., 1993). in the model, the substrate is described by its composition of basic organic components, i.e., car...

  20. On the ""early-time"" evolution of variables relevant to turbulence models for the Rayleigh-Taylor instability

    Energy Technology Data Exchange (ETDEWEB)

    Rollin, Bertrand [Los Alamos National Laboratory; Andrews, Malcolm J [Los Alamos National Laboratory

    2010-01-01

    We present our progress toward setting initial conditions in variable density turbulence models. In particular, we concentrate our efforts on the BHR turbulence model for turbulent Rayleigh-Taylor instability. Our approach is to predict profiles of relevant variables before fully turbulent regime and use them as initial conditions for the turbulence model. We use an idealized model of mixing between two interpenetrating fluids to define the initial profiles for the turbulence model variables. Velocities and volume fractions used in the idealized mixing model are obtained respectively from a set of ordinary differential equations modeling the growth of the Rayleigh-Taylor instability and from an idealization of the density profile in the mixing layer. A comparison between predicted profiles for the turbulence model variables and profiles of the variables obtained from low Atwood number three dimensional simulations show reasonable agreement.

  1. Interaction of dermatologically relevant nanoparticles with skin cells and skin

    Directory of Open Access Journals (Sweden)

    Annika Vogt

    2014-12-01

    Full Text Available The investigation of nanoparticle interactions with tissues is complex. High levels of standardization, ideally testing of different material types in the same biological model, and combinations of sensitive imaging and detection methods are required. Here, we present our studies on nanoparticle interactions with skin, skin cells, and biological media. Silica, titanium dioxide and silver particles were chosen as representative examples for different types of skin exposure to nanomaterials, e.g., unintended environmental exposure (silica versus intended exposure through application of sunscreen (titanium dioxide or antiseptics (silver. Because each particle type exhibits specific physicochemical properties, we were able to apply different combinations of methods to examine skin penetration and cellular uptake, including optical microscopy, electron microscopy, X-ray microscopy on cells and tissue sections, flow cytometry of isolated skin cells as well as Raman microscopy on whole tissue blocks. In order to assess the biological relevance of such findings, cell viability and free radical production were monitored on cells and in whole tissue samples. The combination of technologies and the joint discussion of results enabled us to look at nanoparticle–skin interactions and the biological relevance of our findings from different angles.

  2. Socio-Environmental Resilience and Complex Urban Systems Modeling

    Science.gov (United States)

    Deal, Brian; Petri, Aaron; Pan, Haozhi; Goldenberg, Romain; Kalantari, Zahra; Cvetkovic, Vladimir

    2017-04-01

    The increasing pressure of climate change has inspired two normative agendas; socio-technical transitions and socio-ecological resilience, both sharing a complex-systems epistemology (Gillard et al. 2016). Socio-technical solutions include a continuous, massive data gathering exercise now underway in urban places under the guise of developing a 'smart'(er) city. This has led to the creation of data-rich environments where large data sets have become central to monitoring and forming a response to anomalies. Some have argued that these kinds of data sets can help in planning for resilient cities (Norberg and Cumming 2008; Batty 2013). In this paper, we focus on a more nuanced, ecologically based, socio-environmental perspective of resilience planning that is often given less consideration. Here, we broadly discuss (and model) the tightly linked, mutually influenced, social and biophysical subsystems that are critical for understanding urban resilience. We argue for the need to incorporate these sub system linkages into the resilience planning lexicon through the integration of systems models and planning support systems. We make our case by first providing a context for urban resilience from a socio-ecological and planning perspective. We highlight the data needs for this type of resilient planning and compare it to currently collected data streams in various smart city efforts. This helps to define an approach for operationalizing socio-environmental resilience planning using robust systems models and planning support systems. For this, we draw from our experiences in coupling a spatio-temporal land use model (the Landuse Evolution and impact Assessment Model (LEAM)) with water quality and quantity models in Stockholm Sweden. We describe the coupling of these systems models using a robust Planning Support System (PSS) structural framework. We use the coupled model simulations and PSS to analyze the connection between urban land use transformation (social) and water

  3. The value relevance of environmental emissions

    Directory of Open Access Journals (Sweden)

    Melinda Lydia Nelwan

    2016-07-01

    Full Text Available This study examines whether environmental performance has value relevance by investigating the relations between environmental emissions and stock prices for the U.S. public companies. The previous studies argued that the conjectured relations between accounting performance measures and environmental performance do not have a strong theoretical basis, and the modeling of relations between market per-formance measures and environmental performance do not adequately consider the relevance of accounting performance to market value. Therefore, this study examines whether publicly reported environmental emissions provide incremental information to accounting earnings in pricing companies stocks. It is done among the complete set of industries covered by Toxics Release Inventory (TRI reporting for the period 2007 to 2010. Using Ohlson model but modified to include different types of emis-sions, it is found that ground emissions (underground injection and land emissions are value relevant but other emission types (air and water and transferred-out emis-sions appear to not provide incremental information in the valuation model. The result in this study raise concerns that different types of emissions are assessed differently by the market, confirming that studies should not aggregate such measures.

  4. A range of complex probabilistic models for RNA secondary structure prediction that includes the nearest-neighbor model and more.

    Science.gov (United States)

    Rivas, Elena; Lang, Raymond; Eddy, Sean R

    2012-02-01

    The standard approach for single-sequence RNA secondary structure prediction uses a nearest-neighbor thermodynamic model with several thousand experimentally determined energy parameters. An attractive alternative is to use statistical approaches with parameters estimated from growing databases of structural RNAs. Good results have been reported for discriminative statistical methods using complex nearest-neighbor models, including CONTRAfold, Simfold, and ContextFold. Little work has been reported on generative probabilistic models (stochastic context-free grammars [SCFGs]) of comparable complexity, although probabilistic models are generally easier to train and to use. To explore a range of probabilistic models of increasing complexity, and to directly compare probabilistic, thermodynamic, and discriminative approaches, we created TORNADO, a computational tool that can parse a wide spectrum of RNA grammar architectures (including the standard nearest-neighbor model and more) using a generalized super-grammar that can be parameterized with probabilities, energies, or arbitrary scores. By using TORNADO, we find that probabilistic nearest-neighbor models perform comparably to (but not significantly better than) discriminative methods. We find that complex statistical models are prone to overfitting RNA structure and that evaluations should use structurally nonhomologous training and test data sets. Overfitting has affected at least one published method (ContextFold). The most important barrier to improving statistical approaches for RNA secondary structure prediction is the lack of diversity of well-curated single-sequence RNA secondary structures in current RNA databases.

  5. The Semireduced Mechanism for Nitric Oxide Reduction by Non-Heme Diiron Complexes: Modeling Flavodiiron Nitric Oxide Reductases.

    Science.gov (United States)

    White, Corey J; Speelman, Amy L; Kupper, Claudia; Demeshko, Serhiy; Meyer, Franc; Shanahan, James P; Alp, E Ercan; Hu, Michael; Zhao, Jiyong; Lehnert, Nicolai

    2018-02-21

    Flavodiiron nitric oxide reductases (FNORs) are a subclass of flavodiiron proteins (FDPs) capable of preferential binding and subsequent reduction of NO to N 2 O. FNORs are found in certain pathogenic bacteria, equipping them with resistance to nitrosative stress, generated as a part of the immune defense in humans, and allowing them to proliferate. Here, we report the spectroscopic characterization and detailed reactivity studies of the diiron dinitrosyl model complex [Fe 2 (BPMP)(OPr)(NO) 2 ](OTf) 2 for the FNOR active site that is capable of reducing NO to N 2 O [Zheng et al., J. Am. Chem. Soc. 2013, 135, 4902-4905]. Using UV-vis spectroscopy, cyclic voltammetry, and spectro-electrochemistry, we show that one reductive equivalent is in fact sufficient for the quantitative generation of N 2 O, following a semireduced reaction mechanism. This reaction is very efficient and produces N 2 O with a first-order rate constant k > 10 2 s -1 . Further isotope labeling studies confirm an intramolecular N-N coupling mechanism, consistent with the rapid time scale of the reduction and a very low barrier for N-N bond formation. Accordingly, the reaction proceeds at -80 °C, allowing for the direct observation of the mixed-valent product of the reaction. At higher temperatures, the initial reaction product is unstable and decays, ultimately generating the diferrous complex [Fe 2 (BPMP)(OPr) 2 ](OTf) and an unidentified ferric product. These results combined offer deep insight into the mechanism of NO reduction by the relevant model complex [Fe 2 (BPMP)(OPr)(NO) 2 ] 2+ and provide direct evidence that the semireduced mechanism would constitute a highly efficient pathway to accomplish NO reduction to N 2 O in FNORs and in synthetic catalysts.

  6. A model for integrating clinical care and basic science research, and pitfalls of performing complex research projects for addressing a clinical challenge.

    Science.gov (United States)

    Steck, R; Epari, D R; Schuetz, M A

    2010-07-01

    The collaboration of clinicians with basic science researchers is crucial for addressing clinically relevant research questions. In order to initiate such mutually beneficial relationships, we propose a model where early career clinicians spend a designated time embedded in established basic science research groups, in order to pursue a postgraduate qualification. During this time, clinicians become integral members of the research team, fostering long term relationships and opening up opportunities for continuing collaboration. However, for these collaborations to be successful there are pitfalls to be avoided. Limited time and funding can lead to attempts to answer clinical challenges with highly complex research projects characterised by a large number of "clinical" factors being introduced in the hope that the research outcomes will be more clinically relevant. As a result, the complexity of such studies and variability of its outcomes may lead to difficulties in drawing scientifically justified and clinically useful conclusions. Consequently, we stress that it is the basic science researcher and the clinician's obligation to be mindful of the limitations and challenges of such multi-factorial research projects. A systematic step-by-step approach to address clinical research questions with limited, but highly targeted and well defined research projects provides the solid foundation which may lead to the development of a longer term research program for addressing more challenging clinical problems. Ultimately, we believe that it is such models, encouraging the vital collaboration between clinicians and researchers for the work on targeted, well defined research projects, which will result in answers to the important clinical challenges of today. Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  7. Summarizing Simulation Results using Causally-relevant States

    Science.gov (United States)

    Parikh, Nidhi; Marathe, Madhav; Swarup, Samarth

    2016-01-01

    As increasingly large-scale multiagent simulations are being implemented, new methods are becoming necessary to make sense of the results of these simulations. Even concisely summarizing the results of a given simulation run is a challenge. Here we pose this as the problem of simulation summarization: how to extract the causally-relevant descriptions of the trajectories of the agents in the simulation. We present a simple algorithm to compress agent trajectories through state space by identifying the state transitions which are relevant to determining the distribution of outcomes at the end of the simulation. We present a toy-example to illustrate the working of the algorithm, and then apply it to a complex simulation of a major disaster in an urban area. PMID:28042620

  8. RATING MODELS AND INFORMATION TECHNOLOGIES APPLICATION FOR MANAGEMENT OF ADMINISTRATIVE-TERRITORIAL COMPLEXES

    Directory of Open Access Journals (Sweden)

    O. M. Pshinko

    2016-12-01

    Full Text Available Purpose. The paper aims to develop rating models and related information technologies designed to resolve the tasks of strategic planning of the administrative and territorial units’ development, as well as the tasks of multi-criteria control of inhomogeneous multiparameter objects operation. Methodology. When solving problems of strategic planning of administrative and territorial development and heterogeneous classes management of objects under control, a set of agreed methods is used. Namely the multi-criteria properties analysis for objects of planning and management, diagnostics of the state parameters, forecasting and management of complex systems of different classes. Their states are estimated by sets of different quality indicators, as well as represented by the individual models of operation process. A new information technology is proposed and created to implement the strategic planning and management tasks. This technology uses the procedures for solving typical tasks, that are implemented in MS SQL Server. Findings. A new approach to develop models of analyze and management of complex systems classes based on the ratings has been proposed. Rating models development for analysis of multicriteria and multiparameter systems has been obtained. The management of these systems is performed on the base of parameters of the current and predicted state by non-uniform distribution of resources. The procedure of sensitivity analysis of the changes in the rating model of inhomogeneous distribution of resources parameters has been developed. The information technology of strategic planning and management of heterogeneous classes of objects based on the rating model has been created. Originality. This article proposes a new approach of the rating indicators’ using as a general model for strategic planning of the development and management of heterogeneous objects that can be characterized by the sets of parameters measured on different scales

  9. The complex formation-partition and partition-association models of solvent extraction of ions

    International Nuclear Information System (INIS)

    Siekierski, S.

    1976-01-01

    Two models of the extraction process have been proposed. In the first model it is assumed that the partitioning neutral species is at first formed in the aqueous phase and then transferred into the organic phase. The second model is based on the assumption that equivalent amounts of cations are at first transferred from the aqueous into the organic phase and then associated to form a neutral molecule. The role of the solubility parameter in extraction and the relation between the solubility of liquid organic substances in water and the partition of complexes have been discussed. The extraction of simple complexes and complexes with organic ligands has been discussed using the first model. Partition coefficients have been calculated theoretically and compared with experimental values in some very simple cases. The extraction of ion pairs has been discussed using the partition-association model and the concept of single-ion partition coefficients. (author)

  10. Developing and Modeling Complex Social Interventions: Introducing the Connecting People Intervention

    Science.gov (United States)

    Webber, Martin; Reidy, Hannah; Ansari, David; Stevens, Martin; Morris, David

    2016-01-01

    Objectives: Modeling the processes involved in complex social interventions is important in social work practice, as it facilitates their implementation and translation into different contexts. This article reports the process of developing and modeling the connecting people intervention (CPI), a model of practice that supports people with mental…

  11. Are our dynamic water quality models too complex? A comparison of a new parsimonious phosphorus model, SimplyP, and INCA-P

    Science.gov (United States)

    Jackson-Blake, L. A.; Sample, J. E.; Wade, A. J.; Helliwell, R. C.; Skeffington, R. A.

    2017-07-01

    Catchment-scale water quality models are increasingly popular tools for exploring the potential effects of land management, land use change and climate change on water quality. However, the dynamic, catchment-scale nutrient models in common usage are complex, with many uncertain parameters requiring calibration, limiting their usability and robustness. A key question is whether this complexity is justified. To explore this, we developed a parsimonious phosphorus model, SimplyP, incorporating a rainfall-runoff model and a biogeochemical model able to simulate daily streamflow, suspended sediment, and particulate and dissolved phosphorus dynamics. The model's complexity was compared to one popular nutrient model, INCA-P, and the performance of the two models was compared in a small rural catchment in northeast Scotland. For three land use classes, less than six SimplyP parameters must be determined through calibration, the rest may be based on measurements, while INCA-P has around 40 unmeasurable parameters. Despite substantially simpler process-representation, SimplyP performed comparably to INCA-P in both calibration and validation and produced similar long-term projections in response to changes in land management. Results support the hypothesis that INCA-P is overly complex for the study catchment. We hope our findings will help prompt wider model comparison exercises, as well as debate among the water quality modeling community as to whether today's models are fit for purpose. Simpler models such as SimplyP have the potential to be useful management and research tools, building blocks for future model development (prototype code is freely available), or benchmarks against which more complex models could be evaluated.

  12. Cyclodextrin--piroxicam inclusion complexes: analyses by mass spectrometry and molecular modelling

    Science.gov (United States)

    Gallagher, Richard T.; Ball, Christopher P.; Gatehouse, Deborah R.; Gates, Paul J.; Lobell, Mario; Derrick, Peter J.

    1997-11-01

    Mass spectrometry has been used to investigate the natures of non-covalent complexes formed between the anti-inflammatory drug piroxicam and [alpha]-, [beta]- and [gamma]-cyclodextrins. Energies of these complexes have been calculated by means of molecular modelling. There is a correlation between peak intensities in the mass spectra and the calculated energies.

  13. Complex singlet extension of the standard model

    International Nuclear Information System (INIS)

    Barger, Vernon; McCaskey, Mathew; Langacker, Paul; Ramsey-Musolf, Michael; Shaughnessy, Gabe

    2009-01-01

    We analyze a simple extension of the standard model (SM) obtained by adding a complex singlet to the scalar sector (cxSM). We show that the cxSM can contain one or two viable cold dark matter candidates and analyze the conditions on the parameters of the scalar potential that yield the observed relic density. When the cxSM potential contains a global U(1) symmetry that is both softly and spontaneously broken, it contains both a viable dark matter candidate and the ingredients necessary for a strong first order electroweak phase transition as needed for electroweak baryogenesis. We also study the implications of the model for discovery of a Higgs boson at the Large Hadron Collider.

  14. Complexity and agent-based modelling in urban research

    DEFF Research Database (Denmark)

    Fertner, Christian

    influence on the bigger system. Traditional scientific methods or theories often tried to simplify, not accounting complex relations of actors and decision-making. The introduction of computers in simulation made new approaches in modelling, as for example agent-based modelling (ABM), possible, dealing......Urbanisation processes are results of a broad variety of actors or actor groups and their behaviour and decisions based on different experiences, knowledge, resources, values etc. The decisions done are often on a micro/individual level but resulting in macro/collective behaviour. In urban research...

  15. A structural model of the E. coli PhoB Dimer in the transcription initiation complex

    Directory of Open Access Journals (Sweden)

    Tung Chang-Shung

    2012-03-01

    Full Text Available Abstract Background There exist > 78,000 proteins and/or nucleic acids structures that were determined experimentally. Only a small portion of these structures corresponds to those of protein complexes. While homology modeling is able to exploit knowledge-based potentials of side-chain rotomers and backbone motifs to infer structures for new proteins, no such general method exists to extend our understanding of protein interaction motifs to novel protein complexes. Results We use a Motif Binding Geometries (MBG approach, to infer the structure of a protein complex from the database of complexes of homologous proteins taken from other contexts (such as the helix-turn-helix motif binding double stranded DNA, and demonstrate its utility on one of the more important regulatory complexes in biology, that of the RNA polymerase initiating transcription under conditions of phosphate starvation. The modeled PhoB/RNAP/σ-factor/DNA complex is stereo-chemically reasonable, has sufficient interfacial Solvent Excluded Surface Areas (SESAs to provide adequate binding strength, is physically meaningful for transcription regulation, and is consistent with a variety of known experimental constraints. Conclusions Based on a straightforward and easy to comprehend concept, "proteins and protein domains that fold similarly could interact similarly", a structural model of the PhoB dimer in the transcription initiation complex has been developed. This approach could be extended to enable structural modeling and prediction of other bio-molecular complexes. Just as models of individual proteins provide insight into molecular recognition, catalytic mechanism, and substrate specificity, models of protein complexes will provide understanding into the combinatorial rules of cellular regulation and signaling.

  16. A framework for modelling the complexities of food and water security under globalisation

    Science.gov (United States)

    Dermody, Brian J.; Sivapalan, Murugesu; Stehfest, Elke; van Vuuren, Detlef P.; Wassen, Martin J.; Bierkens, Marc F. P.; Dekker, Stefan C.

    2018-01-01

    We present a new framework for modelling the complexities of food and water security under globalisation. The framework sets out a method to capture regional and sectoral interdependencies and cross-scale feedbacks within the global food system that contribute to emergent water use patterns. The framework integrates aspects of existing models and approaches in the fields of hydrology and integrated assessment modelling. The core of the framework is a multi-agent network of city agents connected by infrastructural trade networks. Agents receive socio-economic and environmental constraint information from integrated assessment models and hydrological models respectively and simulate complex, socio-environmental dynamics that operate within those constraints. The emergent changes in food and water resources are aggregated and fed back to the original models with minimal modification of the structure of those models. It is our conviction that the framework presented can form the basis for a new wave of decision tools that capture complex socio-environmental change within our globalised world. In doing so they will contribute to illuminating pathways towards a sustainable future for humans, ecosystems and the water they share.

  17. Task relevant variables are encoded in OFC neurons

    Directory of Open Access Journals (Sweden)

    Ramon Nogueira

    2015-04-01

    Our results demonstrate that OFC in rats might not only be involved in reward processing but it also conveys a wide variety of task relevant variables. Our hypothesis is that OFC acts as a hub for complex decision-making tasks where all possible information is processed and conveyed to other brain regions responsible for decision execution.

  18. Protein Complex Production from the Drug Discovery Standpoint.

    Science.gov (United States)

    Moarefi, Ismail

    2016-01-01

    Small molecule drug discovery critically depends on the availability of meaningful in vitro assays to guide medicinal chemistry programs that are aimed at optimizing drug potency and selectivity. As it becomes increasingly evident, most disease relevant drug targets do not act as a single protein. In the body, they are instead generally found in complex with protein cofactors that are highly relevant for their correct function and regulation. This review highlights selected examples of the increasing trend to use biologically relevant protein complexes for rational drug discovery to reduce costly late phase attritions due to lack of efficacy or toxicity.

  19. Comparing flood loss models of different complexity

    Science.gov (United States)

    Schröter, Kai; Kreibich, Heidi; Vogel, Kristin; Riggelsen, Carsten; Scherbaum, Frank; Merz, Bruno

    2013-04-01

    Any deliberation on flood risk requires the consideration of potential flood losses. In particular, reliable flood loss models are needed to evaluate cost-effectiveness of mitigation measures, to assess vulnerability, for comparative risk analysis and financial appraisal during and after floods. In recent years, considerable improvements have been made both concerning the data basis and the methodological approaches used for the development of flood loss models. Despite of that, flood loss models remain an important source of uncertainty. Likewise the temporal and spatial transferability of flood loss models is still limited. This contribution investigates the predictive capability of different flood loss models in a split sample cross regional validation approach. For this purpose, flood loss models of different complexity, i.e. based on different numbers of explaining variables, are learned from a set of damage records that was obtained from a survey after the Elbe flood in 2002. The validation of model predictions is carried out for different flood events in the Elbe and Danube river basins in 2002, 2005 and 2006 for which damage records are available from surveys after the flood events. The models investigated are a stage-damage model, the rule based model FLEMOps+r as well as novel model approaches which are derived using data mining techniques of regression trees and Bayesian networks. The Bayesian network approach to flood loss modelling provides attractive additional information concerning the probability distribution of both model predictions and explaining variables.

  20. The big seven model of personality and its relevance to personality pathology.

    Science.gov (United States)

    Simms, Leonard J

    2007-02-01

    Proponents of the Big Seven model of personality have suggested that Positive Valence (PV) and Negative Valence (NV) are independent of the Big Five personality dimensions and may be particularly relevant to personality disorder. These hypotheses were tested with 403 undergraduates who completed a Big Seven measure and markers of the Big Five and personality pathology. Results revealed that PV and NV incrementally predicted personality pathology dimensions beyond those predicted by multiple markers of the Big Five. However, factor analyses suggested that PV and NV might be best understood as specific, maladaptive aspects of positive emotionality and low agreeableness, respectively, as opposed to independent factors of personality. Implications for the description of normal and abnormal personality are discussed.

  1. Nuclear models relevant to evaluation

    International Nuclear Information System (INIS)

    Arthur, E.D.; Chadwick, M.B.; Hale, G.M.; Young, P.G.

    1991-01-01

    The widespread use of nuclear models continues in the creation of data evaluations. The reasons include extension of data evaluations to higher energies, creation of data libraries for isotopic components of natural materials, and production of evaluations for radiative target species. In these cases, experimental data are often sparse or nonexistent. As this trend continues, the nuclear models employed in evaluation work move towards more microscopically-based theoretical methods, prompted in part by the availability of increasingly powerful computational resources. Advances in nuclear models applicable to evaluation will be reviewed. These include advances in optical model theory, microscopic and phenomenological state and level density theory, unified models that consistently describe both equilibrium and nonequilibrium reaction mechanism, and improved methodologies for calculation of prompt radiation from fission. 84 refs., 8 figs

  2. A multi-modal geological investigation framework for subsurface modeling and kinematic monitoring of a slow-moving landslide complex in Colorado, United States

    Science.gov (United States)

    Lowry, B. W.; Zhou, W.; Smartgeo

    2010-12-01

    The Muddy Creek landslide complex is a large area of active and reactivating landslides that impact the operation of both a state highway and Paonia Reservoir in Gunnison County, Colorado, United States. Historically, the monitoring of this slide has been investigated using disparate techniques leading to protracted analysis and project knowledge attrition. We present an integrated, data-driven investigation framework that supports continued kinematic monitoring, document cataloging, and subsurface modeling of the landslide complex. A geospatial information system (GIS) was integrated with a visual programming based subsurface model to facilitate modular integration of monitoring data with borehole information. Subsurface modeling was organized by material type and activity state based on multiple sources of kinematic measurement. The framework is constructed to modularly integrate remotely sensed imagery and other spatial datasets such as ASTER, InSAR, and LiDAR derived elevation products as more precise datasets become available. The framework allows for terrestrial LiDAR survey error estimation, borehole siting, and placement of wireless sensor (GPS, accelerometers, geophysical ) networks for optimized spatial relevance and utility. Coordinated spatial referencing within the GIS facilitates geotechnical and hydrogeological modeling input generation and common display of modeling outputs. Kinematic data fusion techniques are accomplished with integration of instrumentation, surficial feature tracking, subsurface classification, and 3D interpolation. The framework includes dynamic decision support including landslide dam failure estimates, back-flooding scenario planning that can be accessed by multiple agencies and stakeholders.

  3. Modeling and simulation of complex systems a framework for efficient agent-based modeling and simulation

    CERN Document Server

    Siegfried, Robert

    2014-01-01

    Robert Siegfried presents a framework for efficient agent-based modeling and simulation of complex systems. He compares different approaches for describing structure and dynamics of agent-based models in detail. Based on this evaluation the author introduces the "General Reference Model for Agent-based Modeling and Simulation" (GRAMS). Furthermore he presents parallel and distributed simulation approaches for execution of agent-based models -from small scale to very large scale. The author shows how agent-based models may be executed by different simulation engines that utilize underlying hard

  4. Post-closure biosphere assessment modelling: comparison of complex and more stylised approaches

    Energy Technology Data Exchange (ETDEWEB)

    Walke, Russell C. [Quintessa Limited, The Hub, 14 Station Road, Henley-on-Thames (United Kingdom); Kirchner, Gerald [University of Hamburg, ZNF, Beim Schlump 83, 20144 Hamburg (Germany); Xu, Shulan; Dverstorp, Bjoern [Swedish Radiation Safety Authority, SE-171 16 Stockholm (Sweden)

    2014-07-01

    Geological facilities are the preferred option for disposal of high-level radioactive waste, due to their potential to provide isolation from the surface environment (biosphere) on very long time scales. Safety cases developed in support of geological disposal include assessment of potential impacts on humans and wildlife in order to demonstrate compliance with regulatory criteria. As disposal programmes move from site-independent/generic assessments through site selection to applications for construction/operation and closure, the degree of understanding of the present-day site increases, together with increased site-specific information. Assessments need to strike a balance between simple models and more complex approaches that draw more extensively on this site-specific information. This paper explores the relative merits of complex versus more stylised biosphere models in the context of a site-specific assessment. The complex biosphere model was developed by the Swedish Nuclear Fuel and Waste Management Co (SKB) for the Formark candidate site for a spent nuclear fuel repository in Sweden. SKB's model is built on a landscape evolution model, whereby radionuclide releases to distinct hydrological basins/sub-catchments (termed 'objects') are represented as they evolve through land rise and climate change. The site is located on the Baltic coast with a terrestrial landscape including lakes, mires, forest and agriculture. The land at the site is projected to continue to rise due to post-glacial uplift leading to ecosystem transitions in excess of ten thousand years. The simple biosphere models developed for this study include the most plausible transport processes and represent various types of ecosystem. The complex biosphere models adopt a relatively coarse representation of the near-surface strata, which is shown to be conservative, but also to under-estimate the time scale required for potential doses to reach equilibrium with radionuclide fluxes

  5. PeTTSy: a computational tool for perturbation analysis of complex systems biology models.

    Science.gov (United States)

    Domijan, Mirela; Brown, Paul E; Shulgin, Boris V; Rand, David A

    2016-03-10

    Over the last decade sensitivity analysis techniques have been shown to be very useful to analyse complex and high dimensional Systems Biology models. However, many of the currently available toolboxes have either used parameter sampling, been focused on a restricted set of model observables of interest, studied optimisation of a objective function, or have not dealt with multiple simultaneous model parameter changes where the changes can be permanent or temporary. Here we introduce our new, freely downloadable toolbox, PeTTSy (Perturbation Theory Toolbox for Systems). PeTTSy is a package for MATLAB which implements a wide array of techniques for the perturbation theory and sensitivity analysis of large and complex ordinary differential equation (ODE) based models. PeTTSy is a comprehensive modelling framework that introduces a number of new approaches and that fully addresses analysis of oscillatory systems. It examines sensitivity analysis of the models to perturbations of parameters, where the perturbation timing, strength, length and overall shape can be controlled by the user. This can be done in a system-global setting, namely, the user can determine how many parameters to perturb, by how much and for how long. PeTTSy also offers the user the ability to explore the effect of the parameter perturbations on many different types of outputs: period, phase (timing of peak) and model solutions. PeTTSy can be employed on a wide range of mathematical models including free-running and forced oscillators and signalling systems. To enable experimental optimisation using the Fisher Information Matrix it efficiently allows one to combine multiple variants of a model (i.e. a model with multiple experimental conditions) in order to determine the value of new experiments. It is especially useful in the analysis of large and complex models involving many variables and parameters. PeTTSy is a comprehensive tool for analysing large and complex models of regulatory and

  6. Differential Effects of Munc18s on Multiple Degranulation-Relevant Trans-SNARE Complexes.

    Directory of Open Access Journals (Sweden)

    Hao Xu

    Full Text Available Mast cell exocytosis, which includes compound degranulation and vesicle-associated piecemeal degranulation, requires multiple Q- and R- SNAREs. It is not clear how these SNAREs pair to form functional trans-SNARE complexes and how these trans-SNARE complexes are selectively regulated for fusion. Here we undertake a comprehensive examination of the capacity of two Q-SNARE subcomplexes (syntaxin3/SNAP-23 and syntaxin4/SNAP-23 to form fusogenic trans-SNARE complexes with each of the four granule-borne R-SNAREs (VAMP2, 3, 7, 8. We report the identification of at least six distinct trans-SNARE complexes under enhanced tethering conditions: i VAMP2/syntaxin3/SNAP-23, ii VAMP2/syntaxin4/SNAP-23, iii VAMP3/syntaxin3/SNAP-23, iv VAMP3/syntaxin4/SNAP-23, v VAMP8/syntaxin3/SNAP-23, and vi VAMP8/syntaxin4/SNAP-23. We show for the first time that Munc18a operates synergistically with SNAP-23-based non-neuronal SNARE complexes (i to iv in lipid mixing, in contrast to Munc18b and c, which exhibit no positive effect on any SNARE combination tested. Pre-incubation with Munc18a renders the SNARE-dependent fusion reactions insensitive to the otherwise inhibitory R-SNARE cytoplasmic domains, suggesting a protective role of Munc18a for its cognate SNAREs. Our findings substantiate the recently discovered but unexpected requirement for Munc18a in mast cell exocytosis, and implicate post-translational modifications in Munc18b/c activation.

  7. The complex sine-Gordon model on a half line

    International Nuclear Information System (INIS)

    Tzamtzis, Georgios

    2003-01-01

    In this thesis, we study the complex sine-Gordon model on a half line. The model in the bulk is an integrable (1+1) dimensional field theory which is U(1) gauge invariant and comprises a generalisation of the sine-Gordon theory. It accepts soliton and breather solutions. By introducing suitably selected boundary conditions we may consider the model on a half line. Through such conditions the model can be shown to remain integrable and various aspects of the boundary theory can be examined. The first chapter serves as a brief introduction to some basic concepts of integrability and soliton solutions. As an example of an integrable system with soliton solutions, the sine-Gordon model is presented both in the bulk and on a half line. These results will serve as a useful guide for the model at hand. The introduction finishes with a brief overview of the two methods that will be used on the fourth chapter in order to obtain the quantum spectrum of the boundary complex sine-Gordon model. In the second chapter the model is properly introduced along with a brief literature review. Different realisations of the model and their connexions are discussed. The vacuum of the theory is investigated. Soliton solutions are given and a discussion on the existence of breathers follows. Finally the collapse of breather solutions to single solitons is demonstrated and the chapter concludes with a different approach to the breather problem. In the third chapter, we construct the lowest conserved currents and through them we find suitable boundary conditions that allow for their conservation in the presence of a boundary. The boundary term is added to the Lagrangian and the vacuum is reexamined in the half line case. The reflection process of solitons from the boundary is studied and the time-delay is calculated. Finally we address the existence of boundary-bound states. In the fourth chapter we study the quantum complex sine-Gordon model. We begin with a brief overview of the theory in

  8. On the general procedure for modelling complex ecological systems

    International Nuclear Information System (INIS)

    He Shanyu.

    1987-12-01

    In this paper, the principle of a general procedure for modelling complex ecological systems, i.e. the Adaptive Superposition Procedure (ASP) is shortly stated. The result of application of ASP in a national project for ecological regionalization is also described. (author). 3 refs

  9. On the hyperporous non-linear elasticity model for fusion-relevant pebble beds

    International Nuclear Information System (INIS)

    Di Maio, P.A.; Giammusso, R.; Vella, G.

    2010-01-01

    Packed pebble beds are particular granular systems composed of a large amount of small particles, arranged in irregular lattices and surrounded by a gas filling interstitial spaces. Due to their heterogeneous structure, pebble beds have non-linear and strongly coupled thermal and mechanical behaviours whose constitutive models seem limited, being not suitable for fusion-relevant design-oriented applications. Within the framework of the modelling activities promoted for the lithiated ceramics and beryllium pebble beds foreseen in the Helium-Cooled Pebble Bed breeding blanket concept of DEMO, at the Department of Nuclear Engineering of the University of Palermo (DIN) a thermo-mechanical constitutive model has been set-up assuming that pebble beds can be considered as continuous, homogeneous and isotropic media. The present paper deals with the DIN non-linear elasticity constitutive model, based on the assumption that during the reversible straining of a pebble bed its effective logarithmic bulk modulus depends on the equivalent pressure according to a modified power law and its effective Poisson modulus remains constant. In these hypotheses the functional dependence of the effective tangential and secant bed deformation moduli on either the equivalent pressure or the volumetric strain have been derived in a closed analytical form. A procedure has been, then, defined to assess the model parameters for a given pebble bed from its oedometric test results and it has been applied to both polydisperse lithium orthosilicate and single size beryllium pebble beds.

  10. QMU as an approach to strengthening the predictive capabilities of complex models.

    Energy Technology Data Exchange (ETDEWEB)

    Gray, Genetha Anne.; Boggs, Paul T.; Grace, Matthew D.

    2010-09-01

    Complex systems are made up of multiple interdependent parts, and the behavior of the entire system cannot always be directly inferred from the behavior of the individual parts. They are nonlinear and system responses are not necessarily additive. Examples of complex systems include energy, cyber and telecommunication infrastructures, human and animal social structures, and biological structures such as cells. To meet the goals of infrastructure development, maintenance, and protection for cyber-related complex systems, novel modeling and simulation technology is needed. Sandia has shown success using M&S in the nuclear weapons (NW) program. However, complex systems represent a significant challenge and relative departure from the classical M&S exercises, and many of the scientific and mathematical M&S processes must be re-envisioned. Specifically, in the NW program, requirements and acceptable margins for performance, resilience, and security are well-defined and given quantitatively from the start. The Quantification of Margins and Uncertainties (QMU) process helps to assess whether or not these safety, reliability and performance requirements have been met after a system has been developed. In this sense, QMU is used as a sort of check that requirements have been met once the development process is completed. In contrast, performance requirements and margins may not have been defined a priori for many complex systems, (i.e. the Internet, electrical distribution grids, etc.), particularly not in quantitative terms. This project addresses this fundamental difference by investigating the use of QMU at the start of the design process for complex systems. Three major tasks were completed. First, the characteristics of the cyber infrastructure problem were collected and considered in the context of QMU-based tools. Second, UQ methodologies for the quantification of model discrepancies were considered in the context of statistical models of cyber activity. Third

  11. Theories, models and frameworks used in capacity building interventions relevant to public health: a systematic review.

    Science.gov (United States)

    Bergeron, Kim; Abdi, Samiya; DeCorby, Kara; Mensah, Gloria; Rempel, Benjamin; Manson, Heather

    2017-11-28

    There is limited research on capacity building interventions that include theoretical foundations. The purpose of this systematic review is to identify underlying theories, models and frameworks used to support capacity building interventions relevant to public health practice. The aim is to inform and improve capacity building practices and services offered by public health organizations. Four search strategies were used: 1) electronic database searching; 2) reference lists of included papers; 3) key informant consultation; and 4) grey literature searching. Inclusion and exclusion criteria are outlined with included papers focusing on capacity building, learning plans, professional development plans in combination with tools, resources, processes, procedures, steps, model, framework, guideline, described in a public health or healthcare setting, or non-government, government, or community organizations as they relate to healthcare, and explicitly or implicitly mention a theory, model and/or framework that grounds the type of capacity building approach developed. Quality assessment were performed on all included articles. Data analysis included a process for synthesizing, analyzing and presenting descriptive summaries, categorizing theoretical foundations according to which theory, model and/or framework was used and whether or not the theory, model or framework was implied or explicitly identified. Nineteen articles were included in this review. A total of 28 theories, models and frameworks were identified. Of this number, two theories (Diffusion of Innovations and Transformational Learning), two models (Ecological and Interactive Systems Framework for Dissemination and Implementation) and one framework (Bloom's Taxonomy of Learning) were identified as the most frequently cited. This review identifies specific theories, models and frameworks to support capacity building interventions relevant to public health organizations. It provides public health practitioners

  12. First results from the International Urban Energy Balance Model Comparison: Model Complexity

    Science.gov (United States)

    Blackett, M.; Grimmond, S.; Best, M.

    2009-04-01

    A great variety of urban energy balance models has been developed. These vary in complexity from simple schemes that represent the city as a slab, through those which model various facets (i.e. road, walls and roof) to more complex urban forms (including street canyons with intersections) and features (such as vegetation cover and anthropogenic heat fluxes). Some schemes also incorporate detailed representations of momentum and energy fluxes distributed throughout various layers of the urban canopy layer. The models each differ in the parameters they require to describe the site and the in demands they make on computational processing power. Many of these models have been evaluated using observational datasets but to date, no controlled comparisons have been conducted. Urban surface energy balance models provide a means to predict the energy exchange processes which influence factors such as urban temperature, humidity, atmospheric stability and winds. These all need to be modelled accurately to capture features such as the urban heat island effect and to provide key information for dispersion and air quality modelling. A comparison of the various models available will assist in improving current and future models and will assist in formulating research priorities for future observational campaigns within urban areas. In this presentation we will summarise the initial results of this international urban energy balance model comparison. In particular, the relative performance of the models involved will be compared based on their degree of complexity. These results will inform us on ways in which we can improve the modelling of air quality within, and climate impacts of, global megacities. The methodology employed in conducting this comparison followed that used in PILPS (the Project for Intercomparison of Land-Surface Parameterization Schemes) which is also endorsed by the GEWEX Global Land Atmosphere System Study (GLASS) panel. In all cases, models were run

  13. Dynamical Behaviors in Complex-Valued Love Model With or Without Time Delays

    Science.gov (United States)

    Deng, Wei; Liao, Xiaofeng; Dong, Tao

    2017-12-01

    In this paper, a novel version of nonlinear model, i.e. a complex-valued love model with two time delays between two individuals in a love affair, has been proposed. A notable feature in this model is that we separate the emotion of one individual into real and imaginary parts to represent the variation and complexity of psychophysiological emotion in romantic relationship instead of just real domain, and make our model much closer to reality. This is because love is a complicated cognitive and social phenomenon, full of complexity, diversity and unpredictability, which refers to the coexistence of different aspects of feelings, states and attitudes ranging from joy and trust to sadness and disgust. By analyzing associated characteristic equation of linearized equations for our model, it is found that the Hopf bifurcation occurs when the sum of time delays passes through a sequence of critical value. Stability of bifurcating cyclic love dynamics is also derived by applying the normal form theory and the center manifold theorem. In addition, it is also shown that, for some appropriate chosen parameters, chaotic behaviors can appear even without time delay.

  14. Inverse analyses of effective diffusion parameters relevant for a two-phase moisture model of cementitious materials

    DEFF Research Database (Denmark)

    Addassi, Mouadh; Johannesson, Björn; Wadsö, Lars

    2018-01-01

    Here we present an inverse analyses approach to determining the two-phase moisture transport properties relevant to concrete durability modeling. The purposed moisture transport model was based on a continuum approach with two truly separate equations for the liquid and gas phase being connected...... test, and, (iv) capillary suction test. Mass change over time, as obtained from the drying test, the two different cup test intervals and the capillary suction test, was used to obtain the effective diffusion parameters using the proposed inverse analyses approach. The moisture properties obtained...

  15. MODELS AND METHODS OF SAFETY-ORIENTED PROJECT MANAGEMENT OF DEVELOPMENT OF COMPLEX SYSTEMS: METHODOLOGICAL APPROACH

    Directory of Open Access Journals (Sweden)

    Олег Богданович ЗАЧКО

    2016-03-01

    Full Text Available The methods and models of safety-oriented project management of the development of complex systems are proposed resulting from the convergence of existing approaches in project management in contrast to the mechanism of value-oriented management. A cognitive model of safety oriented project management of the development of complex systems is developed, which provides a synergistic effect that is to move the system from the original (pre condition in an optimal one from the viewpoint of life safety - post-project state. The approach of assessment the project complexity is proposed, which consists in taking into account the seasonal component of a time characteristic of life cycles of complex organizational and technical systems with occupancy. This enabled to take into account the seasonal component in simulation models of life cycle of the product operation in complex organizational and technical system, modeling the critical points of operation of systems with occupancy, which forms a new methodology for safety-oriented management of projects, programs and portfolios of projects with the formalization of the elements of complexity.

  16. Where to from here? Future applications of mental models of complex performance

    International Nuclear Information System (INIS)

    Hahn, H.A.; Nelson, W.R.; Blackman, H.S.

    1988-01-01

    The purpose of this paper is to raise issues for discussion regarding the applications of mental models in the study of complex performance. Applications for training, expert systems and decision aids, job selection, workstation design, and other complex environments are considered. 1 ref

  17. Expectancy-Violation and Information-Theoretic Models of Melodic Complexity

    Directory of Open Access Journals (Sweden)

    Tuomas Eerola

    2016-07-01

    Full Text Available The present study assesses two types of models for melodic complexity: one based on expectancy violations and the other one related to an information-theoretic account of redundancy in music. Seven different datasets spanning artificial sequences, folk and pop songs were used to refine and assess the models. The refinement eliminated unnecessary components from both types of models. The final analysis pitted three variants of the two model types against each other and could explain from 46-74% of the variance in the ratings across the datasets. The most parsimonious models were identified with an information-theoretic criterion. This suggested that the simplified expectancy-violation models were the most efficient for these sets of data. However, the differences between all optimized models were subtle in terms both of performance and simplicity.

  18. Per Aspera ad Astra: Through Complex Population Modeling to Predictive Theory.

    Science.gov (United States)

    Topping, Christopher J; Alrøe, Hugo Fjelsted; Farrell, Katharine N; Grimm, Volker

    2015-11-01

    Population models in ecology are often not good at predictions, even if they are complex and seem to be realistic enough. The reason for this might be that Occam's razor, which is key for minimal models exploring ideas and concepts, has been too uncritically adopted for more realistic models of systems. This can tie models too closely to certain situations, thereby preventing them from predicting the response to new conditions. We therefore advocate a new kind of parsimony to improve the application of Occam's razor. This new parsimony balances two contrasting strategies for avoiding errors in modeling: avoiding inclusion of nonessential factors (false inclusions) and avoiding exclusion of sometimes-important factors (false exclusions). It involves a synthesis of traditional modeling and analysis, used to describe the essentials of mechanistic relationships, with elements that are included in a model because they have been reported to be or can arguably be assumed to be important under certain conditions. The resulting models should be able to reflect how the internal organization of populations change and thereby generate representations of the novel behavior necessary for complex predictions, including regime shifts.

  19. Holocene glacier variability: three case studies using an intermediate-complexity climate model

    NARCIS (Netherlands)

    Weber, S.L.; Oerlemans, J.

    2003-01-01

    Synthetic glacier length records are generated for the Holocene epoch using a process-based glacier model coupled to the intermediate-complexity climate model ECBilt. The glacier model consists of a massbalance component and an ice-flow component. The climate model is forced by the insolation change

  20. Modeling, Simulation and Analysis of Complex Networked Systems: A Program Plan for DOE Office of Advanced Scientific Computing Research

    Energy Technology Data Exchange (ETDEWEB)

    Brown, D L

    2009-05-01

    networked systems, and (4) design, situational awareness and control of complex networks. The program elements consist of a group of Complex Networked Systems Research Institutes (CNSRI), tightly coupled to an associated individual-investigator-based Complex Networked Systems Basic Research (CNSBR) program. The CNSRI's will be principally located at the DOE National Laboratories and are responsible for identifying research priorities, developing and maintaining a networked systems modeling and simulation software infrastructure, operating summer schools, workshops and conferences and coordinating with the CNSBR individual investigators. The CNSBR individual investigator projects will focus on specific challenges for networked systems. Relevancy of CNSBR research to DOE needs will be assured through the strong coupling provided between the CNSBR grants and the CNSRI's.

  1. Modeling, Simulation and Analysis of Complex Networked Systems: A Program Plan for DOE Office of Advanced Scientific Computing Research

    International Nuclear Information System (INIS)

    Brown, D.L.

    2009-01-01

    , and (4) design, situational awareness and control of complex networks. The program elements consist of a group of Complex Networked Systems Research Institutes (CNSRI), tightly coupled to an associated individual-investigator-based Complex Networked Systems Basic Research (CNSBR) program. The CNSRI's will be principally located at the DOE National Laboratories and are responsible for identifying research priorities, developing and maintaining a networked systems modeling and simulation software infrastructure, operating summer schools, workshops and conferences and coordinating with the CNSBR individual investigators. The CNSBR individual investigator projects will focus on specific challenges for networked systems. Relevancy of CNSBR research to DOE needs will be assured through the strong coupling provided between the CNSBR grants and the CNSRI's.

  2. Inclusion Complexes of Sunscreen Agents with β-Cyclodextrin: Spectroscopic and Molecular Modeling Studies

    Directory of Open Access Journals (Sweden)

    Nathir A. F. Al-Rawashdeh

    2013-01-01

    Full Text Available The inclusion complexes of selected sunscreen agents, namely, oxybenzone (Oxy, octocrylene (Oct, and ethylhexyl-methoxycinnamate (Cin with β-cyclodextrin (β-CD were studied by UV-Vis spectroscopy, differential scanning calorimetry (DSC, 13C NMR techniques, and molecular mechanics (MM calculations and modeling. Molecular modeling (MM study of the entire process of the formation of 1 : 1 stoichiometry sunscreen agent/β-cyclodextrin structures has been used to contribute to the understanding and rationalization of the experimental results. Molecular mechanics calculations, together with 13C NMR measurements, for the complex with β-CD have been used to describe details of the structural, energetic, and dynamic features of host-guest complex. Accurate structures of CD inclusion complexes have been derived from molecular mechanics (MM calculations and modeling. The photodegradation reaction of the sunscreen agents' molecules in lotion was explored using UV-Vis spectroscopy. It has been demonstrated that the photostability of these selected sunscreen agents has been enhanced upon forming inclusion complexes with β-CD in lotion. The results of this study demonstrate that β-CD can be utilized as photostabilizer additive for enhancing the photostability of the selected sunscreen agents' molecules.

  3. The Model of Complex Structure of Quark

    Science.gov (United States)

    Liu, Rongwu

    2017-09-01

    In Quantum Chromodynamics, quark is known as a kind of point-like fundamental particle which carries mass, charge, color, and flavor, strong interaction takes place between quarks by means of exchanging intermediate particles-gluons. An important consequence of this theory is that, strong interaction is a kind of short-range force, and it has the features of ``asymptotic freedom'' and ``quark confinement''. In order to reveal the nature of strong interaction, the ``bag'' model of vacuum and the ``string'' model of string theory were proposed in the context of quantum mechanics, but neither of them can provide a clear interaction mechanism. This article formulates a new mechanism by proposing a model of complex structure of quark, it can be outlined as follows: (1) Quark (as well as electron, etc) is a kind of complex structure, it is composed of fundamental particle (fundamental matter mass and electricity) and fundamental volume field (fundamental matter flavor and color) which exists in the form of limited volume; fundamental particle lies in the center of fundamental volume field, forms the ``nucleus'' of quark. (2) As static electric force, the color field force between quarks has classical form, it is proportional to the square of the color quantity carried by each color field, and inversely proportional to the area of cross section of overlapping color fields which is along force direction, it has the properties of overlap, saturation, non-central, and constant. (3) Any volume field undergoes deformation when interacting with other volume field, the deformation force follows Hooke's law. (4) The phenomena of ``asymptotic freedom'' and ``quark confinement'' are the result of color field force and deformation force.

  4. Lab-on-a-brane: A novel physiologically relevant planar arterial model to study transendothelial transport

    Science.gov (United States)

    Budhwani, Karim Ismail

    The tremendous quality of life impact notwithstanding, cardiovascular diseases and Cancer add up to over US$ 700bn each year in financial costs alone. Aging and population growth are expected to further expand the problem space while drug research and development remain expensive. However, preclinical costs can be substantially mitigated by substituting animal models with in vitro devices that accurately model human cardiovascular transport. Here we present a novel physiologically relevant lab-on-a-brane that simulates in vivo pressure, flow, strain, and shear waveforms associated with normal and pathological conditions in large and small blood vessels for studying molecular transport across the endothelial monolayer. The device builds upon previously demonstrated integrated microfluidic loop design by: (a) introducing nanoscale pores in the substrate membrane to enable transmembrane molecular transport, (b) transforming the substrate membrane into a nanofibrous matrix for 3D smooth muscle cell (SMC) tissue culture, (c) integrating electrospinning fabrication methods, (d) engineering an invertible sandwich cell culture device architecture, and (e) devising a healthy co-culture mechanism for human arterial endothelial cell (HAEC) monolayer and multiple layers of human smooth muscle cells (HSMC) to accurately mimic arterial anatomy. Structural and mechanical characterization was conducted using confocal microscopy, SEM, stress/strain analysis, and infrared spectroscopy. Transport was characterized using FITC-Dextran hydraulic permeability protocol. Structure and transport characterization successfully demonstrate device viability as a physiologically relevant arterial mimic for testing transendothelial transport. Thus, our lab-on-a-brane provides a highly effective and efficient, yet considerably inexpensive, physiologically relevant alternative for pharmacokinetic evaluation; possibly reducing animals used in pre-clinical testing, clinical trials cost from false

  5. Complex Coronary Hemodynamics - Simple Analog Modelling as an Educational Tool.

    Science.gov (United States)

    Parikh, Gaurav R; Peter, Elvis; Kakouros, Nikolaos

    2017-01-01

    Invasive coronary angiography remains the cornerstone for evaluation of coronary stenoses despite there being a poor correlation between luminal loss assessment by coronary luminography and myocardial ischemia. This is especially true for coronary lesions deemed moderate by visual assessment. Coronary pressure-derived fractional flow reserve (FFR) has emerged as the gold standard for the evaluation of hemodynamic significance of coronary artery stenosis, which is cost effective and leads to improved patient outcomes. There are, however, several limitations to the use of FFR including the evaluation of serial stenoses. In this article, we discuss the electronic-hydraulic analogy and the utility of simple electrical modelling to mimic the coronary circulation and coronary stenoses. We exemplify the effect of tandem coronary lesions on the FFR by modelling of a patient with sequential disease segments and complex anatomy. We believe that such computational modelling can serve as a powerful educational tool to help clinicians better understand the complexity of coronary hemodynamics and improve patient care.

  6. EVALUATING PREDICTIVE ERRORS OF A COMPLEX ENVIRONMENTAL MODEL USING A GENERAL LINEAR MODEL AND LEAST SQUARE MEANS

    Science.gov (United States)

    A General Linear Model (GLM) was used to evaluate the deviation of predicted values from expected values for a complex environmental model. For this demonstration, we used the default level interface of the Regional Mercury Cycling Model (R-MCM) to simulate epilimnetic total mer...

  7. The Algebra of Complex Numbers.

    Science.gov (United States)

    LePage, Wilbur R.

    This programed text is an introduction to the algebra of complex numbers for engineering students, particularly because of its relevance to important problems of applications in electrical engineering. It is designed for a person who is well experienced with the algebra of real numbers and calculus, but who has no experience with complex number…

  8. Systematic Assessment of Neutron and Gamma Backgrounds Relevant to Operational Modeling and Detection Technology Implementation

    Energy Technology Data Exchange (ETDEWEB)

    Archer, Daniel E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hornback, Donald Eric [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Johnson, Jeffrey O. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Nicholson, Andrew D. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Patton, Bruce W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Peplow, Douglas E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Miller, Thomas Martin [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Ayaz-Maierhafer, Birsen [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-01-01

    This report summarizes the findings of a two year effort to systematically assess neutron and gamma backgrounds relevant to operational modeling and detection technology implementation. The first year effort focused on reviewing the origins of background sources and their impact on measured rates in operational scenarios of interest. The second year has focused on the assessment of detector and algorithm performance as they pertain to operational requirements against the various background sources and background levels.

  9. A viscoelastic-viscoplastic model for short-fibre reinforced polymers with complex fibre orientations

    Directory of Open Access Journals (Sweden)

    Nciri M.

    2015-01-01

    Full Text Available This paper presents an innovative approach for the modelling of viscous behaviour of short-fibre reinforced composites (SFRC with complex distributions of fibre orientations and for a wide range of strain rates. As an alternative to more complex homogenisation methods, the model is based on an additive decomposition of the state potential for the computation of composite’s macroscopic behaviour. Thus, the composite material is seen as the assembly of a matrix medium and several linear elastic fibre media. The division of short fibres into several families means that complex distributions of orientation or random orientation can be easily modelled. The matrix behaviour is strain-rate sensitive, i.e. viscoelastic and/or viscoplastic. Viscoelastic constitutive laws are based on a generalised linear Maxwell model and the modelling of the viscoplasticity is based on an overstress approach. The model is tested for the case of a polypropylene reinforced with short-glass fibres with distributed orientations and subjected to uniaxial tensile tests, in different loading directions and under different strain rates. Results demonstrate the efficiency of the model over a wide range of strain rates.

  10. A framework for modelling the complexities of food and water security under globalisation

    Directory of Open Access Journals (Sweden)

    B. J. Dermody

    2018-01-01

    Full Text Available We present a new framework for modelling the complexities of food and water security under globalisation. The framework sets out a method to capture regional and sectoral interdependencies and cross-scale feedbacks within the global food system that contribute to emergent water use patterns. The framework integrates aspects of existing models and approaches in the fields of hydrology and integrated assessment modelling. The core of the framework is a multi-agent network of city agents connected by infrastructural trade networks. Agents receive socio-economic and environmental constraint information from integrated assessment models and hydrological models respectively and simulate complex, socio-environmental dynamics that operate within those constraints. The emergent changes in food and water resources are aggregated and fed back to the original models with minimal modification of the structure of those models. It is our conviction that the framework presented can form the basis for a new wave of decision tools that capture complex socio-environmental change within our globalised world. In doing so they will contribute to illuminating pathways towards a sustainable future for humans, ecosystems and the water they share.

  11. A density-based clustering model for community detection in complex networks

    Science.gov (United States)

    Zhao, Xiang; Li, Yantao; Qu, Zehui

    2018-04-01

    Network clustering (or graph partitioning) is an important technique for uncovering the underlying community structures in complex networks, which has been widely applied in various fields including astronomy, bioinformatics, sociology, and bibliometric. In this paper, we propose a density-based clustering model for community detection in complex networks (DCCN). The key idea is to find group centers with a higher density than their neighbors and a relatively large integrated-distance from nodes with higher density. The experimental results indicate that our approach is efficient and effective for community detection of complex networks.

  12. Applying complexity theory: A primer for identifying and modeling firm anomalies

    Directory of Open Access Journals (Sweden)

    Arch G. Woodside

    2018-01-01

    Full Text Available This essay elaborates on the usefulness of embracing complexity theory, modeling outcomes rather than directionality, and modeling complex rather than simple outcomes in strategic management. Complexity theory includes the tenet that most antecedent conditions are neither sufficient nor necessary for the occurrence of a specific outcome. Identifying a firm by individual antecedents (i.e., non-innovative versus highly innovative, small versus large size in sales or number of employees, or serving local versus international markets provides shallow information in modeling specific outcomes (e.g., high sales growth or high profitability—even if directional analyses (e.g., regression analysis, including structural equation modeling indicates that the independent (main effects of the individual antecedents relate to outcomes directionally—because firm (case anomalies almost always occur to main effects. Examples: a number of highly innovative firms have low sales while others have high sales and a number of non-innovative firms have low sales while others have high sales. Breaking-away from the current dominant logic of directionality testing—null hypotheses statistic testing (NHST—to embrace somewhat precise outcome testing (SPOT is necessary for extracting highly useful information about the causes of anomalies—associations opposite to expected and “statistically significant” main effects. The study of anomalies extends to identifying the occurrences of four-corner strategy outcomes: firms doing well in favorable circumstances, firms doing badly in favorable circumstances, firms doing well in unfavorable circumstances, and firms doing badly in unfavorable circumstances. Models of four-corner strategy outcomes advances strategic management beyond the current dominant logic of directional modeling of single outcomes.

  13. Chromate adsorption on selected soil minerals: Surface complexation modeling coupled with spectroscopic investigation

    Energy Technology Data Exchange (ETDEWEB)

    Veselská, Veronika, E-mail: veselskav@fzp.czu.cz [Department of Environmental Geosciences, Faculty of Environmental Sciences, Czech University of Life Sciences Prague, Kamýcka 129, CZ-16521, Prague (Czech Republic); Fajgar, Radek [Department of Analytical and Material Chemistry, Institute of Chemical Process Fundamentals of the CAS, v.v.i., Rozvojová 135/1, CZ-16502, Prague (Czech Republic); Číhalová, Sylva [Department of Environmental Geosciences, Faculty of Environmental Sciences, Czech University of Life Sciences Prague, Kamýcka 129, CZ-16521, Prague (Czech Republic); Bolanz, Ralph M. [Institute of Geosciences, Friedrich-Schiller-University Jena, Carl-Zeiss-Promenade 10, DE-07745, Jena (Germany); Göttlicher, Jörg; Steininger, Ralph [ANKA Synchrotron Radiation Facility, Karlsruhe Institute of Technology, Hermann-von-Helmholtz-Platz 1, DE-76344, Eggenstein-Leopoldshafen (Germany); Siddique, Jamal A.; Komárek, Michael [Department of Environmental Geosciences, Faculty of Environmental Sciences, Czech University of Life Sciences Prague, Kamýcka 129, CZ-16521, Prague (Czech Republic)

    2016-11-15

    Highlights: • Study of Cr(VI) adsorption on soil minerals over a large range of conditions. • Combined surface complexation modeling and spectroscopic techniques. • Diffuse-layer and triple-layer models used to obtain fits to experimental data. • Speciation of Cr(VI) and Cr(III) was assessed. - Abstract: This study investigates the mechanisms of Cr(VI) adsorption on natural clay (illite and kaolinite) and synthetic (birnessite and ferrihydrite) minerals, including its speciation changes, and combining quantitative thermodynamically based mechanistic surface complexation models (SCMs) with spectroscopic measurements. Series of adsorption experiments have been performed at different pH values (3–10), ionic strengths (0.001–0.1 M KNO{sub 3}), sorbate concentrations (10{sup −4}, 10{sup −5}, and 10{sup −6} M Cr(VI)), and sorbate/sorbent ratios (50–500). Fourier transform infrared spectroscopy, X-ray photoelectron spectroscopy, and X-ray absorption spectroscopy were used to determine the surface complexes, including surface reactions. Adsorption of Cr(VI) is strongly ionic strength dependent. For ferrihydrite at pH <7, a simple diffuse-layer model provides a reasonable prediction of adsorption. For birnessite, bidentate inner-sphere complexes of chromate and dichromate resulted in a better diffuse-layer model fit. For kaolinite, outer-sphere complexation prevails mainly at lower Cr(VI) loadings. Dissolution of solid phases needs to be considered for better SCMs fits. The coupled SCM and spectroscopic approach is thus useful for investigating individual minerals responsible for Cr(VI) retention in soils, and improving the handling and remediation processes.

  14. Principle of an operational complexity index for the characterization of the human factor relevance of future reactors concepts

    International Nuclear Information System (INIS)

    Papin, Bernard

    2004-01-01

    With the increasing reliability of the modern technological systems, the human contribution to the global risk in the operation of industrial systems is becoming more and more significant : in the nuclear reactor operation for example, a recent PSA estimation of this contribution is about 25% of the risk of core melting, all situations considered. This urges the designers of future nuclear reactors to consider the minimisation of this Human Factor (HF) contribution, at the very early stage of their design : the experience feedback shows that this is indeed at this stage that the fundamental design options, impacting the most the human reliability in operation, are fixed. The problem is that at these early design stages, it is also quite impossible to apply formal human reliability methods to support this HF optimisation, while the precise operating conditions of the reactor are not yet known in enough details. In this paper, another approach of the HF evaluation during the design, based on the functional and operational complexity assessment, is proposed. As an illustration, this approach is used to compare various concepts of Pressurized Water Reactors from the point of view of the Human Factor relevance. (Author)

  15. The Limits to Relevance

    Science.gov (United States)

    Averill, M.; Briggle, A.

    2006-12-01

    Science policy and knowledge production lately have taken a pragmatic turn. Funding agencies increasingly are requiring scientists to explain the relevance of their work to society. This stems in part from mounting critiques of the "linear model" of knowledge production in which scientists operating according to their own interests or disciplinary standards are presumed to automatically produce knowledge that is of relevance outside of their narrow communities. Many contend that funded scientific research should be linked more directly to societal goals, which implies a shift in the kind of research that will be funded. While both authors support the concept of useful science, we question the exact meaning of "relevance" and the wisdom of allowing it to control research agendas. We hope to contribute to the conversation by thinking more critically about the meaning and limits of the term "relevance" and the trade-offs implicit in a narrow utilitarian approach. The paper will consider which interests tend to be privileged by an emphasis on relevance and address issues such as whose goals ought to be pursued and why, and who gets to decide. We will consider how relevance, narrowly construed, may actually limit the ultimate utility of scientific research. The paper also will reflect on the worthiness of research goals themselves and their relationship to a broader view of what it means to be human and to live in society. Just as there is more to being human than the pragmatic demands of daily life, there is more at issue with knowledge production than finding the most efficient ways to satisfy consumer preferences or fix near-term policy problems. We will conclude by calling for a balanced approach to funding research that addresses society's most pressing needs but also supports innovative research with less immediately apparent application.

  16. Happiness: origins, forms, and technical relevance.

    Science.gov (United States)

    Akhtar, Salman

    2010-09-01

    By critically reviewing Freud's views on happiness, and also those of Helene Deutsch, Bertram Lewin, Melanie Klein, and Heinz Kohut, the author evolves a complex and multilayered perspective on the phenomenon. He categorizes happiness into four related and occasionally overlapping varieties: pleasure-based happiness (elation), assertion-based happiness (joy), merger-based happiness (ecstasy), and fulfillment-based happiness (contentment). After entering some caveats and drawing from his clinical experience, the author then demonstrates the relevance of these ideas to the conduct of psychotherapy and psychoanalysis.

  17. Synergy and destructive interferences between local magnetic anisotropies in binuclear complexes

    Energy Technology Data Exchange (ETDEWEB)

    Guihéry, Nathalie; Ruamps, Renaud [Laboratoire de Chimie et Physique Quantiques, UMR5625, University of Toulouse 3, Paul Sabatier, 118 route de Narbonne, 31062 Toulouse (France); Maurice, Rémi [SUBATECH, IN2P3/EMN Nantes/University of Nantes, 4 rue Alfred Kastler, BP 20722 44307, Nantes, Cedex 3 (France); Graaf, Coen de [University Rovira i Virgili, Marcelli Domingo s/n, 43007 Tarragona (Spain)

    2015-12-31

    Magnetic anisotropy is responsible for the single molecule magnet behavior of transition metal complexes. This behavior is characterized by a slow relaxation of the magnetization for low enough temperatures, and thus for a possible blocking of the magnetization. This bistable behavior can lead to possible technological applications in the domain of data storage or quantum computing. Therefore, the understanding of the microscopic origin of magnetic anisotropy has received a considerable interest during the last two decades. The presentation focuses on the determination of the anisotropy parameters of both mono-nuclear and bi-nuclear types of complexes and on the control and optimization of the anisotropic properties. The validity of the model Hamiltonians commonly used to characterize such complexes has been questioned and it is shown that neither the standard multispin Hamiltonian nor the giant spin Hamiltonian are appropriate for weakly coupled ions. Alternative models have been proposed and used to properly extract the relevant parameters. Rationalizations of the magnitude and nature of both local anisotropies of single ions and the molecular anisotropy of polynuclear complexes are provided. The synergy and interference effects between local magnetic anisotropies are studied in a series of binuclear complexes.

  18. Encoding of complexity, shape and curvature by macaque infero-temporal neurons

    Directory of Open Access Journals (Sweden)

    Greet eKayaert

    2011-07-01

    Full Text Available We recorded responses of macaque infero-temporal (IT neurons to a stimulus set of Fourier Boundary Descriptor shapes wherein complexity, general shape and curvature were systematically varied. We analyzed the response patterns of the neurons to the different stimuli using multi-dimensional scaling. The resulting neural shape space differed in important ways from the physical, image-based shape space. We found a particular sensitivity for the presence of curved versus straight contours that existed only for the simple but not for the medium and highly complex shapes. Also, IT neurons could linearly separate the simple and the complex shapes within a low-dimensional neural shape space, but no distinction was found between the medium and high levels of complexity. None of these effects could be derived from physical image metrics, either directly or by comparing the neural data with similarities yielded by two models of low-level visual processing (one using wavelet-based filters and one that models position and size invariant object selectivity through four hierarchically organized neural layers. This study highlights the relevance of complexity to IT neural encoding, both as a neurally independently represented shape property and through its influence on curvature detection.

  19. arXiv Spin models in complex magnetic fields: a hard sign problem

    CERN Document Server

    de Forcrand, Philippe

    2018-01-01

    Coupling spin models to complex external fields can give rise to interesting phenomena like zeroes of the partition function (Lee-Yang zeroes, edge singularities) or oscillating propagators. Unfortunately, it usually also leads to a severe sign problem that can be overcome only in special cases; if the partition function has zeroes, the sign problem is even representation-independent at these points. In this study, we couple the N-state Potts model in different ways to a complex external magnetic field and discuss the above mentioned phenomena and their relations based on analytic calculations (1D) and results obtained using a modified cluster algorithm (general D) that in many cases either cures or at least drastically reduces the sign-problem induced by the complex external field.

  20. Arbitrary protein−protein docking targets biologically relevant interfaces

    International Nuclear Information System (INIS)

    Martin, Juliette; Lavery, Richard

    2012-01-01

    Protein-protein recognition is of fundamental importance in the vast majority of biological processes. However, it has already been demonstrated that it is very hard to distinguish true complexes from false complexes in so-called cross-docking experiments, where binary protein complexes are separated and the isolated proteins are all docked against each other and scored. Does this result, at least in part, reflect a physical reality? False complexes could reflect possible nonspecific or weak associations. In this paper, we investigate the twilight zone of protein-protein interactions, building on an interesting outcome of cross-docking experiments: false complexes seem to favor residues from the true interaction site, suggesting that randomly chosen partners dock in a non-random fashion on protein surfaces. Here, we carry out arbitrary docking of a non-redundant data set of 198 proteins, with more than 300 randomly chosen "probe" proteins. We investigate the tendency of arbitrary partners to aggregate at localized regions of the protein surfaces, the shape and compositional bias of the generated interfaces, and the potential of this property to predict biologically relevant binding sites. We show that the non-random localization of arbitrary partners after protein-protein docking is a generic feature of protein structures. The interfaces generated in this way are not systematically planar or curved, but tend to be closer than average to the center of the proteins. These results can be used to predict biological interfaces with an AUC value up to 0.69 alone, and 0.72 when used in combination with evolutionary information. An appropriate choice of random partners and number of docking models make this method computationally practical. It is also noted that nonspecific interfaces can point to alternate interaction sites in the case of proteins with multiple interfaces. We illustrate the usefulness of arbitrary docking using PEBP (Phosphatidylethanolamine binding

  1. Arbitrary protein−protein docking targets biologically relevant interfaces

    Directory of Open Access Journals (Sweden)

    Martin Juliette

    2012-05-01

    Full Text Available Abstract Background Protein-protein recognition is of fundamental importance in the vast majority of biological processes. However, it has already been demonstrated that it is very hard to distinguish true complexes from false complexes in so-called cross-docking experiments, where binary protein complexes are separated and the isolated proteins are all docked against each other and scored. Does this result, at least in part, reflect a physical reality? False complexes could reflect possible nonspecific or weak associations. Results In this paper, we investigate the twilight zone of protein-protein interactions, building on an interesting outcome of cross-docking experiments: false complexes seem to favor residues from the true interaction site, suggesting that randomly chosen partners dock in a non-random fashion on protein surfaces. Here, we carry out arbitrary docking of a non-redundant data set of 198 proteins, with more than 300 randomly chosen "probe" proteins. We investigate the tendency of arbitrary partners to aggregate at localized regions of the protein surfaces, the shape and compositional bias of the generated interfaces, and the potential of this property to predict biologically relevant binding sites. We show that the non-random localization of arbitrary partners after protein-protein docking is a generic feature of protein structures. The interfaces generated in this way are not systematically planar or curved, but tend to be closer than average to the center of the proteins. These results can be used to predict biological interfaces with an AUC value up to 0.69 alone, and 0.72 when used in combination with evolutionary information. An appropriate choice of random partners and number of docking models make this method computationally practical. It is also noted that nonspecific interfaces can point to alternate interaction sites in the case of proteins with multiple interfaces. We illustrate the usefulness of arbitrary docking

  2. Complex Constructivism: A Theoretical Model of Complexity and Cognition

    Science.gov (United States)

    Doolittle, Peter E.

    2014-01-01

    Education has long been driven by its metaphors for teaching and learning. These metaphors have influenced both educational research and educational practice. Complexity and constructivism are two theories that provide functional and robust metaphors. Complexity provides a metaphor for the structure of myriad phenomena, while constructivism…

  3. FACET: A simulation software framework for modeling complex societal processes and interactions

    Energy Technology Data Exchange (ETDEWEB)

    Christiansen, J. H.

    2000-06-02

    FACET, the Framework for Addressing Cooperative Extended Transactions, was developed at Argonne National Laboratory to address the need for a simulation software architecture in the style of an agent-based approach, but with sufficient robustness, expressiveness, and flexibility to be able to deal with the levels of complexity seen in real-world social situations. FACET is an object-oriented software framework for building models of complex, cooperative behaviors of agents. It can be used to implement simulation models of societal processes such as the complex interplay of participating individuals and organizations engaged in multiple concurrent transactions in pursuit of their various goals. These transactions can be patterned on, for example, clinical guidelines and procedures, business practices, government and corporate policies, etc. FACET can also address other complex behaviors such as biological life cycles or manufacturing processes. To date, for example, FACET has been applied to such areas as land management, health care delivery, avian social behavior, and interactions between natural and social processes in ancient Mesopotamia.

  4. Development of structural model of adaptive training complex in ergatic systems for professional use

    Science.gov (United States)

    Obukhov, A. D.; Dedov, D. L.; Arkhipov, A. E.

    2018-03-01

    The article considers the structural model of the adaptive training complex (ATC), which reflects the interrelations between the hardware, software and mathematical model of ATC and describes the processes in this subject area. The description of the main components of software and hardware complex, their interaction and functioning within the common system are given. Also the article scrutinizers a brief description of mathematical models of personnel activity, a technical system and influences, the interactions of which formalize the regularities of ATC functioning. The studies of main objects of training complexes and connections between them will make it possible to realize practical implementation of ATC in ergatic systems for professional use.

  5. Towards a Unified Theory of Health-Disease: I. Health as a complex model-object

    Directory of Open Access Journals (Sweden)

    Naomar Almeida-Filho

    2013-06-01

    Full Text Available Theory building is one of the most crucial challenges faced by basic, clinical and population research, which form the scientific foundations of health practices in contemporary societies. The objective of the study is to propose a Unified Theory of Health-Disease as a conceptual tool for modeling health-disease-care in the light of complexity approaches. With this aim, the epistemological basis of theoretical work in the health field and concepts related to complexity theory as concerned to health problems are discussed. Secondly, the concepts of model-object, multi-planes of occurrence, modes of health and disease-illness-sickness complex are introduced and integrated into a unified theoretical framework. Finally, in the light of recent epistemological developments, the concept of Health-Disease-Care Integrals is updated as a complex reference object fit for modeling health-related processes and phenomena.

  6. Modelling and simulating in-stent restenosis with complex automata

    NARCIS (Netherlands)

    Hoekstra, A.G.; Lawford, P.; Hose, R.

    2010-01-01

    In-stent restenosis, the maladaptive response of a blood vessel to injury caused by the deployment of a stent, is a multiscale system involving a large number of biological and physical processes. We describe a Complex Automata Model for in-stent restenosis, coupling bulk flow, drug diffusion, and

  7. Surface complexation modeling of the effects of phosphate on uranium(VI) adsorption

    Energy Technology Data Exchange (ETDEWEB)

    Romero-Gonzalez, M.R.; Cheng, T.; Barnett, M.O. [Auburn Univ., AL (United States). Dept. of Civil Engeneering; Roden, E.E. [Wisconsin Univ., Madison, WI (United States). Dept. of Geology and Geophysics

    2007-07-01

    Previous published data for the adsorption of U(VI) and/or phosphate onto amorphous Fe(III) oxides (hydrous ferric oxide, HFO) and crystalline Fe(III) oxides (goethite) was examined. These data were then used to test the ability of a commonly-used surface complexation model (SCM) to describe the adsorption of U(VI) and phosphate onto pure amorphous and crystalline Fe(III) oxides and synthetic goethite-coated sand, a surrogate for a natural Fe(III)-coated material, using the component additivity (CA) approach. Our modeling results show that this model was able to describe U(VI) adsorption onto both amorphous and crystalline Fe(III) oxides and also goethite-coated sand quite well in the absence of phosphate. However, because phosphate adsorption exhibits a stronger dependence on Fe(III) oxide type than U(VI) adsorption, we could not use this model to consistently describe phosphate adsorption onto both amorphous and crystalline Fe(III) oxides and goethite-coated sand. However, the effects of phosphate on U(VI) adsorption could be incorporated into the model to describe U(VI) adsorption to both amorphous and crystalline Fe(III) oxides and goethite-coated sand, at least for an initial approximation. These results illustrate both the potential and limitations of using surface complexation models developed from pure systems to describe metal/radionuclide adsorption under more complex conditions. (orig.)

  8. Surface complexation modeling of the effects of phosphate on uranium(VI) adsorption

    International Nuclear Information System (INIS)

    Romero-Gonzalez, M.R.; Cheng, T.; Barnett, M.O.; Roden, E.E.

    2007-01-01

    Previous published data for the adsorption of U(VI) and/or phosphate onto amorphous Fe(III) oxides (hydrous ferric oxide, HFO) and crystalline Fe(III) oxides (goethite) was examined. These data were then used to test the ability of a commonly-used surface complexation model (SCM) to describe the adsorption of U(VI) and phosphate onto pure amorphous and crystalline Fe(III) oxides and synthetic goethite-coated sand, a surrogate for a natural Fe(III)-coated material, using the component additivity (CA) approach. Our modeling results show that this model was able to describe U(VI) adsorption onto both amorphous and crystalline Fe(III) oxides and also goethite-coated sand quite well in the absence of phosphate. However, because phosphate adsorption exhibits a stronger dependence on Fe(III) oxide type than U(VI) adsorption, we could not use this model to consistently describe phosphate adsorption onto both amorphous and crystalline Fe(III) oxides and goethite-coated sand. However, the effects of phosphate on U(VI) adsorption could be incorporated into the model to describe U(VI) adsorption to both amorphous and crystalline Fe(III) oxides and goethite-coated sand, at least for an initial approximation. These results illustrate both the potential and limitations of using surface complexation models developed from pure systems to describe metal/radionuclide adsorption under more complex conditions. (orig.)

  9. Are Models Easier to Understand than Code? An Empirical Study on Comprehension of Entity-Relationship (ER) Models vs. Structured Query Language (SQL) Code

    Science.gov (United States)

    Sanchez, Pablo; Zorrilla, Marta; Duque, Rafael; Nieto-Reyes, Alicia

    2011-01-01

    Models in Software Engineering are considered as abstract representations of software systems. Models highlight relevant details for a certain purpose, whereas irrelevant ones are hidden. Models are supposed to make system comprehension easier by reducing complexity. Therefore, models should play a key role in education, since they would ease the…

  10. COMPLEX OF NUMERICAL MODELS FOR COMPUTATION OF AIR ION CONCENTRATION IN PREMISES

    Directory of Open Access Journals (Sweden)

    M. M. Biliaiev

    2016-04-01

    Full Text Available Purpose. The article highlights the question about creation the complex numerical models in order to calculate the ions concentration fields in premises of various purpose and in work areas. Developed complex should take into account the main physical factors influencing the formation of the concentration field of ions, that is, aerodynamics of air jets in the room, presence of furniture, equipment, placement of ventilation holes, ventilation mode, location of ionization sources, transfer of ions under the electric field effect, other factors, determining the intensity and shape of the field of concentration of ions. In addition, complex of numerical models has to ensure conducting of the express calculation of the ions concentration in the premises, allowing quick sorting of possible variants and enabling «enlarged» evaluation of air ions concentration in the premises. Methodology. The complex numerical models to calculate air ion regime in the premises is developed. CFD numerical model is based on the use of aerodynamics, electrostatics and mass transfer equations, and takes into account the effect of air flows caused by the ventilation operation, diffusion, electric field effects, as well as the interaction of different polarities ions with each other and with the dust particles. The proposed balance model for computation of air ion regime indoors allows operative calculating the ions concentration field considering pulsed operation of the ionizer. Findings. The calculated data are received, on the basis of which one can estimate the ions concentration anywhere in the premises with artificial air ionization. An example of calculating the negative ions concentration on the basis of the CFD numerical model in the premises with reengineering transformations is given. On the basis of the developed balance model the air ions concentration in the room volume was calculated. Originality. Results of the air ion regime computation in premise, which

  11. Towards and Effective Financial Management: Relevance of Dividend Discount Model in Stock Price Valuation

    Directory of Open Access Journals (Sweden)

    Ana Mugoša

    2015-06-01

    Full Text Available The aim of this paper is to analyze the relevance of dividend discount model, i.e. its specific form in stock price estimation known as Gordon growth model. The expected dividends can be a measure of cash flows returned to the stockholder. In this context, the model is useful for assessment of how risk factors, such as interest rates and changing inflation rates, affect stock returns. This is especially important in case when investors are value oriented, i.e. when expected dividends are theirmain investing drivers. We compared the estimated with the actual stock price values and tested the statistical significance of price differences in 199 publicly traded European companies for the period2010-2013. Statistical difference between pairs of price series (actual and estimated was tested using Wilcoxon and Kruskal-Wallis tests of median and distribution equality. The hypothesis that Gordon growth model cannot be reliable measure of stock price valuation on European equity market over period of 2010-2013 due to influence of the global financial crisis was rejected with 95% confidence. Gordon growth model has proven to be reliable measure of stock price valuation even over period of strong global financial crisis influence.

  12. Interacting price model and fluctuation behavior analysis from Lempel–Ziv complexity and multi-scale weighted-permutation entropy

    Energy Technology Data Exchange (ETDEWEB)

    Li, Rui, E-mail: lirui1401@bjtu.edu.cn; Wang, Jun

    2016-01-08

    A financial price model is developed based on the voter interacting system in this work. The Lempel–Ziv complexity is introduced to analyze the complex behaviors of the stock market. Some stock market stylized facts including fat tails, absence of autocorrelation and volatility clustering are investigated for the proposed price model firstly. Then the complexity of fluctuation behaviors of the real stock markets and the proposed price model are mainly explored by Lempel–Ziv complexity (LZC) analysis and multi-scale weighted-permutation entropy (MWPE) analysis. A series of LZC analyses of the returns and the absolute returns of daily closing prices and moving average prices are performed. Moreover, the complexity of the returns, the absolute returns and their corresponding intrinsic mode functions (IMFs) derived from the empirical mode decomposition (EMD) with MWPE is also investigated. The numerical empirical study shows similar statistical and complex behaviors between the proposed price model and the real stock markets, which exhibits that the proposed model is feasible to some extent. - Highlights: • A financial price dynamical model is developed based on the voter interacting system. • Lempel–Ziv complexity is the firstly applied to investigate the stock market dynamics system. • MWPE is employed to explore the complexity fluctuation behaviors of the stock market. • Empirical results show the feasibility of the proposed financial model.

  13. Interacting price model and fluctuation behavior analysis from Lempel–Ziv complexity and multi-scale weighted-permutation entropy

    International Nuclear Information System (INIS)

    Li, Rui; Wang, Jun

    2016-01-01

    A financial price model is developed based on the voter interacting system in this work. The Lempel–Ziv complexity is introduced to analyze the complex behaviors of the stock market. Some stock market stylized facts including fat tails, absence of autocorrelation and volatility clustering are investigated for the proposed price model firstly. Then the complexity of fluctuation behaviors of the real stock markets and the proposed price model are mainly explored by Lempel–Ziv complexity (LZC) analysis and multi-scale weighted-permutation entropy (MWPE) analysis. A series of LZC analyses of the returns and the absolute returns of daily closing prices and moving average prices are performed. Moreover, the complexity of the returns, the absolute returns and their corresponding intrinsic mode functions (IMFs) derived from the empirical mode decomposition (EMD) with MWPE is also investigated. The numerical empirical study shows similar statistical and complex behaviors between the proposed price model and the real stock markets, which exhibits that the proposed model is feasible to some extent. - Highlights: • A financial price dynamical model is developed based on the voter interacting system. • Lempel–Ziv complexity is the firstly applied to investigate the stock market dynamics system. • MWPE is employed to explore the complexity fluctuation behaviors of the stock market. • Empirical results show the feasibility of the proposed financial model.

  14. A computational approach to modeling cellular-scale blood flow in complex geometry

    Science.gov (United States)

    Balogh, Peter; Bagchi, Prosenjit

    2017-04-01

    We present a computational methodology for modeling cellular-scale blood flow in arbitrary and highly complex geometry. Our approach is based on immersed-boundary methods, which allow modeling flows in arbitrary geometry while resolving the large deformation and dynamics of every blood cell with high fidelity. The present methodology seamlessly integrates different modeling components dealing with stationary rigid boundaries of complex shape, moving rigid bodies, and highly deformable interfaces governed by nonlinear elasticity. Thus it enables us to simulate 'whole' blood suspensions flowing through physiologically realistic microvascular networks that are characterized by multiple bifurcating and merging vessels, as well as geometrically complex lab-on-chip devices. The focus of the present work is on the development of a versatile numerical technique that is able to consider deformable cells and rigid bodies flowing in three-dimensional arbitrarily complex geometries over a diverse range of scenarios. After describing the methodology, a series of validation studies are presented against analytical theory, experimental data, and previous numerical results. Then, the capability of the methodology is demonstrated by simulating flows of deformable blood cells and heterogeneous cell suspensions in both physiologically realistic microvascular networks and geometrically intricate microfluidic devices. It is shown that the methodology can predict several complex microhemodynamic phenomena observed in vascular networks and microfluidic devices. The present methodology is robust and versatile, and has the potential to scale up to very large microvascular networks at organ levels.

  15. A mouse model of mitochondrial complex III dysfunction induced by myxothiazol

    Energy Technology Data Exchange (ETDEWEB)

    Davoudi, Mina [Pediatrics, Department of Clinical Sciences, Lund, Lund University, Lund 22185 (Sweden); Kallijärvi, Jukka; Marjavaara, Sanna [Folkhälsan Research Center, Biomedicum Helsinki, University of Helsinki, 00014 (Finland); Kotarsky, Heike; Hansson, Eva [Pediatrics, Department of Clinical Sciences, Lund, Lund University, Lund 22185 (Sweden); Levéen, Per [Pediatrics, Department of Clinical Sciences, Lund, Lund University, Lund 22185 (Sweden); Folkhälsan Research Center, Biomedicum Helsinki, University of Helsinki, 00014 (Finland); Fellman, Vineta, E-mail: Vineta.Fellman@med.lu.se [Pediatrics, Department of Clinical Sciences, Lund, Lund University, Lund 22185 (Sweden); Folkhälsan Research Center, Biomedicum Helsinki, University of Helsinki, 00014 (Finland); Children’s Hospital, Helsinki University Hospital, University of Helsinki, Helsinki 00029 (Finland)

    2014-04-18

    Highlights: • Reversible chemical inhibition of complex III in wild type mouse. • Myxothiazol causes decreased complex III activity in mouse liver. • The model is useful for therapeutic trials to improve mitochondrial function. - Abstract: Myxothiazol is a respiratory chain complex III (CIII) inhibitor that binds to the ubiquinol oxidation site Qo of CIII. It blocks electron transfer from ubiquinol to cytochrome b and thus inhibits CIII activity. It has been utilized as a tool in studies of respiratory chain function in in vitro and cell culture models. We developed a mouse model of biochemically induced and reversible CIII inhibition using myxothiazol. We administered myxothiazol intraperitoneally at a dose of 0.56 mg/kg to C57Bl/J6 mice every 24 h and assessed CIII activity, histology, lipid content, supercomplex formation, and gene expression in the livers of the mice. A reversible CIII activity decrease to 50% of control value occurred at 2 h post-injection. At 74 h only minor histological changes in the liver were found, supercomplex formation was preserved and no significant changes in the expression of genes indicating hepatotoxicity or inflammation were found. Thus, myxothiazol-induced CIII inhibition can be induced in mice for four days in a row without overt hepatotoxicity or lethality. This model could be utilized in further studies of respiratory chain function and pharmacological approaches to mitochondrial hepatopathies.

  16. Using multi-criteria analysis of simulation models to understand complex biological systems

    Science.gov (United States)

    Maureen C. Kennedy; E. David. Ford

    2011-01-01

    Scientists frequently use computer-simulation models to help solve complex biological problems. Typically, such models are highly integrated, they produce multiple outputs, and standard methods of model analysis are ill suited for evaluating them. We show how multi-criteria optimization with Pareto optimality allows for model outputs to be compared to multiple system...

  17. Experimental determination and modeling of arsenic complexation with humic and fulvic acids.

    Science.gov (United States)

    Fakour, Hoda; Lin, Tsair-Fuh

    2014-08-30

    The complexation of humic acid (HA) and fulvic acid (FA) with arsenic (As) in water was studied. Experimental results indicate that arsenic may form complexes with HA and FA with a higher affinity for arsenate than for arsenite. With the presence of iron oxide based adsorbents, binding of arsenic to HA/FA in water was significantly suppressed, probably due to adsorption of As and HA/FA. A two-site ligand binding model, considering only strong and weak site types of binding affinity, was successfully developed to describe the complexation of arsenic on the two natural organic fractions. The model showed that the numbers of weak sites were more than 10 times those of strong sites on both HA and FA for both arsenic species studied. The numbers of both types of binding sites were found to be proportional to the HA concentrations, while the apparent stability constants, defined for describing binding affinity between arsenic and the sites, are independent of the HA concentrations. To the best of our knowledge, this is the first study to characterize the impact of HA concentrations on the applicability of the ligand binding model, and to extrapolate the model to FA. The obtained results may give insights on the complexation of arsenic in HA/FA laden groundwater and on the selection of more effective adsorption-based treatment methods for natural waters. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. Generalizing a complex model for gully threshold identification in the Mediterranean environment

    Science.gov (United States)

    Torri, D.; Borselli, L.; Iaquinta, P.; Iovine, G.; Poesen, J.; Terranova, O.

    2012-04-01

    Among the physical processes leading to land degradation, soil erosion by water is the most important and gully erosion may contribute, at places, to 70% of the total soil loss. Nevertheless, gully erosion has often been neglected in water soil erosion modeling, whilst more prominence has been given to rill and interrill erosion. Both to facilitate the processing by agricultural machinery and to take advantage of all the arable land, gullies are commonly removed at each crop cycle, with significant soil losses due to the repeated excavation of the channel by the successive rainstorm. When the erosive forces of overland flow exceed the strength of the soil particles to detachment and displacement, water erosion occurs and usually a channel is formed. As runoff is proportional to the local catchment area, a relationship between local slope, S, and contributing area, A, is supposed to exists. A "geomorphologic threshold" scheme is therefore suitable to interpret the physical process of gully initiation: accordingly, a gully is formed when a hydraulic threshold for incision exceeds the resistance of the soil particles to detachment and transport. Similarly, it appears reasonable that a gully ends when there is a reduction of slope, or the concentrated flow meets more resistant soil-vegetation complexes. This study aims to predict the location of the beginning of gullies in the Mediterranean environment, based on an evaluation of S and A by means of a mathematical model. For the identification of the areas prone to gully erosion, the model employs two empirical thresholds relevant to the head (Thead) and to the end (Tend) of the gullies (of the type SA^ b>Thead, SA^ bsituations (usually after abandonment), and c) databases for cropland have been merged. Selected data have been examined and interpreted mathematically to assess a value to be taken as a constant for the exponent "b" of the above equation. Literature data on the problem of topological thresholds Tend are

  19. Beam model for seismic analysis of complex shear wall structure based on the strain energy equivalence

    International Nuclear Information System (INIS)

    Reddy, G.R.; Mahajan, S.C.; Suzuki, Kohei

    1997-01-01

    A nuclear reactor building structure consists of shear walls with complex geometry, beams and columns. The complexity of the structure is explained in the section Introduction. Seismic analysis of the complex reactor building structure using the continuum mechanics approach may produce good results but this method is very difficult to apply. Hence, the finite element approach is found to be an useful technique for solving the dynamic equations of the reactor building structure. In this approach, the model which uses finite elements such as brick, plate and shell elements may produce accurate results. However, this model also poses some difficulties which are explained in the section Modeling Techniques. Therefore, seismic analysis of complex structures is generally carried out using a lumped mass beam model. This model is preferred because of its simplicity and economy. Nevertheless, mathematical modeling of a shear wall structure as a beam requires specialized skill and a thorough understanding of the structure. For accurate seismic analysis, it is necessary to model more realistically the stiffness, mass and damping. In linear seismic analysis, modeling of the mass and damping may pose few problems compared to modeling the stiffness. When used to represent a complex structure, the stiffness of the beam is directly related to the shear wall section properties such as area, shear area and moment of inertia. Various beam models which are classified based on the method of stiffness evaluation are also explained under the section Modeling Techniques. In the section Case Studies the accuracy and simplicity of the beam models are explained. Among various beam models, the one which evaluates the stiffness using strain energy equivalence proves to be the simplest and most accurate method for modeling the complex shear wall structure. (author)

  20. The relation between geometry, hydrology and stability of complex hillslopes examined using low-dimensional hydrological models

    NARCIS (Netherlands)

    Talebi, A.

    2008-01-01

    Key words: Hillslope geometry, Hillslope hydrology, Hillslope stability, Complex hillslopes, Modeling shallow landslides, HSB model, HSB-SM model.

    The hydrologic response of a hillslope to rainfall involves a complex, transient saturated-unsaturated interaction that usually leads to a

  1. The Complexity of Developmental Predictions from Dual Process Models

    Science.gov (United States)

    Stanovich, Keith E.; West, Richard F.; Toplak, Maggie E.

    2011-01-01

    Drawing developmental predictions from dual-process theories is more complex than is commonly realized. Overly simplified predictions drawn from such models may lead to premature rejection of the dual process approach as one of many tools for understanding cognitive development. Misleading predictions can be avoided by paying attention to several…

  2. Development and evaluation of a musculoskeletal model of the elbow joint complex

    Science.gov (United States)

    Gonzalez, Roger V.; Hutchins, E. L.; Barr, Ronald E.; Abraham, Lawrence D.

    1993-01-01

    This paper describes the development and evaluation of a musculoskeletal model that represents human elbow flexion-extension and forearm pronation-supination. The length, velocity, and moment arm for each of the eight musculotendon actuators were based on skeletal anatomy and position. Musculotendon parameters were determined for each actuator and verified by comparing analytical torque-angle curves with experimental joint torque data. The parameters and skeletal geometry were also utilized in the musculoskeletal model for the analysis of ballistic elbow joint complex movements. The key objective was to develop a computational model, guided by parameterized optimal control, to investigate the relationship among patterns of muscle excitation, individual muscle forces, and movement kinematics. The model was verified using experimental kinematic, torque, and electromyographic data from volunteer subjects performing ballistic elbow joint complex movements.

  3. Using Gene Ontology to describe the role of the neurexin-neuroligin-SHANK complex in human, mouse and rat and its relevance to autism.

    Science.gov (United States)

    Patel, Sejal; Roncaglia, Paola; Lovering, Ruth C

    2015-06-06

    People with an autistic spectrum disorder (ASD) display a variety of characteristic behavioral traits, including impaired social interaction, communication difficulties and repetitive behavior. This complex neurodevelopment disorder is known to be associated with a combination of genetic and environmental factors. Neurexins and neuroligins play a key role in synaptogenesis and neurexin-neuroligin adhesion is one of several processes that have been implicated in autism spectrum disorders. In this report we describe the manual annotation of a selection of gene products known to be associated with autism and/or the neurexin-neuroligin-SHANK complex and demonstrate how a focused annotation approach leads to the creation of more descriptive Gene Ontology (GO) terms, as well as an increase in both the number of gene product annotations and their granularity, thus improving the data available in the GO database. The manual annotations we describe will impact on the functional analysis of a variety of future autism-relevant datasets. Comprehensive gene annotation is an essential aspect of genomic and proteomic studies, as the quality of gene annotations incorporated into statistical analysis tools affects the effective interpretation of data obtained through genome wide association studies, next generation sequencing, proteomic and transcriptomic datasets.

  4. Lewis basicity of relevant monoanions in a non-protogenic organic solvent using a zinc(ii) Schiff-base complex as a reference Lewis acid.

    Science.gov (United States)

    Oliveri, Ivan Pietro; Di Bella, Santo

    2017-09-12

    Anions are ubiquitous species playing a primary role in chemistry, whose reactivity is essentially dominated by their Lewis basicity. However, no Lewis basicity data, in terms of Gibbs energy, are reported in the literature. We report here the first Lewis basicity of relevant monoanions through the determination of binding constants for the formation of stable 1 : 1 adducts, using a Zn II Schiff-base complex, 1, as a reference Lewis acid. Binding constants for equilibrium reactions were achieved through a nonlinear regression analysis of the binding isotherms from spectrophotometric titration data. The Lewis acidic complex 1 is a proper reference species because it forms stable adducts with both neutral and charged Lewis bases, thus allowing ranking their Lewis basicity. Binding constants indicate generally a strong Lewis basicity for all involved anions, rivalling or exceeding that of the stronger neutral bases, such as primary amines or pyridine. The cyanide anion results to be the strongest Lewis base, while the nitrate is the weaker base within the present anion series. Moreover, even the weaker base anions behave as stronger bases than the most common non-protogenic coordinating solvents.

  5. Application of proteomics in the study of rodent models of cancer

    DEFF Research Database (Denmark)

    Terp, Mikkel Green; Ditzel, Henrik J

    2014-01-01

    The molecular and cellular mechanisms underlying the multistage processes of cancer progression and metastasis are complex and strictly depend on the interplay between tumor cells and surrounding tissues. Identification of protein aberrations in cancer pathophysiology requires a physiologically r......, and monitoring of cancer progression and treatment response. Central to such studies is the ability to ensure at an early stage that the identified proteins are of clinical relevance by examining relevant specimens from larger cohorts of cancer patients.......The molecular and cellular mechanisms underlying the multistage processes of cancer progression and metastasis are complex and strictly depend on the interplay between tumor cells and surrounding tissues. Identification of protein aberrations in cancer pathophysiology requires a physiologically...... relevant experimental model. The mouse offers such a model to identify protein changes associated with tumor initiation and progression, metastasis development, tumor/microenvironment interplay, and treatment responses. Furthermore, the mouse model offers the ability to collect samples at any stage...

  6. More Realistic Face Model Surface Improves Relevance of Pediatric In-Vitro Aerosol Studies.

    Science.gov (United States)

    Amirav, Israel; Halamish, Asaf; Gorenberg, Miguel; Omar, Hamza; Newhouse, Michael T

    2015-01-01

    Various hard face models are commonly used to evaluate the efficiency of aerosol face masks. Softer more realistic "face" surface materials, like skin, deform upon mask application and should provide more relevant in-vitro tests. Studies that simultaneously take into consideration many of the factors characteristic of the in vivo face are lacking. These include airways, various application forces, comparison of various devices, comparison with a hard-surface model and use of a more representative model face based on large numbers of actual faces. To compare mask to "face" seal and aerosol delivery of two pediatric masks using a soft vs. a hard, appropriately representative, pediatric face model under various applied forces. Two identical face models and upper airways replicas were constructed, the only difference being the suppleness and compressibility of the surface layer of the "face." Integrity of the seal and aerosol delivery of two different masks [AeroChamber (AC) and SootherMask (SM)] were compared using a breath simulator, filter collection and realistic applied forces. The soft "face" significantly increased the delivery efficiency and the sealing characteristics of both masks. Aerosol delivery with the soft "face" was significantly greater for the SM compared to the AC (pmasks was observed with the hard "face." The material and pliability of the model "face" surface has a significant influence on both the seal and delivery efficiency of face masks. This finding should be taken into account during in-vitro aerosol studies.

  7. Logic-based hierarchies for modeling behavior of complex dynamic systems with applications

    International Nuclear Information System (INIS)

    Hu, Y.S.; Modarres, M.

    2000-01-01

    Most complex systems are best represented in the form of a hierarchy. The Goal Tree Success Tree and Master Logic Diagram (GTST-MLD) are proven powerful hierarchic methods to represent complex snap-shot of plant knowledge. To represent dynamic behaviors of complex systems, fuzzy logic is applied to replace binary logic to extend the power of GTST-MLD. Such a fuzzy-logic-based hierarchy is called Dynamic Master Logic Diagram (DMLD). This chapter discusses comparison of the use of GTST-DMLD when applied as a modeling tool for systems whose relationships are modeled by either physical, binary logical or fuzzy logical relationships. This is shown by applying GTST-DMLD to the Direct Containment Heating (DCH) phenomenon at pressurized water reactors which is an important safety issue being addressed by the nuclear industry. (orig.)

  8. Comparison of new generation low-complexity flood inundation mapping tools with a hydrodynamic model

    Science.gov (United States)

    Afshari, Shahab; Tavakoly, Ahmad A.; Rajib, Mohammad Adnan; Zheng, Xing; Follum, Michael L.; Omranian, Ehsan; Fekete, Balázs M.

    2018-01-01

    The objective of this study is to compare two new generation low-complexity tools, AutoRoute and Height Above the Nearest Drainage (HAND), with a two-dimensional hydrodynamic model (Hydrologic Engineering Center-River Analysis System, HEC-RAS 2D). The assessment was conducted on two hydrologically different and geographically distant test-cases in the United States, including the 16,900 km2 Cedar River (CR) watershed in Iowa and a 62 km2 domain along the Black Warrior River (BWR) in Alabama. For BWR, twelve different configurations were set up for each of the models, including four different terrain setups (e.g. with and without channel bathymetry and a levee), and three flooding conditions representing moderate to extreme hazards at 10-, 100-, and 500-year return periods. For the CR watershed, models were compared with a simplistic terrain setup (without bathymetry and any form of hydraulic controls) and one flooding condition (100-year return period). Input streamflow forcing data representing these hypothetical events were constructed by applying a new fusion approach on National Water Model outputs. Simulated inundation extent and depth from AutoRoute, HAND, and HEC-RAS 2D were compared with one another and with the corresponding FEMA reference estimates. Irrespective of the configurations, the low-complexity models were able to produce inundation extents similar to HEC-RAS 2D, with AutoRoute showing slightly higher accuracy than the HAND model. Among four terrain setups, the one including both levee and channel bathymetry showed lowest fitness score on the spatial agreement of inundation extent, due to the weak physical representation of low-complexity models compared to a hydrodynamic model. For inundation depth, the low-complexity models showed an overestimating tendency, especially in the deeper segments of the channel. Based on such reasonably good prediction skills, low-complexity flood models can be considered as a suitable alternative for fast

  9. Health behavior change models and their socio-cultural relevance for breast cancer screening in African American women.

    Science.gov (United States)

    Ashing-Giwa, K

    1999-01-01

    Models of health behavior provide the conceptual bases for most of the breast cancer screening intervention studies. These models were not designed for and have not been adequately tested with African American women. The models discussed in this paper are: The Health Belief Model, the Theory of Reasoned Action/Theory of Planned Behavior, and the Transtheoretical Model. This paper will examine the socio-cultural relevance of these health behavior models, and discuss specific socio-cultural dimensions that are not accounted for by these paradigms. It is critical that researchers include socio-cultural dimensions, such as interconnectedness, health socialization, ecological factors and health care system factors into their intervention models with African American women. Comprehensive and socio-culturally based investigations are necessary to guide the scientific and policy challenge for reducing breast cancer mortality in African American women.

  10. A model of negotiation scenarios based on time, relevance andcontrol used to define advantageous positions in a negotiation

    Directory of Open Access Journals (Sweden)

    Omar Guillermo Rojas Altamirano

    2016-04-01

    Full Text Available Models that apply to negotiation are based on different perspectives that range from the relationship between the actors, game theory or the steps in a procedure. This research proposes a model of negotiation scenarios that considers three factors (time, relevance and control, which are displayed as the most important in a negotiation. These factors interact with each other and create different scenarios for each of the actors involved in a negotiation. The proposed model not only facilitates the creation of a negotiation strategy but also an ideal choice of effective tactics.

  11. Chaos from simple models to complex systems

    CERN Document Server

    Cencini, Massimo; Vulpiani, Angelo

    2010-01-01

    Chaos: from simple models to complex systems aims to guide science and engineering students through chaos and nonlinear dynamics from classical examples to the most recent fields of research. The first part, intended for undergraduate and graduate students, is a gentle and self-contained introduction to the concepts and main tools for the characterization of deterministic chaotic systems, with emphasis to statistical approaches. The second part can be used as a reference by researchers as it focuses on more advanced topics including the characterization of chaos with tools of information theor

  12. Sex and gonadal hormones in mouse models of Alzheimer’s disease: what is relevant to the human condition?

    Directory of Open Access Journals (Sweden)

    Dubal Dena B

    2012-11-01

    Full Text Available Abstract Biologic sex and gonadal hormones matter in human aging and diseases of aging such as Alzheimer’s – and the importance of studying their influences relates directly to human health. The goal of this article is to review the literature to date on sex and hormones in mouse models of Alzheimer’s disease (AD with an exclusive focus on interpreting the relevance of findings to the human condition. To this end, we highlight advances in AD and in sex and hormone biology, discuss what these advances mean for merging the two fields, review the current mouse model literature, raise major unresolved questions, and offer a research framework that incorporates human reproductive aging for future studies aimed at translational discoveries in this important area. Unraveling human relevant pathways in sex and hormone-based biology may ultimately pave the way to novel and urgently needed treatments for AD and other neurodegenerative diseases.

  13. COMPLEX SIMULATION MODEL OF TRAIN BREAKING-UP PROCESS AT THE HUMPS

    Directory of Open Access Journals (Sweden)

    E. B. Demchenko

    2015-11-01

    Full Text Available Purpose. One of the priorities of station sorting complex functioning improvement is the breaking-up process energy consumptions reduction, namely: fuel consumption for train pushing and electric energy consumption for cut braking. In this regard, an effective solution of the problem of energy consumption reduction at breaking-up subsystem requires a comprehensive handling of train pushing and cut rolling down processes. At the same time, the analysis showed that the current task of pushing process improvement and cut rolling down effectiveness increase are solved separately. To solve this problem it is necessary to develop the complex simulation model of train breaking up process at humps. Methodology. Pushing process simulation was done based on adapted under the shunting conditions traction calculations. In addition, the features of shunting locomotives work at the humps were taken into account. In order to realize the current pushing mode the special algorithm of hump locomotive controlling, which along with the safety shunting operation requirements takes into account behavioral factors associated with engineer control actions was applied. This algorithm provides train smooth acceleration and further movement with speed, which is close to the set speed. Hump locomotive fuel consumptions were determined based on the amount of mechanical work performed by locomotive traction. Findings. The simulation model of train pushing process was developed and combined with existing cut rolling down model. Cut initial velocity is determined during simulation process. The obtained initial velocity is used for further cut rolling process modeling. In addition, the modeling resulted in sufficiently accurate determination of the fuel rates consumed for train breaking-up. Originality. The simulation model of train breaking-up process at the humps, which in contrast to the existing models allows reproducing complexly all the elements of this process in detail

  14. Multiagent model and mean field theory of complex auction dynamics

    Science.gov (United States)

    Chen, Qinghua; Huang, Zi-Gang; Wang, Yougui; Lai, Ying-Cheng

    2015-09-01

    Recent years have witnessed a growing interest in analyzing a variety of socio-economic phenomena using methods from statistical and nonlinear physics. We study a class of complex systems arising from economics, the lowest unique bid auction (LUBA) systems, which is a recently emerged class of online auction game systems. Through analyzing large, empirical data sets of LUBA, we identify a general feature of the bid price distribution: an inverted J-shaped function with exponential decay in the large bid price region. To account for the distribution, we propose a multi-agent model in which each agent bids stochastically in the field of winner’s attractiveness, and develop a theoretical framework to obtain analytic solutions of the model based on mean field analysis. The theory produces bid-price distributions that are in excellent agreement with those from the real data. Our model and theory capture the essential features of human behaviors in the competitive environment as exemplified by LUBA, and may provide significant quantitative insights into complex socio-economic phenomena.

  15. Multiagent model and mean field theory of complex auction dynamics

    International Nuclear Information System (INIS)

    Chen, Qinghua; Wang, Yougui; Huang, Zi-Gang; Lai, Ying-Cheng

    2015-01-01

    Recent years have witnessed a growing interest in analyzing a variety of socio-economic phenomena using methods from statistical and nonlinear physics. We study a class of complex systems arising from economics, the lowest unique bid auction (LUBA) systems, which is a recently emerged class of online auction game systems. Through analyzing large, empirical data sets of LUBA, we identify a general feature of the bid price distribution: an inverted J-shaped function with exponential decay in the large bid price region. To account for the distribution, we propose a multi-agent model in which each agent bids stochastically in the field of winner’s attractiveness, and develop a theoretical framework to obtain analytic solutions of the model based on mean field analysis. The theory produces bid-price distributions that are in excellent agreement with those from the real data. Our model and theory capture the essential features of human behaviors in the competitive environment as exemplified by LUBA, and may provide significant quantitative insights into complex socio-economic phenomena. (paper)

  16. Spectroscopy of plutonium-organic complexes

    International Nuclear Information System (INIS)

    Richmann, M.K.; Reed, D.T.

    1995-01-01

    Information on the spectroscopy of plutonium-organic complexes is needed to help establish the speciation of these complexes under environmentally relevant conditions. Laser photoacoustic spectroscopy (LPAS) and absorption spectrometry were used to characterize the Pu(IV)-citrate and Pu(IV)-nitrilotriacetic acid (NTA) complexes at concentrations of 10 -3 --10 -7 M in aqueous solution. Good agreement was observed between the band shape of the LPAS and absorption spectra for the Pu(IV)-NTA complex. Agreement for the Pu(IV)-citrate complex was not quite as good. In both cases, a linear dependence of the LPAS signal on laser power and total concentration of the complexes was noted. This work is part of an ongoing research effort to study key subsurface interactions of plutonium-organic complexes

  17. Comparisons of complex network based models and real train flow model to analyze Chinese railway vulnerability

    International Nuclear Information System (INIS)

    Ouyang, Min; Zhao, Lijing; Hong, Liu; Pan, Zhezhe

    2014-01-01

    Recently numerous studies have applied complex network based models to study the performance and vulnerability of infrastructure systems under various types of attacks and hazards. But how effective are these models to capture their real performance response is still a question worthy of research. Taking the Chinese railway system as an example, this paper selects three typical complex network based models, including purely topological model (PTM), purely shortest path model (PSPM), and weight (link length) based shortest path model (WBSPM), to analyze railway accessibility and flow-based vulnerability and compare their results with those from the real train flow model (RTFM). The results show that the WBSPM can produce the train routines with 83% stations and 77% railway links identical to the real routines and can approach the RTFM the best for railway vulnerability under both single and multiple component failures. The correlation coefficient for accessibility vulnerability from WBSPM and RTFM under single station failures is 0.96 while it is 0.92 for flow-based vulnerability; under multiple station failures, where each station has the same failure probability fp, the WBSPM can produce almost identical vulnerability results with those from the RTFM under almost all failures scenarios when fp is larger than 0.62 for accessibility vulnerability and 0.86 for flow-based vulnerability

  18. THE MODEL OF LIFELONG EDUCATION IN A TECHNICAL UNIVERSITY AS A MULTILEVEL EDUCATIONAL COMPLEX

    Directory of Open Access Journals (Sweden)

    Svetlana V. Sergeyeva

    2016-06-01

    Full Text Available Introduction: the current leading trend of the educational development is characterised by its continuity. Institutions of higher education as multi-level educational complexes nurture favourable conditions for realisation of the strategy of lifelong education. Today a technical university offering training of future engineers is facing a topic issue of creating a multilevel educational complex. Materials and Methods: this paper is put together on the basis of modern Russian and foreign scientific literature about lifelong education. The authors used theoretical methods of scientific research: systemstructural analysis, synthesis, modeling, analysis and generalisations of concepts. Results: the paper presents a model of lifelong education developed by authors for a technical university as a multilevel educational complex. It is realised through a set of principles: multi-level and continuity, integration, conformity and quality, mobility, anticipation, openness, social partnership and feedback. In accordance with the purpose, objectives and principles, the content part of the model is formed. The syllabi following the described model are run in accordance with the training levels undertaken by a technical university as a multilevel educational complex. All syllabi are based on the gradual nature of their implementation. In this regard, the authors highlight three phases: diagnostic, constructive and transformative, assessing. Discussion and Conclusions: the expected result of the created model of lifelong education development in a technical university as a multilevel educational complex is presented by a graduate trained for effective professional activity, competitive, prepared and sought-after at the regional labour market.

  19. Decomposition studies of group 6 hexacarbonyl complexes. Pt. 2. Modelling of the decomposition process

    Energy Technology Data Exchange (ETDEWEB)

    Usoltsev, Ilya; Eichler, Robert; Tuerler, Andreas [Paul Scherrer Institut (PSI), Villigen (Switzerland); Bern Univ. (Switzerland)

    2016-11-01

    The decomposition behavior of group 6 metal hexacarbonyl complexes (M(CO){sub 6}) in a tubular flow reactor is simulated. A microscopic Monte-Carlo based model is presented for assessing the first bond dissociation enthalpy of M(CO){sub 6} complexes. The suggested approach superimposes a microscopic model of gas adsorption chromatography with a first-order heterogeneous decomposition model. The experimental data on the decomposition of Mo(CO){sub 6} and W(CO){sub 6} are successfully simulated by introducing available thermodynamic data. Thermodynamic data predicted by relativistic density functional theory is used in our model to deduce the most probable experimental behavior of the corresponding Sg carbonyl complex. Thus, the design of a chemical experiment with Sg(CO){sub 6} is suggested, which is sensitive to benchmark our theoretical understanding of the bond stability in carbonyl compounds of the heaviest elements.

  20. A conscious mouse model of gastric ileus using clinically relevant endpoints

    Directory of Open Access Journals (Sweden)

    Shao Yuanlin

    2005-06-01

    Full Text Available Abstract Background Gastric ileus is an unsolved clinical problem and current treatment is limited to supportive measures. Models of ileus using anesthetized animals, muscle strips or isolated smooth muscle cells do not adequately reproduce the clinical situation. Thus, previous studies using these techniques have not led to a clear understanding of the pathophysiology of ileus. The feasibility of using food intake and fecal output as simple, clinically relevant endpoints for monitoring ileus in a conscious mouse model was evaluated by assessing the severity and time course of various insults known to cause ileus. Methods Delayed food intake and fecal output associated with ileus was monitored after intraperitoneal injection of endotoxin, laparotomy with bowel manipulation, thermal injury or cerulein induced acute pancreatitis. The correlation of decreased food intake after endotoxin injection with gastric ileus was validated by measuring gastric emptying. The effect of endotoxin on general activity level and feeding behavior was also determined. Small bowel transit was measured using a phenol red marker. Results Each insult resulted in a transient and comparable decrease in food intake and fecal output consistent with the clinical picture of ileus. The endpoints were highly sensitive to small changes in low doses of endotoxin, the extent of bowel manipulation, and cerulein dose. The delay in food intake directly correlated with delayed gastric emptying. Changes in general activity and feeding behavior were insufficient to explain decreased food intake. Intestinal transit remained unchanged at the times measured. Conclusion Food intake and fecal output are sensitive markers of gastric dysfunction in four experimental models of ileus. In the mouse, delayed gastric emptying appears to be the major cause of the anorexic effect associated with ileus. Gastric dysfunction is more important than small bowel dysfunction in this model. Recovery of

  1. Surface-Sensitive and Bulk Studies on the Complexation and Photosensitized Degradation of Catechol by Iron(III) as a Model for Multicomponent Aerosol Systems

    Science.gov (United States)

    Al-abadleh, H. A.; Tofan-Lazar, J.; Situm, A.; Ruffolo, J.; Slikboer, S.

    2013-12-01

    Surface water plays a crucial role in facilitating or inhibiting surface reactions in atmospheric aerosols. Little is known about the role of surface water in the complexation of organic molecules to transition metals in multicomponent aerosol systems. We will show results from real time diffuse reflectance infrared Fourier transform spectroscopy (DRIFTS) experiments for the in situ complexation of catechol to Fe(III) and its photosensitized degradation under dry and humid conditions. Catechol was chosen as a simple model for humic-like substances (HULIS) in aerosols and aged polyaromatic hydrocarbons (PAH). It has also been detected in secondary organic aerosols (SOA) formed from the reaction of hydroxyl radicals with benzene. Given the importance of the iron content in aerosols and its biogeochemistry, our studies were conducted using FeCl3. For comparison, these surface-sensitive studies were complemented with bulk aqueous ATR-FTIR, UV-vis, and HPLC measurements for structural, quantitative and qualitative information about complexes in the bulk, and potential degradation products. The implications of our studies on understanding interfacial and condensed phase chemistry relevant to multicomponent aerosols, water thin islands on buildings, and ocean surfaces containing transition metals will be discussed.

  2. The development and application of composite complexity models and a relative complexity metric in a software maintenance environment

    Science.gov (United States)

    Hops, J. M.; Sherif, J. S.

    1994-01-01

    A great deal of effort is now being devoted to the study, analysis, prediction, and minimization of software maintenance expected cost, long before software is delivered to users or customers. It has been estimated that, on the average, the effort spent on software maintenance is as costly as the effort spent on all other software costs. Software design methods should be the starting point to aid in alleviating the problems of software maintenance complexity and high costs. Two aspects of maintenance deserve attention: (1) protocols for locating and rectifying defects, and for ensuring that noe new defects are introduced in the development phase of the software process; and (2) protocols for modification, enhancement, and upgrading. This article focuses primarily on the second aspect, the development of protocols to help increase the quality and reduce the costs associated with modifications, enhancements, and upgrades of existing software. This study developed parsimonious models and a relative complexity metric for complexity measurement of software that were used to rank the modules in the system relative to one another. Some success was achieved in using the models and the relative metric to identify maintenance-prone modules.

  3. A binary logistic regression model with complex sampling design of ...

    African Journals Online (AJOL)

    2017-09-03

    Sep 3, 2017 ... Bi-variable and multi-variable binary logistic regression model with complex sampling design was fitted. .... Data was entered into STATA-12 and analyzed using. SPSS-21. .... lack of access/too far or costs too much. 35. 1.2.

  4. Extension of association models to complex chemicals

    DEFF Research Database (Denmark)

    Avlund, Ane Søgaard

    Summary of “Extension of association models to complex chemicals”. Ph.D. thesis by Ane Søgaard Avlund The subject of this thesis is application of SAFT type equations of state (EoS). Accurate and predictive thermodynamic models are important in many industries including the petroleum industry......; CPA and sPC-SAFT. Phase equilibrium and monomer fraction calculations with sPC-SAFT for methanol are used in the thesis to illustrate the importance of parameter estimation when using SAFT. Different parameter sets give similar pure component vapor pressure and liquid density results, whereas very...... association is presented in the thesis, and compared to the corresponding lattice theory. The theory for intramolecular association is then applied in connection with sPC-SAFT for mixtures containing glycol ethers. Calculations with sPC-SAFT (without intramolecular association) are presented for comparison...

  5. A dynamic simulation model of the Savannah River Site high level waste complex

    International Nuclear Information System (INIS)

    Gregory, M.V.; Aull, J.E.; Dimenna, R.A.

    1994-01-01

    A detailed, dynamic simulation entire high level radioactive waste complex at the Savannah River Site has been developed using SPEEDUP(tm) software. The model represents mass transfer, evaporation, precipitation, sludge washing, effluent treatment, and vitrification unit operation processes through the solution of 7800 coupled differential and algebraic equations. Twenty-seven discrete chemical constituents are tracked through the unit operations. The simultaneous simultaneous simulation of concurrent batch and continuous processes is achieved by several novel, customized SPEEDUP(tm) algorithms. Due to the model's computational burden, a high-end work station is required: simulation of a years operation of the complex requires approximately three CPU hours on an IBM RS/6000 Model 590 processor. The model will be used to develop optimal high level waste (HLW) processing strategies over a thirty year time horizon. It will be employed to better understand the dynamic inter-relationships between different HLW unit operations, and to suggest strategies that will maximize available working tank space during the early years of operation and minimize overall waste processing cost over the long-term history of the complex. Model validation runs are currently underway with comparisons against actual plant operating data providing an excellent match

  6. Rethinking the Psychogenic Model of Complex Regional Pain Syndrome: Somatoform Disorders and Complex Regional Pain Syndrome

    Science.gov (United States)

    Hill, Renee J.; Chopra, Pradeep; Richardi, Toni

    2012-01-01

    Abstract Explaining the etiology of Complex Regional Pain Syndrome (CRPS) from the psychogenic model is exceedingly unsophisticated, because neurocognitive deficits, neuroanatomical abnormalities, and distortions in cognitive mapping are features of CRPS pathology. More importantly, many people who have developed CRPS have no history of mental illness. The psychogenic model offers comfort to physicians and mental health practitioners (MHPs) who have difficulty understanding pain maintained by newly uncovered neuro inflammatory processes. With increased education about CRPS through a biopsychosocial perspective, both physicians and MHPs can better diagnose, treat, and manage CRPS symptomatology. PMID:24223338

  7. Relevance of the c-statistic when evaluating risk-adjustment models in surgery.

    Science.gov (United States)

    Merkow, Ryan P; Hall, Bruce L; Cohen, Mark E; Dimick, Justin B; Wang, Edward; Chow, Warren B; Ko, Clifford Y; Bilimoria, Karl Y

    2012-05-01

    The measurement of hospital quality based on outcomes requires risk adjustment. The c-statistic is a popular tool used to judge model performance, but can be limited, particularly when evaluating specific operations in focused populations. Our objectives were to examine the interpretation and relevance of the c-statistic when used in models with increasingly similar case mix and to consider an alternative perspective on model calibration based on a graphical depiction of model fit. From the American College of Surgeons National Surgical Quality Improvement Program (2008-2009), patients were identified who underwent a general surgery procedure, and procedure groups were increasingly restricted: colorectal-all, colorectal-elective cases only, and colorectal-elective cancer cases only. Mortality and serious morbidity outcomes were evaluated using logistic regression-based risk adjustment, and model c-statistics and calibration curves were used to compare model performance. During the study period, 323,427 general, 47,605 colorectal-all, 39,860 colorectal-elective, and 21,680 colorectal cancer patients were studied. Mortality ranged from 1.0% in general surgery to 4.1% in the colorectal-all group, and serious morbidity ranged from 3.9% in general surgery to 12.4% in the colorectal-all procedural group. As case mix was restricted, c-statistics progressively declined from the general to the colorectal cancer surgery cohorts for both mortality and serious morbidity (mortality: 0.949 to 0.866; serious morbidity: 0.861 to 0.668). Calibration was evaluated graphically by examining predicted vs observed number of events over risk deciles. For both mortality and serious morbidity, there was no qualitative difference in calibration identified between the procedure groups. In the present study, we demonstrate how the c-statistic can become less informative and, in certain circumstances, can lead to incorrect model-based conclusions, as case mix is restricted and patients become

  8. The evolution model of Uppsala in light of the complex adaptive systems approach

    Directory of Open Access Journals (Sweden)

    Rennaly Alves da Silva

    2013-11-01

    Full Text Available The behavioral approach to the internationalization of companies explains that the movements toward external markets occur in accordance with the increasing commitment of resources to mitigate the effects of uncertainty and reduce the perception of risk. Evidence indicates that the theories and practices developed in the domestic market may not be able to explain the reality of companies that operate in international markets. Thus, the Paradigm of Complexity presents itself as a comprehensive alternative to realize the relationships within organizations and markets. Accordingly, the aim of this theoretical paper is to analyze the evolution of the Uppsala Model between years 1975 and 2010 with the understanding of the companies in the process of internationalization as Complex Adaptive Systems, in accordance with the Model Kelly and Allison (1998. Four propositions are presented that show the links between the approaches. The most surprising is the perception that the conceptual evolution of the Uppsala Model seems to accompany the evolution of complexity levels, presented in Model Kelly and Allison.

  9. Computational model of dose response for low-LET-induced complex chromosomal aberrations

    International Nuclear Information System (INIS)

    Eidelman, Y.A.; Andreev, S.G.

    2015-01-01

    Experiments with full-colour mFISH chromosome painting have revealed high yield of radiation-induced complex chromosomal aberrations (CAs). The ratio of complex to simple aberrations is dependent on cell type and linear energy transfer. Theoretical analysis has demonstrated that the mechanism of CA formation as a result of interaction between lesions at a surface of chromosome territories does not explain high complexes-to-simples ratio in human lymphocytes. The possible origin of high yields of γ-induced complex CAs was investigated in the present work by computer simulation. CAs were studied on the basis of chromosome structure and dynamics modelling and the hypothesis of CA formation on nuclear centres. The spatial organisation of all chromosomes in a human interphase nucleus was predicted by simulation of mitosis-to-interphase chromosome structure transition. Two scenarios of CA formation were analysed, 'static' (existing in a nucleus prior to irradiation) centres and 'dynamic' (formed in response to irradiation) centres. The modelling results reveal that under certain conditions, both scenarios explain quantitatively the dose-response relationships for both simple and complex γ-induced inter-chromosomal exchanges observed by mFISH chromosome painting in the first post-irradiation mitosis in human lymphocytes. (authors)

  10. The solution of a chiral random matrix model with complex eigenvalues

    International Nuclear Information System (INIS)

    Akemann, G

    2003-01-01

    We describe in detail the solution of the extension of the chiral Gaussian unitary ensemble (chGUE) into the complex plane. The correlation functions of the model are first calculated for a finite number of N complex eigenvalues, where we exploit the existence of orthogonal Laguerre polynomials in the complex plane. When taking the large-N limit we derive new correlation functions in the case of weak and strong non-Hermiticity, thus describing the transition from the chGUE to a generalized Ginibre ensemble. We briefly discuss applications to the Dirac operator eigenvalue spectrum in quantum chromodynamics with non-vanishing chemical potential. This is an extended version of hep-th/0204068

  11. A study of the logical model of capital market complexity theories

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Analyzes the shortcomings of the classic capital market theories based on EMH and discloses the complexity essence of the capital market. Considering the capital market a complicated, interactive and adaptable dynamic system, with complexity science as the method for researching the operation law of the capital market, this paper constructs a nonlinear logical model to analyze the applied realm, focal point and interrelationship of such theories as dissipative structure theory, chaos theory, fractal theory, synergetics theory, catastrophe theory and scale theory, and summarizes and discusses the achievements and problems of each theory.Based on the research, the paper foretells the developing direction of complexity science in a capital market.

  12. A Simple Model for Complex Fabrication of MEMS based Pressure Sensor: A Challenging Approach

    Directory of Open Access Journals (Sweden)

    Himani SHARMA

    2010-08-01

    Full Text Available In this paper we have presented the simple model for complex fabrication of MEMS based absolute micro pressure sensor. This kind of modeling is extremely useful for determining its complexity in fabrication steps and provides complete information about process sequence to be followed during manufacturing. Therefore, the need for test iteration decreases and cost, time can be reduced significantly. By using DevEdit tool (part of SILVACO tool, a behavioral model of pressure sensor have been presented and implemented.

  13. Predicting the future completing models of observed complex systems

    CERN Document Server

    Abarbanel, Henry

    2013-01-01

    Predicting the Future: Completing Models of Observed Complex Systems provides a general framework for the discussion of model building and validation across a broad spectrum of disciplines. This is accomplished through the development of an exact path integral for use in transferring information from observations to a model of the observed system. Through many illustrative examples drawn from models in neuroscience, fluid dynamics, geosciences, and nonlinear electrical circuits, the concepts are exemplified in detail. Practical numerical methods for approximate evaluations of the path integral are explored, and their use in designing experiments and determining a model's consistency with observations is investigated. Using highly instructive examples, the problems of data assimilation and the means to treat them are clearly illustrated. This book will be useful for students and practitioners of physics, neuroscience, regulatory networks, meteorology and climate science, network dynamics, fluid dynamics, and o...

  14. Stability of Naturally Relevant Ternary Phases in the Cu–Sn–S system in Contact with an Aqueous Solution

    Directory of Open Access Journals (Sweden)

    Andrea Giaccherini

    2016-07-01

    Full Text Available A relevant research effort is devoted to the synthesis and characterization of phases belonging to the ternary system Cu–Sn–S, mainly for their possible applications in semiconductor technology. Among all ternary phases, kuramite, Cu3SnS4, mohite, Cu2SnS3, and Cu4Sn7S16 have attracted the highest interest. Numerous studies were carried out claiming for the description of new phases in the ternary compositional field. In this study, we revise the existing literature on this ternary system, with a special focus on the phases stable in a temperature range at 25 °C. The only two ternary phases observed in nature are mohite and kuramite. Their occurrence is described as very rare. A numerical modelling of the stable solid phases in contact with a water solution was underwent to define stability relationships of the relevant phases of the system. The numerical modelling of the Eh-pH diagrams was carried out through the phreeqc software with the lnll.dat thermodynamic database. Owing to the complexity of this task, the subsystems Cu–O–H, Sn–O–H, Cu–S–O–H and Sn–S–O–H were firstly considered. The first Pourbaix diagram for the two naturally relevant ternary phases is then proposed.

  15. The Anti-Inflammatory Effects of Acupuncture and Their Relevance to Allergic Rhinitis: A Narrative Review and Proposed Model

    Directory of Open Access Journals (Sweden)

    John L. McDonald

    2013-01-01

    Full Text Available Classical literature indicates that acupuncture has been used for millennia to treat numerous inflammatory conditions, including allergic rhinitis. Recent research has examined some of the mechanisms underpinning acupuncture's anti-inflammatory effects which include mediation by sympathetic and parasympathetic pathways. The hypothalamus-pituitary-adrenal (HPA axis has been reported to mediate the antioedema effects of acupuncture, but not antihyperalgesic actions during inflammation. Other reported anti-inflammatory effects of acupuncture include an antihistamine action and downregulation of proinflammatory cytokines (such as TNF-α, IL-1β, IL-6, and IL-10, proinflammatory neuropeptides (such as SP, CGRP, and VIP, and neurotrophins (such as NGF and BDNF which can enhance and prolong inflammatory response. Acupuncture has been reported to suppress the expression of COX-1, COX-2, and iNOS during experimentally induced inflammation. Downregulation of the expression and sensitivity of the transient receptor potential vallinoid 1 (TRPV1 after acupuncture has been reported. In summary, acupuncture may exert anti-inflammatory effects through a complex neuro-endocrino-immunological network of actions. Many of these generic anti-inflammatory effects of acupuncture are of direct relevance to allergic rhinitis; however, more research is needed to elucidate specifically how immune mechanisms might be modulated by acupuncture in allergic rhinitis, and to this end a proposed model is offered to guide further research.

  16. Deep learning relevance

    DEFF Research Database (Denmark)

    Lioma, Christina; Larsen, Birger; Petersen, Casper

    2016-01-01

    train a Recurrent Neural Network (RNN) on existing relevant information to that query. We then use the RNN to "deep learn" a single, synthetic, and we assume, relevant document for that query. We design a crowdsourcing experiment to assess how relevant the "deep learned" document is, compared...... to existing relevant documents. Users are shown a query and four wordclouds (of three existing relevant documents and our deep learned synthetic document). The synthetic document is ranked on average most relevant of all....

  17. Framework for Modelling Multiple Input Complex Aggregations for Interactive Installations

    DEFF Research Database (Denmark)

    Padfield, Nicolas; Andreasen, Troels

    2012-01-01

    on fuzzy logic and provides a method for variably balancing interaction and user input with the intention of the artist or director. An experimental design is presented, demonstrating an intuitive interface for parametric modelling of a complex aggregation function. The aggregation function unifies...

  18. “Zebrafishing” for Novel Genes Relevant to the Glomerular Filtration Barrier

    Directory of Open Access Journals (Sweden)

    Nils Hanke

    2013-01-01

    Full Text Available Data for genes relevant to glomerular filtration barrier function or proteinuria is continually increasing in an era of microarrays, genome-wide association studies, and quantitative trait locus analysis. Researchers are limited by published literature searches to select the most relevant genes to investigate. High-throughput cell cultures and other in vitro systems ultimately need to demonstrate proof in an in vivo model. Generating mammalian models for the genes of interest is costly and time intensive, and yields only a small number of test subjects. These models also have many pitfalls such as possible embryonic mortality and failure to generate phenotypes or generate nonkidney specific phenotypes. Here we describe an in vivo zebrafish model as a simple vertebrate screening system to identify genes relevant to glomerular filtration barrier function. Using our technology, we are able to screen entirely novel genes in 4–6 weeks in hundreds of live test subjects at a fraction of the cost of a mammalian model. Our system produces consistent and reliable evidence for gene relevance in glomerular kidney disease; the results then provide merit for further analysis in mammalian models.

  19. Business model innovation in electricity supply markets: The role of complex value in the United Kingdom

    International Nuclear Information System (INIS)

    Hall, Stephen; Roelich, Katy

    2016-01-01

    This research investigates the new opportunities that business model innovations are creating in electricity supply markets at the sub-national scale. These local supply business models can offer significant benefits to the electricity system, but also generate economic, social, and environmental values that are not well accounted for in current policy or regulation. This paper uses the UK electricity supply market to investigate new business models which rely on more complex value propositions than the incumbent utility model. Nine archetypal local supply business models are identified and their value propositions, value capture methods, and barriers to market entry are analysed. This analysis defines 'complex value' as a key concept in understanding business model innovation in the energy sector. The process of complex value identification poses a challenge to energy researchers, commercial firms and policymakers in liberalised markets; to investigate the opportunities for system efficiency and diverse outcomes that new supplier business models can offer to the electricity system. - Highlights: •Business models of energy supply markets shape energy transitions. •The British system misses four opportunities of local electricity supply. •Nine new business model archetypes of local supply are analysed. •New electricity business models have complex value propositions. •A process for policy response to business model innovation is presented.

  20. Historical and idealized climate model experiments: an intercomparison of Earth system models of intermediate complexity

    DEFF Research Database (Denmark)

    Eby, M.; Weaver, A. J.; Alexander, K.

    2013-01-01

    Both historical and idealized climate model experiments are performed with a variety of Earth system models of intermediate complexity (EMICs) as part of a community contribution to the Intergovernmental Panel on Climate Change Fifth Assessment Report. Historical simulations start at 850 CE...... and continue through to 2005. The standard simulations include changes in forcing from solar luminosity, Earth's orbital configuration, CO2, additional greenhouse gases, land use, and sulphate and volcanic aerosols. In spite of very different modelled pre-industrial global surface air temperatures, overall 20...