WorldWideScience

Sample records for explicit modeling tool

  1. Spatially-explicit LCIA model for marine eutrophication as a tool for sustainability assessment

    DEFF Research Database (Denmark)

    Cosme, Nuno Miguel Dias; Hauschild, Michael Zwicky

    2014-01-01

    The increasing emissions from human activities are overrunning the ecosystems’ natural capacity to absorb them. Nutrient emissions, mostly nitrogen- and phosphorus-forms (N, P) from e.g. agricultural runoff and combustion processes, may lead to social-economic impacts and environmental quality......-enrichment to impacts on marine ecosystems. Emitted nitrogen reaches marine coastal waters where it promotes the growth of phytoplankton biomass in the surface photic zone from where it eventually sinks to bottom waters. This downward flux of organic matter is respired there by bacteria resulting in the consumption...... of dissolved oxygen. An excessive depletion of oxygen affects the exposed organisms and loss of species diversity may be expected. A model framework was built to estimate the potential impacts arising from N-emissions (see figure). It combines the fate of N in rivers and coastal waters, the exposure...

  2. Skeeter Buster: a stochastic, spatially explicit modeling tool for studying Aedes aegypti population replacement and population suppression strategies.

    Directory of Open Access Journals (Sweden)

    Krisztian Magori

    2009-09-01

    Full Text Available Dengue is the most important mosquito-borne viral disease affecting humans. The only prevention measure currently available is the control of its vectors, primarily Aedes aegypti. Recent advances in genetic engineering have opened the possibility for a new range of control strategies based on genetically modified mosquitoes. Assessing the potential efficacy of genetic (and conventional strategies requires the availability of modeling tools that accurately describe the dynamics and genetics of Ae. aegypti populations.We describe in this paper a new modeling tool of Ae. aegypti population dynamics and genetics named Skeeter Buster. This model operates at the scale of individual water-filled containers for immature stages and individual properties (houses for adults. The biology of cohorts of mosquitoes is modeled based on the algorithms used in the non-spatial Container Inhabiting Mosquitoes Simulation Model (CIMSiM. Additional features incorporated into Skeeter Buster include stochasticity, spatial structure and detailed population genetics. We observe that the stochastic modeling of individual containers in Skeeter Buster is associated with a strongly reduced temporal variation in stage-specific population densities. We show that heterogeneity in container composition of individual properties has a major impact on spatial heterogeneity in population density between properties. We detail how adult dispersal reduces this spatial heterogeneity. Finally, we present the predicted genetic structure of the population by calculating F(ST values and isolation by distance patterns, and examine the effects of adult dispersal and container movement between properties.We demonstrate that the incorporated stochasticity and level of spatial detail have major impacts on the simulated population dynamics, which could potentially impact predictions in terms of control measures. The capacity to describe population genetics confers the ability to model the outcome

  3. Philosophical Reflections made explicit as a Tool for Mathematical Reasoning

    DEFF Research Database (Denmark)

    Frølund, Sune; Andresen, Mette

    2009-01-01

        A new construct, ‘multidiciplinarity', is prescribed in the curricula of Danish Upper Secondary Schools by governmental regulations since 2006. Multidisciplinarity offers a good chance to introduce philosophical tools or methods in mathematics with the aim to improve the students' learning...... of both subjects, and to study the students' reactions and signs of progressive mathematizing. Based on realistic mathematics education (RME) which is rooted in Hans Freudenthal's idea of mathematics as a human activity, we decided to centre our work on the concept of reflection and to build a model...... for making students reflections in the mathematics class explicit to themselves. In our paper, we present a combination of two stratifications of reflections which were developed recently in works by other authors. The paper outlines our model and exemplifies its use on the teaching of mathematical models...

  4. Modeling Agricultural Watersheds with the Soil and Water Assessment Tool (SWAT): Calibration and Validation with a Novel Procedure for Spatially Explicit HRUs.

    Science.gov (United States)

    Teshager, Awoke Dagnew; Gassman, Philip W; Secchi, Silvia; Schoof, Justin T; Misgna, Girmaye

    2016-04-01

    Applications of the Soil and Water Assessment Tool (SWAT) model typically involve delineation of a watershed into subwatersheds/subbasins that are then further subdivided into hydrologic response units (HRUs) which are homogeneous areas of aggregated soil, landuse, and slope and are the smallest modeling units used within the model. In a given standard SWAT application, multiple potential HRUs (farm fields) in a subbasin are usually aggregated into a single HRU feature. In other words, the standard version of the model combines multiple potential HRUs (farm fields) with the same landuse/landcover, soil, and slope, but located at different places of a subbasin (spatially non-unique), and considers them as one HRU. In this study, ArcGIS pre-processing procedures were developed to spatially define a one-to-one match between farm fields and HRUs (spatially unique HRUs) within a subbasin prior to SWAT simulations to facilitate input processing, input/output mapping, and further analysis at the individual farm field level. Model input data such as landuse/landcover (LULC), soil, crop rotation, and other management data were processed through these HRUs. The SWAT model was then calibrated/validated for Raccoon River watershed in Iowa for 2002-2010 and Big Creek River watershed in Illinois for 2000-2003. SWAT was able to replicate annual, monthly, and daily streamflow, as well as sediment, nitrate and mineral phosphorous within recommended accuracy in most cases. The one-to-one match between farm fields and HRUs created and used in this study is a first step in performing LULC change, climate change impact, and other analyses in a more spatially explicit manner.

  5. Modeling Implicit and Explicit Memory.

    NARCIS (Netherlands)

    Raaijmakers, J.G.W.; Ohta, N.; Izawa, C.

    2005-01-01

    Mathematical models of memory are useful for describing basic processes of memory in a way that enables generalization across a number of experimental paradigms. Models that have these characteristics do not just engage in empirical curve-fitting, but may also provide explanations for puzzling

  6. Spatially explicit modeling of greater sage-grouse (Centrocercus urophasianus) habitat in Nevada and northeastern California: a decision-support tool for management

    Science.gov (United States)

    Coates, Peter S.; Casazza, Michael L.; Brussee, Brianne E.; Ricca, Mark A.; Gustafson, K. Benjamin; Overton, Cory T.; Sanchez-Chopitea, Erika; Kroger, Travis; Mauch, Kimberly; Niell, Lara; Howe, Kristy; Gardner, Scott; Espinosa, Shawn; Delehanty, David J.

    2014-01-01

    Greater sage-grouse (Centrocercus urophasianus, hereafter referred to as “sage-grouse”) populations are declining throughout the sagebrush (Artemisia spp.) ecosystem, including millions of acres of potential habitat across the West. Habitat maps derived from empirical data are needed given impending listing decisions that will affect both sage-grouse population dynamics and human land-use restrictions. This report presents the process for developing spatially explicit maps describing relative habitat suitability for sage-grouse in Nevada and northeastern California. Maps depicting habitat suitability indices (HSI) values were generated based on model-averaged resource selection functions informed by more than 31,000 independent telemetry locations from more than 1,500 radio-marked sage-grouse across 12 project areas in Nevada and northeastern California collected during a 15-year period (1998–2013). Modeled habitat covariates included land cover composition, water resources, habitat configuration, elevation, and topography, each at multiple spatial scales that were relevant to empirically observed sage-grouse movement patterns. We then present an example of how the HSI can be delineated into categories. Specifically, we demonstrate that the deviation from the mean can be used to classify habitat suitability into three categories of habitat quality (high, moderate, and low) and one non-habitat category. The classification resulted in an agreement of 93–97 percent for habitat versus non-habitat across a suite of independent validation datasets. Lastly, we provide an example of how space use models can be integrated with habitat models to help inform conservation planning. In this example, we combined probabilistic breeding density with a non-linear probability of occurrence relative to distance to nearest lek (traditional breeding ground) using count data to calculate a composite space use index (SUI). The SUI was then classified into two categories of use

  7. Spatially explicit modelling of cholera epidemics

    Science.gov (United States)

    Finger, F.; Bertuzzo, E.; Mari, L.; Knox, A. C.; Gatto, M.; Rinaldo, A.

    2013-12-01

    Epidemiological models can provide crucial understanding about the dynamics of infectious diseases. Possible applications range from real-time forecasting and allocation of health care resources to testing alternative intervention mechanisms such as vaccines, antibiotics or the improvement of sanitary conditions. We apply a spatially explicit model to the cholera epidemic that struck Haiti in October 2010 and is still ongoing. The dynamics of susceptibles as well as symptomatic and asymptomatic infectives are modelled at the scale of local human communities. Dissemination of Vibrio cholerae through hydrological transport and human mobility along the road network is explicitly taken into account, as well as the effect of rainfall as a driver of increasing disease incidence. The model is calibrated using a dataset of reported cholera cases. We further model the long term impact of several types of interventions on the disease dynamics by varying parameters appropriately. Key epidemiological mechanisms and parameters which affect the efficiency of treatments such as antibiotics are identified. Our results lead to conclusions about the influence of different intervention strategies on the overall epidemiological dynamics.

  8. Spatially explicit modeling in ecology: A review

    Science.gov (United States)

    DeAngelis, Donald L.; Yurek, Simeon

    2017-01-01

    The use of spatially explicit models (SEMs) in ecology has grown enormously in the past two decades. One major advancement has been that fine-scale details of landscapes, and of spatially dependent biological processes, such as dispersal and invasion, can now be simulated with great precision, due to improvements in computer technology. Many areas of modeling have shifted toward a focus on capturing these fine-scale details, to improve mechanistic understanding of ecosystems. However, spatially implicit models (SIMs) have played a dominant role in ecology, and arguments have been made that SIMs, which account for the effects of space without specifying spatial positions, have an advantage of being simpler and more broadly applicable, perhaps contributing more to understanding. We address this debate by comparing SEMs and SIMs in examples from the past few decades of modeling research. We argue that, although SIMs have been the dominant approach in the incorporation of space in theoretical ecology, SEMs have unique advantages for addressing pragmatic questions concerning species populations or communities in specific places, because local conditions, such as spatial heterogeneities, organism behaviors, and other contingencies, produce dynamics and patterns that usually cannot be incorporated into simpler SIMs. SEMs are also able to describe mechanisms at the local scale that can create amplifying positive feedbacks at that scale, creating emergent patterns at larger scales, and therefore are important to basic ecological theory. We review the use of SEMs at the level of populations, interacting populations, food webs, and ecosystems and argue that SEMs are not only essential in pragmatic issues, but must play a role in the understanding of causal relationships on landscapes.

  9. Explicitly represented polygon wall boundary model for the explicit MPS method

    Science.gov (United States)

    Mitsume, Naoto; Yoshimura, Shinobu; Murotani, Kohei; Yamada, Tomonori

    2015-05-01

    This study presents an accurate and robust boundary model, the explicitly represented polygon (ERP) wall boundary model, to treat arbitrarily shaped wall boundaries in the explicit moving particle simulation (E-MPS) method, which is a mesh-free particle method for strong form partial differential equations. The ERP model expresses wall boundaries as polygons, which are explicitly represented without using the distance function. These are derived so that for viscous fluids, and with less computational cost, they satisfy the Neumann boundary condition for the pressure and the slip/no-slip condition on the wall surface. The proposed model is verified and validated by comparing computed results with the theoretical solution, results obtained by other models, and experimental results. Two simulations with complex boundary movements are conducted to demonstrate the applicability of the E-MPS method to the ERP model.

  10. Integrating remote sensing and spatially explicit epidemiological modeling

    Science.gov (United States)

    Finger, Flavio; Knox, Allyn; Bertuzzo, Enrico; Mari, Lorenzo; Bompangue, Didier; Gatto, Marino; Rinaldo, Andrea

    2015-04-01

    Spatially explicit epidemiological models are a crucial tool for the prediction of epidemiological patterns in time and space as well as for the allocation of health care resources. In addition they can provide valuable information about epidemiological processes and allow for the identification of environmental drivers of the disease spread. Most epidemiological models rely on environmental data as inputs. They can either be measured in the field by the means of conventional instruments or using remote sensing techniques to measure suitable proxies of the variables of interest. The later benefit from several advantages over conventional methods, including data availability, which can be an issue especially in developing, and spatial as well as temporal resolution of the data, which is particularly crucial for spatially explicit models. Here we present the case study of a spatially explicit, semi-mechanistic model applied to recurring cholera outbreaks in the Lake Kivu area (Democratic Republic of the Congo). The model describes the cholera incidence in eight health zones on the shore of the lake. Remotely sensed datasets of chlorophyll a concentration in the lake, precipitation and indices of global climate anomalies are used as environmental drivers. Human mobility and its effect on the disease spread is also taken into account. Several model configurations are tested on a data set of reported cases. The best models, accounting for different environmental drivers, and selected using the Akaike information criterion, are formally compared via cross validation. The best performing model accounts for seasonality, El Niño Southern Oscillation, precipitation and human mobility.

  11. Making Culturally Responsive Mathematics Teaching Explicit: A Lesson Analysis Tool

    Science.gov (United States)

    Aguirre, Julia M.; Zavala, Maria del Rosario

    2013-01-01

    In the United States, there is a need for pedagogical tools that help teachers develop essential pedagogical content knowledge and practices to meet the mathematical education needs of a growing culturally and linguistically diverse student population. In this article, we introduce an innovative lesson analysis tool that focuses on integrating…

  12. Explicit Nonlinear Model Predictive Control Theory and Applications

    CERN Document Server

    Grancharova, Alexandra

    2012-01-01

    Nonlinear Model Predictive Control (NMPC) has become the accepted methodology to solve complex control problems related to process industries. The main motivation behind explicit NMPC is that an explicit state feedback law avoids the need for executing a numerical optimization algorithm in real time. The benefits of an explicit solution, in addition to the efficient on-line computations, include also verifiability of the implementation and the possibility to design embedded control systems with low software and hardware complexity. This book considers the multi-parametric Nonlinear Programming (mp-NLP) approaches to explicit approximate NMPC of constrained nonlinear systems, developed by the authors, as well as their applications to various NMPC problem formulations and several case studies. The following types of nonlinear systems are considered, resulting in different NMPC problem formulations: Ø  Nonlinear systems described by first-principles models and nonlinear systems described by black-box models; �...

  13. Spatially explicit modeling of annual and seasonal habitat for greater sage-grouse (Centrocercus urophasianus) in Nevada and Northeastern California—An updated decision-support tool for management

    Science.gov (United States)

    Coates, Peter S.; Casazza, Michael L.; Brussee, Brianne E.; Ricca, Mark A.; Gustafson, K. Benjamin; Sanchez-Chopitea, Erika; Mauch, Kimberly; Niell, Lara; Gardner, Scott; Espinosa, Shawn; Delehanty, David J.

    2016-05-20

    Successful adaptive management hinges largely upon integrating new and improved sources of information as they become available. As a timely example of this tenet, we updated a management decision support tool that was previously developed for greater sage-grouse (Centrocercus urophasianus, hereinafter referred to as “sage-grouse”) populations in Nevada and California. Specifically, recently developed spatially explicit habitat maps derived from empirical data played a key role in the conservation of this species facing listing under the Endangered Species Act. This report provides an updated process for mapping relative habitat suitability and management categories for sage-grouse in Nevada and northeastern California (Coates and others, 2014, 2016). These updates include: (1) adding radio and GPS telemetry locations from sage-grouse monitored at multiple sites during 2014 to the original location dataset beginning in 1998; (2) integrating output from high resolution maps (1–2 m2) of sagebrush and pinyon-juniper cover as covariates in resource selection models; (3) modifying the spatial extent of the analyses to match newly available vegetation layers; (4) explicit modeling of relative habitat suitability during three seasons (spring, summer, winter) that corresponded to critical life history periods for sage-grouse (breeding, brood-rearing, over-wintering); (5) accounting for differences in habitat availability between more mesic sagebrush steppe communities in the northern part of the study area and drier Great Basin sagebrush in more southerly regions by categorizing continuous region-wide surfaces of habitat suitability index (HSI) with independent locations falling within two hydrological zones; (6) integrating the three seasonal maps into a composite map of annual relative habitat suitability; (7) deriving updated land management categories based on previously determined cut-points for intersections of habitat suitability and an updated index of sage

  14. Recent Advances in Explicit Multiparametric Nonlinear Model Predictive Control

    KAUST Repository

    Domínguez, Luis F.

    2011-01-19

    In this paper we present recent advances in multiparametric nonlinear programming (mp-NLP) algorithms for explicit nonlinear model predictive control (mp-NMPC). Three mp-NLP algorithms for NMPC are discussed, based on which novel mp-NMPC controllers are derived. The performance of the explicit controllers are then tested and compared in a simulation example involving the operation of a continuous stirred-tank reactor (CSTR). © 2010 American Chemical Society.

  15. Modelling conflict management in design: an explicit approach

    NARCIS (Netherlands)

    Brazier, F.M.; van Langen, P.H.G.; Treur, J.

    1995-01-01

    This paper focusses on how conflicts that arise during a design process and the management of conflicts can be modelled. A number of possible conflict types are distinguished and it is described how each of them can be detected during the design process, using an explicit meta-representation.

  16. Modeling single versus multiple systems in implicit and explicit memory.

    Science.gov (United States)

    Starns, Jeffrey J; Ratcliff, Roger; McKoon, Gail

    2012-04-01

    It is currently controversial whether priming on implicit tasks and discrimination on explicit recognition tests are supported by a single memory system or by multiple, independent systems. In a Psychological Review article, Berry and colleagues used mathematical modeling to address this question and provide compelling evidence against the independent-systems approach. Copyright © 2012 Elsevier Ltd. All rights reserved.

  17. A new parallelization algorithm of ocean model with explicit scheme

    Science.gov (United States)

    Fu, X. D.

    2017-08-01

    This paper will focus on the parallelization of ocean model with explicit scheme which is one of the most commonly used schemes in the discretization of governing equation of ocean model. The characteristic of explicit schema is that calculation is simple, and that the value of the given grid point of ocean model depends on the grid point at the previous time step, which means that one doesn’t need to solve sparse linear equations in the process of solving the governing equation of the ocean model. Aiming at characteristics of the explicit scheme, this paper designs a parallel algorithm named halo cells update with tiny modification of original ocean model and little change of space step and time step of the original ocean model, which can parallelize ocean model by designing transmission module between sub-domains. This paper takes the GRGO for an example to implement the parallelization of GRGO (Global Reduced Gravity Ocean model) with halo update. The result demonstrates that the higher speedup can be achieved at different problem size.

  18. Modeling the Explicit Chemistry of Anthropogenic and Biogenic Organic Aerosols

    Energy Technology Data Exchange (ETDEWEB)

    Madronich, Sasha [Univ. Corporation for Atmospheric Research, Boulder, CO (United States)

    2015-12-09

    The atmospheric burden of Secondary Organic Aerosols (SOA) remains one of the most important yet uncertain aspects of the radiative forcing of climate. This grant focused on improving our quantitative understanding of SOA formation and evolution, by developing, applying, and improving a highly detailed model of atmospheric organic chemistry, the Generation of Explicit Chemistry and Kinetics of Organics in the Atmosphere (GECKO-A) model. Eleven (11) publications have resulted from this grant.

  19. Explicit chiral symmetry breaking in Gross-Neveu type models

    Energy Technology Data Exchange (ETDEWEB)

    Boehmer, Christian

    2011-07-25

    This thesis is devoted to the study of a 1+1-dimensional, fermionic quantum field theory with Lagrangian L= anti {psi}i{gamma}{sup {mu}}{partial_derivative}{sub {mu}}{psi}-m{sub 0} anti {psi}{psi}+(g{sup 2})/(2)(anti {psi}{psi}){sup 2}+(G{sup 2})/(2)(anti {psi}i{gamma}{sub 5}{psi}){sup 2} in the limit of an infinite number of flavors, using semiclassical methods. The main goal of the present work was to see what changes if we allow for explicit chiral symmetry breaking, either by a bare mass term, or a splitting of the scalar and pseudo-scalar coupling constants, or both. In the first case, this becomes the massive NJL{sub 2} model. In the 2nd and 3rd cases we are dealing with a model largely unexplored so far. The first half of this thesis deals with the massive NJL{sub 2} model. Before attacking the phase diagram, it was necessary to determine the baryons of the model. We have carried out full numerical Hartree-Fock calculations including the Dirac sea. The most important result is the first complete phase diagram of the massive NJL{sub 2} model in ({mu},T,{gamma}) space, where {gamma} arises from m{sub 0} through mass renormalization. In the 2nd half of the thesis we have studied a generalization of the massless NJL{sub 2} model with two different (scalar and pseudoscalar) coupling constants, first in the massless version. Renormalization of the 2 coupling constants leads to the usual dynamical mass by dynamical transmutation, but in addition to a novel {xi} parameter interpreted as chiral quenching parameter. As far as baryon structure is concerned, the most interesting result is the fact that the new baryons interpolate between the kink of the GN model and the massless baryon of the NJL{sub 2} model, always carrying fractional baryon number 1/2. The phase diagram of the massless model with 2 coupling constants has again been determined numerically. At zero temperature we have also investigated the massive, generalized GN model with 3 parameters. It is well

  20. Explicit chiral symmetry breaking in Gross-Neveu type models

    International Nuclear Information System (INIS)

    Boehmer, Christian

    2011-01-01

    This thesis is devoted to the study of a 1+1-dimensional, fermionic quantum field theory with Lagrangian L= anti ψiγ μ ∂ μ ψ-m 0 anti ψψ+(g 2 )/(2)(anti ψψ) 2 +(G 2 )/(2)(anti ψiγ 5 ψ) 2 in the limit of an infinite number of flavors, using semiclassical methods. The main goal of the present work was to see what changes if we allow for explicit chiral symmetry breaking, either by a bare mass term, or a splitting of the scalar and pseudo-scalar coupling constants, or both. In the first case, this becomes the massive NJL 2 model. In the 2nd and 3rd cases we are dealing with a model largely unexplored so far. The first half of this thesis deals with the massive NJL 2 model. Before attacking the phase diagram, it was necessary to determine the baryons of the model. We have carried out full numerical Hartree-Fock calculations including the Dirac sea. The most important result is the first complete phase diagram of the massive NJL 2 model in (μ,T,γ) space, where γ arises from m 0 through mass renormalization. In the 2nd half of the thesis we have studied a generalization of the massless NJL 2 model with two different (scalar and pseudoscalar) coupling constants, first in the massless version. Renormalization of the 2 coupling constants leads to the usual dynamical mass by dynamical transmutation, but in addition to a novel ξ parameter interpreted as chiral quenching parameter. As far as baryon structure is concerned, the most interesting result is the fact that the new baryons interpolate between the kink of the GN model and the massless baryon of the NJL 2 model, always carrying fractional baryon number 1/2. The phase diagram of the massless model with 2 coupling constants has again been determined numerically. At zero temperature we have also investigated the massive, generalized GN model with 3 parameters. It is well-known that the massless NJL 2 model can be solved analytically. The same is true for the GN model, be it massless or massive. Here, the

  1. Explicit estimating equations for semiparametric generalized linear latent variable models

    KAUST Repository

    Ma, Yanyuan

    2010-07-05

    We study generalized linear latent variable models without requiring a distributional assumption of the latent variables. Using a geometric approach, we derive consistent semiparametric estimators. We demonstrate that these models have a property which is similar to that of a sufficient complete statistic, which enables us to simplify the estimating procedure and explicitly to formulate the semiparametric estimating equations. We further show that the explicit estimators have the usual root n consistency and asymptotic normality. We explain the computational implementation of our method and illustrate the numerical performance of the estimators in finite sample situations via extensive simulation studies. The advantage of our estimators over the existing likelihood approach is also shown via numerical comparison. We employ the method to analyse a real data example from economics. © 2010 Royal Statistical Society.

  2. SPATIALLY-EXPLICIT BAT IMPACT SCREENING TOOL FOR WIND TURBINE SITING

    Energy Technology Data Exchange (ETDEWEB)

    Versar, Inc.; Exponent, Inc.

    2013-10-28

    As the U.S. seeks to increase energy production from renewable energy sources, development of wind power resources continues to grow. One of the most important ecological issues restricting wind energy development, especially the siting of wind turbines, is the potential adverse effect on bats. High levels of bat fatality have been recorded at a number of wind energy facilities, especially in the eastern United States. The U.S. Department of Energy contracted with Versar, Inc., and Exponent to develop a spatially-explicit site screening tool to evaluate the mortality of bats resulting from interactions (collisions or barotrauma) with wind turbines. The resulting Bat Vulnerability Assessment Tool (BVAT) presented in this report integrates spatial information about turbine locations, bat habitat features, and bat behavior as it relates to possible interactions with turbines. A model demonstration was conducted that focuses on two bat species, the eastern red bat (Lasiurus borealis) and the Indiana bat (Myotis sodalis). The eastern red bat is a relatively common tree-roosting species that ranges broadly during migration in the Eastern U.S., whereas the Indiana bat is regional species that migrates between a summer range and cave hibernacula. Moreover, Indiana bats are listed as endangered, and so the impacts to this species are of particular interest. The model demonstration used conditions at the Mountaineer Wind Energy Center (MWEC), which consists of 44 wind turbines arranged in a linear array near Thomas, West Virginia (Tucker County), to illustrate model functions and not to represent actual or potential impacts of the facility. The turbines at MWEC are erected on the ridge of Backbone Mountain with a nacelle height of 70 meters and a collision area of 72 meters (blade height) or 4,071 meters square. The habitat surrounding the turbines is an Appalachian mixed mesophytic forest. Model sensitivity runs showed that bat mortality in the model was most sensitive to

  3. Modeling Active Aging and Explicit Memory: An Empirical Study.

    Science.gov (United States)

    Ponce de León, Laura Ponce; Lévy, Jean Pierre; Fernández, Tomás; Ballesteros, Soledad

    2015-08-01

    The rapid growth of the population of older adults and their concomitant psychological status and health needs have captured the attention of researchers and health professionals. To help fill the void of literature available to social workers interested in mental health promotion and aging, the authors provide a model for active aging that uses psychosocial variables. Structural equation modeling was used to examine the relationships among the latent variables of the state of explicit memory, the perception of social resources, depression, and the perception of quality of life in a sample of 184 older adults. The results suggest that explicit memory is not a direct indicator of the perception of quality of life, but it could be considered an indirect indicator as it is positively correlated with perception of social resources and negatively correlated with depression. These last two variables influenced the perception of quality of life directly, the former positively and the latter negatively. The main outcome suggests that the perception of social support improves explicit memory and quality of life and reduces depression in active older adults. The findings also suggest that gerontological professionals should design memory training programs, improve available social resources, and offer environments with opportunities to exercise memory.

  4. Explicit estimating equations for semiparametric generalized linear latent variable models

    KAUST Repository

    Ma, Yanyuan; Genton, Marc G.

    2010-01-01

    which is similar to that of a sufficient complete statistic, which enables us to simplify the estimating procedure and explicitly to formulate the semiparametric estimating equations. We further show that the explicit estimators have the usual root n

  5. Quantifying multiple telecouplings using an integrated suite of spatially-explicit tools

    Science.gov (United States)

    Tonini, F.; Liu, J.

    2016-12-01

    Telecoupling is an interdisciplinary research umbrella concept that enables natural and social scientists to understand and generate information for managing how humans and nature can sustainably coexist worldwide. To systematically study telecoupling, it is essential to build a comprehensive set of spatially-explicit tools for describing and quantifying multiple reciprocal socioeconomic and environmental interactions between a focal area and other areas. Here we introduce the Telecoupling Toolbox, a new free and open-source set of tools developed to map and identify the five major interrelated components of the telecoupling framework: systems, flows, agents, causes, and effects. The modular design of the toolbox allows the integration of existing tools and software (e.g. InVEST) to assess synergies and tradeoffs associated with policies and other local to global interventions. We show applications of the toolbox using a number of representative studies that address a variety of scientific and management issues related to telecouplings throughout the world. The results suggest that the toolbox can thoroughly map and quantify multiple telecouplings under various contexts while providing users with an easy-to-use interface. It provides a powerful platform to address globally important issues, such as land use and land cover change, species invasion, migration, flows of ecosystem services, and international trade of goods and products.

  6. The SOA/VOC/NOx system: an explicit model of secondary organic aerosol formation

    Directory of Open Access Journals (Sweden)

    S. Madronich

    2007-11-01

    Full Text Available Our current understanding of secondary organic aerosol (SOA formation is limited by our knowledge of gaseous secondary organics involved in gas/particle partitioning. The objective of this study is to explore (i the potential for products of multiple oxidation steps contributing to SOA, and (ii the evolution of the SOA/VOC/NOx system. We developed an explicit model based on the coupling of detailed gas-phase oxidation schemes with a thermodynamic condensation module. Such a model allows prediction of SOA mass and speciation on the basis of first principles. The SOA/VOC/NOx system is studied for the oxidation of 1-octene under atmospherically relevant concentrations. In this study, gaseous oxidation of octene is simulated to lead to SOA formation. Contributors to SOA formation are shown to be formed via multiple oxidation steps of the parent hydrocarbon. The behaviour of the SOA/VOC/NOx system simulated using the explicit model agrees with general tendencies observed during laboratory chamber experiments. This explicit modelling of SOA formation appears as a useful exploratory tool to (i support interpretations of SOA formation observed in laboratory chamber experiments, (ii give some insights on SOA formation under atmospherically relevant conditions and (iii investigate implications for the regional/global lifetimes of the SOA.

  7. Recent Advances in Explicit Multiparametric Nonlinear Model Predictive Control

    KAUST Repository

    Domínguez, Luis F.; Pistikopoulos, Efstratios N.

    2011-01-01

    are derived. The performance of the explicit controllers are then tested and compared in a simulation example involving the operation of a continuous stirred-tank reactor (CSTR). © 2010 American Chemical Society.

  8. A web-tool to find spatially explicit climate-smart solutions for the sector agriculture

    Science.gov (United States)

    Verzandvoort, Simone; Kuikman, Peter; Walvoort, Dennis

    2017-04-01

    Europe faces the challenge to produce more food and more biomass for the bio-economy, to adapt its agricultural sector to negative consequences of climate change, and to reduce greenhouse gas emissions from agriculture. Climate-smart agriculture (CSA) solutions and technologies improve agriculture's productivity and provide economic growth and stability, increase resilience, and help to reduce GHG emissions from agricultural activities. The Climate Smart Agriculture Booster (CSAb) (http://csabooster.climate-kic.org/) is a Flagship Program under Climate-KIC, aiming to facilitate the adoption of CSA solutions and technologies in the European agro-food sector. This adoption requires spatially explicit, contextual information on farming activities and risks and opportunities related to climate change in regions across Europe. Other spatial information supporting adoption includes Information on where successful implementations were already done, on where CSA would profit from enabling policy conditions, and where markets or business opportunities for selling or purchasing technology and knowledge are located or emerging. The Spatial Solution Finder is a web-based spatial tool aiming to help agri-food companies (supply and processing), authorities or agricultural organisations find CSA solutions and technologies that fit local farmers and regions, and to demonstrate examples of successful implementations as well as expected impact at the farm and regional level. The tool is based on state of the art (geo)datasets of environmental and socio-economic conditions (partly open access, partly derived from previous research) and open source web-technology. The philosophy of the tool is that combining existing datasets with contextual information on the region of interest with personalized information entered by the user provides a suitable basis for offering a basket of options for CSA solutions and technologies. Solutions and technologies are recommended to the user based on

  9. Fuselage Versus Subcomponent Panel Response Correlation Based on ABAQUS Explicit Progressive Damage Analysis Tools

    Science.gov (United States)

    Gould, Kevin E.; Satyanarayana, Arunkumar; Bogert, Philip B.

    2016-01-01

    Analysis performed in this study substantiates the need for high fidelity vehicle level progressive damage analyses (PDA) structural models for use in the verification and validation of proposed sub-scale structural models and to support required full-scale vehicle level testing. PDA results are presented that capture and correlate the responses of sub-scale 3-stringer and 7-stringer panel models and an idealized 8-ft diameter fuselage model, which provides a vehicle level environment for the 7-stringer sub-scale panel model. Two unique skin-stringer attachment assumptions are considered and correlated in the models analyzed: the TIE constraint interface versus the cohesive element (COH3D8) interface. Evaluating different interfaces allows for assessing a range of predicted damage modes, including delamination and crack propagation responses. Damage models considered in this study are the ABAQUS built-in Hashin procedure and the COmplete STress Reduction (COSTR) damage procedure implemented through a VUMAT user subroutine using the ABAQUS/Explicit code.

  10. Modelling the Hydraulic Behaviour of Growing Media with the Explicit Finite Volume Solution

    Directory of Open Access Journals (Sweden)

    Marco Carbone

    2015-02-01

    Full Text Available The increasing imperviousness of urban areas reduces the infiltration and evapotranspiration capacity of urban catchments and results in increased runoff. In the last few decades, several solutions and techniques have been proposed to prevent such impacts by restoring the hydrological cycle. A limiting factor in spreading the use of such systems is the lack of proper modelling tools for design, especially for the infiltration processes in a growing medium. In this research, a physically-based model, employing the explicit Finite Volume Method (FVM, is proposed for modelling infiltration into growing media. The model solves a modified version of the Richards equation using a formulation which takes into account the main characteristics of green infrastructure substrates. The proposed model was verified against the HYDRUS-1D software and the comparison of results confirmed the suitability of the proposed model for correctly describing the hydraulic behaviour of soil substrates.

  11. Generalized Heteroskedasticity ACF for Moving Average Models in Explicit Forms

    OpenAIRE

    Samir Khaled Safi

    2014-01-01

    The autocorrelation function (ACF) measures the correlation between observations at different   distances apart. We derive explicit equations for generalized heteroskedasticity ACF for moving average of order q, MA(q). We consider two cases: Firstly: when the disturbance term follow the general covariance matrix structure Cov(wi, wj)=S with si,j ¹ 0 " i¹j . Secondly: when the diagonal elements of S are not all identical but sij = 0 " i¹j, i.e. S=diag(s11, s22,&hellip...

  12. Generalized Heteroskedasticity ACF for Moving Average Models in Explicit Forms

    Directory of Open Access Journals (Sweden)

    Samir Khaled Safi

    2014-02-01

    Full Text Available Normal 0 false false false MicrosoftInternetExplorer4 The autocorrelation function (ACF measures the correlation between observations at different   distances apart. We derive explicit equations for generalized heteroskedasticity ACF for moving average of order q, MA(q. We consider two cases: Firstly: when the disturbance term follow the general covariance matrix structure Cov(wi, wj=S with si,j ¹ 0 " i¹j . Secondly: when the diagonal elements of S are not all identical but sij = 0 " i¹j, i.e. S=diag(s11, s22,…,stt. The forms of the explicit equations depend essentially on the moving average coefficients and covariance structure of the disturbance terms.   /* Style Definitions */ table.MsoNormalTable {mso-style-name:"جدول عادي"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman"; mso-ansi-language:#0400; mso-fareast-language:#0400; mso-bidi-language:#0400;}

  13. Novel application of explicit dynamics occupancy models to ongoing aquatic invasions

    Science.gov (United States)

    Sepulveda, Adam J.

    2018-01-01

    Identification of suitable habitats, where invasive species can establish, is an important step towards controlling their spread. Accurate identification is difficult for new or slow invaders because unoccupied habitats may be suitable, given enough time for dispersal, while occupied habitats may prove to be unsuitable for establishment.To identify the suitable habitat of a recent invader, I used an explicit dynamics occupancy modelling framework to evaluate habitat covariates related to successful and failed establishments of American bullfrogs (Lithobates catesbeianus) within the Yellowstone River floodplain of Montana, USA from 2012-2016.During this five-year period, bullfrogs failed to establish at most sites they colonized. Bullfrog establishment was most likely to occur and least likely to fail at sites closest to human-modified ponds and lakes and those with emergent vegetation. These habitat covariates were generally associated with the presence of permanent water.Suitable habitat for bullfrog establishment is abundant in the Yellowstone River floodplain, though many sites with suitable habitat remain uncolonized. Thus, the maximum distribution of bullfrogs is much greater than their current distribution.Synthesis and applications. Focused control efforts on habitats with or proximate to permanent waters are most likely to reduce the potential for invasive bullfrog establishment and spread in the Yellowstone River. The novel application of explicit dynamics occupancy models is a useful and widely applicable tool for guiding management efforts towards those habitats where new or slow invaders are most likely to establish and persist.

  14. Explicit equilibria in a kinetic model of gambling

    Science.gov (United States)

    Bassetti, F.; Toscani, G.

    2010-06-01

    We introduce and discuss a nonlinear kinetic equation of Boltzmann type which describes the evolution of wealth in a pure gambling process, where the entire sum of wealths of two agents is up for gambling, and randomly shared between the agents. For this equation the analytical form of the steady states is found for various realizations of the random fraction of the sum which is shared to the agents. Among others, the exponential distribution appears as steady state in case of a uniformly distributed random fraction, while Gamma distribution appears for a random fraction which is Beta distributed. The case in which the gambling game is only conservative-in-the-mean is shown to lead to an explicit heavy tailed distribution.

  15. Depletion benchmarks calculation of random media using explicit modeling approach of RMC

    International Nuclear Information System (INIS)

    Liu, Shichang; She, Ding; Liang, Jin-gang; Wang, Kan

    2016-01-01

    Highlights: • Explicit modeling of RMC is applied to depletion benchmark for HTGR fuel element. • Explicit modeling can provide detailed burnup distribution and burnup heterogeneity. • The results would serve as a supplement for the HTGR fuel depletion benchmark. • The method of adjacent burnup regions combination is proposed for full-core problems. • The combination method can reduce memory footprint, keeping the computing accuracy. - Abstract: Monte Carlo method plays an important role in accurate simulation of random media, owing to its advantages of the flexible geometry modeling and the use of continuous-energy nuclear cross sections. Three stochastic geometry modeling methods including Random Lattice Method, Chord Length Sampling and explicit modeling approach with mesh acceleration technique, have been implemented in RMC to simulate the particle transport in the dispersed fuels, in which the explicit modeling method is regarded as the best choice. In this paper, the explicit modeling method is applied to the depletion benchmark for HTGR fuel element, and the method of combination of adjacent burnup regions has been proposed and investigated. The results show that the explicit modeling can provide detailed burnup distribution of individual TRISO particles, and this work would serve as a supplement for the HTGR fuel depletion benchmark calculations. The combination of adjacent burnup regions can effectively reduce the memory footprint while keeping the computational accuracy.

  16. Cholera in the Lake Kivu region (DRC): Integrating remote sensing and spatially explicit epidemiological modeling

    Science.gov (United States)

    Finger, Flavio; Knox, Allyn; Bertuzzo, Enrico; Mari, Lorenzo; Bompangue, Didier; Gatto, Marino; Rodriguez-Iturbe, Ignacio; Rinaldo, Andrea

    2014-07-01

    Mathematical models of cholera dynamics can not only help in identifying environmental drivers and processes that influence disease transmission, but may also represent valuable tools for the prediction of the epidemiological patterns in time and space as well as for the allocation of health care resources. Cholera outbreaks have been reported in the Democratic Republic of the Congo since the 1970s. They have been ravaging the shore of Lake Kivu in the east of the country repeatedly during the last decades. Here we employ a spatially explicit, inhomogeneous Markov chain model to describe cholera incidence in eight health zones on the shore of the lake. Remotely sensed data sets of chlorophyll a concentration in the lake, precipitation and indices of global climate anomalies are used as environmental drivers in addition to baseline seasonality. The effect of human mobility is also modelled mechanistically. We test several models on a multiyear data set of reported cholera cases. The best fourteen models, accounting for different environmental drivers, and selected using the Akaike information criterion, are formally compared via proper cross validation. Among these, the one accounting for seasonality, El Niño Southern Oscillation, precipitation and human mobility outperforms the others in cross validation. Some drivers (such as human mobility and rainfall) are retained only by a few models, possibly indicating that the mechanisms through which they influence cholera dynamics in the area will have to be investigated further.

  17. Explicit Modeling of Solid Ocean Floor in Shallow Underwater Explosions

    Directory of Open Access Journals (Sweden)

    A.P. Walters

    2013-01-01

    Full Text Available Current practices for modeling the ocean floor in underwater explosion simulations call for application of an inviscid fluid with soil properties. A method for modeling the ocean floor as a Lagrangian solid, vice an Eulerian fluid, was developed in order to determine its effects on underwater explosions in shallow water using the DYSMAS solver. The Lagrangian solid bottom model utilized transmitting boundary segments, exterior nodal forces acting as constraints, and the application of prestress to minimize any distortions into the fluid domain. For simplicity, elastic materials were used in this current effort, though multiple constitutive soil models can be applied to improve the overall accuracy of the model. Even though this method is unable to account for soil cratering effects, it does however provide the distinct advantage of modeling contoured ocean floors such as dredged channels and sloped bottoms absent in Eulerian formulations. The study conducted here showed significant differences among the initial bottom reflections for the different solid bottom contours that were modeled. The most important bottom contour effect was the distortion to the gas bubble and its associated first pulse timing. In addition to its utility in bottom modeling, implementation of the non-reflecting boundary along with realistic material models can be used to drastically reduce the size of current fluid domains.

  18. Making decision process knowledge explicit using the product data model

    NARCIS (Netherlands)

    Petrusel, R.; Vanderfeesten, I.T.P.; Dolean, Cristina; Mican, D.

    2011-01-01

    In this paper, we present a new knowledge acquisition and formalization method: the decision mining approach. Basically, we aim to produce a model of the workflow of mental actions performed by decision makers during the decision process. We show that through the use of a Product Data Model (PDM) we

  19. Quantum decay model with exact explicit analytical solution

    Science.gov (United States)

    Marchewka, Avi; Granot, Er'El

    2009-01-01

    A simple decay model is introduced. The model comprises a point potential well, which experiences an abrupt change. Due to the temporal variation, the initial quantum state can either escape from the well or stay localized as a new bound state. The model allows for an exact analytical solution while having the necessary features of a decay process. The results show that the decay is never exponential, as classical dynamics predicts. Moreover, at short times the decay has a fractional power law, which differs from perturbation quantum method predictions. At long times the decay includes oscillations with an envelope that decays algebraically. This is a model where the final state can be either continuous or localized, and that has an exact analytical solution.

  20. Are mixed explicit/implicit solvation models reliable for studying phosphate hydrolysis? A comparative study of continuum, explicit and mixed solvation models.

    Energy Technology Data Exchange (ETDEWEB)

    Kamerlin, Shina C. L.; Haranczyk, Maciej; Warshel, Arieh

    2009-05-01

    Phosphate hydrolysis is ubiquitous in biology. However, despite intensive research on this class of reactions, the precise nature of the reaction mechanism remains controversial. In this work, we have examined the hydrolysis of three homologous phosphate diesters. The solvation free energy was simulated by means of either an implicit solvation model (COSMO), hybrid quantum mechanical / molecular mechanical free energy perturbation (QM/MM-FEP) or a mixed solvation model in which N water molecules were explicitly included in the ab initio description of the reacting system (where N=1-3), with the remainder of the solvent being implicitly modelled as a continuum. Here, both COSMO and QM/MM-FEP reproduce Delta Gobs within an error of about 2kcal/mol. However, we demonstrate that in order to obtain any form of reliable results from a mixed model, it is essential to carefully select the explicit water molecules from short QM/MM runs that act as a model for the true infinite system. Additionally, the mixed models tend to be increasingly inaccurate the more explicit water molecules are placed into the system. Thus, our analysis indicates that this approach provides an unreliable way for modelling phosphate hydrolysis in solution.

  1. Spatially explicit models, generalized reproduction numbers and the prediction of patterns of waterborne disease

    Science.gov (United States)

    Rinaldo, A.; Gatto, M.; Mari, L.; Casagrandi, R.; Righetto, L.; Bertuzzo, E.; Rodriguez-Iturbe, I.

    2012-12-01

    still lacking. Here, we show that the requirement that all the local reproduction numbers R0 be larger than unity is neither necessary nor sufficient for outbreaks to occur when local settlements are connected by networks of primary and secondary infection mechanisms. To determine onset conditions, we derive general analytical expressions for a reproduction matrix G0 explicitly accounting for spatial distributions of human settlements and pathogen transmission via hydrological and human mobility networks. At disease onset, a generalized reproduction number Λ0 (the dominant eigenvalue of G0) must be larger than unity. We also show that geographical outbreak patterns in complex environments are linked to the dominant eigenvector and to spectral properties of G0. Tests against data and computations for the 2010 Haiti and 2000 KwaZulu-Natal cholera outbreaks, as well as against computations for metapopulation networks, demonstrate that eigenvectors of G0 provide a synthetic and effective tool for predicting the disease course in space and time. Networked connectivity models, describing the interplay between hydrology, epidemiology and social behavior sustaining human mobility, thus prove to be key tools for emergency management of waterborne infections.

  2. SOMPROF: A vertically explicit soil organic matter model

    NARCIS (Netherlands)

    Braakhekke, M.C.; Beer, M.; Hoosbeek, M.R.; Kruijt, B.; Kabat, P.

    2011-01-01

    Most current soil organic matter (SOM) models represent the soil as a bulk without specification of the vertical distribution of SOM in the soil profile. However, the vertical SOM profile may be of great importance for soil carbon cycling, both on short (hours to years) time scale, due to

  3. Explicit versus Implicit Solvent Modeling of Raman Optical Activity Spectra

    Czech Academy of Sciences Publication Activity Database

    Hopmann, K. H.; Ruud, K.; Pecul, M.; Kudelski, A.; Dračínský, Martin; Bouř, Petr

    2011-01-01

    Roč. 115, č. 14 (2011), s. 4128-4137 ISSN 1520-6106 R&D Projects: GA MŠk(CZ) LH11033; GA ČR GAP208/11/0105 Grant - others:AV ČR(CZ) M200550902 Institutional research plan: CEZ:AV0Z40550506 Keywords : raman optical activity * lactamide * solvent models Subject RIV: CF - Physical ; Theoretical Chemistry Impact factor: 3.696, year: 2011

  4. Explicit prediction of ice clouds in general circulation models

    Science.gov (United States)

    Kohler, Martin

    1999-11-01

    Although clouds play extremely important roles in the radiation budget and hydrological cycle of the Earth, there are large quantitative uncertainties in our understanding of their generation, maintenance and decay mechanisms, representing major obstacles in the development of reliable prognostic cloud water schemes for General Circulation Models (GCMs). Recognizing their relative neglect in the past, both observationally and theoretically, this work places special focus on ice clouds. A recent version of the UCLA - University of Utah Cloud Resolving Model (CRM) that includes interactive radiation is used to perform idealized experiments to study ice cloud maintenance and decay mechanisms under various conditions in term of: (1) background static stability, (2) background relative humidity, (3) rate of cloud ice addition over a fixed initial time-period and (4) radiation: daytime, nighttime and no-radiation. Radiation is found to have major effects on the life-time of layer-clouds. Optically thick ice clouds decay significantly slower than expected from pure microphysical crystal fall-out (taucld = 0.9--1.4 h as opposed to no-motion taumicro = 0.5--0.7 h). This is explained by the upward turbulent fluxes of water induced by IR destabilization, which partially balance the downward transport of water by snowfall. Solar radiation further slows the ice-water decay by destruction of the inversion above cloud-top and the resulting upward transport of water. Optically thin ice clouds, on the other hand, may exhibit even longer life-times (>1 day) in the presence of radiational cooling. The resulting saturation mixing ratio reduction provides for a constant cloud ice source. These CRM results are used to develop a prognostic cloud water scheme for the UCLA-GCM. The framework is based on the bulk water phase model of Ose (1993). The model predicts cloud liquid water and cloud ice separately, and which is extended to split the ice phase into suspended cloud ice (predicted

  5. Model of high-tech businesses management under the trends of explicit and implicit knowledge markets: classification and business model

    OpenAIRE

    Guzel Isayevna Gumerova; Elmira Shamilevna Shaimieva

    2015-01-01

    Objective to define the notion of ldquohightech businessrdquo to elaborate classification of hightech businesses to elaborate the business model for hightech business management. Methods general scientific methods of theoretical and empirical cognition. Results the research presents a business model of hightech businesses management basing on the trends of explicit and explicit knowledge market with the dominating implicit knowledge market classification of hightech business...

  6. A spatially explicit scenario-driven model of adaptive capacity to global change in Europe

    NARCIS (Netherlands)

    Acosta, L.; Klein, R.J.T.; Reidsma, P.; Metzger, M.J.; Rounsevell, M.D.A.; Leemans, R.

    2013-01-01

    Traditional impact models combine exposure in the form of scenarios and sensitivity in the form of parameters, providing potential impacts of global change as model outputs. However, adaptive capacity is rarely addressed in these models. This paper presents the first spatially explicit

  7. Assessing Bioenergy Harvest Risks: Geospatially Explicit Tools for Maintaining Soil Productivity in Western US Forests

    Directory of Open Access Journals (Sweden)

    Deborah Page-Dumroese

    2011-09-01

    Full Text Available Biomass harvesting for energy production and forest health can impact the soil resource by altering inherent chemical, physical and biological properties. These impacts raise concern about damaging sensitive forest soils, even with the prospect of maintaining vigorous forest growth through biomass harvesting operations. Current forest biomass harvesting research concurs that harvest impacts to the soil resource are region- and site-specific, although generalized knowledge from decades of research can be incorporated into management activities. Based upon the most current forest harvesting research, we compiled information on harvest activities that decrease, maintain or increase soil-site productivity. We then developed a soil chemical and physical property risk assessment within a geographic information system for a timber producing region within the Northern Rocky Mountain ecoregion. Digital soil and geology databases were used to construct geospatially explicit best management practices to maintain or enhance soil-site productivity. The proposed risk assessments could aid in identifying resilient soils for forest land managers considering biomass operations, policy makers contemplating expansion of biomass harvesting and investors deliberating where to locate bioenergy conversion facilities.

  8. Assessing Sustainability of Coral Reef Ecosystem Services using a Spatially-Explicit Decision Support Tool

    Science.gov (United States)

    Forecasting and communicating the potential outcomes of decision options requires support tools that aid in evaluating alternative scenarios in a user-friendly context and that highlight variables relevant to the decision options and valuable stakeholders. Envision is a GIS-base...

  9. Making dilemmas explicit through the use of a cognitive mapping collaboration tool

    NARCIS (Netherlands)

    Matos Castano, Julieta; van Amstel, Frederick; Hartmann, Timo; Dewulf, Geert

    2017-01-01

    Dilemmas are pervasive in decision making. Although they offer the potential of reflecting on issues at stake from different perspectives, dilemmas often lead to paralysis for those encountering them. This study presents a three dimensional collaboration tool specifically developed to surface

  10. DEFINING RECOVERY GOALS AND STRATEGIES FOR ENDANGERED SPECIES USING SPATIALLY-EXPLICIT POPULATION MODELS

    Science.gov (United States)

    We used a spatially explicit population model of wolves (Canis lupus) to propose a framework for defining rangewide recovery priorities and finer-scale strategies for regional reintroductions. The model predicts that Yellowstone and central Idaho, where wolves have recently been ...

  11. Flood vulnerability assessment of residential buildings by explicit damage process modelling

    DEFF Research Database (Denmark)

    Custer, Rocco; Nishijima, Kazuyoshi

    2015-01-01

    The present paper introduces a vulnerability modelling approach for residential buildings in flood. The modelling approach explicitly considers relevant damage processes, i.e. water infiltration into the building, mechanical failure of components in the building envelope and damage from water...

  12. An Explicit Formula for Symmetric Polynomials Related to the Eigenfunctions of Calogero-Sutherland Models

    Directory of Open Access Journals (Sweden)

    Martin Hallnäs

    2007-03-01

    Full Text Available We review a recent construction of an explicit analytic series representation for symmetric polynomials which up to a groundstate factor are eigenfunctions of Calogero-Sutherland type models. We also indicate a generalisation of this result to polynomials which give the eigenfunctions of so-called 'deformed' Calogero-Sutherland type models.

  13. Comparison of explicit and effective models for calculating ionic populations in argon plasmas

    International Nuclear Information System (INIS)

    Abdallah, J. Jr.; Clark, R.E.H.

    1994-01-01

    Calculations have been performed to model the state populations of argon plasmas at electron densities at and above those required for the validity of coronal equilibrium. Both effective and explicit models have been used, and both are based on the same set of atomic cross sections. The effective model includes ground and singly excited states explicitly, while the effect of autoionizing states is accounted for by branching factors which describe their depopulation into the various non-autoionizing states. The explicit model considers both autoionizing and non-autoionizing states explicitly. The effective model requires a significantly reduced amount of computer time and memory. Good agreement between the two models can be obtained through moderate densities if the branching factors include electron density dependent terms which describe the collisional stabilization of each autoionizing state. The effective model breaks down as density is increased because the population of individual autoionizing states become significant. Results for both ionization balance and radiated power loss are presented. (Author)

  14. High Performance Programming Using Explicit Shared Memory Model on Cray T3D1

    Science.gov (United States)

    Simon, Horst D.; Saini, Subhash; Grassi, Charles

    1994-01-01

    The Cray T3D system is the first-phase system in Cray Research, Inc.'s (CRI) three-phase massively parallel processing (MPP) program. This system features a heterogeneous architecture that closely couples DEC's Alpha microprocessors and CRI's parallel-vector technology, i.e., the Cray Y-MP and Cray C90. An overview of the Cray T3D hardware and available programming models is presented. Under Cray Research adaptive Fortran (CRAFT) model four programming methods (data parallel, work sharing, message-passing using PVM, and explicit shared memory model) are available to the users. However, at this time data parallel and work sharing programming models are not available to the user community. The differences between standard PVM and CRI's PVM are highlighted with performance measurements such as latencies and communication bandwidths. We have found that the performance of neither standard PVM nor CRI s PVM exploits the hardware capabilities of the T3D. The reasons for the bad performance of PVM as a native message-passing library are presented. This is illustrated by the performance of NAS Parallel Benchmarks (NPB) programmed in explicit shared memory model on Cray T3D. In general, the performance of standard PVM is about 4 to 5 times less than obtained by using explicit shared memory model. This degradation in performance is also seen on CM-5 where the performance of applications using native message-passing library CMMD on CM-5 is also about 4 to 5 times less than using data parallel methods. The issues involved (such as barriers, synchronization, invalidating data cache, aligning data cache etc.) while programming in explicit shared memory model are discussed. Comparative performance of NPB using explicit shared memory programming model on the Cray T3D and other highly parallel systems such as the TMC CM-5, Intel Paragon, Cray C90, IBM-SP1, etc. is presented.

  15. An Efficient Explicit-time Description Method for Timed Model Checking

    Directory of Open Access Journals (Sweden)

    Hao Wang

    2009-12-01

    Full Text Available Timed model checking, the method to formally verify real-time systems, is attracting increasing attention from both the model checking community and the real-time community. Explicit-time description methods verify real-time systems using general model constructs found in standard un-timed model checkers. Lamport proposed an explicit-time description method using a clock-ticking process (Tick to simulate the passage of time together with a group of global variables to model time requirements. Two methods, the Sync-based Explicit-time Description Method using rendezvous synchronization steps and the Semaphore-based Explicit-time Description Method using only one global variable were proposed; they both achieve better modularity than Lamport's method in modeling the real-time systems. In contrast to timed automata based model checkers like UPPAAL, explicit-time description methods can access and store the current time instant for future calculations necessary for many real-time systems, especially those with pre-emptive scheduling. However, the Tick process in the above three methods increments the time by one unit in each tick; the state spaces therefore grow relatively fast as the time parameters increase, a problem when the system's time period is relatively long. In this paper, we propose a more efficient method which enables the Tick process to leap multiple time units in one tick. Preliminary experimental results in a high performance computing environment show that this new method significantly reduces the state space and improves both the time and memory efficiency.

  16. Scaling-up spatially-explicit ecological models using graphics processors

    NARCIS (Netherlands)

    Koppel, Johan van de; Gupta, Rohit; Vuik, Cornelis

    2011-01-01

    How the properties of ecosystems relate to spatial scale is a prominent topic in current ecosystem research. Despite this, spatially explicit models typically include only a limited range of spatial scales, mostly because of computing limitations. Here, we describe the use of graphics processors to

  17. Dynamic optimization and robust explicit model predictive control of hydrogen storage tank

    KAUST Repository

    Panos, C.

    2010-09-01

    We present a general framework for the optimal design and control of a metal-hydride bed under hydrogen desorption operation. The framework features: (i) a detailed two-dimension dynamic process model, (ii) a design and operational dynamic optimization step, and (iii) an explicit/multi-parametric model predictive controller design step. For the controller design, a reduced order approximate model is obtained, based on which nominal and robust multi-parametric controllers are designed. © 2010 Elsevier Ltd.

  18. Dynamic optimization and robust explicit model predictive control of hydrogen storage tank

    KAUST Repository

    Panos, C.; Kouramas, K.I.; Georgiadis, M.C.; Pistikopoulos, E.N.

    2010-01-01

    We present a general framework for the optimal design and control of a metal-hydride bed under hydrogen desorption operation. The framework features: (i) a detailed two-dimension dynamic process model, (ii) a design and operational dynamic optimization step, and (iii) an explicit/multi-parametric model predictive controller design step. For the controller design, a reduced order approximate model is obtained, based on which nominal and robust multi-parametric controllers are designed. © 2010 Elsevier Ltd.

  19. Dynamic modeling and explicit/multi-parametric MPC control of pressure swing adsorption systems

    KAUST Repository

    Khajuria, Harish

    2011-01-01

    Pressure swing adsorption (PSA) is a flexible, albeit complex gas separation system. Due to its inherent nonlinear nature and discontinuous operation, the design of a model based PSA controller, especially with varying operating conditions, is a challenging task. This work focuses on the design of an explicit/multi-parametric model predictive controller for a PSA system. Based on a system involving four adsorbent beds separating 70% H2, 30% CH4 mixture into high purity hydrogen, the key controller objective is to fast track H2 purity to a set point value of 99.99%. To perform this task, a rigorous and systematic framework is employed. First, a high fidelity detailed dynamic model is built to represent the system\\'s real operation, and understand its dynamic behavior. The model is then used to derive appropriate linear models by applying suitable system identification techniques. For the reduced models, a model predictive control (MPC) step is formulated, where latest developments in multi-parametric programming and control are applied to derive a novel explicit MPC controller. To test the performance of the designed controller, closed loop simulations are performed where the dynamic model is used as the virtual plant. Comparison studies of the derived explicit MPC controller are also performed with conventional PID controllers. © 2010 Elsevier Ltd. All rights reserved.

  20. Spatially explicit modeling of conflict zones between wildlife and snow sports: prioritizing areas for winter refuges.

    Science.gov (United States)

    Braunisch, Veronika; Patthey, Patrick; Arlettaz, Raphaël

    2011-04-01

    Outdoor winter recreation exerts an increasing pressure upon mountain ecosystems, with unpredictable, free-ranging activities (e.g., ski mountaineering, snowboarding, and snowshoeing) representing a major source of stress for wildlife. Mitigating anthropogenic disturbance requires the spatially explicit prediction of the interference between the activities of humans and wildlife. We applied spatial modeling to localize conflict zones between wintering Black Grouse (Tetrao tetrix), a declining species of Alpine timberline ecosystems, and two free-ranging winter sports (off-piste skiing [including snow-boarding] and snowshoeing). Track data (snow-sports and birds' traces) obtained from aerial photographs taken over a 585-km transect running along the timberline, implemented within a maximum entropy model, were used to predict the occurrence of snow sports and Black Grouse as a function of landscape characteristics. By modeling Black Grouse presence in the theoretical absence of free-ranging activities and ski infrastructure, we first estimated the amount of habitat reduction caused by these two factors. The models were then extrapolated to the altitudinal range occupied by Black Grouse, while the spatial extent and intensity of potential conflict were assessed by calculating the probability of human-wildlife co-occurrence. The two snow-sports showed different distribution patterns. Skiers' occurrence was mainly determined by ski-lift presence and a smooth terrain, while snowshoers' occurrence was linked to hiking or skiing routes and moderate slopes. Wintering Black Grouse avoided ski lifts and areas frequented by free-ranging snow sports. According to the models, Black Grouse have faced a substantial reduction of suitable wintering habitat along the timberline transect: 12% due to ski infrastructure and another 16% when adding free-ranging activities. Extrapolating the models over the whole study area results in an overall habitat loss due to ski infrastructure of

  1. Predicting continental-scale patterns of bird species richness with spatially explicit models

    DEFF Research Database (Denmark)

    Rahbek, Carsten; Gotelli, Nicholas J; Colwell, Robert K

    2007-01-01

    the extraordinary diversity of avian species in the montane tropics, the most species-rich region on Earth. Our findings imply that correlative climatic models substantially underestimate the importance of historical factors and small-scale niche-driven assembly processes in shaping contemporary species-richness......The causes of global variation in species richness have been debated for nearly two centuries with no clear resolution in sight. Competing hypotheses have typically been evaluated with correlative models that do not explicitly incorporate the mechanisms responsible for biotic diversity gradients....... Here, we employ a fundamentally different approach that uses spatially explicit Monte Carlo models of the placement of cohesive geographical ranges in an environmentally heterogeneous landscape. These models predict species richness of endemic South American birds (2248 species) measured...

  2. Fire Propagation Tracing Model in the Explicit Treatment of Events of Fire PSA

    International Nuclear Information System (INIS)

    Lim, Ho Gon; Han, Sang Hoon; Yang, Jun Eon

    2010-01-01

    The fire propagation model in a fire PSA has not been considered analytically instead a simplified analyst's intuition was used to consider the fire propagation path. A fire propagation equation is developed to trace all the propagation paths in the fire area in which a zone is defined to identify various fire ignition sources. An initiation of fire is assumed to take place in a zone. Then, the propagation is modeled with a Boolean equation. Since the explicit fire PSA modeling requires an exclusive event set to sum up the..., exclusive event sets are derived from the fire propagation equation. As an example, we show the exclusive set for a 2x3 rectangular fire zone. Also, the applicability the developed fire equation is discussed when the number of zone increases including the limitation of the explicit fire PSA modeling method

  3. Testing the cognitive catalyst model of rumination with explicit and implicit cognitive content.

    Science.gov (United States)

    Sova, Christopher C; Roberts, John E

    2018-06-01

    The cognitive catalyst model posits that rumination and negative cognitive content, such as negative schema, interact to predict depressive affect. Past research has found support for this model using explicit measures of negative cognitive content such as self-report measures of trait self-esteem and dysfunctional attitudes. The present study tested whether these findings would extend to implicit measures of negative cognitive content such as implicit self-esteem, and whether effects would depend on initial mood state and history of depression. Sixty-one undergraduate students selected on the basis of depression history (27 previously depressed; 34 never depressed) completed explicit and implicit measures of negative cognitive content prior to random assignment to a rumination induction followed by a distraction induction or vice versa. Dysphoric affect was measured both before and after these inductions. Analyses revealed that explicit measures, but not implicit measures, interacted with rumination to predict change in dysphoric affect, and these interactions were further moderated by baseline levels of dysphoria. Limitations include the small nonclinical sample and use of a self-report measure of depression history. These findings suggest that rumination amplifies the association between explicit negative cognitive content and depressive affect primarily among people who are already experiencing sad mood. Copyright © 2018 Elsevier Ltd. All rights reserved.

  4. Implicit-explicit (IMEX) Runge-Kutta methods for non-hydrostatic atmospheric models

    Science.gov (United States)

    Gardner, David J.; Guerra, Jorge E.; Hamon, François P.; Reynolds, Daniel R.; Ullrich, Paul A.; Woodward, Carol S.

    2018-04-01

    The efficient simulation of non-hydrostatic atmospheric dynamics requires time integration methods capable of overcoming the explicit stability constraints on time step size arising from acoustic waves. In this work, we investigate various implicit-explicit (IMEX) additive Runge-Kutta (ARK) methods for evolving acoustic waves implicitly to enable larger time step sizes in a global non-hydrostatic atmospheric model. The IMEX formulations considered include horizontally explicit - vertically implicit (HEVI) approaches as well as splittings that treat some horizontal dynamics implicitly. In each case, the impact of solving nonlinear systems in each implicit ARK stage in a linearly implicit fashion is also explored. The accuracy and efficiency of the IMEX splittings, ARK methods, and solver options are evaluated on a gravity wave and baroclinic wave test case. HEVI splittings that treat some vertical dynamics explicitly do not show a benefit in solution quality or run time over the most implicit HEVI formulation. While splittings that implicitly evolve some horizontal dynamics increase the maximum stable step size of a method, the gains are insufficient to overcome the additional cost of solving a globally coupled system. Solving implicit stage systems in a linearly implicit manner limits the solver cost but this is offset by a reduction in step size to achieve the desired accuracy for some methods. Overall, the third-order ARS343 and ARK324 methods performed the best, followed by the second-order ARS232 and ARK232 methods.

  5. Implicit–explicit (IMEX Runge–Kutta methods for non-hydrostatic atmospheric models

    Directory of Open Access Journals (Sweden)

    D. J. Gardner

    2018-04-01

    Full Text Available The efficient simulation of non-hydrostatic atmospheric dynamics requires time integration methods capable of overcoming the explicit stability constraints on time step size arising from acoustic waves. In this work, we investigate various implicit–explicit (IMEX additive Runge–Kutta (ARK methods for evolving acoustic waves implicitly to enable larger time step sizes in a global non-hydrostatic atmospheric model. The IMEX formulations considered include horizontally explicit – vertically implicit (HEVI approaches as well as splittings that treat some horizontal dynamics implicitly. In each case, the impact of solving nonlinear systems in each implicit ARK stage in a linearly implicit fashion is also explored.The accuracy and efficiency of the IMEX splittings, ARK methods, and solver options are evaluated on a gravity wave and baroclinic wave test case. HEVI splittings that treat some vertical dynamics explicitly do not show a benefit in solution quality or run time over the most implicit HEVI formulation. While splittings that implicitly evolve some horizontal dynamics increase the maximum stable step size of a method, the gains are insufficient to overcome the additional cost of solving a globally coupled system. Solving implicit stage systems in a linearly implicit manner limits the solver cost but this is offset by a reduction in step size to achieve the desired accuracy for some methods. Overall, the third-order ARS343 and ARK324 methods performed the best, followed by the second-order ARS232 and ARK232 methods.

  6. Explicit calculation of indirect global warming potentials for halons using atmospheric models

    Directory of Open Access Journals (Sweden)

    D. J. Wuebbles

    2009-11-01

    Full Text Available The concept of Global Warming Potentials (GWPs has been extensively used in policy consideration as a relative index for comparing the climate impact of an emitted greenhouse gas (GHG, relative to carbon dioxide with equal mass emissions. Ozone depletion due to emission of chlorinated or brominated halocarbons leads to cooling of the climate system in the opposite direction to the direct warming contribution by halocarbons as GHGs. This cooling is a key indirect effect of the halocarbons on climatic radiative forcing, which is accounted for by indirect GWPs. With respect to climate, it is critical to understand net influences considering direct warming and indirect cooling effects especially for Halons due to the greater ozone-depleting efficiency of bromine over chlorine. Until now, the indirect GWPs have been calculated using a parameterized approach based on the concept of Equivalent Effective Stratospheric Chlorine (EESC and the observed ozone depletion over the last few decades. As a step towards obtaining indirect GWPs through a more robust approach, we use atmospheric models to explicitly calculate the indirect GWPs of Halon-1211 and Halon-1301 for a 100-year time horizon. State-of-the-art global chemistry-transport models (CTMs were used as the computational tools to derive more realistic ozone depletion changes caused by an added pulse emission of the two major Halons at the surface. The radiative forcings on climate from the ozone changes have been calculated for indirect GWPs using an atmospheric radiative transfer model (RTM. The simulated temporal variations of global average total column Halons after a pulse perturbation follow an exponential decay with an e-folding time which is consistent with the expected chemical lifetimes of the Halons. Our calculated indirect GWPs for the two Halons are much smaller than those from past studies but are within a single standard deviation of WMO (2007 values and the direct GWP values derived

  7. Explicit Modeling of Ancestry Improves Polygenic Risk Scores and BLUP Prediction.

    Science.gov (United States)

    Chen, Chia-Yen; Han, Jiali; Hunter, David J; Kraft, Peter; Price, Alkes L

    2015-09-01

    Polygenic prediction using genome-wide SNPs can provide high prediction accuracy for complex traits. Here, we investigate the question of how to account for genetic ancestry when conducting polygenic prediction. We show that the accuracy of polygenic prediction in structured populations may be partly due to genetic ancestry. However, we hypothesized that explicitly modeling ancestry could improve polygenic prediction accuracy. We analyzed three GWAS of hair color (HC), tanning ability (TA), and basal cell carcinoma (BCC) in European Americans (sample size from 7,440 to 9,822) and considered two widely used polygenic prediction approaches: polygenic risk scores (PRSs) and best linear unbiased prediction (BLUP). We compared polygenic prediction without correction for ancestry to polygenic prediction with ancestry as a separate component in the model. In 10-fold cross-validation using the PRS approach, the R(2) for HC increased by 66% (0.0456-0.0755; P ancestry, which prevents ancestry effects from entering into each SNP effect and being overweighted. Surprisingly, explicitly modeling ancestry produces a similar improvement when using the BLUP approach, which fits all SNPs simultaneously in a single variance component and causes ancestry to be underweighted. We validate our findings via simulations, which show that the differences in prediction accuracy will increase in magnitude as sample sizes increase. In summary, our results show that explicitly modeling ancestry can be important in both PRS and BLUP prediction. © 2015 WILEY PERIODICALS, INC.

  8. A unitary signal-detection model of implicit and explicit memory.

    Science.gov (United States)

    Berry, Christopher J; Shanks, David R; Henson, Richard N A

    2008-10-01

    Do dissociations imply independent systems? In the memory field, the view that there are independent implicit and explicit memory systems has been predominantly supported by dissociation evidence. Here, we argue that many of these dissociations do not necessarily imply distinct memory systems. We review recent work with a single-system computational model that extends signal-detection theory (SDT) to implicit memory. SDT has had a major influence on research in a variety of domains. The current work shows that it can be broadened even further in its range of application. Indeed, the single-system model that we present does surprisingly well in accounting for some key dissociations that have been taken as evidence for independent implicit and explicit memory systems.

  9. An Explicit Approach Toward Modeling Thermo-Coupled Deformation Behaviors of SMPs

    Directory of Open Access Journals (Sweden)

    Hao Li

    2017-03-01

    Full Text Available A new elastoplastic J 2 -flow models with thermal effects is proposed toward simulating thermo-coupled finite deformation behaviors of shape memory polymers. In this new model, an elastic potential evolving with development of plastic flow is incorporated to characterize the stress-softening effect at unloading and, moreover, thermo-induced plastic flow is introduced to represent the strain recovery effect at heating. It is shown that any given test data for both effects may be accurately simulated by means of direct and explicit procedures. Numerical examples for model predictions compare well with test data in literature.

  10. Explicit all-atom modeling of realistically sized ligand-capped nanocrystals

    KAUST Repository

    Kaushik, Ananth P.

    2012-01-01

    We present a study of an explicit all-atom representation of nanocrystals of experimentally relevant sizes (up to 6 nm), capped with alkyl chain ligands, in vacuum. We employ all-atom molecular dynamics simulation methods in concert with a well-tested intermolecular potential model, MM3 (molecular mechanics 3), for the studies presented here. These studies include determining the preferred conformation of an isolated single nanocrystal (NC), pairs of isolated NCs, and (presaging studies of superlattice arrays) unit cells of NC superlattices. We observe that very small NCs (3 nm) behave differently in a superlattice as compared to larger NCs (6 nm and above) due to the conformations adopted by the capping ligands on the NC surface. Short ligands adopt a uniform distribution of orientational preferences, including some that lie against the face of the nanocrystal. In contrast, longer ligands prefer to interdigitate. We also study the effect of changing ligand length and ligand coverage on the NCs on the preferred ligand configurations. Since explicit all-atom modeling constrains the maximum system size that can be studied, we discuss issues related to coarse-graining the representation of the ligands, including a comparison of two commonly used coarse-grained models. We find that care has to be exercised in the choice of coarse-grained model. The data provided by these realistically sized ligand-capped NCs, determined using explicit all-atom models, should serve as a reference standard for future models of coarse-graining ligands using united atom models, especially for self-assembly processes. © 2012 American Institute of Physics.

  11. Latin hypercube sampling and geostatistical modeling of spatial uncertainty in a spatially explicit forest landscape model simulation

    Science.gov (United States)

    Chonggang Xu; Hong S. He; Yuanman Hu; Yu Chang; Xiuzhen Li; Rencang Bu

    2005-01-01

    Geostatistical stochastic simulation is always combined with Monte Carlo method to quantify the uncertainty in spatial model simulations. However, due to the relatively long running time of spatially explicit forest models as a result of their complexity, it is always infeasible to generate hundreds or thousands of Monte Carlo simulations. Thus, it is of great...

  12. Probabilistic modelling in urban drainage – two approaches that explicitly account for temporal variation of model errors

    DEFF Research Database (Denmark)

    Löwe, Roland; Del Giudice, Dario; Mikkelsen, Peter Steen

    of input uncertainties observed in the models. The explicit inclusion of such variations in the modelling process will lead to a better fulfilment of the assumptions made in formal statistical frameworks, thus reducing the need to resolve to informal methods. The two approaches presented here...

  13. Multiscale modeling of a rectifying bipolar nanopore: explicit-water versus implicit-water simulations.

    Science.gov (United States)

    Ható, Zoltán; Valiskó, Mónika; Kristóf, Tamás; Gillespie, Dirk; Boda, Dezsö

    2017-07-21

    In a multiscale modeling approach, we present computer simulation results for a rectifying bipolar nanopore at two modeling levels. In an all-atom model, we use explicit water to simulate ion transport directly with the molecular dynamics technique. In a reduced model, we use implicit water and apply the Local Equilibrium Monte Carlo method together with the Nernst-Planck transport equation. This hybrid method makes the fast calculation of ion transport possible at the price of lost details. We show that the implicit-water model is an appropriate representation of the explicit-water model when we look at the system at the device (i.e., input vs. output) level. The two models produce qualitatively similar behavior of the electrical current for different voltages and model parameters. Looking at the details of concentration and potential profiles, we find profound differences between the two models. These differences, however, do not influence the basic behavior of the model as a device because they do not influence the z-dependence of the concentration profiles which are the main determinants of current. These results then address an old paradox: how do reduced models, whose assumptions should break down in a nanoscale device, predict experimental data? Our simulations show that reduced models can still capture the overall device physics correctly, even though they get some important aspects of the molecular-scale physics quite wrong; reduced models work because they include the physics that is necessary from the point of view of device function. Therefore, reduced models can suffice for general device understanding and device design, but more detailed models might be needed for molecular level understanding.

  14. An improved risk-explicit interval linear programming model for pollution load allocation for watershed management.

    Science.gov (United States)

    Xia, Bisheng; Qian, Xin; Yao, Hong

    2017-11-01

    Although the risk-explicit interval linear programming (REILP) model has solved the problem of having interval solutions, it has an equity problem, which can lead to unbalanced allocation between different decision variables. Therefore, an improved REILP model is proposed. This model adds an equity objective function and three constraint conditions to overcome this equity problem. In this case, pollution reduction is in proportion to pollutant load, which supports balanced development between different regional economies. The model is used to solve the problem of pollution load allocation in a small transboundary watershed. Compared with the REILP original model result, our model achieves equity between the upstream and downstream pollutant loads; it also overcomes the problem of greatest pollution reduction, where sources are nearest to the control section. The model provides a better solution to the problem of pollution load allocation than previous versions.

  15. Short-Range Prediction of Monsoon Precipitation by NCMRWF Regional Unified Model with Explicit Convection

    Science.gov (United States)

    Mamgain, Ashu; Rajagopal, E. N.; Mitra, A. K.; Webster, S.

    2018-03-01

    There are increasing efforts towards the prediction of high-impact weather systems and understanding of related dynamical and physical processes. High-resolution numerical model simulations can be used directly to model the impact at fine-scale details. Improvement in forecast accuracy can help in disaster management planning and execution. National Centre for Medium Range Weather Forecasting (NCMRWF) has implemented high-resolution regional unified modeling system with explicit convection embedded within coarser resolution global model with parameterized convection. The models configurations are based on UK Met Office unified seamless modeling system. Recent land use/land cover data (2012-2013) obtained from Indian Space Research Organisation (ISRO) are also used in model simulations. Results based on short-range forecast of both the global and regional models over India for a month indicate that convection-permitting simulations by the high-resolution regional model is able to reduce the dry bias over southern parts of West Coast and monsoon trough zone with more intense rainfall mainly towards northern parts of monsoon trough zone. Regional model with explicit convection has significantly improved the phase of the diurnal cycle of rainfall as compared to the global model. Results from two monsoon depression cases during study period show substantial improvement in details of rainfall pattern. Many categories in rainfall defined for operational forecast purposes by Indian forecasters are also well represented in case of convection-permitting high-resolution simulations. For the statistics of number of days within a range of rain categories between `No-Rain' and `Heavy Rain', the regional model is outperforming the global model in all the ranges. In the very heavy and extremely heavy categories, the regional simulations show overestimation of rainfall days. Global model with parameterized convection have tendency to overestimate the light rainfall days and

  16. An explicit formula for the interface tension of the 2D Potts model

    Science.gov (United States)

    Borgs, Christian; Janke, Wolfhard

    1992-11-01

    We consider the exact correlation length calculations for the two-dimensional Potts model at the transition point β_t by Klümper, Schadschneider and Zittartz, and by Buffenoir and Wallon. We argue that the correlation length calculated by the latter authors is the correlation length in the disordered phase and then combine their result with duality and the assumption of complete wetting to give an explicit formula for the order-disorder interface tension σ_od of this model. The result is used to clarify a controversy stemming from different numerical simulations of σ_od.

  17. Modeling mixed retention and early arrivals in multidimensional heterogeneous media using an explicit Lagrangian scheme

    Science.gov (United States)

    Zhang, Yong; Meerschaert, Mark M.; Baeumer, Boris; LaBolle, Eric M.

    2015-08-01

    This study develops an explicit two-step Lagrangian scheme based on the renewal-reward process to capture transient anomalous diffusion with mixed retention and early arrivals in multidimensional media. The resulting 3-D anomalous transport simulator provides a flexible platform for modeling transport. The first step explicitly models retention due to mass exchange between one mobile zone and any number of parallel immobile zones. The mobile component of the renewal process can be calculated as either an exponential random variable or a preassigned time step, and the subsequent random immobile time follows a Hyper-exponential distribution for finite immobile zones or a tempered stable distribution for infinite immobile zones with an exponentially tempered power-law memory function. The second step describes well-documented early arrivals which can follow streamlines due to mechanical dispersion using the method of subordination to regional flow. Applicability and implementation of the Lagrangian solver are further checked against transport observed in various media. Results show that, although the time-nonlocal model parameters are predictable for transport with retention in alluvial settings, the standard time-nonlocal model cannot capture early arrivals. Retention and early arrivals observed in porous and fractured media can be efficiently modeled by our Lagrangian solver, allowing anomalous transport to be incorporated into 2-D/3-D models with irregular flow fields. Extensions of the particle-tracking approach are also discussed for transport with parameters conditioned on local aquifer properties, as required by transient flow and nonstationary media.

  18. A Bidirectional Subsurface Remote Sensing Reflectance Model Explicitly Accounting for Particle Backscattering Shapes

    Science.gov (United States)

    He, Shuangyan; Zhang, Xiaodong; Xiong, Yuanheng; Gray, Deric

    2017-11-01

    The subsurface remote sensing reflectance (rrs, sr-1), particularly its bidirectional reflectance distribution function (BRDF), depends fundamentally on the angular shape of the volume scattering functions (VSFs, m-1 sr-1). Recent technological advancement has greatly expanded the collection, and the knowledge of natural variability, of the VSFs of oceanic particles. This allows us to test the Zaneveld's theoretical rrs model that explicitly accounts for particle VSF shapes. We parameterized the rrs model based on HydroLight simulations using 114 VSFs measured in three coastal waters around the United States and in oceanic waters of North Atlantic Ocean. With the absorption coefficient (a), backscattering coefficient (bb), and VSF shape as inputs, the parameterized model is able to predict rrs with a root mean square relative error of ˜4% for solar zenith angles from 0 to 75°, viewing zenith angles from 0 to 60°, and viewing azimuth angles from 0 to 180°. A test with the field data indicates the performance of our model, when using only a and bb as inputs and selecting the VSF shape using bb, is comparable to or slightly better than the currently used models by Morel et al. and Lee et al. Explicitly expressing VSF shapes in rrs modeling has great potential to further constrain the uncertainty in the ocean color studies as our knowledge on the VSFs of natural particles continues to improve. Our study represents a first effort in this direction.

  19. Speech Enhancement Using Gaussian Mixture Models, Explicit Bayesian Estimation and Wiener Filtering

    Directory of Open Access Journals (Sweden)

    M. H. Savoji

    2014-09-01

    Full Text Available Gaussian Mixture Models (GMMs of power spectral densities of speech and noise are used with explicit Bayesian estimations in Wiener filtering of noisy speech. No assumption is made on the nature or stationarity of the noise. No voice activity detection (VAD or any other means is employed to estimate the input SNR. The GMM mean vectors are used to form sets of over-determined system of equations whose solutions lead to the first estimates of speech and noise power spectra. The noise source is also identified and the input SNR estimated in this first step. These first estimates are then refined using approximate but explicit MMSE and MAP estimation formulations. The refined estimates are then used in a Wiener filter to reduce noise and enhance the noisy speech. The proposed schemes show good results. Nevertheless, it is shown that the MAP explicit solution, introduced here for the first time, reduces the computation time to less than one third with a slight higher improvement in SNR and PESQ score and also less distortion in comparison to the MMSE solution.

  20. Towards Linking 3D SAR and Lidar Models with a Spatially Explicit Individual Based Forest Model

    Science.gov (United States)

    Osmanoglu, B.; Ranson, J.; Sun, G.; Armstrong, A. H.; Fischer, R.; Huth, A.

    2017-12-01

    In this study, we present a parameterization of the FORMIND individual-based gap model (IBGM)for old growth Atlantic lowland rainforest in La Selva, Costa Rica for the purpose of informing multisensor remote sensing techniques for above ground biomass techniques. The model was successfully parameterized and calibrated for the study site; results show that the simulated forest reproduces the structural complexity of Costa Rican rainforest based on comparisons with CARBONO inventory plot data. Though the simulated stem numbers (378) slightly underestimated the plot data (418), particularly for canopy dominant intermediate shade tolerant trees and shade tolerant understory trees, overall there was a 9.7% difference. Aboveground biomass (kg/ha) showed a 0.1% difference between the simulated forest and inventory plot dataset. The Costa Rica FORMIND simulation was then used to parameterize a spatially explicit (3D) SAR and lidar backscatter models. The simulated forest stands were used to generate a Look Up Table as a tractable means to estimate aboveground forest biomass for these complex forests. Various combinations of lidar and radar variables were evaluated in the LUT inversion. To test the capability of future data for estimation of forest height and biomass, we considered data of 1) L- (or P-) band polarimetric data (backscattering coefficients of HH, HV and VV); 2) L-band dual-pol repeat-pass InSAR data (HH/HV backscattering coefficients and coherences, height of scattering phase center at HH and HV using DEM or surface height from lidar data as reference); 3) P-band polarimetric InSAR data (canopy height from inversion of PolInSAR data or use the coherences and height of scattering phase center at HH, HV and VV); 4) various height indices from waveform lidar data); and 5) surface and canopy top height from photon-counting lidar data. The methods for parameterizing the remote sensing models with the IBGM and developing Look Up Tables will be discussed. Results

  1. A new method for explicit modelling of single failure event within different common cause failure groups

    International Nuclear Information System (INIS)

    Kančev, Duško; Čepin, Marko

    2012-01-01

    Redundancy and diversity are the main principles of the safety systems in the nuclear industry. Implementation of safety components redundancy has been acknowledged as an effective approach for assuring high levels of system reliability. The existence of redundant components, identical in most of the cases, implicates a probability of their simultaneous failure due to a shared cause—a common cause failure. This paper presents a new method for explicit modelling of single component failure event within multiple common cause failure groups simultaneously. The method is based on a modification of the frequently utilised Beta Factor parametric model. The motivation for development of this method lays in the fact that one of the most widespread softwares for fault tree and event tree modelling as part of the probabilistic safety assessment does not comprise the option for simultaneous assignment of single failure event to multiple common cause failure groups. In that sense, the proposed method can be seen as an advantage of the explicit modelling of common cause failures. A standard standby safety system is selected as a case study for application and study of the proposed methodology. The results and insights implicate improved, more transparent and more comprehensive models within probabilistic safety assessment.

  2. Explicit ions/implicit water generalized Born model for nucleic acids

    Science.gov (United States)

    Tolokh, Igor S.; Thomas, Dennis G.; Onufriev, Alexey V.

    2018-05-01

    The ion atmosphere around highly charged nucleic acid molecules plays a significant role in their dynamics, structure, and interactions. Here we utilized the implicit solvent framework to develop a model for the explicit treatment of ions interacting with nucleic acid molecules. The proposed explicit ions/implicit water model is based on a significantly modified generalized Born (GB) model and utilizes a non-standard approach to define the solute/solvent dielectric boundary. Specifically, the model includes modifications to the GB interaction terms for the case of multiple interacting solutes—disconnected dielectric boundary around the solute-ion or ion-ion pairs. A fully analytical description of all energy components for charge-charge interactions is provided. The effectiveness of the approach is demonstrated by calculating the potential of mean force for Na+-Cl- ion pair and by carrying out a set of Monte Carlo (MC) simulations of mono- and trivalent ions interacting with DNA and RNA duplexes. The monovalent (Na+) and trivalent (CoHex3+) counterion distributions predicted by the model are in close quantitative agreement with all-atom explicit water molecular dynamics simulations used as reference. Expressed in the units of energy, the maximum deviations of local ion concentrations from the reference are within kBT. The proposed explicit ions/implicit water GB model is able to resolve subtle features and differences of CoHex distributions around DNA and RNA duplexes. These features include preferential CoHex binding inside the major groove of the RNA duplex, in contrast to CoHex biding at the "external" surface of the sugar-phosphate backbone of the DNA duplex; these differences in the counterion binding patters were earlier shown to be responsible for the observed drastic differences in condensation propensities between short DNA and RNA duplexes. MC simulations of CoHex ions interacting with the homopolymeric poly(dA.dT) DNA duplex with modified (de

  3. Spatially-Explicit Bayesian Information Entropy Metrics for Calibrating Landscape Transformation Models

    Directory of Open Access Journals (Sweden)

    Kostas Alexandridis

    2013-06-01

    Full Text Available Assessing spatial model performance often presents challenges related to the choice and suitability of traditional statistical methods in capturing the true validity and dynamics of the predicted outcomes. The stochastic nature of many of our contemporary spatial models of land use change necessitate the testing and development of new and innovative methodologies in statistical spatial assessment. In many cases, spatial model performance depends critically on the spatially-explicit prior distributions, characteristics, availability and prevalence of the variables and factors under study. This study explores the statistical spatial characteristics of statistical model assessment of modeling land use change dynamics in a seven-county study area in South-Eastern Wisconsin during the historical period of 1963–1990. The artificial neural network-based Land Transformation Model (LTM predictions are used to compare simulated with historical land use transformations in urban/suburban landscapes. We introduce a range of Bayesian information entropy statistical spatial metrics for assessing the model performance across multiple simulation testing runs. Bayesian entropic estimates of model performance are compared against information-theoretic stochastic entropy estimates and theoretically-derived accuracy assessments. We argue for the critical role of informational uncertainty across different scales of spatial resolution in informing spatial landscape model assessment. Our analysis reveals how incorporation of spatial and landscape information asymmetry estimates can improve our stochastic assessments of spatial model predictions. Finally our study shows how spatially-explicit entropic classification accuracy estimates can work closely with dynamic modeling methodologies in improving our scientific understanding of landscape change as a complex adaptive system and process.

  4. A spatially explicit model for an Allee effect: why wolves recolonize so slowly in Greater Yellowstone.

    Science.gov (United States)

    Hurford, Amy; Hebblewhite, Mark; Lewis, Mark A

    2006-11-01

    A reduced probability of finding mates at low densities is a frequently hypothesized mechanism for a component Allee effect. At low densities dispersers are less likely to find mates and establish new breeding units. However, many mathematical models for an Allee effect do not make a distinction between breeding group establishment and subsequent population growth. Our objective is to derive a spatially explicit mathematical model, where dispersers have a reduced probability of finding mates at low densities, and parameterize the model for wolf recolonization in the Greater Yellowstone Ecosystem (GYE). In this model, only the probability of establishing new breeding units is influenced by the reduced probability of finding mates at low densities. We analytically and numerically solve the model to determine the effect of a decreased probability in finding mates at low densities on population spread rate and density. Our results suggest that a reduced probability of finding mates at low densities may slow recolonization rate.

  5. An explicit solution of the mathematical model for osmotic desalination process

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Do Yeon; Gu, Boram; Yang, Dae Ryook [Korea University, Seoul (Korea, Republic of)

    2013-09-15

    Membrane processes such as reverse osmosis and forward osmosis for seawater desalination have gained attention in recent years. Mathematical models have been used to interpret the mechanism of membrane processes. The membrane process model, consisting of flux and concentration polarization (CP) models, is coupled with balance equations and solved simultaneously. This set of model equations is, however, implicit and nonlinear; consequently, the model must be solved iteratively and numerically, which is time- and cost-intensive. We suggest a method to transform implicit equations to their explicit form, in order to avoid an iterative procedure. In addition, the performance of five solving methods, including the method that we suggest, is tested and compared for accuracy, computation time, and robustness based on input conditions. Our proposed method shows the best performance based on the robustness of various simulation conditions, accuracy, and a cost-effective computation time.

  6. Pedagogical Model for Explicit Teaching of Reading Comprehension to English Language Learners

    Directory of Open Access Journals (Sweden)

    Al Tiyb Al Khaiyali

    2017-09-01

    Full Text Available Reading comprehension instruction is considered one of the major challenges that most English language teachers and students encounter. Therefore, providing a systematic, explicit, and flexible model to teaching reading comprehension strategies could help resolve some of these challenges and increase the possibility of teaching reading comprehension, particularly in language learners’ classrooms. Consequently, the purpose of this paper is to provide a model to teach reading comprehension strategies in language learning classrooms. The proposed instructional model is divided into three systematic phases through which strategies are taught before reading, during reading, and after reading. Each phase is explained and elaborated using recommended models for teachers. Finally, suggested considerations to consolidate this model are provided.

  7. Modeling nitrous oxide production and reduction in soil through explicit representation of denitrification enzyme kinetics.

    Science.gov (United States)

    Zheng, Jianqiu; Doskey, Paul V

    2015-02-17

    An enzyme-explicit denitrification model with representations for pre- and de novo synthesized enzymes was developed to improve predictions of nitrous oxide (N2O) accumulations in soil and emissions from the surface. The metabolic model of denitrification is based on dual-substrate utilization and Monod growth kinetics. Enzyme synthesis/activation was incorporated into each sequential reduction step of denitrification to regulate dynamics of the denitrifier population and the active enzyme pool, which controlled the rate function. Parameterizations were developed from observations of the dynamics of N2O production and reduction in soil incubation experiments. The model successfully reproduced the dynamics of N2O and N2 accumulation in the incubations and revealed an important regulatory effect of denitrification enzyme kinetics on the accumulation of denitrification products. Pre-synthesized denitrification enzymes contributed 20, 13, 43, and 62% of N2O that accumulated in 48 h incubations of soil collected from depths of 0-5, 5-10, 10-15, and 15-25 cm, respectively. An enzyme activity function (E) was defined to estimate the relative concentration of active enzymes and variation in response to environmental conditions. The value of E allows for activities of pre-synthesized denitrification enzymes to be differentiated from de novo synthesized enzymes. Incorporating explicit representations of denitrification enzyme kinetics into biogeochemical models is a promising approach for accurately simulating dynamics of the production and reduction of N2O in soils.

  8. Green Infrastructure Models and Tools

    Science.gov (United States)

    The objective of this project is to modify and refine existing models and develop new tools to support decision making for the complete green infrastructure (GI) project lifecycle, including the planning and implementation of stormwater control in urban and agricultural settings,...

  9. A Risk Assessment Example for Soil Invertebrates Using Spatially Explicit Agent-Based Models

    DEFF Research Database (Denmark)

    Reed, Melissa; Alvarez, Tania; Chelinho, Sonia

    2016-01-01

    Current risk assessment methods for measuring the toxicity of plant protection products (PPPs) on soil invertebrates use standardized laboratory conditions to determine acute effects on mortality and sublethal effects on reproduction. If an unacceptable risk is identified at the lower tier...... population models for ubiquitous soil invertebrates (collembolans and earthworms) as refinement options in current risk assessment. Both are spatially explicit agent-based models (ABMs), incorporating individual and landscape variability. The models were used to provide refined risk assessments for different...... application scenarios of a hypothetical pesticide applied to potato crops (full-field spray onto the soil surface [termed “overall”], in-furrow, and soil-incorporated pesticide applications). In the refined risk assessment, the population models suggest that soil invertebrate populations would likely recover...

  10. Simulation of a severe convective storm using a numerical model with explicitly incorporated aerosols

    Science.gov (United States)

    Lompar, Miloš; Ćurić, Mladjen; Romanic, Djordje

    2017-09-01

    Despite an important role the aerosols play in all stages of cloud lifecycle, their representation in numerical weather prediction models is often rather crude. This paper investigates the effects the explicit versus implicit inclusion of aerosols in a microphysics parameterization scheme in Weather Research and Forecasting (WRF) - Advanced Research WRF (WRF-ARW) model has on cloud dynamics and microphysics. The testbed selected for this study is a severe mesoscale convective system with supercells that struck west and central parts of Serbia in the afternoon of July 21, 2014. Numerical products of two model runs, i.e. one with aerosols explicitly (WRF-AE) included and another with aerosols implicitly (WRF-AI) assumed, are compared against precipitation measurements from surface network of rain gauges, as well as against radar and satellite observations. The WRF-AE model accurately captured the transportation of dust from the north Africa over the Mediterranean and to the Balkan region. On smaller scales, both models displaced the locations of clouds situated above west and central Serbia towards southeast and under-predicted the maximum values of composite radar reflectivity. Similar to satellite images, WRF-AE shows the mesoscale convective system as a merged cluster of cumulonimbus clouds. Both models over-predicted the precipitation amounts; WRF-AE over-predictions are particularly pronounced in the zones of light rain, while WRF-AI gave larger outliers. Unlike WRF-AI, the WRF-AE approach enables the modelling of time evolution and influx of aerosols into the cloud which could be of practical importance in weather forecasting and weather modification. Several likely causes for discrepancies between models and observations are discussed and prospects for further research in this field are outlined.

  11. [Application of spatially explicit landscape model in soil loss study in Huzhong area].

    Science.gov (United States)

    Xu, Chonggang; Hu, Yuanman; Chang, Yu; Li, Xiuzhen; Bu, Renchang; He, Hongshi; Leng, Wenfang

    2004-10-01

    Universal Soil Loss Equation (USLE) has been widely used to estimate the average annual soil loss. In most of the previous work on soil loss evaluation on forestland, cover management factor was calculated from the static forest landscape. The advent of spatially explicit forest landscape model in the last decade, which explicitly simulates the forest succession dynamics under natural and anthropogenic disturbances (fire, wind, harvest and so on) on heterogeneous landscape, makes it possible to take into consideration the change of forest cover, and to dynamically simulate the soil loss in different year (e.g. 10 years and 20 years after current year). In this study, we linked a spatially explicit landscape model (LANDIS) with USLE to simulate the soil loss dynamics under two scenarios: fire and no harvest, fire and harvest. We also simulated the soil loss with no fire and no harvest as a control. The results showed that soil loss varied periodically with simulation year, and the amplitude of change was the lowest under the control scenario and the highest under the fire and no harvest scenario. The effect of harvest on soil loss could not be easily identified on the map; however, the cumulative effect of harvest on soil loss was larger than that of fire. Decreasing the harvest area and the percent of bare soil increased by harvest could significantly reduce soil loss, but had no significant effects on the dynamic of soil loss. Although harvest increased the annual soil loss, it tended to decrease the variability of soil loss between different simulation years.

  12. Interaction of the model alkyltrimethylammonium ions with alkali halide salts: an explicit water molecular dynamics study

    Directory of Open Access Journals (Sweden)

    M. Druchok

    2013-01-01

    Full Text Available We present an explicit water molecular dynamics simulation of dilute solutions of model alkyltrimethylammonium surfactant ions (number of methylene groups in the tail is 3, 5, 8, 10, and 12 in mixture with NaF, NaCl, NaBr, and NaI salts, respectively. The SPC/E model is used to describe water molecules. Results of the simulation at 298 K are presented in form of the radial distribution functions between nitrogen and carbon atoms of CH2 groups on the alkyltrimethylammonium ion, and the counterion species in the solution. The running coordination numbers between carbon atoms of surfactants and counterions are also calculated. We show that I- counterion exhibits the highest, and F- the lowest affinity to "bind" to the model surfactants. The results are discussed in view of the available experimental and simulation data for this and similar solutions.

  13. Analysis of explicit model predictive control for path-following control

    Science.gov (United States)

    2018-01-01

    In this paper, explicit Model Predictive Control(MPC) is employed for automated lane-keeping systems. MPC has been regarded as the key to handle such constrained systems. However, the massive computational complexity of MPC, which employs online optimization, has been a major drawback that limits the range of its target application to relatively small and/or slow problems. Explicit MPC can reduce this computational burden using a multi-parametric quadratic programming technique(mp-QP). The control objective is to derive an optimal front steering wheel angle at each sampling time so that autonomous vehicles travel along desired paths, including straight, circular, and clothoid parts, at high entry speeds. In terms of the design of the proposed controller, a method of choosing weighting matrices in an optimization problem and the range of horizons for path-following control are described through simulations. For the verification of the proposed controller, simulation results obtained using other control methods such as MPC, Linear-Quadratic Regulator(LQR), and driver model are employed, and CarSim, which reflects the features of a vehicle more realistically than MATLAB/Simulink, is used for reliable demonstration. PMID:29534080

  14. Modeling of fatigue crack induced nonlinear ultrasonics using a highly parallelized explicit local interaction simulation approach

    Science.gov (United States)

    Shen, Yanfeng; Cesnik, Carlos E. S.

    2016-04-01

    This paper presents a parallelized modeling technique for the efficient simulation of nonlinear ultrasonics introduced by the wave interaction with fatigue cracks. The elastodynamic wave equations with contact effects are formulated using an explicit Local Interaction Simulation Approach (LISA). The LISA formulation is extended to capture the contact-impact phenomena during the wave damage interaction based on the penalty method. A Coulomb friction model is integrated into the computation procedure to capture the stick-slip contact shear motion. The LISA procedure is coded using the Compute Unified Device Architecture (CUDA), which enables the highly parallelized supercomputing on powerful graphic cards. Both the explicit contact formulation and the parallel feature facilitates LISA's superb computational efficiency over the conventional finite element method (FEM). The theoretical formulations based on the penalty method is introduced and a guideline for the proper choice of the contact stiffness is given. The convergence behavior of the solution under various contact stiffness values is examined. A numerical benchmark problem is used to investigate the new LISA formulation and results are compared with a conventional contact finite element solution. Various nonlinear ultrasonic phenomena are successfully captured using this contact LISA formulation, including the generation of nonlinear higher harmonic responses. Nonlinear mode conversion of guided waves at fatigue cracks is also studied.

  15. Explicit state representation and the ATLAS event data model: theory and practice

    International Nuclear Information System (INIS)

    Nowak, M; Snyder, S; Cranmer, K; Malon, D; Gemmeren, P v; Schaffer, A; Binet, S

    2008-01-01

    In anticipation of data taking, ATLAS has undertaken a program of work to develop an explicit state representation of the experiment's complex transient event data model. This effort has provided both an opportunity to consider explicitly the structure, organization, and content of the ATLAS persistent event store before writing tens of petabytes of data (replacing simple streaming, which uses the persistent store as a core dump of transient memory), and a locus for support of event data model evolution, including significant refactoring, beyond the automatic schema evolution capabilities of underlying persistence technologies. ATLAS has encountered the need for such non-trivial schema evolution on several occasions already. This paper describes the state representation strategy (transient/persistent separation) and its implementation, including both the payoffs that ATLAS has seen (significant and sometimes surprising space and performance improvements, the extra layer notwithstanding, and extremely general schema evolution support) and the costs (additional and relatively pervasive additional infrastructure development and maintenance). The paper further discusses how these costs are mitigated, and how ATLAS is able to implement this strategy without losing the ability to take advantage of the (improving!) automatic schema evolution capabilities of underlying technology layers when appropriate. Implications of state representations for direct ROOT browsability, and current strategies for associating physics analysis views with such state representations, are also described

  16. Analysis of explicit model predictive control for path-following control.

    Science.gov (United States)

    Lee, Junho; Chang, Hyuk-Jun

    2018-01-01

    In this paper, explicit Model Predictive Control(MPC) is employed for automated lane-keeping systems. MPC has been regarded as the key to handle such constrained systems. However, the massive computational complexity of MPC, which employs online optimization, has been a major drawback that limits the range of its target application to relatively small and/or slow problems. Explicit MPC can reduce this computational burden using a multi-parametric quadratic programming technique(mp-QP). The control objective is to derive an optimal front steering wheel angle at each sampling time so that autonomous vehicles travel along desired paths, including straight, circular, and clothoid parts, at high entry speeds. In terms of the design of the proposed controller, a method of choosing weighting matrices in an optimization problem and the range of horizons for path-following control are described through simulations. For the verification of the proposed controller, simulation results obtained using other control methods such as MPC, Linear-Quadratic Regulator(LQR), and driver model are employed, and CarSim, which reflects the features of a vehicle more realistically than MATLAB/Simulink, is used for reliable demonstration.

  17. Model of high-tech businesses management under the trends of explicit and implicit knowledge markets: classification and business model

    Directory of Open Access Journals (Sweden)

    Guzel Isayevna Gumerova

    2015-03-01

    Full Text Available Objective to define the notion of ldquohightech businessrdquo to elaborate classification of hightech businesses to elaborate the business model for hightech business management. Methods general scientific methods of theoretical and empirical cognition. Results the research presents a business model of hightech businesses management basing on the trends of explicit and explicit knowledge market with the dominating implicit knowledge market classification of hightech businesses taking into consideration the three types of economic activity possibilities to manage hightech business basing on its market cost technological innovations costs and business indicators. Scientific novelty the interpretation of the notion of ldquohightech businessrdquo has been renewed the classification of hightech businesses has been elaborated for the first time allocating three groups of enterprises. Practical value theoretical significance ndash development of notional apparatus of hightech business management practical significancenbsp ndash grounding of the necessity to manage enterprises under development of explicit and explicit knowledge markets in Russia as a complex of capital and noncapital assets with dominating indicators of ldquomarket valuerdquo and ldquolife span of a companyrdquo. nbsp

  18. Design and application of a technologically explicit hybrid energy-economy policy model with micro and macro economic dynamics

    Science.gov (United States)

    Bataille, Christopher G. F.

    2005-11-01

    autonomous energy efficiency indices (AEEI) from the model, parameters that could be used in long-run computable general equilibrium (CGE) analysis. The thesis concludes with a summary of the strengths and weakness of the new model as a policy tool, a work plan for its further improvement, and a discussion of the general potential for technologically explicit general equilibrium modelling.

  19. Explicit Solution of Reinsurance-Investment Problem for an Insurer with Dynamic Income under Vasicek Model

    Directory of Open Access Journals (Sweden)

    De-Lei Sheng

    2016-01-01

    Full Text Available Unlike traditionally used reserves models, this paper focuses on a reserve process with dynamic income to study the reinsurance-investment problem for an insurer under Vasicek stochastic interest rate model. The insurer’s dynamic income is given by the remainder after a dynamic reward budget being subtracted from the insurer’s net premium which is calculated according to expected premium principle. Applying stochastic control technique, a Hamilton-Jacobi-Bellman equation is established and the explicit solution is obtained under the objective of maximizing the insurer’s power utility of terminal wealth. Some economic interpretations of the obtained results are explained in detail. In addition, numerical analysis and several graphics are given to illustrate our results more meticulous.

  20. From explicit to implicit normal mode initialization of a limited-area model

    Energy Technology Data Exchange (ETDEWEB)

    Bijlsma, S.J.

    2013-02-15

    In this note the implicit normal mode initialization of a limited-area model is discussed from a different point of view. To that end it is shown that the equations describing the explicit normal mode initialization applied to the shallow water equations in differentiated form on the sphere can readily be derived in normal mode space if the model equations are separable, but only in the case of stationary Rossby modes can be transformed into the implicit equations in physical space. This is a consequence of the simple relations between the components of the different modes in that case. In addition a simple eigenvalue problem is given for the frequencies of the gravity waves. (orig.)

  1. Nano-colloid electrophoretic transport: Fully explicit modelling via dissipative particle dynamics

    Science.gov (United States)

    Hassanzadeh Afrouzi, Hamid; Farhadi, Mousa; Sedighi, Kurosh; Moshfegh, Abouzar

    2018-02-01

    In present study, a novel fully explicit approach using dissipative particle dynamics (DPD) method is introduced for modelling electrophoretic transport of nano-colloids in an electrolyte solution. Slater type charge smearing function included in 3D Ewald summation method is employed to treat electrostatic interaction. Moreover, capability of different thermostats are challenged to control the system temperature and study the dynamic response of colloidal electrophoretic mobility under practical ranges of external electric field in nano scale application (0.072 600 in DPD units regardless of electric field intensity. Nosé-Hoover-Lowe-Andersen and Lowe-Andersen thermostats are found to function more effectively under high electric fields (E > 0.145 [ v / nm ]) while thermal equilibrium is maintained. Reasonable agreements are achieved by benchmarking the radial distribution function with available electrolyte structure modellings, as well as comparing reduced mobility against conventional Smoluchowski and Hückel theories, and numerical solution of Poisson-Boltzmann equation.

  2. Charged patchy particle models in explicit salt: Ion distributions, electrostatic potentials, and effective interactions.

    Science.gov (United States)

    Yigit, Cemil; Heyda, Jan; Dzubiella, Joachim

    2015-08-14

    We introduce a set of charged patchy particle models (CPPMs) in order to systematically study the influence of electrostatic charge patchiness and multipolarity on macromolecular interactions by means of implicit-solvent, explicit-ion Langevin dynamics simulations employing the Gromacs software. We consider well-defined zero-, one-, and two-patched spherical globules each of the same net charge and (nanometer) size which are composed of discrete atoms. The studied mono- and multipole moments of the CPPMs are comparable to those of globular proteins with similar size. We first characterize ion distributions and electrostatic potentials around a single CPPM. Although angle-resolved radial distribution functions reveal the expected local accumulation and depletion of counter- and co-ions around the patches, respectively, the orientation-averaged electrostatic potential shows only a small variation among the various CPPMs due to space charge cancellations. Furthermore, we study the orientation-averaged potential of mean force (PMF), the number of accumulated ions on the patches, as well as the CPPM orientations along the center-to-center distance of a pair of CPPMs. We compare the PMFs to the classical Derjaguin-Verwey-Landau-Overbeek theory and previously introduced orientation-averaged Debye-Hückel pair potentials including dipolar interactions. Our simulations confirm the adequacy of the theories in their respective regimes of validity, while low salt concentrations and large multipolar interactions remain a challenge for tractable theoretical descriptions.

  3. Explicit modeling of volatile organic compounds partitioning in the atmospheric aqueous phase

    Directory of Open Access Journals (Sweden)

    C. Mouchel-Vallon

    2013-01-01

    Full Text Available The gas phase oxidation of organic species is a multigenerational process involving a large number of secondary compounds. Most secondary organic species are water-soluble multifunctional oxygenated molecules. The fully explicit chemical mechanism GECKO-A (Generator of Explicit Chemistry and Kinetics of Organics in the Atmosphere is used to describe the oxidation of organics in the gas phase and their mass transfer to the aqueous phase. The oxidation of three hydrocarbons of atmospheric interest (isoprene, octane and α-pinene is investigated for various NOx conditions. The simulated oxidative trajectories are examined in a new two dimensional space defined by the mean oxidation state and the solubility. The amount of dissolved organic matter was found to be very low (yield less than 2% on carbon atom basis under a water content typical of deliquescent aerosols. For cloud water content, 50% (isoprene oxidation to 70% (octane oxidation of the carbon atoms are found in the aqueous phase after the removal of the parent hydrocarbons for low NOx conditions. For high NOx conditions, this ratio is only 5% in the isoprene oxidation case, but remains large for α-pinene and octane oxidation cases (40% and 60%, respectively. Although the model does not yet include chemical reactions in the aqueous phase, much of this dissolved organic matter should be processed in cloud drops and modify both oxidation rates and the speciation of organic species.

  4. Integrating a Decision Management Tool with UML Modeling Tools

    DEFF Research Database (Denmark)

    Könemann, Patrick

    by proposing potential subsequent design issues. In model-based software development, many decisions directly affect the structural and behavioral models used to describe and develop a software system and its architecture. However, these decisions are typically not connected to the models created during...... integration of formerly disconnected tools improves tool usability as well as decision maker productivity....

  5. Comparison of Explicitly Simulated and Downscaled Tropical Cyclone Activity in a High-Resolution Global Climate Model

    Directory of Open Access Journals (Sweden)

    Hirofumi Tomita

    2010-01-01

    Full Text Available The response of tropical cyclone activity to climate change is a matter of great inherent interest and practical importance. Most current global climate models are not, however, capable of adequately resolving tropical cyclones; this has led to the development of downscaling techniques designed to infer tropical cyclone activity from the large-scale fields produced by climate models. Here we compare the statistics of tropical cyclones simulated explicitly in a very high resolution (~14 km grid mesh global climate model to the results of one such downscaling technique driven by the same global model. This is done for a simulation of the current climate and also for a simulation of a climate warmed by the addition of carbon dioxide. The explicitly simulated and downscaled storms are similarly distributed in space, but the intensity distribution of the downscaled events has a somewhat longer high-intensity tail, owing to the higher resolution of the downscaling model. Both explicitly simulated and downscaled events show large increases in the frequency of events at the high-intensity ends of their respective intensity distributions, but the downscaled storms also show increases in low-intensity events, whereas the explicitly simulated weaker events decline in number. On the regional scale, there are large differences in the responses of the explicitly simulated and downscaled events to global warming. In particular, the power dissipation of downscaled events shows a 175% increase in the Atlantic, while the power dissipation of explicitly simulated events declines there.

  6. A novel explicit approach to model bromide and pesticide transport in soils containing macropores

    Science.gov (United States)

    Klaus, J.; Zehe, E.

    2011-01-01

    The present study tests whether an explicit treatment of worm burrows is feasible for simulating water flow, bromide and pesticide transport in structured heterogeneous soils. The essence is to represent worm burrows as morphologically connected paths of low flow resistance in the spatially highly resolved model domain. A recent Monte Carlo study (Klaus and Zehe, 2010) revealed that this approach allowed successful reproduction of tile drain event discharge recorded during an irrigation experiment at a tile drained field site. However, several "hillslope architectures" that were all consistent with the available extensive data base allowed a good reproduction of tile drain flow response. Our second objective was thus to find out whether this "equifinality" in spatial model setups may be reduced when including bromide tracer data in the model falsification process. We thus simulated transport of bromide and Isoproturon (IPU) for the 13 spatial model setups, which performed best with respect to reproduce tile drain event discharge, without any further calibration. All model setups allowed a very good prediction of the temporal dynamics of cumulated bromide leaching into the tile drain, while only four of them matched the accumulated water balance and accumulated bromide loss into the tile drain. The number of behavioural model architectures could thus be reduced to four. One of those setups was used for simulating transport of IPU, using different parameter combinations to characterise adsorption according to the Footprint data base. Simulations could, however, only reproduce the observed leaching behaviour, when we allowed for retardation coefficients that were very close to one.

  7. Three Dimensional Explicit Model for Cometary Tail Ions Interactions with Solar Wind

    Science.gov (United States)

    Al Bermani, M. J. F.; Alhamed, S. A.; Khalaf, S. Z.; Ali, H. Sh.; Selman, A. A.

    2009-06-01

    The different interactions between cometary tail and solar wind ions are studied in the present paper based on three-dimensional Lax explicit method. The model used in this research is based on the continuity equations describing the cometary tail-solar wind interactions. Three dimensional system was considered in this paper. Simulation of the physical system was achieved using computer code written using Matlab 7.0. The parameters studied here assumed Halley comet type and include the particle density rho, the particles velocity v, the magnetic field strength B, dynamic pressure p and internal energy E. The results of the present research showed that the interaction near the cometary nucleus is mainly affected by the new ions added to the plasma of the solar wind, which increases the average molecular weight and result in many unique characteristics of the cometary tail. These characteristics were explained in the presence of the IMF.

  8. The Importance of Representing Certain Key Vegetation Canopy Processes Explicitly in a Land Surface Model

    Science.gov (United States)

    Napoly, A.; Boone, A. A.; Martin, E.; Samuelsson, P.

    2015-12-01

    Land surface models are moving to more detailed vegetation canopy descriptions in order to better represent certain key processes, such as Carbon dynamics and snowpack evolution. Since such models are usually applied within coupled numerical weather prediction or spatially distributed hydrological models, these improvements must strike a balance between computational cost and complexity. The consequences of simplified or composite canopy approaches can be manifested in terms of increased errors with respect to soil temperatures, estimates of the diurnal cycle of the turbulent fluxes or snow canopy interception and melt. Vegetated areas and particularly forests are modeled in a quite simplified manner in the ISBA land surface model. However, continuous developments of surface processes now require a more accurate description of the canopy. A new version of the the model now includes a multi energy balance (MEB) option to explicitly represent the canopy and the forest floor. It will be shown that certain newly included processes such as the shading effect of the vegetation, the explicit heat capacity of the canopy, and the insulating effect of the forest floor turn out to be essential. A detailed study has been done for four French forested sites. It was found that the MEB option significantly improves the ground heat flux (RMSE decrease from 50W/m2 to 10W/m2 on average) and soil temperatures when compared against measurements. Also the sensible heat flux calculation was improved primarily owing to a better phasing with the solar insulation owing to a lower vegetation heat capacity. However, the total latent heat flux is less modified compared to the classical ISBA simulation since it is more related to water uptake and the formulation of the stomatal resistance (which are unchanged). Next, a benchmark over 40 Fluxnet sites (116 cumulated years) was performed and compared with results from the default composite soil-vegetation version of ISBA. The results show

  9. A risk explicit interval linear programming model for uncertainty-based environmental economic optimization in the Lake Fuxian watershed, China.

    Science.gov (United States)

    Zhang, Xiaoling; Huang, Kai; Zou, Rui; Liu, Yong; Yu, Yajuan

    2013-01-01

    The conflict of water environment protection and economic development has brought severe water pollution and restricted the sustainable development in the watershed. A risk explicit interval linear programming (REILP) method was used to solve integrated watershed environmental-economic optimization problem. Interval linear programming (ILP) and REILP models for uncertainty-based environmental economic optimization at the watershed scale were developed for the management of Lake Fuxian watershed, China. Scenario analysis was introduced into model solution process to ensure the practicality and operability of optimization schemes. Decision makers' preferences for risk levels can be expressed through inputting different discrete aspiration level values into the REILP model in three periods under two scenarios. Through balancing the optimal system returns and corresponding system risks, decision makers can develop an efficient industrial restructuring scheme based directly on the window of "low risk and high return efficiency" in the trade-off curve. The representative schemes at the turning points of two scenarios were interpreted and compared to identify a preferable planning alternative, which has the relatively low risks and nearly maximum benefits. This study provides new insights and proposes a tool, which was REILP, for decision makers to develop an effectively environmental economic optimization scheme in integrated watershed management.

  10. A Risk Explicit Interval Linear Programming Model for Uncertainty-Based Environmental Economic Optimization in the Lake Fuxian Watershed, China

    Directory of Open Access Journals (Sweden)

    Xiaoling Zhang

    2013-01-01

    Full Text Available The conflict of water environment protection and economic development has brought severe water pollution and restricted the sustainable development in the watershed. A risk explicit interval linear programming (REILP method was used to solve integrated watershed environmental-economic optimization problem. Interval linear programming (ILP and REILP models for uncertainty-based environmental economic optimization at the watershed scale were developed for the management of Lake Fuxian watershed, China. Scenario analysis was introduced into model solution process to ensure the practicality and operability of optimization schemes. Decision makers’ preferences for risk levels can be expressed through inputting different discrete aspiration level values into the REILP model in three periods under two scenarios. Through balancing the optimal system returns and corresponding system risks, decision makers can develop an efficient industrial restructuring scheme based directly on the window of “low risk and high return efficiency” in the trade-off curve. The representative schemes at the turning points of two scenarios were interpreted and compared to identify a preferable planning alternative, which has the relatively low risks and nearly maximum benefits. This study provides new insights and proposes a tool, which was REILP, for decision makers to develop an effectively environmental economic optimization scheme in integrated watershed management.

  11. Explicit simulation of ice particle habits in a Numerical Weather Prediction Model

    Science.gov (United States)

    Hashino, Tempei

    2007-05-01

    This study developed a scheme for explicit simulation of ice particle habits in Numerical Weather Prediction (NWP) Models. The scheme is called Spectral Ice Habit Prediction System (SHIPS), and the goal is to retain growth history of ice particles in the Eulerian dynamics framework. It diagnoses characteristics of ice particles based on a series of particle property variables (PPVs) that reflect history of microphysieal processes and the transport between mass bins and air parcels in space. Therefore, categorization of ice particles typically used in bulk microphysical parameterization and traditional bin models is not necessary, so that errors that stem from the categorization can be avoided. SHIPS predicts polycrystals as well as hexagonal monocrystals based on empirically derived habit frequency and growth rate, and simulates the habit-dependent aggregation and riming processes by use of the stochastic collection equation with predicted PPVs. Idealized two dimensional simulations were performed with SHIPS in a NWP model. The predicted spatial distribution of ice particle habits and types, and evolution of particle size distributions showed good quantitative agreement with observation This comprehensive model of ice particle properties, distributions, and evolution in clouds can be used to better understand problems facing wide range of research disciplines, including microphysics processes, radiative transfer in a cloudy atmosphere, data assimilation, and weather modification.

  12. A spatially explicit model for the future progression of the current Haiti cholera epidemic

    Science.gov (United States)

    Bertuzzo, E.; Mari, L.; Righetto, L.; Gatto, M.; Casagrandi, R.; Rodriguez-Iturbe, I.; Rinaldo, A.

    2011-12-01

    As a major cholera epidemic progresses in Haiti, and the figures of the infection, up to July 2011, climb to 385,000 cases and 5,800 deaths, the development of general models to track and predict the evolution of the outbreak, so as to guide the allocation of medical supplies and staff, is gaining notable urgency. We propose here a spatially explicit epidemic model that accounts for the dynamics of susceptible and infected individuals as well as the redistribution of textit{Vibrio cholera}, the causative agent of the disease, among different human communities. In particular, we model two spreading pathways: the advection of pathogens through hydrologic connections and the dissemination due to human mobility described by means of a gravity-like model. To this end the country has been divided into hydrologic units based on drainage directions derived from a digital terrain model. Moreover the population of each unit has been estimated from census data downscaled to 1 km x 1 km resolution via remotely sensed geomorphological information (LandScan texttrademark project). The model directly account for the role of rainfall patterns in driving the seasonality of cholera outbreaks. The two main outbreaks in fact occurred during the rainy seasons (October and May) when extensive floodings severely worsened the sanitation conditions and, in turn, raised the risk of infection. The model capability to reproduce the spatiotemporal features of the epidemic up to date grants robustness to the foreseen future development. In this context, the duration of acquired immunity, a hotly debated topic in the scientific community, emerges as a controlling factor for progression of the epidemic in the near future. The framework presented here can straightforwardly be used to evaluate the effectiveness of alternative intervention strategies like mass vaccinations, clean water supply and educational campaigns, thus emerging as an essential component of the control of future cholera

  13. Explicit Nonlinear Model Predictive Control for a Saucer-Shaped Unmanned Aerial Vehicle

    Directory of Open Access Journals (Sweden)

    Zhihui Xing

    2013-01-01

    Full Text Available A lifting body unmanned aerial vehicle (UAV generates lift by its body and shows many significant advantages due to the particular shape, such as huge loading space, small wetted area, high-strength fuselage structure, and large lifting area. However, designing the control law for a lifting body UAV is quite challenging because it has strong nonlinearity and coupling, and usually lacks it rudders. In this paper, an explicit nonlinear model predictive control (ENMPC strategy is employed to design a control law for a saucer-shaped UAV which can be adequately modeled with a rigid 6-degrees-of-freedom (DOF representation. In the ENMPC, control signal is calculated by approximation of the tracking error in the receding horizon by its Taylor-series expansion to any specified order. It enhances the advantages of the nonlinear model predictive control and eliminates the time-consuming online optimization. The simulation results show that ENMPC is a propriety strategy for controlling lifting body UAVs and can compensate the insufficient control surface area.

  14. Predicting drought propagation within peat layers using a three dimensionally explicit voxel based model

    Science.gov (United States)

    Condro, A. A.; Pawitan, H.; Risdiyanto, I.

    2018-05-01

    Peatlands are very vulnerable to widespread fires during dry seasons, due to availability of aboveground fuel biomass on the surface and belowground fuel biomass on the sub-surface. Hence, understanding drought propagation occurring within peat layers is crucial with regards to disaster mitigation activities on peatlands. Using a three dimensionally explicit voxel-based model of peatland hydrology, this study predicted drought propagation time lags into sub-surface peat layers after drought events occurrence on the surface of about 1 month during La-Nina and 2.5 months during El-Nino. The study was carried out on a high-conservation-value area of oil palm plantation in West Kalimantan. Validity of the model was evaluated and its applicability for disaster mitigation was discussed. The animations of simulated voxels are available at: goo.gl/HDRMYN (El-Nino 2015 episode) and goo.gl/g1sXPl (La-Nina 2016 episode). The model is available at: goo.gl/RiuMQz.

  15. A stage-structured, spatially explicit migration model for Myotis bats: mortality location affects system dynamics

    Science.gov (United States)

    Erickson, Richard A.; Thogmartin, Wayne E.; Russell, Robin E.; Diffendorfer, James E.; Szymanski, Jennifer A.

    2014-01-01

    Bats are ecologically and economically important species because they consume insects, transport nutrients, and pollinate flowers.  Many species of bats, including those in the Myotis genus, are facing population decline and increased extinction risk.  Despite these conservation concerns, few models exist for providing insight into the population dynamics of bats in a spatially explicit context.  We developed a model for bats by considering the stage-structured colonial life history of Myotis bats with their annual migration behavior.  This model provided insight into network dynamics.  We specifically focused on two Myotis species living in the eastern United States: the Indiana bat (M. sodalis), which is a Federally listed endangered species, and the little brown bat (M. lucifugus), which is under consideration for listing as an endangered species.  We found that multiple equilibria exist for the local, migratory subpopulations even though the total population was constant.  These equilibria suggest the location and magnitude of stressors such as White-nose Syndrome, meteorological phenomena, or impacts of wind turbines on survival influence system dynamics and risk of population extirpation in difficult to predict ways.

  16. Spatially Explicit Estimation of Optimal Light Use Efficiency for Improved Satellite Data Driven Ecosystem Productivity Modeling

    Science.gov (United States)

    Madani, N.; Kimball, J. S.; Running, S. W.

    2014-12-01

    Remote sensing based light use efficiency (LUE) models, including the MODIS (MODerate resolution Imaging Spectroradiometer) MOD17 algorithm are commonly used for regional estimation and monitoring of vegetation gross primary production (GPP) and photosynthetic carbon (CO2) uptake. A common model assumption is that plants in a biome matrix operate at their photosynthetic capacity under optimal climatic conditions. A prescribed biome maximum light use efficiency parameter defines the maximum photosynthetic carbon conversion rate under prevailing climate conditions and is a large source of model uncertainty. Here, we used tower (FLUXNET) eddy covariance measurement based carbon flux data for estimating optimal LUE (LUEopt) over a North American domain. LUEopt was first estimated using tower observed daily carbon fluxes, meteorology and satellite (MODIS) observed fraction of photosynthetically active radiation (FPAR). LUEopt was then spatially interpolated over the domain using empirical models derived from independent geospatial data including global plant traits, surface soil moisture, terrain aspect, land cover type and percent tree cover. The derived LUEopt maps were then used as primary inputs to the MOD17 LUE algorithm for regional GPP estimation; these results were evaluated against tower observations and alternate MOD17 GPP estimates determined using Biome-specific LUEopt constants. Estimated LUEopt shows large spatial variability within and among different land cover classes indicated from a sparse North American tower network. Leaf nitrogen content and soil moisture are two important factors explaining LUEopt spatial variability. GPP estimated from spatially explicit LUEopt inputs shows significantly improved model accuracy against independent tower observations (R2 = 0.76; Mean RMSE plant trait information can explain spatial heterogeneity in LUEopt, leading to improved GPP estimates from satellite based LUE models.

  17. A novel explicit approach to model bromide and pesticide transport in connected soil structures

    Directory of Open Access Journals (Sweden)

    J. Klaus

    2011-07-01

    Full Text Available The present study tests whether an explicit treatment of worm burrows and tile drains as connected structures is feasible for simulating water flow, bromide and pesticide transport in structured heterogeneous soils at hillslope scale. The essence is to represent worm burrows as morphologically connected paths of low flow resistance in a hillslope model. A recent Monte Carlo study (Klaus and Zehe, 2010, Hydrological Processes, 24, p. 1595–1609 revealed that this approach allowed successful reproduction of tile drain event discharge recorded during an irrigation experiment at a tile drained field site. However, several "hillslope architectures" that were all consistent with the available extensive data base allowed a good reproduction of tile drain flow response. Our second objective was thus to find out whether this "equifinality" in spatial model setups may be reduced when including bromide tracer data in the model falsification process. We thus simulated transport of bromide for the 13 spatial model setups that performed best with respect to reproduce tile drain event discharge, without any further calibration. All model setups allowed a very good prediction of the temporal dynamics of cumulated bromide leaching into the tile drain, while only four of them matched the accumulated water balance and accumulated bromide loss into the tile drain. The number of behavioural model architectures could thus be reduced to four. One of those setups was used for simulating transport of Isoproturon, using different parameter combinations to characterise adsorption according to the Footprint data base. Simulations could, however, only reproduce the observed leaching behaviour, when we allowed for retardation coefficients that were very close to one.

  18. A novel explicit approach to model bromide and pesticide transport in connected soil structures

    Science.gov (United States)

    Klaus, J.; Zehe, E.

    2011-07-01

    The present study tests whether an explicit treatment of worm burrows and tile drains as connected structures is feasible for simulating water flow, bromide and pesticide transport in structured heterogeneous soils at hillslope scale. The essence is to represent worm burrows as morphologically connected paths of low flow resistance in a hillslope model. A recent Monte Carlo study (Klaus and Zehe, 2010, Hydrological Processes, 24, p. 1595-1609) revealed that this approach allowed successful reproduction of tile drain event discharge recorded during an irrigation experiment at a tile drained field site. However, several "hillslope architectures" that were all consistent with the available extensive data base allowed a good reproduction of tile drain flow response. Our second objective was thus to find out whether this "equifinality" in spatial model setups may be reduced when including bromide tracer data in the model falsification process. We thus simulated transport of bromide for the 13 spatial model setups that performed best with respect to reproduce tile drain event discharge, without any further calibration. All model setups allowed a very good prediction of the temporal dynamics of cumulated bromide leaching into the tile drain, while only four of them matched the accumulated water balance and accumulated bromide loss into the tile drain. The number of behavioural model architectures could thus be reduced to four. One of those setups was used for simulating transport of Isoproturon, using different parameter combinations to characterise adsorption according to the Footprint data base. Simulations could, however, only reproduce the observed leaching behaviour, when we allowed for retardation coefficients that were very close to one.

  19. Modeling the oxidation of ebselen and other organoselenium compounds using explicit solvent networks.

    Science.gov (United States)

    Bayse, Craig A; Antony, Sonia

    2009-05-14

    The oxidation of dimethylselenide, dimethyldiselenide, S-methylselenenyl-methylmercaptan, and truncated and full models of ebselen (N-phenyl-1,2-benzisoselenazol-3(2H)-one) by methyl hydrogen peroxide has been modeled using density functional theory (DFT) and solvent-assisted proton exchange (SAPE), a method of microsolvation that employs explicit solvent networks to facilitate proton transfer reactions. The calculated activation barriers for these systems were substantially lower in energy (DeltaG(double dagger) + DeltaG(solv) = 13 to 26 kcal/mol) than models that neglect the participation of solvent in proton exchange. The comparison of two- and three-water SAPE networks showed a reduction in the strain in the model system but without a substantial reduction in the activation barriers. Truncating the ebselen model to N-methylisoselenazol-3(2H)-one gave a larger activation barrier than ebselen or N-methyl-1,2-benzisoselenazol-3(2H)-one but provided an efficient means of determining an initial guess for larger transition-state models. The similar barriers obtained for ebselen and Me(2)Se(2) (DeltaG(double dagger) + DeltaG(solv) = 20.65 and 20.40 kcal/mol, respectively) were consistent with experimentally determined rate constants. The activation barrier for MeSeSMe (DeltaG(double dagger) + DeltaG(solv) = 21.25 kcal/mol) was similar to that of ebselen and Me(2)Se(2) despite its significantly lower experimental rate for oxidation of an ebselen selenenyl sulfide by hydrogen peroxide relative to ebselen and ebselen diselenide. The disparity is attributed to intramolecular Se-O interactions, which decrease the nucleophilicity of the selenium center of the selenenyl sulfide.

  20. Biomass supply from alternative cellulosic crops and crop residues: A spatially explicit bioeconomic modeling approach

    International Nuclear Information System (INIS)

    Egbendewe-Mondzozo, Aklesso; Swinton, Scott M.; Izaurralde, César R.; Manowitz, David H.; Zhang, Xuesong

    2011-01-01

    This paper introduces a spatially-explicit bioeconomic model for the study of potential cellulosic biomass supply. For biomass crops to begin to replace current crops, farmers must earn more from them than from current crops. Using weather, topographic and soil data, the terrestrial ecosystem model, EPIC, dynamically simulates multiple cropping systems that vary by crop rotation, tillage, fertilization and residue removal rate. EPIC generates predicted crop yield and environmental outcomes over multiple watersheds. These EPIC results are used to parameterize a regional profit-maximization mathematical programming model that identifies profitable cropping system choices. The bioeconomic model is calibrated to 2007–09 crop production in a 9-county region of southwest Michigan. A simulation of biomass supply in response to rising biomass prices shows that cellulosic residues from corn stover and wheat straw begin to be supplied at minimum delivered biomass:corn grain price ratios of 0.15 and 0.18, respectively. At the mean corn price of $162.6/Mg ($4.13 per bushel) at commercial moisture content during 2007–2009, these ratios correspond to stover and straw prices of $24 and $29 per dry Mg. Perennial bioenergy crops begin to be supplied at price levels 2–3 times higher. Average biomass transport costs to the biorefinery plant range from $6 to $20/Mg compared to conventional crop production practices in the area, biomass supply from annual crop residues increased greenhouse gas emissions and reduced water quality through increased nutrient loss. By contrast, perennial cellulosic biomass crop production reduced greenhouse gas emissions and improved water quality. -- Highlights: ► A new bioeconomic model predicts biomass supply and its environmental impacts. ► The model captures the opportunity cost of switching to new cellulosic crops. ► Biomass from crop residues is supplied at lower biomass price than cellulosic crops. ► Biomass from cellulosic crops has

  1. Explicit modelling of SOA formation from α-pinene photooxidation: sensitivity to vapour pressure estimation

    Directory of Open Access Journals (Sweden)

    R. Valorso

    2011-07-01

    Full Text Available The sensitivity of the formation of secondary organic aerosol (SOA to the estimated vapour pressures of the condensable oxidation products is explored. A highly detailed reaction scheme was generated for α-pinene photooxidation using the Generator for Explicit Chemistry and Kinetics of Organics in the Atmosphere (GECKO-A. Vapour pressures (Pvap were estimated with three commonly used structure activity relationships. The values of Pvap were compared for the set of secondary species generated by GECKO-A to describe α-pinene oxidation. Discrepancies in the predicted vapour pressures were found to increase with the number of functional groups borne by the species. For semi-volatile organic compounds (i.e. organic species of interest for SOA formation, differences in the predicted Pvap range between a factor of 5 to 200 on average. The simulated SOA concentrations were compared to SOA observations in the Caltech chamber during three experiments performed under a range of NOx conditions. While the model captures the qualitative features of SOA formation for the chamber experiments, SOA concentrations are systematically overestimated. For the conditions simulated, the modelled SOA speciation appears to be rather insensitive to the Pvap estimation method.

  2. Modelling rapid subsurface flow at the hillslope scale with explicit representation of preferential flow paths

    Science.gov (United States)

    Wienhöfer, J.; Zehe, E.

    2012-04-01

    Rapid lateral flow processes via preferential flow paths are widely accepted to play a key role for rainfall-runoff response in temperate humid headwater catchments. A quantitative description of these processes, however, is still a major challenge in hydrological research, not least because detailed information about the architecture of subsurface flow paths are often impossible to obtain at a natural site without disturbing the system. Our study combines physically based modelling and field observations with the objective to better understand how flow network configurations influence the hydrological response of hillslopes. The system under investigation is a forested hillslope with a small perennial spring at the study area Heumöser, a headwater catchment of the Dornbirnerach in Vorarlberg, Austria. In-situ points measurements of field-saturated hydraulic conductivity and dye staining experiments at the plot scale revealed that shrinkage cracks and biogenic macropores function as preferential flow paths in the fine-textured soils of the study area, and these preferential flow structures were active in fast subsurface transport of artificial tracers at the hillslope scale. For modelling of water and solute transport, we followed the approach of implementing preferential flow paths as spatially explicit structures of high hydraulic conductivity and low retention within the 2D process-based model CATFLOW. Many potential configurations of the flow path network were generated as realisations of a stochastic process informed by macropore characteristics derived from the plot scale observations. Together with different realisations of soil hydraulic parameters, this approach results in a Monte Carlo study. The model setups were used for short-term simulation of a sprinkling and tracer experiment, and the results were evaluated against measured discharges and tracer breakthrough curves. Although both criteria were taken for model evaluation, still several model setups

  3. SPATIALLY EXPLICIT MICRO-LEVEL MODELLING OF LAND USE CHANGE AT THE RURAL-URBAN INTERFACE. (R828012)

    Science.gov (United States)

    This paper describes micro-economic models of land use change applicable to the rural–urban interface in the US. Use of a spatially explicit micro-level modelling approach permits the analysis of regional patterns of land use as the aggregate outcomes of many, disparate...

  4. Prediction of strongly-heated gas flows in a vertical tube using explicit algebraic stress/heat-flux models

    International Nuclear Information System (INIS)

    Baek, Seong Gu; Park, Seung O.

    2003-01-01

    This paper provides the assessment of prediction performance of explicit algebraic stress and heat-flux models under conditions of mixed convective gas flows in a strongly-heated vertical tube. Two explicit algebraic stress models and four algebraic heat-flux models are selected for assessment. Eight combinations of explicit algebraic stress and heat-flux models are used in predicting the flows experimentally studied by Shehata and McEligot (IJHMT 41(1998) p.4333) in which property variation was significant. Among the various model combinations, the Wallin and Johansson (JFM 403(2000) p. 89) explicit algebraic stress model-Abe, Kondo, and Nagano (IJHFF 17(1996) p. 228) algebraic heat-flux model combination is found to perform best. We also found that the dimensionless wall distance y + should be calculated based on the local property rather than the property at the wall for property-variation flows. When the buoyancy or the property variation effects are so strong that the flow may relaminarize, the choice of the basic platform two-equation model is a most important factor in improving the predictions

  5. Human Mobility Patterns and Cholera Epidemics: a Spatially Explicit Modeling Approach

    Science.gov (United States)

    Mari, L.; Bertuzzo, E.; Righetto, L.; Casagrandi, R.; Gatto, M.; Rodriguez-Iturbe, I.; Rinaldo, A.

    2010-12-01

    Cholera is an acute enteric disease caused by the ingestion of water or food contaminated by the bacterium Vibrio cholerae. Although most infected individuals do not develop severe symptoms, their stool may contain huge quantities of V.~cholerae cells. Therefore, while traveling or commuting, asymptomatic carriers can be responsible for the long-range dissemination of the disease. As a consequence, human mobility is an alternative and efficient driver for the spread of cholera, whose primary propagation pathway is hydrological transport through river networks. We present a multi-layer network model that accounts for the interplay between epidemiological dynamics, hydrological transport and long-distance dissemination of V.~cholerae due to human movement. In particular, building on top of state-of-the-art spatially explicit models for cholera spread through surface waters, we describe human movement and its effects on the propagation of the disease by means of a gravity-model approach borrowed from transportation theory. Gravity-like contact processes have been widely used in epidemiology, because they can satisfactorily depict human movement when data on actual mobility patterns are not available. We test our model against epidemiological data recorded during the cholera outbreak occurred in the KwaZulu-Natal province of South Africa during years 2000--2001. We show that human mobility does actually play an important role in the formation of the spatiotemporal patterns of cholera epidemics. In particular, long-range human movement may determine inter-catchment dissemination of V.~cholerae cells, thus in turn explaining the emergence of epidemic patterns that cannot be produced by hydrological transport alone. We also show that particular attention has to be devoted to study how heterogeneously distributed drinking water supplies and sanitation conditions may affect cholera transmission.

  6. Large eddy simulations of round free jets using explicit filtering with/without dynamic Smagorinsky model

    International Nuclear Information System (INIS)

    Bogey, Christophe; Bailly, Christophe

    2006-01-01

    Large eddy simulations (LES) of round free jets at Mach number M = 0.9 with Reynolds numbers over the range 2.5 x 10 3 ≤ Re D ≤ 4 x 10 5 are performed using explicit selective/high-order filtering with or without dynamic Smagorinsky model (DSM). Features of the flows and of the turbulent kinetic energy budgets in the turbulent jets are reported. The contributions of molecular viscosity, filtering and DSM to energy dissipation are also presented. Using filtering alone, the results are independent of the filtering strength, and the effects of the Reynolds number on jet development are successfully calculated. Using DSM, the effective jet Reynolds number is found to be artificially decreased by the eddy viscosity. The results are also not appreciably modified when subgrid-scale kinetic energy is used. Moreover, unlike filtering which does not significantly affect the larger computed scales, the eddy viscosity is shown to dissipate energy through all the turbulent scales, in the same way as molecular viscosity at lower Reynolds numbers

  7. Resolution and Energy Dissipation Characteristics of Implicit LES and Explicit Filtering Models for Compressible Turbulence

    Directory of Open Access Journals (Sweden)

    Romit Maulik

    2017-04-01

    Full Text Available Solving two-dimensional compressible turbulence problems up to a resolution of 16, 384^2, this paper investigates the characteristics of two promising computational approaches: (i an implicit or numerical large eddy simulation (ILES framework using an upwind-biased fifth-order weighted essentially non-oscillatory (WENO reconstruction algorithm equipped with several Riemann solvers, and (ii a central sixth-order reconstruction framework combined with various linear and nonlinear explicit low-pass spatial filtering processes. Our primary aim is to quantify the dissipative behavior, resolution characteristics, shock capturing ability and computational expenditure for each approach utilizing a systematic analysis with respect to its modeling parameters or parameterizations. The relative advantages and disadvantages of both approaches are addressed for solving a stratified Kelvin-Helmholtz instability shear layer problem as well as a canonical Riemann problem with the interaction of four shocks. The comparisons are both qualitative and quantitative, using visualizations of the spatial structure of the flow and energy spectra, respectively. We observe that the central scheme, with relaxation filtering, offers a competitive approach to ILES and is much more computationally efficient than WENO-based schemes.

  8. Spatially Explicit Modelling of the Belgian Major Endurance Event 'The 100 km Dodentocht'.

    Directory of Open Access Journals (Sweden)

    Steffie Van Nieuland

    Full Text Available 'The 100 km Dodentocht', which takes place annually and has its start in Bornem, Belgium, is a long distance march where participants have to cover a 100 km trail in at most 24 hours. The approximately 11 000 marchers per edition are tracked by making use of passive radio-frequency-identification (RFID. These tracking data were analyzed to build a spatially explicit marching model that gives insights into the dynamics of the event and allows to evaluate the effect of changes in the starting procedure of the event. For building the model, the empirical distribution functions (edf of the marching speeds at every section of the trail in between two consecutive checkpoints and of the checkpoints where marchers retire, are determined, taking into account age, gender, and marching speeds at previous sections. These distribution functions are then used to sample the consecutive speeds and retirement, and as such to simulate the times when individual marchers pass by the consecutive checkpoints. We concluded that the data-driven model simulates the event reliably. Furthermore, we tested three scenarios to reduce the crowdiness along the first part of the trail and in this way were able to conclude that either the start should be moved to a location outside the town center where the streets are at least 25% wider, or that the marchers should start in two groups at two different locations, and that these groups should ideally merge at about 20 km after the start. The crowdiness at the start might also be reduced by installing a bottleneck at the start in order to limit the number of marchers that can pass per unit of time. Consequently, the operating hours of the consecutive checkpoints would be longer. The developed framework can likewise be used to analyze and improve the operation of other endurance events if sufficient tracking data are available.

  9. Modeling the Bergeron-Findeisen Process Using PDF Methods With an Explicit Representation of Mixing

    Science.gov (United States)

    Jeffery, C.; Reisner, J.

    2005-12-01

    Currently, the accurate prediction of cloud droplet and ice crystal number concentration in cloud resolving, numerical weather prediction and climate models is a formidable challenge. The Bergeron-Findeisen process in which ice crystals grow by vapor deposition at the expense of super-cooled droplets is expected to be inhomogeneous in nature--some droplets will evaporate completely in centimeter-scale filaments of sub-saturated air during turbulent mixing while others remain unchanged [Baker et al., QJRMS, 1980]--and is unresolved at even cloud-resolving scales. Despite the large body of observational evidence in support of the inhomogeneous mixing process affecting cloud droplet number [most recently, Brenguier et al., JAS, 2000], it is poorly understood and has yet to be parameterized and incorporated into a numerical model. In this talk, we investigate the Bergeron-Findeisen process using a new approach based on simulations of the probability density function (PDF) of relative humidity during turbulent mixing. PDF methods offer a key advantage over Eulerian (spatial) models of cloud mixing and evaporation: the low probability (cm-scale) filaments of entrained air are explicitly resolved (in probability space) during the mixing event even though their spatial shape, size and location remain unknown. Our PDF approach reveals the following features of the inhomogeneous mixing process during the isobaric turbulent mixing of two parcels containing super-cooled water and ice, respectively: (1) The scavenging of super-cooled droplets is inhomogeneous in nature; some droplets evaporate completely at early times while others remain unchanged. (2) The degree of total droplet evaporation during the initial mixing period depends linearly on the mixing fractions of the two parcels and logarithmically on Damköhler number (Da)---the ratio of turbulent to evaporative time-scales. (3) Our simulations predict that the PDF of Lagrangian (time-integrated) subsaturation (S) goes as

  10. Spatially explicit models for inference about density in unmarked or partially marked populations

    Science.gov (United States)

    Chandler, Richard B.; Royle, J. Andrew

    2013-01-01

    Recently developed spatial capture–recapture (SCR) models represent a major advance over traditional capture–recapture (CR) models because they yield explicit estimates of animal density instead of population size within an unknown area. Furthermore, unlike nonspatial CR methods, SCR models account for heterogeneity in capture probability arising from the juxtaposition of animal activity centers and sample locations. Although the utility of SCR methods is gaining recognition, the requirement that all individuals can be uniquely identified excludes their use in many contexts. In this paper, we develop models for situations in which individual recognition is not possible, thereby allowing SCR concepts to be applied in studies of unmarked or partially marked populations. The data required for our model are spatially referenced counts made on one or more sample occasions at a collection of closely spaced sample units such that individuals can be encountered at multiple locations. Our approach includes a spatial point process for the animal activity centers and uses the spatial correlation in counts as information about the number and location of the activity centers. Camera-traps, hair snares, track plates, sound recordings, and even point counts can yield spatially correlated count data, and thus our model is widely applicable. A simulation study demonstrated that while the posterior mean exhibits frequentist bias on the order of 5–10% in small samples, the posterior mode is an accurate point estimator as long as adequate spatial correlation is present. Marking a subset of the population substantially increases posterior precision and is recommended whenever possible. We applied our model to avian point count data collected on an unmarked population of the northern parula (Parula americana) and obtained a density estimate (posterior mode) of 0.38 (95% CI: 0.19–1.64) birds/ha. Our paper challenges sampling and analytical conventions in ecology by demonstrating

  11. Alien wavelength modeling tool and field trial

    DEFF Research Database (Denmark)

    Sambo, N.; Sgambelluri, A.; Secondini, M.

    2015-01-01

    A modeling tool is presented for pre-FEC BER estimation of PM-QPSK alien wavelength signals. A field trial is demonstrated and used as validation of the tool's correctness. A very close correspondence between the performance of the field trial and the one predicted by the modeling tool has been...

  12. Development of a reactive burn model based upon an explicit visco-plastic pore collapse model

    Science.gov (United States)

    Bouton, Eric; Lefrançois, Alexandre; Belmas, Robert

    2015-06-01

    Our aim in this study is to develop a reactive burn model based upon a microscopic hot spot model to compute the initiation and shock to detonation of pressed TATB explosives. For the sake of simplicity, the hot spots are supposed to result from the viscoplastic collapse of spherical micro-voids inside the composition. Such a model has been incorporated in a lagrangian hydrodynamic code. In our calculations, 8 different pore diameters, ranging from 100 nm to 1.2 μm, have been taken into account and the porosity associated to each pore size has been deduced from the PBX-9502 void distribution derived from the SAXS. The last ingredient of our model is the burn rate that depends on two main variables. The first one is the shock pressure as proposed by the developers of the CREST model. The second one is the number of effective chemical reaction sites calculated by the microscopic model. Furthermore, the function of the reaction progress variable of the burn rate is similar to that in the SURF model proposed by Menikoff. Our burn rate has been calibrated by using pressure profile, material velocities wave forms obtained with embedded particle velocity gauges and run distance to detonation. The comparison between the numerical and experimental results is really good and sufficient to perform a wide variety of simulations including single, double shock waves and the desensitization phenomenon. In conclusion, future works are described.

  13. Modeling Loop Reorganization Free Energies of Acetylcholinesterase: A Comparison of Explicit and Implicit Solvent Models

    National Research Council Canada - National Science Library

    Olson, Mark

    2004-01-01

    ... screening of charge-charge interactions. This paper compares different solvent models applied to the problem of estimating the free-energy difference between two loop conformations in acetylcholinesterase...

  14. Explicit Interaction

    DEFF Research Database (Denmark)

    Löwgren, Jonas; Eriksen, Mette Agger; Linde, Per

    2006-01-01

    We report an ongoing study of palpable computing to support surgical rehabilitation, in the general field of interaction design for ubiquitous computing. Through explorative design, fieldwork and participatory design techniques, we explore the design principle of explicit interaction as an interp...

  15. Computer-Aided Modelling Methods and Tools

    DEFF Research Database (Denmark)

    Cameron, Ian; Gani, Rafiqul

    2011-01-01

    The development of models for a range of applications requires methods and tools. In many cases a reference model is required that allows the generation of application specific models that are fit for purpose. There are a range of computer aided modelling tools available that help to define the m...

  16. A new approach to spatially explicit modelling of forest dynamics: spacing, ageing and neighbourhood competition of mangrove trees

    NARCIS (Netherlands)

    Berger, U.; Hildenbrandt, H.

    2000-01-01

    This paper presents a new approach to spatially explicit modelling that enables the influence of neighbourhood effects on the dynamics of forests and plant communities to be analysed. We refer to this approach as 'field of neighbourhood' (FON). It combines the 'neighbourhood philosophy' of

  17. Development and Validation of Spatially Explicit Habitat Models for Cavity-nesting Birds in Fishlake National Forest, Utah

    Science.gov (United States)

    Randall A., Jr. Schultz; Thomas C., Jr. Edwards; Gretchen G. Moisen; Tracey S. Frescino

    2005-01-01

    The ability of USDA Forest Service Forest Inventory and Analysis (FIA) generated spatial products to increase the predictive accuracy of spatially explicit, macroscale habitat models was examined for nest-site selection by cavity-nesting birds in Fishlake National Forest, Utah. One FIA-derived variable (percent basal area of aspen trees) was significant in the habitat...

  18. Hydroclimatology of Dual Peak Cholera Incidence in Bengal Region: Inferences from a Spatial Explicit Model

    Science.gov (United States)

    Bertuzzo, E.; Mari, L.; Righetto, L.; Casagrandi, R.; Gatto, M.; Rodriguez-Iturbe, I.; Rinaldo, A.

    2010-12-01

    The seasonality of cholera and its relation with environmental drivers are receiving increasing interest and research efforts, yet they remain unsatisfactorily understood. A striking example is the observed annual cycle of cholera incidence in the Bengal region which exhibits two peaks despite the main environmental drivers that have been linked to the disease (air and sea surface temperature, zooplankton density, river discharge) follow a synchronous single-peak annual pattern. A first outbreak, mainly affecting the coastal regions, occurs in spring and it is followed, after a period of low incidence during summer, by a second, usually larger, peak in autumn also involving regions situated farther inland. A hydroclimatological explanation for this unique seasonal cycle has been recently proposed: the low river spring flows favor the intrusion of brackish water (the natural environment of the causative agent of the disease) which, in turn, triggers the first outbreak. The summer rising river discharges have a temporary dilution effect and prompt the repulsion of contaminated water which lowers the disease incidence. However, the monsoon flooding, together with the induced crowding of the population and the failure of the sanitation systems, can possibly facilitate the spatial transmission of the disease and promote the autumn outbreak. We test this hypothesis using a mechanistic, spatially explicit model of cholera epidemic. The framework directly accounts for the role of the river network in transporting and redistributing cholera bacteria among human communities as well as for the annual fluctuation of the river flow. The model is forced with the actual environmental drivers of the region, namely river flow and temperature. Our results show that these two drivers, both having a single peak in the summer, can generate a double peak cholera incidence pattern. Besides temporal patterns, the model is also able to qualitatively reproduce spatial patterns characterized

  19. Spatially Explicit Modeling Reveals Cephalopod Distributions Match Contrasting Trophic Pathways in the Western Mediterranean Sea.

    Directory of Open Access Journals (Sweden)

    Patricia Puerta

    Full Text Available Populations of the same species can experience different responses to the environment throughout their distributional range as a result of spatial and temporal heterogeneity in habitat conditions. This highlights the importance of understanding the processes governing species distribution at local scales. However, research on species distribution often averages environmental covariates across large geographic areas, missing variability in population-environment interactions within geographically distinct regions. We used spatially explicit models to identify interactions between species and environmental, including chlorophyll a (Chla and sea surface temperature (SST, and trophic (prey density conditions, along with processes governing the distribution of two cephalopods with contrasting life-histories (octopus and squid across the western Mediterranean Sea. This approach is relevant for cephalopods, since their population dynamics are especially sensitive to variations in habitat conditions and rarely stable in abundance and location. The regional distributions of the two cephalopod species matched two different trophic pathways present in the western Mediterranean Sea, associated with the Gulf of Lion upwelling and the Ebro river discharges respectively. The effects of the studied environmental and trophic conditions were spatially variant in both species, with usually stronger effects along their distributional boundaries. We identify areas where prey availability limited the abundance of cephalopod populations as well as contrasting effects of temperature in the warmest regions. Despite distributional patterns matching productive areas, a general negative effect of Chla on cephalopod densities suggests that competition pressure is common in the study area. Additionally, results highlight the importance of trophic interactions, beyond other common environmental factors, in shaping the distribution of cephalopod populations. Our study presents

  20. Performability Modelling Tools, Evaluation Techniques and Applications

    NARCIS (Netherlands)

    Haverkort, Boudewijn R.H.M.

    1990-01-01

    This thesis deals with three aspects of quantitative evaluation of fault-tolerant and distributed computer and communication systems: performability evaluation techniques, performability modelling tools, and performability modelling applications. Performability modelling is a relatively new

  1. Explicit all-atom modeling of realistically sized ligand-capped nanocrystals

    KAUST Repository

    Kaushik, Ananth P.; Clancy, Paulette

    2012-01-01

    We present a study of an explicit all-atom representation of nanocrystals of experimentally relevant sizes (up to 6 nm), capped with alkyl chain ligands, in vacuum. We employ all-atom molecular dynamics simulation methods in concert with a well

  2. Explicit time integration of finite element models on a vectorized, concurrent computer with shared memory

    Science.gov (United States)

    Gilbertsen, Noreen D.; Belytschko, Ted

    1990-01-01

    The implementation of a nonlinear explicit program on a vectorized, concurrent computer with shared memory is described and studied. The conflict between vectorization and concurrency is described and some guidelines are given for optimal block sizes. Several example problems are summarized to illustrate the types of speed-ups which can be achieved by reprogramming as compared to compiler optimization.

  3. A Conceptual Model for the Design and Delivery of Explicit Thinking Skills Instruction

    Science.gov (United States)

    Kassem, Cherrie L.

    2005-01-01

    Developing student thinking skills is an important goal for most educators. However, due to time constraints and weighty content standards, thinking skills instruction is often embedded in subject matter, implicit and incidental. For best results, thinking skills instruction requires a systematic design and explicit teaching strategies. The…

  4. Large scale spatially explicit modeling of blue and green water dynamics in a temperate mid-latitude basin

    Science.gov (United States)

    Du, Liuying; Rajib, Adnan; Merwade, Venkatesh

    2018-07-01

    Looking only at climate change impacts provides partial information about a changing hydrologic regime. Understanding the spatio-temporal nature of change in hydrologic processes, and the explicit contributions from both climate and land use drivers, holds more practical value for water resources management and policy intervention. This study presents a comprehensive assessment on the spatio-temporal trend of Blue Water (BW) and Green Water (GW) in a 490,000 km2 temperate mid-latitude basin (Ohio River Basin) over the past 80 years (1935-2014), and from thereon, quantifies the combined as well as relative contributions of climate and land use changes. The Soil and Water Assessment Tool (SWAT) is adopted to simulate hydrologic fluxes. Mann-Kendall and Theil-Sen statistical tests are performed on the modeled outputs to detect respectively the trend and magnitude of changes at three different spatial scales - the entire basin, regional level, and sub-basin level. Despite the overall volumetric increase of both BW and GW in the entire basin, changes in their annual average values during the period of simulation reveal a distinctive spatial pattern. GW has increased significantly in the upper and lower parts of the basin, which can be related to the prominent land use change in those areas. BW has increased significantly only in the lower part, likely being associated with the notable precipitation change there. Furthermore, the simulation under a time-varying climate but constant land use scenario identifies climate change in the Ohio River Basin to be influential on BW, while the impact is relatively nominal on GW; whereas, land use change increases GW remarkably, but is counterproductive on BW. The approach to quantify combined/relative effects of climate and land use change as shown in this study can be replicated to understand BW-GW dynamics in similar large basins around the globe.

  5. EPS Mid-Career Award 2011. Are there multiple memory systems? Tests of models of implicit and explicit memory.

    Science.gov (United States)

    Shanks, David R; Berry, Christopher J

    2012-01-01

    This article reviews recent work aimed at developing a new framework, based on signal detection theory, for understanding the relationship between explicit (e.g., recognition) and implicit (e.g., priming) memory. Within this framework, different assumptions about sources of memorial evidence can be framed. Application to experimental results provides robust evidence for a single-system model in preference to multiple-systems models. This evidence comes from several sources including studies of the effects of amnesia and ageing on explicit and implicit memory. The framework allows a range of concepts in current memory research, such as familiarity, recollection, fluency, and source memory, to be linked to implicit memory. More generally, this work emphasizes the value of modern computational modelling techniques in the study of learning and memory.

  6. CPsuperH2.3: an Updated Tool for Phenomenology in the MSSM with Explicit CP Violation

    CERN Document Server

    Lee, J.S.; Ellis, J.; Pilaftsis, A.; Wagner, C.E.M.

    2013-01-01

    We describe the Fortran code CPsuperH2.3, which incorporates the following updates compared with its predecessor CPsuperH2.0. It implements improved calculations of the Higgs-boson masses and mixing including stau contributions and finite threshold effects on the tau-lepton Yukawa coupling. It incorporates the LEP limits on the processes e^+ e^- to H_i Z, H_i H_j and the CMS limits on H_i to tau^+ tau^- obtained from 4.6/fb of data at a centre-of-mass energy of 7 TeV. It also includes the decay mode H_i to Z gamma and the Schiff-moment contributions to the electric dipole moments of Mercury and Radium225, with several calculational options for the case of Mercury. These additions make CPsuperH2.3 a suitable tool for analyzing possible CP-violating effects in the MSSM in the era of the LHC and a new generation of EDM experiments

  7. Improvement, Verification, and Refinement of Spatially-Explicit Exposure Models in Risk Assessment - FishRand Spatially-Explicit Bioaccumulation Model Demonstration

    Science.gov (United States)

    2015-08-01

    Unaccounted dynamic habitats and resultant changes in wildlife usage;  Simplified foraging strategies (lacking important considerations such as...and water exposures, fish foraging strategies, and PCB uptake. Figure 2 additionally shows the comparison of standard deviations across the...area (1, 2, and 5) at the Tyndall AFB site. ....................................... 22  Figure 5. Comparison of model predictions to site data for

  8. An evaluation of BPMN modeling tools

    NARCIS (Netherlands)

    Yan, Z.; Reijers, H.A.; Dijkman, R.M.; Mendling, J.; Weidlich, M.

    2010-01-01

    Various BPMN modeling tools are available and it is close to impossible to understand their functional differences without simply trying them out. This paper presents an evaluation framework and presents the outcomes of its application to a set of five BPMN modeling tools. We report on various

  9. Spatially explicit modeling of blackbird abundance in the Prairie Pothole Region

    Science.gov (United States)

    Forcey, Greg M.; Thogmartin, Wayne E.; Linz, George M.; McKann, Patrick C.; Crimmins, Shawn M.

    2015-01-01

    Knowledge of factors influencing animal abundance is important to wildlife biologists developing management plans. This is especially true for economically important species such as blackbirds (Icteridae), which cause more than $100 million in crop damages annually in the United States. Using data from the North American Breeding Bird Survey, the National Land Cover Dataset, and the National Climatic Data Center, we modeled effects of regional environmental variables on relative abundance of 3 blackbird species (red-winged blackbird,Agelaius phoeniceus; yellow-headed blackbird, Xanthocephalus xanthocephalus; common grackle, Quiscalus quiscula) in the Prairie Pothole Region of the central United States. We evaluated landscape covariates at 3 logarithmically related spatial scales (1,000 ha, 10,000 ha, and 100,000 ha) and modeled weather variables at the 100,000-ha scale. We constructed models a priori using information from published habitat associations. We fit models with WinBUGS using Markov chain Monte Carlo techniques. Both landscape and weather variables contributed strongly to predicting blackbird relative abundance (95% credibility interval did not overlap 0). Variables with the strongest associations with blackbird relative abundance were the percentage of wetland area and precipitation amount from the year before bird surveys were conducted. The influence of spatial scale appeared small—models with the same variables expressed at different scales were often in the best model subset. This large-scale study elucidated regional effects of weather and landscape variables, suggesting that management strategies aimed at reducing damages caused by these species should consider the broader landscape, including weather effects, because such factors may outweigh the influence of localized conditions or site-specific management actions. The regional species distributional models we developed for blackbirds provide a tool for understanding these broader

  10. Methods used to parameterize the spatially-explicit components of a state-and-transition simulation model

    Directory of Open Access Journals (Sweden)

    Rachel R. Sleeter

    2015-06-01

    Full Text Available Spatially-explicit state-and-transition simulation models of land use and land cover (LULC increase our ability to assess regional landscape characteristics and associated carbon dynamics across multiple scenarios. By characterizing appropriate spatial attributes such as forest age and land-use distribution, a state-and-transition model can more effectively simulate the pattern and spread of LULC changes. This manuscript describes the methods and input parameters of the Land Use and Carbon Scenario Simulator (LUCAS, a customized state-and-transition simulation model utilized to assess the relative impacts of LULC on carbon stocks for the conterminous U.S. The methods and input parameters are spatially explicit and describe initial conditions (strata, state classes and forest age, spatial multipliers, and carbon stock density. Initial conditions were derived from harmonization of multi-temporal data characterizing changes in land use as well as land cover. Harmonization combines numerous national-level datasets through a cell-based data fusion process to generate maps of primary LULC categories. Forest age was parameterized using data from the North American Carbon Program and spatially-explicit maps showing the locations of past disturbances (i.e. wildfire and harvest. Spatial multipliers were developed to spatially constrain the location of future LULC transitions. Based on distance-decay theory, maps were generated to guide the placement of changes related to forest harvest, agricultural intensification/extensification, and urbanization. We analyze the spatially-explicit input parameters with a sensitivity analysis, by showing how LUCAS responds to variations in the model input. This manuscript uses Mediterranean California as a regional subset to highlight local to regional aspects of land change, which demonstrates the utility of LUCAS at many scales and applications.

  11. Methods used to parameterize the spatially-explicit components of a state-and-transition simulation model

    Science.gov (United States)

    Sleeter, Rachel; Acevedo, William; Soulard, Christopher E.; Sleeter, Benjamin M.

    2015-01-01

    Spatially-explicit state-and-transition simulation models of land use and land cover (LULC) increase our ability to assess regional landscape characteristics and associated carbon dynamics across multiple scenarios. By characterizing appropriate spatial attributes such as forest age and land-use distribution, a state-and-transition model can more effectively simulate the pattern and spread of LULC changes. This manuscript describes the methods and input parameters of the Land Use and Carbon Scenario Simulator (LUCAS), a customized state-and-transition simulation model utilized to assess the relative impacts of LULC on carbon stocks for the conterminous U.S. The methods and input parameters are spatially explicit and describe initial conditions (strata, state classes and forest age), spatial multipliers, and carbon stock density. Initial conditions were derived from harmonization of multi-temporal data characterizing changes in land use as well as land cover. Harmonization combines numerous national-level datasets through a cell-based data fusion process to generate maps of primary LULC categories. Forest age was parameterized using data from the North American Carbon Program and spatially-explicit maps showing the locations of past disturbances (i.e. wildfire and harvest). Spatial multipliers were developed to spatially constrain the location of future LULC transitions. Based on distance-decay theory, maps were generated to guide the placement of changes related to forest harvest, agricultural intensification/extensification, and urbanization. We analyze the spatially-explicit input parameters with a sensitivity analysis, by showing how LUCAS responds to variations in the model input. This manuscript uses Mediterranean California as a regional subset to highlight local to regional aspects of land change, which demonstrates the utility of LUCAS at many scales and applications.

  12. Software Engineering Tools for Scientific Models

    Science.gov (United States)

    Abrams, Marc; Saboo, Pallabi; Sonsini, Mike

    2013-01-01

    Software tools were constructed to address issues the NASA Fortran development community faces, and they were tested on real models currently in use at NASA. These proof-of-concept tools address the High-End Computing Program and the Modeling, Analysis, and Prediction Program. Two examples are the NASA Goddard Earth Observing System Model, Version 5 (GEOS-5) atmospheric model in Cell Fortran on the Cell Broadband Engine, and the Goddard Institute for Space Studies (GISS) coupled atmosphere- ocean model called ModelE, written in fixed format Fortran.

  13. Explicit solution of the quantum three-body Calogero-Sutherland model

    CERN Document Server

    Perelomov, A.M.; Zaugg, P.

    1998-01-01

    Quantum integrable systems generalizing Calogero-Sutherland systems were introduced by Olshanetsky and Perelomov (1977). Recently, it was proved that for systems with trigonometric potential, the series in the product of two wave functions is a deformation of the Clebsch-Gordan series. This yields recursion relations for the wave functions of those systems. In this note, this approach is used to compute the explicit expressions for the three-body Calogero-Sutherland wave functions, which are the Jack polynomials. We conjecture that similar results are also valid for the more general two-parameters deformation introduced by Macdonald.

  14. Explicit and implicit springback simulation in sheet metal forming using fully coupled ductile damage and distortional hardening model

    Science.gov (United States)

    Yetna n'jock, M.; Houssem, B.; Labergere, C.; Saanouni, K.; Zhenming, Y.

    2018-05-01

    The springback is an important phenomenon which accompanies the forming of metallic sheets especially for high strength materials. A quantitative prediction of springback becomes very important for newly developed material with high mechanical characteristics. In this work, a numerical methodology is developed to quantify this undesirable phenomenon. This methodoly is based on the use of both explicit and implicit finite element solvers of Abaqus®. The most important ingredient of this methodology consists on the use of highly predictive mechanical model. A thermodynamically-consistent, non-associative and fully anisotropic elastoplastic constitutive model strongly coupled with isotropic ductile damage and accounting for distortional hardening is then used. An algorithm for local integration of the complete set of the constitutive equations is developed. This algorithm considers the rotated frame formulation (RFF) to ensure the incremental objectivity of the model in the framework of finite strains. This algorithm is implemented in both explicit (Abaqus/Explicit®) and implicit (Abaqus/Standard®) solvers of Abaqus® through the users routine VUMAT and UMAT respectively. The implicit solver of Abaqus® has been used to study spingback as it is generally a quasi-static unloading. In order to compare the methods `efficiency, the explicit method (Dynamic Relaxation Method) proposed by Rayleigh has been also used for springback prediction. The results obtained within U draw/bending benchmark are studied, discussed and compared with experimental results as reference. Finally, the purpose of this work is to evaluate the reliability of different methods predict efficiently springback in sheet metal forming.

  15. The explicit treatment of model uncertainties in the presence of aleatory and epistemic parameter uncertainties in risk and reliability analysis

    International Nuclear Information System (INIS)

    Ahn, Kwang Il; Yang, Joon Eon

    2003-01-01

    In the risk and reliability analysis of complex technological systems, the primary concern of formal uncertainty analysis is to understand why uncertainties arise, and to evaluate how they impact the results of the analysis. In recent times, many of the uncertainty analyses have focused on parameters of the risk and reliability analysis models, whose values are uncertain in an aleatory or an epistemic way. As the field of parametric uncertainty analysis matures, however, more attention is being paid to the explicit treatment of uncertainties that are addressed in the predictive model itself as well as the accuracy of the predictive model. The essential steps for evaluating impacts of these model uncertainties in the presence of parameter uncertainties are to determine rigorously various sources of uncertainties to be addressed in an underlying model itself and in turn model parameters, based on our state-of-knowledge and relevant evidence. Answering clearly the question of how to characterize and treat explicitly the forgoing different sources of uncertainty is particularly important for practical aspects such as risk and reliability optimization of systems as well as more transparent risk information and decision-making under various uncertainties. The main purpose of this paper is to provide practical guidance for quantitatively treating various model uncertainties that would often be encountered in the risk and reliability modeling process of complex technological systems

  16. Model Checking Markov Chains: Techniques and Tools

    NARCIS (Netherlands)

    Zapreev, I.S.

    2008-01-01

    This dissertation deals with four important aspects of model checking Markov chains: the development of efficient model-checking tools, the improvement of model-checking algorithms, the efficiency of the state-space reduction techniques, and the development of simulation-based model-checking

  17. Modeling spatially- and temporally-explicit water stress indices for use in life cycle assessment

    Science.gov (United States)

    Scherer, L.; Venkatesh, A.; Karuppiah, R.; Usadi, A.; Pfister, S.; Hellweg, S.

    2013-12-01

    Water scarcity is a regional issue in many areas across the world, and can affect human health and ecosystems locally. Water stress indices (WSIs) have been developed as quantitative indicators of such scarcities - examples include the Falkenmark indicator, Social Water Stress Index, and the Water Supply Stress Index1. Application of these indices helps us understand water supply and demand risks for multiple users, including those in the agricultural, industrial, residential and commercial sectors. Pfister et al.2 developed a method to calculate WSIs that were used to estimate characterization factors (CFs) in order to quantify environmental impacts of freshwater consumption within a life cycle assessment (LCA) framework. Global WSIs were based on data from the WaterGAP model3, and presented as annual averages for watersheds. Since water supply and demand varies regionally and temporally, the resolution used in Pfister et al. does not effectively differentiate between seasonal and permanent water scarcity. This study aims to improve the temporal and spatial resolution of the water scarcity calculations used to estimate WSIs and CFs. We used the Soil and Water Assessment Tool (SWAT)4 hydrological model to properly simulate water supply in different world regions with high spatial and temporal resolution, and coupled it with water use data from WaterGAP3 and Pfister et al.5. Input data to SWAT included weather, land use, soil characteristics and a digital elevation model (DEM), all from publicly available global data sets. Potential evapotranspiration, which affects water supply, was determined using an improved Priestley-Taylor approach. In contrast to most other hydrological studies, large reservoirs, water consumption and major water transfers were simulated. The model was calibrated against observed monthly discharge, actual evapotranspiration, and snow water equivalents wherever appropriate. Based on these simulations, monthly WSIs were calculated for a few

  18. Probing the role of interfacial waters in protein-DNA recognition using a hybrid implicit/explicit solvation model

    Science.gov (United States)

    Li, Shen; Bradley, Philip

    2013-01-01

    When proteins bind to their DNA target sites, ordered water molecules are often present at the protein-DNA interface bridging protein and DNA through hydrogen bonds. What is the role of these ordered interfacial waters? Are they important determinants of the specificity of DNA sequence recognition, or do they act in binding in a primarily non-specific manner, by improving packing of the interface, shielding unfavorable electrostatic interactions, and solvating unsatisfied polar groups that are inaccessible to bulk solvent? When modeling details of structure and binding preferences, can fully implicit solvent models be fruitfully applied to protein-DNA interfaces, or must the individualistic properties of these interfacial waters be accounted for? To address these questions, we have developed a hybrid implicit/explicit solvation model that specifically accounts for the locations and orientations of small numbers of DNA-bound water molecules while treating the majority of the solvent implicitly. Comparing the performance of this model to its fully implicit counterpart, we find that explicit treatment of interfacial waters results in a modest but significant improvement in protein sidechain placement and DNA sequence recovery. Base-by-base comparison of the performance of the two models highlights DNA sequence positions whose recognition may be dependent on interfacial water. Our study offers large-scale statistical evidence for the role of ordered water for protein DNA recognition, together with detailed examination of several well-characterized systems. In addition, our approach provides a template for modeling explicit water molecules at interfaces that should be extensible to other systems. PMID:23444044

  19. Spatial Modeling Tools for Cell Biology

    National Research Council Canada - National Science Library

    Przekwas, Andrzej; Friend, Tom; Teixeira, Rodrigo; Chen, Z. J; Wilkerson, Patrick

    2006-01-01

    .... Scientific potentials and military relevance of computational biology and bioinformatics have inspired DARPA/IPTO's visionary BioSPICE project to develop computational framework and modeling tools for cell biology...

  20. Spatially explicit habitat models for 28 fishes from the Upper Mississippi River System (AHAG 2.0)

    Science.gov (United States)

    Ickes, Brian S.; Sauer, J.S.; Richards, N.; Bowler, M.; Schlifer, B.

    2014-01-01

    species; occurrences at sampling sites as observed in day electrofishing samples) using multiple logistic regression with presence/absence responses. Each species’ probability of occurrence, at each sample site, was modeled as a function of 17 environmental variables observed at each sample site by LTRMP standardized protocols. The modeling methods used (1) a forward-selection process to identify the most important predictors and their relative contributions to predictions; (2) partial methods on the predictor set to control variance inflation; and (3) diagnostics for LTRMP design elements that may influence model fits. Models were fit for 28 species, representing 3 habitat guilds (Lentic, Lotic, and Generalist). We intended to develop “systemic models” using data from all six LTRMP study reaches simultaneously; however, this proved impossible. Thus, we “regionalized” the models, creating two models for each species: “Upper Reach” models, using data from Pools 4, 8, and 13; and “Lower Reach” models, using data from Pool 26, the Open River Reach of the Mississippi River, and the La Grange reach of the Illinois River. A total of 56 models were attempted. For any given site-scale prediction, each model used data from the three LTRMP study reaches comprising the regional model to make predictions. For example, a site-scale prediction in Pool 8 was made using data from Pools 4, 8, and 13. This is the fundamental nature and trade-off of regionalizing these models for broad management application. Model fits were deemed “certifiably good” using the Hosmer and Lemeshow Goodness-of-Fit statistic (Hosmer and Lemeshow, 2000). This test post-partitions model predictions into 10 groups and conducts inferential tests on correspondences between observed and expected probability of occurrence across all partitions, under Chi-square distributional assumptions. This permits an inferential test of how well the models fit and a tool for reporting when they did not (and

  1. An unified framework to integrate biotic, abiotic processes and human activities in spatially explicit models of agricultural landscapes

    Directory of Open Access Journals (Sweden)

    Fabrice eVinatier

    2016-02-01

    Full Text Available Recent concern over possible ways to sustain ecosystem services has triggered important research worldwide on ecosystem processes at the landscape scale. Understanding this complexity of landscape functioning calls for coupled and spatially-explicit modelling approaches. However, disciplinary boundaries have limited the number of multi-process studies at the landscape scale, and current progress in coupling processes at this scale often reveals strong imbalance between biotic and abiotic processes, depending on the core discipline of the modellers. We propose a spatially-explicit, unified conceptual framework that allows researchers from different fields to develop a shared view of agricultural landscapes. In particular,we distinguish landscape elements that are mobile in space and represent biotic or abiotic objects (for example water, fauna or flora populations, and elements that are immobile and represent fixed landscape elements with a given geometry (for example ditch section or plot. The shared representation of these elements allows setting common objects and spatio-temporal process boundaries that may otherwise differ between disciplines. We present guidelines and an assessment of the applicability of this framework to a virtual landscape system with realistic properties. This framework allows the complex system to be represented with a limited set of concepts but leaves the possibility to include current modelling strategies specific to biotic or abiotic disciplines. Future operational challenges include model design, space and time discretization, and the availability of both landscape modelling platforms and data.

  2. Comparison of two different modelling tools

    DEFF Research Database (Denmark)

    Brix, Wiebke; Elmegaard, Brian

    2009-01-01

    In this paper a test case is solved using two different modelling tools, Engineering Equation Solver (EES) and WinDali, in order to compare the tools. The system of equations solved, is a static model of an evaporator used for refrigeration. The evaporator consists of two parallel channels......, and it is investigated how a non-uniform airflow influences the refrigerant mass flow rate distribution and the total cooling capacity of the heat exchanger. It is shown that the cooling capacity decreases significantly with increasing maldistribution of the airflow. Comparing the two simulation tools it is found...

  3. Cockpit System Situational Awareness Modeling Tool

    Science.gov (United States)

    Keller, John; Lebiere, Christian; Shay, Rick; Latorella, Kara

    2004-01-01

    This project explored the possibility of predicting pilot situational awareness (SA) using human performance modeling techniques for the purpose of evaluating developing cockpit systems. The Improved Performance Research Integration Tool (IMPRINT) was combined with the Adaptive Control of Thought-Rational (ACT-R) cognitive modeling architecture to produce a tool that can model both the discrete tasks of pilots and the cognitive processes associated with SA. The techniques for using this tool to predict SA were demonstrated using the newly developed Aviation Weather Information (AWIN) system. By providing an SA prediction tool to cockpit system designers, cockpit concepts can be assessed early in the design process while providing a cost-effective complement to the traditional pilot-in-the-loop experiments and data collection techniques.

  4. Low Cloud Feedback to Surface Warming in the World's First Global Climate Model with Explicit Embedded Boundary Layer Turbulence

    Science.gov (United States)

    Parishani, H.; Pritchard, M. S.; Bretherton, C. S.; Wyant, M. C.; Khairoutdinov, M.; Singh, B.

    2017-12-01

    Biases and parameterization formulation uncertainties in the representation of boundary layer clouds remain a leading source of possible systematic error in climate projections. Here we show the first results of cloud feedback to +4K SST warming in a new experimental climate model, the ``Ultra-Parameterized (UP)'' Community Atmosphere Model, UPCAM. We have developed UPCAM as an unusually high-resolution implementation of cloud superparameterization (SP) in which a global set of cloud resolving arrays is embedded in a host global climate model. In UP, the cloud-resolving scale includes sufficient internal resolution to explicitly generate the turbulent eddies that form marine stratocumulus and trade cumulus clouds. This is computationally costly but complements other available approaches for studying low clouds and their climate interaction, by avoiding parameterization of the relevant scales. In a recent publication we have shown that UP, while not without its own complexity trade-offs, can produce encouraging improvements in low cloud climatology in multi-month simulations of the present climate and is a promising target for exascale computing (Parishani et al. 2017). Here we show results of its low cloud feedback to warming in multi-year simulations for the first time. References: Parishani, H., M. S. Pritchard, C. S. Bretherton, M. C. Wyant, and M. Khairoutdinov (2017), Toward low-cloud-permitting cloud superparameterization with explicit boundary layer turbulence, J. Adv. Model. Earth Syst., 9, doi:10.1002/2017MS000968.

  5. Formulation of an explicit-multiple-time-step time integration method for use in a global primitive equation grid model

    Science.gov (United States)

    Chao, W. C.

    1982-01-01

    With appropriate modifications, a recently proposed explicit-multiple-time-step scheme (EMTSS) is incorporated into the UCLA model. In this scheme, the linearized terms in the governing equations that generate the gravity waves are split into different vertical modes. Each mode is integrated with an optimal time step, and at periodic intervals these modes are recombined. The other terms are integrated with a time step dictated by the CFL condition for low-frequency waves. This large time step requires a special modification of the advective terms in the polar region to maintain stability. Test runs for 72 h show that EMTSS is a stable, efficient and accurate scheme.

  6. Graph and model transformation tools for model migration : empirical results from the transformation tool contest

    NARCIS (Netherlands)

    Rose, L.M.; Herrmannsdoerfer, M.; Mazanek, S.; Van Gorp, P.M.E.; Buchwald, S.; Horn, T.; Kalnina, E.; Koch, A.; Lano, K.; Schätz, B.; Wimmer, M.

    2014-01-01

    We describe the results of the Transformation Tool Contest 2010 workshop, in which nine graph and model transformation tools were compared for specifying model migration. The model migration problem—migration of UML activity diagrams from version 1.4 to version 2.2—is non-trivial and practically

  7. System level modelling with open source tools

    DEFF Research Database (Denmark)

    Jakobsen, Mikkel Koefoed; Madsen, Jan; Niaki, Seyed Hosein Attarzadeh

    , called ForSyDe. ForSyDe is available under the open Source approach, which allows small and medium enterprises (SME) to get easy access to advanced modeling capabilities and tools. We give an introduction to the design methodology through the system level modeling of a simple industrial use case, and we...

  8. The european Trans-Tools transport model

    NARCIS (Netherlands)

    Rooijen, T. van; Burgess, A.

    2008-01-01

    The paper presents the use of ArcGIS in the Transtools Transport Model, TRANS-TOOLS, created by an international consortium for the European Commission. The model describe passenger as well as freight transport in Europe with all medium and long distance modes (cars, vans, trucks, train, inland

  9. An Explicit Structural Model of Root Hair and Soil Interactions Parameterised by Synchrotron X-ray Computed Tomography.

    Science.gov (United States)

    Keyes, Samuel David; Zygalakis, Konstantinos C; Roose, Tiina

    2017-12-01

    The rhizosphere is a zone of fundamental importance for understanding the dynamics of nutrient acquisition by plant roots. The canonical difficulty of experimentally investigating the rhizosphere led long ago to the adoption of mathematical models, the most sophisticated of which now incorporate explicit representations of root hairs and rhizosphere soil. Mathematical upscaling regimes, such as homogenisation, offer the possibility of incorporating into larger-scale models the important mechanistic processes occurring at the rhizosphere scale. However, we lack concrete descriptions of all the features required to fully parameterise models at the rhizosphere scale. By combining synchrotron X-ray computed tomography (SRXCT) and a novel root growth assay, we derive a three-dimensional description of rhizosphere soil structure suitable for use in multi-scale modelling frameworks. We describe an approach to mitigate sub-optimal root hair detection via structural root hair growth modelling. The growth model is explicitly parameterised with SRXCT data and simulates three-dimensional root hair ideotypes in silico, which are suitable for both ideotypic analysis and parameterisation of 3D geometry in mathematical models. The study considers different hypothetical conditions governing root hair interactions with soil matrices, with their respective effects on hair morphology being compared between idealised and image-derived soil/root geometries. The studies in idealised geometries suggest that packing arrangement of soil affects hair tortuosity more than the particle diameter. Results in field-derived soil suggest that hair access to poorly mobile nutrients is particularly sensitive to the physical interaction between the growing hairs and the phase of the soil in which soil water is present (i.e. the hydrated textural phase). The general trends in fluid-coincident hair length with distance from the root, and their dependence on hair/soil interaction mechanisms, are

  10. BETR-World: a geographically explicit model of chemical fate: application to transport of α-HCH to the Arctic

    International Nuclear Information System (INIS)

    Toose, L.; Woodfine, D.G.; MacLeod, M.; Mackay, D.; Gouin, J.

    2004-01-01

    The Berkeley-Trent (BETR)-World model, a 25 compartment, geographically explicit fugacity-based model is described and applied to evaluate the transport of chemicals from temperate source regions to receptor regions (such as the Arctic). The model was parameterized using GIS and an array of digital data on weather, oceans, freshwater, vegetation and geo-political boundaries. This version of the BETR model framework includes modification of atmospheric degradation rates by seasonally variable hydroxyl radical concentrations and temperature. Degradation rates in all other compartments vary with seasonally changing temperature. Deposition to the deep ocean has been included as a loss mechanism. A case study was undertaken for α-HCH. Dynamic emission scenarios were estimated for each of the 25 regions. Predicted environmental concentrations showed good agreement with measured values for the northern regions in air, and fresh and oceanic water and with the results from a previous model of global chemical fate. Potential for long-range transport and deposition to the Arctic region was assessed using a Transfer Efficiency combined with estimated emissions. European regions and the Orient including China have a high potential to contribute α-HCH contamination in the Arctic due to high rates of emission in these regions despite low Transfer Efficiencies. Sensitivity analyses reveal that the performance and reliability of the model is strongly influenced by parameters controlling degradation rates. - A geographically explicit multi-compartment model is applied to the transport of α-HCH to the Arctic, showing Europe and the Orient are key sources

  11. Diagnosis of dynamic systems based on explicit and implicit behavioural models: an application to gas turbines in Esprit Project Tiger

    Energy Technology Data Exchange (ETDEWEB)

    Trave-Massuyes, L. [Centre National de la Recherche Scientifique (CNRS), 31 - Toulouse (France); Milne, R.

    1995-12-31

    We are interested in the monitoring and diagnosis of dynamic systems. In our work, we are combining explicit temporal models of the behaviour of a dynamic system with implicit behavioural models supporting model based approaches. This work is drive by the needs of and applied to, two gas turbines of very different size and power. In this paper we describe the problems of building systems for these domains and illustrate how we have developed a system where these two approaches complement each other to provide a comprehensive fault detection and diagnosis system. We also explore the strengths and weaknesses of each approach. The work described here is currently working continuously, on line to a gas turbine in a major chemical plant. (author) 24 refs.

  12. The Explicit Wake Parametrisation V1.0: a wind farm parametrisation in the mesoscale model WRF

    Directory of Open Access Journals (Sweden)

    P. J. H. Volker

    2015-11-01

    Full Text Available We describe the theoretical basis, implementation, and validation of a new parametrisation that accounts for the effect of large offshore wind farms on the atmosphere and can be used in mesoscale and large-scale atmospheric models. This new parametrisation, referred to as the Explicit Wake Parametrisation (EWP, uses classical wake theory to describe the unresolved wake expansion. The EWP scheme is validated for a neutral atmospheric boundary layer against filtered in situ measurements from two meteorological masts situated a few kilometres away from the Danish offshore wind farm Horns Rev I. The simulated velocity deficit in the wake of the wind farm compares well to that observed in the measurements, and the velocity profile is qualitatively similar to that simulated with large eddy simulation models and from wind tunnel studies. At the same time, the validation process highlights the challenges in verifying such models with real observations.

  13. Diagnosis of dynamic systems based on explicit and implicit behavioural models: an application to gas turbines in Esprit Project Tiger

    Energy Technology Data Exchange (ETDEWEB)

    Trave-Massuyes, L [Centre National de la Recherche Scientifique (CNRS), 31 - Toulouse (France); Milne, R

    1996-12-31

    We are interested in the monitoring and diagnosis of dynamic systems. In our work, we are combining explicit temporal models of the behaviour of a dynamic system with implicit behavioural models supporting model based approaches. This work is drive by the needs of and applied to, two gas turbines of very different size and power. In this paper we describe the problems of building systems for these domains and illustrate how we have developed a system where these two approaches complement each other to provide a comprehensive fault detection and diagnosis system. We also explore the strengths and weaknesses of each approach. The work described here is currently working continuously, on line to a gas turbine in a major chemical plant. (author) 24 refs.

  14. Explicit modeling the progressive interface damage in fibrous composite: Analytical vs. numerical approach

    DEFF Research Database (Denmark)

    Kushch, V.I.; Shmegera, S.V.; Mishnaevsky, Leon

    2011-01-01

    of the multiple inclusion problem by means of complex potentials. The second, finite element model of FRC is based on the cohesive zone model of interface. Simulation of progressive debonding in FRC using the many-fiber models of composite has been performed. The advantageous features and applicability areas...... of both models are discussed. It has been shown that the developed models provide detailed analysis of the progressive debonding phenomena including the interface crack cluster formation, overall stiffness reduction and induced anisotropy of the effective elastic moduli of composite....

  15. A Regional Model for Malaria Vector Developmental Habitats Evaluated Using Explicit, Pond-Resolving Surface Hydrology Simulations.

    Directory of Open Access Journals (Sweden)

    Ernest Ohene Asare

    Full Text Available Dynamical malaria models can relate precipitation to the availability of vector breeding sites using simple models of surface hydrology. Here, a revised scheme is developed for the VECTRI malaria model, which is evaluated alongside the default scheme using a two year simulation by HYDREMATS, a 10 metre resolution, village-scale model that explicitly simulates individual ponds. Despite the simplicity of the two VECTRI surface hydrology parametrization schemes, they can reproduce the sub-seasonal evolution of fractional water coverage. Calibration of the model parameters is required to simulate the mean pond fraction correctly. The default VECTRI model tended to overestimate water fraction in periods subject to light rainfall events and underestimate it during periods of intense rainfall. This systematic error was improved in the revised scheme by including the a parametrization for surface run-off, such that light rainfall below the initial abstraction threshold does not contribute to ponds. After calibration of the pond model, the VECTRI model was able to simulate vector densities that compared well to the detailed agent based model contained in HYDREMATS without further parameter adjustment. Substituting local rain-gauge data with satellite-retrieved precipitation gave a reasonable approximation, raising the prospects for regional malaria simulations even in data sparse regions. However, further improvements could be made if a method can be derived to calibrate the key hydrology parameters of the pond model in each grid cell location, possibly also incorporating slope and soil texture.

  16. A Regional Model for Malaria Vector Developmental Habitats Evaluated Using Explicit, Pond-Resolving Surface Hydrology Simulations.

    Science.gov (United States)

    Asare, Ernest Ohene; Tompkins, Adrian Mark; Bomblies, Arne

    2016-01-01

    Dynamical malaria models can relate precipitation to the availability of vector breeding sites using simple models of surface hydrology. Here, a revised scheme is developed for the VECTRI malaria model, which is evaluated alongside the default scheme using a two year simulation by HYDREMATS, a 10 metre resolution, village-scale model that explicitly simulates individual ponds. Despite the simplicity of the two VECTRI surface hydrology parametrization schemes, they can reproduce the sub-seasonal evolution of fractional water coverage. Calibration of the model parameters is required to simulate the mean pond fraction correctly. The default VECTRI model tended to overestimate water fraction in periods subject to light rainfall events and underestimate it during periods of intense rainfall. This systematic error was improved in the revised scheme by including the a parametrization for surface run-off, such that light rainfall below the initial abstraction threshold does not contribute to ponds. After calibration of the pond model, the VECTRI model was able to simulate vector densities that compared well to the detailed agent based model contained in HYDREMATS without further parameter adjustment. Substituting local rain-gauge data with satellite-retrieved precipitation gave a reasonable approximation, raising the prospects for regional malaria simulations even in data sparse regions. However, further improvements could be made if a method can be derived to calibrate the key hydrology parameters of the pond model in each grid cell location, possibly also incorporating slope and soil texture.

  17. Explicit/multi-parametric model predictive control (MPC) of linear discrete-time systems by dynamic and multi-parametric programming

    KAUST Repository

    Kouramas, K.I.; Faí sca, N.P.; Panos, C.; Pistikopoulos, E.N.

    2011-01-01

    This work presents a new algorithm for solving the explicit/multi- parametric model predictive control (or mp-MPC) problem for linear, time-invariant discrete-time systems, based on dynamic programming and multi-parametric programming techniques

  18. An online model composition tool for system biology models.

    Science.gov (United States)

    Coskun, Sarp A; Cicek, A Ercument; Lai, Nicola; Dash, Ranjan K; Ozsoyoglu, Z Meral; Ozsoyoglu, Gultekin

    2013-09-05

    There are multiple representation formats for Systems Biology computational models, and the Systems Biology Markup Language (SBML) is one of the most widely used. SBML is used to capture, store, and distribute computational models by Systems Biology data sources (e.g., the BioModels Database) and researchers. Therefore, there is a need for all-in-one web-based solutions that support advance SBML functionalities such as uploading, editing, composing, visualizing, simulating, querying, and browsing computational models. We present the design and implementation of the Model Composition Tool (Interface) within the PathCase-SB (PathCase Systems Biology) web portal. The tool helps users compose systems biology models to facilitate the complex process of merging systems biology models. We also present three tools that support the model composition tool, namely, (1) Model Simulation Interface that generates a visual plot of the simulation according to user's input, (2) iModel Tool as a platform for users to upload their own models to compose, and (3) SimCom Tool that provides a side by side comparison of models being composed in the same pathway. Finally, we provide a web site that hosts BioModels Database models and a separate web site that hosts SBML Test Suite models. Model composition tool (and the other three tools) can be used with little or no knowledge of the SBML document structure. For this reason, students or anyone who wants to learn about systems biology will benefit from the described functionalities. SBML Test Suite models will be a nice starting point for beginners. And, for more advanced purposes, users will able to access and employ models of the BioModels Database as well.

  19. Aggregate and Individual Replication Probability within an Explicit Model of the Research Process

    Science.gov (United States)

    Miller, Jeff; Schwarz, Wolf

    2011-01-01

    We study a model of the research process in which the true effect size, the replication jitter due to changes in experimental procedure, and the statistical error of effect size measurement are all normally distributed random variables. Within this model, we analyze the probability of successfully replicating an initial experimental result by…

  20. Explicit validation of a surface shortwave radiation balance model over snow-covered complex terrain

    Science.gov (United States)

    Helbig, N.; Löwe, H.; Mayer, B.; Lehning, M.

    2010-09-01

    A model that computes the surface radiation balance for all sky conditions in complex terrain is presented. The spatial distribution of direct and diffuse sky radiation is determined from observations of incident global radiation, air temperature, and relative humidity at a single measurement location. Incident radiation under cloudless sky is spatially derived from a parameterization of the atmospheric transmittance. Direct and diffuse sky radiation for all sky conditions are obtained by decomposing the measured global radiation value. Spatial incident radiation values under all atmospheric conditions are computed by adjusting the spatial radiation values obtained from the parametric model with the radiation components obtained from the decomposition model at the measurement site. Topographic influences such as shading are accounted for. The radiosity approach is used to compute anisotropic terrain reflected radiation. Validations of the shortwave radiation balance model are presented in detail for a day with cloudless sky. For a day with overcast sky a first validation is presented. Validation of a section of the horizon line as well as of individual radiation components is performed with high-quality measurements. A new measurement setup was designed to determine terrain reflected radiation. There is good agreement between the measurements and the modeled terrain reflected radiation values as well as with incident radiation values. A comparison of the model with a fully three-dimensional radiative transfer Monte Carlo model is presented. That validation reveals a good agreement between modeled radiation values.

  1. BETR global - A geographically-explicit global-scale multimedia contaminant fate model

    International Nuclear Information System (INIS)

    MacLeod, Matthew; Waldow, Harald von; Tay, Pascal; Armitage, James M.; Woehrnschimmel, Henry; Riley, William J.; McKone, Thomas E.; Hungerbuhler, Konrad

    2011-01-01

    We present two new software implementations of the BETR Global multimedia contaminant fate model. The model uses steady-state or non-steady-state mass-balance calculations to describe the fate and transport of persistent organic pollutants using a desktop computer. The global environment is described using a database of long-term average monthly conditions on a 15 o x 15 o grid. We demonstrate BETR Global by modeling the global sources, transport, and removal of decamethylcyclopentasiloxane (D5). - Two new software implementations of the Berkeley-Trent Global Contaminant Fate Model are available. The new model software is illustrated using a case study of the global fate of decamethylcyclopentasiloxane (D5).

  2. Spatially explicit modeling of particulate nutrient flux in Large global rivers

    Science.gov (United States)

    Cohen, S.; Kettner, A.; Mayorga, E.; Harrison, J. A.

    2017-12-01

    Water, sediment, nutrient and carbon fluxes along river networks have undergone considerable alterations in response to anthropogenic and climatic changes, with significant consequences to infrastructure, agriculture, water security, ecology and geomorphology worldwide. However, in a global setting, these changes in fluvial fluxes and their spatial and temporal characteristics are poorly constrained, due to the limited availability of continuous and long-term observations. We present results from a new global-scale particulate modeling framework (WBMsedNEWS) that combines the Global NEWS watershed nutrient export model with the spatially distributed WBMsed water and sediment model. We compare the model predictions against multiple observational datasets. The results indicate that the model is able to accurately predict particulate nutrient (Nitrogen, Phosphorus and Organic Carbon) fluxes on an annual time scale. Analysis of intra-basin nutrient dynamics and fluxes to global oceans is presented.

  3. An open and extensible framework for spatially explicit land use change modelling: the lulcc R package

    Science.gov (United States)

    Moulds, S.; Buytaert, W.; Mijic, A.

    2015-10-01

    We present the lulcc software package, an object-oriented framework for land use change modelling written in the R programming language. The contribution of the work is to resolve the following limitations associated with the current land use change modelling paradigm: (1) the source code for model implementations is frequently unavailable, severely compromising the reproducibility of scientific results and making it impossible for members of the community to improve or adapt models for their own purposes; (2) ensemble experiments to capture model structural uncertainty are difficult because of fundamental differences between implementations of alternative models; and (3) additional software is required because existing applications frequently perform only the spatial allocation of change. The package includes a stochastic ordered allocation procedure as well as an implementation of the CLUE-S algorithm. We demonstrate its functionality by simulating land use change at the Plum Island Ecosystems site, using a data set included with the package. It is envisaged that lulcc will enable future model development and comparison within an open environment.

  4. BETR Global - A geographically explicit global-scale multimedia contaminant fate model

    Energy Technology Data Exchange (ETDEWEB)

    Macleod, M.; Waldow, H. von; Tay, P.; Armitage, J. M.; Wohrnschimmel, H.; Riley, W.; McKone, T. E.; Hungerbuhler, K.

    2011-04-01

    We present two new software implementations of the BETR Global multimedia contaminant fate model. The model uses steady-state or non-steady-state mass-balance calculations to describe the fate and transport of persistent organic pollutants using a desktop computer. The global environment is described using a database of long-term average monthly conditions on a 15{sup o} x 15{sup o} grid. We demonstrate BETR Global by modeling the global sources, transport, and removal of decamethylcyclopentasiloxane (D5).

  5. Deterministic Compilation of Temporal Safety Properties in Explicit State Model Checking

    Data.gov (United States)

    National Aeronautics and Space Administration — The translation of temporal logic specifications constitutes an essen- tial step in model checking and a major influence on the efficiency of formal verification via...

  6. Using Satellite Remote Sensing Data in a Spatially Explicit Price Model

    Science.gov (United States)

    Brown, Molly E.; Pinzon, Jorge E.; Prince, Stephen D.

    2007-01-01

    Famine early warning organizations use data from multiple disciplines to assess food insecurity of communities and regions in less-developed parts of the World. In this paper we integrate several indicators that are available to enhance the information for preparation for and responses to food security emergencies. The assessment uses a price model based on the relationship between the suitability of the growing season and market prices for coarse grain. The model is then used to create spatially continuous maps of millet prices. The model is applied to the dry central and northern areas of West Africa, using satellite-derived vegetation indices for the entire region. By coupling the model with vegetation data estimated for one to four months into the future, maps are created of a leading indicator of potential price movements. It is anticipated that these maps can be used to enable early warning of famine and for planning appropriate responses.

  7. Communication: Role of explicit water models in the helix folding/unfolding processes

    Science.gov (United States)

    Palazzesi, Ferruccio; Salvalaglio, Matteo; Barducci, Alessandro; Parrinello, Michele

    2016-09-01

    In the last years, it has become evident that computer simulations can assume a relevant role in modelling protein dynamical motions for their ability to provide a full atomistic image of the processes under investigation. The ability of the current protein force-fields in reproducing the correct thermodynamics and kinetics systems behaviour is thus an essential ingredient to improve our understanding of many relevant biological functionalities. In this work, employing the last developments of the metadynamics framework, we compare the ability of state-of-the-art all-atom empirical functions and water models to consistently reproduce the folding and unfolding of a helix turn motif in a model peptide. This theoretical study puts in evidence that the choice of the water models can influence the thermodynamic and the kinetics of the system under investigation, and for this reason cannot be considered trivial.

  8. A spatially explicit model of functional connectivity for the endangered Przewalski's gazelle (Procapra przewalskii in a patchy landscape.

    Directory of Open Access Journals (Sweden)

    Chunlin Li

    Full Text Available Habitat fragmentation, associated with human population expansion, impedes dispersal, reduces gene flow and aggravates inbreeding in species on the brink of extinction. Both scientific and conservation communities increasingly realize that maintaining and restoring landscape connectivity is of vital importance in biodiversity conservation. Prior to any conservation initiatives, it is helpful to present conservation practitioners with a spatially explicit model of functional connectivity for the target species or landscape.Using Przewalski's gazelle (Procapra przewalskii as a model of endangered ungulate species in highly fragmented landscape, we present a model providing spatially explicit information to inform the long-term preservation of well-connected metapopulations. We employed a Geographic Information System (GIS and expert-literature method to create a habitat suitability map, to identify potential habitats and to delineate a functional connectivity network (least-cost movement corridors and paths for the gazelle. Results indicated that there were limited suitable habitats for the gazelle, mainly found to the north and northwest of the Qinghai Lake where four of five potential habitat patches were identified. Fifteen pairs of least-cost corridors and paths were mapped connecting eleven extant populations and two neighboring potential patches. The least-cost paths ranged from 0.2 km to 26.8 km in length (averaging 12.4 km and were all longer than corresponding Euclidean distances.The model outputs were validated and supported by the latest findings in landscape genetics of the species, and may provide impetus for connectivity conservation programs. Dispersal barriers were examined and appropriate mitigation strategies were suggested. This study provides conservation practitioners with thorough and visualized information to reserve the landscape connectivity for Przewalski's gazelle. In a general sense, we proposed a heuristic framework

  9. Explicit representation and parametrised impacts of under ice shelf seas in the z∗ coordinate ocean model NEMO 3.6

    Directory of Open Access Journals (Sweden)

    P. Mathiot

    2017-07-01

    Full Text Available Ice-shelf–ocean interactions are a major source of freshwater on the Antarctic continental shelf and have a strong impact on ocean properties, ocean circulation and sea ice. However, climate models based on the ocean–sea ice model NEMO (Nucleus for European Modelling of the Ocean currently do not include these interactions in any detail. The capability of explicitly simulating the circulation beneath ice shelves is introduced in the non-linear free surface model NEMO. Its implementation into the NEMO framework and its assessment in an idealised and realistic circum-Antarctic configuration is described in this study. Compared with the current prescription of ice shelf melting (i.e. at the surface, inclusion of open sub-ice-shelf cavities leads to a decrease in sea ice thickness along the coast, a weakening of the ocean stratification on the shelf, a decrease in salinity of high-salinity shelf water on the Ross and Weddell sea shelves and an increase in the strength of the gyres that circulate within the over-deepened basins on the West Antarctic continental shelf. Mimicking the overturning circulation under the ice shelves by introducing a prescribed meltwater flux over the depth range of the ice shelf base, rather than at the surface, is also assessed. It yields similar improvements in the simulated ocean properties and circulation over the Antarctic continental shelf to those from the explicit ice shelf cavity representation. With the ice shelf cavities opened, the widely used three equation ice shelf melting formulation, which enables an interactive computation of melting, is tested. Comparison with observational estimates of ice shelf melting indicates realistic results for most ice shelves. However, melting rates for the Amery, Getz and George VI ice shelves are considerably overestimated.

  10. Tacit to explicit knowledge conversion.

    Science.gov (United States)

    Cairó Battistutti, Osvaldo; Bork, Dominik

    2017-11-01

    The ability to create, use and transfer knowledge may allow the creation or improvement of new products or services. But knowledge is often tacit: It lives in the minds of individuals, and therefore, it is difficult to transfer it to another person by means of the written word or verbal expression. This paper addresses this important problem by introducing a methodology, consisting of a four-step process that facilitates tacit to explicit knowledge conversion. The methodology utilizes conceptual modeling, thus enabling understanding and reasoning through visual knowledge representation. This implies the possibility of understanding concepts and ideas, visualized through conceptual models, without using linguistic or algebraic means. The proposed methodology is conducted in a metamodel-based tool environment whose aim is efficient application and ease of use.

  11. Dynamic modeling and explicit/multi-parametric MPC control of pressure swing adsorption systems

    KAUST Repository

    Khajuria, Harish; Pistikopoulos, Efstratios N.

    2011-01-01

    objective is to fast track H2 purity to a set point value of 99.99%. To perform this task, a rigorous and systematic framework is employed. First, a high fidelity detailed dynamic model is built to represent the system's real operation, and understand its

  12. Modelling explicit tides in the Indonesian seas: An important process for surface sea water properties.

    Science.gov (United States)

    Nugroho, Dwiyoga; Koch-Larrouy, Ariane; Gaspar, Philippe; Lyard, Florent; Reffray, Guillaume; Tranchant, Benoit

    2017-06-16

    Very intense internal tides take place in Indonesian seas. They dissipate and affect the vertical distribution of temperature and currents, which in turn influence the survival rates and transports of most planktonic organisms at the base of the whole marine ecosystem. This study uses the INDESO physical model to characterize the internal tides spatio-temporal patterns in the Indonesian Seas. The model reproduced internal tide dissipation in agreement with previous fine structure and microstructure observed in-situ in the sites of generation. The model also produced similar water mass transformation as the previous parameterization of Koch-Larrouy et al. (2007), and show good agreement with observations. The resulting cooling at the surface is 0.3°C, with maxima of 0.8°C at the location of internal tides energy, with stronger cooling in austral winter. The cycle of spring tides and neap tides modulates this impact by 0.1°C to 0.3°C. These results suggest that mixing due to internal tides might also upwell nutrients at the surface at a frequency similar to the tidal frequencies. Implications for biogeochemical modelling are important. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Explicit Foreground and Background Modeling in The Classification of Text Blocks in Scene Images

    NARCIS (Netherlands)

    Sriman, Bowornrat; Schomaker, Lambertus

    2015-01-01

    Achieving high accuracy for classifying foreground and background is an interesting challenge in the field of scene image analysis because of the wide range of illumination, complex background, and scale changes. Classifying fore- ground and background using bag-of-feature model gives a good result.

  14. An explicit statistical model of learning lexical segmentation using multiple cues

    NARCIS (Netherlands)

    Çöltekin, Ça ̆grı; Nerbonne, John; Lenci, Alessandro; Padró, Muntsa; Poibeau, Thierry; Villavicencio, Aline

    2014-01-01

    This paper presents an unsupervised and incremental model of learning segmentation that combines multiple cues whose use by children and adults were attested by experimental studies. The cues we exploit in this study are predictability statistics, phonotactics, lexical stress and partial lexical

  15. Explicit model predictive control applications in power systems: an AGC study for an isolated industrial system

    DEFF Research Database (Denmark)

    Jiang, Hao; Lin, Jin; Song, Yonghua

    2016-01-01

    Model predictive control (MPC), that can consider system constraints, is one of the most advanced control technology used nowadays. In power systems, MPC is applied in a way that an optimal control sequence is given every step by an online MPC controller. The main drawback is that the control law...

  16. Spatially explicit Schistosoma infection risk in eastern Africa using Bayesian geostatistical modelling

    DEFF Research Database (Denmark)

    Schur, Nadine; Hürlimann, Eveline; Stensgaard, Anna-Sofie

    2013-01-01

    are currently infected with either S. mansoni, or S. haematobium, or both species concurrently. Country-specific population-adjusted prevalence estimates range between 12.9% (Uganda) and 34.5% (Mozambique) for S. mansoni and between 11.9% (Djibouti) and 40.9% (Mozambique) for S. haematobium. Our models revealed...

  17. Tool wear modeling using abductive networks

    Science.gov (United States)

    Masory, Oren

    1992-09-01

    A tool wear model based on Abductive Networks, which consists of a network of `polynomial' nodes, is described. The model relates the cutting parameters, components of the cutting force, and machining time to flank wear. Thus real time measurements of the cutting force can be used to monitor the machining process. The model is obtained by a training process in which the connectivity between the network's nodes and the polynomial coefficients of each node are determined by optimizing a performance criteria. Actual wear measurements of coated and uncoated carbide inserts were used for training and evaluating the established model.

  18. The “Destabilizing” Effect of Cannibalism in a Spatially Explicit Three-Species Age Structured Predator-Prey Model

    Directory of Open Access Journals (Sweden)

    Aladeen Al Basheer

    2017-01-01

    Full Text Available Cannibalism, the act of killing and consumption of conspecifics, is generally considered to be a stabilising process in ODE models of predator-prey systems. On the other hand, Sun et al. were the first to show that cannibalism can cause Turing instability, in the classical Rosenzweig-McArthur two-species PDE model, which is an impossibility without cannibalism. Magnússon’s classic work is the first to show that cannibalism in a structured three-species predator-prey ODE model can actually be destabilising. In the current manuscript we consider the PDE form of the three-species model proposed in Magnússon’s classic work. We prove that, in the absence of cannibalism, Turing instability is an impossibility in this model, for any range of parameters. However, the inclusion of cannibalism can cause Turing instability. Thus, to the best of our knowledge, we report the first cannibalism induced Turing instability result, in spatially explicit three-species age structured predator-prey systems. We also show that, in the classical ODE model proposed by Magnússon, cannibalism can act as a life boat mechanism, for the prey.

  19. HABSEED: a Simple Spatially Explicit Meta-Populations Model Using Remote Sensing Derived Habitat Quality Data

    Science.gov (United States)

    Heumann, B. W.; Guichard, F.; Seaquist, J. W.

    2005-05-01

    The HABSEED model uses remote sensing derived NPP as a surrogate for habitat quality as the driving mechanism for population growth and local seed dispersal. The model has been applied to the Sahel region of Africa. Results show that the functional response of plants to habitat quality alters population distribution. Plants more tolerant of medium quality habitat have greater distributions to the North while plants requiring only the best habitat are limited to the South. For all functional response types, increased seed production results in diminishing returns. Functional response types have been related to life history tradeoffs and r-K strategies based on the results. Results are compared to remote sensing derived vegetation land cover.

  20. Simulated x-ray scattering of protein solutions using explicit-solvent models

    International Nuclear Information System (INIS)

    Park, Sanghyun; Bardhan, Jaydeep P.; Makowski, Lee; Roux, Benoit

    2009-01-01

    X-ray solution scattering shows new promise for the study of protein structures, complementing crystallography and nuclear magnetic resonance. In order to realize the full potential of solution scattering, it is necessary to not only improve experimental techniques but also develop accurate and efficient computational schemes to relate atomistic models to measurements. Previous computational methods, based on continuum models of water, have been unable to calculate scattering patterns accurately, especially in the wide-angle regime which contains most of the information on the secondary, tertiary, and quaternary structures. Here we present a novel formulation based on the atomistic description of water, in which scattering patterns are calculated from atomic coordinates of protein and water. Without any empirical adjustments, this method produces scattering patterns of unprecedented accuracy in the length scale between 5 and 100 A, as we demonstrate by comparing simulated and observed scattering patterns for myoglobin and lysozyme.

  1. An explicit canopy BRDF model and inversion. [Bidirectional Reflectance Distribution Function

    Science.gov (United States)

    Liang, Shunlin; Strahler, Alan H.

    1992-01-01

    Based on a rigorous canopy radiative transfer equation, the multiple scattering radiance is approximated by the asymptotic theory, and the single scattering radiance calculation, which requires an numerical intergration due to considering the hotspot effect, is simplified. A new formulation is presented to obtain more exact angular dependence of the sky radiance distribution. The unscattered solar radiance and single scattering radiance are calculated exactly, and the multiple scattering is approximated by the delta two-stream atmospheric radiative transfer model. The numerical algorithms prove that the parametric canopy model is very accurate, especially when the viewing angles are smaller than 55 deg. The Powell algorithm is used to retrieve biospheric parameters from the ground measured multiangle observations.

  2. Web tools for predictive toxicology model building.

    Science.gov (United States)

    Jeliazkova, Nina

    2012-07-01

    The development and use of web tools in chemistry has accumulated more than 15 years of history already. Powered by the advances in the Internet technologies, the current generation of web systems are starting to expand into areas, traditional for desktop applications. The web platforms integrate data storage, cheminformatics and data analysis tools. The ease of use and the collaborative potential of the web is compelling, despite the challenges. The topic of this review is a set of recently published web tools that facilitate predictive toxicology model building. The focus is on software platforms, offering web access to chemical structure-based methods, although some of the frameworks could also provide bioinformatics or hybrid data analysis functionalities. A number of historical and current developments are cited. In order to provide comparable assessment, the following characteristics are considered: support for workflows, descriptor calculations, visualization, modeling algorithms, data management and data sharing capabilities, availability of GUI or programmatic access and implementation details. The success of the Web is largely due to its highly decentralized, yet sufficiently interoperable model for information access. The expected future convergence between cheminformatics and bioinformatics databases provides new challenges toward management and analysis of large data sets. The web tools in predictive toxicology will likely continue to evolve toward the right mix of flexibility, performance, scalability, interoperability, sets of unique features offered, friendly user interfaces, programmatic access for advanced users, platform independence, results reproducibility, curation and crowdsourcing utilities, collaborative sharing and secure access.

  3. Spatially explicit modeling of lesser prairie-chicken lek density in Texas

    Science.gov (United States)

    Timmer, Jennifer M.; Butler, M.J.; Ballard, Warren; Boal, Clint W.; Whitlaw, Heather A.

    2014-01-01

    As with many other grassland birds, lesser prairie-chickens (Tympanuchus pallidicinctus) have experienced population declines in the Southern Great Plains. Currently they are proposed for federal protection under the Endangered Species Act. In addition to a history of land-uses that have resulted in habitat loss, lesser prairie-chickens now face a new potential disturbance from energy development. We estimated lek density in the occupied lesser prairie-chicken range of Texas, USA, and modeled anthropogenic and vegetative landscape features associated with lek density. We used an aerial line-transect survey method to count lesser prairie-chicken leks in spring 2010 and 2011 and surveyed 208 randomly selected 51.84-km(2) blocks. We divided each survey block into 12.96-km(2) quadrats and summarized landscape variables within each quadrat. We then used hierarchical distance-sampling models to examine the relationship between lek density and anthropogenic and vegetative landscape features and predict how lek density may change in response to changes on the landscape, such as an increase in energy development. Our best models indicated lek density was related to percent grassland, region (i.e., the northeast or southwest region of the Texas Panhandle), total percentage of grassland and shrubland, paved road density, and active oil and gas well density. Predicted lek density peaked at 0.39leks/12.96km(2) (SE=0.09) and 2.05leks/12.96km(2) (SE=0.56) in the northeast and southwest region of the Texas Panhandle, respectively, which corresponds to approximately 88% and 44% grassland in the northeast and southwest region. Lek density increased with an increase in total percentage of grassland and shrubland and was greatest in areas with lower densities of paved roads and lower densities of active oil and gas wells. We used the 2 most competitive models to predict lek abundance and estimated 236 leks (CV=0.138, 95% CI=177-306leks) for our sampling area. Our results suggest that

  4. High Performance Programming Using Explicit Shared Memory Model on the Cray T3D

    Science.gov (United States)

    Saini, Subhash; Simon, Horst D.; Lasinski, T. A. (Technical Monitor)

    1994-01-01

    The Cray T3D is the first-phase system in Cray Research Inc.'s (CRI) three-phase massively parallel processing program. In this report we describe the architecture of the T3D, as well as the CRAFT (Cray Research Adaptive Fortran) programming model, and contrast it with PVM, which is also supported on the T3D We present some performance data based on the NAS Parallel Benchmarks to illustrate both architectural and software features of the T3D.

  5. Asymptotic analysis for a simple explicit estimator in Barndorff-Nielsen and Shephard stochastic volatility models

    DEFF Research Database (Denmark)

    Hubalek, Friedrich; Posedel, Petra

    expressions for the asymptotic covariance matrix. We develop in detail the martingale estimating function approach for a bivariate model, that is not a diffusion, but admits jumps. We do not use ergodicity arguments. We assume that both, logarithmic returns and instantaneous variance are observed...... on a discrete grid of fixed width, and the observation horizon tends to infinity. This anaysis is a starting point and benchmark for further developments concerning optimal martingale estimating functions, and for theoretical and empirical investigations, that replace the (actually unobserved) variance process...

  6. Stable explicit coupling of the Yee scheme with a linear current model in fluctuating magnetized plasmas

    International Nuclear Information System (INIS)

    Silva, Filipe da; Pinto, Martin Campos; Després, Bruno; Heuraux, Stéphane

    2015-01-01

    This work analyzes the stability of the Yee scheme for non-stationary Maxwell's equations coupled with a linear current model with density fluctuations. We show that the usual procedure may yield unstable scheme for physical situations that correspond to strongly magnetized plasmas in X-mode (TE) polarization. We propose to use first order clustered discretization of the vectorial product that gives back a stable coupling. We validate the schemes on some test cases representative of direct numerical simulations of X-mode in a magnetic fusion plasma including turbulence

  7. Analyzing key constraints to biogas production from crop residues and manure in the EU—A spatially explicit model

    Science.gov (United States)

    Persson, U. Martin

    2017-01-01

    This paper presents a spatially explicit method for making regional estimates of the potential for biogas production from crop residues and manure, accounting for key technical, biochemical, environmental and economic constraints. Methods for making such estimates are important as biofuels from agricultural residues are receiving increasing policy support from the EU and major biogas producers, such as Germany and Italy, in response to concerns over unintended negative environmental and social impacts of conventional biofuels. This analysis comprises a spatially explicit estimate of crop residue and manure production for the EU at 250 m resolution, and a biogas production model accounting for local constraints such as the sustainable removal of residues, transportation of substrates, and the substrates’ biochemical suitability for anaerobic digestion. In our base scenario, the EU biogas production potential from crop residues and manure is about 0.7 EJ/year, nearly double the current EU production of biogas from agricultural substrates, most of which does not come from residues or manure. An extensive sensitivity analysis of the model shows that the potential could easily be 50% higher or lower, depending on the stringency of economic, technical and biochemical constraints. We find that the potential is particularly sensitive to constraints on the substrate mixtures’ carbon-to-nitrogen ratio and dry matter concentration. Hence, the potential to produce biogas from crop residues and manure in the EU depends to large extent on the possibility to overcome the challenges associated with these substrates, either by complementing them with suitable co-substrates (e.g. household waste and energy crops), or through further development of biogas technology (e.g. pretreatment of substrates and recirculation of effluent). PMID:28141827

  8. Analyzing key constraints to biogas production from crop residues and manure in the EU-A spatially explicit model.

    Science.gov (United States)

    Einarsson, Rasmus; Persson, U Martin

    2017-01-01

    This paper presents a spatially explicit method for making regional estimates of the potential for biogas production from crop residues and manure, accounting for key technical, biochemical, environmental and economic constraints. Methods for making such estimates are important as biofuels from agricultural residues are receiving increasing policy support from the EU and major biogas producers, such as Germany and Italy, in response to concerns over unintended negative environmental and social impacts of conventional biofuels. This analysis comprises a spatially explicit estimate of crop residue and manure production for the EU at 250 m resolution, and a biogas production model accounting for local constraints such as the sustainable removal of residues, transportation of substrates, and the substrates' biochemical suitability for anaerobic digestion. In our base scenario, the EU biogas production potential from crop residues and manure is about 0.7 EJ/year, nearly double the current EU production of biogas from agricultural substrates, most of which does not come from residues or manure. An extensive sensitivity analysis of the model shows that the potential could easily be 50% higher or lower, depending on the stringency of economic, technical and biochemical constraints. We find that the potential is particularly sensitive to constraints on the substrate mixtures' carbon-to-nitrogen ratio and dry matter concentration. Hence, the potential to produce biogas from crop residues and manure in the EU depends to large extent on the possibility to overcome the challenges associated with these substrates, either by complementing them with suitable co-substrates (e.g. household waste and energy crops), or through further development of biogas technology (e.g. pretreatment of substrates and recirculation of effluent).

  9. Analyzing key constraints to biogas production from crop residues and manure in the EU-A spatially explicit model.

    Directory of Open Access Journals (Sweden)

    Rasmus Einarsson

    Full Text Available This paper presents a spatially explicit method for making regional estimates of the potential for biogas production from crop residues and manure, accounting for key technical, biochemical, environmental and economic constraints. Methods for making such estimates are important as biofuels from agricultural residues are receiving increasing policy support from the EU and major biogas producers, such as Germany and Italy, in response to concerns over unintended negative environmental and social impacts of conventional biofuels. This analysis comprises a spatially explicit estimate of crop residue and manure production for the EU at 250 m resolution, and a biogas production model accounting for local constraints such as the sustainable removal of residues, transportation of substrates, and the substrates' biochemical suitability for anaerobic digestion. In our base scenario, the EU biogas production potential from crop residues and manure is about 0.7 EJ/year, nearly double the current EU production of biogas from agricultural substrates, most of which does not come from residues or manure. An extensive sensitivity analysis of the model shows that the potential could easily be 50% higher or lower, depending on the stringency of economic, technical and biochemical constraints. We find that the potential is particularly sensitive to constraints on the substrate mixtures' carbon-to-nitrogen ratio and dry matter concentration. Hence, the potential to produce biogas from crop residues and manure in the EU depends to large extent on the possibility to overcome the challenges associated with these substrates, either by complementing them with suitable co-substrates (e.g. household waste and energy crops, or through further development of biogas technology (e.g. pretreatment of substrates and recirculation of effluent.

  10. Decadal shifts of East Asian summer monsoon in a climate model free of explicit GHGs and aerosols

    Science.gov (United States)

    Lin, Renping; Zhu, Jiang; Zheng, Fei

    2016-12-01

    The East Asian summer monsoon (EASM) experienced decadal transitions over the past few decades, and the associated "wetter-South-drier-North" shifts in rainfall patterns in China significantly affected the social and economic development in China. Two viewpoints stand out to explain these decadal shifts, regarding the shifts either a result of internal variability of climate system or that of external forcings (e.g. greenhouse gases (GHGs) and anthropogenic aerosols). However, most climate models, for example, the Atmospheric Model Intercomparison Project (AMIP)-type simulations and the Coupled Model Intercomparison Project (CMIP)-type simulations, fail to simulate the variation patterns, leaving the mechanisms responsible for these shifts still open to dispute. In this study, we conducted a successful simulation of these decadal transitions in a coupled model where we applied ocean data assimilation in the model free of explicit aerosols and GHGs forcing. The associated decadal shifts of the three-dimensional spatial structure in the 1990s, including the eastward retreat, the northward shift of the western Pacific subtropical high (WPSH), and the south-cool-north-warm pattern of the upper-level tropospheric temperature, were all well captured. Our simulation supports the argument that the variations of the oceanic fields are the dominant factor responsible for the EASM decadal transitions.

  11. Water transport through tall trees: A vertically-explicit, analytical model of xylem hydraulic conductance in stems.

    Science.gov (United States)

    Couvreur, Valentin; Ledder, Glenn; Manzoni, Stefano; Way, Danielle A; Muller, Erik B; Russo, Sabrina E

    2018-05-08

    Trees grow by vertically extending their stems, so accurate stem hydraulic models are fundamental to understanding the hydraulic challenges faced by tall trees. Using a literature survey, we showed that many tree species exhibit continuous vertical variation in hydraulic traits. To examine the effects of this variation on hydraulic function, we developed a spatially-explicit, analytical water transport model for stems. Our model allows Huber ratio, stem-saturated conductivity, pressure at 50% loss of conductivity, leaf area, and transpiration rate to vary continuously along the hydraulic path. Predictions from our model differ from a matric flux potential model parameterized with uniform traits. Analyses show that cavitation is a whole-stem emergent property resulting from nonlinear pressure-conductivity feedbacks that, with gravity, cause impaired water transport to accumulate along the path. Because of the compounding effects of vertical trait variation on hydraulic function, growing proportionally more sapwood and building tapered xylem with height, as well as reducing xylem vulnerability only at branch tips while maintaining transport capacity at the stem base, can compensate for these effects. We therefore conclude that the adaptive significance of vertical variation in stem hydraulic traits is to allow trees to grow tall and tolerate operating near their hydraulic limits. This article is protected by copyright. All rights reserved.

  12. Animal models: an important tool in mycology.

    Science.gov (United States)

    Capilla, Javier; Clemons, Karl V; Stevens, David A

    2007-12-01

    Animal models of fungal infections are, and will remain, a key tool in the advancement of the medical mycology. Many different types of animal models of fungal infection have been developed, with murine models the most frequently used, for studies of pathogenesis, virulence, immunology, diagnosis, and therapy. The ability to control numerous variables in performing the model allows us to mimic human disease states and quantitatively monitor the course of the disease. However, no single model can answer all questions and different animal species or different routes of infection can show somewhat different results. Thus, the choice of which animal model to use must be made carefully, addressing issues of the type of human disease to mimic, the parameters to follow and collection of the appropriate data to answer those questions being asked. This review addresses a variety of uses for animal models in medical mycology. It focuses on the most clinically important diseases affecting humans and cites various examples of the different types of studies that have been performed. Overall, animal models of fungal infection will continue to be valuable tools in addressing questions concerning fungal infections and contribute to our deeper understanding of how these infections occur, progress and can be controlled and eliminated.

  13. Modelling of friction anisotropy of deepdrawing sheet in ABAQUS/EXPLICIT

    Directory of Open Access Journals (Sweden)

    F. Stachowicz

    2010-07-01

    Full Text Available This paper presents the experimental and numerical results of rectangular cup drawing of steel sheets. The aim of the experimental study was to analyze material behavior under deformation. The received results were further used to verify the results from numerical simulation by taking friction and material anisotropy into consideration. A 3D parametric finite element (FE model was built using the FE-package ABAQUS/Standard. ABAQUS allows analyzing physical models of real processes putting special emphasis on geometrical non-linearities caused by large deformations, material non-linearities and complex friction conditions. Frictional properties of the deep drawing quality steel sheet were determined by using the pin-on-disc tribometer. It shows that the friction coefficient value depends on the measured angle from the rolling direction and corresponds to the surface topography. A quadratic Hill anisotropic yield criterion was compared with Huber-Mises yield criterion having isotropic hardening. Plastic anisotropy is the result of the distortion of the yield surface shape due to the material microstructural state. The sensitivity of constitutive laws to the initial data characterizing material behavior isalso presented. It is found that plastic anisotropy of the matrix in ductile sheet metal has influence on deformation behavior of the material. If the material and friction anisotropy are taken into account in the finite element analysis, this approach undoubtedly gives the most approximate numerical results to real processes. This paper is the first part of the study of numerical investigation using ABAQUS and mainly deals with the most influencing parameters in a forming process to simulate the sheet metal forming of rectangular cup.

  14. The Explicit-Cloud Parameterized-Pollutant hybrid approach for aerosol-cloud interactions in multiscale modeling framework models: tracer transport results

    International Nuclear Information System (INIS)

    Jr, William I Gustafson; Berg, Larry K; Easter, Richard C; Ghan, Steven J

    2008-01-01

    All estimates of aerosol indirect effects on the global energy balance have either completely neglected the influence of aerosol on convective clouds or treated the influence in a highly parameterized manner. Embedding cloud-resolving models (CRMs) within each grid cell of a global model provides a multiscale modeling framework for treating both the influence of aerosols on convective as well as stratiform clouds and the influence of clouds on the aerosol, but treating the interactions explicitly by simulating all aerosol processes in the CRM is computationally prohibitive. An alternate approach is to use horizontal statistics (e.g., cloud mass flux, cloud fraction, and precipitation) from the CRM simulation to drive a single-column parameterization of cloud effects on the aerosol and then use the aerosol profile to simulate aerosol effects on clouds within the CRM. Here, we present results from the first component of the Explicit-Cloud Parameterized-Pollutant parameterization to be developed, which handles vertical transport of tracers by clouds. A CRM with explicit tracer transport serves as a benchmark. We show that this parameterization, driven by the CRM's cloud mass fluxes, reproduces the CRM tracer transport significantly better than a single-column model that uses a conventional convective cloud parameterization

  15. The Explicit-Cloud Parameterized-Pollutant hybrid approach for aerosol-cloud interactions in multiscale modeling framework models: tracer transport results

    Energy Technology Data Exchange (ETDEWEB)

    Jr, William I Gustafson; Berg, Larry K; Easter, Richard C; Ghan, Steven J [Atmospheric Science and Global Change Division, Pacific Northwest National Laboratory, PO Box 999, MSIN K9-30, Richland, WA (United States)], E-mail: William.Gustafson@pnl.gov

    2008-04-15

    All estimates of aerosol indirect effects on the global energy balance have either completely neglected the influence of aerosol on convective clouds or treated the influence in a highly parameterized manner. Embedding cloud-resolving models (CRMs) within each grid cell of a global model provides a multiscale modeling framework for treating both the influence of aerosols on convective as well as stratiform clouds and the influence of clouds on the aerosol, but treating the interactions explicitly by simulating all aerosol processes in the CRM is computationally prohibitive. An alternate approach is to use horizontal statistics (e.g., cloud mass flux, cloud fraction, and precipitation) from the CRM simulation to drive a single-column parameterization of cloud effects on the aerosol and then use the aerosol profile to simulate aerosol effects on clouds within the CRM. Here, we present results from the first component of the Explicit-Cloud Parameterized-Pollutant parameterization to be developed, which handles vertical transport of tracers by clouds. A CRM with explicit tracer transport serves as a benchmark. We show that this parameterization, driven by the CRM's cloud mass fluxes, reproduces the CRM tracer transport significantly better than a single-column model that uses a conventional convective cloud parameterization.

  16. A spatially explicit hydro-ecological modeling framework (BEPS-TerrainLab V2.0): Model description and test in a boreal ecosystem in Eastern North America

    Science.gov (United States)

    Govind, Ajit; Chen, Jing Ming; Margolis, Hank; Ju, Weimin; Sonnentag, Oliver; Giasson, Marc-André

    2009-04-01

    SummaryA spatially explicit, process-based hydro-ecological model, BEPS-TerrainLab V2.0, was developed to improve the representation of ecophysiological, hydro-ecological and biogeochemical processes of boreal ecosystems in a tightly coupled manner. Several processes unique to boreal ecosystems were implemented including the sub-surface lateral water fluxes, stratification of vegetation into distinct layers for explicit ecophysiological representation, inclusion of novel spatial upscaling strategies and biogeochemical processes. To account for preferential water fluxes common in humid boreal ecosystems, a novel scheme was introduced based on laboratory analyses. Leaf-scale ecophysiological processes were upscaled to canopy-scale by explicitly considering leaf physiological conditions as affected by light and water stress. The modified model was tested with 2 years of continuous measurements taken at the Eastern Old Black Spruce Site of the Fluxnet-Canada Research Network located in a humid boreal watershed in eastern Canada. Comparison of the simulated and measured ET, water-table depth (WTD), volumetric soil water content (VSWC) and gross primary productivity (GPP) revealed that BEPS-TerrainLab V2.0 simulates hydro-ecological processes with reasonable accuracy. The model was able to explain 83% of the ET, 92% of the GPP variability and 72% of the WTD dynamics. The model suggests that in humid ecosystems such as eastern North American boreal watersheds, topographically driven sub-surface baseflow is the main mechanism of soil water partitioning which significantly affects the local-scale hydrological conditions.

  17. Modeling the fate of nitrogen on the catchment scale using a spatially explicit hydro-biogeochemical simulation system

    Science.gov (United States)

    Klatt, S.; Butterbach-Bahl, K.; Kiese, R.; Haas, E.; Kraus, D.; Molina-Herrera, S. W.; Kraft, P.

    2015-12-01

    The continuous growth of the human population demands an equally growing supply for fresh water and food. As a result, available land for efficient agriculture is constantly diminishing which forces farmers to cultivate inferior croplands and intensify agricultural practices, e.g., increase the use of synthetic fertilizers. This intensification of marginal areas in particular will cause a dangerous rise in nitrate discharge into open waters or even drinking water resources. In order to reduce the amount of nitrate lost by surface runoff or lateral subsurface transport, bufferstrips have proved to be a valuable means. Current laws, however, promote rather static designs (i.e., width and usage) even though a multitude of factors, e.g., soil type, slope, vegetation and the nearby agricultural management, determines its effectiveness. We propose a spatially explicit modeling approach enabling to assess the effects of those factors on nitrate discharge from arable lands using the fully distributed hydrology model CMF coupled to the complex biogeochemical model LandscapeDNDC. Such a modeling scheme allows to observe the displacement of dissolved nutrients in both vertical and horizontal directions and serves to estimate both their uptake by the vegetated bufferstrip and loss to the environment. First results indicate a significant reduction of nitrate loss in the presence of a bufferstrip (2.5 m). We show effects induced by various buffer strip widths and plant cover on the nitrate retention.

  18. A High-Resolution Spatially Explicit Monte-Carlo Simulation Approach to Commercial and Residential Electricity and Water Demand Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Morton, April M [ORNL; McManamay, Ryan A [ORNL; Nagle, Nicholas N [ORNL; Piburn, Jesse O [ORNL; Stewart, Robert N [ORNL; Surendran Nair, Sujithkumar [ORNL

    2016-01-01

    Abstract As urban areas continue to grow and evolve in a world of increasing environmental awareness, the need for high resolution spatially explicit estimates for energy and water demand has become increasingly important. Though current modeling efforts mark significant progress in the effort to better understand the spatial distribution of energy and water consumption, many are provided at a course spatial resolution or rely on techniques which depend on detailed region-specific data sources that are not publicly available for many parts of the U.S. Furthermore, many existing methods do not account for errors in input data sources and may therefore not accurately reflect inherent uncertainties in model outputs. We propose an alternative and more flexible Monte-Carlo simulation approach to high-resolution residential and commercial electricity and water consumption modeling that relies primarily on publicly available data sources. The method s flexible data requirement and statistical framework ensure that the model is both applicable to a wide range of regions and reflective of uncertainties in model results. Key words: Energy Modeling, Water Modeling, Monte-Carlo Simulation, Uncertainty Quantification Acknowledgment This manuscript has been authored by employees of UT-Battelle, LLC, under contract DE-AC05-00OR22725 with the U.S. Department of Energy. Accordingly, the United States Government retains and the publisher, by accepting the article for publication, acknowledges that the United States Government retains a non-exclusive, paid-up, irrevocable, world-wide license to publish or reproduce the published form of this manuscript, or allow others to do so, for United States Government purposes.

  19. Analytical explicit formulas of average run length for long memory process with ARFIMA model on CUSUM control chart

    Directory of Open Access Journals (Sweden)

    Wilasinee Peerajit

    2017-12-01

    Full Text Available This paper proposes the explicit formulas for the derivation of exact formulas from Average Run Lengths (ARLs using integral equation on CUSUM control chart when observations are long memory processes with exponential white noise. The authors compared efficiency in terms of the percentage of absolute difference to a similar method to verify the accuracy of the ARLs between the values obtained by the explicit formulas and numerical integral equation (NIE method. The explicit formulas were based on Banach fixed point theorem which was used to guarantee the existence and uniqueness of the solution for ARFIMA(p,d,q. Results showed that the two methods are similar in good agreement with the percentage of absolute difference at less than 0.23%. Therefore, the explicit formulas are an efficient alternative for implementation in real applications because the computational CPU time for ARLs from the explicit formulas are 1 second preferable over the NIE method.

  20. Modeling spatially explicit fire impact on gross primary production in interior Alaska using satellite images coupled with eddy covariance

    Science.gov (United States)

    Huang, Shengli; Liu, Heping; Dahal, Devendra; Jin, Suming; Welp, Lisa R.; Liu, Jinxun; Liu, Shuguang

    2013-01-01

    In interior Alaska, wildfires change gross primary production (GPP) after the initial disturbance. The impact of fires on GPP is spatially heterogeneous, which is difficult to evaluate by limited point-based comparisons or is insufficient to assess by satellite vegetation index. The direct prefire and postfire comparison is widely used, but the recovery identification may become biased due to interannual climate variability. The objective of this study is to propose a method to quantify the spatially explicit GPP change caused by fires and succession. We collected three Landsat images acquired on 13 July 2004, 5 August 2004, and 6 September 2004 to examine the GPP recovery of burned area from 1987 to 2004. A prefire Landsat image acquired in 1986 was used to reconstruct satellite images assuming that the fires of 1987–2004 had not occurred. We used a light-use efficiency model to estimate the GPP. This model was driven by maximum light-use efficiency (Emax) and fraction of photosynthetically active radiation absorbed by vegetation (FPAR). We applied this model to two scenarios (i.e., an actual postfire scenario and an assuming-no-fire scenario), where the changes in Emax and FPAR were taken into account. The changes in Emax were represented by the change in land cover of evergreen needleleaf forest, deciduous broadleaf forest, and shrub/grass mixed, whose Emax was determined from three fire chronosequence flux towers as 1.1556, 1.3336, and 0.5098 gC/MJ PAR. The changes in FPAR were inferred from NDVI change between the actual postfire NDVI and the reconstructed NDVI. After GPP quantification for July, August, and September 2004, we calculated the difference between the two scenarios in absolute and percent GPP changes. Our results showed rapid recovery of GPP post-fire with a 24% recovery immediately after burning and 43% one year later. For the fire scars with an age range of 2–17 years, the recovery rate ranged from 54% to 95%. In addition to the averaging

  1. Self-Dual Configurations in a Generalized Abelian Chern-Simons-Higgs Model with Explicit Breaking of the Lorentz Covariance

    International Nuclear Information System (INIS)

    Sourrouille, Lucas; Casana, Rodolfo

    2016-01-01

    We have studied the existence of self-dual solitonic solutions in a generalization of the Abelian Chern-Simons-Higgs model. Such a generalization introduces two different nonnegative functions, ω_1(|ϕ|) and ω(|ϕ|), which split the kinetic term of the Higgs field, |D_μϕ|"2→ω_1(|ϕ|)|D_0ϕ|"2-ω(|ϕ|)|D_kϕ|"2, breaking explicitly the Lorentz covariance. We have shown that a clean implementation of the Bogomolnyi procedure only can be implemented whether ω(|ϕ|)∝β|ϕ|"2"β"-"2 with β≥1. The self-dual or Bogomolnyi equations produce an infinity number of soliton solutions by choosing conveniently the generalizing function ω_1(|ϕ|) which must be able to provide a finite magnetic field. Also, we have shown that by properly choosing the generalizing functions it is possible to reproduce the Bogomolnyi equations of the Abelian Maxwell-Higgs and Chern-Simons-Higgs models. Finally, some new self-dual |ϕ|"6-vortex solutions have been analyzed from both theoretical and numerical point of view.

  2. Reducing fertilizer-nitrogen losses from rowcrop landscapes: Insights and implications from a spatially explicit watershed model

    Science.gov (United States)

    McLellan, Eileen; Schilling, Keith; Robertson, Dale M.

    2015-01-01

    We present conceptual and quantitative models that predict changes in fertilizer-derived nitrogen delivery from rowcrop landscapes caused by agricultural conservation efforts implemented to reduce nutrient inputs and transport and increase nutrient retention in the landscape. To evaluate the relative importance of changes in the sources, transport, and sinks of fertilizer-derived nitrogen across a region, we use the spatially explicit SPAtially Referenced Regression On Watershed attributes watershed model to map the distribution, at the small watershed scale within the Upper Mississippi-Ohio River Basin (UMORB), of: (1) fertilizer inputs; (2) nutrient attenuation during delivery of those inputs to the UMORB outlet; and (3) nitrogen export from the UMORB outlet. Comparing these spatial distributions suggests that the amount of fertilizer input and degree of nutrient attenuation are both important in determining the extent of nitrogen export. From a management perspective, this means that agricultural conservation efforts to reduce nitrogen export would benefit by: (1) expanding their focus to include activities that restore and enhance nutrient processing in these highly altered landscapes; and (2) targeting specific types of best management practices to watersheds where they will be most valuable. Doing so successfully may result in a shift in current approaches to conservation planning, outreach, and funding.

  3. Inferring the past and present connectivity across the range of a North American leaf beetle: combining ecological niche modeling and a geographically explicit model of coalescence.

    Science.gov (United States)

    Dellicour, Simon; Fearnley, Shannon; Lombal, Anicée; Heidl, Sarah; Dahlhoff, Elizabeth P; Rank, Nathan E; Mardulyn, Patrick

    2014-08-01

    The leaf beetle Chrysomela aeneicollis occurs across Western North America, either at high elevation or in small, isolated populations along the coast, and thus has a highly fragmented distribution. DNA sequence data (three loci) were collected from five regions across the species range. Population connectivity was examined using traditional ecological niche modeling, which suggested that gene flow could occur among regions now and in the past. We developed geographically explicit coalescence models of sequence evolution that incorporated a two-dimensional representation of the hypothesized ranges suggested by the niche-modeling estimates. We simulated sequence data according to these models and compared them to observed sequences to identify most probable scenarios regarding the migration history of C. aeneicollis. Our results disagreed with initial niche-modeling estimates by clearly rejecting recent connectivity among regions, and were instead most consistent with a long period of range fragmentation, extending well beyond the last glacial maximum. This application of geographically explicit models of coalescence has highlighted some limitations of the use of climatic variables for predicting the present and past range of a species and has explained aspects of the Pleistocene evolutionary history of a cold-adapted organism in Western North America. © 2014 The Author(s). Evolution © 2014 The Society for the Study of Evolution.

  4. WMT: The CSDMS Web Modeling Tool

    Science.gov (United States)

    Piper, M.; Hutton, E. W. H.; Overeem, I.; Syvitski, J. P.

    2015-12-01

    The Community Surface Dynamics Modeling System (CSDMS) has a mission to enable model use and development for research in earth surface processes. CSDMS strives to expand the use of quantitative modeling techniques, promotes best practices in coding, and advocates for the use of open-source software. To streamline and standardize access to models, CSDMS has developed the Web Modeling Tool (WMT), a RESTful web application with a client-side graphical interface and a server-side database and API that allows users to build coupled surface dynamics models in a web browser on a personal computer or a mobile device, and run them in a high-performance computing (HPC) environment. With WMT, users can: Design a model from a set of components Edit component parameters Save models to a web-accessible server Share saved models with the community Submit runs to an HPC system Download simulation results The WMT client is an Ajax application written in Java with GWT, which allows developers to employ object-oriented design principles and development tools such as Ant, Eclipse and JUnit. For deployment on the web, the GWT compiler translates Java code to optimized and obfuscated JavaScript. The WMT client is supported on Firefox, Chrome, Safari, and Internet Explorer. The WMT server, written in Python and SQLite, is a layered system, with each layer exposing a web service API: wmt-db: database of component, model, and simulation metadata and output wmt-api: configure and connect components wmt-exe: launch simulations on remote execution servers The database server provides, as JSON-encoded messages, the metadata for users to couple model components, including descriptions of component exchange items, uses and provides ports, and input parameters. Execution servers are network-accessible computational resources, ranging from HPC systems to desktop computers, containing the CSDMS software stack for running a simulation. Once a simulation completes, its output, in NetCDF, is packaged

  5. Comparing implicit and explicit semantic access of direct and indirect word pairs in schizophrenia to evaluate models of semantic memory.

    Science.gov (United States)

    Neill, Erica; Rossell, Susan Lee

    2013-02-28

    Semantic memory deficits in schizophrenia (SZ) are profound, yet there is no research comparing implicit and explicit semantic processing in the same participant sample. In the current study, both implicit and explicit priming are investigated using direct (LION-TIGER) and indirect (LION-STRIPES; where tiger is not displayed) stimuli comparing SZ to healthy controls. Based on a substantive review (Rossell and Stefanovic, 2007) and meta-analysis (Pomarol-Clotet et al., 2008), it was predicted that SZ would be associated with increased indirect priming implicitly. Further, it was predicted that SZ would be associated with abnormal indirect priming explicitly, replicating earlier work (Assaf et al., 2006). No specific hypotheses were made for implicit direct priming due to the heterogeneity of the literature. It was hypothesised that explicit direct priming would be intact based on the structured nature of this task. The pattern of results suggests (1) intact reaction time (RT) and error performance implicitly in the face of abnormal direct priming and (2) impaired RT and error performance explicitly. This pattern confirms general findings regarding implicit/explicit memory impairments in SZ whilst highlighting the unique pattern of performance specific to semantic priming. Finally, priming performance is discussed in relation to thought disorder and length of illness. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  6. Comparison of BrainTool to other UML modeling and model transformation tools

    Science.gov (United States)

    Nikiforova, Oksana; Gusarovs, Konstantins

    2017-07-01

    In the last 30 years there were numerous model generated software systems offered targeting problems with the development productivity and the resulting software quality. CASE tools developed due today's date are being advertised as having "complete code-generation capabilities". Nowadays the Object Management Group (OMG) is calling similar arguments in regards to the Unified Modeling Language (UML) models at different levels of abstraction. It is being said that software development automation using CASE tools enables significant level of automation. Actual today's CASE tools are usually offering a combination of several features starting with a model editor and a model repository for a traditional ones and ending with code generator (that could be using a scripting or domain-specific (DSL) language), transformation tool to produce the new artifacts from the manually created and transformation definition editor to define new transformations for the most advanced ones. Present paper contains the results of CASE tool (mainly UML editors) comparison against the level of the automation they are offering.

  7. An explicit asymptotic model for the surface wave in a viscoelastic half-space based on applying Rabotnov's fractional exponential integral operators

    Science.gov (United States)

    Wilde, M. V.; Sergeeva, N. V.

    2018-05-01

    An explicit asymptotic model extracting the contribution of a surface wave to the dynamic response of a viscoelastic half-space is derived. Fractional exponential Rabotnov's integral operators are used for describing of material properties. The model is derived by extracting the principal part of the poles corresponding to the surface waves after applying Laplace and Fourier transforms. The simplified equations for the originals are written by using power series expansions. Padè approximation is constructed to unite short-time and long-time models. The form of this approximation allows to formulate the explicit model using a fractional exponential Rabotnov's integral operator with parameters depending on the properties of surface wave. The applicability of derived models is studied by comparing with the exact solutions of a model problem. It is revealed that the model based on Padè approximation is highly effective for all the possible time domains.

  8. Wave and Wind Model Performance Metrics Tools

    Science.gov (United States)

    Choi, J. K.; Wang, D. W.

    2016-02-01

    Continual improvements and upgrades of Navy ocean wave and wind models are essential to the assurance of battlespace environment predictability of ocean surface wave and surf conditions in support of Naval global operations. Thus, constant verification and validation of model performance is equally essential to assure the progress of model developments and maintain confidence in the predictions. Global and regional scale model evaluations may require large areas and long periods of time. For observational data to compare against, altimeter winds and waves along the tracks from past and current operational satellites as well as moored/drifting buoys can be used for global and regional coverage. Using data and model runs in previous trials such as the planned experiment, the Dynamics of the Adriatic in Real Time (DART), we demonstrated the use of accumulated altimeter wind and wave data over several years to obtain an objective evaluation of the performance the SWAN (Simulating Waves Nearshore) model running in the Adriatic Sea. The assessment provided detailed performance of wind and wave models by using cell-averaged statistical variables maps with spatial statistics including slope, correlation, and scatter index to summarize model performance. Such a methodology is easily generalized to other regions and at global scales. Operational technology currently used by subject matter experts evaluating the Navy Coastal Ocean Model and the Hybrid Coordinate Ocean Model can be expanded to evaluate wave and wind models using tools developed for ArcMAP, a GIS application developed by ESRI. Recent inclusion of altimeter and buoy data into a format through the Naval Oceanographic Office's (NAVOCEANO) quality control system and the netCDF standards applicable to all model output makes it possible for the fusion of these data and direct model verification. Also, procedures were developed for the accumulation of match-ups of modelled and observed parameters to form a data base

  9. Surface Protonation at the Rutile (110) Interface: Explicit Incorporation of Solvation Structure within the Refined MUSIC Model Framework

    Energy Technology Data Exchange (ETDEWEB)

    Machesky, Michael L. [Illinois State Water Survey, Champaign, IL; Predota, M. [University of South Bohemia, Czech Republic; Wesolowski, David J [ORNL

    2008-01-01

    The detailed solvation structure at the (110) surface of rutile ({alpha}-TiO{sub 2}) in contact with bulk liquid water has been obtained primarily from experimentally verified classical molecular dynamics (CMD) simulations of the ab initio-optimized surface in contact with SPC/E water. The results are used to explicitly quantify H-bonding interactions, which are then used within the refined MUSIC model framework to predict surface oxygen protonation constants. Quantum mechanical molecular dynamics (QMD) simulations in the presence of freely dissociable water molecules produced H-bond distributions around deprotonated surface oxygens very similar to those obtained by CMD with nondissociable SPC/E water, thereby confirming that the less computationally intensive CMD simulations provide accurate H-bond information. Utilizing this H-bond information within the refined MUSIC model, along with manually adjusted Ti-O surface bond lengths that are nonetheless within 0.05 {angstrom} of those obtained from static density functional theory (DFT) calculations and measured in X-ray reflectivity experiments (as well as bulk crystal values), give surface protonation constants that result in a calculated zero net proton charge pH value (pHznpc) at 25 C that agrees quantitatively with the experimentally determined value (5.4 {+-} 0.2) for a specific rutile powder dominated by the (110) crystal face. Moreover, the predicted pH{sub znpc} values agree to within 0.1 pH unit with those measured at all temperatures between 10 and 250 C. A slightly smaller manual adjustment of the DFT-derived Ti-O surface bond lengths was sufficient to bring the predicted pH{sub znpc} value of the rutile (110) surface at 25 C into quantitative agreement with the experimental value (4.8 {+-} 0.3) obtained from a polished and annealed rutile (110) single crystal surface in contact with dilute sodium nitrate solutions using second harmonic generation (SHG) intensity measurements as a function of ionic

  10. Surface Protonation at the Rutile (110) Interface: Explicit Incorporation of Solvation Structure within the Refined MUSIC Model Framework

    International Nuclear Information System (INIS)

    Machesky, Michael L.; Predota, M.; Wesolowski, David J.

    2008-01-01

    The detailed solvation structure at the (110) surface of rutile (α-TiO 2 ) in contact with bulk liquid water has been obtained primarily from experimentally verified classical molecular dynamics (CMD) simulations of the ab initio-optimized surface in contact with SPC/E water. The results are used to explicitly quantify H-bonding interactions, which are then used within the refined MUSIC model framework to predict surface oxygen protonation constants. Quantum mechanical molecular dynamics (QMD) simulations in the presence of freely dissociable water molecules produced H-bond distributions around deprotonated surface oxygens very similar to those obtained by CMD with nondissociable SPC/E water, thereby confirming that the less computationally intensive CMD simulations provide accurate H-bond information. Utilizing this H-bond information within the refined MUSIC model, along with manually adjusted Ti-O surface bond lengths that are nonetheless within 0.05 (angstrom) of those obtained from static density functional theory (DFT) calculations and measured in X-ray reflectivity experiments (as well as bulk crystal values), give surface protonation constants that result in a calculated zero net proton charge pH value (pHznpc) at 25 C that agrees quantitatively with the experimentally determined value (5.4 ± 0.2) for a specific rutile powder dominated by the (110) crystal face. Moreover, the predicted pH znpc values agree to within 0.1 pH unit with those measured at all temperatures between 10 and 250 C. A slightly smaller manual adjustment of the DFT-derived Ti-O surface bond lengths was sufficient to bring the predicted pH znpc value of the rutile (110) surface at 25 C into quantitative agreement with the experimental value (4.8 ± 0.3) obtained from a polished and annealed rutile (110) single crystal surface in contact with dilute sodium nitrate solutions using second harmonic generation (SHG) intensity measurements as a function of ionic strength. Additionally, the H

  11. Collaboro: a collaborative (meta modeling tool

    Directory of Open Access Journals (Sweden)

    Javier Luis Cánovas Izquierdo

    2016-10-01

    Full Text Available Software development is becoming more and more collaborative, emphasizing the role of end-users in the development process to make sure the final product will satisfy customer needs. This is especially relevant when developing Domain-Specific Modeling Languages (DSMLs, which are modeling languages specifically designed to carry out the tasks of a particular domain. While end-users are actually the experts of the domain for which a DSML is developed, their participation in the DSML specification process is still rather limited nowadays. In this paper, we propose a more community-aware language development process by enabling the active participation of all community members (both developers and end-users from the very beginning. Our proposal, called Collaboro, is based on a DSML itself enabling the representation of change proposals during the language design and the discussion (and trace back of possible solutions, comments and decisions arisen during the collaboration. Collaboro also incorporates a metric-based recommender system to help community members to define high-quality notations for the DSMLs. We also show how Collaboro can be used at the model-level to facilitate the collaborative specification of software models. Tool support is available both as an Eclipse plug-in a web-based solution.

  12. Numerical study on two-point contact by an explicit integration finite element method : A contribution to the modeling of flange squeal

    NARCIS (Netherlands)

    Yang, Z.; Li, Z.; Dollevoet, R.P.B.J.; Tournay, H; Grassie, S

    2015-01-01

    The precise mechanism which activates squeal, especially flange squeal has not been fully explained. The complex non-Hertzian contact and the broad-band high frequency feature bring great challenges to the modelling work of flange squeal. In this paper, an explicit integration finite element method

  13. CDMetaPOP: An individual-based, eco-evolutionary model for spatially explicit simulation of landscape demogenetics

    Science.gov (United States)

    Landguth, Erin L; Bearlin, Andrew; Day, Casey; Dunham, Jason B.

    2016-01-01

    1. Combining landscape demographic and genetics models offers powerful methods for addressing questions for eco-evolutionary applications.2. Using two illustrative examples, we present Cost–Distance Meta-POPulation, a program to simulate changes in neutral and/or selection-driven genotypes through time as a function of individual-based movement, complex spatial population dynamics, and multiple and changing landscape drivers.3. Cost–Distance Meta-POPulation provides a novel tool for questions in landscape genetics by incorporating population viability analysis, while linking directly to conservation applications.

  14. Collaborative Inquiry Learning: Models, tools, and challenges

    Science.gov (United States)

    Bell, Thorsten; Urhahne, Detlef; Schanze, Sascha; Ploetzner, Rolf

    2010-02-01

    Collaborative inquiry learning is one of the most challenging and exciting ventures for today's schools. It aims at bringing a new and promising culture of teaching and learning into the classroom where students in groups engage in self-regulated learning activities supported by the teacher. It is expected that this way of learning fosters students' motivation and interest in science, that they learn to perform steps of inquiry similar to scientists and that they gain knowledge on scientific processes. Starting from general pedagogical reflections and science standards, the article reviews some prominent models of inquiry learning. This comparison results in a set of inquiry processes being the basis for cooperation in the scientific network NetCoIL. Inquiry learning is conceived in several ways with emphasis on different processes. For an illustration of the spectrum, some main conceptions of inquiry and their focuses are described. In the next step, the article describes exemplary computer tools and environments from within and outside the NetCoIL network that were designed to support processes of collaborative inquiry learning. These tools are analysed by describing their functionalities as well as effects on student learning known from the literature. The article closes with challenges for further developments elaborated by the NetCoIL network.

  15. Metabolic engineering tools in model cyanobacteria.

    Science.gov (United States)

    Carroll, Austin L; Case, Anna E; Zhang, Angela; Atsumi, Shota

    2018-03-26

    Developing sustainable routes for producing chemicals and fuels is one of the most important challenges in metabolic engineering. Photoautotrophic hosts are particularly attractive because of their potential to utilize light as an energy source and CO 2 as a carbon substrate through photosynthesis. Cyanobacteria are unicellular organisms capable of photosynthesis and CO 2 fixation. While engineering in heterotrophs, such as Escherichia coli, has result in a plethora of tools for strain development and hosts capable of producing valuable chemicals efficiently, these techniques are not always directly transferable to cyanobacteria. However, recent efforts have led to an increase in the scope and scale of chemicals that cyanobacteria can produce. Adaptations of important metabolic engineering tools have also been optimized to function in photoautotrophic hosts, which include Clustered Regularly Interspaced Short Palindromic Repeats (CRISPR)-Cas9, 13 C Metabolic Flux Analysis (MFA), and Genome-Scale Modeling (GSM). This review explores innovations in cyanobacterial metabolic engineering, and highlights how photoautotrophic metabolism has shaped their development. Copyright © 2018 International Metabolic Engineering Society. Published by Elsevier Inc. All rights reserved.

  16. Fuzzy risk explicit interval linear programming model for end-of-life vehicle recycling planning in the EU.

    Science.gov (United States)

    Simic, Vladimir

    2015-01-01

    End-of-life vehicles (ELVs) are vehicles that have reached the end of their useful lives and are no longer registered or licensed for use. The ELV recycling problem has become very serious in the last decade and more and more efforts are made in order to reduce the impact of ELVs on the environment. This paper proposes the fuzzy risk explicit interval linear programming model for ELV recycling planning in the EU. It has advantages in reflecting uncertainties presented in terms of intervals in the ELV recycling systems and fuzziness in decision makers' preferences. The formulated model has been applied to a numerical study in which different decision maker types and several ELV types under two EU ELV Directive legislative cases were examined. This study is conducted in order to examine the influences of the decision maker type, the α-cut level, the EU ELV Directive and the ELV type on decisions about vehicle hulks procuring, storing unprocessed hulks, sorting generated material fractions, allocating sorted waste flows and allocating sorted metals. Decision maker type can influence quantity of vehicle hulks kept in storages. The EU ELV Directive and decision maker type have no influence on which vehicle hulk type is kept in the storage. Vehicle hulk type, the EU ELV Directive and decision maker type do not influence the creation of metal allocation plans, since each isolated metal has its regular destination. The valid EU ELV Directive eco-efficiency quotas can be reached even when advanced thermal treatment plants are excluded from the ELV recycling process. The introduction of the stringent eco-efficiency quotas will significantly reduce the quantities of land-filled waste fractions regardless of the type of decision makers who will manage vehicle recycling system. In order to reach these stringent quotas, significant quantities of sorted waste need to be processed in advanced thermal treatment plants. Proposed model can serve as the support for the European

  17. SMART: a spatially explicit bio-economic model for assessing and managing demersal fisheries, with an application to italian trawlers in the strait of sicily.

    Directory of Open Access Journals (Sweden)

    Tommaso Russo

    Full Text Available Management of catches, effort and exploitation pattern are considered the most effective measures to control fishing mortality and ultimately ensure productivity and sustainability of fisheries. Despite the growing concerns about the spatial dimension of fisheries, the distribution of resources and fishing effort in space is seldom considered in assessment and management processes. Here we propose SMART (Spatial MAnagement of demersal Resources for Trawl fisheries, a tool for assessing bio-economic feedback in different management scenarios. SMART combines information from different tasks gathered within the European Data Collection Framework on fisheries and is composed of: 1 spatial models of fishing effort, environmental characteristics and distribution of demersal resources; 2 an Artificial Neural Network which captures the relationships among these aspects in a spatially explicit way and uses them to predict resources abundances; 3 a deterministic module which analyzes the size structure of catches and the associated revenues, according to different spatially-based management scenarios. SMART is applied to demersal fishery in the Strait of Sicily, one of the most productive fisheries of the Mediterranean Sea. Three of the main target species are used as proxies for the whole range exploited by trawlers. After training, SMART is used to evaluate different management scenarios, including spatial closures, using a simulation approach that mimics the recent exploitation patterns. Results evidence good model performance, with a noteworthy coherence and reliability of outputs for the different components. Among others, the main finding is that a partial improvement in resource conditions can be achieved by means of nursery closures, even if the overall fishing effort in the area remains stable. Accordingly, a series of strategically designed areas of trawling closures could significantly improve the resource conditions of demersal fisheries in

  18. SMART: a spatially explicit bio-economic model for assessing and managing demersal fisheries, with an application to italian trawlers in the strait of sicily.

    Science.gov (United States)

    Russo, Tommaso; Parisi, Antonio; Garofalo, Germana; Gristina, Michele; Cataudella, Stefano; Fiorentino, Fabio

    2014-01-01

    Management of catches, effort and exploitation pattern are considered the most effective measures to control fishing mortality and ultimately ensure productivity and sustainability of fisheries. Despite the growing concerns about the spatial dimension of fisheries, the distribution of resources and fishing effort in space is seldom considered in assessment and management processes. Here we propose SMART (Spatial MAnagement of demersal Resources for Trawl fisheries), a tool for assessing bio-economic feedback in different management scenarios. SMART combines information from different tasks gathered within the European Data Collection Framework on fisheries and is composed of: 1) spatial models of fishing effort, environmental characteristics and distribution of demersal resources; 2) an Artificial Neural Network which captures the relationships among these aspects in a spatially explicit way and uses them to predict resources abundances; 3) a deterministic module which analyzes the size structure of catches and the associated revenues, according to different spatially-based management scenarios. SMART is applied to demersal fishery in the Strait of Sicily, one of the most productive fisheries of the Mediterranean Sea. Three of the main target species are used as proxies for the whole range exploited by trawlers. After training, SMART is used to evaluate different management scenarios, including spatial closures, using a simulation approach that mimics the recent exploitation patterns. Results evidence good model performance, with a noteworthy coherence and reliability of outputs for the different components. Among others, the main finding is that a partial improvement in resource conditions can be achieved by means of nursery closures, even if the overall fishing effort in the area remains stable. Accordingly, a series of strategically designed areas of trawling closures could significantly improve the resource conditions of demersal fisheries in the Strait of

  19. Evaluation of clinical information modeling tools.

    Science.gov (United States)

    Moreno-Conde, Alberto; Austin, Tony; Moreno-Conde, Jesús; Parra-Calderón, Carlos L; Kalra, Dipak

    2016-11-01

    Clinical information models are formal specifications for representing the structure and semantics of the clinical content within electronic health record systems. This research aims to define, test, and validate evaluation metrics for software tools designed to support the processes associated with the definition, management, and implementation of these models. The proposed framework builds on previous research that focused on obtaining agreement on the essential requirements in this area. A set of 50 conformance criteria were defined based on the 20 functional requirements agreed by that consensus and applied to evaluate the currently available tools. Of the 11 initiative developing tools for clinical information modeling identified, 9 were evaluated according to their performance on the evaluation metrics. Results show that functionalities related to management of data types, specifications, metadata, and terminology or ontology bindings have a good level of adoption. Improvements can be made in other areas focused on information modeling and associated processes. Other criteria related to displaying semantic relationships between concepts and communication with terminology servers had low levels of adoption. The proposed evaluation metrics were successfully tested and validated against a representative sample of existing tools. The results identify the need to improve tool support for information modeling and software development processes, especially in those areas related to governance, clinician involvement, and optimizing the technical validation of testing processes. This research confirmed the potential of these evaluation metrics to support decision makers in identifying the most appropriate tool for their organization. Los Modelos de Información Clínica son especificaciones para representar la estructura y características semánticas del contenido clínico en los sistemas de Historia Clínica Electrónica. Esta investigación define, prueba y valida

  20. Modeling mind-wandering: a tool to better understand distraction

    NARCIS (Netherlands)

    van Vugt, Marieke; Taatgen, Niels; Sackur, Jerome; Bastian, Mikael; Taatgen, Niels; van Vugt, Marieke; Borst, Jelmer; Mehlhorn, Katja

    2015-01-01

    When we get distracted, we may engage in mind-wandering, or task-unrelated thinking, which impairs performance on cognitive tasks. Yet, we do not have cognitive models that make this process explicit. On the basis of both recent experiments that have started to investigate mind-wandering and

  1. Numerical modelling of tool wear in turning with cemented carbide cutting tools

    Science.gov (United States)

    Franco, P.; Estrems, M.; Faura, F.

    2007-04-01

    A numerical model is proposed for analysing the flank and crater wear resulting from the loss of material on cutting tool surface in turning processes due to wear mechanisms of adhesion, abrasion and fracture. By means of this model, the material loss along cutting tool surface can be analysed, and the worn surface shape during the workpiece machining can be determined. The proposed model analyses the gradual degradation of cutting tool during turning operation, and tool wear can be estimated as a function of cutting time. Wear-land width (VB) and crater depth (KT) can be obtained for description of material loss on cutting tool surface, and the effects of the distinct wear mechanisms on surface shape can be studied. The parameters required for the tool wear model are obtained from bibliography and experimental observation for AISI 4340 steel turning with WC-Co cutting tools.

  2. Numerical modelling of tool wear in turning with cemented carbide cutting tools

    International Nuclear Information System (INIS)

    Franco, P.; Estrems, M.; Faura, F.

    2007-01-01

    A numerical model is proposed for analysing the flank and crater wear resulting from the loss of material on cutting tool surface in turning processes due to wear mechanisms of adhesion, abrasion and fracture. By means of this model, the material loss along cutting tool surface can be analysed, and the worn surface shape during the workpiece machining can be determined. The proposed model analyses the gradual degradation of cutting tool during turning operation, and tool wear can be estimated as a function of cutting time. Wear-land width (VB) and crater depth (KT) can be obtained for description of material loss on cutting tool surface, and the effects of the distinct wear mechanisms on surface shape can be studied. The parameters required for the tool wear model are obtained from bibliography and experimental observation for AISI 4340 steel turning with WC-Co cutting tools

  3. Large scale experiments as a tool for numerical model development

    DEFF Research Database (Denmark)

    Kirkegaard, Jens; Hansen, Erik Asp; Fuchs, Jesper

    2003-01-01

    Experimental modelling is an important tool for study of hydrodynamic phenomena. The applicability of experiments can be expanded by the use of numerical models and experiments are important for documentation of the validity of numerical tools. In other cases numerical tools can be applied...

  4. Computer system for identification of tool wear model in hot forging

    Directory of Open Access Journals (Sweden)

    Wilkus Marek

    2016-01-01

    Full Text Available The aim of the research was to create a methodology that will enable effective and reliable prediction of the tool wear. The idea of the hybrid model, which accounts for various mechanisms of tool material deterioration, is proposed in the paper. The mechanisms, which were considered, include abrasive wear, adhesive wear, thermal fatigue, mechanical fatigue, oxidation and plastic deformation. Individual models of various complexity were used for separate phenomena and strategy of combination of these models in one hybrid system was developed to account for the synergy of various mechanisms. The complex hybrid model was built on the basis of these individual models for various wear mechanisms. The individual models expanded from phenomenological ones for abrasive wear to multi-scale methods for modelling micro cracks initiation and propagation utilizing virtual representations of granular microstructures. The latter have been intensively developed recently and they form potentially a powerful tool that allows modelling of thermal and mechanical fatigue, accounting explicitly for the tool material microstructure.

  5. Precision tools and models to narrow in on the 750 GeV diphoton resonance

    International Nuclear Information System (INIS)

    Staub, Florian; Athron, Peter; Basso, Lorenzo

    2016-02-01

    The hints for a new resonance at 750 GeV from ATLAS and CMS have triggered a significant amount of attention. Since the simplest extensions of the standard model cannot accommodate the observation, many alternatives have been considered to explain the excess. Here we focus on several proposed renormalisable weakly-coupled models and revisit results given in the literature. We point out that physically important subtleties are often missed or neglected. To facilitate the study of the excess we have created a collection of 40 model files, selected from recent literature, for the Mathematica package SARAH. With SARAH one can generate files to perform numerical studies using the tailor-made spectrum generators FlexibleSUSY and SPheno. These have been extended to automatically include crucial higher order corrections to the diphoton and digluon decay rates for both CP-even and CP-odd scalars. Additionally, we have extended the UFO and CalcHep interfaces of SARAH, to pass the precise information about the effective vertices from the spectrum generator to a Monte-Carlo tool. Finally, as an example to demonstrate the power of the entire setup, we present a new supersymmetric model that accommodates the diphoton excess, explicitly demonstrating how a large width can be obtained. We explicitly show several steps in detail to elucidate the use of these public tools in the precision study of this model.

  6. Precision tools and models to narrow in on the 750 GeV diphoton resonance

    Energy Technology Data Exchange (ETDEWEB)

    Staub, Florian [CERN, Theoretical Physics Department, Geneva (Switzerland); Athron, Peter [Monash University, ARC Centre of Excellence for Particle Physics at the Terascale, School of Physics, Melbourne, VIC (Australia); Basso, Lorenzo [CPPM, Aix-Marseille Universite, CNRS-IN2P3, UMR 7346, Marseille Cedex 9 (France); Goodsell, Mark D. [Sorbonne Universites, LPTHE, UMR 7589, CNRS and Universite Pierre et Marie Curie, Paris Cedex 05 (France); Harries, Dylan [The University of Adelaide, Department of Physics, ARC Centre of Excellence for Particle Physics at the Terascale, Adelaide, SA (Australia); Krauss, Manuel E.; Nickel, Kilian; Opferkuch, Toby [Bethe Center for Theoretical Physics and Physikalisches Institut der Universitaet Bonn, Bonn (Germany); Ubaldi, Lorenzo [Tel-Aviv University, Raymond and Beverly Sackler School of Physics and Astronomy, Tel Aviv (Israel); Vicente, Avelino [Instituto de Fisica Corpuscular (CSIC-Universitat de Valencia), Valencia (Spain); Voigt, Alexander [Deutsches Elektronen-Synchrotron DESY, Hamburg (Germany)

    2016-09-15

    The hints for a new resonance at 750 GeV from ATLAS and CMS have triggered a significant amount of attention. Since the simplest extensions of the standard model cannot accommodate the observation, many alternatives have been considered to explain the excess. Here we focus on several proposed renormalisable weakly-coupled models and revisit results given in the literature. We point out that physically important subtleties are often missed or neglected. To facilitate the study of the excess we have created a collection of 40 model files, selected from recent literature, for the Mathematica package SARAH. With SARAH one can generate files to perform numerical studies using the tailor-made spectrum generators FlexibleSUSY and SPheno. These have been extended to automatically include crucial higher order corrections to the diphoton and digluon decay rates for both CP-even and CP-odd scalars. Additionally, we have extended the UFO and CalcHep interfaces of SARAH, to pass the precise information about the effective vertices from the spectrum generator to a Monte-Carlo tool. Finally, as an example to demonstrate the power of the entire setup, we present a new supersymmetric model that accommodates the diphoton excess, explicitly demonstrating how a large width can be obtained. We explicitly show several steps in detail to elucidate the use of these public tools in the precision study of this model. (orig.)

  7. Precision tools and models to narrow in on the 750 GeV diphoton resonance

    Energy Technology Data Exchange (ETDEWEB)

    Staub, Florian [CERN, Geneva (Switzerland). Theoretical Physics Dept.; Athron, Peter [Monash Univ., Melbourne (Australia). ARC Center of Excellence for Particle Physics at the Terascale; Basso, Lorenzo [Aix-Marseille Univ., CNRS-IN2P3, UMR 7346 (France). CPPM; and others

    2016-02-15

    The hints for a new resonance at 750 GeV from ATLAS and CMS have triggered a significant amount of attention. Since the simplest extensions of the standard model cannot accommodate the observation, many alternatives have been considered to explain the excess. Here we focus on several proposed renormalisable weakly-coupled models and revisit results given in the literature. We point out that physically important subtleties are often missed or neglected. To facilitate the study of the excess we have created a collection of 40 model files, selected from recent literature, for the Mathematica package SARAH. With SARAH one can generate files to perform numerical studies using the tailor-made spectrum generators FlexibleSUSY and SPheno. These have been extended to automatically include crucial higher order corrections to the diphoton and digluon decay rates for both CP-even and CP-odd scalars. Additionally, we have extended the UFO and CalcHep interfaces of SARAH, to pass the precise information about the effective vertices from the spectrum generator to a Monte-Carlo tool. Finally, as an example to demonstrate the power of the entire setup, we present a new supersymmetric model that accommodates the diphoton excess, explicitly demonstrating how a large width can be obtained. We explicitly show several steps in detail to elucidate the use of these public tools in the precision study of this model.

  8. Precision tools and models to narrow in on the 750 GeV diphoton resonance

    International Nuclear Information System (INIS)

    Staub, Florian; Athron, Peter; Basso, Lorenzo; Goodsell, Mark D.; Harries, Dylan; Krauss, Manuel E.; Nickel, Kilian; Opferkuch, Toby; Ubaldi, Lorenzo; Vicente, Avelino; Voigt, Alexander

    2016-01-01

    The hints for a new resonance at 750 GeV from ATLAS and CMS have triggered a significant amount of attention. Since the simplest extensions of the standard model cannot accommodate the observation, many alternatives have been considered to explain the excess. Here we focus on several proposed renormalisable weakly-coupled models and revisit results given in the literature. We point out that physically important subtleties are often missed or neglected. To facilitate the study of the excess we have created a collection of 40 model files, selected from recent literature, for the Mathematica package SARAH. With SARAH one can generate files to perform numerical studies using the tailor-made spectrum generators FlexibleSUSY and SPheno. These have been extended to automatically include crucial higher order corrections to the diphoton and digluon decay rates for both CP-even and CP-odd scalars. Additionally, we have extended the UFO and CalcHep interfaces of SARAH, to pass the precise information about the effective vertices from the spectrum generator to a Monte-Carlo tool. Finally, as an example to demonstrate the power of the entire setup, we present a new supersymmetric model that accommodates the diphoton excess, explicitly demonstrating how a large width can be obtained. We explicitly show several steps in detail to elucidate the use of these public tools in the precision study of this model. (orig.)

  9. Modeling, methodologies and tools for molecular and nano-scale communications modeling, methodologies and tools

    CERN Document Server

    Nakano, Tadashi; Moore, Michael

    2017-01-01

    (Preliminary) The book presents the state of art in the emerging field of molecular and nanoscale communication. It gives special attention to fundamental models, and advanced methodologies and tools used in the field. It covers a wide range of applications, e.g. nanomedicine, nanorobot communication, bioremediation and environmental managements. It addresses advanced graduate students, academics and professionals working at the forefront in their fields and at the interfaces between different areas of research, such as engineering, computer science, biology and nanotechnology.

  10. Developing a Modeling Tool Using Eclipse

    NARCIS (Netherlands)

    Kirtley, Nick; Waqas Kamal, Ahmad; Avgeriou, Paris

    2008-01-01

    Tool development using an open source platform provides autonomy to users to change, use, and develop cost-effective software with freedom from licensing requirements. However, open source tool development poses a number of challenges, such as poor documentation and continuous evolution. In this

  11. Novel Multiscale Modeling Tool Applied to Pseudomonas aeruginosa Biofilm Formation

    OpenAIRE

    Biggs, Matthew B.; Papin, Jason A.

    2013-01-01

    Multiscale modeling is used to represent biological systems with increasing frequency and success. Multiscale models are often hybrids of different modeling frameworks and programming languages. We present the MATLAB-NetLogo extension (MatNet) as a novel tool for multiscale modeling. We demonstrate the utility of the tool with a multiscale model of Pseudomonas aeruginosa biofilm formation that incorporates both an agent-based model (ABM) and constraint-based metabolic modeling. The hybrid mod...

  12. Memory Efficient Data Structures for Explicit Verification of Timed Systems

    DEFF Research Database (Denmark)

    Taankvist, Jakob Haahr; Srba, Jiri; Larsen, Kim Guldstrand

    2014-01-01

    Timed analysis of real-time systems can be performed using continuous (symbolic) or discrete (explicit) techniques. The explicit state-space exploration can be considerably faster for models with moderately small constants, however, at the expense of high memory consumption. In the setting of timed......-arc Petri nets, we explore new data structures for lowering the used memory: PTries for efficient storing of configurations and time darts for semi-symbolic description of the state-space. Both methods are implemented as a part of the tool TAPAAL and the experiments document at least one order of magnitude...... of memory savings while preserving comparable verification times....

  13. Towards anatomic scale agent-based modeling with a massively parallel spatially explicit general-purpose model of enteric tissue (SEGMEnT_HPC).

    Science.gov (United States)

    Cockrell, Robert Chase; Christley, Scott; Chang, Eugene; An, Gary

    2015-01-01

    Perhaps the greatest challenge currently facing the biomedical research community is the ability to integrate highly detailed cellular and molecular mechanisms to represent clinical disease states as a pathway to engineer effective therapeutics. This is particularly evident in the representation of organ-level pathophysiology in terms of abnormal tissue structure, which, through histology, remains a mainstay in disease diagnosis and staging. As such, being able to generate anatomic scale simulations is a highly desirable goal. While computational limitations have previously constrained the size and scope of multi-scale computational models, advances in the capacity and availability of high-performance computing (HPC) resources have greatly expanded the ability of computational models of biological systems to achieve anatomic, clinically relevant scale. Diseases of the intestinal tract are exemplary examples of pathophysiological processes that manifest at multiple scales of spatial resolution, with structural abnormalities present at the microscopic, macroscopic and organ-levels. In this paper, we describe a novel, massively parallel computational model of the gut, the Spatially Explicitly General-purpose Model of Enteric Tissue_HPC (SEGMEnT_HPC), which extends an existing model of the gut epithelium, SEGMEnT, in order to create cell-for-cell anatomic scale simulations. We present an example implementation of SEGMEnT_HPC that simulates the pathogenesis of ileal pouchitis, and important clinical entity that affects patients following remedial surgery for ulcerative colitis.

  14. Development and reliability of the explicit professional oral communication observation tool to quantify the use of non-technical skills in healthcare.

    Science.gov (United States)

    Kemper, Peter F; van Noord, Inge; de Bruijne, Martine; Knol, Dirk L; Wagner, Cordula; van Dyck, Cathy

    2013-07-01

    A lack of non-technical skills is increasingly recognised as an important underlying cause of adverse events in healthcare. The nature and number of things professionals communicate to each other can be perceived as a product of their use of non-technical skills. This paper describes the development and reliability of an instrument to measure and quantify the use of non-technical skills by direct observations of explicit professional oral communication (EPOC) in the clinical situation. In an iterative process we translated, tested and refined an existing checklist from the aviation industry, called self, human interaction, aircraft, procedures and environment, in the context of healthcare, notably emergency departments (ED) and intensive care units (ICU). The EPOC comprises six dimensions: assertiveness, working with others; task-oriented leadership; people-oriented leadership; situational awareness; planning and anticipation. Each dimension is specified into several concrete items reflecting verbal behaviours. The EPOC was evaluated in four ED and six ICU. In the ED and ICU, respectively, 378 and 1144 individual and 51 and 68 contemporaneous observations of individual staff members were conducted. All EPOC dimensions occur frequently, apart from assertiveness, which was hardly observed. Intraclass correlations for the overall EPOC score ranged between 0.85 and 0.91 and for underlying EPOC dimensions between 0.53 and 0.95. The EPOC is a new instrument for evaluating the use of non-technical skills in healthcare, which is reliable in two highly different settings. By quantifying professional behaviour the instrument facilitates measurement of behavioural change over time. The results suggest that EPOC can also be translated to other settings.

  15. Modelling of Tool Wear and Residual Stress during Machining of AISI H13 Tool Steel

    Science.gov (United States)

    Outeiro, José C.; Umbrello, Domenico; Pina, José C.; Rizzuti, Stefania

    2007-05-01

    Residual stresses can enhance or impair the ability of a component to withstand loading conditions in service (fatigue, creep, stress corrosion cracking, etc.), depending on their nature: compressive or tensile, respectively. This poses enormous problems in structural assembly as this affects the structural integrity of the whole part. In addition, tool wear issues are of critical importance in manufacturing since these affect component quality, tool life and machining cost. Therefore, prediction and control of both tool wear and the residual stresses in machining are absolutely necessary. In this work, a two-dimensional Finite Element model using an implicit Lagrangian formulation with an automatic remeshing was applied to simulate the orthogonal cutting process of AISI H13 tool steel. To validate such model the predicted and experimentally measured chip geometry, cutting forces, temperatures, tool wear and residual stresses on the machined affected layers were compared. The proposed FE model allowed us to investigate the influence of tool geometry, cutting regime parameters and tool wear on residual stress distribution in the machined surface and subsurface of AISI H13 tool steel. The obtained results permit to conclude that in order to reduce the magnitude of surface residual stresses, the cutting speed should be increased, the uncut chip thickness (or feed) should be reduced and machining with honed tools having large cutting edge radii produce better results than chamfered tools. Moreover, increasing tool wear increases the magnitude of surface residual stresses.

  16. SIMULATION TOOLS FOR ELECTRICAL MACHINES MODELLING ...

    African Journals Online (AJOL)

    Dr Obe

    ABSTRACT. Simulation tools are used both for research and teaching to allow a good ... The solution provide an easy way of determining the dynamic .... incorporate an in-built numerical algorithm, ... to learn, versatile in application, enhanced.

  17. Integrating decision management with UML modeling concepts and tools

    DEFF Research Database (Denmark)

    Könemann, Patrick

    2009-01-01

    , but also for guiding the user by proposing subsequent decisions. In model-based software development, many decisions directly affect the structural and behavioral models used to describe and develop a software system and its architecture. However, the decisions are typically not connected to these models...... of formerly disconnected tools could improve tool usability as well as decision maker productivity....

  18. Dynamic wind turbine models in power system simulation tool

    DEFF Research Database (Denmark)

    Hansen, A.; Jauch, Clemens; Soerensen, P.

    The present report describes the dynamic wind turbine models implemented in the power system simulation tool DIgSILENT. The developed models are a part of the results of a national research project, whose overall objective is to create a model database in different simulation tools. The report...

  19. Modeling and Tool Wear in Routing of CFRP

    International Nuclear Information System (INIS)

    Iliescu, D.; Fernandez, A.; Gutierrez-Orrantia, M. E.; Lopez de Lacalle, L. N.; Girot, F.

    2011-01-01

    This paper presents the prediction and evaluation of feed force in routing of carbon composite material. In order to extend tool life and improve quality of the machined surface, a better understanding of uncoated and coated tool behaviors is required. This work describes (1) the optimization of the geometry of multiple teeth tools minimizing the tool wear and the feed force, (2) the optimization of tool coating and (3) the development of a phenomenological model between the feed force, the routing parameters and the tool wear. The experimental results indicate that the feed rate, the cutting speed and the tool wear are the most significant factors affecting the feed force. In the case of multiple teeth tools, a particular geometry with 14 teeth right helix right cut and 11 teeth left helix right cut gives the best results. A thick AlTiN coating or a diamond coating can dramatically improve the tool life while minimizing the axial force, roughness and delamination. A wear model has then been developed based on an abrasive behavior of the tool. The model links the feed rate to the tool geometry parameters (tool diameter), to the process parameters (feed rate, cutting speed and depth of cut) and to the wear. The model presented has been verified by experimental tests.

  20. Explicit/multi-parametric model predictive control (MPC) of linear discrete-time systems by dynamic and multi-parametric programming

    KAUST Repository

    Kouramas, K.I.

    2011-08-01

    This work presents a new algorithm for solving the explicit/multi- parametric model predictive control (or mp-MPC) problem for linear, time-invariant discrete-time systems, based on dynamic programming and multi-parametric programming techniques. The algorithm features two key steps: (i) a dynamic programming step, in which the mp-MPC problem is decomposed into a set of smaller subproblems in which only the current control, state variables, and constraints are considered, and (ii) a multi-parametric programming step, in which each subproblem is solved as a convex multi-parametric programming problem, to derive the control variables as an explicit function of the states. The key feature of the proposed method is that it overcomes potential limitations of previous methods for solving multi-parametric programming problems with dynamic programming, such as the need for global optimization for each subproblem of the dynamic programming step. © 2011 Elsevier Ltd. All rights reserved.

  1. Requirements Validation: Execution of UML Models with CPN Tools

    DEFF Research Database (Denmark)

    Machado, Ricardo J.; Lassen, Kristian Bisgaard; Oliveira, Sérgio

    2007-01-01

    Requirements validation is a critical task in any engineering project. The confrontation of stakeholders with static requirements models is not enough, since stakeholders with non-computer science education are not able to discover all the inter-dependencies between the elicited requirements. Eve...... requirements, where the system to be built must explicitly support the interaction between people within a pervasive cooperative workflow execution. A case study from a real project is used to illustrate the proposed approach....

  2. Student Model Tools Code Release and Documentation

    DEFF Research Database (Denmark)

    Johnson, Matthew; Bull, Susan; Masci, Drew

    of its strengths and areas of improvement (Section 6). Several key appendices are attached to this report including user manuals for teacher and students (Appendix 3). Fundamentally, all relevant information is included in the report for those wishing to do further development work with the tool...

  3. DYNAMO-HIA--a Dynamic Modeling tool for generic Health Impact Assessments.

    Directory of Open Access Journals (Sweden)

    Stefan K Lhachimi

    Full Text Available BACKGROUND: Currently, no standard tool is publicly available that allows researchers or policy-makers to quantify the impact of policies using epidemiological evidence within the causal framework of Health Impact Assessment (HIA. A standard tool should comply with three technical criteria (real-life population, dynamic projection, explicit risk-factor states and three usability criteria (modest data requirements, rich model output, generally accessible to be useful in the applied setting of HIA. With DYNAMO-HIA (Dynamic Modeling for Health Impact Assessment, we introduce such a generic software tool specifically designed to facilitate quantification in the assessment of the health impacts of policies. METHODS AND RESULTS: DYNAMO-HIA quantifies the impact of user-specified risk-factor changes on multiple diseases and in turn on overall population health, comparing one reference scenario with one or more intervention scenarios. The Markov-based modeling approach allows for explicit risk-factor states and simulation of a real-life population. A built-in parameter estimation module ensures that only standard population-level epidemiological evidence is required, i.e. data on incidence, prevalence, relative risks, and mortality. DYNAMO-HIA provides a rich output of summary measures--e.g. life expectancy and disease-free life expectancy--and detailed data--e.g. prevalences and mortality/survival rates--by age, sex, and risk-factor status over time. DYNAMO-HIA is controlled via a graphical user interface and is publicly available from the internet, ensuring general accessibility. We illustrate the use of DYNAMO-HIA with two example applications: a policy causing an overall increase in alcohol consumption and quantifying the disease-burden of smoking. CONCLUSION: By combining modest data needs with general accessibility and user friendliness within the causal framework of HIA, DYNAMO-HIA is a potential standard tool for health impact assessment based

  4. A spatially explicit whole-system model of the lignocellulosic bioethanol supply chain: an assessment of decentralised processing potential

    Directory of Open Access Journals (Sweden)

    Shah Nilay

    2008-07-01

    Full Text Available Abstract Background Lignocellulosic bioethanol technologies exhibit significant capacity for performance improvement across the supply chain through the development of high-yielding energy crops, integrated pretreatment, hydrolysis and fermentation technologies and the application of dedicated ethanol pipelines. The impact of such developments on cost-optimal plant location, scale and process composition within multiple plant infrastructures is poorly understood. A combined production and logistics model has been developed to investigate cost-optimal system configurations for a range of technological, system scale, biomass supply and ethanol demand distribution scenarios specific to European agricultural land and population densities. Results Ethanol production costs for current technologies decrease significantly from $0.71 to $0.58 per litre with increasing economies of scale, up to a maximum single-plant capacity of 550 × 106 l year-1. The development of high-yielding energy crops and consolidated bio-processing realises significant cost reductions, with production costs ranging from $0.33 to $0.36 per litre. Increased feedstock yields result in systems of eight fully integrated plants operating within a 500 × 500 km2 region, each producing between 1.24 and 2.38 × 109 l year-1 of pure ethanol. A limited potential for distributed processing and centralised purification systems is identified, requiring developments in modular, ambient pretreatment and fermentation technologies and the pipeline transport of pure ethanol. Conclusion The conceptual and mathematical modelling framework developed provides a valuable tool for the assessment and optimisation of the lignocellulosic bioethanol supply chain. In particular, it can provide insight into the optimal configuration of multiple plant systems. This information is invaluable in ensuring (near-cost-optimal strategic development within the sector at the regional and national scale. The framework

  5. Advanced reach tool (ART) : Development of the mechanistic model

    NARCIS (Netherlands)

    Fransman, W.; Tongeren, M. van; Cherrie, J.W.; Tischer, M.; Schneider, T.; Schinkel, J.; Kromhout, H.; Warren, N.; Goede, H.; Tielemans, E.

    2011-01-01

    This paper describes the development of the mechanistic model within a collaborative project, referred to as the Advanced REACH Tool (ART) project, to develop a tool to model inhalation exposure for workers sharing similar operational conditions across different industries and locations in Europe.

  6. Storm Water Management Model Climate Adjustment Tool (SWMM-CAT)

    Science.gov (United States)

    The US EPA’s newest tool, the Stormwater Management Model (SWMM) – Climate Adjustment Tool (CAT) is meant to help municipal stormwater utilities better address potential climate change impacts affecting their operations. SWMM, first released in 1971, models hydrology and hydrauli...

  7. Information Theoretic Tools for Parameter Fitting in Coarse Grained Models

    KAUST Repository

    Kalligiannaki, Evangelia; Harmandaris, Vagelis; Katsoulakis, Markos A.; Plechac, Petr

    2015-01-01

    We study the application of information theoretic tools for model reduction in the case of systems driven by stochastic dynamics out of equilibrium. The model/dimension reduction is considered by proposing parametrized coarse grained dynamics

  8. MARKET EVALUATION MODEL: TOOL FORBUSINESS DECISIONS

    OpenAIRE

    Porlles Loarte, José; Yenque Dedios, Julio; Lavado Soto, Aurelio

    2014-01-01

    In the present work the concepts of potential market and global market are analyzed as the basis for strategic decisions of market with long term perspectives, when the implantation of a business in certain geographic area is evaluated. On this conceptual frame, the methodological tool is proposed to evaluate a commercial decision, for which it is taken as reference the case from the brewing industry in Peru, considering that this industry faces in the region entrepreneurial reorderings withi...

  9. Assessment of the Simulated Molecular Composition with the GECKO-A Modeling Tool Using Chamber Observations for α-Pinene.

    Science.gov (United States)

    Aumont, B.; Camredon, M.; Isaacman-VanWertz, G. A.; Karam, C.; Valorso, R.; Madronich, S.; Kroll, J. H.

    2016-12-01

    Gas phase oxidation of VOC is a gradual process leading to the formation of multifunctional organic compounds, i.e., typically species with higher oxidation state, high water solubility and low volatility. These species contribute to the formation of secondary organic aerosols (SOA) viamultiphase processes involving a myriad of organic species that evolve through thousands of reactions and gas/particle mass exchanges. Explicit chemical mechanisms reflect the understanding of these multigenerational oxidation steps. These mechanisms rely directly on elementary reactions to describe the chemical evolution and track the identity of organic carbon through various phases down to ultimate oxidation products. The development, assessment and improvement of such explicit schemes is a key issue, as major uncertainties remain on the chemical pathways involved during atmospheric oxidation of organic matter. An array of mass spectrometric techniques (CIMS, PTRMS, AMS) was recently used to track the composition of organic species during α-pinene oxidation in the MIT environmental chamber, providing an experimental database to evaluate and improve explicit mechanisms. In this study, the GECKO-A tool (Generator for Explicit Chemistry and Kinetics of Organics in the Atmosphere) is used to generate fully explicit oxidation schemes for α-pinene multiphase oxidation simulating the MIT experiment. The ability of the GECKO-A chemical scheme to explain the organic molecular composition in the gas and the condensed phases is explored. First results of this model/observation comparison at the molecular level will be presented.

  10. Applying Modeling Tools to Ground System Procedures

    Science.gov (United States)

    Di Pasquale, Peter

    2012-01-01

    As part of a long-term effort to revitalize the Ground Systems (GS) Engineering Section practices, Systems Modeling Language (SysML) and Business Process Model and Notation (BPMN) have been used to model existing GS products and the procedures GS engineers use to produce them.

  11. Shape: A 3D Modeling Tool for Astrophysics.

    Science.gov (United States)

    Steffen, Wolfgang; Koning, Nicholas; Wenger, Stephan; Morisset, Christophe; Magnor, Marcus

    2011-04-01

    We present a flexible interactive 3D morpho-kinematical modeling application for astrophysics. Compared to other systems, our application reduces the restrictions on the physical assumptions, data type, and amount that is required for a reconstruction of an object's morphology. It is one of the first publicly available tools to apply interactive graphics to astrophysical modeling. The tool allows astrophysicists to provide a priori knowledge about the object by interactively defining 3D structural elements. By direct comparison of model prediction with observational data, model parameters can then be automatically optimized to fit the observation. The tool has already been successfully used in a number of astrophysical research projects.

  12. GPGPU-based explicit finite element computations for applications in biomechanics: the performance of material models, element technologies, and hardware generations.

    Science.gov (United States)

    Strbac, V; Pierce, D M; Vander Sloten, J; Famaey, N

    2017-12-01

    Finite element (FE) simulations are increasingly valuable in assessing and improving the performance of biomedical devices and procedures. Due to high computational demands such simulations may become difficult or even infeasible, especially when considering nearly incompressible and anisotropic material models prevalent in analyses of soft tissues. Implementations of GPGPU-based explicit FEs predominantly cover isotropic materials, e.g. the neo-Hookean model. To elucidate the computational expense of anisotropic materials, we implement the Gasser-Ogden-Holzapfel dispersed, fiber-reinforced model and compare solution times against the neo-Hookean model. Implementations of GPGPU-based explicit FEs conventionally rely on single-point (under) integration. To elucidate the expense of full and selective-reduced integration (more reliable) we implement both and compare corresponding solution times against those generated using underintegration. To better understand the advancement of hardware, we compare results generated using representative Nvidia GPGPUs from three recent generations: Fermi (C2075), Kepler (K20c), and Maxwell (GTX980). We explore scaling by solving the same boundary value problem (an extension-inflation test on a segment of human aorta) with progressively larger FE meshes. Our results demonstrate substantial improvements in simulation speeds relative to two benchmark FE codes (up to 300[Formula: see text] while maintaining accuracy), and thus open many avenues to novel applications in biomechanics and medicine.

  13. Evaluating the effect of corridors and landscape heterogeneity on dispersal probability: a comparison of three spatially explicit modelling approaches

    DEFF Research Database (Denmark)

    Jepsen, J. U.; Baveco, J. M.; Topping, C. J.

    2004-01-01

    preferences of the modeller, rather than by a critical evaluation of model performance. We present a comparison of three common spatial simulation approaches (patch-based incidence-function model (IFM), individual-based movement model (IBMM), individual-based population model including detailed behaviour...

  14. Finite Element Modelling of the effect of tool rake angle on tool temperature and cutting force during high speed machining of AISI 4340 steel

    International Nuclear Information System (INIS)

    Sulaiman, S; Roshan, A; Ariffin, M K A

    2013-01-01

    In this paper, a Finite Element Method (FEM) based on the ABAQUS explicit software which involves Johnson-Cook material model was used to simulate cutting force and tool temperature during high speed machining (HSM) of AISI 4340 steel. In this simulation work, a tool rake angle ranging from 0° to 20° and a range of cutting speeds between 300 to 550 m/min was investigated. The purpose of this simulation analysis was to find optimum tool rake angle where cutting force is smallest as well as tool temperature is lowest during high speed machining. It was found that cutting forces to have a decreasing trend as rake angle increased to positive direction. The optimum rake angle observed between 10° and 18° due to decrease of cutting force as 20% for all simulated cutting speeds. In addition, increasing cutting tool rake angle over its optimum value had negative influence on tool's performance and led to an increase in cutting temperature. The results give a better understanding and recognition of the cutting tool design for high speed machining processes

  15. Novel multiscale modeling tool applied to Pseudomonas aeruginosa biofilm formation.

    Directory of Open Access Journals (Sweden)

    Matthew B Biggs

    Full Text Available Multiscale modeling is used to represent biological systems with increasing frequency and success. Multiscale models are often hybrids of different modeling frameworks and programming languages. We present the MATLAB-NetLogo extension (MatNet as a novel tool for multiscale modeling. We demonstrate the utility of the tool with a multiscale model of Pseudomonas aeruginosa biofilm formation that incorporates both an agent-based model (ABM and constraint-based metabolic modeling. The hybrid model correctly recapitulates oxygen-limited biofilm metabolic activity and predicts increased growth rate via anaerobic respiration with the addition of nitrate to the growth media. In addition, a genome-wide survey of metabolic mutants and biofilm formation exemplifies the powerful analyses that are enabled by this computational modeling tool.

  16. Novel multiscale modeling tool applied to Pseudomonas aeruginosa biofilm formation.

    Science.gov (United States)

    Biggs, Matthew B; Papin, Jason A

    2013-01-01

    Multiscale modeling is used to represent biological systems with increasing frequency and success. Multiscale models are often hybrids of different modeling frameworks and programming languages. We present the MATLAB-NetLogo extension (MatNet) as a novel tool for multiscale modeling. We demonstrate the utility of the tool with a multiscale model of Pseudomonas aeruginosa biofilm formation that incorporates both an agent-based model (ABM) and constraint-based metabolic modeling. The hybrid model correctly recapitulates oxygen-limited biofilm metabolic activity and predicts increased growth rate via anaerobic respiration with the addition of nitrate to the growth media. In addition, a genome-wide survey of metabolic mutants and biofilm formation exemplifies the powerful analyses that are enabled by this computational modeling tool.

  17. New tools for aquatic habitat modeling

    Science.gov (United States)

    D. Tonina; J. A. McKean; C. Tang; P. Goodwin

    2011-01-01

    Modeling of aquatic microhabitat in streams has been typically done over short channel reaches using one-dimensional simulations, partly because of a lack of high resolution. subaqueous topographic data to better define model boundary conditions. The Experimental Advanced Airborne Research Lidar (EAARL) is an airborne aquatic-terrestrial sensor that allows simultaneous...

  18. The impact of convection in the West African monsoon region on global weather forecasts - explicit vs. parameterised convection simulations using the ICON model

    Science.gov (United States)

    Pante, Gregor; Knippertz, Peter

    2017-04-01

    The West African monsoon is the driving element of weather and climate during summer in the Sahel region. It interacts with mesoscale convective systems (MCSs) and the African easterly jet and African easterly waves. Poor representation of convection in numerical models, particularly its organisation on the mesoscale, can result in unrealistic forecasts of the monsoon dynamics. Arguably, the parameterisation of convection is one of the main deficiencies in models over this region. Overall, this has negative impacts on forecasts over West Africa itself but may also affect remote regions, as waves originating from convective heating are badly represented. Here we investigate those remote forecast impacts based on daily initialised 10-day forecasts for July 2016 using the ICON model. One set of simulations employs the default setup of the global model with a horizontal grid spacing of 13 km. It is compared with simulations using the 2-way nesting capability of ICON. A second model domain over West Africa (the nest) with 6.5 km grid spacing is sufficient to explicitly resolve MCSs in this region. In the 2-way nested simulations, the prognostic variables of the global model are influenced by the results of the nest through relaxation. The nest with explicit convection is able to reproduce single MCSs much more realistically compared to the stand-alone global simulation with parameterised convection. Explicit convection leads to cooler temperatures in the lower troposphere (below 500 hPa) over the northern Sahel due to stronger evaporational cooling. Overall, the feedback of dynamic variables from the nest to the global model shows clear positive effects when evaluating the output of the global domain of the 2-way nesting simulation and the output of the stand-alone global model with ERA-Interim re-analyses. Averaged over the 2-way nested region, bias and root mean squared error (RMSE) of temperature, geopotential, wind and relative humidity are significantly reduced in

  19. Linking land use change to recreational fishery valuation with a spatially explicit behavior model: A case study from Tampa Bay, FL USA

    Science.gov (United States)

    Drawing a link between habitat change and production and delivery of ecosystem services is a priority in coastal estuarine ecosystems. This link is needed to fully understand how human communities can influence ecosystem sustainability. Mechanistic modeling tools are highly fun...

  20. Modeling Behavior by Coastal River Otter (Lontra Canadensis in Response to Prey Availability in Prince William Sound, Alaska: A Spatially-Explicit Individual-Based Approach.

    Directory of Open Access Journals (Sweden)

    Shannon E Albeke

    Full Text Available Effects of climate change on animal behavior and cascading ecosystem responses are rarely evaluated. In coastal Alaska, social river otters (Lontra Canadensis, largely males, cooperatively forage on schooling fish and use latrine sites to communicate group associations and dominance. Conversely, solitary otters, mainly females, feed on intertidal-demersal fish and display mutual avoidance via scent marking. This behavioral variability creates "hotspots" of nutrient deposition and affects plant productivity and diversity on the terrestrial landscape. Because the abundance of schooling pelagic fish is predicted to decline with climate change, we developed a spatially-explicit individual-based model (IBM of otter behavior and tested six scenarios based on potential shifts to distribution patterns of schooling fish. Emergent patterns from the IBM closely mimicked observed otter behavior and landscape use in the absence of explicit rules of intraspecific attraction or repulsion. Model results were most sensitive to rules regarding spatial memory and activity state following an encounter with a fish school. With declining availability of schooling fish, the number of social groups and the time simulated otters spent in the company of conspecifics declined. Concurrently, model results suggested an elevation of defecation rate, a 25% increase in nitrogen transport to the terrestrial landscape, and significant changes to the spatial distribution of "hotspots" with declines in schooling fish availability. However, reductions in availability of schooling fish could lead to declines in otter density over time.

  1. ‘Safety Matters Have Become Too Important for Management to Leave it Up to the Workers’ –The Nordic OSH Model Between Implicit and Explicit Frameworks

    Directory of Open Access Journals (Sweden)

    Johnny Dyreborg

    2011-01-01

    Full Text Available In a globalized economy it is relevant to question whether the Nordic Working Environment (WE model will remain as the basic and implicit framework for the governance of the WE. This paper explores institutional changes in the governance of the WE, and critically examines how a more explicit and market-oriented framework might influence the governance of the WE in the Nordic countries. Firstly, the paper examines the changes in the governance of the WE at the societal level (Denmark for the period 1954 - 2007, and identifies institutional logics informing these changes. Secondly, the paper examines changes in the governance of the WE at the level of the construction sector, using case material from four of the largest construction projects completed in Denmark in recent years. The analyses reveal three discrete periods, representing distinct logics influencing the governance of the WE, i.e., the logic of the state, the logic of democracy and the logic of the market. The logic of the state and the logic of democracy represent an implicit framework, whereas the logic of the market entails a shift to a more explicit framework. The shift to a more explicit framework for the governance of the WE, is also identified at the level of the construction sector. This leads to a pivotal shift in the clients' and the construction companies' relationship with the institutional environment in the four large construction projects. From worker representatives being the primary stakeholders, to a shift where the fulcrum of the development of the WE lies between management, the state and stakeholders in the companies' environment. This shift opens up a range of new and more market-oriented approaches to the governance of the WE that seems to challenge the extant Nordic WE model.

  2. Jack Human Modelling Tool: A Review

    Science.gov (United States)

    2010-01-01

    design and evaluation [8] and evolved into the Computerised Biomechanical Man Model (Combiman), shown in Figure 2. Combiman was developed at the...unrealistic arrangement of tetrahedra (Figure 7) to a highly realistic human model based on current anthropometric, anatomical and biomechanical data...has long legs and a short torso may find it difficult to adjust the seat and rudder pedals to achieve the required over the nose vision, reach to

  3. A comparison of tools for modeling freshwater ecosystem services.

    Science.gov (United States)

    Vigerstol, Kari L; Aukema, Juliann E

    2011-10-01

    Interest in ecosystem services has grown tremendously among a wide range of sectors, including government agencies, NGO's and the business community. Ecosystem services entailing freshwater (e.g. flood control, the provision of hydropower, and water supply), as well as carbon storage and sequestration, have received the greatest attention in both scientific and on-the-ground applications. Given the newness of the field and the variety of tools for predicting water-based services, it is difficult to know which tools to use for different questions. There are two types of freshwater-related tools--traditional hydrologic tools and newer ecosystem services tools. Here we review two of the most prominent tools of each type and their possible applications. In particular, we compare the data requirements, ease of use, questions addressed, and interpretability of results among the models. We discuss the strengths, challenges and most appropriate applications of the different models. Traditional hydrological tools provide more detail whereas ecosystem services tools tend to be more accessible to non-experts and can provide a good general picture of these ecosystem services. We also suggest gaps in the modeling toolbox that would provide the greatest advances by improving existing tools. Copyright © 2011 Elsevier Ltd. All rights reserved.

  4. Aligning building information model tools and construction management methods

    NARCIS (Netherlands)

    Hartmann, Timo; van Meerveld, H.J.; Vossebeld, N.; Adriaanse, Adriaan Maria

    2012-01-01

    Few empirical studies exist that can explain how different Building Information Model (BIM) based tool implementation strategies work in practical contexts. To help overcoming this gap, this paper describes the implementation of two BIM based tools, the first, to support the activities at an

  5. Scratch as a Computational Modelling Tool for Teaching Physics

    Science.gov (United States)

    Lopez, Victor; Hernandez, Maria Isabel

    2015-01-01

    The Scratch online authoring tool, which features a simple programming language that has been adapted to primary and secondary students, is being used more and more in schools as it offers students and teachers the opportunity to use a tool to build scientific models and evaluate their behaviour, just as can be done with computational modelling…

  6. Advanced REACH tool: A Bayesian model for occupational exposure assessment

    NARCIS (Netherlands)

    McNally, K.; Warren, N.; Fransman, W.; Entink, R.K.; Schinkel, J.; Van Tongeren, M.; Cherrie, J.W.; Kromhout, H.; Schneider, T.; Tielemans, E.

    2014-01-01

    This paper describes a Bayesian model for the assessment of inhalation exposures in an occupational setting; the methodology underpins a freely available web-based application for exposure assessment, the Advanced REACH Tool (ART). The ART is a higher tier exposure tool that combines disparate

  7. Agent Based Modeling as an Educational Tool

    Science.gov (United States)

    Fuller, J. H.; Johnson, R.; Castillo, V.

    2012-12-01

    Motivation is a key element in high school education. One way to improve motivation and provide content, while helping address critical thinking and problem solving skills, is to have students build and study agent based models in the classroom. This activity visually connects concepts with their applied mathematical representation. "Engaging students in constructing models may provide a bridge between frequently disconnected conceptual and mathematical forms of knowledge." (Levy and Wilensky, 2011) We wanted to discover the feasibility of implementing a model based curriculum in the classroom given current and anticipated core and content standards.; Simulation using California GIS data ; Simulation of high school student lunch popularity using aerial photograph on top of terrain value map.

  8. Spatially explicit models of full-season productivity and implications for landscape management of Golden-winged Warblers in the western Great Lakes Region: Chapter 9

    Science.gov (United States)

    Peterson, Sean M.; Streby, Henry M.; Andersen, David E.

    2016-01-01

    The relationship between landscape structure and composition and full-season productivity (FSP) is poorly understood for most birds. For species of high conservation concern, insight into how productivity is related to landscape structure and composition can be used to develop more effective conservation strategies that increase recruitment. We monitored nest productivity and fledgling survival of Golden-winged Warblers (Vermivora chrysoptera), a species of high conservation concern, in managed forest landscapes at two sites in northern Minnesota, and one site in southeastern Manitoba, Canada from 2010 to 2012. We used logistic exposure models to identify the influence of landscape structure and composition on nest productivity and fledgling survival. We used the models to predict spatially explicit, FSP across our study sites to identify areas of low relative productivity that could be targeted for management. We then used our models of spatially explicit, FSP to simulate the impact of potential management actions on our study sites with the goal of increasing total population productivity. Unlike previous studies that suggested wetland cover types provide higher quality breeding habitat for Golden-winged Warblers, our models predicted 14% greater productivity in upland cover types. Simulated succession of a 9-ha grassland patch to a shrubby upland suitable for nesting increased the total number of fledglings produced by that patch and adjacent upland shrublands by 30%, despite decreasing individual productivity by 13%. Further simulated succession of the same patch described above into deciduous forest reduced the total number of fledglings produced to independence on a landscape by 18% because of a decrease in the area available for nesting. Simulated reduction in the cumulative length of shrubby edge within a 50-m radius of any location in our landscapes from 0.6 to 0.3 km increased FSP by 5%. Our models demonstrated that the effects of any single management

  9. Exploring the Dynamic Mechanisms of Farmland Abandonment Based on a Spatially Explicit Economic Model for Environmental Sustainability: A Case Study in Jiangxi Province, China

    Directory of Open Access Journals (Sweden)

    Hualin Xie

    2014-03-01

    Full Text Available Farmland abandonment has important impacts on biodiversity and ecosystem recovery, as well as food security and rural sustainable development. Due to rapid urbanization and industrialization, farmland abandonment has become an increasingly important problem in many countries, particularly in China. To promote sustainable land-use management and environmental sustainability, it is important to understand the socioeconomic causes and spatial patterns of farmland abandonment. In this study, we explored the dynamic mechanisms of farmland abandonment in Jiangxi province of China using a spatially explicit economical model. The results show that the variables associated with the agricultural products yield are significantly correlated with farmland abandonment. The increasing opportunity cost of farming labor is the main factor in farmland abandonment in conjunction with a rural labor shortage due to rural-to-urban population migration and regional industrialization. Farmlands are more likely to be abandoned in areas located far from the villages and towns due to higher transportation costs. Additionally, farmers with more land but lower net income are more likely to abandon poor-quality farmland. Our results support the hypothesis that farmland abandonment takes place in locations in which the costs of cultivation are high and the potential crop yield is low. In addition, our study also demonstrates that a spatially explicit economic model is necessary to distinguish between the main driving forces of farmland abandonment. Policy implications are also provided for potential future policy decisions.

  10. Direct versus Indirect Explicit Methods of Enhancing EFL Students' English Grammatical Competence: A Concept Checking-Based Consciousness-Raising Tasks Model

    Science.gov (United States)

    Dang, Trang Thi Doan; Nguyen, Huong Thu

    2013-01-01

    Two approaches to grammar instruction are often discussed in the ESL literature: direct explicit grammar instruction (DEGI) (deduction) and indirect explicit grammar instruction (IEGI) (induction). This study aims to explore the effects of indirect explicit grammar instruction on EFL learners' mastery of English tenses. Ninety-four…

  11. Predictions of titanium alloy properties using thermodynamic modeling tools

    Science.gov (United States)

    Zhang, F.; Xie, F.-Y.; Chen, S.-L.; Chang, Y. A.; Furrer, D.; Venkatesh, V.

    2005-12-01

    Thermodynamic modeling tools have become essential in understanding the effect of alloy chemistry on the final microstructure of a material. Implementation of such tools to improve titanium processing via parameter optimization has resulted in significant cost savings through the elimination of shop/laboratory trials and tests. In this study, a thermodynamic modeling tool developed at CompuTherm, LLC, is being used to predict β transus, phase proportions, phase chemistries, partitioning coefficients, and phase boundaries of multicomponent titanium alloys. This modeling tool includes Pandat, software for multicomponent phase equilibrium calculations, and PanTitanium, a thermodynamic database for titanium alloys. Model predictions are compared with experimental results for one α-β alloy (Ti-64) and two near-β alloys (Ti-17 and Ti-10-2-3). The alloying elements, especially the interstitial elements O, N, H, and C, have been shown to have a significant effect on the β transus temperature, and are discussed in more detail herein.

  12. Integrated and spatially explicit modelling of the economic value of complex environmental change and its indirect effects

    OpenAIRE

    Bateman, Ian; Binner, Amy; Coombes, Emma; Day, Brett; Ferrini, Silvia; Fezzi, Carlo; Hutchins, Michael; Posen, Paulette

    2012-01-01

    Arguably the greatest challenge to contemporary research is to capture the inter-relatedness and complexity of the real world environment within models so at to better inform decision makers of the accurate and complete consequences of differing options. The paper presents an integrated model of the consequence of climate change upon land use and the secondary and subsequent effects arising subsequently. The model predicts the shift in land use which climate change is likely to induce and the...

  13. Spatially explicit integrated modeling and economic valuation of climate driven land use change and its indirect effects.

    OpenAIRE

    Bateman, Ian; Agarwala, M.; Binner, A.; Coombes, E.; Day, B.; Ferrini, Silvia; Fezzi, C.; Hutchins, M.; Lovett, A.; Posen, P.

    2016-01-01

    We present an integrated model of the direct consequences of climate change on land use, and the indirect effects of induced land use change upon the natural environment. The model predicts climate-driven shifts in the profitability of alternative uses of agricultural land. Both the direct impact of climate change and the induced shift in land use patterns will cause secondary effects on the water environment, for which agriculture is the major source of diffuse pollution. We model the impact...

  14. A tool for model based diagnostics of the AGS Booster

    International Nuclear Information System (INIS)

    Luccio, A.

    1993-01-01

    A model-based algorithmic tool was developed to search for lattice errors by a systematic analysis of orbit data in the AGS Booster synchrotron. The algorithm employs transfer matrices calculated with MAD between points in the ring. Iterative model fitting of the data allows one to find and eventually correct magnet displacements and angles or field errors. The tool, implemented on a HP-Apollo workstation system, has proved very general and of immediate physical interpretation

  15. An open and extensible framework for spatially explicit land use change modelling in R: the lulccR package (0.1.0)

    Science.gov (United States)

    Moulds, S.; Buytaert, W.; Mijic, A.

    2015-04-01

    Land use change has important consequences for biodiversity and the sustainability of ecosystem services, as well as for global environmental change. Spatially explicit land use change models improve our understanding of the processes driving change and make predictions about the quantity and location of future and past change. Here we present the lulccR package, an object-oriented framework for land use change modelling written in the R programming language. The contribution of the work is to resolve the following limitations associated with the current land use change modelling paradigm: (1) the source code for model implementations is frequently unavailable, severely compromising the reproducibility of scientific results and making it impossible for members of the community to improve or adapt models for their own purposes; (2) ensemble experiments to capture model structural uncertainty are difficult because of fundamental differences between implementations of different models; (3) different aspects of the modelling procedure must be performed in different environments because existing applications usually only perform the spatial allocation of change. The package includes a stochastic ordered allocation procedure as well as an implementation of the widely used CLUE-S algorithm. We demonstrate its functionality by simulating land use change at the Plum Island Ecosystems site, using a dataset included with the package. It is envisaged that lulccR will enable future model development and comparison within an open environment.

  16. Fluid Survival Tool: A Model Checker for Hybrid Petri Nets

    NARCIS (Netherlands)

    Postema, Björn Frits; Remke, Anne Katharina Ingrid; Haverkort, Boudewijn R.H.M.; Ghasemieh, Hamed

    2014-01-01

    Recently, algorithms for model checking Stochastic Time Logic (STL) on Hybrid Petri nets with a single general one-shot transition (HPNG) have been introduced. This paper presents a tool for model checking HPNG models against STL formulas. A graphical user interface (GUI) not only helps to

  17. LANDIS 4.0 users guide. LANDIS: a spatially explicit model of forest landscape disturbance, management, and succession

    Science.gov (United States)

    Hong S. He; Wei Li; Brian R. Sturtevant; Jian Yang; Bo Z. Shang; Eric J. Gustafson; David J. Mladenoff

    2005-01-01

    LANDIS 4.0 is new-generation software that simulates forest landscape change over large spatial and temporal scales. It is used to explore how disturbances, succession, and management interact to determine forest composition and pattern. Also describes software architecture, model assumptions and provides detailed instructions on the use of the model.

  18. Application of parameters space analysis tools for empirical model validation

    Energy Technology Data Exchange (ETDEWEB)

    Paloma del Barrio, E. [LEPT-ENSAM UMR 8508, Talence (France); Guyon, G. [Electricite de France, Moret-sur-Loing (France)

    2004-01-01

    A new methodology for empirical model validation has been proposed in the framework of the Task 22 (Building Energy Analysis Tools) of the International Energy Agency. It involves two main steps: checking model validity and diagnosis. Both steps, as well as the underlying methods, have been presented in the first part of the paper. In this part, they are applied for testing modelling hypothesis in the framework of the thermal analysis of an actual building. Sensitivity analysis tools have been first used to identify the parts of the model that can be really tested on the available data. A preliminary diagnosis is then supplied by principal components analysis. Useful information for model behaviour improvement has been finally obtained by optimisation techniques. This example of application shows how model parameters space analysis is a powerful tool for empirical validation. In particular, diagnosis possibilities are largely increased in comparison with residuals analysis techniques. (author)

  19. Transparent Model Transformation: Turning Your Favourite Model Editor into a Transformation Tool

    DEFF Research Database (Denmark)

    Acretoaie, Vlad; Störrle, Harald; Strüber, Daniel

    2015-01-01

    Current model transformation languages are supported by dedicated editors, often closely coupled to a single execution engine. We introduce Transparent Model Transformation, a paradigm enabling modelers to specify transformations using a familiar tool: their model editor. We also present VMTL, th...... model transformation tool sharing the model editor’s benefits, transparently....

  20. Building an explicit de Sitter

    International Nuclear Information System (INIS)

    Louis, Jan; Hamburg Univ.; Rummel, Markus; Valandro, Roberto; Westphal, Alexander

    2012-11-01

    We construct an explicit example of a de Sitter vacuum in type IIB string theory that realizes the proposal of Kaehler uplifting. As the large volume limit in this method depends on the rank of the largest condensing gauge group we carry out a scan of gauge group ranks over the Kreuzer-Skarke set of toric Calabi-Yau threefolds. We find large numbers of models with the largest gauge group factor easily exceeding a rank of one hundred. We construct a global model with Kaehler uplifting on a two-parameter model on CP 4 11169 , by an explicit analysis from both the type IIB and F-theory point of view. The explicitness of the construction lies in the realization of a D7 brane configuration, gauge flux and RR and NS flux choices, such that all known consistency conditions are met and the geometric moduli are stabilized in a metastable de Sitter vacuum with spontaneous GUT scale supersymmetry breaking driven by an F-term of the Kaehler moduli.

  1. Building an explicit de Sitter

    Energy Technology Data Exchange (ETDEWEB)

    Louis, Jan [Hamburg Univ. (Germany). 2. Inst. fuer Theoretische Physik; Hamburg Univ. (Germany). Zentrum fuer Mathematische Physik; Rummel, Markus; Valandro, Roberto [Hamburg Univ. (Germany). 2. Inst. fuer Theoretische Physik; Westphal, Alexander [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany). Gruppe Theorie

    2012-11-15

    We construct an explicit example of a de Sitter vacuum in type IIB string theory that realizes the proposal of Kaehler uplifting. As the large volume limit in this method depends on the rank of the largest condensing gauge group we carry out a scan of gauge group ranks over the Kreuzer-Skarke set of toric Calabi-Yau threefolds. We find large numbers of models with the largest gauge group factor easily exceeding a rank of one hundred. We construct a global model with Kaehler uplifting on a two-parameter model on CP{sup 4}{sub 11169}, by an explicit analysis from both the type IIB and F-theory point of view. The explicitness of the construction lies in the realization of a D7 brane configuration, gauge flux and RR and NS flux choices, such that all known consistency conditions are met and the geometric moduli are stabilized in a metastable de Sitter vacuum with spontaneous GUT scale supersymmetry breaking driven by an F-term of the Kaehler moduli.

  2. Explicit treatment for Dirichlet, Neumann and Cauchy boundary conditions in POD-based reduction of groundwater models

    Science.gov (United States)

    Gosses, Moritz; Nowak, Wolfgang; Wöhling, Thomas

    2018-05-01

    In recent years, proper orthogonal decomposition (POD) has become a popular model reduction method in the field of groundwater modeling. It is used to mitigate the problem of long run times that are often associated with physically-based modeling of natural systems, especially for parameter estimation and uncertainty analysis. POD-based techniques reproduce groundwater head fields sufficiently accurate for a variety of applications. However, no study has investigated how POD techniques affect the accuracy of different boundary conditions found in groundwater models. We show that the current treatment of boundary conditions in POD causes inaccuracies for these boundaries in the reduced models. We provide an improved method that splits the POD projection space into a subspace orthogonal to the boundary conditions and a separate subspace that enforces the boundary conditions. To test the method for Dirichlet, Neumann and Cauchy boundary conditions, four simple transient 1D-groundwater models, as well as a more complex 3D model, are set up and reduced both by standard POD and POD with the new extension. We show that, in contrast to standard POD, the new method satisfies both Dirichlet and Neumann boundary conditions. It can also be applied to Cauchy boundaries, where the flux error of standard POD is reduced by its head-independent contribution. The extension essentially shifts the focus of the projection towards the boundary conditions. Therefore, we see a slight trade-off between errors at model boundaries and overall accuracy of the reduced model. The proposed POD extension is recommended where exact treatment of boundary conditions is required.

  3. Multidisciplinary Modelling Tools for Power Electronic Circuits

    DEFF Research Database (Denmark)

    Bahman, Amir Sajjad

    in reliability assessment of power modules, a three-dimensional lumped thermal network is proposed to be used for fast, accurate and detailed temperature estimation of power module in dynamic operation and different boundary conditions. Since an important issue in the reliability of power electronics...... environment to be used for optimization of cooling system layout with respect to thermal resistance and pressure drop reductions. Finally extraction of electrical parasitics in the multi-chip power modules will be investigated. As the switching frequency of power devices increases, the size of passive...... components are reduced considerably that leads to increase of power density and cost reduction. However, electrical parasitics become more challenging with increasing the switching frequency and paralleled chips in the integrated and denser packages. Therefore, electrical parasitic models are analyzed based...

  4. Modeling Tools for Drilling, Reservoir Navigation, and Formation Evaluation

    Directory of Open Access Journals (Sweden)

    Sushant Dutta

    2012-06-01

    Full Text Available The oil and gas industry routinely uses borehole tools for measuring or logging rock and fluid properties of geologic formations to locate hydrocarbons and maximize their production. Pore fluids in formations of interest are usually hydrocarbons or water. Resistivity logging is based on the fact that oil and gas have a substantially higher resistivity than water. The first resistivity log was acquired in 1927, and resistivity logging is still the foremost measurement used for drilling and evaluation. However, the acquisition and interpretation of resistivity logging data has grown in complexity over the years. Resistivity logging tools operate in a wide range of frequencies (from DC to GHz and encounter extremely high (several orders of magnitude conductivity contrast between the metal mandrel of the tool and the geologic formation. Typical challenges include arbitrary angles of tool inclination, full tensor electric and magnetic field measurements, and interpretation of complicated anisotropic formation properties. These challenges combine to form some of the most intractable computational electromagnetic problems in the world. Reliable, fast, and convenient numerical modeling of logging tool responses is critical for tool design, sensor optimization, virtual prototyping, and log data inversion. This spectrum of applications necessitates both depth and breadth of modeling software—from blazing fast one-dimensional (1-D modeling codes to advanced threedimensional (3-D modeling software, and from in-house developed codes to commercial modeling packages. In this paper, with the help of several examples, we demonstrate our approach for using different modeling software to address different drilling and evaluation applications. In one example, fast 1-D modeling provides proactive geosteering information from a deep-reading azimuthal propagation resistivity measurement. In the second example, a 3-D model with multiple vertical resistive fractures

  5. Exploring spatial change and gravity center movement for ecosystem services value using a spatially explicit ecosystem services value index and gravity model.

    Science.gov (United States)

    He, Yingbin; Chen, Youqi; Tang, Huajun; Yao, Yanmin; Yang, Peng; Chen, Zhongxin

    2011-04-01

    Spatially explicit ecosystem services valuation and change is a newly developing area of research in the field of ecology. Using the Beijing region as a study area, the authors have developed a spatially explicit ecosystem services value index and implemented this to quantify and spatially differentiate ecosystem services value at 1-km grid resolution. A gravity model was developed to trace spatial change in the total ecosystem services value of the Beijing study area from a holistic point of view. Study results show that the total value of ecosystem services for the study area decreased by 19.75% during the period 1996-2006 (3,226.2739 US$×10(6) in 1996, 2,589.0321 US$×10(6) in 2006). However, 27.63% of the total area of the Beijing study area increased in ecosystem services value. Spatial differences in ecosystem services values for both 1996 and 2006 are very clear. The center of gravity of total ecosystem services value for the study area moved 32.28 km northwestward over the 10 years due to intensive human intervention taking place in southeast Beijing. The authors suggest that policy-makers should pay greater attention to ecological protection under conditions of rapid socio-economic development and increase the area of green belt in the southeastern part of Beijing.

  6. Model based methods and tools for process systems engineering

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    need to be integrated with work-flows and data-flows for specific product-process synthesis-design problems within a computer-aided framework. The framework therefore should be able to manage knowledge-data, models and the associated methods and tools needed by specific synthesis-design work...... of model based methods and tools within a computer aided framework for product-process synthesis-design will be highlighted.......Process systems engineering (PSE) provides means to solve a wide range of problems in a systematic and efficient manner. This presentation will give a perspective on model based methods and tools needed to solve a wide range of problems in product-process synthesis-design. These methods and tools...

  7. Pre-Processing and Modeling Tools for Bigdata

    Directory of Open Access Journals (Sweden)

    Hashem Hadi

    2016-09-01

    Full Text Available Modeling tools and operators help the user / developer to identify the processing field on the top of the sequence and to send into the computing module only the data related to the requested result. The remaining data is not relevant and it will slow down the processing. The biggest challenge nowadays is to get high quality processing results with a reduced computing time and costs. To do so, we must review the processing sequence, by adding several modeling tools. The existing processing models do not take in consideration this aspect and focus on getting high calculation performances which will increase the computing time and costs. In this paper we provide a study of the main modeling tools for BigData and a new model based on pre-processing.

  8. A review of a method for dynamic load distribution, dynamical modeling, and explicit internal force control when two manipulators mutually lift and transport a rigid body object

    International Nuclear Information System (INIS)

    Unseren, M.A.

    1997-01-01

    The paper reviews a method for modeling and controlling two serial link manipulators which mutually lift and transport a rigid body object in a three dimensional workspace. A new vector variable is introduced which parameterizes the internal contact force controlled degrees of freedom. A technique for dynamically distributing the payload between the manipulators is suggested which yields a family of solutions for the contact forces and torques the manipulators impart to the object. A set of rigid body kinematic constraints which restrict the values of the joint velocities of both manipulators is derived. A rigid body dynamical model for the closed chain system is first developed in the joint space. The model is obtained by generalizing the previous methods for deriving the model. The joint velocity and acceleration variables in the model are expressed in terms of independent pseudovariables. The pseudospace model is transformed to obtain reduced order equations of motion and a separate set of equations governing the internal components of the contact forces and torques. A theoretic control architecture is suggested which explicitly decouples the two sets of equations comprising the model. The controller enables the designer to develop independent, non-interacting control laws for the position control and internal force control of the system

  9. A review of a method for dynamic load distribution, dynamical modeling, and explicit internal force control when two manipulators mutually lift and transport a rigid body object

    Energy Technology Data Exchange (ETDEWEB)

    Unseren, M.A.

    1997-04-20

    The paper reviews a method for modeling and controlling two serial link manipulators which mutually lift and transport a rigid body object in a three dimensional workspace. A new vector variable is introduced which parameterizes the internal contact force controlled degrees of freedom. A technique for dynamically distributing the payload between the manipulators is suggested which yields a family of solutions for the contact forces and torques the manipulators impart to the object. A set of rigid body kinematic constraints which restrict the values of the joint velocities of both manipulators is derived. A rigid body dynamical model for the closed chain system is first developed in the joint space. The model is obtained by generalizing the previous methods for deriving the model. The joint velocity and acceleration variables in the model are expressed in terms of independent pseudovariables. The pseudospace model is transformed to obtain reduced order equations of motion and a separate set of equations governing the internal components of the contact forces and torques. A theoretic control architecture is suggested which explicitly decouples the two sets of equations comprising the model. The controller enables the designer to develop independent, non-interacting control laws for the position control and internal force control of the system.

  10. The role of spatially explicit models in land-use change research: a case study for cropping patterns in China

    NARCIS (Netherlands)

    Verburg, P.H.; Veldkamp, A.

    2001-01-01

    Single research methodologies do not suffice for a complete analysis of land-use change. Instead, a sequence of methodologies is needed that link up and integrate disciplinary components over a range of spatial and temporal scales. In this paper, a modelling methodology is presented aiming at the

  11. Spatially explicit integrated modeling and economic valuation of climate driven land use change and its indirect effects.

    Science.gov (United States)

    Bateman, Ian; Agarwala, Matthew; Binner, Amy; Coombes, Emma; Day, Brett; Ferrini, Silvia; Fezzi, Carlo; Hutchins, Michael; Lovett, Andrew; Posen, Paulette

    2016-10-01

    We present an integrated model of the direct consequences of climate change on land use, and the indirect effects of induced land use change upon the natural environment. The model predicts climate-driven shifts in the profitability of alternative uses of agricultural land. Both the direct impact of climate change and the induced shift in land use patterns will cause secondary effects on the water environment, for which agriculture is the major source of diffuse pollution. We model the impact of changes in such pollution on riverine ecosystems showing that these will be spatially heterogeneous. Moreover, we consider further knock-on effects upon the recreational benefits derived from water environments, which we assess using revealed preference methods. This analysis permits a multi-layered examination of the economic consequences of climate change, assessing the sequence of impacts from climate change through farm gross margins, land use, water quality and recreation, both at the individual and catchment scale. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. An advanced modelling tool for simulating complex river systems.

    Science.gov (United States)

    Trancoso, Ana Rosa; Braunschweig, Frank; Chambel Leitão, Pedro; Obermann, Matthias; Neves, Ramiro

    2009-04-01

    The present paper describes MOHID River Network (MRN), a 1D hydrodynamic model for river networks as part of MOHID Water Modelling System, which is a modular system for the simulation of water bodies (hydrodynamics and water constituents). MRN is capable of simulating water quality in the aquatic and benthic phase and its development was especially focused on the reproduction of processes occurring in temporary river networks (flush events, pools formation, and transmission losses). Further, unlike many other models, it allows the quantification of settled materials at the channel bed also over periods when the river falls dry. These features are very important to secure mass conservation in highly varying flows of temporary rivers. The water quality models existing in MOHID are base on well-known ecological models, such as WASP and ERSEM, the latter allowing explicit parameterization of C, N, P, Si, and O cycles. MRN can be coupled to the basin model, MOHID Land, with computes runoff and porous media transport, allowing for the dynamic exchange of water and materials between the river and surroundings, or it can be used as a standalone model, receiving discharges at any specified nodes (ASCII files of time series with arbitrary time step). These features account for spatial gradients in precipitation which can be significant in Mediterranean-like basins. An interface has been already developed for SWAT basin model.

  13. Designer Modeling for Personalized Game Content Creation Tools

    DEFF Research Database (Denmark)

    Liapis, Antonios; Yannakakis, Georgios N.; Togelius, Julian

    2013-01-01

    preferences, goals and processes from their interaction with a computer-aided design tool, and suggests methods and domains within game development where such a model can be applied. We describe how designer modeling could be integrated with current work on automated and mixed-initiative content creation......With the growing use of automated content creation and computer-aided design tools in game development, there is potential for enhancing the design process through personalized interactions between the software and the game developer. This paper proposes designer modeling for capturing the designer’s......, and envision future directions which focus on personalizing the processes to a designer’s particular wishes....

  14. Reconstructing the 2003/2004 H3N2 influenza epidemic in Switzerland with a spatially explicit, individual-based model

    Science.gov (United States)

    2011-01-01

    Background Simulation models of influenza spread play an important role for pandemic preparedness. However, as the world has not faced a severe pandemic for decades, except the rather mild H1N1 one in 2009, pandemic influenza models are inherently hypothetical and validation is, thus, difficult. We aim at reconstructing a recent seasonal influenza epidemic that occurred in Switzerland and deem this to be a promising validation strategy for models of influenza spread. Methods We present a spatially explicit, individual-based simulation model of influenza spread. The simulation model bases upon (i) simulated human travel data, (ii) data on human contact patterns and (iii) empirical knowledge on the epidemiology of influenza. For model validation we compare the simulation outcomes with empirical knowledge regarding (i) the shape of the epidemic curve, overall infection rate and reproduction number, (ii) age-dependent infection rates and time of infection, (iii) spatial patterns. Results The simulation model is capable of reproducing the shape of the 2003/2004 H3N2 epidemic curve of Switzerland and generates an overall infection rate (14.9 percent) and reproduction numbers (between 1.2 and 1.3), which are realistic for seasonal influenza epidemics. Age and spatial patterns observed in empirical data are also reflected by the model: Highest infection rates are in children between 5 and 14 and the disease spreads along the main transport axes from west to east. Conclusions We show that finding evidence for the validity of simulation models of influenza spread by challenging them with seasonal influenza outbreak data is possible and promising. Simulation models for pandemic spread gain more credibility if they are able to reproduce seasonal influenza outbreaks. For more robust modelling of seasonal influenza, serological data complementing sentinel information would be beneficial. PMID:21554680

  15. Fish habitat simulation models and integrated assessment tools

    International Nuclear Information System (INIS)

    Harby, A.; Alfredsen, K.

    1999-01-01

    Because of human development water use increases in importance, and this worldwide trend is leading to an increasing number of user conflicts with a strong need for assessment tools to measure the impacts both on the ecosystem and the different users and user groups. The quantitative tools must allow a comparison of alternatives, different user groups, etc., and the tools must be integrated while impact assessments includes different disciplines. Fish species, especially young ones, are indicators of the environmental state of a riverine system and monitoring them is a way to follow environmental changes. The direct and indirect impacts on the ecosystem itself are measured, and impacts on user groups is not included. Fish habitat simulation models are concentrated on, and methods and examples are considered from Norway. Some ideas on integrated modelling tools for impact assessment studies are included. One dimensional hydraulic models are rapidly calibrated and do not require any expert knowledge in hydraulics. Two and three dimensional models require a bit more skilled users, especially if the topography is very heterogeneous. The advantages of using two and three dimensional models include: they do not need any calibration, just validation; they are predictive; and they can be more cost effective than traditional habitat hydraulic models when combined with modern data acquisition systems and tailored in a multi-disciplinary study. Suitable modelling model choice should be based on available data and possible data acquisition, available manpower, computer, and software resources, and needed output and accuracy in the output. 58 refs

  16. Simulation Tools for Electrical Machines Modelling: Teaching and ...

    African Journals Online (AJOL)

    Simulation tools are used both for research and teaching to allow a good comprehension of the systems under study before practical implementations. This paper illustrates the way MATLAB is used to model non-linearites in synchronous machine. The machine is modeled in rotor reference frame with currents as state ...

  17. An interactive modelling tool for understanding hydrological processes in lowland catchments

    Science.gov (United States)

    Brauer, Claudia; Torfs, Paul; Uijlenhoet, Remko

    2016-04-01

    Recently, we developed the Wageningen Lowland Runoff Simulator (WALRUS), a rainfall-runoff model for catchments with shallow groundwater (Brauer et al., 2014ab). WALRUS explicitly simulates processes which are important in lowland catchments, such as feedbacks between saturated and unsaturated zone and between groundwater and surface water. WALRUS has a simple model structure and few parameters with physical connotations. Some default functions (which can be changed easily for research purposes) are implemented to facilitate application by practitioners and students. The effect of water management on hydrological variables can be simulated explicitly. The model description and applications are published in open access journals (Brauer et al, 2014). The open source code (provided as R package) and manual can be downloaded freely (www.github.com/ClaudiaBrauer/WALRUS). We organised a short course for Dutch water managers and consultants to become acquainted with WALRUS. We are now adapting this course as a stand-alone tutorial suitable for a varied, international audience. In addition, simple models can aid teachers to explain hydrological principles effectively. We used WALRUS to generate examples for simple interactive tools, which we will present at the EGU General Assembly. C.C. Brauer, A.J. Teuling, P.J.J.F. Torfs, R. Uijlenhoet (2014a): The Wageningen Lowland Runoff Simulator (WALRUS): a lumped rainfall-runoff model for catchments with shallow groundwater, Geosci. Model Dev., 7, 2313-2332. C.C. Brauer, P.J.J.F. Torfs, A.J. Teuling, R. Uijlenhoet (2014b): The Wageningen Lowland Runoff Simulator (WALRUS): application to the Hupsel Brook catchment and Cabauw polder, Hydrol. Earth Syst. Sci., 18, 4007-4028.

  18. Tool Efficiency Analysis model research in SEMI industry

    Directory of Open Access Journals (Sweden)

    Lei Ma

    2018-01-01

    Full Text Available One of the key goals in SEMI industry is to improve equipment through put and ensure equipment production efficiency maximization. This paper is based on SEMI standards in semiconductor equipment control, defines the transaction rules between different tool states,and presents a TEA system model which is to analysis tool performance automatically based on finite state machine. The system was applied to fab tools and verified its effectiveness successfully, and obtained the parameter values used to measure the equipment performance, also including the advices of improvement.

  19. Multi-scale modelling and simulation of the thermo-hydro-mechanical behavior of concrete with explicit representation of cracking

    International Nuclear Information System (INIS)

    Tognevi, Amen

    2012-01-01

    The concrete structures of nuclear power plants can be subjected to moderate thermo-hydric loadings characterized by temperatures of the order of hundred of degrees in service conditions as well as in accidental ones. These loadings can be at the origin of important disorders, in particular cracking which accelerate hydric transfers in the structure. In the framework of the study of durability of these structures, a coupled thermo-hydro-mechanical model denoted THMs has been developed at Laboratoire d'Etude du Comportement des Betons et des Argiles (LECBA) of CEA Saclay in order to perform simulations of the concrete behavior submitted to such loadings. In this work, we focus on the improvement in the model THMs in one hand of the assessment of the mechanical and hydro-mechanical parameters of the unsaturated micro-cracked material and in the other hand of the description of cracking in terms of opening and propagation. The first part is devoted to the development of a model based on a multi-scale description of cement-based materials starting from the scale of the main hydrated products (portlandite, ettringite, C-S-H etc.) to the macroscopic scale of the cracked material. The investigated parameters are obtained at each scale of the description by applying analytical homogenization techniques. The second part concerns a fine numerical description of cracking. To this end, we choose to use combined finite element and discrete element methods. This procedure is presented and illustrated through a series of mechanical tests in order to show the feasibility of the method and to proceed to its validation. Finally, we apply the procedure to a heated wall and the proposed method for estimating the permeability shows the interest to take into account an anisotropic permeability tensor when dealing with mass transfers in cracked concrete structures. (author) [fr

  20. Model tool to describe chemical structures in XML format utilizing structural fragments and chemical ontology.

    Science.gov (United States)

    Sankar, Punnaivanam; Alain, Krief; Aghila, Gnanasekaran

    2010-05-24

    We have developed a model structure-editing tool, ChemEd, programmed in JAVA, which allows drawing chemical structures on a graphical user interface (GUI) by selecting appropriate structural fragments defined in a fragment library. The terms representing the structural fragments are organized in fragment ontology to provide a conceptual support. ChemEd describes the chemical structure in an XML document (ChemFul) with rich semantics explicitly encoding the details of the chemical bonding, the hybridization status, and the electron environment around each atom. The document can be further processed through suitable algorithms and with the support of external chemical ontologies to generate understandable reports about the functional groups present in the structure and their specific environment.

  1. The Cryosphere Model Comparison Tool (CmCt): Ice Sheet Model Validation and Comparison Tool for Greenland and Antarctica

    Science.gov (United States)

    Simon, E.; Nowicki, S.; Neumann, T.; Tyahla, L.; Saba, J. L.; Guerber, J. R.; Bonin, J. A.; DiMarzio, J. P.

    2017-12-01

    The Cryosphere model Comparison tool (CmCt) is a web based ice sheet model validation tool that is being developed by NASA to facilitate direct comparison between observational data and various ice sheet models. The CmCt allows the user to take advantage of several decades worth of observations from Greenland and Antarctica. Currently, the CmCt can be used to compare ice sheet models provided by the user with remotely sensed satellite data from ICESat (Ice, Cloud, and land Elevation Satellite) laser altimetry, GRACE (Gravity Recovery and Climate Experiment) satellite, and radar altimetry (ERS-1, ERS-2, and Envisat). One or more models can be uploaded through the CmCt website and compared with observational data, or compared to each other or other models. The CmCt calculates statistics on the differences between the model and observations, and other quantitative and qualitative metrics, which can be used to evaluate the different model simulations against the observations. The qualitative metrics consist of a range of visual outputs and the quantitative metrics consist of several whole-ice-sheet scalar values that can be used to assign an overall score to a particular simulation. The comparison results from CmCt are useful in quantifying improvements within a specific model (or within a class of models) as a result of differences in model dynamics (e.g., shallow vs. higher-order dynamics approximations), model physics (e.g., representations of ice sheet rheological or basal processes), or model resolution (mesh resolution and/or changes in the spatial resolution of input datasets). The framework and metrics could also be used for use as a model-to-model intercomparison tool, simply by swapping outputs from another model as the observational datasets. Future versions of the tool will include comparisons with other datasets that are of interest to the modeling community, such as ice velocity, ice thickness, and surface mass balance.

  2. Proc. of the Workshop on Agent Simulation : Applications, Models, and Tools, Oct. 15-16, 1999

    International Nuclear Information System (INIS)

    Macal, C. M.; Sallach, D.

    2000-01-01

    The many motivations for employing agent-based computation in the social sciences are reviewed. It is argued that there exist three distinct uses of agent modeling techniques. One such use-the simplest-is conceptually quite close to traditional simulation in operations research. This use arises when equations can be formulated that completely describe a social process, and these equations are explicitly soluble, either analytically or numerically. In the former case, the agent model is merely a tool for presenting results, while in the latter it is a novel kind of Monte Carlo analysis. A second, more commonplace usage of computational agent models arises when mathematical models can be written down but not completely solved. In this case the agent-based model can shed significant light on the solution structure, illustrate dynamical properties of the model, serve to test the dependence of results on parameters and assumptions, and be a source of counter-examples. Finally, there are important classes of problems for which writing down equations is not a useful activity. In such circumstances, resort to agent-based computational models may be the only way available to explore such processes systematically, and constitute a third distinct usage of such models

  3. Business intelligence tools for radiology: creating a prototype model using open-source tools.

    Science.gov (United States)

    Prevedello, Luciano M; Andriole, Katherine P; Hanson, Richard; Kelly, Pauline; Khorasani, Ramin

    2010-04-01

    Digital radiology departments could benefit from the ability to integrate and visualize data (e.g. information reflecting complex workflow states) from all of their imaging and information management systems in one composite presentation view. Leveraging data warehousing tools developed in the business world may be one way to achieve this capability. In total, the concept of managing the information available in this data repository is known as Business Intelligence or BI. This paper describes the concepts used in Business Intelligence, their importance to modern Radiology, and the steps used in the creation of a prototype model of a data warehouse for BI using open-source tools.

  4. Systematic Methods and Tools for Computer Aided Modelling

    DEFF Research Database (Denmark)

    Fedorova, Marina

    and processes can be faster, cheaper and very efficient. The developed modelling framework involves five main elements: 1) a modelling tool, that includes algorithms for model generation; 2) a template library, which provides building blocks for the templates (generic models previously developed); 3) computer......-format and COM-objects, are incorporated to allow the export and import of mathematical models; 5) a user interface that provides the work-flow and data-flow to guide the user through the different modelling tasks....

  5. Lightweight approach to model traceability in a CASE tool

    Science.gov (United States)

    Vileiniskis, Tomas; Skersys, Tomas; Pavalkis, Saulius; Butleris, Rimantas; Butkiene, Rita

    2017-07-01

    A term "model-driven" is not at all a new buzzword within the ranks of system development community. Nevertheless, the ever increasing complexity of model-driven approaches keeps fueling all kinds of discussions around this paradigm and pushes researchers forward to research and develop new and more effective ways to system development. With the increasing complexity, model traceability, and model management as a whole, becomes indispensable activities of model-driven system development process. The main goal of this paper is to present a conceptual design and implementation of a practical lightweight approach to model traceability in a CASE tool.

  6. OISI dynamic end-to-end modeling tool

    Science.gov (United States)

    Kersten, Michael; Weidler, Alexander; Wilhelm, Rainer; Johann, Ulrich A.; Szerdahelyi, Laszlo

    2000-07-01

    The OISI Dynamic end-to-end modeling tool is tailored to end-to-end modeling and dynamic simulation of Earth- and space-based actively controlled optical instruments such as e.g. optical stellar interferometers. `End-to-end modeling' is meant to denote the feature that the overall model comprises besides optical sub-models also structural, sensor, actuator, controller and disturbance sub-models influencing the optical transmission, so that the system- level instrument performance due to disturbances and active optics can be simulated. This tool has been developed to support performance analysis and prediction as well as control loop design and fine-tuning for OISI, Germany's preparatory program for optical/infrared spaceborne interferometry initiated in 1994 by Dornier Satellitensysteme GmbH in Friedrichshafen.

  7. Hybrid Neural Network Approach Based Tool for the Modelling of Photovoltaic Panels

    Directory of Open Access Journals (Sweden)

    Antonino Laudani

    2015-01-01

    Full Text Available A hybrid neural network approach based tool for identifying the photovoltaic one-diode model is presented. The generalization capabilities of neural networks are used together with the robustness of the reduced form of one-diode model. Indeed, from the studies performed by the authors and the works present in the literature, it was found that a direct computation of the five parameters via multiple inputs and multiple outputs neural network is a very difficult task. The reduced form consists in a series of explicit formulae for the support to the neural network that, in our case, is aimed at predicting just two parameters among the five ones identifying the model: the other three parameters are computed by reduced form. The present hybrid approach is efficient from the computational cost point of view and accurate in the estimation of the five parameters. It constitutes a complete and extremely easy tool suitable to be implemented in a microcontroller based architecture. Validations are made on about 10000 PV panels belonging to the California Energy Commission database.

  8. Modeling the dielectric logging tool at high frequency

    International Nuclear Information System (INIS)

    Chew, W.C.

    1987-01-01

    The high frequency dielectric logging tool has been used widely in electromagnetic well logging, because by measuring the dielectric constant at high frequencies (1 GHz), the water saturation of rocks could be known without measuring the water salinity in the rocks. As such, it could be used to delineate fresh water bearing zones, as the dielectric constant of fresh water is much higher than that of oil while they may have the same resistivity. The authors present a computer model, though electromagnetic field analysis, the response of such a measurement tool in a well logging environment. As the measurement is performed at high frequency, usually with small separation between the transmitter and receivers, some small geological features could be measured by such a tool. They use the computer model to study the behavior of such a tool across geological bed boundaries, and also across thin geological beds. Such a study could be very useful in understanding the limitation on the resolution of the tool. Furthermore, they could study the standoff effect and the depth of investigation of such a tool. This could delineate the range of usefulness of the measurement

  9. Exploring a multi-scale method for molecular simulation in continuum solvent model: Explicit simulation of continuum solvent as an incompressible fluid.

    Science.gov (United States)

    Xiao, Li; Luo, Ray

    2017-12-07

    We explored a multi-scale algorithm for the Poisson-Boltzmann continuum solvent model for more robust simulations of biomolecules. In this method, the continuum solvent/solute interface is explicitly simulated with a numerical fluid dynamics procedure, which is tightly coupled to the solute molecular dynamics simulation. There are multiple benefits to adopt such a strategy as presented below. At this stage of the development, only nonelectrostatic interactions, i.e., van der Waals and hydrophobic interactions, are included in the algorithm to assess the quality of the solvent-solute interface generated by the new method. Nevertheless, numerical challenges exist in accurately interpolating the highly nonlinear van der Waals term when solving the finite-difference fluid dynamics equations. We were able to bypass the challenge rigorously by merging the van der Waals potential and pressure together when solving the fluid dynamics equations and by considering its contribution in the free-boundary condition analytically. The multi-scale simulation method was first validated by reproducing the solute-solvent interface of a single atom with analytical solution. Next, we performed the relaxation simulation of a restrained symmetrical monomer and observed a symmetrical solvent interface at equilibrium with detailed surface features resembling those found on the solvent excluded surface. Four typical small molecular complexes were then tested, both volume and force balancing analyses showing that these simple complexes can reach equilibrium within the simulation time window. Finally, we studied the quality of the multi-scale solute-solvent interfaces for the four tested dimer complexes and found that they agree well with the boundaries as sampled in the explicit water simulations.

  10. Modeling with data tools and techniques for scientific computing

    CERN Document Server

    Klemens, Ben

    2009-01-01

    Modeling with Data fully explains how to execute computationally intensive analyses on very large data sets, showing readers how to determine the best methods for solving a variety of different problems, how to create and debug statistical models, and how to run an analysis and evaluate the results. Ben Klemens introduces a set of open and unlimited tools, and uses them to demonstrate data management, analysis, and simulation techniques essential for dealing with large data sets and computationally intensive procedures. He then demonstrates how to easily apply these tools to the many threads of statistical technique, including classical, Bayesian, maximum likelihood, and Monte Carlo methods

  11. Open source Modeling and optimization tools for Planning

    Energy Technology Data Exchange (ETDEWEB)

    Peles, S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-02-10

    Open source modeling and optimization tools for planning The existing tools and software used for planning and analysis in California are either expensive, difficult to use, or not generally accessible to a large number of participants. These limitations restrict the availability of participants for larger scale energy and grid studies in the state. The proposed initiative would build upon federal and state investments in open source software, and create and improve open source tools for use in the state planning and analysis activities. Computational analysis and simulation frameworks in development at national labs and universities can be brought forward to complement existing tools. An open source platform would provide a path for novel techniques and strategies to be brought into the larger community and reviewed by a broad set of stakeholders.

  12. Analytical Modelling Of Milling For Tool Design And Selection

    International Nuclear Information System (INIS)

    Fontaine, M.; Devillez, A.; Dudzinski, D.

    2007-01-01

    This paper presents an efficient analytical model which allows to simulate a large panel of milling operations. A geometrical description of common end mills and of their engagement in the workpiece material is proposed. The internal radius of the rounded part of the tool envelope is used to define the considered type of mill. The cutting edge position is described for a constant lead helix and for a constant local helix angle. A thermomechanical approach of oblique cutting is applied to predict forces acting on the tool and these results are compared with experimental data obtained from milling tests on a 42CrMo4 steel for three classical types of mills. The influence of some tool's geometrical parameters on predicted cutting forces is presented in order to propose optimisation criteria for design and selection of cutting tools

  13. A tool model for predicting atmospheric kinetics with sensitivity analysis

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    A package( a tool model) for program of predicting atmospheric chemical kinetics with sensitivity analysis is presented. The new direct method of calculating the first order sensitivity coefficients using sparse matrix technology to chemical kinetics is included in the tool model, it is only necessary to triangularize the matrix related to the Jacobian matrix of the model equation. The Gear type procedure is used to integrate amodel equation and its coupled auxiliary sensitivity coefficient equations. The FORTRAN subroutines of the model equation, the sensitivity coefficient equations, and their Jacobian analytical expressions are generated automatically from a chemical mechanism. The kinetic representation for the model equation and its sensitivity coefficient equations, and their Jacobian matrix is presented. Various FORTRAN subroutines in packages, such as SLODE, modified MA28, Gear package, with which the program runs in conjunction are recommended.The photo-oxidation of dimethyl disulfide is used for illustration.

  14. Development Life Cycle and Tools for XML Content Models

    Energy Technology Data Exchange (ETDEWEB)

    Kulvatunyou, Boonserm [ORNL; Morris, Katherine [National Institute of Standards and Technology (NIST); Buhwan, Jeong [POSTECH University, South Korea; Goyal, Puja [National Institute of Standards and Technology (NIST)

    2004-11-01

    Many integration projects today rely on shared semantic models based on standards represented using Extensible Mark up Language (XML) technologies. Shared semantic models typically evolve and require maintenance. In addition, to promote interoperability and reduce integration costs, the shared semantics should be reused as much as possible. Semantic components must be consistent and valid in terms of agreed upon standards and guidelines. In this paper, we describe an activity model for creation, use, and maintenance of a shared semantic model that is coherent and supports efficient enterprise integration. We then use this activity model to frame our research and the development of tools to support those activities. We provide overviews of these tools primarily in the context of the W3C XML Schema. At the present, we focus our work on the W3C XML Schema as the representation of choice, due to its extensive adoption by industry.

  15. Using the IEA ETSAP modelling tools for Denmark

    DEFF Research Database (Denmark)

    Grohnheit, Poul Erik

    signed the agreement and contributed to some early annexes. This project is motivated by an invitation to participate in ETSAP Annex X, "Global Energy Systems and Common Analyses: Climate friendly, Secure and Productive Energy Systems" for the period 2005 to 2007. The main activity is semi......-annual workshops focusing on presentations of model analyses and use of the ETSAP' tools (the MARKAL/TIMES family of models). The project was also planned to benefit from the EU project ”NEEDS - New Energy Externalities Developments for Sustainability. ETSAP is contributing to a part of NEEDS that develops......, Environment and Health (CEEH), starting from January 2007. This report summarises the activities under ETSAP Annex X and related project, emphasising the development of modelling tools that will be useful for modelling the Danish energy system. It is also a status report for the development of a model...

  16. Designing tools for oil exploration using nuclear modeling

    Directory of Open Access Journals (Sweden)

    Mauborgne Marie-Laure

    2017-01-01

    Full Text Available When designing nuclear tools for oil exploration, one of the first steps is typically nuclear modeling for concept evaluation and initial characterization. Having an accurate model, including the availability of accurate cross sections, is essential to reduce or avoid time consuming and costly design iterations. During tool response characterization, modeling is benchmarked with experimental data and then used to complement and to expand the database to make it more detailed and inclusive of more measurement environments which are difficult or impossible to reproduce in the laboratory. We present comparisons of our modeling results obtained using the ENDF/B-VI and ENDF/B-VII cross section data bases, focusing on the response to a few elements found in the tool, borehole and subsurface formation. For neutron-induced inelastic and capture gamma ray spectroscopy, major obstacles may be caused by missing or inaccurate cross sections for essential materials. We show examples of the benchmarking of modeling results against experimental data obtained during tool characterization and discuss observed discrepancies.

  17. Designing tools for oil exploration using nuclear modeling

    Science.gov (United States)

    Mauborgne, Marie-Laure; Allioli, Françoise; Manclossi, Mauro; Nicoletti, Luisa; Stoller, Chris; Evans, Mike

    2017-09-01

    When designing nuclear tools for oil exploration, one of the first steps is typically nuclear modeling for concept evaluation and initial characterization. Having an accurate model, including the availability of accurate cross sections, is essential to reduce or avoid time consuming and costly design iterations. During tool response characterization, modeling is benchmarked with experimental data and then used to complement and to expand the database to make it more detailed and inclusive of more measurement environments which are difficult or impossible to reproduce in the laboratory. We present comparisons of our modeling results obtained using the ENDF/B-VI and ENDF/B-VII cross section data bases, focusing on the response to a few elements found in the tool, borehole and subsurface formation. For neutron-induced inelastic and capture gamma ray spectroscopy, major obstacles may be caused by missing or inaccurate cross sections for essential materials. We show examples of the benchmarking of modeling results against experimental data obtained during tool characterization and discuss observed discrepancies.

  18. AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT: A GIS-BASED HYDROLOGIC MODELING TOOL

    Science.gov (United States)

    Planning and assessment in land and water resource management are evolving toward complex, spatially explicit regional assessments. These problems have to be addressed with distributed models that can compute runoff and erosion at different spatial and temporal scales. The extens...

  19. Spatially-explicit modeling of multi-scale drivers of aboveground forest biomass and water yield in watersheds of the Southeastern United States.

    Science.gov (United States)

    Ajaz Ahmed, Mukhtar Ahmed; Abd-Elrahman, Amr; Escobedo, Francisco J; Cropper, Wendell P; Martin, Timothy A; Timilsina, Nilesh

    2017-09-01

    Understanding ecosystem processes and the influence of regional scale drivers can provide useful information for managing forest ecosystems. Examining more local scale drivers of forest biomass and water yield can also provide insights for identifying and better understanding the effects of climate change and management on forests. We used diverse multi-scale datasets, functional models and Geographically Weighted Regression (GWR) to model ecosystem processes at the watershed scale and to interpret the influence of ecological drivers across the Southeastern United States (SE US). Aboveground forest biomass (AGB) was determined from available geospatial datasets and water yield was estimated using the Water Supply and Stress Index (WaSSI) model at the watershed level. Our geostatistical model examined the spatial variation in these relationships between ecosystem processes, climate, biophysical, and forest management variables at the watershed level across the SE US. Ecological and management drivers at the watershed level were analyzed locally to identify whether drivers contribute positively or negatively to aboveground forest biomass and water yield ecosystem processes and thus identifying potential synergies and tradeoffs across the SE US region. Although AGB and water yield drivers varied geographically across the study area, they were generally significantly influenced by climate (rainfall and temperature), land-cover factor1 (Water and barren), land-cover factor2 (wetland and forest), organic matter content high, rock depth, available water content, stand age, elevation, and LAI drivers. These drivers were positively or negatively associated with biomass or water yield which significantly contributes to ecosystem interactions or tradeoff/synergies. Our study introduced a spatially-explicit modelling framework to analyze the effect of ecosystem drivers on forest ecosystem structure, function and provision of services. This integrated model approach facilitates

  20. Hydrologic Drivers of Soil Organic Carbon Erosion and Burial: Insights from a Spatially-explicit Model of a Degraded Landscape at the Calhoun Critical Zone Observatory

    Science.gov (United States)

    Dialynas, Y. G.; Bras, R. L.; Richter, D. D., Jr.

    2017-12-01

    Soil erosion and burial of organic material may constitute a substantial sink of atmospheric CO2. Attempts to quantify impacts of soil erosion on the soil-atmosphere C exchange are limited by difficulties in accounting for the fate of eroded soil organic carbon (SOC), a key factor in estimating of the net effect of erosion on the C cycle. Processes that transport SOC are still inadequately represented in terrestrial carbon (C) cycle models. This study investigates hydrologic controls on SOC redistribution across the landscape focusing on dynamic feedbacks between watershed hydrology, soil erosional processes, and SOC burial. We use tRIBS-ECO (Triangulated Irregular Network-based Real-time Integrated Basin Simulator-Erosion and Carbon Oxidation), a spatially-explicit model of SOC dynamics coupled with a physically-based hydro-geomorphic model. tRIBS-ECO systematically accounts for the fate of eroded SOC across the watershed: Rainsplash erosion and sheet erosion redistribute SOC from upland sites to depositional environments, altering depth-dependent soil biogeochemical properties in diverse soil profiles. Eroded organic material is transferred with sediment and can be partially oxidized upon transport, or preserved from decomposition by burial. The model was applied in the Calhoun Critical Zone Observatory (CZO), a site that is recovering from some of the most serious agricultural erosion in North America. Soil biogeochemical characteristics at multiple soil horizons were used to initialize the model and test performance. Remotely sensed soil moisture data (NASA SMAP) were used for model calibration. Results show significant rates of hydrologically-induced burial of SOC at the Calhoun CZO. We find that organic material at upland eroding soil profiles is largely mobilized by rainsplash erosion. Sheet erosion mainly drives C transport in lower elevation clayey soils. While SOC erosion and deposition rates declined with recent reforestation at the study site, the

  1. BPMNDiffViz : a tool for BPMN models comparison

    NARCIS (Netherlands)

    Ivanov, S.Y.; Kalenkova, A.A.; Aalst, van der W.M.P.; Daniel, F.; Zugal, S.

    2015-01-01

    Automatic comparison of business processes plays an important role in their analysis and optimization. In this paper we present the web-based tool BPMNDiffViz, that finds business processes discrepancies and visualizes them. BPMN (Business Process Model and Notation) 2.0 - one of the most commonly

  2. Explicit dissipative structures

    International Nuclear Information System (INIS)

    Roessler, O.E.

    1987-01-01

    Dissipative structures consisting of a few macrovariables arise out of a sea of reversible microvariables. Unexpected residual effects of the massive underlying reversibility, on the macrolevel, cannot therefore be excluded. In the age of molecular-dynamics simulations, explicit dissipative structures like excitable systems (explicit observers) can be generated in a computer from first reversible principles. A class of classical, 1-D Hamiltonian systems of chaotic type is considered which has the asset that the trajectorial behavior in phase space can be understood geometrically. If, as nuatural, the number of particle types is much smaller than that of particles, the Gibbs symmetry must be taken into account. The permutation invariance drastically changes the behavior in phase space (quasi-periodization). The explicity observer becomes effectively reversible on a short time scale. In consequence, his ability to measure microscopic motions is suspended in a characteristic fashion. Unlike quantum mechanics whose holistic nature cannot be transcended, the present holistic (internal-interface) effects - mimicking the former to some extent - can be understood fully in principle

  3. HMMEditor: a visual editing tool for profile hidden Markov model

    Directory of Open Access Journals (Sweden)

    Cheng Jianlin

    2008-03-01

    Full Text Available Abstract Background Profile Hidden Markov Model (HMM is a powerful statistical model to represent a family of DNA, RNA, and protein sequences. Profile HMM has been widely used in bioinformatics research such as sequence alignment, gene structure prediction, motif identification, protein structure prediction, and biological database search. However, few comprehensive, visual editing tools for profile HMM are publicly available. Results We develop a visual editor for profile Hidden Markov Models (HMMEditor. HMMEditor can visualize the profile HMM architecture, transition probabilities, and emission probabilities. Moreover, it provides functions to edit and save HMM and parameters. Furthermore, HMMEditor allows users to align a sequence against the profile HMM and to visualize the corresponding Viterbi path. Conclusion HMMEditor provides a set of unique functions to visualize and edit a profile HMM. It is a useful tool for biological sequence analysis and modeling. Both HMMEditor software and web service are freely available.

  4. Modeling flow and solute transport at a tile drain field site by explicit representation of preferential flow structures: Equifinality and uncertainty

    Science.gov (United States)

    Zehe, E.; Klaus, J.

    2011-12-01

    Rapid flow in connected preferential flow paths is crucial for fast transport of water and solutes through soils, especially at tile drained field sites. The present study tests whether an explicit treatment of worm burrows is feasible for modeling water flow, bromide and pesticide transport in structured heterogeneous soils with a 2-dimensional Richards based model. The essence is to represent worm burrows as morphologically connected paths of low flow resistance and low retention capacity in the spatially highly resolved model domain. The underlying extensive database to test this approach was collected during an irrigation experiment, which investigated transport of bromide and the herbicide Isoproturon at a 900 sqm tile drained field site. In a first step we investigated whether the inherent uncertainty in key data causes equifinality i.e. whether there are several spatial model setups that reproduce tile drain event discharge in an acceptable manner. We found a considerable equifinality in the spatial setup of the model, when key parameters such as the area density of worm burrows and the maximum volumetric water flows inside these macropores were varied within the ranges of either our measurement errors or measurements reported in the literature. Thirteen model runs yielded a Nash-Sutcliffe coefficient of more than 0.9. Also, the flow volumes were in good accordance and peak timing errors where less than or equal to 20 min. In the second step we investigated thus whether this "equifinality" in spatial model setups may be reduced when including the bromide tracer data into the model falsification process. We simulated transport of bromide for the 13 spatial model setups, which performed best with respect to reproduce tile drain event discharge, without any further calibration. Four of this 13 model setups allowed to model bromide transport within fixed limits of acceptability. Parameter uncertainty and equifinality could thus be reduced. Thirdly, we selected

  5. Rapid State Space Modeling Tool for Rectangular Wing Aeroservoelastic Studies

    Science.gov (United States)

    Suh, Peter M.; Conyers, Howard Jason; Mavris, Dimitri N.

    2015-01-01

    This report introduces a modeling and simulation tool for aeroservoelastic analysis of rectangular wings with trailing-edge control surfaces. The inputs to the code are planform design parameters such as wing span, aspect ratio, and number of control surfaces. Using this information, the generalized forces are computed using the doublet-lattice method. Using Roger's approximation, a rational function approximation is computed. The output, computed in a few seconds, is a state space aeroservoelastic model which can be used for analysis and control design. The tool is fully parameterized with default information so there is little required interaction with the model developer. All parameters can be easily modified if desired. The focus of this report is on tool presentation, verification, and validation. These processes are carried out in stages throughout the report. The rational function approximation is verified against computed generalized forces for a plate model. A model composed of finite element plates is compared to a modal analysis from commercial software and an independently conducted experimental ground vibration test analysis. Aeroservoelastic analysis is the ultimate goal of this tool, therefore, the flutter speed and frequency for a clamped plate are computed using damping-versus-velocity and frequency-versus-velocity analysis. The computational results are compared to a previously published computational analysis and wind-tunnel results for the same structure. A case study of a generic wing model with a single control surface is presented. Verification of the state space model is presented in comparison to damping-versus-velocity and frequency-versus-velocity analysis, including the analysis of the model in response to a 1-cos gust.

  6. QUALITY SERVICES EVALUATION MODEL BASED ON DEDICATED SOFTWARE TOOL

    Directory of Open Access Journals (Sweden)

    ANDREEA CRISTINA IONICĂ

    2012-10-01

    Full Text Available In this paper we introduced a new model, called Service Quality (SQ, which combines QFD and SERVQUAL methods. This model takes from the SERVQUAL method the five dimensions of requirements and three of characteristics and from the QFD method the application methodology. The originality of the SQ model consists in computing a global index that reflects the customers’ requirements accomplishment level by the quality characteristics. In order to prove the viability of the SQ model, there was developed a software tool that was applied for the evaluation of a health care services provider.

  7. AgMIP Training in Multiple Crop Models and Tools

    Science.gov (United States)

    Boote, Kenneth J.; Porter, Cheryl H.; Hargreaves, John; Hoogenboom, Gerrit; Thornburn, Peter; Mutter, Carolyn

    2015-01-01

    The Agricultural Model Intercomparison and Improvement Project (AgMIP) has the goal of using multiple crop models to evaluate climate impacts on agricultural production and food security in developed and developing countries. There are several major limitations that must be overcome to achieve this goal, including the need to train AgMIP regional research team (RRT) crop modelers to use models other than the ones they are currently familiar with, plus the need to harmonize and interconvert the disparate input file formats used for the various models. Two activities were followed to address these shortcomings among AgMIP RRTs to enable them to use multiple models to evaluate climate impacts on crop production and food security. We designed and conducted courses in which participants trained on two different sets of crop models, with emphasis on the model of least experience. In a second activity, the AgMIP IT group created templates for inputting data on soils, management, weather, and crops into AgMIP harmonized databases, and developed translation tools for converting the harmonized data into files that are ready for multiple crop model simulations. The strategies for creating and conducting the multi-model course and developing entry and translation tools are reviewed in this chapter.

  8. Model-based setup assistant for progressive tools

    Science.gov (United States)

    Springer, Robert; Gräler, Manuel; Homberg, Werner; Henke, Christian; Trächtler, Ansgar

    2018-05-01

    In the field of production systems, globalization and technological progress lead to increasing requirements regarding part quality, delivery time and costs. Hence, today's production is challenged much more than a few years ago: it has to be very flexible and produce economically small batch sizes to satisfy consumer's demands and avoid unnecessary stock. Furthermore, a trend towards increasing functional integration continues to lead to an ongoing miniaturization of sheet metal components. In the industry of electric connectivity for example, the miniaturized connectors are manufactured by progressive tools, which are usually used for very large batches. These tools are installed in mechanical presses and then set up by a technician, who has to manually adjust a wide range of punch-bending operations. Disturbances like material thickness, temperatures, lubrication or tool wear complicate the setup procedure. In prospect of the increasing demand of production flexibility, this time-consuming process has to be handled more and more often. In this paper, a new approach for a model-based setup assistant is proposed as a solution, which is exemplarily applied in combination with a progressive tool. First, progressive tools, more specifically, their setup process is described and based on that, the challenges are pointed out. As a result, a systematic process to set up the machines is introduced. Following, the process is investigated with an FE-Analysis regarding the effects of the disturbances. In the next step, design of experiments is used to systematically develop a regression model of the system's behaviour. This model is integrated within an optimization in order to calculate optimal machine parameters and the following necessary adjustment of the progressive tool due to the disturbances. Finally, the assistant is tested in a production environment and the results are discussed.

  9. Continued development of modeling tools and theory for RF heating

    International Nuclear Information System (INIS)

    1998-01-01

    Mission Research Corporation (MRC) is pleased to present the Department of Energy (DOE) with its renewal proposal to the Continued Development of Modeling Tools and Theory for RF Heating program. The objective of the program is to continue and extend the earlier work done by the proposed principal investigator in the field of modeling (Radio Frequency) RF heating experiments in the large tokamak fusion experiments, particularly the Tokamak Fusion Test Reactor (TFTR) device located at Princeton Plasma Physics Laboratory (PPPL). An integral part of this work is the investigation and, in some cases, resolution of theoretical issues which pertain to accurate modeling. MRC is nearing the successful completion of the specified tasks of the Continued Development of Modeling Tools and Theory for RF Heating project. The following tasks are either completed or nearing completion. (1) Anisotropic temperature and rotation upgrades; (2) Modeling for relativistic ECRH; (3) Further documentation of SHOOT and SPRUCE. As a result of the progress achieved under this project, MRC has been urged to continue this effort. Specifically, during the performance of this project two topics were identified by PPPL personnel as new applications of the existing RF modeling tools. These two topics concern (a) future fast-wave current drive experiments on the large tokamaks including TFTR and (c) the interpretation of existing and future RF probe data from TFTR. To address each of these topics requires some modification or enhancement of the existing modeling tools, and the first topic requires resolution of certain theoretical issues to produce self-consistent results. This work falls within the scope of the original project and is more suited to the project's renewal than to the initiation of a new project

  10. Programming Models and Tools for Intelligent Embedded Systems

    DEFF Research Database (Denmark)

    Sørensen, Peter Verner Bojsen

    Design automation and analysis tools targeting embedded platforms, developed using a component-based design approach, must be able to reason about the capabilities of the platforms. In the general case where nothing is assumed about the components comprising a platform or the platform topology...... is used for checking the consistency of a design with respect to the availablity of services and resources. In the second application, a tool for automatically implementing the communication infrastructure of a process network application, the Service Relation Model is used for analyzing the capabilities...

  11. Evaluating EML Modeling Tools for Insurance Purposes: A Case Study

    Directory of Open Access Journals (Sweden)

    Mikael Gustavsson

    2010-01-01

    Full Text Available As with any situation that involves economical risk refineries may share their risk with insurers. The decision process generally includes modelling to determine to which extent the process area can be damaged. On the extreme end of modelling the so-called Estimated Maximum Loss (EML scenarios are found. These scenarios predict the maximum loss a particular installation can sustain. Unfortunately no standard model for this exists. Thus the insurers reach different results due to applying different models and different assumptions. Therefore, a study has been conducted on a case in a Swedish refinery where several scenarios previously had been modelled by two different insurance brokers using two different softwares, ExTool and SLAM. This study reviews the concept of EML and analyses the used models to see which parameters are most uncertain. Also a third model, EFFECTS, was employed in an attempt to reach a conclusion with higher reliability.

  12. The Role of Explicit and Implicit Self-Esteem in Peer Modeling of Palatable Food Intake: A Study on Social Media Interaction among Youngsters

    Science.gov (United States)

    Bevelander, Kirsten E.; Anschütz, Doeschka J.; Creemers, Daan H. M.; Kleinjan, Marloes; Engels, Rutger C. M. E.

    2013-01-01

    Objective This experimental study investigated the impact of peers on palatable food intake of youngsters within a social media setting. To determine whether this effect was moderated by self-esteem, the present study examined the roles of global explicit self-esteem (ESE), body esteem (BE) and implicit self-esteem (ISE). Methods Participants (N = 118; 38.1% boys; M age 11.14±.79) were asked to play a computer game while they believed to interact online with a same-sex normal-weight remote confederate (i.e., instructed peer) who ate either nothing, a small or large amount of candy. Results Participants modeled the candy intake of peers via a social media interaction, but this was qualified by their self-esteem. Participants with higher ISE adjusted their candy intake to that of a peer more closely than those with lower ISE when the confederate ate nothing compared to when eating a modest (β = .26, p = .05) or considerable amount of candy (kcal) (β = .32, p = .001). In contrast, participants with lower BE modeled peer intake more than those with higher BE when eating nothing compared to a considerable amount of candy (kcal) (β = .21, p = .02); ESE did not moderate social modeling behavior. In addition, participants with higher discrepant or “damaged” self-esteem (i.e., high ISE and low ESE) modeled peer intake more when the peer ate nothing or a modest amount compared to a substantial amount of candy (kcal) (β = −.24, p = .004; β = −.26, pesteem or damaged self-esteem may be more at risk to peer influences on food intake. PMID:24015251

  13. The role of explicit and implicit self-esteem in peer modeling of palatable food intake: a study on social media interaction among youngsters.

    Science.gov (United States)

    Bevelander, Kirsten E; Anschütz, Doeschka J; Creemers, Daan H M; Kleinjan, Marloes; Engels, Rutger C M E

    2013-01-01

    This experimental study investigated the impact of peers on palatable food intake of youngsters within a social media setting. To determine whether this effect was moderated by self-esteem, the present study examined the roles of global explicit self-esteem (ESE), body esteem (BE) and implicit self-esteem (ISE). Participants (N = 118; 38.1% boys; M age 11.14±.79) were asked to play a computer game while they believed to interact online with a same-sex normal-weight remote confederate (i.e., instructed peer) who ate either nothing, a small or large amount of candy. Participants modeled the candy intake of peers via a social media interaction, but this was qualified by their self-esteem. Participants with higher ISE adjusted their candy intake to that of a peer more closely than those with lower ISE when the confederate ate nothing compared to when eating a modest (β = .26, p = .05) or considerable amount of candy (kcal) (β = .32, p = .001). In contrast, participants with lower BE modeled peer intake more than those with higher BE when eating nothing compared to a considerable amount of candy (kcal) (β = .21, p = .02); ESE did not moderate social modeling behavior. In addition, participants with higher discrepant or "damaged" self-esteem (i.e., high ISE and low ESE) modeled peer intake more when the peer ate nothing or a modest amount compared to a substantial amount of candy (kcal) (β = -.24, p = .004; β = -.26, pesteem or damaged self-esteem may be more at risk to peer influences on food intake.

  14. The role of explicit and implicit self-esteem in peer modeling of palatable food intake: a study on social media interaction among youngsters.

    Directory of Open Access Journals (Sweden)

    Kirsten E Bevelander

    Full Text Available OBJECTIVE: This experimental study investigated the impact of peers on palatable food intake of youngsters within a social media setting. To determine whether this effect was moderated by self-esteem, the present study examined the roles of global explicit self-esteem (ESE, body esteem (BE and implicit self-esteem (ISE. METHODS: Participants (N = 118; 38.1% boys; M age 11.14±.79 were asked to play a computer game while they believed to interact online with a same-sex normal-weight remote confederate (i.e., instructed peer who ate either nothing, a small or large amount of candy. RESULTS: Participants modeled the candy intake of peers via a social media interaction, but this was qualified by their self-esteem. Participants with higher ISE adjusted their candy intake to that of a peer more closely than those with lower ISE when the confederate ate nothing compared to when eating a modest (β = .26, p = .05 or considerable amount of candy (kcal (β = .32, p = .001. In contrast, participants with lower BE modeled peer intake more than those with higher BE when eating nothing compared to a considerable amount of candy (kcal (β = .21, p = .02; ESE did not moderate social modeling behavior. In addition, participants with higher discrepant or "damaged" self-esteem (i.e., high ISE and low ESE modeled peer intake more when the peer ate nothing or a modest amount compared to a substantial amount of candy (kcal (β = -.24, p = .004; β = -.26, p<.0001, respectively. CONCLUSION: Youngsters conform to the amount of palatable food eaten by peers through social media interaction. Those with lower body esteem or damaged self-esteem may be more at risk to peer influences on food intake.

  15. High accuracy navigation information estimation for inertial system using the multi-model EKF fusing adams explicit formula applied to underwater gliders.

    Science.gov (United States)

    Huang, Haoqian; Chen, Xiyuan; Zhang, Bo; Wang, Jian

    2017-01-01

    The underwater navigation system, mainly consisting of MEMS inertial sensors, is a key technology for the wide application of underwater gliders and plays an important role in achieving high accuracy navigation and positioning for a long time of period. However, the navigation errors will accumulate over time because of the inherent errors of inertial sensors, especially for MEMS grade IMU (Inertial Measurement Unit) generally used in gliders. The dead reckoning module is added to compensate the errors. In the complicated underwater environment, the performance of MEMS sensors is degraded sharply and the errors will become much larger. It is difficult to establish the accurate and fixed error model for the inertial sensor. Therefore, it is very hard to improve the accuracy of navigation information calculated by sensors. In order to solve the problem mentioned, the more suitable filter which integrates the multi-model method with an EKF approach can be designed according to different error models to give the optimal estimation for the state. The key parameters of error models can be used to determine the corresponding filter. The Adams explicit formula which has an advantage of high precision prediction is simultaneously fused into the above filter to achieve the much more improvement in attitudes estimation accuracy. The proposed algorithm has been proved through theory analyses and has been tested by both vehicle experiments and lake trials. Results show that the proposed method has better accuracy and effectiveness in terms of attitudes estimation compared with other methods mentioned in the paper for inertial navigation applied to underwater gliders. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  16. Scenario Evaluator for Electrical Resistivity survey pre-modeling tool

    Science.gov (United States)

    Terry, Neil; Day-Lewis, Frederick D.; Robinson, Judith L.; Slater, Lee D.; Halford, Keith J.; Binley, Andrew; Lane, John W.; Werkema, Dale D.

    2017-01-01

    Geophysical tools have much to offer users in environmental, water resource, and geotechnical fields; however, techniques such as electrical resistivity imaging (ERI) are often oversold and/or overinterpreted due to a lack of understanding of the limitations of the techniques, such as the appropriate depth intervals or resolution of the methods. The relationship between ERI data and resistivity is nonlinear; therefore, these limitations depend on site conditions and survey design and are best assessed through forward and inverse modeling exercises prior to field investigations. In this approach, proposed field surveys are first numerically simulated given the expected electrical properties of the site, and the resulting hypothetical data are then analyzed using inverse models. Performing ERI forward/inverse modeling, however, requires substantial expertise and can take many hours to implement. We present a new spreadsheet-based tool, the Scenario Evaluator for Electrical Resistivity (SEER), which features a graphical user interface that allows users to manipulate a resistivity model and instantly view how that model would likely be interpreted by an ERI survey. The SEER tool is intended for use by those who wish to determine the value of including ERI to achieve project goals, and is designed to have broad utility in industry, teaching, and research.

  17. Making the Tacit Explicit

    DEFF Research Database (Denmark)

    Blasco, Maribel

    2015-01-01

    The article proposes an approach, broadly inspired by culturally inclusive pedagogy, to facilitate international student academic adaptation based on rendering tacit aspects of local learning cultures explicit to international full degree students, rather than adapting them. Preliminary findings...... are presented from a focus group-based exploratory study of international student experiences at different stages of their studies at a Danish business school, one of Denmark’s most international universities. The data show how a major source of confusion for these students has to do with the tacit logics...... and expectations that shape how the formal steps of the learning cycle are understood and enacted locally, notably how learning and assessment moments are defined and related to one another. Theoretically, the article draws on tacit knowledge and sense-making theories to analyse student narratives...

  18. DsixTools: the standard model effective field theory toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Celis, Alejandro [Ludwig-Maximilians-Universitaet Muenchen, Fakultaet fuer Physik, Arnold Sommerfeld Center for Theoretical Physics, Munich (Germany); Fuentes-Martin, Javier; Vicente, Avelino [Universitat de Valencia-CSIC, Instituto de Fisica Corpuscular, Valencia (Spain); Virto, Javier [University of Bern, Albert Einstein Center for Fundamental Physics, Institute for Theoretical Physics, Bern (Switzerland)

    2017-06-15

    We present DsixTools, a Mathematica package for the handling of the dimension-six standard model effective field theory. Among other features, DsixTools allows the user to perform the full one-loop renormalization group evolution of the Wilson coefficients in the Warsaw basis. This is achieved thanks to the SMEFTrunner module, which implements the full one-loop anomalous dimension matrix previously derived in the literature. In addition, DsixTools also contains modules devoted to the matching to the ΔB = ΔS = 1, 2 and ΔB = ΔC = 1 operators of the Weak Effective Theory at the electroweak scale, and their QCD and QED Renormalization group evolution below the electroweak scale. (orig.)

  19. Open Tools for Integrated Modelling to Understand SDG development - The OPTIMUS program

    Science.gov (United States)

    Howells, Mark; Zepeda, Eduardo; Rogner, H. Holger; Sanchez, Marco; Roehrl, Alexander; Cicowiez, Matrin; Mentis, Dimitris; Korkevelos, Alexandros; Taliotis, Constantinos; Broad, Oliver; Alfstad, Thomas

    2016-04-01

    The recently adopted Sustainable Development Goals (SDGs) - a set of 17 measurable and time-bound goals with 169 associated targets for 2030 - are highly inclusive challenges before the world community ranging from eliminating poverty to human rights, inequality, a secure world and protection of the environment. Each individual goal or target by themselves present enormous tasks, taken together they are overwhelming. There strong and weak interlinkages, hence trade-offs and complementarities among goals and targets. Some targets may affect several goals while other goals and targets may conflict or be mutually exclusive (Ref). Meeting each of these requires the judicious exploitation of resource, with energy playing an important role. Such complexity demands to be addressed in an integrated way using systems analysis tools to support informed policy formulation, planning, allocation of scarce resources, monitoring progress, effectiveness and review at different scales. There is no one size fits all methodology that conceivably could include all goal and targets simultaneously. But there are methodologies encapsulating critical subsets of the goal and targets with strong interlinkages with a 'soft' reflection on the weak interlinkages. Universal food security or sustainable energy for all inherently support goals and targets on human rights and equality but possibly at the cost of biodiversity or desertification. Integrated analysis and planning tools are not yet commonplace at national universities - or indeed in many policy making organs. What is needed is a fundamental realignment of institutions and integrations of their planning processes and decision making. We introduce a series of open source tools to support the SDG planning and implementation process. The Global User-friendly CLEW Open Source (GLUCOSE) tool optimizes resource interactions and constraints; The Global Electrification Tool kit (GETit) provides the first global spatially explicit

  20. Model Fusion Tool - the Open Environmental Modelling Platform Concept

    Science.gov (United States)

    Kessler, H.; Giles, J. R.

    2010-12-01

    The vision of an Open Environmental Modelling Platform - seamlessly linking geoscience data, concepts and models to aid decision making in times of environmental change. Governments and their executive agencies across the world are facing increasing pressure to make decisions about the management of resources in light of population growth and environmental change. In the UK for example, groundwater is becoming a scarce resource for large parts of its most densely populated areas. At the same time river and groundwater flooding resulting from high rainfall events are increasing in scale and frequency and sea level rise is threatening the defences of coastal cities. There is also a need for affordable housing, improved transport infrastructure and waste disposal as well as sources of renewable energy and sustainable food production. These challenges can only be resolved if solutions are based on sound scientific evidence. Although we have knowledge and understanding of many individual processes in the natural sciences it is clear that a single science discipline is unable to answer the questions and their inter-relationships. Modern science increasingly employs computer models to simulate the natural, economic and human system. Management and planning requires scenario modelling, forecasts and ‘predictions’. Although the outputs are often impressive in terms of apparent accuracy and visualisation, they are inherently not suited to simulate the response to feedbacks from other models of the earth system, such as the impact of human actions. Geological Survey Organisations (GSO) are increasingly employing advances in Information Technology to visualise and improve their understanding of geological systems. Instead of 2 dimensional paper maps and reports many GSOs now produce 3 dimensional geological framework models and groundwater flow models as their standard output. Additionally the British Geological Survey have developed standard routines to link geological

  1. ISAC: A tool for aeroservoelastic modeling and analysis

    Science.gov (United States)

    Adams, William M., Jr.; Hoadley, Sherwood Tiffany

    1993-01-01

    The capabilities of the Interaction of Structures, Aerodynamics, and Controls (ISAC) system of program modules is discussed. The major modeling, analysis, and data management components of ISAC are identified. Equations of motion are displayed for a Laplace-domain representation of the unsteady aerodynamic forces. Options for approximating a frequency-domain representation of unsteady aerodynamic forces with rational functions of the Laplace variable are shown. Linear time invariant state-space equations of motion that result are discussed. Model generation and analyses of stability and dynamic response characteristics are shown for an aeroelastic vehicle which illustrates some of the capabilities of ISAC as a modeling and analysis tool for aeroelastic applications.

  2. Graphite-MicroMégas, a tool for DNA modeling

    OpenAIRE

    Hornus , Samuel; Larivière , Damien

    2011-01-01

    National audience; MicroMégas is the current state of an ongoing effort to develop tools for modeling biological assembly of molecules. We here present its DNA modeling part. MicroMégas is implemented as a plug-in to Graphite, which is a research plat- form for computer graphics, 3D modeling and numerical geometry that is developed by members of the ALICE team of INRIA.; Nous décrivons l'outils MicroMégas et les techniques qu'il met en jeu pour la modélisation d'assemblage de molécule, en par...

  3. Models as Tools of Analysis of a Network Organisation

    Directory of Open Access Journals (Sweden)

    Wojciech Pająk

    2013-06-01

    Full Text Available The paper presents models which may be applied as tools of analysis of a network organisation. The starting point of the discussion is defining the following terms: supply chain and network organisation. Further parts of the paper present basic assumptions analysis of a network organisation. Then the study characterises the best known models utilised in analysis of a network organisation. The purpose of the article is to define the notion and the essence of network organizations and to present the models used for their analysis.

  4. Using an Explicit Emission Tagging Method in Global Modeling of Source-Receptor Relationships for Black Carbon in the Arctic: Variations, Sources and Transport Pathways

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Hailong; Rasch, Philip J.; Easter, Richard C.; Singh, Balwinder; Zhang, Rudong; Ma, Po-Lun; Qian, Yun; Ghan, Steven J.; Beagley, Nathaniel

    2014-11-27

    We introduce an explicit emission tagging technique in the Community Atmosphere Model to quantify source-region-resolved characteristics of black carbon (BC), focusing on the Arctic. Explicit tagging of BC source regions without perturbing the emissions makes it straightforward to establish source-receptor relationships and transport pathways, providing a physically consistent and computationally efficient approach to produce a detailed characterization of the destiny of regional BC emissions and the potential for mitigation actions. Our analysis shows that the contributions of major source regions to the global BC burden are not proportional to the respective emissions due to strong region-dependent removal rates and lifetimes, while the contributions to BC direct radiative forcing show a near-linear dependence on their respective contributions to the burden. Distant sources contribute to BC in remote regions mostly in the mid- and upper troposphere, having much less impact on lower-level concentrations (and deposition) than on burden. Arctic BC concentrations, deposition and source contributions all have strong seasonal variations. Eastern Asia contributes the most to the wintertime Arctic burden. Northern Europe emissions are more important to both surface concentration and deposition in winter than in summer. The largest contribution to Arctic BC in the summer is from Northern Asia. Although local emissions contribute less than 10% to the annual mean BC burden and deposition within the Arctic, the per-emission efficiency is much higher than for major non-Arctic sources. The interannual variability (1996-2005) due to meteorology is small in annual mean BC burden and radiative forcing but is significant in yearly seasonal means over the Arctic. When a slow aging treatment of BC is introduced, the increase of BC lifetime and burden is source-dependent. Global BC forcing-per-burden efficiency also increases primarily due to changes in BC vertical distributions. The

  5. Spatially-Explicit Simulation Modeling of Ecological Response to Climate Change: Methodological Considerations in Predicting Shifting Population Dynamics of Infectious Disease Vectors

    Directory of Open Access Journals (Sweden)

    Justin V. Remais

    2013-07-01

    Full Text Available Poikilothermic disease vectors can respond to altered climates through spatial changes in both population size and phenology. Quantitative descriptors to characterize, analyze and visualize these dynamic responses are lacking, particularly across large spatial domains. In order to demonstrate the value of a spatially explicit, dynamic modeling approach, we assessed spatial changes in the population dynamics of Ixodes scapularis, the Lyme disease vector, using a temperature-forced population model simulated across a grid of 4 × 4 km cells covering the eastern United States, using both modeled (Weather Research and Forecasting (WRF 3.2.1 baseline/current (2001–2004 and projected (Representative Concentration Pathway (RCP 4.5 and RCP 8.5; 2057–2059 climate data. Ten dynamic population features (DPFs were derived from simulated populations and analyzed spatially to characterize the regional population response to current and future climate across the domain. Each DPF under the current climate was assessed for its ability to discriminate observed Lyme disease risk and known vector presence/absence, using data from the US Centers for Disease Control and Prevention. Peak vector population and month of peak vector population were the DPFs that performed best as predictors of current Lyme disease risk. When examined under baseline and projected climate scenarios, the spatial and temporal distributions of DPFs shift and the seasonal cycle of key questing life stages is compressed under some scenarios. Our results demonstrate the utility of spatial characterization, analysis and visualization of dynamic population responses—including altered phenology—of disease vectors to altered climate.

  6. Spatially-Explicit Simulation Modeling of Ecological Response to Climate Change: Methodological Considerations in Predicting Shifting Population Dynamics of Infectious Disease Vectors.

    Science.gov (United States)

    Dhingra, Radhika; Jimenez, Violeta; Chang, Howard H; Gambhir, Manoj; Fu, Joshua S; Liu, Yang; Remais, Justin V

    2013-09-01

    Poikilothermic disease vectors can respond to altered climates through spatial changes in both population size and phenology. Quantitative descriptors to characterize, analyze and visualize these dynamic responses are lacking, particularly across large spatial domains. In order to demonstrate the value of a spatially explicit, dynamic modeling approach, we assessed spatial changes in the population dynamics of Ixodes scapularis , the Lyme disease vector, using a temperature-forced population model simulated across a grid of 4 × 4 km cells covering the eastern United States, using both modeled (Weather Research and Forecasting (WRF) 3.2.1) baseline/current (2001-2004) and projected (Representative Concentration Pathway (RCP) 4.5 and RCP 8.5; 2057-2059) climate data. Ten dynamic population features (DPFs) were derived from simulated populations and analyzed spatially to characterize the regional population response to current and future climate across the domain. Each DPF under the current climate was assessed for its ability to discriminate observed Lyme disease risk and known vector presence/absence, using data from the US Centers for Disease Control and Prevention. Peak vector population and month of peak vector population were the DPFs that performed best as predictors of current Lyme disease risk. When examined under baseline and projected climate scenarios, the spatial and temporal distributions of DPFs shift and the seasonal cycle of key questing life stages is compressed under some scenarios. Our results demonstrate the utility of spatial characterization, analysis and visualization of dynamic population responses-including altered phenology-of disease vectors to altered climate.

  7. Theoretical Modeling of Rock Breakage by Hydraulic and Mechanical Tool

    Directory of Open Access Journals (Sweden)

    Hongxiang Jiang

    2014-01-01

    Full Text Available Rock breakage by coupled mechanical and hydraulic action has been developed over the past several decades, but theoretical study on rock fragmentation by mechanical tool with water pressure assistance was still lacking. The theoretical model of rock breakage by mechanical tool was developed based on the rock fracture mechanics and the solution of Boussinesq’s problem, and it could explain the process of rock fragmentation as well as predicating the peak reacting force. The theoretical model of rock breakage by coupled mechanical and hydraulic action was developed according to the superposition principle of intensity factors at the crack tip, and the reacting force of mechanical tool assisted by hydraulic action could be reduced obviously if the crack with a critical length could be produced by mechanical or hydraulic impact. The experimental results indicated that the peak reacting force could be reduced about 15% assisted by medium water pressure, and quick reduction of reacting force after peak value decreased the specific energy consumption of rock fragmentation by mechanical tool. The crack formation by mechanical or hydraulic impact was the prerequisite to improvement of the ability of combined breakage.

  8. Using the IEA ETSAP modelling tools for Denmark

    Energy Technology Data Exchange (ETDEWEB)

    Grohnheit, Poul Erik

    2008-12-15

    An important part of the cooperation within the IEA (International Energy Agency) is organised through national contributions to 'Implementation Agreements' on energy technology and energy analyses. One of them is ETSAP (Energy Technology Systems Analysis Programme), started in 1976. Denmark has signed the agreement and contributed to some early annexes. This project is motivated by an invitation to participate in ETSAP Annex X, 'Global Energy Systems and Common Analyses: Climate friendly, Secure and Productive Energy Systems' for the period 2005 to 2007. The main activity is semi-annual workshops focusing on presentations of model analyses and use of the ETSAP tools (the MARKAL/TIMES family of models). The project was also planned to benefit from the EU project 'NEEDS - New Energy Externalities Developments for Sustainability'. ETSAP is contributing to a part of NEEDS that develops the TIMES model for 29 European countries with assessment of future technologies. An additional project 'Monitoring and Evaluation of the RES directives: implementation in EU27 and policy recommendations for 2020' (RES2020) under Intelligent Energy Europe was added, as well as the Danish 'Centre for Energy, Environment and Health (CEEH), starting from January 2007. This report summarises the activities under ETSAP Annex X and related project, emphasising the development of modelling tools that will be useful for modelling the Danish energy system. It is also a status report for the development of a model for Denmark, focusing on the tools and features that allow comparison with other countries and, particularly, to evaluate assumptions and results in international models covering Denmark. (au)

  9. ADAS tools for collisional–radiative modelling of molecules

    Energy Technology Data Exchange (ETDEWEB)

    Guzmán, F., E-mail: francisco.guzman@cea.fr [Department of Physics, University of Strathclyde, Glasgow G4 0NG (United Kingdom); CEA, IRFM, Saint-Paul-lez-Durance 13108 (France); O’Mullane, M.; Summers, H.P. [Department of Physics, University of Strathclyde, Glasgow G4 0NG (United Kingdom)

    2013-07-15

    New theoretical and computational tools for molecular collisional–radiative models are presented. An application to the hydrogen molecule system has been made. At the same time, a structured database has been created where fundamental cross sections and rates for individual processes as well as derived data (effective coefficients) are stored. Relative populations for the vibrational states of the ground electronic state of H{sub 2} are presented and this vibronic resolution model is compared electronic resolution where vibronic transitions are summed over vibrational sub-states. Some new reaction rates are calculated by means of the impact parameter approximation. Computational tools have been developed to automate process and simplify the data assembly. Effective (collisional–radiative) rate coefficients versus temperature and density are presented.

  10. ADAS tools for collisional-radiative modelling of molecules

    Science.gov (United States)

    Guzmán, F.; O'Mullane, M.; Summers, H. P.

    2013-07-01

    New theoretical and computational tools for molecular collisional-radiative models are presented. An application to the hydrogen molecule system has been made. At the same time, a structured database has been created where fundamental cross sections and rates for individual processes as well as derived data (effective coefficients) are stored. Relative populations for the vibrational states of the ground electronic state of H2 are presented and this vibronic resolution model is compared electronic resolution where vibronic transitions are summed over vibrational sub-states. Some new reaction rates are calculated by means of the impact parameter approximation. Computational tools have been developed to automate process and simplify the data assembly. Effective (collisional-radiative) rate coefficients versus temperature and density are presented.

  11. Introduction to genetic algorithms as a modeling tool

    International Nuclear Information System (INIS)

    Wildberger, A.M.; Hickok, K.A.

    1990-01-01

    Genetic algorithms are search and classification techniques modeled on natural adaptive systems. This is an introduction to their use as a modeling tool with emphasis on prospects for their application in the power industry. It is intended to provide enough background information for its audience to begin to follow technical developments in genetic algorithms and to recognize those which might impact on electric power engineering. Beginning with a discussion of genetic algorithms and their origin as a model of biological adaptation, their advantages and disadvantages are described in comparison with other modeling tools such as simulation and neural networks in order to provide guidance in selecting appropriate applications. In particular, their use is described for improving expert systems from actual data and they are suggested as an aid in building mathematical models. Using the Thermal Performance Advisor as an example, it is suggested how genetic algorithms might be used to make a conventional expert system and mathematical model of a power plant adapt automatically to changes in the plant's characteristics

  12. A new tool for accelerator system modeling and analysis

    International Nuclear Information System (INIS)

    Gillespie, G.H.; Hill, B.W.; Jameson, R.A.

    1994-01-01

    A novel computer code is being developed to generate system level designs of radiofrequency ion accelerators. The goal of the Accelerator System Model (ASM) code is to create a modeling and analysis tool that is easy to use, automates many of the initial design calculations, supports trade studies used in assessing alternate designs and yet is flexible enough to incorporate new technology concepts as they emerge. Hardware engineering parameters and beam dynamics are modeled at comparable levels of fidelity. Existing scaling models of accelerator subsystems were sued to produce a prototype of ASM (version 1.0) working within the Shell for Particle Accelerator Related Codes (SPARC) graphical user interface. A small user group has been testing and evaluating the prototype for about a year. Several enhancements and improvements are now being developed. The current version (1.1) of ASM is briefly described and an example of the modeling and analysis capabilities is illustrated

  13. Surviving the present: Modeling tools for organizational change

    International Nuclear Information System (INIS)

    Pangaro, P.

    1992-01-01

    The nuclear industry, like the rest of modern American business, is beset by a confluence of economic, technological, competitive, regulatory, and political pressures. For better or worse, business schools and management consultants have leapt to the rescue, offering the most modern conveniences that they can purvey. Recent advances in the study of organizations have led to new tools for their analysis, revision, and repair. There are two complementary tools that do not impose values or injunctions in themselves. One, called the organization modeler, captures the hierarchy of purposes that organizations and their subparts carry out. Any deficiency or pathology is quickly illuminated, and requirements for repair are made clear. The second, called THOUGHTSTICKER, is used to capture the semantic content of the conversations that occur across the interactions of parts of an organization. The distinctions and vocabulary in the language of an organization, and the relations within that domain, are elicited from the participants so that all three are available for debate and refinement. The product of the applications of these modeling tools is not the resulting models but rather the enhancement of the organization as a consequence of the process of constructing them

  14. Modelling stillbirth mortality reduction with the Lives Saved Tool

    Directory of Open Access Journals (Sweden)

    Hannah Blencowe

    2017-11-01

    Full Text Available Abstract Background The worldwide burden of stillbirths is large, with an estimated 2.6 million babies stillborn in 2015 including 1.3 million dying during labour. The Every Newborn Action Plan set a stillbirth target of ≤12 per 1000 in all countries by 2030. Planning tools will be essential as countries set policy and plan investment to scale up interventions to meet this target. This paper summarises the approach taken for modelling the impact of scaling-up health interventions on stillbirths in the Lives Saved tool (LiST, and potential future refinements. Methods The specific application to stillbirths of the general method for modelling the impact of interventions in LiST is described. The evidence for the effectiveness of potential interventions to reduce stillbirths are reviewed and the assumptions of the affected fraction of stillbirths who could potentially benefit from these interventions are presented. The current assumptions and their effects on stillbirth reduction are described and potential future improvements discussed. Results High quality evidence are not available for all parameters in the LiST stillbirth model. Cause-specific mortality data is not available for stillbirths, therefore stillbirths are modelled in LiST using an attributable fraction approach by timing of stillbirths (antepartum/ intrapartum. Of 35 potential interventions to reduce stillbirths identified, eight interventions are currently modelled in LiST. These include childbirth care, induction for prolonged pregnancy, multiple micronutrient and balanced energy supplementation, malaria prevention and detection and management of hypertensive disorders of pregnancy, diabetes and syphilis. For three of the interventions, childbirth care, detection and management of hypertensive disorders of pregnancy, and diabetes the estimate of effectiveness is based on expert opinion through a Delphi process. Only for malaria is coverage information available, with coverage

  15. Producing Distribution Maps for a Spatially-Explicit Ecosystem Model Using Large Monitoring and Environmental Databases and a Combination of Interpolation and Extrapolation

    Directory of Open Access Journals (Sweden)

    Arnaud Grüss

    2018-01-01

    Full Text Available To be able to simulate spatial patterns of predator-prey interactions, many spatially-explicit ecosystem modeling platforms, including Atlantis, need to be provided with distribution maps defining the annual or seasonal spatial distributions of functional groups and life stages. We developed a methodology combining extrapolation and interpolation of the predictions made by statistical habitat models to produce distribution maps for the fish and invertebrates represented in the Atlantis model of the Gulf of Mexico (GOM Large Marine Ecosystem (LME (“Atlantis-GOM”. This methodology consists of: (1 compiling a large monitoring database, gathering all the fisheries-independent and fisheries-dependent data collected in the northern (U.S. GOM since 2000; (2 compiling a large environmental database, storing all the environmental parameters known to influence the spatial distribution patterns of fish and invertebrates of the GOM; (3 fitting binomial generalized additive models (GAMs to the large monitoring and environmental databases, and geostatistical binomial generalized linear mixed models (GLMMs to the large monitoring database; and (4 employing GAM predictions to infer spatial distributions in the southern GOM, and GLMM predictions to infer spatial distributions in the U.S. GOM. Thus, our methodology allows for reasonable extrapolation in the southern GOM based on a large amount of monitoring and environmental data, and for interpolation in the U.S. GOM accurately reflecting the probability of encountering fish and invertebrates in that region. We used an iterative cross-validation procedure to validate GAMs. When a GAM did not pass the validation test, we employed a GAM for a related functional group/life stage to generate distribution maps for the southern GOM. In addition, no geostatistical GLMMs were fit for the functional groups and life stages whose depth, longitudinal and latitudinal ranges within the U.S. GOM are not entirely covered by

  16. Information Theoretic Tools for Parameter Fitting in Coarse Grained Models

    KAUST Repository

    Kalligiannaki, Evangelia

    2015-01-07

    We study the application of information theoretic tools for model reduction in the case of systems driven by stochastic dynamics out of equilibrium. The model/dimension reduction is considered by proposing parametrized coarse grained dynamics and finding the optimal parameter set for which the relative entropy rate with respect to the atomistic dynamics is minimized. The minimization problem leads to a generalization of the force matching methods to non equilibrium systems. A multiplicative noise example reveals the importance of the diffusion coefficient in the optimization problem.

  17. An ensemble model of QSAR tools for regulatory risk assessment.

    Science.gov (United States)

    Pradeep, Prachi; Povinelli, Richard J; White, Shannon; Merrill, Stephen J

    2016-01-01

    Quantitative structure activity relationships (QSARs) are theoretical models that relate a quantitative measure of chemical structure to a physical property or a biological effect. QSAR predictions can be used for chemical risk assessment for protection of human and environmental health, which makes them interesting to regulators, especially in the absence of experimental data. For compatibility with regulatory use, QSAR models should be transparent, reproducible and optimized to minimize the number of false negatives. In silico QSAR tools are gaining wide acceptance as a faster alternative to otherwise time-consuming clinical and animal testing methods. However, different QSAR tools often make conflicting predictions for a given chemical and may also vary in their predictive performance across different chemical datasets. In a regulatory context, conflicting predictions raise interpretation, validation and adequacy concerns. To address these concerns, ensemble learning techniques in the machine learning paradigm can be used to integrate predictions from multiple tools. By leveraging various underlying QSAR algorithms and training datasets, the resulting consensus prediction should yield better overall predictive ability. We present a novel ensemble QSAR model using Bayesian classification. The model allows for varying a cut-off parameter that allows for a selection in the desirable trade-off between model sensitivity and specificity. The predictive performance of the ensemble model is compared with four in silico tools (Toxtree, Lazar, OECD Toolbox, and Danish QSAR) to predict carcinogenicity for a dataset of air toxins (332 chemicals) and a subset of the gold carcinogenic potency database (480 chemicals). Leave-one-out cross validation results show that the ensemble model achieves the best trade-off between sensitivity and specificity (accuracy: 83.8 % and 80.4 %, and balanced accuracy: 80.6 % and 80.8 %) and highest inter-rater agreement [kappa ( κ ): 0

  18. Models and Modelling Tools for Chemical Product and Process Design

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    2016-01-01

    The design, development and reliability of a chemical product and the process to manufacture it, need to be consistent with the end-use characteristics of the desired product. One of the common ways to match the desired product-process characteristics is through trial and error based experiments......-based framework is that in the design, development and/or manufacturing of a chemical product-process, the knowledge of the applied phenomena together with the product-process design details can be provided with diverse degrees of abstractions and details. This would allow the experimental resources...... to be employed for validation and fine-tuning of the solutions from the model-based framework, thereby, removing the need for trial and error experimental steps. Also, questions related to economic feasibility, operability and sustainability, among others, can be considered in the early stages of design. However...

  19. Modeling as a tool for process control: alcoholic fermentation

    Energy Technology Data Exchange (ETDEWEB)

    Tayeb, A M; Ashour, I A; Mostafa, N A [El-Minia Univ. (EG). Faculty of Engineering

    1991-01-01

    The results of the alcoholic fermentation of beet sugar molasses and wheat milling residues (Akalona) were fed into a computer program. Consequently, the kinetic parameters for these fermentation reactions were determined. These parameters were put into a kinetic model. Next, the model was tested, and the results obtained were compared with the experimental results of both beet molasses and Akalona. The deviation of the experimental results from the results obtained from the model was determined. An acceptable deviation of 1.2% for beet sugar molasses and 3.69% for Akalona was obtained. Thus, the present model could be a tool for chemical engineers working in fermentation processes both with respect to the control of the process and the design of the fermentor. (Author).

  20. Logic flowgraph methodology - A tool for modeling embedded systems

    Science.gov (United States)

    Muthukumar, C. T.; Guarro, S. B.; Apostolakis, G. E.

    1991-01-01

    The logic flowgraph methodology (LFM), a method for modeling hardware in terms of its process parameters, has been extended to form an analytical tool for the analysis of integrated (hardware/software) embedded systems. In the software part of a given embedded system model, timing and the control flow among different software components are modeled by augmenting LFM with modified Petrinet structures. The objective of the use of such an augmented LFM model is to uncover possible errors and the potential for unanticipated software/hardware interactions. This is done by backtracking through the augmented LFM mode according to established procedures which allow the semiautomated construction of fault trees for any chosen state of the embedded system (top event). These fault trees, in turn, produce the possible combinations of lower-level states (events) that may lead to the top event.

  1. Advances in Intelligent Modelling and Simulation Simulation Tools and Applications

    CERN Document Server

    Oplatková, Zuzana; Carvalho, Marco; Kisiel-Dorohinicki, Marek

    2012-01-01

    The human capacity to abstract complex systems and phenomena into simplified models has played a critical role in the rapid evolution of our modern industrial processes and scientific research. As a science and an art, Modelling and Simulation have been one of the core enablers of this remarkable human trace, and have become a topic of great importance for researchers and practitioners. This book was created to compile some of the most recent concepts, advances, challenges and ideas associated with Intelligent Modelling and Simulation frameworks, tools and applications. The first chapter discusses the important aspects of a human interaction and the correct interpretation of results during simulations. The second chapter gets to the heart of the analysis of entrepreneurship by means of agent-based modelling and simulations. The following three chapters bring together the central theme of simulation frameworks, first describing an agent-based simulation framework, then a simulator for electrical machines, and...

  2. Minimizing Erosion and Agro-Pollutants Transport from Furrow Irrigated Fields to the Nearby Water Body Using Spatially-Explicit Agent Based Model and Decision Optimization Platform

    Science.gov (United States)

    Ghoveisi, H.; Al Dughaishi, U.; Kiker, G.

    2017-12-01

    Maintaining water quality in agricultural watersheds is a worldwide challenge, especially where furrow irrigation is being practiced. The Yakima River Basin watershed in south central Washington State, (USA) is an example of these impacted areas with elevated load of sediments and other agricultural products due to runoff from furrow-irrigated fields. Within the Yakima basin, the Granger Drain watershed (area of 75 km2) is particularly challenged in this regard with more than 400 flood-irrigated individual parcels (area of 21 km2) growing a variety of crops from maize to grapes. Alternatives for improving water quality from furrow-irrigated parcels include vegetated filter strip (VFS) implementation, furrow water application efficiency, polyacrylamide (PAM) application and irrigation scheduling. These alternatives were simulated separately and in combinations to explore potential Best Management Practices (BMPs) for runoff-related-pollution reduction in a spatially explicit, agent based modeling system (QnD:GrangerDrain). Two regulatory scenarios were tested to BMP adoption within individual parcels. A blanket-style regulatory scenario simulated a total of 60 BMP combinations implemented in all 409 furrow-irrigated parcels. A second regulatory scenario simulated the BMPs in 119 furrow-irrigated parcels designated as "hotspots" based on a standard 12 Mg ha-1 seasonal sediment load. The simulated cumulative runoff and sediment loading from all BMP alternatives were ranked using Multiple Criteria Decision Analysis (MCDA), specifically the Stochastic Multi-Attribute Acceptability Analysis (SMAA) method. Several BMP combinations proved successful in reducing loads below a 25 NTU (91 mg L-1) regulatory sediment concentration. The QnD:GrangerDrain simulations and subsequent MCDA ranking revealed that the BMP combinations of 5 m-VFS and high furrow water efficiency were highly ranked alternatives for both the blanket and hotspot scenarios.

  3. Modeling in the Classroom: An Evolving Learning Tool

    Science.gov (United States)

    Few, A. A.; Marlino, M. R.; Low, R.

    2006-12-01

    Among the early programs (early 1990s) focused on teaching Earth System Science were the Global Change Instruction Program (GCIP) funded by NSF through UCAR and the Earth System Science Education Program (ESSE) funded by NASA through USRA. These two programs introduced modeling as a learning tool from the beginning, and they provided workshops, demonstrations and lectures for their participating universities. These programs were aimed at university-level education. Recently, classroom modeling is experiencing a revival of interest. Drs John Snow and Arthur Few conducted two workshops on modeling at the ESSE21 meeting in Fairbanks, Alaska, in August 2005. The Digital Library for Earth System Education (DLESE) at http://www.dlese.org provides web access to STELLA models and tutorials, and UCAR's Education and Outreach (EO) program holds workshops that include training in modeling. An important innovation to the STELLA modeling software by isee systems, http://www.iseesystems.com, called "isee Player" is available as a free download. The Player allows users to view and run STELLA models, change model parameters, share models with colleagues and students, and make working models available on the web. This is important because the expert can create models, and the user can learn how the modeled system works. Another aspect of this innovation is that the educational benefits of modeling concepts can be extended throughout most of the curriculum. The procedure for building a working computer model of an Earth Science System follows this general format: (1) carefully define the question(s) for which you seek the answer(s); (2) identify the interacting system components and inputs contributing to the system's behavior; (3) collect the information and data that will be required to complete the conceptual model; (4) construct a system diagram (graphic) of the system that displays all of system's central questions, components, relationships and required inputs. At this stage

  4. Analysis of Sequence Diagram Layout in Advanced UML Modelling Tools

    Directory of Open Access Journals (Sweden)

    Ņikiforova Oksana

    2016-05-01

    Full Text Available System modelling using Unified Modelling Language (UML is the task that should be solved for software development. The more complex software becomes the higher requirements are stated to demonstrate the system to be developed, especially in its dynamic aspect, which in UML is offered by a sequence diagram. To solve this task, the main attention is devoted to the graphical presentation of the system, where diagram layout plays the central role in information perception. The UML sequence diagram due to its specific structure is selected for a deeper analysis on the elements’ layout. The authors research represents the abilities of modern UML modelling tools to offer automatic layout of the UML sequence diagram and analyse them according to criteria required for the diagram perception.

  5. MODERN TOOLS FOR MODELING ACTIVITY IT-COMPANIES

    Directory of Open Access Journals (Sweden)

    Марина Петрівна ЧАЙКОВСЬКА

    2015-05-01

    Full Text Available Increasing competition in the market of the web-based applications increases the importance of the quality of services and optimization of processes of interaction with customers. The purpose of the article is to develop recommendations for improving the business processes of IT enterprises of web application segment based on technological tools for business modeling, shaping requirements for the development of an information system for customer interaction; analysis of the effective means of implementation and evaluation of the economic effects of the introduction. A scheme of the business process development and launch of the website was built, based on the analysis of business process models and “swim lane” models, requirements for IP customer relationship management for web studio were established. Market of software to create IP was analyzed, and the ones corresponding to the requirements were selected. IP system was developed and tested, implemented it in the company, an appraisal of the economic effect was conducted.

  6. Tools and Methods for RTCP-Nets Modeling and Verification

    Directory of Open Access Journals (Sweden)

    Szpyrka Marcin

    2016-09-01

    Full Text Available RTCP-nets are high level Petri nets similar to timed colored Petri nets, but with different time model and some structural restrictions. The paper deals with practical aspects of using RTCP-nets for modeling and verification of real-time systems. It contains a survey of software tools developed to support RTCP-nets. Verification of RTCP-nets is based on coverability graphs which represent the set of reachable states in the form of directed graph. Two approaches to verification of RTCP-nets are considered in the paper. The former one is oriented towards states and is based on translation of a coverability graph into nuXmv (NuSMV finite state model. The later approach is oriented towards transitions and uses the CADP toolkit to check whether requirements given as μ-calculus formulae hold for a given coverability graph. All presented concepts are discussed using illustrative examples

  7. Dynamic wind turbine models in power system simulation tool

    DEFF Research Database (Denmark)

    Hansen, Anca D.; Iov, Florin; Sørensen, Poul

    , connection of the wind turbine at different types of grid and storage systems. Different control strategies have been developed and implemented for these wind turbine concepts, their performance in normal or fault operation being assessed and discussed by means of simulations. The described control......This report presents a collection of models and control strategies developed and implemented in the power system simulation tool PowerFactory DIgSILENT for different wind turbine concepts. It is the second edition of Risø-R-1400(EN) and it gathers and describes a whole wind turbine model database...... of the interaction between the mechanical structure of the wind turbine and the electrical grid during different operational modes. The report provides thus a description of the wind turbines modelling, both at a component level and at a system level. The report contains both the description of DIgSILENT built...

  8. Mathematical modeling of physiological systems: an essential tool for discovery.

    Science.gov (United States)

    Glynn, Patric; Unudurthi, Sathya D; Hund, Thomas J

    2014-08-28

    Mathematical models are invaluable tools for understanding the relationships between components of a complex system. In the biological context, mathematical models help us understand the complex web of interrelations between various components (DNA, proteins, enzymes, signaling molecules etc.) in a biological system, gain better understanding of the system as a whole, and in turn predict its behavior in an altered state (e.g. disease). Mathematical modeling has enhanced our understanding of multiple complex biological processes like enzyme kinetics, metabolic networks, signal transduction pathways, gene regulatory networks, and electrophysiology. With recent advances in high throughput data generation methods, computational techniques and mathematical modeling have become even more central to the study of biological systems. In this review, we provide a brief history and highlight some of the important applications of modeling in biological systems with an emphasis on the study of excitable cells. We conclude with a discussion about opportunities and challenges for mathematical modeling going forward. In a larger sense, the review is designed to help answer a simple but important question that theoreticians frequently face from interested but skeptical colleagues on the experimental side: "What is the value of a model?" Copyright © 2014 Elsevier Inc. All rights reserved.

  9. The Role of Explicit and Implicit Self-Esteem in Peer Modeling of Palatable Food Intake: A Study on Social Media Interaction among Youngsters

    NARCIS (Netherlands)

    Bevelander, K.E.; Anschutz, D.J.; Creemers, D.H.M.; Kleinjan, M.; Engels, R.C.M.E.

    2013-01-01

    Objective: This experimental study investigated the impact of peers on palatable food intake of youngsters within a social media setting. To determine whether this effect was moderated by self-esteem, the present study examined the roles of global explicit self-esteem (ESE), body esteem (BE) and

  10. Materials modelling - a possible design tool for advanced nuclear applications

    International Nuclear Information System (INIS)

    Hoffelner, W.; Samaras, M.; Bako, B.; Iglesias, R.

    2008-01-01

    The design of components for power plants is usually based on codes, standards and design rules or code cases. However, it is very difficult to get the necessary experimental data to prove these lifetime assessment procedures for long-term applications in environments where complex damage interactions (temperature, stress, environment, irradiation) can occur. The rules used are often very simple and do not have a basis which take physical damage into consideration. The linear life fraction rule for creep and fatigue interaction can be taken as a prominent example. Materials modelling based on a multi-scale approach in principle provides a tool to convert microstructural findings into mechanical response and therefore has the capability of providing a set of tools for the improvement of design life assessments. The strength of current multi-scale modelling efforts is the insight they offer as regards experimental phenomena. To obtain an understanding of these phenomena it is import to focus on issues which are important at the various time and length scales of the modelling code. In this presentation the multi-scale path will be demonstrated with a few recent examples which focus on VHTR applications. (authors)

  11. Integrated modeling tool for performance engineering of complex computer systems

    Science.gov (United States)

    Wright, Gary; Ball, Duane; Hoyt, Susan; Steele, Oscar

    1989-01-01

    This report summarizes Advanced System Technologies' accomplishments on the Phase 2 SBIR contract NAS7-995. The technical objectives of the report are: (1) to develop an evaluation version of a graphical, integrated modeling language according to the specification resulting from the Phase 2 research; and (2) to determine the degree to which the language meets its objectives by evaluating ease of use, utility of two sets of performance predictions, and the power of the language constructs. The technical approach followed to meet these objectives was to design, develop, and test an evaluation prototype of a graphical, performance prediction tool. The utility of the prototype was then evaluated by applying it to a variety of test cases found in the literature and in AST case histories. Numerous models were constructed and successfully tested. The major conclusion of this Phase 2 SBIR research and development effort is that complex, real-time computer systems can be specified in a non-procedural manner using combinations of icons, windows, menus, and dialogs. Such a specification technique provides an interface that system designers and architects find natural and easy to use. In addition, PEDESTAL's multiview approach provides system engineers with the capability to perform the trade-offs necessary to produce a design that meets timing performance requirements. Sample system designs analyzed during the development effort showed that models could be constructed in a fraction of the time required by non-visual system design capture tools.

  12. Performance Analysis, Modeling and Scaling of HPC Applications and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Bhatele, Abhinav [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-01-13

    E cient use of supercomputers at DOE centers is vital for maximizing system throughput, mini- mizing energy costs and enabling science breakthroughs faster. This requires complementary e orts along several directions to optimize the performance of scienti c simulation codes and the under- lying runtimes and software stacks. This in turn requires providing scalable performance analysis tools and modeling techniques that can provide feedback to physicists and computer scientists developing the simulation codes and runtimes respectively. The PAMS project is using time allocations on supercomputers at ALCF, NERSC and OLCF to further the goals described above by performing research along the following fronts: 1. Scaling Study of HPC applications; 2. Evaluation of Programming Models; 3. Hardening of Performance Tools; 4. Performance Modeling of Irregular Codes; and 5. Statistical Analysis of Historical Performance Data. We are a team of computer and computational scientists funded by both DOE/NNSA and DOE/ ASCR programs such as ECRP, XStack (Traleika Glacier, PIPER), ExaOSR (ARGO), SDMAV II (MONA) and PSAAP II (XPACC). This allocation will enable us to study big data issues when analyzing performance on leadership computing class systems and to assist the HPC community in making the most e ective use of these resources.

  13. Landscape capability models as a tool to predict fine-scale forest bird occupancy and abundance

    Science.gov (United States)

    Loman, Zachary G.; DeLuca, William; Harrison, Daniel J.; Loftin, Cynthia S.; Rolek, Brian W.; Wood, Petra B.

    2018-01-01

    ContextSpecies-specific models of landscape capability (LC) can inform landscape conservation design. Landscape capability is “the ability of the landscape to provide the environment […] and the local resources […] needed for survival and reproduction […] in sufficient quantity, quality and accessibility to meet the life history requirements of individuals and local populations.” Landscape capability incorporates species’ life histories, ecologies, and distributions to model habitat for current and future landscapes and climates as a proactive strategy for conservation planning.ObjectivesWe tested the ability of a set of LC models to explain variation in point occupancy and abundance for seven bird species representative of spruce-fir, mixed conifer-hardwood, and riparian and wooded wetland macrohabitats.MethodsWe compiled point count data sets used for biological inventory, species monitoring, and field studies across the northeastern United States to create an independent validation data set. Our validation explicitly accounted for underestimation in validation data using joint distance and time removal sampling.ResultsBlackpoll warbler (Setophaga striata), wood thrush (Hylocichla mustelina), and Louisiana (Parkesia motacilla) and northern waterthrush (P. noveboracensis) models were validated as predicting variation in abundance, although this varied from not biologically meaningful (1%) to strongly meaningful (59%). We verified all seven species models [including ovenbird (Seiurus aurocapilla), blackburnian (Setophaga fusca) and cerulean warbler (Setophaga cerulea)], as all were positively related to occupancy data.ConclusionsLC models represent a useful tool for conservation planning owing to their predictive ability over a regional extent. As improved remote-sensed data become available, LC layers are updated, which will improve predictions.

  14. Modeling energy technology choices. Which investment analysis tools are appropriate?

    International Nuclear Information System (INIS)

    Johnson, B.E.

    1994-01-01

    A variety of tools from modern investment theory appear to hold promise for unraveling observed energy technology investment behavior that often appears anomalous when analyzed using traditional investment analysis methods. This paper reviews the assumptions and important insights of the investment theories most commonly suggested as candidates for explaining the apparent ''energy technology investment paradox''. The applicability of each theory is considered in the light of important aspects of energy technology investment problems, such as sunk costs, uncertainty and imperfect information. The theories addressed include the capital asset pricing model, the arbitrage pricing theory, and the theory of irreversible investment. Enhanced net present value methods are also considered. (author)

  15. Evaluation and comparison of models and modelling tools simulating nitrogen processes in treatment wetlands

    DEFF Research Database (Denmark)

    Edelfeldt, Stina; Fritzson, Peter

    2008-01-01

    with Modelica 2.1 (Wiley-IEEE Press, USA, 2004).] and an associated tool. The differences and similarities between the MathModelica Model Editor and three other ecological modelling tools have also been evaluated. The results show that the models can well be modelled and simulated in the MathModelica Model...... Editor, and that nitrogen decrease in a constructed treatment wetland should be described and simulated using the Nitrification/Denitrification model as this model has the highest overall quality score and provides a more variable environment.......In this paper, two ecological models of nitrogen processes in treatment wetlands have been evaluated and compared. These models were implemented, simulated, and visualized using the Modelica modelling and simulation language [P. Fritzson, Principles of Object-Oriented Modelling and Simulation...

  16. CDPOP: A spatially explicit cost distance population genetics program

    Science.gov (United States)

    Erin L. Landguth; S. A. Cushman

    2010-01-01

    Spatially explicit simulation of gene flow in complex landscapes is essential to explain observed population responses and provide a foundation for landscape genetics. To address this need, we wrote a spatially explicit, individual-based population genetics model (CDPOP). The model implements individual-based population modelling with Mendelian inheritance and k-allele...

  17. ExEP yield modeling tool and validation test results

    Science.gov (United States)

    Morgan, Rhonda; Turmon, Michael; Delacroix, Christian; Savransky, Dmitry; Garrett, Daniel; Lowrance, Patrick; Liu, Xiang Cate; Nunez, Paul

    2017-09-01

    EXOSIMS is an open-source simulation tool for parametric modeling of the detection yield and characterization of exoplanets. EXOSIMS has been adopted by the Exoplanet Exploration Programs Standards Definition and Evaluation Team (ExSDET) as a common mechanism for comparison of exoplanet mission concept studies. To ensure trustworthiness of the tool, we developed a validation test plan that leverages the Python-language unit-test framework, utilizes integration tests for selected module interactions, and performs end-to-end crossvalidation with other yield tools. This paper presents the test methods and results, with the physics-based tests such as photometry and integration time calculation treated in detail and the functional tests treated summarily. The test case utilized a 4m unobscured telescope with an idealized coronagraph and an exoplanet population from the IPAC radial velocity (RV) exoplanet catalog. The known RV planets were set at quadrature to allow deterministic validation of the calculation of physical parameters, such as working angle, photon counts and integration time. The observing keepout region was tested by generating plots and movies of the targets and the keepout zone over a year. Although the keepout integration test required the interpretation of a user, the test revealed problems in the L2 halo orbit and the parameterization of keepout applied to some solar system bodies, which the development team was able to address. The validation testing of EXOSIMS was performed iteratively with the developers of EXOSIMS and resulted in a more robust, stable, and trustworthy tool that the exoplanet community can use to simulate exoplanet direct-detection missions from probe class, to WFIRST, up to large mission concepts such as HabEx and LUVOIR.

  18. Conceptual Models as Tools for Communication Across Disciplines

    Directory of Open Access Journals (Sweden)

    Marieke Heemskerk

    2003-12-01

    Full Text Available To better understand and manage complex social-ecological systems, social scientists and ecologists must collaborate. However, issues related to language and research approaches can make it hard for researchers in different fields to work together. This paper suggests that researchers can improve interdisciplinary science through the use of conceptual models as a communication tool. The authors share lessons from a workshop in which interdisciplinary teams of young scientists developed conceptual models of social-ecological systems using data sets and metadata from Long-Term Ecological Research sites across the United States. Both the process of model building and the models that were created are discussed. The exercise revealed that the presence of social scientists in a group influenced the place and role of people in the models. This finding suggests that the participation of both ecologists and social scientists in the early stages of project development may produce better questions and more accurate models of interactions between humans and ecosystems. Although the participants agreed that a better understanding of human intentions and behavior would advance ecosystem science, they felt that interdisciplinary research might gain more by training strong disciplinarians than by merging ecology and social sciences into a new field. It is concluded that conceptual models can provide an inspiring point of departure and a guiding principle for interdisciplinary group discussions. Jointly developing a model not only helped the participants to formulate questions, clarify system boundaries, and identify gaps in existing data, but also revealed the thoughts and assumptions of fellow scientists. Although the use of conceptual models will not serve all purposes, the process of model building can help scientists, policy makers, and resource managers discuss applied problems and theory among themselves and with those in other areas.

  19. Implicit, explicit and speculative knowledge

    NARCIS (Netherlands)

    van Ditmarsch, H.; French, T.; Velázquez-Quesada, F.R.; Wáng, Y.N.

    We compare different epistemic notions in the presence of awareness of propositional variables: the logic of implicit knowledge (in which explicit knowledge is definable), the logic of explicit knowledge, and the logic of speculative knowledge. Speculative knowledge is a novel epistemic notion that

  20. System dynamics models as decision-making tools in agritourism

    Directory of Open Access Journals (Sweden)

    Jere Jakulin Tadeja

    2016-12-01

    Full Text Available Agritourism as a type of niche tourism is a complex and softly defined phaenomenon. The demands for fast and integrated decision regarding agritourism and its interconnections with environment, economy (investments, traffic and social factors (tourists is urgent. Many different methodologies and methods master softly structured questions and dilemmas with global and local properties. Here we present methods of systems thinking and system dynamics, which were first brought into force in the educational and training area in the form of different computer simulations and later as tools for decision-making and organisational re-engineering. We develop system dynamics models in order to present accuracy of methodology. These models are essentially simple and can serve only as describers of the activity of basic mutual influences among variables. We will pay the attention to the methodology for parameter model values determination and the so-called mental model. This one is the basis of causal connections among model variables. At the end, we restore a connection between qualitative and quantitative models in frame of system dynamics.

  1. Implicit and explicit ethnocentrism: revisiting the ideologies of prejudice.

    Science.gov (United States)

    Cunningham, William A; Nezlek, John B; Banaji, Mahzarin R

    2004-10-01

    Two studies investigated relationships among individual differences in implicit and explicit prejudice, right-wing ideology, and rigidity in thinking. The first study examined these relationships focusing on White Americans' prejudice toward Black Americans. The second study provided the first test of implicit ethnocentrism and its relationship to explicit ethnocentrism by studying the relationship between attitudes toward five social groups. Factor analyses found support for both implicit and explicit ethnocentrism. In both studies, mean explicit attitudes toward out groups were positive, whereas implicit attitudes were negative, suggesting that implicit and explicit prejudices are distinct; however, in both studies, implicit and explicit attitudes were related (r = .37, .47). Latent variable modeling indicates a simple structure within this ethnocentric system, with variables organized in order of specificity. These results lead to the conclusion that (a) implicit ethnocentrism exists and (b) it is related to and distinct from explicit ethnocentrism.

  2. Explicit Finite Element Modeling of Multilayer Composite Fabric for Gas Turbine Engine Containment Systems, Phase II. Part 3; Material Model Development and Simulation of Experiments

    Science.gov (United States)

    Simmons, J.; Erlich, D.; Shockey, D.

    2009-01-01

    A team consisting of Arizona State University, Honeywell Engines, Systems & Services, the National Aeronautics and Space Administration Glenn Research Center, and SRI International collaborated to develop computational models and verification testing for designing and evaluating turbine engine fan blade fabric containment structures. This research was conducted under the Federal Aviation Administration Airworthiness Assurance Center of Excellence and was sponsored by the Aircraft Catastrophic Failure Prevention Program. The research was directed toward improving the modeling of a turbine engine fabric containment structure for an engine blade-out containment demonstration test required for certification of aircraft engines. The research conducted in Phase II began a new level of capability to design and develop fan blade containment systems for turbine engines. Significant progress was made in three areas: (1) further development of the ballistic fabric model to increase confidence and robustness in the material models for the Kevlar(TradeName) and Zylon(TradeName) material models developed in Phase I, (2) the capability was improved for finite element modeling of multiple layers of fabric using multiple layers of shell elements, and (3) large-scale simulations were performed. This report concentrates on the material model development and simulations of the impact tests.

  3. Modelling thermomechanical conditions at the tool/matrix interface in Friction Stir Welding

    DEFF Research Database (Denmark)

    Schmidt, Henrik Nikolaj Blich; Hattel, Jesper

    2004-01-01

    is obtained. A fully coupled thermo-mechanical 3D FE model has been developed in ABAQUS/Explicit using the ALE formulation and the Johnson-Cook material law. The contact forces are modelled by Coulomb’s law of friction making the contact condition highly solution dependent. The heat is generated by both...

  4. Introducing Modeling Transition Diagrams as a Tool to Connect Mathematical Modeling to Mathematical Thinking

    Science.gov (United States)

    Czocher, Jennifer A.

    2016-01-01

    This study contributes a methodological tool to reconstruct the cognitive processes and mathematical activities carried out by mathematical modelers. Represented as Modeling Transition Diagrams (MTDs), individual modeling routes were constructed for four engineering undergraduate students. Findings stress the importance and limitations of using…

  5. 3D-Printed Craniosynostosis Model: New Simulation Surgical Tool.

    Science.gov (United States)

    Ghizoni, Enrico; de Souza, João Paulo Sant Ana Santos; Raposo-Amaral, Cassio Eduardo; Denadai, Rafael; de Aquino, Humberto Belém; Raposo-Amaral, Cesar Augusto; Joaquim, Andrei Fernandes; Tedeschi, Helder; Bernardes, Luís Fernando; Jardini, André Luiz

    2018-01-01

    Craniosynostosis is a complex disease once it involves deep anatomic perception, and a minor mistake during surgery can be fatal. The objective of this report is to present novel 3-dimensional-printed polyamide craniosynostosis models that can improve the understanding and treatment complex pathologies. The software InVesalius was used for segmentation of the anatomy image (from 3 patients between 6 and 9 months old). Afterward, the file was transferred to a 3-dimensional printing system and, with the use of an infrared laser, slices of powder PA 2200 were consecutively added to build a polyamide model of cranial bone. The 3 craniosynostosis models allowed fronto-orbital advancement, Pi procedure, and posterior distraction in the operating room environment. All aspects of the craniofacial anatomy could be shown on the models, as well as the most common craniosynostosis pathologic variations (sphenoid wing elevation, shallow orbits, jugular foramen stenosis). Another advantage of our model is its low cost, about 100 U.S. dollars or even less when several models are produced. Simulation is becoming an essential part of medical education for surgical training and for improving surgical safety with adequate planning. This new polyamide craniosynostosis model allowed the surgeons to have realistic tactile feedback on manipulating a child's bone and permitted execution of the main procedures for anatomic correction. It is a low-cost model. Therefore our model is an excellent option for training purposes and is potentially a new important tool to improve the quality of the management of patients with craniosynostosis. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    Science.gov (United States)

    Joshi, D. M.; Patel, H. K.

    2015-10-01

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant.

  7. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    International Nuclear Information System (INIS)

    Joshi, D.M.; Patel, H.K.

    2015-01-01

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant

  8. Transposons As Tools for Functional Genomics in Vertebrate Models.

    Science.gov (United States)

    Kawakami, Koichi; Largaespada, David A; Ivics, Zoltán

    2017-11-01

    Genetic tools and mutagenesis strategies based on transposable elements are currently under development with a vision to link primary DNA sequence information to gene functions in vertebrate models. By virtue of their inherent capacity to insert into DNA, transposons can be developed into powerful tools for chromosomal manipulations. Transposon-based forward mutagenesis screens have numerous advantages including high throughput, easy identification of mutated alleles, and providing insight into genetic networks and pathways based on phenotypes. For example, the Sleeping Beauty transposon has become highly instrumental to induce tumors in experimental animals in a tissue-specific manner with the aim of uncovering the genetic basis of diverse cancers. Here, we describe a battery of mutagenic cassettes that can be applied in conjunction with transposon vectors to mutagenize genes, and highlight versatile experimental strategies for the generation of engineered chromosomes for loss-of-function as well as gain-of-function mutagenesis for functional gene annotation in vertebrate models, including zebrafish, mice, and rats. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Automated sensitivity analysis: New tools for modeling complex dynamic systems

    International Nuclear Information System (INIS)

    Pin, F.G.

    1987-01-01

    Sensitivity analysis is an established methodology used by researchers in almost every field to gain essential insight in design and modeling studies and in performance assessments of complex systems. Conventional sensitivity analysis methodologies, however, have not enjoyed the widespread use they deserve considering the wealth of information they can provide, partly because of their prohibitive cost or the large initial analytical investment they require. Automated systems have recently been developed at ORNL to eliminate these drawbacks. Compilers such as GRESS and EXAP now allow automatic and cost effective calculation of sensitivities in FORTRAN computer codes. In this paper, these and other related tools are described and their impact and applicability in the general areas of modeling, performance assessment and decision making for radioactive waste isolation problems are discussed

  10. Development of tools and models for computational fracture assessment

    International Nuclear Information System (INIS)

    Talja, H.; Santaoja, K.

    1998-01-01

    The aim of the work presented in this paper has been to develop and test new computational tools and theoretically more sound methods for fracture mechanical analysis. The applicability of the engineering integrity assessment system MASI for evaluation of piping components has been extended. The most important motivation for the theoretical development have been the well-known fundamental limitations in the validity of J-integral, which limits its applicability in many important practical safety assessment cases. Examples are extensive plastic deformation, multimaterial structures and ascending loading paths (especially warm prestress, WPS). Further, the micromechanical Gurson model has been applied to several reactor pressure vessel materials. Special attention is paid to the transferability of Gurson model parameters from tensile test results to prediction of ductile failure behaviour of cracked structures. (author)

  11. Edge effect modeling of small tool polishing in planetary movement

    Science.gov (United States)

    Li, Qi-xin; Ma, Zhen; Jiang, Bo; Yao, Yong-sheng

    2018-03-01

    As one of the most challenging problems in Computer Controlled Optical Surfacing (CCOS), the edge effect greatly affects the polishing accuracy and efficiency. CCOS rely on stable tool influence function (TIF), however, at the edge of the mirror surface,with the grinding head out of the mirror ,the contact area and pressure distribution changes, which resulting in a non-linear change of TIF, and leads to tilting or sagging at the edge of the mirror. In order reduce the adverse effects and improve the polishing accuracy and efficiency. In this paper, we used the finite element simulation to analyze the pressure distribution at the mirror edge and combined with the improved traditional method to establish a new model. The new method fully considered the non-uniformity of pressure distribution. After modeling the TIFs in different locations, the description and prediction of the edge effects are realized, which has a positive significance on the control and suppression of edge effects

  12. MTK: An AI tool for model-based reasoning

    Science.gov (United States)

    Erickson, William K.; Schwartz, Mary R.

    1987-01-01

    A 1988 goal for the Systems Autonomy Demonstration Project Office of the NASA Ames Research Center is to apply model-based representation and reasoning techniques in a knowledge-based system that will provide monitoring, fault diagnosis, control and trend analysis of the space station Thermal Management System (TMS). A number of issues raised during the development of the first prototype system inspired the design and construction of a model-based reasoning tool called MTK, which was used in the building of the second prototype. These issues are outlined, along with examples from the thermal system to highlight the motivating factors behind them. An overview of the capabilities of MTK is given.

  13. Empirical flow parameters : a tool for hydraulic model validity

    Science.gov (United States)

    Asquith, William H.; Burley, Thomas E.; Cleveland, Theodore G.

    2013-01-01

    The objectives of this project were (1) To determine and present from existing data in Texas, relations between observed stream flow, topographic slope, mean section velocity, and other hydraulic factors, to produce charts such as Figure 1 and to produce empirical distributions of the various flow parameters to provide a methodology to "check if model results are way off!"; (2) To produce a statistical regional tool to estimate mean velocity or other selected parameters for storm flows or other conditional discharges at ungauged locations (most bridge crossings) in Texas to provide a secondary way to compare such values to a conventional hydraulic modeling approach. (3.) To present ancillary values such as Froude number, stream power, Rosgen channel classification, sinuosity, and other selected characteristics (readily determinable from existing data) to provide additional information to engineers concerned with the hydraulic-soil-foundation component of transportation infrastructure.

  14. Isotopes as validation tools for global climate models

    International Nuclear Information System (INIS)

    Henderson-Sellers, A.

    2001-01-01

    Global Climate Models (GCMs) are the predominant tool with which we predict the future climate. In order that people can have confidence in such predictions, GCMs require validation. As almost every available item of meteorological data has been exploited in the construction and tuning of GCMs to date, independent validation is very difficult. This paper explores the use of isotopes as a novel and fully independent means of evaluating GCMs. The focus is the Amazon Basin which has a long history of isotope collection and analysis and also of climate modelling: both having been reported for over thirty years. Careful consideration of the results of GCM simulations of Amazonian deforestation and climate change suggests that the recent stable isotope record is more consistent with the predicted effects of greenhouse warming, possibly combined with forest removal, than with GCM predictions of the effects of deforestation alone

  15. Port performance evaluation tool based on microsimulation model

    Directory of Open Access Journals (Sweden)

    Tsavalista Burhani Jzolanda

    2017-01-01

    Full Text Available As port performance is becoming correlative to national competitiveness, the issue of port performance evaluation has significantly raised. Port performances can simply be indicated by port service levels to the ship (e.g., throughput, waiting for berthing etc., as well as the utilization level of equipment and facilities within a certain period. The performances evaluation then can be used as a tool to develop related policies for improving the port’s performance to be more effective and efficient. However, the evaluation is frequently conducted based on deterministic approach, which hardly captures the nature variations of port parameters. Therefore, this paper presents a stochastic microsimulation model for investigating the impacts of port parameter variations to the port performances. The variations are derived from actual data in order to provide more realistic results. The model is further developed using MATLAB and Simulink based on the queuing theory.

  16. Assessment of fine-scale resource selection and spatially explicit habitat suitability modelling for a re-introduced tiger (Panthera tigris population in central India

    Directory of Open Access Journals (Sweden)

    Mriganka Shekhar Sarkar

    2017-11-01

    Full Text Available Background Large carnivores influence ecosystem functions at various scales. Thus, their local extinction is not only a species-specific conservation concern, but also reflects on the overall habitat quality and ecosystem value. Species-habitat relationships at fine scale reflect the individuals’ ability to procure resources and negotiate intraspecific competition. Such fine scale habitat choices are more pronounced in large carnivores such as tiger (Panthera tigris, which exhibits competitive exclusion in habitat and mate selection strategies. Although landscape level policies and conservation strategies are increasingly promoted for tiger conservation, specific management interventions require knowledge of the habitat correlates at fine scale. Methods We studied nine radio-collared individuals of a successfully reintroduced tiger population in Panna Tiger Reserve, central India, focussing on the species-habitat relationship at fine scales. With 16 eco-geographical variables, we performed Manly’s selection ratio and K-select analyses to define population-level and individual-level variation in resource selection, respectively. We analysed the data obtained during the exploratory period of six tigers and during the settled period of eight tigers separately, and compared the consequent results. We further used the settled period characteristics to model and map habitat suitability based on the Mahalanobis D2 method and the Boyce index. Results There was a clear difference in habitat selection by tigers between the exploratory and the settled period. During the exploratory period, tigers selected dense canopy and bamboo forests, but also spent time near villages and relocated village sites. However, settled tigers predominantly selected bamboo forests in complex terrain, riverine forests and teak-mixed forest, and totally avoided human settlements and agriculture areas. There were individual variations in habitat selection between exploratory

  17. Non-equilibrium reaction and relaxation dynamics in a strongly interacting explicit solvent: F + CD{sub 3}CN treated with a parallel multi-state EVB model

    Energy Technology Data Exchange (ETDEWEB)

    Glowacki, David R., E-mail: drglowacki@gmail.com [School of Chemistry, University of Bristol, Bristol BS8 1TS (United Kingdom); Department of Computer Science, University of Bristol, Bristol BS8 1UB (United Kingdom); PULSE Institute and Department of Chemistry, Stanford University, Stanford, California 94305 (United States); SLAC National Accelerator Laboratory, Menlo Park, California 94025 (United States); Orr-Ewing, Andrew J. [School of Chemistry, University of Bristol, Bristol BS8 1TS (United Kingdom); Harvey, Jeremy N. [Department of Chemistry, KU Leuven, Celestijnenlaan 200F, B-3001 Heverlee (Belgium)

    2015-07-28

    We describe a parallelized linear-scaling computational framework developed to implement arbitrarily large multi-state empirical valence bond (MS-EVB) calculations within CHARMM and TINKER. Forces are obtained using the Hellmann-Feynman relationship, giving continuous gradients, and good energy conservation. Utilizing multi-dimensional Gaussian coupling elements fit to explicitly correlated coupled cluster theory, we built a 64-state MS-EVB model designed to study the F + CD{sub 3}CN → DF + CD{sub 2}CN reaction in CD{sub 3}CN solvent (recently reported in Dunning et al. [Science 347(6221), 530 (2015)]). This approach allows us to build a reactive potential energy surface whose balanced accuracy and efficiency considerably surpass what we could achieve otherwise. We ran molecular dynamics simulations to examine a range of observables which follow in the wake of the reactive event: energy deposition in the nascent reaction products, vibrational relaxation rates of excited DF in CD{sub 3}CN solvent, equilibrium power spectra of DF in CD{sub 3}CN, and time dependent spectral shifts associated with relaxation of the nascent DF. Many of our results are in good agreement with time-resolved experimental observations, providing evidence for the accuracy of our MS-EVB framework in treating both the solute and solute/solvent interactions. The simulations provide additional insight into the dynamics at sub-picosecond time scales that are difficult to resolve experimentally. In particular, the simulations show that (immediately following deuterium abstraction) the nascent DF finds itself in a non-equilibrium regime in two different respects: (1) it is highly vibrationally excited, with ∼23 kcal mol{sup −1} localized in the stretch and (2) its post-reaction solvation environment, in which it is not yet hydrogen-bonded to CD{sub 3}CN solvent molecules, is intermediate between the non-interacting gas-phase limit and the solution-phase equilibrium limit. Vibrational

  18. Assessment of fine-scale resource selection and spatially explicit habitat suitability modelling for a re-introduced tiger (Panthera tigris) population in central India.

    Science.gov (United States)

    Sarkar, Mriganka Shekhar; Krishnamurthy, Ramesh; Johnson, Jeyaraj A; Sen, Subharanjan; Saha, Goutam Kumar

    2017-01-01

    Large carnivores influence ecosystem functions at various scales. Thus, their local extinction is not only a species-specific conservation concern, but also reflects on the overall habitat quality and ecosystem value. Species-habitat relationships at fine scale reflect the individuals' ability to procure resources and negotiate intraspecific competition. Such fine scale habitat choices are more pronounced in large carnivores such as tiger ( Panthera tigris ), which exhibits competitive exclusion in habitat and mate selection strategies. Although landscape level policies and conservation strategies are increasingly promoted for tiger conservation, specific management interventions require knowledge of the habitat correlates at fine scale. We studied nine radio-collared individuals of a successfully reintroduced tiger population in Panna Tiger Reserve, central India, focussing on the species-habitat relationship at fine scales. With 16 eco-geographical variables, we performed Manly's selection ratio and K-select analyses to define population-level and individual-level variation in resource selection, respectively. We analysed the data obtained during the exploratory period of six tigers and during the settled period of eight tigers separately, and compared the consequent results. We further used the settled period characteristics to model and map habitat suitability based on the Mahalanobis D 2 method and the Boyce index. There was a clear difference in habitat selection by tigers between the exploratory and the settled period. During the exploratory period, tigers selected dense canopy and bamboo forests, but also spent time near villages and relocated village sites. However, settled tigers predominantly selected bamboo forests in complex terrain, riverine forests and teak-mixed forest, and totally avoided human settlements and agriculture areas. There were individual variations in habitat selection between exploratory and settled periods. Based on threshold limits

  19. Non-equilibrium reaction and relaxation dynamics in a strongly interacting explicit solvent: F + CD3CN treated with a parallel multi-state EVB model.

    Science.gov (United States)

    Glowacki, David R; Orr-Ewing, Andrew J; Harvey, Jeremy N

    2015-07-28

    We describe a parallelized linear-scaling computational framework developed to implement arbitrarily large multi-state empirical valence bond (MS-EVB) calculations within CHARMM and TINKER. Forces are obtained using the Hellmann-Feynman relationship, giving continuous gradients, and good energy conservation. Utilizing multi-dimensional Gaussian coupling elements fit to explicitly correlated coupled cluster theory, we built a 64-state MS-EVB model designed to study the F + CD3CN → DF + CD2CN reaction in CD3CN solvent (recently reported in Dunning et al. [Science 347(6221), 530 (2015)]). This approach allows us to build a reactive potential energy surface whose balanced accuracy and efficiency considerably surpass what we could achieve otherwise. We ran molecular dynamics simulations to examine a range of observables which follow in the wake of the reactive event: energy deposition in the nascent reaction products, vibrational relaxation rates of excited DF in CD3CN solvent, equilibrium power spectra of DF in CD3CN, and time dependent spectral shifts associated with relaxation of the nascent DF. Many of our results are in good agreement with time-resolved experimental observations, providing evidence for the accuracy of our MS-EVB framework in treating both the solute and solute/solvent interactions. The simulations provide additional insight into the dynamics at sub-picosecond time scales that are difficult to resolve experimentally. In particular, the simulations show that (immediately following deuterium abstraction) the nascent DF finds itself in a non-equilibrium regime in two different respects: (1) it is highly vibrationally excited, with ∼23 kcal mol(-1) localized in the stretch and (2) its post-reaction solvation environment, in which it is not yet hydrogen-bonded to CD3CN solvent molecules, is intermediate between the non-interacting gas-phase limit and the solution-phase equilibrium limit. Vibrational relaxation of the nascent DF results in a spectral

  20. MODEL CAR TRANSPORT SYSTEM - MODERN ITS EDUCATION TOOL

    Directory of Open Access Journals (Sweden)

    Karel Bouchner

    2017-12-01

    Full Text Available The model car transport system is a laboratory intended for a practical development in the area of the motor traffic. It is also an important education tool for students’ hands-on training, enabling students to test the results of their own studies. The main part of the model car transportation network is a model in a ratio 1:87 (HO, based on component units of FALLER Car system, e.g. cars, traffic lights, carriage way, parking spaces, stop sections, branch-off junctions, sensors and control sections. The model enables to simulate real traffic situations. It includes a motor traffic in a city, in a small village, on a carriageway between a city and a village including a railway crossing. The traffic infrastructure includes different kinds of intersections, such as T-junctions, a classic four-way crossroad and four-way traffic circle, with and without traffic lights control. Another important part of the model is a segment of a highway which includes an elevated crossing with highway approaches and exits.

  1. Explicit Instruction Elements in Core Reading Programs

    Science.gov (United States)

    Child, Angela R.

    2012-01-01

    Classroom teachers are provided instructional recommendations for teaching reading from their adopted core reading programs (CRPs). Explicit instruction elements or what is also called instructional moves, including direct explanation, modeling, guided practice, independent practice, discussion, feedback, and monitoring, were examined within CRP…

  2. An artificial intelligence tool for complex age-depth models

    Science.gov (United States)

    Bradley, E.; Anderson, K. A.; de Vesine, L. R.; Lai, V.; Thomas, M.; Nelson, T. H.; Weiss, I.; White, J. W. C.

    2017-12-01

    CSciBox is an integrated software system for age modeling of paleoenvironmental records. It incorporates an array of data-processing and visualization facilities, ranging from 14C calibrations to sophisticated interpolation tools. Using CSciBox's GUI, a scientist can build custom analysis pipelines by composing these built-in components or adding new ones. Alternatively, she can employ CSciBox's automated reasoning engine, Hobbes, which uses AI techniques to perform an in-depth, autonomous exploration of the space of possible age-depth models and presents the results—both the models and the reasoning that was used in constructing and evaluating them—to the user for her inspection. Hobbes accomplishes this using a rulebase that captures the knowledge of expert geoscientists, which was collected over the course of more than 100 hours of interviews. It works by using these rules to generate arguments for and against different age-depth model choices for a given core. Given a marine-sediment record containing uncalibrated 14C dates, for instance, Hobbes tries CALIB-style calibrations using a choice of IntCal curves, with reservoir age correction values chosen from the 14CHRONO database using the lat/long information provided with the core, and finally composes the resulting age points into a full age model using different interpolation methods. It evaluates each model—e.g., looking for outliers or reversals—and uses that information to guide the next steps of its exploration, and presents the results to the user in human-readable form. The most powerful of CSciBox's built-in interpolation methods is BACON, a Bayesian sedimentation-rate algorithm—a powerful but complex tool that can be difficult to use. Hobbes adjusts BACON's many parameters autonomously to match the age model to the expectations of expert geoscientists, as captured in its rulebase. It then checks the model against the data and iteratively re-calculates until it is a good fit to the data.

  3. Watershed modeling tools and data for prognostic and diagnostic

    Science.gov (United States)

    Chambel-Leitao, P.; Brito, D.; Neves, R.

    2009-04-01

    When eutrophication is considered an important process to control it can be accomplished reducing nitrogen and phosphorus losses from both point and nonpoint sources and helping to assess the effectiveness of the pollution reduction strategy. HARP-NUT guidelines (Guidelines on Harmonized Quantification and Reporting Procedures for Nutrients) (Borgvang & Selvik, 2000) are presented by OSPAR as the best common quantification and reporting procedures for calculating the reduction of nutrient inputs. In 2000, OSPAR HARP-NUT guidelines on a trial basis. They were intended to serve as a tool for OSPAR Contracting Parties to report, in a harmonized manner, their different commitments, present or future, with regard to nutrients under the OSPAR Convention, in particular the "Strategy to Combat Eutrophication". HARP-NUT Guidelines (Borgvang and Selvik, 2000; Schoumans, 2003) were developed to quantify and report on the individual sources of nitrogen and phosphorus discharges/losses to surface waters (Source Orientated Approach). These results can be compared to nitrogen and phosphorus figures with the total riverine loads measured at downstream monitoring points (Load Orientated Approach), as load reconciliation. Nitrogen and phosphorus retention in river systems represents the connecting link between the "Source Orientated Approach" and the "Load Orientated Approach". Both approaches are necessary for verification purposes and both may be needed for providing the information required for the various commitments. Guidelines 2,3,4,5 are mainly concerned with the sources estimation. They present a set of simple calculations that allow the estimation of the origin of loads. Guideline 6 is a particular case where the application of a model is advised, in order to estimate the sources of nutrients from diffuse sources associated with land use/land cover. The model chosen for this was SWAT (Arnold & Fohrer, 2005) model because it is suggested in the guideline 6 and because it

  4. A Model-Driven Visualization Tool for Use with Model-Based Systems Engineering Projects

    Science.gov (United States)

    Trase, Kathryn; Fink, Eric

    2014-01-01

    Model-Based Systems Engineering (MBSE) promotes increased consistency between a system's design and its design documentation through the use of an object-oriented system model. The creation of this system model facilitates data presentation by providing a mechanism from which information can be extracted by automated manipulation of model content. Existing MBSE tools enable model creation, but are often too complex for the unfamiliar model viewer to easily use. These tools do not yet provide many opportunities for easing into the development and use of a system model when system design documentation already exists. This study creates a Systems Modeling Language (SysML) Document Traceability Framework (SDTF) for integrating design documentation with a system model, and develops an Interactive Visualization Engine for SysML Tools (InVEST), that exports consistent, clear, and concise views of SysML model data. These exported views are each meaningful to a variety of project stakeholders with differing subjects of concern and depth of technical involvement. InVEST allows a model user to generate multiple views and reports from a MBSE model, including wiki pages and interactive visualizations of data. System data can also be filtered to present only the information relevant to the particular stakeholder, resulting in a view that is both consistent with the larger system model and other model views. Viewing the relationships between system artifacts and documentation, and filtering through data to see specialized views improves the value of the system as a whole, as data becomes information

  5. Slab2 - Updated Subduction Zone Geometries and Modeling Tools

    Science.gov (United States)

    Moore, G.; Hayes, G. P.; Portner, D. E.; Furtney, M.; Flamme, H. E.; Hearne, M. G.

    2017-12-01

    The U.S. Geological Survey database of global subduction zone geometries (Slab1.0), is a highly utilized dataset that has been applied to a wide range of geophysical problems. In 2017, these models have been improved and expanded upon as part of the Slab2 modeling effort. With a new data driven approach that can be applied to a broader range of tectonic settings and geophysical data sets, we have generated a model set that will serve as a more comprehensive, reliable, and reproducible resource for three-dimensional slab geometries at all of the world's convergent margins. The newly developed framework of Slab2 is guided by: (1) a large integrated dataset, consisting of a variety of geophysical sources (e.g., earthquake hypocenters, moment tensors, active-source seismic survey images of the shallow slab, tomography models, receiver functions, bathymetry, trench ages, and sediment thickness information); (2) a dynamic filtering scheme aimed at constraining incorporated seismicity to only slab related events; (3) a 3-D data interpolation approach which captures both high resolution shallow geometries and instances of slab rollback and overlap at depth; and (4) an algorithm which incorporates uncertainties of contributing datasets to identify the most probable surface depth over the extent of each subduction zone. Further layers will also be added to the base geometry dataset, such as historic moment release, earthquake tectonic providence, and interface coupling. Along with access to several queryable data formats, all components have been wrapped into an open source library in Python, such that suites of updated models can be released as further data becomes available. This presentation will discuss the extent of Slab2 development, as well as the current availability of the model and modeling tools.

  6. Using Modeling Tools to Better Understand Permafrost Hydrology

    Directory of Open Access Journals (Sweden)

    Clément Fabre

    2017-06-01

    Full Text Available Modification of the hydrological cycle and, subsequently, of other global cycles is expected in Arctic watersheds owing to global change. Future climate scenarios imply widespread permafrost degradation caused by an increase in air temperature, and the expected effect on permafrost hydrology is immense. This study aims at analyzing, and quantifying the daily water transfer in the largest Arctic river system, the Yenisei River in central Siberia, Russia, partially underlain by permafrost. The semi-distributed SWAT (Soil and Water Assessment Tool hydrological model has been calibrated and validated at a daily time step in historical discharge simulations for the 2003–2014 period. The model parameters have been adjusted to embrace the hydrological features of permafrost. SWAT is shown capable to estimate water fluxes at a daily time step, especially during unfrozen periods, once are considered specific climatic and soils conditions adapted to a permafrost watershed. The model simulates average annual contribution to runoff of 263 millimeters per year (mm yr−1 distributed as 152 mm yr−1 (58% of surface runoff, 103 mm yr−1 (39% of lateral flow and 8 mm yr−1 (3% of return flow from the aquifer. These results are integrated on a reduced basin area downstream from large dams and are closer to observations than previous modeling exercises.

  7. Unleashing spatially distributed ecohydrology modeling using Big Data tools

    Science.gov (United States)

    Miles, B.; Idaszak, R.

    2015-12-01

    Physically based spatially distributed ecohydrology models are useful for answering science and management questions related to the hydrology and biogeochemistry of prairie, savanna, forested, as well as urbanized ecosystems. However, these models can produce hundreds of gigabytes of spatial output for a single model run over decadal time scales when run at regional spatial scales and moderate spatial resolutions (~100-km2+ at 30-m spatial resolution) or when run for small watersheds at high spatial resolutions (~1-km2 at 3-m spatial resolution). Numerical data formats such as HDF5 can store arbitrarily large datasets. However even in HPC environments, there are practical limits on the size of single files that can be stored and reliably backed up. Even when such large datasets can be stored, querying and analyzing these data can suffer from poor performance due to memory limitations and I/O bottlenecks, for example on single workstations where memory and bandwidth are limited, or in HPC environments where data are stored separately from computational nodes. The difficulty of storing and analyzing spatial data from ecohydrology models limits our ability to harness these powerful tools. Big Data tools such as distributed databases have the potential to surmount the data storage and analysis challenges inherent to large spatial datasets. Distributed databases solve these problems by storing data close to computational nodes while enabling horizontal scalability and fault tolerance. Here we present the architecture of and preliminary results from PatchDB, a distributed datastore for managing spatial output from the Regional Hydro-Ecological Simulation System (RHESSys). The initial version of PatchDB uses message queueing to asynchronously write RHESSys model output to an Apache Cassandra cluster. Once stored in the cluster, these data can be efficiently queried to quickly produce both spatial visualizations for a particular variable (e.g. maps and animations), as well

  8. A Simple Evacuation Modeling and Simulation Tool for First Responders

    Energy Technology Data Exchange (ETDEWEB)

    Koch, Daniel B [ORNL; Payne, Patricia W [ORNL

    2015-01-01

    Although modeling and simulation of mass evacuations during a natural or man-made disaster is an on-going and vigorous area of study, tool adoption by front-line first responders is uneven. Some of the factors that account for this situation include cost and complexity of the software. For several years, Oak Ridge National Laboratory has been actively developing the free Incident Management Preparedness and Coordination Toolkit (IMPACT) to address these issues. One of the components of IMPACT is a multi-agent simulation module for area-based and path-based evacuations. The user interface is designed so that anyone familiar with typical computer drawing tools can quickly author a geospatially-correct evacuation visualization suitable for table-top exercises. Since IMPACT is designed for use in the field where network communications may not be available, quick on-site evacuation alternatives can be evaluated to keep pace with a fluid threat situation. Realism is enhanced by incorporating collision avoidance into the simulation. Statistics are gathered as the simulation unfolds, including most importantly time-to-evacuate, to help first responders choose the best course of action.

  9. Using urban forest assessment tools to model bird habitat potential

    Science.gov (United States)

    Lerman, Susannah B.; Nislow, Keith H.; Nowak, David J.; DeStefano, Stephen; King, David I.; Jones-Farrand, D. Todd

    2014-01-01

    The alteration of forest cover and the replacement of native vegetation with buildings, roads, exotic vegetation, and other urban features pose one of the greatest threats to global biodiversity. As more land becomes slated for urban development, identifying effective urban forest wildlife management tools becomes paramount to ensure the urban forest provides habitat to sustain bird and other wildlife populations. The primary goal of this study was to integrate wildlife suitability indices to an existing national urban forest assessment tool, i-Tree. We quantified available habitat characteristics of urban forests for ten northeastern U.S. cities, and summarized bird habitat relationships from the literature in terms of variables that were represented in the i-Tree datasets. With these data, we generated habitat suitability equations for nine bird species representing a range of life history traits and conservation status that predicts the habitat suitability based on i-Tree data. We applied these equations to the urban forest datasets to calculate the overall habitat suitability for each city and the habitat suitability for different types of land-use (e.g., residential, commercial, parkland) for each bird species. The proposed habitat models will help guide wildlife managers, urban planners, and landscape designers who require specific information such as desirable habitat conditions within an urban management project to help improve the suitability of urban forests for birds.

  10. Prototype of Automated PLC Model Checking Using Continuous Integration Tools

    CERN Document Server

    Lettrich, Michael

    2015-01-01

    To deal with the complexity of operating and supervising large scale industrial installations at CERN, often Programmable Logic Controllers (PLCs) are used. A failure in these control systems can cause a disaster in terms of economic loses, environmental damages or human losses. Therefore the requirements to software quality are very high. To provide PLC developers with a way to verify proper functionality against requirements, a Java tool named PLCverif has been developed which encapsulates and thus simplifies the use of third party model checkers. One of our goals in this project is to integrate PLCverif in development process of PLC programs. When the developer changes the program, all the requirements should be verified again, as a change on the code can produce collateral effects and violate one or more requirements. For that reason, PLCverif has been extended to work with Jenkins CI in order to trigger automatically the verication cases when the developer changes the PLC program. This prototype has been...

  11. A Relational Database Model and Tools for Environmental Sound Recognition

    Directory of Open Access Journals (Sweden)

    Yuksel Arslan

    2017-12-01

    Full Text Available Environmental sound recognition (ESR has become a hot topic in recent years. ESR is mainly based on machine learning (ML and ML algorithms require first a training database. This database must comprise the sounds to be recognized and other related sounds. An ESR system needs the database during training, testing and in the production stage. In this paper, we present the design and pilot establishment of a database which will assists all researchers who want to establish an ESR system. This database employs relational database model which is not used for this task before. We explain in this paper design and implementation details of the database, data collection and load process. Besides we explain the tools and developed graphical user interface for a desktop application and for the WEB.

  12. Computational Tools To Model Halogen Bonds in Medicinal Chemistry.

    Science.gov (United States)

    Ford, Melissa Coates; Ho, P Shing

    2016-03-10

    The use of halogens in therapeutics dates back to the earliest days of medicine when seaweed was used as a source of iodine to treat goiters. The incorporation of halogens to improve the potency of drugs is now fairly standard in medicinal chemistry. In the past decade, halogens have been recognized as direct participants in defining the affinity of inhibitors through a noncovalent interaction called the halogen bond or X-bond. Incorporating X-bonding into structure-based drug design requires computational models for the anisotropic distribution of charge and the nonspherical shape of halogens, which lead to their highly directional geometries and stabilizing energies. We review here current successes and challenges in developing computational methods to introduce X-bonding into lead compound discovery and optimization during drug development. This fast-growing field will push further development of more accurate and efficient computational tools to accelerate the exploitation of halogens in medicinal chemistry.

  13. Tool-chain for online modeling of the LHC

    International Nuclear Information System (INIS)

    Mueller, G.J.; Buffat, X.; Fuchsberger, K.; Giovannozzi, M.; Redaelli, S.; Schmidt, F.

    2012-01-01

    The control of high intensity beams in a high energy, superconducting machine with complex optics like the CERN Large Hadron Collider (LHC) is challenging not only from the design aspect but also for operation towards physics production. To support the LHC beam commissioning, efforts were devoted to the design and implementation of a software infrastructure aimed at using the computing power of the beam dynamics code MAD-X in the framework of the JAVA-based LHC control and measurement environment. Alongside interfaces to measurement data as well as to settings of the control system, the best knowledge of machine aperture and optic models is provided. In this paper, we will present the status of the tool chain and illustrate how it has been used during commissioning and operation of the LHC. Possible future implementations will be discussed. (authors)

  14. Standalone visualization tool for three-dimensional DRAGON geometrical models

    International Nuclear Information System (INIS)

    Lukomski, A.; McIntee, B.; Moule, D.; Nichita, E.

    2008-01-01

    DRAGON is a neutron transport and depletion code able to solve one-, two- and three-dimensional problems. To date DRAGON provides two visualization modules, able to represent respectively two- and three-dimensional geometries. The two-dimensional visualization module generates a postscript file, while the three dimensional visualization module generates a MATLAB M-file with instructions for drawing the tracks in the DRAGON TRACKING data structure, which implicitly provide a representation of the geometry. The current work introduces a new, standalone, tool based on the open-source Visualization Toolkit (VTK) software package which allows the visualization of three-dimensional geometrical models by reading the DRAGON GEOMETRY data structure and generating an axonometric image which can be manipulated interactively by the user. (author)

  15. The Innsbruck/ESO sky models and telluric correction tools*

    Directory of Open Access Journals (Sweden)

    Kimeswenger S.

    2015-01-01

    While the ground based astronomical observatories just have to correct for the line-of-sight integral of these effects, the Čerenkov telescopes use the atmosphere as the primary detector. The measured radiation originates at lower altitudes and does not pass through the entire atmosphere. Thus, a decent knowledge of the profile of the atmosphere at any time is required. The latter cannot be achieved by photometric measurements of stellar sources. We show here the capabilities of our sky background model and data reduction tools for ground-based optical/infrared telescopes. Furthermore, we discuss the feasibility of monitoring the atmosphere above any observing site, and thus, the possible application of the method for Čerenkov telescopes.

  16. Modeling of tool path for the CNC sheet cutting machines

    Science.gov (United States)

    Petunin, Aleksandr A.

    2015-11-01

    In the paper the problem of tool path optimization for CNC (Computer Numerical Control) cutting machines is considered. The classification of the cutting techniques is offered. We also propose a new classification of toll path problems. The tasks of cost minimization and time minimization for standard cutting technique (Continuous Cutting Problem, CCP) and for one of non-standard cutting techniques (Segment Continuous Cutting Problem, SCCP) are formalized. We show that the optimization tasks can be interpreted as discrete optimization problem (generalized travel salesman problem with additional constraints, GTSP). Formalization of some constraints for these tasks is described. For the solution GTSP we offer to use mathematical model of Prof. Chentsov based on concept of a megalopolis and dynamic programming.

  17. Development of hydrogeological modelling tools based on NAMMU

    Energy Technology Data Exchange (ETDEWEB)

    Marsic, N. [Kemakta Konsult AB, Stockholm (Sweden); Hartley, L.; Jackson, P.; Poole, M. [AEA Technology, Harwell (United Kingdom); Morvik, A. [Bergen Software Services International AS, Bergen (Norway)

    2001-09-01

    A number of relatively sophisticated hydrogeological models were developed within the SR 97 project to handle issues such as nesting of scales and the effects of salinity. However, these issues and others are considered of significant importance and generality to warrant further development of the hydrogeological methodology. Several such developments based on the NAMMU package are reported here: - Embedded grid: nesting of the regional- and site-scale models within the same numerical model has given greater consistency in the structural model representation and in the flow between scales. Since there is a continuous representation of the regional- and site-scales the modelling of pathways from the repository no longer has to be contained wholly by the site-scale region. This allows greater choice in the size of the site-scale. - Implicit Fracture Zones (IFZ): this method of incorporating the structural model is very efficient and allows changes to either the mesh or fracture zones to be implemented quickly. It also supports great flexibility in the properties of the structures and rock mass. - Stochastic fractures: new functionality has been added to IFZ to allow arbitrary combinations of stochastic or deterministic fracture zones with the rock-mass. Whether a fracture zone is modelled deterministically or stochastically its statistical properties can be defined independently. - Stochastic modelling: efficient methods for Monte-Carlo simulation of stochastic permeability fields have been implemented and tested on SKB's computers. - Visualisation: the visualisation tool Avizier for NAMMU has been enhanced such that it is efficient for checking models and presentation. - PROPER interface: NAMMU outputs pathlines in PROPER format so that it can be included in PA workflow. The developed methods are illustrated by application to stochastic nested modelling of the Beberg site using data from SR 97. The model properties were in accordance with the regional- and site

  18. Development of hydrogeological modelling tools based on NAMMU

    International Nuclear Information System (INIS)

    Marsic, N.; Hartley, L.; Jackson, P.; Poole, M.; Morvik, A.

    2001-09-01

    A number of relatively sophisticated hydrogeological models were developed within the SR 97 project to handle issues such as nesting of scales and the effects of salinity. However, these issues and others are considered of significant importance and generality to warrant further development of the hydrogeological methodology. Several such developments based on the NAMMU package are reported here: - Embedded grid: nesting of the regional- and site-scale models within the same numerical model has given greater consistency in the structural model representation and in the flow between scales. Since there is a continuous representation of the regional- and site-scales the modelling of pathways from the repository no longer has to be contained wholly by the site-scale region. This allows greater choice in the size of the site-scale. - Implicit Fracture Zones (IFZ): this method of incorporating the structural model is very efficient and allows changes to either the mesh or fracture zones to be implemented quickly. It also supports great flexibility in the properties of the structures and rock mass. - Stochastic fractures: new functionality has been added to IFZ to allow arbitrary combinations of stochastic or deterministic fracture zones with the rock-mass. Whether a fracture zone is modelled deterministically or stochastically its statistical properties can be defined independently. - Stochastic modelling: efficient methods for Monte-Carlo simulation of stochastic permeability fields have been implemented and tested on SKB's computers. - Visualisation: the visualisation tool Avizier for NAMMU has been enhanced such that it is efficient for checking models and presentation. - PROPER interface: NAMMU outputs pathlines in PROPER format so that it can be included in PA workflow. The developed methods are illustrated by application to stochastic nested modelling of the Beberg site using data from SR 97. The model properties were in accordance with the regional- and site

  19. "Tacit Knowledge" versus "Explicit Knowledge"

    DEFF Research Database (Denmark)

    Sanchez, Ron

    creators and carriers. By contrast, the explicit knowledge approach emphasizes processes for articulating knowledge held by individuals, the design of organizational approaches for creating new knowledge, and the development of systems (including information systems) to disseminate articulated knowledge...

  20. Planning the network of gas pipelines through modeling tools

    Energy Technology Data Exchange (ETDEWEB)

    Sucupira, Marcos L.L.; Lutif Filho, Raimundo B. [Companhia de Gas do Ceara (CEGAS), Fortaleza, CE (Brazil)

    2009-07-01

    Natural gas is a source of non-renewable energy used by different sectors of the economy of Ceara. Its use may be industrial, residential, commercial, as a source of automotive fuel, as a co-generation of energy and as a source for generating electricity from heat. For its practicality this energy has a strong market acceptance and provides a broad list of clients to fit their use, which makes it possible to reach diverse parts of the city. Its distribution requires a complex network of pipelines that branches throughout the city to meet all potential clients interested in this source of energy. To facilitate the design, analysis, expansion and location of bottlenecks and breaks in the distribution network, a modeling software is used that allows the network manager of the net to manage the various information about the network. This paper presents the advantages of modeling the gas distribution network of natural gas companies in Ceara, showing the tool used, the steps necessary for the implementation of the models, the advantages of using the software and the findings obtained with its use. (author)

  1. Complex Coronary Hemodynamics - Simple Analog Modelling as an Educational Tool.

    Science.gov (United States)

    Parikh, Gaurav R; Peter, Elvis; Kakouros, Nikolaos

    2017-01-01

    Invasive coronary angiography remains the cornerstone for evaluation of coronary stenoses despite there being a poor correlation between luminal loss assessment by coronary luminography and myocardial ischemia. This is especially true for coronary lesions deemed moderate by visual assessment. Coronary pressure-derived fractional flow reserve (FFR) has emerged as the gold standard for the evaluation of hemodynamic significance of coronary artery stenosis, which is cost effective and leads to improved patient outcomes. There are, however, several limitations to the use of FFR including the evaluation of serial stenoses. In this article, we discuss the electronic-hydraulic analogy and the utility of simple electrical modelling to mimic the coronary circulation and coronary stenoses. We exemplify the effect of tandem coronary lesions on the FFR by modelling of a patient with sequential disease segments and complex anatomy. We believe that such computational modelling can serve as a powerful educational tool to help clinicians better understand the complexity of coronary hemodynamics and improve patient care.

  2. Extending the Will, Skill, Tool Model of Technology Integration: Adding Pedagogy as a New Model Construct

    Science.gov (United States)

    Knezek, Gerald; Christensen, Rhonda

    2016-01-01

    An expansion of the Will, Skill, Tool Model of Technology Integration to include teacher's pedagogical style is proposed by the authors as a means of advancing the predictive power of the model for level of classroom technology integration to beyond 90%. Suggested advantages to this expansion include more precise identification of areas to be…

  3. Explicit Versus Implicit Income Insurance

    OpenAIRE

    Thomas J. Kniesner; James P. Z‎iliak

    2001-01-01

    October 2001 (Revised from July 2001). Abstract: By supplementing income explicitly through payments or implicitly through taxes collected, income-based taxes and transfers make disposable income less variable. Because disposable income determines consumption, policies that smooth disposable income also create welfare improving consumption insurance. With data from the Panel Study of Income Dynamics we find that annual consumption variation is reduced by almost 20 percent due to explicit and ...

  4. Crop Monitoring as a Tool for Modelling the Genesis of Millet Prices in Senegal

    Science.gov (United States)

    Jacques, D.; Marinho, E.; Defourny, P.; Waldner, F.; d'Andrimont, R.

    2015-12-01

    Food security in Sahelian countries strongly relies on the ability of markets to transfer staplesfrom surplus to deficit areas. Market failures, leading to the inefficient geographical allocation of food,are expected to emerge from high transportation costs and information asymmetries that are commonin moderately developed countries. As a result, important price differentials are observed betweenproducing and consuming areas which damages both poor producers and food insecure consumers. Itis then vital for policy makers to understand how the prices of agricultural commodities are formed byaccounting for the existing market imperfections in addition to local demand and supply considerations. To address this issue, we have gathered an unique and diversified set of data for Senegal andintegrated it in a spatially explicit model that simulates the functioning of agricultural markets, that isfully consistent with the economic theory. Our departure point is a local demand and supply modelaround each market having its catchment areas determined by the road network. We estimate the localsupply of agricultural commodities from satellite imagery while the demand is assumed to be a functionof the population living in the area. From this point on, profitable transactions between areas with lowprices to areas with high prices are simulated for different levels of per kilometer transportation costand information flows (derived from call details records i.e. mobile phone data). The simulated prices are then comparedwith the actual millet prices. Despite the parsimony of the model that estimates only two parameters, i.e. the per kilometertransportation cost and the information asymmetry resulting from low levels of mobile phone activitybetween markets, it impressively explains more than 80% of the price differentials observed in the 40markets included in the analysis. In one hand these results can be used in the assessment of the socialwelfare impacts of the further development of

  5. Three Software Tools for Viewing Sectional Planes, Volume Models, and Surface Models of a Cadaver Hand.

    Science.gov (United States)

    Chung, Beom Sun; Chung, Min Suk; Shin, Byeong Seok; Kwon, Koojoo

    2018-02-19

    The hand anatomy, including the complicated hand muscles, can be grasped by using computer-assisted learning tools with high quality two-dimensional images and three-dimensional models. The purpose of this study was to present up-to-date software tools that promote learning of stereoscopic morphology of the hand. On the basis of horizontal sectioned images and outlined images of a male cadaver, vertical planes, volume models, and surface models were elaborated. Software to browse pairs of the sectioned and outlined images in orthogonal planes and software to peel and rotate the volume models, as well as a portable document format (PDF) file to select and rotate the surface models, were produced. All of the software tools were downloadable free of charge and usable off-line. The three types of tools for viewing multiple aspects of the hand could be adequately employed according to individual needs. These new tools involving the realistic images of a cadaver and the diverse functions are expected to improve comprehensive knowledge of the hand shape. © 2018 The Korean Academy of Medical Sciences.

  6. A crowdsourcing model for creating preclinical medical education study tools.

    Science.gov (United States)

    Bow, Hansen C; Dattilo, Jonathan R; Jonas, Andrea M; Lehmann, Christoph U

    2013-06-01

    During their preclinical course work, medical students must memorize and recall substantial amounts of information. Recent trends in medical education emphasize collaboration through team-based learning. In the technology world, the trend toward collaboration has been characterized by the crowdsourcing movement. In 2011, the authors developed an innovative approach to team-based learning that combined students' use of flashcards to master large volumes of content with a crowdsourcing model, using a simple informatics system to enable those students to share in the effort of generating concise, high-yield study materials. The authors used Google Drive and developed a simple Java software program that enabled students to simultaneously access and edit sets of questions and answers in the form of flashcards. Through this crowdsourcing model, medical students in the class of 2014 at the Johns Hopkins University School of Medicine created a database of over 16,000 questions that corresponded to the Genes to Society basic science curriculum. An analysis of exam scores revealed that students in the class of 2014 outperformed those in the class of 2013, who did not have access to the flashcard system, and a survey of students demonstrated that users were generally satisfied with the system and found it a valuable study tool. In this article, the authors describe the development and implementation of their crowdsourcing model for creating study materials, emphasize its simplicity and user-friendliness, describe its impact on students' exam performance, and discuss how students in any educational discipline could implement a similar model of collaborative learning.

  7. Tools and Models for Integrating Multiple Cellular Networks

    Energy Technology Data Exchange (ETDEWEB)

    Gerstein, Mark [Yale Univ., New Haven, CT (United States). Gerstein Lab.

    2015-11-06

    In this grant, we have systematically investigated the integrated networks, which are responsible for the coordination of activity between metabolic pathways in prokaryotes. We have developed several computational tools to analyze the topology of the integrated networks consisting of metabolic, regulatory, and physical interaction networks. The tools are all open-source, and they are available to download from Github, and can be incorporated in the Knowledgebase. Here, we summarize our work as follow. Understanding the topology of the integrated networks is the first step toward understanding its dynamics and evolution. For Aim 1 of this grant, we have developed a novel algorithm to determine and measure the hierarchical structure of transcriptional regulatory networks [1]. The hierarchy captures the direction of information flow in the network. The algorithm is generally applicable to regulatory networks in prokaryotes, yeast and higher organisms. Integrated datasets are extremely beneficial in understanding the biology of a system in a compact manner due to the conflation of multiple layers of information. Therefore for Aim 2 of this grant, we have developed several tools and carried out analysis for integrating system-wide genomic information. To make use of the structural data, we have developed DynaSIN for protein-protein interactions networks with various dynamical interfaces [2]. We then examined the association between network topology with phenotypic effects such as gene essentiality. In particular, we have organized E. coli and S. cerevisiae transcriptional regulatory networks into hierarchies. We then correlated gene phenotypic effects by tinkering with different layers to elucidate which layers were more tolerant to perturbations [3]. In the context of evolution, we also developed a workflow to guide the comparison between different types of biological networks across various species using the concept of rewiring [4], and Furthermore, we have developed

  8. Assimilation of Remotely Sensed Leaf Area Index into the Community Land Model with Explicit Carbon and Nitrogen Components using Data Assimilation Research Testbed

    Science.gov (United States)

    Ling, X.; Fu, C.; Yang, Z. L.; Guo, W.

    2017-12-01

    Information of the spatial and temporal patterns of leaf area index (LAI) is crucial to understand the exchanges of momentum, carbon, energy, and water between the terrestrial ecosystem and the atmosphere, while both in-situ observation and model simulation usually show distinct deficiency in terms of LAI coverage and value. Land data assimilation, combined with observation and simulation together, is a promising way to provide variable estimation. The Data Assimilation Research Testbed (DART) developed and maintained by the National Centre for Atmospheric Research (NCAR) provides a powerful tool to facilitate the combination of assimilation algorithms, models, and real (as well as synthetic) observations to better understanding of all three. Here we systematically investigated the effects of data assimilation on improving LAI simulation based on NCAR Community Land Model with the prognostic carbon-nitrogen option (CLM4CN) linked with DART using the deterministic Ensemble Adjustment Kalman Filter (EAKF). Random 40-member atmospheric forcing was used to drive the CLM4CN with or without LAI assimilation. The Global Land Surface Satellite LAI data (GLASS LAI) LAI is assimilated into the CLM4CN at a frequency of 8 days, and LAI (and leaf carbon / nitrogen) are adjusted at each time step. The results show that assimilating remotely sensed LAI into the CLM4CN is an effective method for improving model performance. In detail, the CLM4-CN simulated LAI systematically overestimates global LAI, especially in low latitude with the largest bias of 5 m2/m2. While if updating both LAI and leaf carbon and leaf nitrogen simultaneously during assimilation, the analyzed LAI can be corrected, especially in low latitude regions with the bias controlled around ±1 m2/m2. Analyzed LAI could also represent the seasonal variation except for the Southern Temperate (23°S-90°S). The obviously improved regions located in the center of Africa, Amazon, the South of Eurasia, the northeast of

  9. Explicit MDS Codes with Complementary Duals

    DEFF Research Database (Denmark)

    Beelen, Duals Peter; Jin, Lingfei

    2018-01-01

    In 1964, Massey introduced a class of codes with complementary duals which are called Linear Complimentary Dual (LCD for short) codes. He showed that LCD codes have applications in communication system, side-channel attack (SCA) and so on. LCD codes have been extensively studied in literature....... On the other hand, MDS codes form an optimal family of classical codes which have wide applications in both theory and practice. The main purpose of this paper is to give an explicit construction of several classes of LCD MDS codes, using tools from algebraic function fields. We exemplify this construction...

  10. 33 CFR 385.33 - Revisions to models and analytical tools.

    Science.gov (United States)

    2010-07-01

    ... on a case-by-case basis what documentation is appropriate for revisions to models and analytic tools... analytical tools. 385.33 Section 385.33 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE... Incorporating New Information Into the Plan § 385.33 Revisions to models and analytical tools. (a) In carrying...

  11. Improving Power System Modeling. A Tool to Link Capacity Expansion and Production Cost Models

    Energy Technology Data Exchange (ETDEWEB)

    Diakov, Victor [National Renewable Energy Lab. (NREL), Golden, CO (United States); Cole, Wesley [National Renewable Energy Lab. (NREL), Golden, CO (United States); Sullivan, Patrick [National Renewable Energy Lab. (NREL), Golden, CO (United States); Brinkman, Gregory [National Renewable Energy Lab. (NREL), Golden, CO (United States); Margolis, Robert [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2015-11-01

    Capacity expansion models (CEM) provide a high-level long-term view at the prospects of the evolving power system. In simulating the possibilities of long-term capacity expansion, it is important to maintain the viability of power system operation in the short-term (daily, hourly and sub-hourly) scales. Production-cost models (PCM) simulate routine power system operation on these shorter time scales using detailed load, transmission and generation fleet data by minimizing production costs and following reliability requirements. When based on CEM 'predictions' about generating unit retirements and buildup, PCM provide more detailed simulation for the short-term system operation and, consequently, may confirm the validity of capacity expansion predictions. Further, production cost model simulations of a system that is based on capacity expansion model solution are 'evolutionary' sound: the generator mix is the result of logical sequence of unit retirement and buildup resulting from policy and incentives. The above has motivated us to bridge CEM with PCM by building a capacity expansion - to - production cost model Linking Tool (CEPCoLT). The Linking Tool is built to onset capacity expansion model prescriptions onto production cost model inputs. NREL's ReEDS and Energy Examplar's PLEXOS are the capacity expansion and the production cost models, respectively. Via the Linking Tool, PLEXOS provides details of operation for the regionally-defined ReEDS scenarios.

  12. Uncertainty Analysis of Coupled Socioeconomic-Cropping Models: Building Confidence in Climate Change Decision-Support Tools for Local Stakeholders

    Science.gov (United States)

    Malard, J. J.; Rojas, M.; Adamowski, J. F.; Gálvez, J.; Tuy, H. A.; Melgar-Quiñonez, H.

    2015-12-01

    While cropping models represent the biophysical aspects of agricultural systems, system dynamics modelling offers the possibility of representing the socioeconomic (including social and cultural) aspects of these systems. The two types of models can then be coupled in order to include the socioeconomic dimensions of climate change adaptation in the predictions of cropping models.We develop a dynamically coupled socioeconomic-biophysical model of agricultural production and its repercussions on food security in two case studies from Guatemala (a market-based, intensive agricultural system and a low-input, subsistence crop-based system). Through the specification of the climate inputs to the cropping model, the impacts of climate change on the entire system can be analysed, and the participatory nature of the system dynamics model-building process, in which stakeholders from NGOs to local governmental extension workers were included, helps ensure local trust in and use of the model.However, the analysis of climate variability's impacts on agroecosystems includes uncertainty, especially in the case of joint physical-socioeconomic modelling, and the explicit representation of this uncertainty in the participatory development of the models is important to ensure appropriate use of the models by the end users. In addition, standard model calibration, validation, and uncertainty interval estimation techniques used for physically-based models are impractical in the case of socioeconomic modelling. We present a methodology for the calibration and uncertainty analysis of coupled biophysical (cropping) and system dynamics (socioeconomic) agricultural models, using survey data and expert input to calibrate and evaluate the uncertainty of the system dynamics as well as of the overall coupled model. This approach offers an important tool for local decision makers to evaluate the potential impacts of climate change and their feedbacks through the associated socioeconomic system.

  13. Single Event Kinetic Modelling without Explicit Generation of Large Networks: Application to Hydrocracking of Long Paraffins Modélisation cinétique par événements constitutifs sans génération explicite de grands réseaux : application à l’hydrocraquage des paraffines longues

    Directory of Open Access Journals (Sweden)

    Guillaume D.

    2011-08-01

    Full Text Available The single event modelling concept allows developing kinetic models for the simulation of refinery processes. For reaction networks with several hundreds of thousands of species, as is the case for catalytic reforming, rigorous relumping by carbon atom number and branching degree were efficiently employed by assuming chemical equilibrium in each lump. This relumping technique yields a compact lumped model without any loss of information, but requires the full detail of an explicitly generated reaction network. Classic network generation techniques become impractical when the hydrocarbon species contain more than approximately 20 carbon atoms, because of the extremely rapid growth of reaction network. Hence, implicit relumping techniques were developed in order to compute lumping coefficients without generating the detailed reaction network. Two alternative and equivalent approaches are presented, based either on structural classes or on lateral chain decomposition. These two methods are discussed and the lateral chain decomposition method is applied to the kinetic modelling of long chain paraffin hydroisomerization and hydrocracking. The lateral chain decomposition technique is exactly equivalent to the original calculation method based on the explicitly generated detailed reaction network, as long as Benson’s group contribution method is used to calculate the necessary thermodynamic data in both approaches. Le concept de modélisation par événements constitutifs permet de développer des modèles cinétiques pour la simulation des procédés de raffinage. Pour des réseaux réactionnels de centaines de milliers d'espèces, comme cela est le cas pour le reformage catalytique, le regroupement rigoureux par nombre d'atomes de carbone et degré de ramification a été utilisé efficacement en faisant l'hypothèse de l'équilibre chimique dans chaque groupe. Cette technique de regroupement conduit à un modèle regroupé compact sans perte d

  14. Analytical Modeling Tool for Design of Hydrocarbon Sensitive Optical Fibers

    Directory of Open Access Journals (Sweden)

    Khalil Al Handawi

    2017-09-01

    Full Text Available Pipelines are the main transportation means for oil and gas products across large distances. Due to the severe conditions they operate in, they are regularly inspected using conventional Pipeline Inspection Gages (PIGs for corrosion damage. The motivation for researching a real-time distributed monitoring solution arose to mitigate costs and provide a proactive indication of potential failures. Fiber optic sensors with polymer claddings provide a means of detecting contact with hydrocarbons. By coating the fibers with a layer of metal similar in composition to that of the parent pipeline, corrosion of this coating may be detected when the polymer cladding underneath is exposed to the surrounding hydrocarbons contained within the pipeline. A Refractive Index (RI change occurs in the polymer cladding causing a loss in intensity of a traveling light pulse due to a reduction in the fiber’s modal capacity. Intensity losses may be detected using Optical Time Domain Reflectometry (OTDR while pinpointing the spatial location of the contact via time delay calculations of the back-scattered pulses. This work presents a theoretical model for the above sensing solution to provide a design tool for the fiber optic cable in the context of hydrocarbon sensing following corrosion of an external metal coating. Results are verified against the experimental data published in the literature.

  15. Analytical Modeling Tool for Design of Hydrocarbon Sensitive Optical Fibers.

    Science.gov (United States)

    Al Handawi, Khalil; Vahdati, Nader; Shiryayev, Oleg; Lawand, Lydia

    2017-09-28

    Pipelines are the main transportation means for oil and gas products across large distances. Due to the severe conditions they operate in, they are regularly inspected using conventional Pipeline Inspection Gages (PIGs) for corrosion damage. The motivation for researching a real-time distributed monitoring solution arose to mitigate costs and provide a proactive indication of potential failures. Fiber optic sensors with polymer claddings provide a means of detecting contact with hydrocarbons. By coating the fibers with a layer of metal similar in composition to that of the parent pipeline, corrosion of this coating may be detected when the polymer cladding underneath is exposed to the surrounding hydrocarbons contained within the pipeline. A Refractive Index (RI) change occurs in the polymer cladding causing a loss in intensity of a traveling light pulse due to a reduction in the fiber's modal capacity. Intensity losses may be detected using Optical Time Domain Reflectometry (OTDR) while pinpointing the spatial location of the contact via time delay calculations of the back-scattered pulses. This work presents a theoretical model for the above sensing solution to provide a design tool for the fiber optic cable in the context of hydrocarbon sensing following corrosion of an external metal coating. Results are verified against the experimental data published in the literature.

  16. Modeling the Spray Forming of H13 Steel Tooling

    Science.gov (United States)

    Lin, Yaojun; McHugh, Kevin M.; Zhou, Yizhang; Lavernia, Enrique J.

    2007-07-01

    On the basis of a numerical model, the temperature and liquid fraction of spray-formed H13 tool steel are calculated as a function of time. Results show that a preheated substrate at the appropriate temperature can lead to very low porosity by increasing the liquid fraction in the deposited steel. The calculated cooling rate can lead to a microstructure consisting of martensite, lower bainite, retained austenite, and proeutectoid carbides in as-spray-formed material. In the temperature range between the solidus and liquidus temperatures, the calculated temperature of the spray-formed material increases with increasing substrate preheat temperature, resulting in a very low porosity by increasing the liquid fraction of the deposited steel. In the temperature region where austenite decomposition occurs, the substrate preheat temperature has a negligible influence on the cooling rate of the spray-formed material. On the basis of the calculated results, it is possible to generate sufficient liquid fraction during spray forming by using a high growth rate of the deposit without preheating the substrate, and the growth rate of the deposit has almost no influence on the cooling rate in the temperature region of austenite decomposition.

  17. ModelMage: a tool for automatic model generation, selection and management.

    Science.gov (United States)

    Flöttmann, Max; Schaber, Jörg; Hoops, Stephan; Klipp, Edda; Mendes, Pedro

    2008-01-01

    Mathematical modeling of biological systems usually involves implementing, simulating, and discriminating several candidate models that represent alternative hypotheses. Generating and managing these candidate models is a tedious and difficult task and can easily lead to errors. ModelMage is a tool that facilitates management of candidate models. It is designed for the easy and rapid development, generation, simulation, and discrimination of candidate models. The main idea of the program is to automatically create a defined set of model alternatives from a single master model. The user provides only one SBML-model and a set of directives from which the candidate models are created by leaving out species, modifiers or reactions. After generating models the software can automatically fit all these models to the data and provides a ranking for model selection, in case data is available. In contrast to other model generation programs, ModelMage aims at generating only a limited set of models that the user can precisely define. ModelMage uses COPASI as a simulation and optimization engine. Thus, all simulation and optimization features of COPASI are readily incorporated. ModelMage can be downloaded from http://sysbio.molgen.mpg.de/modelmage and is distributed as free software.

  18. Accelerated bridge construction (ABC) decision making and economic modeling tool.

    Science.gov (United States)

    2011-12-01

    In this FHWA-sponsored pool funded study, a set of decision making tools, based on the Analytic Hierarchy Process (AHP) was developed. This tool set is prepared for transportation specialists and decision-makers to determine if ABC is more effective ...

  19. The Will, Skill, Tool Model of Technology Integration: Adding Pedagogy as a New Model Construct

    Science.gov (United States)

    Knezek, Gerald; Christensen, Rhonda

    2015-01-01

    An expansion of the Will, Skill, Tool Model of Technology Integration to include teacher's pedagogical style is proposed by the authors as a means of advancing the predictive power for level of classroom technology integration to beyond 90%. Suggested advantages to this expansion include more precise identification of areas to be targeted for…

  20. Verifying Real-Time Systems using Explicit-time Description Methods

    Directory of Open Access Journals (Sweden)

    Hao Wang

    2009-12-01

    Full Text Available Timed model checking has been extensively researched in recent years. Many new formalisms with time extensions and tools based on them have been presented. On the other hand, Explicit-Time Description Methods aim to verify real-time systems with general untimed model checkers. Lamport presented an explicit-time description method using a clock-ticking process (Tick to simulate the passage of time together with a group of global variables for time requirements. This paper proposes a new explicit-time description method with no reliance on global variables. Instead, it uses rendezvous synchronization steps between the Tick process and each system process to simulate time. This new method achieves better modularity and facilitates usage of more complex timing constraints. The two explicit-time description methods are implemented in DIVINE, a well-known distributed-memory model checker. Preliminary experiment results show that our new method, with better modularity, is comparable to Lamport's method with respect to time and memory efficiency.

  1. Age effects on explicit and implicit memory

    Directory of Open Access Journals (Sweden)

    Emma eWard

    2013-09-01

    Full Text Available It is well documented that explicit memory (e.g., recognition declines with age. In contrast, many argue that implicit memory (e.g., priming is preserved in healthy aging. For example, priming on tasks such as perceptual identification is often not statistically different in groups of young and older adults. Such observations are commonly taken as evidence for distinct explicit and implicit learning/memory systems. In this article we discuss several lines of evidence that challenge this view. We describe how patterns of differential age-related decline may arise from differences in the ways in which the two forms of memory are commonly measured, and review recent research suggesting that under improved measurement methods, implicit memory is not age-invariant. Formal computational models are of considerable utility in revealing the nature of underlying systems. We report the results of applying single and multiple-systems models to data on age effects in implicit and explicit memory. Model comparison clearly favours the single-system view. Implications for the memory systems debate are discussed.

  2. Age effects on explicit and implicit memory.

    Science.gov (United States)

    Ward, Emma V; Berry, Christopher J; Shanks, David R

    2013-01-01

    It is well-documented that explicit memory (e.g., recognition) declines with age. In contrast, many argue that implicit memory (e.g., priming) is preserved in healthy aging. For example, priming on tasks such as perceptual identification is often not statistically different in groups of young and older adults. Such observations are commonly taken as evidence for distinct explicit and implicit learning/memory systems. In this article we discuss several lines of evidence that challenge this view. We describe how patterns of differential age-related decline may arise from differences in the ways in which the two forms of memory are commonly measured, and review recent research suggesting that under improved measurement methods, implicit memory is not age-invariant. Formal computational models are of considerable utility in revealing the nature of underlying systems. We report the results of applying single and multiple-systems models to data on age effects in implicit and explicit memory. Model comparison clearly favors the single-system view. Implications for the memory systems debate are discussed.

  3. Multi-category micro-milling tool wear monitoring with continuous hidden Markov models

    Science.gov (United States)

    Zhu, Kunpeng; Wong, Yoke San; Hong, Geok Soon

    2009-02-01

    In-process monitoring of tool conditions is important in micro-machining due to the high precision requirement and high tool wear rate. Tool condition monitoring in micro-machining poses new challenges compared to conventional machining. In this paper, a multi-category classification approach is proposed for tool flank wear state identification in micro-milling. Continuous Hidden Markov models (HMMs) are adapted for modeling of the tool wear process in micro-milling, and estimation of the tool wear state given the cutting force features. For a noise-robust approach, the HMM outputs are connected via a medium filter to minimize the tool state before entry into the next state due to high noise level. A detailed study on the selection of HMM structures for tool condition monitoring (TCM) is presented. Case studies on the tool state estimation in the micro-milling of pure copper and steel demonstrate the effectiveness and potential of these methods.

  4. Tailored graph ensembles as proxies or null models for real networks I: tools for quantifying structure

    International Nuclear Information System (INIS)

    Annibale, A; Coolen, A C C; Fernandes, L P; Fraternali, F; Kleinjung, J

    2009-01-01

    We study the tailoring of structured random graph ensembles to real networks, with the objective of generating precise and practical mathematical tools for quantifying and comparing network topologies macroscopically, beyond the level of degree statistics. Our family of ensembles can produce graphs with any prescribed degree distribution and any degree-degree correlation function; its control parameters can be calculated fully analytically, and as a result we can calculate (asymptotically) formulae for entropies and complexities and for information-theoretic distances between networks, expressed directly and explicitly in terms of their measured degree distribution and degree correlations.

  5. Response Surface Modeling Tool Suite, Version 1.x

    Energy Technology Data Exchange (ETDEWEB)

    2016-07-05

    The Response Surface Modeling (RSM) Tool Suite is a collection of three codes used to generate an empirical interpolation function for a collection of drag coefficient calculations computed with Test Particle Monte Carlo (TPMC) simulations. The first code, "Automated RSM", automates the generation of a drag coefficient RSM for a particular object to a single command. "Automated RSM" first creates a Latin Hypercube Sample (LHS) of 1,000 ensemble members to explore the global parameter space. For each ensemble member, a TPMC simulation is performed and the object drag coefficient is computed. In the next step of the "Automated RSM" code, a Gaussian process is used to fit the TPMC simulations. In the final step, Markov Chain Monte Carlo (MCMC) is used to evaluate the non-analytic probability distribution function from the Gaussian process. The second code, "RSM Area", creates a look-up table for the projected area of the object based on input limits on the minimum and maximum allowed pitch and yaw angles and pitch and yaw angle intervals. The projected area from the look-up table is used to compute the ballistic coefficient of the object based on its pitch and yaw angle. An accurate ballistic coefficient is crucial in accurately computing the drag on an object. The third code, "RSM Cd", uses the RSM generated by the "Automated RSM" code and the projected area look-up table generated by the "RSM Area" code to accurately compute the drag coefficient and ballistic coefficient of the object. The user can modify the object velocity, object surface temperature, the translational temperature of the gas, the species concentrations of the gas, and the pitch and yaw angles of the object. Together, these codes allow for the accurate derivation of an object's drag coefficient and ballistic coefficient under any conditions with only knowledge of the object's geometry and mass.

  6. Enhancing Formal Modelling Tool Support with Increased Automation

    DEFF Research Database (Denmark)

    Lausdahl, Kenneth

    Progress report for the qualification exam report for PhD Student Kenneth Lausdahl. Initial work on enhancing tool support for the formal method VDM and the concept of unifying a abstract syntax tree with the ability for isolated extensions is described. The tool support includes a connection to ...... to UML and a test automation principle based on traces written as a kind of regular expressions....

  7. An Overview of the Object Protocol Model (OPM) and the OPM Data Management Tools.

    Science.gov (United States)

    Chen, I-Min A.; Markowitz, Victor M.

    1995-01-01

    Discussion of database management tools for scientific information focuses on the Object Protocol Model (OPM) and data management tools based on OPM. Topics include the need for new constructs for modeling scientific experiments, modeling object structures and experiments in OPM, queries and updates, and developing scientific database applications…

  8. Time-dependent density functional theory (TD-DFT) coupled with reference interaction site model self-consistent field explicitly including spatial electron density distribution (RISM-SCF-SEDD)

    Energy Technology Data Exchange (ETDEWEB)

    Yokogawa, D., E-mail: d.yokogawa@chem.nagoya-u.ac.jp [Department of Chemistry, Graduate School of Science, Nagoya University, Chikusa, Nagoya 464-8602 (Japan); Institute of Transformative Bio-Molecules (WPI-ITbM), Nagoya University, Chikusa, Nagoya 464-8602 (Japan)

    2016-09-07

    Theoretical approach to design bright bio-imaging molecules is one of the most progressing ones. However, because of the system size and computational accuracy, the number of theoretical studies is limited to our knowledge. To overcome the difficulties, we developed a new method based on reference interaction site model self-consistent field explicitly including spatial electron density distribution and time-dependent density functional theory. We applied it to the calculation of indole and 5-cyanoindole at ground and excited states in gas and solution phases. The changes in the optimized geometries were clearly explained with resonance structures and the Stokes shift was correctly reproduced.

  9. A Modeling Tool for Household Biogas Burner Flame Port Design

    Science.gov (United States)

    Decker, Thomas J.

    Anaerobic digestion is a well-known and potentially beneficial process for rural communities in emerging markets, providing the opportunity to generate usable gaseous fuel from agricultural waste. With recent developments in low-cost digestion technology, communities across the world are gaining affordable access to the benefits of anaerobic digestion derived biogas. For example, biogas can displace conventional cooking fuels such as biomass (wood, charcoal, dung) and Liquefied Petroleum Gas (LPG), effectively reducing harmful emissions and fuel cost respectively. To support the ongoing scaling effort of biogas in rural communities, this study has developed and tested a design tool aimed at optimizing flame port geometry for household biogas-fired burners. The tool consists of a multi-component simulation that incorporates three-dimensional CAD designs with simulated chemical kinetics and computational fluid dynamics. An array of circular and rectangular port designs was developed for a widely available biogas stove (called the Lotus) as part of this study. These port designs were created through guidance from previous studies found in the literature. The three highest performing designs identified by the tool were manufactured and tested experimentally to validate tool output and to compare against the original port geometry. The experimental results aligned with the tool's prediction for the three chosen designs. Each design demonstrated improved thermal efficiency relative to the original, with one configuration of circular ports exhibiting superior performance. The results of the study indicated that designing for a targeted range of port hydraulic diameter, velocity and mixture density in the tool is a relevant way to improve the thermal efficiency of a biogas burner. Conversely, the emissions predictions made by the tool were found to be unreliable and incongruent with laboratory experiments.

  10. Hypersonic Control Modeling and Simulation Tool for Lifting Towed Ballutes, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Aerospace Corporation proposes to develop a hypersonic control modeling and simulation tool for hypersonic aeroassist vehicles. Our control and simulation...

  11. Meteoric 10Be as a tool to investigate human induced soil fluxes: a conceptual model

    Science.gov (United States)

    Campforts, Benjamin; Govers, Gerard; Vanacker, Veerle; De Vente, Joris; Boix-Fayos, Carolina; Minella, Jean; Baken, Stijn; Smolders, Erik

    2014-05-01

    The use of meteoric 10Be as a tool to understand long term landscape behavior is becoming increasingly popular. Due its high residence time, meteoric 10Be allows in principle to investigate in situ erosion rates over time scales exceeding the period studied with classical approaches such as 137Cs. The use of meteoric 10Be strongly contributes to the traditional interpretation of sedimentary archives which cannot be unequivocally coupled to sediment production and could provide biased information over longer time scales (Sadler, 1981). So far, meteoric 10Be has successfully been used in geochemical fingerprinting of sediments, to date soil profiles, to assess soil residence times and to quantify downslope soil fluxes using accumulated 10Be inventories along a hill slope. However, less attention is given to the potential use of the tracer to directly asses human induced changes in soil fluxes through deforestation, cultivation and reforestation. A good understanding of the processes governing the distribution of meteoric 10Be both within the soil profile and at landscape scale is essential before meteoric 10Be can be successfully applied to assess human impact. We developed a spatially explicit 2D-model (Be2D) in order to gain insight in meteoric 10Be movement along a hillslope that is subject to human disturbance. Be2D integrates both horizontal soil fluxes and vertical meteoric 10Be movement throughout the soil prolife. Horizontal soil fluxes are predicted using (i) well studied geomorphical laws for natural erosion and soil formation as well as (ii) human accelerated water and tillage erosion. Vertical movement of meteoric 10Be throughout the soil profile is implemented by inserting depth dependent retardation calculated using experimentally determined partition coefficients (Kd). The model was applied to different environments such as (i) the Belgian loess belt, characterized by aeolian deposits enriched in inherited meteoric 10Be, (ii) highly degraded and stony

  12. PHREEQC modelling of concrete/clay interactions in a 2D geometry with explicit effect of porosity evolution on transport properties due to mineralogical changes

    International Nuclear Information System (INIS)

    Claret, F.; Marty, N.C.M.; Tournassat, C.; Gaboreau, S.; Burnol, A.; Chiaberge, C.; Gaucher, E.C.; Munier, I.; Cochepin, B.; Michau, N.

    2010-01-01

    Document available in extended abstract form only. In the context of deep repository for radioactive waste, significant use of concrete will be made. This material constitutes a compromise between properties, technical uses and costs. Within the French concepts, concrete will be used to build access structures, drifts as well as waste disposal cells and waste packages for Intermediate Level Wastes (ILW). With this design, concrete will be at the interface with either/both the host rock, Callovo-Oxfordian argillites in our case, and/or the clay plug built with swelling clay such as bentonite. Due to the chemical disequilibrium between concrete and clay, chemical reactions can modify both chemical and physical properties of these materials (e.g. mineralogical composition, diffusion coefficient...). In order to assess the long term behaviour of concrete/clay interfaces and the evolution of their properties with time, predictive modelling have to be performed. The high chemical contrast (e.g. pH or pe at the interface) often leads to problems of numerical convergence. Our own experience showed that PHREEQC is very successful in handling such difficulties in 1D geometry. PHREEQC is also able to handle 2D geometries as presented hereafter thanks to the MIX option as well as feedback on porosity thanks to the MCD option (multi component diffusion). Indeed, 2D simulation of a drift sealing concept developed by Andra was attempted using PHREEQC with the MIX option which allows the use of different transport properties in the different cells. A basic program was developed to generate this complex 2D mesh and another one to treat the outputs under TECPLOT R . The mesh is composed of 3081 cells with a refinement of 3 cm at each interface. Such a simulation was already conducted under ALLIANCES geochemistry transport tools, but in our cases the mesh refinement and the chemistry of the system are extended and the feedback on porosity is now considered. Furthermore, the new multi

  13. Modelling Machine Tools using Structure Integrated Sensors for Fast Calibration

    Directory of Open Access Journals (Sweden)

    Benjamin Montavon

    2018-02-01

    Full Text Available Monitoring of the relative deviation between commanded and actual tool tip position, which limits the volumetric performance of the machine tool, enables the use of contemporary methods of compensation to reduce tolerance mismatch and the uncertainties of on-machine measurements. The development of a primarily optical sensor setup capable of being integrated into the machine structure without limiting its operating range is presented. The use of a frequency-modulating interferometer and photosensitive arrays in combination with a Gaussian laser beam allows for fast and automated online measurements of the axes’ motion errors and thermal conditions with comparable accuracy, lower cost, and smaller dimensions as compared to state-of-the-art optical measuring instruments for offline machine tool calibration. The development is tested through simulation of the sensor setup based on raytracing and Monte-Carlo techniques.

  14. Spatially explicit spectral analysis of point clouds and geospatial data

    Science.gov (United States)

    Buscombe, Daniel D.

    2015-01-01

    The increasing use of spatially explicit analyses of high-resolution spatially distributed data (imagery and point clouds) for the purposes of characterising spatial heterogeneity in geophysical phenomena necessitates the development of custom analytical and computational tools. In recent years, such analyses have become the basis of, for example, automated texture characterisation and segmentation, roughness and grain size calculation, and feature detection and classification, from a variety of data types. In this work, much use has been made of statistical descriptors of localised spatial variations in amplitude variance (roughness), however the horizontal scale (wavelength) and spacing of roughness elements is rarely considered. This is despite the fact that the ratio of characteristic vertical to horizontal scales is not constant and can yield important information about physical scaling relationships. Spectral analysis is a hitherto under-utilised but powerful means to acquire statistical information about relevant amplitude and wavelength scales, simultaneously and with computational efficiency. Further, quantifying spatially distributed data in the frequency domain lends itself to the development of stochastic models for probing the underlying mechanisms which govern the spatial distribution of geological and geophysical phenomena. The software packagePySESA (Python program for Spatially Explicit Spectral Analysis) has been developed for generic analyses of spatially distributed data in both the spatial and frequency domains. Developed predominantly in Python, it accesses libraries written in Cython and C++ for efficiency. It is open source and modular, therefore readily incorporated into, and combined with, other data analysis tools and frameworks with particular utility for supporting research in the fields of geomorphology, geophysics, hydrography, photogrammetry and remote sensing. The analytical and computational structure of the toolbox is

  15. Spatially explicit spectral analysis of point clouds and geospatial data

    Science.gov (United States)

    Buscombe, Daniel

    2016-01-01

    The increasing use of spatially explicit analyses of high-resolution spatially distributed data (imagery and point clouds) for the purposes of characterising spatial heterogeneity in geophysical phenomena necessitates the development of custom analytical and computational tools. In recent years, such analyses have become the basis of, for example, automated texture characterisation and segmentation, roughness and grain size calculation, and feature detection and classification, from a variety of data types. In this work, much use has been made of statistical descriptors of localised spatial variations in amplitude variance (roughness), however the horizontal scale (wavelength) and spacing of roughness elements is rarely considered. This is despite the fact that the ratio of characteristic vertical to horizontal scales is not constant and can yield important information about physical scaling relationships. Spectral analysis is a hitherto under-utilised but powerful means to acquire statistical information about relevant amplitude and wavelength scales, simultaneously and with computational efficiency. Further, quantifying spatially distributed data in the frequency domain lends itself to the development of stochastic models for probing the underlying mechanisms which govern the spatial distribution of geological and geophysical phenomena. The software package PySESA (Python program for Spatially Explicit Spectral Analysis) has been developed for generic analyses of spatially distributed data in both the spatial and frequency domains. Developed predominantly in Python, it accesses libraries written in Cython and C++ for efficiency. It is open source and modular, therefore readily incorporated into, and combined with, other data analysis tools and frameworks with particular utility for supporting research in the fields of geomorphology, geophysics, hydrography, photogrammetry and remote sensing. The analytical and computational structure of the toolbox is described

  16. Understanding and making practice explicit

    Directory of Open Access Journals (Sweden)

    Gráinne Conole

    2006-12-01

    Full Text Available This issue contains four, on the face of it, quite different papers, but on looking a little closer there are a number of interesting themes running through them that illustrate some of the key methodological and theoretical issues that e-learning researchers are currently struggling with; central to these is the issue of how do we understand and make practice explicit?

  17. Explicit Oral Narrative Intervention for Students with Williams Syndrome

    Science.gov (United States)

    Diez-Itza, Eliseo; Martínez, Verónica; Pérez, Vanesa; Fernández-Urquiza, Maite

    2018-01-01

    Narrative skills play a crucial role in organizing experience, facilitating social interaction and building academic discourse and literacy. They are at the interface of cognitive, social, and linguistic abilities related to school engagement. Despite their relative strengths in social and grammatical skills, students with Williams syndrome (WS) do not show parallel cognitive and pragmatic performance in narrative generation tasks. The aim of the present study was to assess retelling of a TV cartoon tale and the effect of an individualized explicit instruction of the narrative structure. Participants included eight students with WS who attended different special education levels. Narratives were elicited in two sessions (pre and post intervention), and were transcribed, coded and analyzed using the tools of the CHILDES Project. Narratives were coded for productivity and complexity at the microstructure and macrostructure levels. Microstructure productivity (i.e., length of narratives) included number of utterances, clauses, and tokens. Microstructure complexity included mean length of utterances, lexical diversity and use of discourse markers as cohesive devices. Narrative macrostructure was assessed for textual coherence through the Pragmatic Evaluation Protocol for Speech Corpora (PREP-CORP). Macrostructure productivity and complexity included, respectively, the recall and sequential order of scenarios, episodes, events and characters. A total of four intervention sessions, lasting approximately 20 min, were delivered individually once a week. This brief intervention addressed explicit instruction about the narrative structure and the use of specific discourse markers to improve cohesion of story retellings. Intervention strategies included verbal scaffolding and modeling, conversational context for retelling the story and visual support with pictures printed from the cartoon. Results showed significant changes in WS students’ retelling of the story, both at

  18. Explicit Oral Narrative Intervention for Students with Williams Syndrome

    Directory of Open Access Journals (Sweden)

    Eliseo Diez-Itza

    2018-01-01

    Full Text Available Narrative skills play a crucial role in organizing experience, facilitating social interaction and building academic discourse and literacy. They are at the interface of cognitive, social, and linguistic abilities related to school engagement. Despite their relative strengths in social and grammatical skills, students with Williams syndrome (WS do not show parallel cognitive and pragmatic performance in narrative generation tasks. The aim of the present study was to assess retelling of a TV cartoon tale and the effect of an individualized explicit instruction of the narrative structure. Participants included eight students with WS who attended different special education levels. Narratives were elicited in two sessions (pre and post intervention, and were transcribed, coded and analyzed using the tools of the CHILDES Project. Narratives were coded for productivity and complexity at the microstructure and macrostructure levels. Microstructure productivity (i.e., length of narratives included number of utterances, clauses, and tokens. Microstructure complexity included mean length of utterances, lexical diversity and use of discourse markers as cohesive devices. Narrative macrostructure was assessed for textual coherence through the Pragmatic Evaluation Protocol for Speech Corpora (PREP-CORP. Macrostructure productivity and complexity included, respectively, the recall and sequential order of scenarios, episodes, events and characters. A total of four intervention sessions, lasting approximately 20 min, were delivered individually once a week. This brief intervention addressed explicit instruction about the narrative structure and the use of specific discourse markers to improve cohesion of story retellings. Intervention strategies included verbal scaffolding and modeling, conversational context for retelling the story and visual support with pictures printed from the cartoon. Results showed significant changes in WS students’ retelling of the

  19. MOVES - A tool for Modeling and Verification of Embedded Systems

    DEFF Research Database (Denmark)

    Ellebæk, Jens; Knudsen, Kristian S.; Brekling, Aske Wiid

    2007-01-01

    We demonstrate MOVES, a tool which allows designers of embedded systems to explore possible implementations early in the design process. The demonstration of MOVES will show how designers can explore different designs by changing the mapping of tasks on processing elements, the number and/or spee...... of processing elements, the size of local memories, and the operating systems (scheduling algorithm)....

  20. Using urban forest assessment tools to model bird habitat potential

    Science.gov (United States)

    Susannah B. Lerman; Keith H. Nislow; David J. Nowak; Stephen DeStefano; David I. King; D. Todd. Jones-Farrand

    2014-01-01

    The alteration of forest cover and the replacement of native vegetation with buildings, roads, exotic vegetation, and other urban features pose one of the greatest threats to global biodiversity. As more land becomes slated for urban development, identifying effective urban forest wildlife management tools becomes paramount to ensure the urban forest provides habitat...

  1. Correction tool for Active Shape Model based lumbar muscle segmentation.

    Science.gov (United States)

    Valenzuela, Waldo; Ferguson, Stephen J; Ignasiak, Dominika; Diserens, Gaelle; Vermathen, Peter; Boesch, Chris; Reyes, Mauricio

    2015-08-01

    In the clinical environment, accuracy and speed of the image segmentation process plays a key role in the analysis of pathological regions. Despite advances in anatomic image segmentation, time-effective correction tools are commonly needed to improve segmentation results. Therefore, these tools must provide faster corrections with a low number of interactions, and a user-independent solution. In this work we present a new interactive correction method for correcting the image segmentation. Given an initial segmentation and the original image, our tool provides a 2D/3D environment, that enables 3D shape correction through simple 2D interactions. Our scheme is based on direct manipulation of free form deformation adapted to a 2D environment. This approach enables an intuitive and natural correction of 3D segmentation results. The developed method has been implemented into a software tool and has been evaluated for the task of lumbar muscle segmentation from Magnetic Resonance Images. Experimental results show that full segmentation correction could be performed within an average correction time of 6±4 minutes and an average of 68±37 number of interactions, while maintaining the quality of the final segmentation result within an average Dice coefficient of 0.92±0.03.

  2. Advanced Computing Tools and Models for Accelerator Physics

    International Nuclear Information System (INIS)

    Ryne, Robert; Ryne, Robert D.

    2008-01-01

    This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics

  3. Tool-Body Assimilation Model Based on Body Babbling and Neurodynamical System

    Directory of Open Access Journals (Sweden)

    Kuniyuki Takahashi

    2015-01-01

    Full Text Available We propose the new method of tool use with a tool-body assimilation model based on body babbling and a neurodynamical system for robots to use tools. Almost all existing studies for robots to use tools require predetermined motions and tool features; the motion patterns are limited and the robots cannot use novel tools. Other studies fully search for all available parameters for novel tools, but this leads to massive amounts of calculations. To solve these problems, we took the following approach: we used a humanoid robot model to generate random motions based on human body babbling. These rich motion experiences were used to train recurrent and deep neural networks for modeling a body image. Tool features were self-organized in parametric bias, modulating the body image according to the tool in use. Finally, we designed a neural network for the robot to generate motion only from the target image. Experiments were conducted with multiple tools for manipulating a cylindrical target object. The results show that the tool-body assimilation model is capable of motion generation.

  4. Regional Sediment Management (RSM) Modeling Tools: Integration of Advanced Sediment Transport Tools into HEC-RAS

    Science.gov (United States)

    2014-06-01

    sediment transport within the USACE HEC River Analysis System ( HEC - RAS ) software package and to determine its applicability to Regional Sediment...Management (RSM) challenges. HEC - RAS SEDIMENT MODELING BACKGROUND: HEC - RAS performs (1) one- dimensional (1D) steady and unsteady hydraulic river ...Albuquerque (SPA)), and recently, the USACE RSM Program. HEC - RAS is one of several hydraulic modeling codes available for river analysis in the

  5. Assessment of wear dependence parameters in complex model of cutting tool wear

    Science.gov (United States)

    Antsev, A. V.; Pasko, N. I.; Antseva, N. V.

    2018-03-01

    This paper addresses wear dependence of the generic efficient life period of cutting tools taken as an aggregate of the law of tool wear rate distribution and dependence of parameters of this law's on the cutting mode, factoring in the random factor as exemplified by the complex model of wear. The complex model of wear takes into account the variance of cutting properties within one batch of tools, variance in machinability within one batch of workpieces, and the stochastic nature of the wear process itself. A technique of assessment of wear dependence parameters in a complex model of cutting tool wear is provided. The technique is supported by a numerical example.

  6. GEOQUIMICO : an interactive tool for comparing sorption conceptual models (surface complexation modeling versus K[D])

    International Nuclear Information System (INIS)

    Hammond, Glenn E.; Cygan, Randall Timothy

    2007-01-01

    Within reactive geochemical transport, several conceptual models exist for simulating sorption processes in the subsurface. Historically, the K D approach has been the method of choice due to ease of implementation within a reactive transport model and straightforward comparison with experimental data. However, for modeling complex sorption phenomenon (e.g. sorption of radionuclides onto mineral surfaces), this approach does not systematically account for variations in location, time, or chemical conditions, and more sophisticated methods such as a surface complexation model (SCM) must be utilized. It is critical to determine which conceptual model to use; that is, when the material variation becomes important to regulatory decisions. The geochemical transport tool GEOQUIMICO has been developed to assist in this decision-making process. GEOQUIMICO provides a user-friendly framework for comparing the accuracy and performance of sorption conceptual models. The model currently supports the K D and SCM conceptual models. The code is written in the object-oriented Java programming language to facilitate model development and improve code portability. The basic theory underlying geochemical transport and the sorption conceptual models noted above is presented in this report. Explanations are provided of how these physicochemical processes are instrumented in GEOQUIMICO and a brief verification study comparing GEOQUIMICO results to data found in the literature is given

  7. A unified tool for performance modelling and prediction

    International Nuclear Information System (INIS)

    Gilmore, Stephen; Kloul, Leila

    2005-01-01

    We describe a novel performability modelling approach, which facilitates the efficient solution of performance models extracted from high-level descriptions of systems. The notation which we use for our high-level designs is the Unified Modelling Language (UML) graphical modelling language. The technology which provides the efficient representation capability for the underlying performance model is the multi-terminal binary decision diagram (MTBDD)-based PRISM probabilistic model checker. The UML models are compiled through an intermediate language, the stochastic process algebra PEPA, before translation into MTBDDs for solution. We illustrate our approach on a real-world analysis problem from the domain of mobile telephony

  8. Customer Data Analysis Model using Business Intelligence Tools in Telecommunication Companies

    Directory of Open Access Journals (Sweden)

    Monica LIA

    2015-10-01

    Full Text Available This article presents a customer data analysis model in a telecommunication company and business intelligence tools for data modelling, transforming, data visualization and dynamic reports building . For a mature market, knowing the information inside the data and making forecast for strategic decision become more important in Romanian Market. Business Intelligence tools are used in business organization as support for decision making.

  9. A Decision Support Model and Tool to Assist Financial Decision-Making in Universities

    Science.gov (United States)

    Bhayat, Imtiaz; Manuguerra, Maurizio; Baldock, Clive

    2015-01-01

    In this paper, a model and tool is proposed to assist universities and other mission-based organisations to ascertain systematically the optimal portfolio of projects, in any year, meeting the organisations risk tolerances and available funds. The model and tool presented build on previous work on university operations and decision support systems…

  10. Transformation of UML models to CSP : a case study for graph transformation tools

    NARCIS (Netherlands)

    Varró, D.; Asztalos, M.; Bisztray, D.; Boronat, A.; Dang, D.; Geiß, R.; Greenyer, J.; Van Gorp, P.M.E.; Kniemeyer, O.; Narayanan, A.; Rencis, E.; Weinell, E.; Schürr, A.; Nagl, M.; Zündorf, A.

    2008-01-01

    Graph transformation provides an intuitive mechanism for capturing model transformations. In the current paper, we investigate and compare various graph transformation tools using a compact practical model transformation case study carried out as part of the AGTIVE 2007 Tool Contest [22]. The aim of

  11. Community Intercomparison Suite (CIS) v1.4.0: A tool for intercomparing models and observations

    NARCIS (Netherlands)

    Watson-Parris, Duncan; Schutgens, Nick; Cook, Nicholas; Kipling, Zak; Kershaw, Philip; Gryspeerdt, Edward; Lawrence, Bryan; Stier, Philip

    2016-01-01

    The Community Intercomparison Suite (CIS) is an easy-to-use command-line tool which has been developed to allow the straightforward intercomparison of remote sensing, in-situ and model data. While there are a number of tools available for working with climate model data, the large diversity of

  12. Formal Development of a Tool for Automated Modelling and Verification of Relay Interlocking Systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Kjær, Andreas A.; Le Bliguet, Marie

    2011-01-01

    This paper describes a tool for formal modelling relay interlocking systems and explains how it has been stepwise, formally developed using the RAISE method. The developed tool takes the circuit diagrams of a relay interlocking system as input and gives as result a state transition system modelling...

  13. Teachers' Use of Computational Tools to Construct and Explore Dynamic Mathematical Models

    Science.gov (United States)

    Santos-Trigo, Manuel; Reyes-Rodriguez, Aaron

    2011-01-01

    To what extent does the use of computational tools offer teachers the possibility of constructing dynamic models to identify and explore diverse mathematical relations? What ways of reasoning or thinking about the problems emerge during the model construction process that involves the use of the tools? These research questions guided the…

  14. A GIS Tool for evaluating and improving NEXRAD and its application in distributed hydrologic modeling

    Science.gov (United States)

    Zhang, X.; Srinivasan, R.

    2008-12-01

    In this study, a user friendly GIS tool was developed for evaluating and improving NEXRAD using raingauge data. This GIS tool can automatically read in raingauge and NEXRAD data, evaluate the accuracy of NEXRAD for each time unit, implement several geostatistical methods to improve the accuracy of NEXRAD through raingauge data, and output spatial precipitation map for distributed hydrologic model. The geostatistical methods incorporated in this tool include Simple Kriging with varying local means, Kriging with External Drift, Regression Kriging, Co-Kriging, and a new geostatistical method that was newly developed by Li et al. (2008). This tool was applied in two test watersheds at hourly and daily temporal scale. The preliminary cross-validation results show that incorporating raingauge data to calibrate NEXRAD can pronouncedly change the spatial pattern of NEXRAD and improve its accuracy. Using different geostatistical methods, the GIS tool was applied to produce long term precipitation input for a distributed hydrologic model - Soil and Water Assessment Tool (SWAT). Animated video was generated to vividly illustrate the effect of using different precipitation input data on distributed hydrologic modeling. Currently, this GIS tool is developed as an extension of SWAT, which is used as water quantity and quality modeling tool by USDA and EPA. The flexible module based design of this tool also makes it easy to be adapted for other hydrologic models for hydrological modeling and water resources management.

  15. Numerical modelling of structure and mechanical properties for medical tools

    OpenAIRE

    L. Jeziorski; J. Jasinski; M. Lubas; M. Szota; P. Lacki; B. Stodolnika

    2007-01-01

    Purpose: In order to design forceps and bowl cutter property, it is necessary to optimise many parameters and consider the functions, which these medical tools should fulfil. Of course, some simplifications are necessary in respect of calculation methodology. In the paper a solution procedure concerning this problem has been presented. The presented solution allows for precise determination of the geometrical dimensions according to the functional requirements that forceps should fulfil. The ...

  16. Solid-state-drives (SSDs) modeling simulation tools & strategies

    CERN Document Server

    2017-01-01

    This book introduces simulation tools and strategies for complex systems of solid-state-drives (SSDs) which consist of a flash multi-core microcontroller plus NAND flash memories. It provides a broad overview of the most popular simulation tools, with special focus on open source solutions. VSSIM, NANDFlashSim and DiskSim are benchmarked against performances of real SSDs under different traffic workloads. PROs and CONs of each simulator are analyzed, and it is clearly indicated which kind of answers each of them can give and at a what price. It is explained, that speed and precision do not go hand in hand, and it is important to understand when to simulate what, and with which tool. Being able to simulate SSD’s performances is mandatory to meet time-to-market, together with product cost and quality. Over the last few years the authors developed an advanced simulator named “SSDExplorer” which has been used to evaluate multiple phenomena with great accuracy, from QoS (Quality Of Service) to Read Retry, fr...

  17. Requirements for UML and OWL Integration Tool for User Data Consistency Modeling and Testing

    DEFF Research Database (Denmark)

    Nytun, J. P.; Jensen, Christian Søndergaard; Oleshchuk, V. A.

    2003-01-01

    The amount of data available on the Internet is continuously increasing, consequentially there is a growing need for tools that help to analyse the data. Testing of consistency among data received from different sources is made difficult by the number of different languages and schemas being used....... In this paper we analyze requirements for a tool that support integration of UML models and ontologies written in languages like the W3C Web Ontology Language (OWL). The tool can be used in the following way: after loading two legacy models into the tool, the tool user connects them by inserting modeling......, an important part of this technique is attaching of OCL expressions to special boolean class attributes that we call consistency attributes. The resulting integration model can be used for automatic consistency testing of two instances of the legacy models by automatically instantiate the whole integration...

  18. Rapid Response Tools and Datasets for Post-fire Hydrological Modeling

    Science.gov (United States)

    Miller, Mary Ellen; MacDonald, Lee H.; Billmire, Michael; Elliot, William J.; Robichaud, Pete R.

    2016-04-01

    Rapid response is critical following natural disasters. Flooding, erosion, and debris flows are a major threat to life, property and municipal water supplies after moderate and high severity wildfires. The problem is that mitigation measures must be rapidly implemented if they are to be effective, but they are expensive and cannot be applied everywhere. Fires, runoff, and erosion risks also are highly heterogeneous in space, so there is an urgent need for a rapid, spatially-explicit assessment. Past post-fire modeling efforts have usually relied on lumped, conceptual models because of the lack of readily available, spatially-explicit data layers on the key controls of topography, vegetation type, climate, and soil characteristics. The purpose of this project is to develop a set of spatially-explicit data layers for use in process-based models such as WEPP, and to make these data layers freely available. The resulting interactive online modeling database (http://geodjango.mtri.org/geowepp/) is now operational and publically available for 17 western states in the USA. After a fire, users only need to upload a soil burn severity map, and this is combined with the pre-existing data layers to generate the model inputs needed for spatially explicit models such as GeoWEPP (Renschler, 2003). The development of this online database has allowed us to predict post-fire erosion and various remediation scenarios in just 1-7 days for six fires ranging in size from 4-540 km2. These initial successes have stimulated efforts to further improve the spatial extent and amount of data, and add functionality to support the USGS debris flow model, batch processing for Disturbed WEPP (Elliot et al., 2004) and ERMiT (Robichaud et al., 2007), and to support erosion modeling for other land uses, such as agriculture or mining. The design and techniques used to create the database and the modeling interface are readily repeatable for any area or country that has the necessary topography

  19. Prediction models : the right tool for the right problem

    NARCIS (Netherlands)

    Kappen, Teus H.; Peelen, Linda M.

    2016-01-01

    PURPOSE OF REVIEW: Perioperative prediction models can help to improve personalized patient care by providing individual risk predictions to both patients and providers. However, the scientific literature on prediction model development and validation can be quite technical and challenging to

  20. Advanced aviation environmental modeling tools to inform policymakers

    Science.gov (United States)

    2012-08-19

    Aviation environmental models which conform to international guidance have advanced : over the past several decades. Enhancements to algorithms and databases have increasingly : shown these models to compare well with gold standard measured data. The...